Co-located haptic and 3D graphic interface for medical simulations.
Berkelman, Peter; Miyasaka, Muneaki; Bozlee, Sebastian
2013-01-01
We describe a system which provides high-fidelity haptic feedback in the same physical location as a 3D graphical display, in order to enable realistic physical interaction with virtual anatomical tissue during modelled procedures such as needle driving, palpation, and other interventions performed using handheld instruments. The haptic feedback is produced by the interaction between an array of coils located behind a thin flat LCD screen, and permanent magnets embedded in the instrument held by the user. The coil and magnet configuration permits arbitrary forces and torques to be generated on the instrument in real time according to the dynamics of the simulated tissue by activating the coils in combination. A rigid-body motion tracker provides position and orientation feedback of the handheld instrument to the computer simulation, and the 3D display is produced using LCD shutter glasses and a head-tracking system for the user.
Precise Haptic Device Co-Location for Visuo-Haptic Augmented Reality.
Eck, Ulrich; Pankratz, Frieder; Sandor, Christian; Klinker, Gudrun; Laga, Hamid
2015-12-01
Visuo-haptic augmented reality systems enable users to see and touch digital information that is embedded in the real world. PHANToM haptic devices are often employed to provide haptic feedback. Precise co-location of computer-generated graphics and the haptic stylus is necessary to provide a realistic user experience. Previous work has focused on calibration procedures that compensate the non-linear position error caused by inaccuracies in the joint angle sensors. In this article we present a more complete procedure that additionally compensates for errors in the gimbal sensors and improves position calibration. The proposed procedure further includes software-based temporal alignment of sensor data and a method for the estimation of a reference for position calibration, resulting in increased robustness against haptic device initialization and external tracker noise. We designed our procedure to require minimal user input to maximize usability. We conducted an extensive evaluation with two different PHANToMs, two different optical trackers, and a mechanical tracker. Compared to state-of-the-art calibration procedures, our approach significantly improves the co-location of the haptic stylus. This results in higher fidelity visual and haptic augmentations, which are crucial for fine-motor tasks in areas such as medical training simulators, assembly planning tools, or rapid prototyping applications.
Algorithms for Haptic Rendering of 3D Objects
NASA Technical Reports Server (NTRS)
Basdogan, Cagatay; Ho, Chih-Hao; Srinavasan, Mandayam
2003-01-01
Algorithms have been developed to provide haptic rendering of three-dimensional (3D) objects in virtual (that is, computationally simulated) environments. The goal of haptic rendering is to generate tactual displays of the shapes, hardnesses, surface textures, and frictional properties of 3D objects in real time. Haptic rendering is a major element of the emerging field of computer haptics, which invites comparison with computer graphics. We have already seen various applications of computer haptics in the areas of medicine (surgical simulation, telemedicine, haptic user interfaces for blind people, and rehabilitation of patients with neurological disorders), entertainment (3D painting, character animation, morphing, and sculpting), mechanical design (path planning and assembly sequencing), and scientific visualization (geophysical data analysis and molecular manipulation).
An augmented reality haptic training simulator for spinal needle procedures.
Sutherland, Colin; Hashtrudi-Zaad, Keyvan; Sellens, Rick; Abolmaesumi, Purang; Mousavi, Parvin
2013-11-01
This paper presents the prototype for an augmented reality haptic simulation system with potential for spinal needle insertion training. The proposed system is composed of a torso mannequin, a MicronTracker2 optical tracking system, a PHANToM haptic device, and a graphical user interface to provide visual feedback. The system allows users to perform simulated needle insertions on a physical mannequin overlaid with an augmented reality cutaway of patient anatomy. A tissue model based on a finite-element model provides force during the insertion. The system allows for training without the need for the presence of a trained clinician or access to live patients or cadavers. A pilot user study demonstrates the potential and functionality of the system.
Improved haptic interface for colonoscopy simulation.
Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young
2007-01-01
This paper presents an improved haptic interface of the KAIST-Ewha colonoscopy simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing enough workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors, and triggers computation to render accurate graphic images corresponding to the angle knob rotation. Tack switches are attached on the valve-actuation buttons of the colonoscope to simulate air-injection or suction, and the corresponding deformation of the colon.
Haptic interface of the KAIST-Ewha colonoscopy simulator II.
Woo, Hyun Soo; Kim, Woo Seok; Ahn, Woojin; Lee, Doo Yong; Yi, Sun Young
2008-11-01
This paper presents an improved haptic interface for the Korea Advanced Institute of Science and Technology Ewha Colonoscopy Simulator II. The haptic interface enables the distal portion of the colonoscope to be freely bent while guaranteeing sufficient workspace and reflective forces for colonoscopy simulation. Its force-torque sensor measures the profiles of the user. Manipulation of the colonoscope tip is monitored by four deflection sensors and triggers computations to render accurate graphic images corresponding to the rotation of the angle knob. Tack sensors are attached to the valve-actuation buttons of the colonoscope to simulate air injection or suction as well as the corresponding deformation of the colon. A survey study for face validation was conducted, and the result shows that the developed haptic interface provides realistic haptic feedback for colonoscopy simulations.
G2H--graphics-to-haptic virtual environment development tool for PC's.
Acosta, E; Temkin, B; Krummel, T M; Heinrichs, W L
2000-01-01
For surgical training and preparations, the existing surgical virtual environments have shown great improvement. However, these improvements are more in the visual aspect. The incorporation of haptics into virtual reality base surgical simulations would enhance the sense of realism greatly. To aid in the development of the haptic surgical virtual environment we have created a graphics to haptic, G2H, virtual environment developer tool. G2H transforms graphical virtual environments (created or imported) to haptic virtual environments without programming. The G2H capability has been demonstrated using the complex 3D pelvic model of Lucy 2.0, the Stanford Visible Female. The pelvis was made haptic using G2H without any further programming effort.
Prototype of haptic device for sole of foot using magnetic field sensitive elastomer
NASA Astrophysics Data System (ADS)
Kikuchi, T.; Masuda, Y.; Sugiyama, M.; Mitsumata, T.; Ohori, S.
2013-02-01
Walking is one of the most popular activities and a healthy aerobic exercise for the elderly. However, if they have physical and / or cognitive disabilities, sometimes it is challenging to go somewhere they don't know well. The final goal of this study is to develop a virtual reality walking system that allows users to walk in virtual worlds fabricated with computer graphics. We focus on a haptic device that can perform various plantar pressures on users' soles of feet as an additional sense in the virtual reality walking. In this study, we discuss a use of a magnetic field sensitive elastomer (MSE) as a working material for the haptic interface on the sole. The first prototype with MSE was developed and evaluated in this work. According to the measurement of planter pressures, it was found that this device can perform different pressures on the sole of a light-weight user by applying magnetic field on the MSE. The result also implied necessities of the improvement of the magnetic circuit and the basic structure of the mechanism of the device.
Besse, Nadine; Rosset, Samuel; Zarate, Juan Jose; Ferrari, Elisabetta; Brayda, Luca; Shea, Herbert
2018-01-01
We present a fully latching and scalable 4 × 4 haptic display with 4 mm pitch, 5 s refresh time, 400 mN holding force, and 650 μm displacement per taxel. The display serves to convey dynamic graphical information to blind and visually impaired users. Combining significant holding force with high taxel density and large amplitude motion in a very compact overall form factor was made possible by exploiting the reversible, fast, hundred-fold change in the stiffness of a thin shape memory polymer (SMP) membrane when heated above its glass transition temperature. Local heating is produced using an addressable array of 3 mm in diameter stretchable microheaters patterned on the SMP. Each taxel is selectively and independently actuated by synchronizing the local Joule heating with a single pressure supply. Switching off the heating locks each taxel into its position (up or down), enabling holding any array configuration with zero power consumption. A 3D-printed pin array is mounted over the SMP membrane, providing the user with a smooth and room temperature array of movable pins to explore by touch. Perception tests were carried out with 24 blind users resulting in 70 percent correct pattern recognition over a 12-word tactile dictionary.
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface
Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele
2017-01-01
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source. PMID:28961198
Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.
Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea
2017-09-29
A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.
Klatzky, Roberta L; Giudice, Nicholas A; Bennett, Christopher R; Loomis, Jack M
2014-01-01
Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface.
Training Surgical Residents With a Haptic Robotic Central Venous Catheterization Simulator.
Pepley, David F; Gordon, Adam B; Yovanoff, Mary A; Mirkin, Katelin A; Miller, Scarlett R; Han, David C; Moore, Jason Z
Ultrasound guided central venous catheterization (CVC) is a common surgical procedure with complication rates ranging from 5 to 21 percent. Training is typically performed using manikins that do not simulate anatomical variations such as obesity and abnormal vessel positioning. The goal of this study was to develop and validate the effectiveness of a new virtual reality and force haptic based simulation platform for CVC of the right internal jugular vein. A CVC simulation platform was developed using a haptic robotic arm, 3D position tracker, and computer visualization. The haptic robotic arm simulated needle insertion force that was based on cadaver experiments. The 3D position tracker was used as a mock ultrasound device with realistic visualization on a computer screen. Upon completion of a practice simulation, performance feedback is given to the user through a graphical user interface including scoring factors based on good CVC practice. The effectiveness of the system was evaluated by training 13 first year surgical residents using the virtual reality haptic based training system over a 3 month period. The participants' performance increased from 52% to 96% on the baseline training scenario, approaching the average score of an expert surgeon: 98%. This also resulted in improvement in positive CVC practices including a 61% decrease between final needle tip position and vein center, a decrease in mean insertion attempts from 1.92 to 1.23, and a 12% increase in time spent aspirating the syringe throughout the procedure. A virtual reality haptic robotic simulator for CVC was successfully developed. Surgical residents training on the simulation improved to near expert levels after three robotic training sessions. This suggests that this system could act as an effective training device for CVC. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Dennerlein, J T; Yang, M C
2001-01-01
Pointing devices, essential input tools for the graphical user interface (GUI) of desktop computers, require precise motor control and dexterity to use. Haptic force-feedback devices provide the human operator with tactile cues, adding the sense of touch to existing visual and auditory interfaces. However, the performance enhancements, comfort, and possible musculoskeletal loading of using a force-feedback device in an office environment are unknown. Hypothesizing that the time to perform a task and the self-reported pain and discomfort of the task improve with the addition of force feedback, 26 people ranging in age from 22 to 44 years performed a point-and-click task 540 times with and without an attractive force field surrounding the desired target. The point-and-click movements were approximately 25% faster with the addition of force feedback (paired t-tests, p < 0.001). Perceived user discomfort and pain, as measured through a questionnaire, were also smaller with the addition of force feedback (p < 0.001). However, this difference decreased as additional distracting force fields were added to the task environment, simulating a more realistic work situation. These results suggest that for a given task, use of a force-feedback device improves performance, and potentially reduces musculoskeletal loading during mouse use. Actual or potential applications of this research include human-computer interface design, specifically that of the pointing device extensively used for the graphical user interface.
Perception of synchronization errors in haptic and visual communications
NASA Astrophysics Data System (ADS)
Kameyama, Seiji; Ishibashi, Yutaka
2006-10-01
This paper deals with a system which conveys the haptic sensation experimented by a user to a remote user. In the system, the user controls a haptic interface device with another remote haptic interface device while watching video. Haptic media and video of a real object which the user is touching are transmitted to another user. By subjective assessment, we investigate the allowable range and imperceptible range of synchronization error between haptic media and video. We employ four real objects and ask each subject whether the synchronization error is perceived or not for each object in the assessment. Assessment results show that we can more easily perceive the synchronization error in the case of haptic media ahead of video than in the case of the haptic media behind the video.
Open Touch/Sound Maps: A system to convey street data through haptic and auditory feedback
NASA Astrophysics Data System (ADS)
Kaklanis, Nikolaos; Votis, Konstantinos; Tzovaras, Dimitrios
2013-08-01
The use of spatial (geographic) information is becoming ever more central and pervasive in today's internet society but the most of it is currently inaccessible to visually impaired users. However, access in visual maps is severely restricted to visually impaired and people with blindness, due to their inability to interpret graphical information. Thus, alternative ways of a map's presentation have to be explored, in order to enforce the accessibility of maps. Multiple types of sensory perception like touch and hearing may work as a substitute of vision for the exploration of maps. The use of multimodal virtual environments seems to be a promising alternative for people with visual impairments. The present paper introduces a tool for automatic multimodal map generation having haptic and audio feedback using OpenStreetMap data. For a desired map area, an elevation map is being automatically generated and can be explored by touch, using a haptic device. A sonification and a text-to-speech (TTS) mechanism provide also audio navigation information during the haptic exploration of the map.
Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback
NASA Astrophysics Data System (ADS)
Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve
2011-03-01
Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.
Shadow-driven 4D haptic visualization.
Zhang, Hui; Hanson, Andrew
2007-01-01
Just as we can work with two-dimensional floor plans to communicate 3D architectural design, we can exploit reduced-dimension shadows to manipulate the higher-dimensional objects generating the shadows. In particular, by taking advantage of physically reactive 3D shadow-space controllers, we can transform the task of interacting with 4D objects to a new level of physical reality. We begin with a teaching tool that uses 2D knot diagrams to manipulate the geometry of 3D mathematical knots via their projections; our unique 2D haptic interface allows the user to become familiar with sketching, editing, exploration, and manipulation of 3D knots rendered as projected imageson a 2D shadow space. By combining graphics and collision-sensing haptics, we can enhance the 2D shadow-driven editing protocol to successfully leverage 2D pen-and-paper or blackboard skills. Building on the reduced-dimension 2D editing tool for manipulating 3D shapes, we develop the natural analogy to produce a reduced-dimension 3D tool for manipulating 4D shapes. By physically modeling the correct properties of 4D surfaces, their bending forces, and their collisions in the 3D haptic controller interface, we can support full-featured physical exploration of 4D mathematical objects in a manner that is otherwise far beyond the experience accessible to human beings. As far as we are aware, this paper reports the first interactive system with force-feedback that provides "4D haptic visualization" permitting the user to model and interact with 4D cloth-like objects.
Integration of Haptics in Agricultural Robotics
NASA Astrophysics Data System (ADS)
Kannan Megalingam, Rajesh; Sreekanth, M. M.; Sivanantham, Vinu; Sai Kumar, K.; Ghanta, Sriharsha; Surya Teja, P.; Reddy, Rajesh G.
2017-08-01
Robots can differentiate with open loop system and closed loop system robots. We face many problems when we do not have a feedback from robots. In this research paper, we are discussing all possibilities to achieve complete closed loop system for Multiple-DOF Robotic Arm, which is used in a coconut tree climbing and cutting robot by introducing a Haptic device. We are working on various sensors like tactile, vibration, force and proximity sensors for getting feedback. For monitoring the robotic arm achieved by graphical user interference software which simulates the working of the robotic arm, send the feedback of all the real time analog values which are produced by various sensors and provide real-time graphs for estimate the efficiency of the Robot.
Comparative study on collaborative interaction in non-immersive and immersive systems
NASA Astrophysics Data System (ADS)
Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki
2007-09-01
This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.
Villard, P F; Vidal, F P; Hunt, C; Bello, F; John, N W; Johnson, S; Gould, D A
2009-11-01
We present here a simulator for interventional radiology focusing on percutaneous transhepatic cholangiography (PTC). This procedure consists of inserting a needle into the biliary tree using fluoroscopy for guidance. The requirements of the simulator have been driven by a task analysis. The three main components have been identified: the respiration, the real-time X-ray display (fluoroscopy) and the haptic rendering (sense of touch). The framework for modelling the respiratory motion is based on kinematics laws and on the Chainmail algorithm. The fluoroscopic simulation is performed on the graphic card and makes use of the Beer-Lambert law to compute the X-ray attenuation. Finally, the haptic rendering is integrated to the virtual environment and takes into account the soft-tissue reaction force feedback and maintenance of the initial direction of the needle during the insertion. Five training scenarios have been created using patient-specific data. Each of these provides the user with variable breathing behaviour, fluoroscopic display tuneable to any device parameters and needle force feedback. A detailed task analysis has been used to design and build the PTC simulator described in this paper. The simulator includes real-time respiratory motion with two independent parameters (rib kinematics and diaphragm action), on-line fluoroscopy implemented on the Graphics Processing Unit and haptic feedback to feel the soft-tissue behaviour of the organs during the needle insertion.
O'Modhrain, Sile; Giudice, Nicholas A; Gardner, John A; Legge, Gordon E
2015-01-01
This paper discusses issues of importance to designers of media for visually impaired users. The paper considers the influence of human factors on the effectiveness of presentation as well as the strengths and weaknesses of tactile, vibrotactile, haptic, and multimodal methods of rendering maps, graphs, and models. The authors, all of whom are visually impaired researchers in this domain, present findings from their own work and work of many others who have contributed to the current understanding of how to prepare and render images for both hard-copy and technology-mediated presentation of Braille and tangible graphics.
Augmented reality and haptic interfaces for robot-assisted surgery.
Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N
2012-03-01
Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.
Human-computer interface including haptically controlled interactions
Anderson, Thomas G.
2005-10-11
The present invention provides a method of human-computer interfacing that provides haptic feedback to control interface interactions such as scrolling or zooming within an application. Haptic feedback in the present method allows the user more intuitive control of the interface interactions, and allows the user's visual focus to remain on the application. The method comprises providing a control domain within which the user can control interactions. For example, a haptic boundary can be provided corresponding to scrollable or scalable portions of the application domain. The user can position a cursor near such a boundary, feeling its presence haptically (reducing the requirement for visual attention for control of scrolling of the display). The user can then apply force relative to the boundary, causing the interface to scroll the domain. The rate of scrolling can be related to the magnitude of applied force, providing the user with additional intuitive, non-visual control of scrolling.
Visuo-Haptic Mixed Reality with Unobstructed Tool-Hand Integration.
Cosco, Francesco; Garre, Carlos; Bruno, Fabio; Muzzupappa, Maurizio; Otaduy, Miguel A
2013-01-01
Visuo-haptic mixed reality consists of adding to a real scene the ability to see and touch virtual objects. It requires the use of see-through display technology for visually mixing real and virtual objects, and haptic devices for adding haptic interaction with the virtual objects. Unfortunately, the use of commodity haptic devices poses obstruction and misalignment issues that complicate the correct integration of a virtual tool and the user's real hand in the mixed reality scene. In this work, we propose a novel mixed reality paradigm where it is possible to touch and see virtual objects in combination with a real scene, using commodity haptic devices, and with a visually consistent integration of the user's hand and the virtual tool. We discuss the visual obstruction and misalignment issues introduced by commodity haptic devices, and then propose a solution that relies on four simple technical steps: color-based segmentation of the hand, tracking-based segmentation of the haptic device, background repainting using image-based models, and misalignment-free compositing of the user's hand. We have developed a successful proof-of-concept implementation, where a user can touch virtual objects and interact with them in the context of a real scene, and we have evaluated the impact on user performance of obstruction and misalignment correction.
Rastogi, Ravi; Pawluk, Dianne T V
2013-01-01
An increasing amount of information content used in school, work, and everyday living is presented in graphical form. Unfortunately, it is difficult for people who are blind or visually impaired to access this information, especially when many diagrams are needed. One problem is that details, even in relatively simple visual diagrams, can be very difficult to perceive using touch. With manually created tactile diagrams, these details are often presented in separate diagrams which must be selected from among others. Being able to actively zoom in on an area of a single diagram so that the details can be presented at a reasonable size for exploration purposes seems a simpler approach for the user. However, directly using visual zooming methods have some limitations when used haptically. Therefore, a new zooming method is proposed to avoid these pitfalls. A preliminary experiment was performed to examine the usefulness of the algorithm compared to not using zooming. The results showed that the number of correct responses improved with the developed zooming algorithm and participants found it to be more usable than not using zooming for exploration of a floor map.
Mental rotation of tactile stimuli: using directional haptic cues in mobile devices.
Gleeson, Brian T; Provancher, William R
2013-01-01
Haptic interfaces have the potential to enrich users' interactions with mobile devices and convey information without burdening the user's visual or auditory attention. Haptic stimuli with directional content, for example, navigational cues, may be difficult to use in handheld applications; the user's hand, where the cues are delivered, may not be aligned with the world, where the cues are to be interpreted. In such a case, the user would be required to mentally transform the stimuli between different reference frames. We examine the mental rotation of directional haptic stimuli in three experiments, investigating: 1) users' intuitive interpretation of rotated stimuli, 2) mental rotation of haptic stimuli about a single axis, and 3) rotation about multiple axes and the effects of specific hand poses and joint rotations. We conclude that directional haptic stimuli are suitable for use in mobile applications, although users do not naturally interpret rotated stimuli in any one universal way. We find evidence of cognitive processes involving the rotation of analog, spatial representations and discuss how our results fit into the larger body of mental rotation research. For small angles (e.g., less than 40 degree), these mental rotations come at little cost, but rotations with larger misalignment angles impact user performance. When considering the design of a handheld haptic device, our results indicate that hand pose must be carefully considered, as certain poses increase the difficulty of stimulus interpretation. Generally, all tested joint rotations impact task difficulty, but finger flexion and wrist rotation interact to greatly increase the cost of stimulus interpretation; such hand poses should be avoided when designing a haptic interface.
Ascending and Descending in Virtual Reality: Simple and Safe System Using Passive Haptics.
Nagao, Ryohei; Matsumoto, Keigo; Narumi, Takuji; Tanikawa, Tomohiro; Hirose, Michitaka
2018-04-01
This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape. Our system enables users to experience all types of stairs, such as half-turn and spiral stairs, in a VR setting. We conducted a preliminary user study and two experiments to evaluate the proposed technique. The preliminary user study investigated the effectiveness of the basic idea associated with the proposed technique for the case of a user ascending stairs. The results demonstrated that the passive haptic feedback produced by the small bumps enhanced the user's feeling of presence and sense of ascending. We subsequently performed an experiment to investigate an improved viewpoint manipulation method and the interaction of the manipulation and haptics for both the ascending and descending cases. The experimental results demonstrated that the participants had a feeling of presence and felt a steep stair gradient under the condition of haptic feedback and viewpoint manipulation based on the characteristics of actual stair walking data. However, these results also indicated that the proposed system may not be as effective in providing a sense of descending stairs without an optimization of the haptic stimuli. We then redesigned the shape of the small bumps, and evaluated the design in a second experiment. The results indicated that the best shape to present haptic stimuli is a right triangle cross section in both the ascending and descending cases. Although it is necessary to install small protrusions in the determined direction, by using this optimized shape the users feeling of presence of the stairs and the sensation of walking up and down was enhanced.
A kinesthetic washout filter for force-feedback rendering.
Danieau, Fabien; Lecuyer, Anatole; Guillotel, Philippe; Fleureau, Julien; Mollet, Nicolas; Christie, Marc
2015-01-01
Today haptic feedback can be designed and associated to audiovisual content (haptic-audiovisuals or HAV). Although there are multiple means to create individual haptic effects, the issue of how to properly adapt such effects on force-feedback devices has not been addressed and is mostly a manual endeavor. We propose a new approach for the haptic rendering of HAV, based on a washout filter for force-feedback devices. A body model and an inverse kinematics algorithm simulate the user's kinesthetic perception. Then, the haptic rendering is adapted in order to handle transitions between haptic effects and to optimize the amplitude of effects regarding the device capabilities. Results of a user study show that this new haptic rendering can successfully improve the HAV experience.
Graphic and haptic simulation system for virtual laparoscopic rectum surgery.
Pan, Jun J; Chang, Jian; Yang, Xiaosong; Zhang, Jian J; Qureshi, Tahseen; Howell, Robert; Hickish, Tamas
2011-09-01
Medical simulators with vision and haptic feedback techniques offer a cost-effective and efficient alternative to the traditional medical trainings. They have been used to train doctors in many specialties of medicine, allowing tasks to be practised in a safe and repetitive manner. This paper describes a virtual-reality (VR) system which will help to influence surgeons' learning curves in the technically challenging field of laparoscopic surgery of the rectum. Data from MRI of the rectum and real operation videos are used to construct the virtual models. A haptic force filter based on radial basis functions is designed to offer realistic and smooth force feedback. To handle collision detection efficiently, a hybrid model is presented to compute the deformation of intestines. Finally, a real-time cutting technique based on mesh is employed to represent the incision operation. Despite numerous research efforts, fast and realistic solutions of soft tissues with large deformation, such as intestines, prove extremely challenging. This paper introduces our latest contribution to this endeavour. With this system, the user can haptically operate with the virtual rectum and simultaneously watch the soft tissue deformation. Our system has been tested by colorectal surgeons who believe that the simulated tactile and visual feedbacks are realistic. It could replace the traditional training process and effectively transfer surgical skills to novices. Copyright © 2011 John Wiley & Sons, Ltd.
Improving manual skills in persons with disabilities (PWD) through a multimodal assistance system.
Covarrubias, Mario; Gatti, Elia; Bordegoni, Monica; Cugini, Umberto; Mansutti, Alessandro
2014-07-01
In this research work, we present a Multimodal Guidance System (MGS) whose aim is to provide dynamic assistance to persons with disabilities (PWD) while performing manual activities such as drawing, coloring in and foam-cutting tasks. The MGS provides robotic assistance in the execution of 2D tasks through haptic and sound interactions. Haptic technology provides the virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs related to the hand's velocity while sketching and filling or cutting operations. By combining this Multimodal System with the haptic assistance, we have created a new approach with possible applications to such diverse fields as physical rehabilitation, scientific investigation of sensorimotor learning and assessment of hand movements in PWD. The MGS has been tested by people with specific disorders affecting coordination, such as Down syndrome and developmental disabilities, under the supervision of their teachers and care assistants inside their learning environment. A Graphic User Interface has been designed for teachers and care assistants in order to provide training during the test sessions. Our results provide conclusive evidence that the effect of using the MGS increases the accuracy in the tasks operations. The Multimodal Guidance System (MGS) is an interface that offers haptic and sound feedback while performing manual tasks. Several studies demonstrated that the haptic guidance systems can help people in recovering cognitive function at different levels of complexity and impairment. The applications supported by our device could also have an important role in supporting physical therapist and cognitive psychologist in helping patients to recover motor and visuo-spatial abilities.
Study on Collaborative Object Manipulation in Virtual Environment
NASA Astrophysics Data System (ADS)
Mayangsari, Maria Niken; Yong-Moo, Kwon
This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.
Sensing and Force-Feedback Exoskeleton (SAFE) Robotic Glove.
Ben-Tzvi, Pinhas; Ma, Zhou
2015-11-01
This paper presents the design, implementation and experimental validation of a novel robotic haptic exoskeleton device to measure the user's hand motion and assist hand motion while remaining portable and lightweight. The device consists of a five-finger mechanism actuated with miniature DC motors through antagonistically routed cables at each finger, which act as both active and passive force actuators. The SAFE Glove is a wireless and self-contained mechatronic system that mounts over the dorsum of a bare hand and provides haptic force feedback to each finger. The glove is adaptable to a wide variety of finger sizes without constraining the range of motion. This makes it possible to accurately and comfortably track the complex motion of the finger and thumb joints associated with common movements of hand functions, including grip and release patterns. The glove can be wirelessly linked to a computer for displaying and recording the hand status through 3D Graphical User Interface (GUI) in real-time. The experimental results demonstrate that the SAFE Glove is capable of reliably modeling hand kinematics, measuring finger motion and assisting hand grasping motion. Simulation and experimental results show the potential of the proposed system in rehabilitation therapy and virtual reality applications.
Haptic-STM: a human-in-the-loop interface to a scanning tunneling microscope.
Perdigão, Luís M A; Saywell, Alex
2011-07-01
The operation of a haptic device interfaced with a scanning tunneling microscope (STM) is presented here. The user moves the STM tip in three dimensions by means of a stylus attached to the haptic instrument. The tunneling current measured by the STM is converted to a vertical force, applied to the stylus and felt by the user, with the user being incorporated into the feedback loop that controls the tip-surface distance. A haptic-STM interface of this nature allows the user to feel atomic features on the surface and facilitates the tactile manipulation of the adsorbate/substrate system. The operation of this device is demonstrated via the room temperature STM imaging of C(60) molecules adsorbed on an Au(111) surface in ultra-high vacuum.
Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.
Katzschmann, Robert K; Araki, Brandon; Rus, Daniela
2018-03-01
This paper presents ALVU (Array of Lidars and Vibrotactile Units), a contactless, intuitive, hands-free, and discreet wearable device that allows visually impaired users to detect low- and high-hanging obstacles, as well as physical boundaries in their immediate environment. The solution allows for safe local navigation in both confined and open spaces by enabling the user to distinguish free space from obstacles. The device presented is composed of two parts: a sensor belt and a haptic strap. The sensor belt is an array of time-of-flight distance sensors worn around the front of a user's waist, and the pulses of infrared light provide reliable and accurate measurements of the distances between the user and surrounding obstacles or surfaces. The haptic strap communicates the measured distances through an array of vibratory motors worn around the user's upper abdomen, providing haptic feedback. The linear vibration motors are combined with a point-loaded pretensioned applicator to transmit isolated vibrations to the user. We validated the device's capability in an extensive user study entailing 162 trials with 12 blind users. Users wearing the device successfully walked through hallways, avoided obstacles, and detected staircases.
Haptic feedback for virtual assembly
NASA Astrophysics Data System (ADS)
Luecke, Greg R.; Zafer, Naci
1998-12-01
Assembly operations require high speed and precision with low cost. The manufacturing industry has recently turned attenuation to the possibility of investigating assembly procedures using graphical display of CAD parts. For these tasks, some sort of feedback to the person is invaluable in providing a real sense of interaction with virtual parts. This research develops the use of a commercial assembly robot as the haptic display in such tasks. For demonstration, a peg-hole insertion task is studied. Kane's Method is employed to derive the dynamics of the peg and the contact motions between the peg and the hole. A handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is handle modeled as a cylindrical peg is attached to the end effector of a PUMA 560 robotic arm. The arm is equipped with a six axis force/torque transducer. The use grabs the handle and the user-applied forces are recorded. A 300 MHz Pentium computer is used to simulate the dynamics of the virtual peg and its interactions as it is inserted in the virtual hole. The computed torque control is then employed to exert the full dynamics of the task to the user hand. Visual feedback is also incorporated to help the user in the process of inserting the peg into the hole. Experimental results are presented to show several contact configurations for this virtually simulated task.
Using the PhysX engine for physics-based virtual surgery with force feedback.
Maciel, Anderson; Halic, Tansel; Lu, Zhonghua; Nedel, Luciana P; De, Suvranu
2009-09-01
The development of modern surgical simulators is highly challenging, as they must support complex simulation environments. The demand for higher realism in such simulators has driven researchers to adopt physics-based models, which are computationally very demanding. This poses a major problem, since real-time interactions must permit graphical updates of 30 Hz and a much higher rate of 1 kHz for force feedback (haptics). Recently several physics engines have been developed which offer multi-physics simulation capabilities, including rigid and deformable bodies, cloth and fluids. While such physics engines provide unique opportunities for the development of surgical simulators, their higher latencies, compared to what is necessary for real-time graphics and haptics, offer significant barriers to their use in interactive simulation environments. In this work, we propose solutions to this problem and demonstrate how a multimodal surgical simulation environment may be developed based on NVIDIA's PhysX physics library. Hence, models that are undergoing relatively low-frequency updates in PhysX can exist in an environment that demands much higher frequency updates for haptics. We use a collision handling layer to interface between the physical response provided by PhysX and the haptic rendering device to provide both real-time tissue response and force feedback. Our simulator integrates a bimanual haptic interface for force feedback and per-pixel shaders for graphics realism in real time. To demonstrate the effectiveness of our approach, we present the simulation of the laparoscopic adjustable gastric banding (LAGB) procedure as a case study. To develop complex and realistic surgical trainers with realistic organ geometries and tissue properties demands stable physics-based deformation methods, which are not always compatible with the interaction level required for such trainers. We have shown that combining different modelling strategies for behaviour, collision and graphics is possible and desirable. Such multimodal environments enable suitable rates to simulate the major steps of the LAGB procedure.
Sharp, Ian; Patton, James; Listenberger, Molly; Case, Emily
2011-08-08
Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.
Olsson, Pontus; Nysjö, Fredrik; Hirsch, Jan-Michaél; Carlbom, Ingrid B
2013-11-01
Cranio-maxillofacial (CMF) surgery to restore normal skeletal anatomy in patients with serious trauma to the face can be both complex and time-consuming. But it is generally accepted that careful pre-operative planning leads to a better outcome with a higher degree of function and reduced morbidity in addition to reduced time in the operating room. However, today's surgery planning systems are primitive, relying mostly on the user's ability to plan complex tasks with a two-dimensional graphical interface. A system for planning the restoration of skeletal anatomy in facial trauma patients using a virtual model derived from patient-specific CT data. The system combines stereo visualization with six degrees-of-freedom, high-fidelity haptic feedback that enables analysis, planning, and preoperative testing of alternative solutions for restoring bone fragments to their proper positions. The stereo display provides accurate visual spatial perception, and the haptics system provides intuitive haptic feedback when bone fragments are in contact as well as six degrees-of-freedom attraction forces for precise bone fragment alignment. A senior surgeon without prior experience of the system received 45 min of system training. Following the training session, he completed a virtual reconstruction in 22 min of a complex mandibular fracture with an adequately reduced result. Preliminary testing with one surgeon indicates that our surgery planning system, which combines stereo visualization with sophisticated haptics, has the potential to become a powerful tool for CMF surgery planning. With little training, it allows a surgeon to complete a complex plan in a short amount of time.
Haptic interface of web-based training system for interventional radiology procedures
NASA Astrophysics Data System (ADS)
Ma, Xin; Lu, Yiping; Loe, KiaFock; Nowinski, Wieslaw L.
2004-05-01
The existing web-based medical training systems and surgical simulators can provide affordable and accessible medical training curriculum, but they seldom offer the trainee realistic and affordable haptic feedback. Therefore, they cannot offer the trainee a suitable practicing environment. In this paper, a haptic solution for interventional radiology (IR) procedures is proposed. System architecture of a web-based training system for IR procedures is briefly presented first. Then, the mechanical structure, the working principle and the application of a haptic device are discussed in detail. The haptic device works as an interface between the training environment and the trainees and is placed at the end user side. With the system, the user can be trained on the interventional radiology procedures - navigating catheters, inflating balloons, deploying coils and placing stents on the web and get surgical haptic feedback in real time.
Girod, Sabine; Schvartzman, Sara C; Gaudilliere, Dyani; Salisbury, Kenneth; Silva, Rebeka
2016-01-01
Computer-assisted surgical (CAS) planning tools are available for craniofacial surgery, but are usually based on computer-aided design (CAD) tools that lack the ability to detect the collision of virtual objects (i.e., fractured bone segments). We developed a CAS system featuring a sense of touch (haptic) that enables surgeons to physically interact with individual, patient-specific anatomy and immerse in a three-dimensional virtual environment. In this study, we evaluated initial user experience with our novel system compared to an existing CAD system. Ten surgery resident trainees received a brief verbal introduction to both the haptic and CAD systems. Users simulated mandibular fracture reduction in three clinical cases within a 15 min time limit for each system and completed a questionnaire to assess their subjective experience. We compared standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome and found that haptic simulation results were not significantly different from actual postoperative outcomes. In contrast, CAD results significantly differed from both the haptic simulation and actual postoperative results. In addition to enabling a more accurate fracture repair, the haptic system provided a better user experience than the CAD system in terms of intuitiveness and self-reported quality of repair.
ERIC Educational Resources Information Center
Schonborn, Konrad J.; Bivall, Petter; Tibell, Lena A. E.
2011-01-01
This study explores tertiary students' interaction with a haptic virtual model representing the specific binding of two biomolecules, a core concept in molecular life science education. Twenty students assigned to a "haptics" (experimental) or "no-haptics" (control) condition performed a "docking" task where users sought the most favourable…
Lemole, G Michael; Banerjee, P Pat; Luciano, Cristian; Neckrysh, Sergey; Charbel, Fady T
2007-07-01
Mastery of the neurosurgical skill set involves many hours of supervised intraoperative training. Convergence of political, economic, and social forces has limited neurosurgical resident operative exposure. There is need to develop realistic neurosurgical simulations that reproduce the operative experience, unrestricted by time and patient safety constraints. Computer-based, virtual reality platforms offer just such a possibility. The combination of virtual reality with dynamic, three-dimensional stereoscopic visualization, and haptic feedback technologies makes realistic procedural simulation possible. Most neurosurgical procedures can be conceptualized and segmented into critical task components, which can be simulated independently or in conjunction with other modules to recreate the experience of a complex neurosurgical procedure. We use the ImmersiveTouch (ImmersiveTouch, Inc., Chicago, IL) virtual reality platform, developed at the University of Illinois at Chicago, to simulate the task of ventriculostomy catheter placement as a proof-of-concept. Computed tomographic data are used to create a virtual anatomic volume. Haptic feedback offers simulated resistance and relaxation with passage of a virtual three-dimensional ventriculostomy catheter through the brain parenchyma into the ventricle. A dynamic three-dimensional graphical interface renders changing visual perspective as the user's head moves. The simulation platform was found to have realistic visual, tactile, and handling characteristics, as assessed by neurosurgical faculty, residents, and medical students. We have developed a realistic, haptics-based virtual reality simulator for neurosurgical education. Our first module recreates a critical component of the ventriculostomy placement task. This approach to task simulation can be assembled in a modular manner to reproduce entire neurosurgical procedures.
Jin, Seung-A Annie
2010-06-01
This study gauged the effects of force feedback in the Novint Falcon haptics system on the sensory and cognitive dimensions of a virtual test-driving experience. First, in order to explore the effects of tactile stimuli with force feedback on users' sensory experience, feelings of physical presence (the extent to which virtual physical objects are experienced as actual physical objects) were measured after participants used the haptics interface. Second, to evaluate the effects of force feedback on the cognitive dimension of consumers' virtual experience, this study investigated brand personality perception. The experiment utilized the Novint Falcon haptics controller to induce immersive virtual test-driving through tactile stimuli. The author designed a two-group (haptics stimuli with force feedback versus no force feedback) comparison experiment (N = 238) by manipulating the level of force feedback. Users in the force feedback condition were exposed to tactile stimuli involving various force feedback effects (e.g., terrain effects, acceleration, and lateral forces) while test-driving a rally car. In contrast, users in the control condition test-drove the rally car using the Novint Falcon but were not given any force feedback. Results of ANOVAs indicated that (a) users exposed to force feedback felt stronger physical presence than those in the no force feedback condition, and (b) users exposed to haptics stimuli with force feedback perceived the brand personality of the car to be more rugged than those in the control condition. Managerial implications of the study for product trial in the business world are discussed.
Review of Designs for Haptic Data Visualization.
Paneels, Sabrina; Roberts, Jonathan C
2010-01-01
There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.
Barmpoutis, Angelos; Alzate, Jose; Beekhuizen, Samantha; Delgado, Horacio; Donaldson, Preston; Hall, Andrew; Lago, Charlie; Vidal, Kevin; Fox, Emily J
2016-01-01
In this paper a prototype system is presented for home-based physical tele-therapy using a wearable device for haptic feedback. The haptic feedback is generated as a sequence of vibratory cues from 8 vibrator motors equally spaced along an elastic wearable band. The motors guide the patients' movement as they perform a prescribed exercise routine in a way that replaces the physical therapists' haptic guidance in an unsupervised or remotely supervised home-based therapy session. A pilot study of 25 human subjects was performed that focused on: a) testing the capability of the system to guide the users in arbitrary motion paths in the space and b) comparing the motion of the users during typical physical therapy exercises with and without haptic-based guidance. The results demonstrate the efficacy of the proposed system.
Veras, Eduardo J; De Laurentis, Kathryn J; Dubey, Rajiv
2008-01-01
This paper describes the design and implementation of a control system that integrates visual and haptic information to give assistive force feedback through a haptic controller (Omni Phantom) to the user. A sensor-based assistive function and velocity scaling program provides force feedback that helps the user complete trajectory following exercises for rehabilitation purposes. This system also incorporates a PUMA robot for teleoperation, which implements a camera and a laser range finder, controlled in real time by a PC, were implemented into the system to help the user to define the intended path to the selected target. The real-time force feedback from the remote robot to the haptic controller is made possible by using effective multithreading programming strategies in the control system design and by novel sensor integration. The sensor-based assistant function concept applied to teleoperation as well as shared control enhances the motion range and manipulation capabilities of the users executing rehabilitation exercises such as trajectory following along a sensor-based defined path. The system is modularly designed to allow for integration of different master devices and sensors. Furthermore, because this real-time system is versatile the haptic component can be used separately from the telerobotic component; in other words, one can use the haptic device for rehabilitation purposes for cases in which assistance is needed to perform tasks (e.g., stroke rehab) and also for teleoperation with force feedback and sensor assistance in either supervisory or automatic modes.
Zenner, Andre; Kruger, Antonio
2017-04-01
We define the concept of Dynamic Passive Haptic Feedback (DPHF) for virtual reality by introducing the weight-shifting physical DPHF proxy object Shifty. This concept combines actuators known from active haptics and physical proxies known from passive haptics to construct proxies that automatically adapt their passive haptic feedback. We describe the concept behind our ungrounded weight-shifting DPHF proxy Shifty and the implementation of our prototype. We then investigate how Shifty can, by automatically changing its internal weight distribution, enhance the user's perception of virtual objects interacted with in two experiments. In a first experiment, we show that Shifty can enhance the perception of virtual objects changing in shape, especially in length and thickness. Here, Shifty was shown to increase the user's fun and perceived realism significantly, compared to an equivalent passive haptic proxy. In a second experiment, Shifty is used to pick up virtual objects of different virtual weights. The results show that Shifty enhances the perception of weight and thus the perceived realism by adapting its kinesthetic feedback to the picked-up virtual object. In the same experiment, we additionally show that specific combinations of haptic, visual and auditory feedback during the pick-up interaction help to compensate for visual-haptic mismatch perceived during the shifting process.
NASA Technical Reports Server (NTRS)
Adams, Richard J.
2015-01-01
The patent-pending Glove-Enabled Computer Operations (GECO) design leverages extravehicular activity (EVA) glove design features as platforms for instrumentation and tactile feedback, enabling the gloves to function as human-computer interface devices. Flexible sensors in each finger enable control inputs that can be mapped to any number of functions (e.g., a mouse click, a keyboard strike, or a button press). Tracking of hand motion is interpreted alternatively as movement of a mouse (change in cursor position on a graphical user interface) or a change in hand position on a virtual keyboard. Programmable vibro-tactile actuators aligned with each finger enrich the interface by creating the haptic sensations associated with control inputs, such as recoil of a button press.
Tele-rehabilitation using in-house wearable ankle rehabilitation robot.
Jamwal, Prashant K; Hussain, Shahid; Mir-Nasiri, Nazim; Ghayesh, Mergen H; Xie, Sheng Q
2018-01-01
This article explores wide-ranging potential of the wearable ankle robot for in-house rehabilitation. The presented robot has been conceptualized following a brief analysis of the existing technologies, systems, and solutions for in-house physical ankle rehabilitation. Configuration design analysis and component selection for ankle robot have been discussed as part of the conceptual design. The complexities of human robot interaction are closely encountered while maneuvering a rehabilitation robot. We present a fuzzy logic-based controller to perform the required robot-assisted ankle rehabilitation treatment. Designs of visual haptic interfaces have also been discussed, which will make the treatment interesting, and the subject will be motivated to exert more and regain lost functions rapidly. The complex nature of web-based communication between user and remotely sitting physiotherapy staff has also been discussed. A high-level software architecture appended with robot ensures user-friendly operations. This software is made up of three important components: patient-related database, graphical user interface (GUI), and a library of exercises creating virtual reality-specifically developed for ankle rehabilitation.
Visual and haptic integration in the estimation of softness of deformable objects
Cellini, Cristiano; Kaim, Lukas; Drewing, Knut
2013-01-01
Softness perception intrinsically relies on haptic information. However, through everyday experiences we learn correspondences between felt softness and the visual effects of exploratory movements that are executed to feel softness. Here, we studied how visual and haptic information is integrated to assess the softness of deformable objects. Participants discriminated between the softness of two softer or two harder objects using only-visual, only-haptic or both visual and haptic information. We assessed the reliabilities of the softness judgments using the method of constant stimuli. In visuo-haptic trials, discrepancies between the two senses' information allowed us to measure the contribution of the individual senses to the judgments. Visual information (finger movement and object deformation) was simulated using computer graphics; input in visual trials was taken from previous visuo-haptic trials. Participants were able to infer softness from vision alone, and vision considerably contributed to bisensory judgments (∼35%). The visual contribution was higher than predicted from models of optimal integration (senses are weighted according to their reliabilities). Bisensory judgments were less reliable than predicted from optimal integration. We conclude that the visuo-haptic integration of softness information is biased toward vision, rather than being optimal, and might even be guided by a fixed weighting scheme. PMID:25165510
Ottensmeyer, M P; Ben-Ur, E; Salisbury, J K
2000-01-01
Current efforts in surgical simulation very often focus on creating realistic graphical feedback, but neglect some or all tactile and force (haptic) feedback that a surgeon would normally receive. Simulations that do include haptic feedback do not typically use real tissue compliance properties, favoring estimates and user feedback to determine realism. When tissue compliance data are used, there are virtually no in vivo property measurements to draw upon. Together with the Center for Innovative Minimally Invasive Therapy at the Massachusetts General Hospital, the Haptics Group is developing tools to introduce more comprehensive haptic feedback in laparoscopy simulators and to provide biological tissue material property data for our software simulation. The platform for providing haptic feedback is a PHANToM Haptic Interface, produced by SensAble Technologies, Inc. Our devices supplement the PHANToM to provide for grasping and optionally, for the roll axis of the tool. Together with feedback from the PHANToM, which provides the pitch, yaw and thrust axes of a typical laparoscopy tool, we can recreate all of the haptic sensations experienced during laparoscopy. The devices integrate real laparoscopy toolhandles and a compliant torso model to complete the set of visual and tactile sensations. Biological tissues are known to exhibit non-linear mechanical properties, and change their properties dramatically when removed from a living organism. To measure the properties in vivo, two devices are being developed. The first is a small displacement, 1-D indenter. It will measure the linear tissue compliance (stiffness and damping) over a wide range of frequencies. These data will be used as inputs to a finite element or other model. The second device will be able to deflect tissues in 3-D over a larger range, so that the non-linearities due to changes in the tissue geometry will be measured. This will allow us to validate the performance of the model on large tissue deformations. Both devices are designed to pass through standard 12 mm laparoscopy trocars, and will be suitable for use during open or minimally invasive procedures. We plan to acquire data from pigs used by surgeons for training purposes, but conceivably, the tools could be refined for use on humans undergoing surgery. Our work will provide the necessary data input for surgical simulations to accurately model the force interactions that a surgeon would have with tissue, and will provide the force output to create a truly realistic simulation of minimally invasive surgery.
Haptic device for colonoscopy training simulator.
Kwon, Jun Yong; Woo, Hyun Soo; Lee, Doo Yong
2005-01-01
A new 2-DOF haptic device for colonoscopy training simulator employing flexible endoscopes, is developed. The user operates the device in translational and roll directions. The developed folding guides of the device keep the endoscope tube straight. This helps transmit large decoupled forces of the colonoscopy simulation to the user. The device also includes a mechanism to detect jiggling motion of the scopes to allow users to practice this important skill of the colonoscopy. The device includes PD controller to compensate the inertia and friction effects. This provides the users with better transparent sensation of the simulation.
Haptics – Touchfeedback Technology Widening the Horizon of Medicine
Kapoor, Shalini; Arora, Pallak; Kapoor, Vikas; Jayachandran, Mahesh; Tiwari, Manish
2014-01-01
Haptics, or touchsense haptic technology is a major breakthrough in medical and dental interventions. Haptic perception is the process of recognizing objects through touch. Haptic sensations are created by actuators or motors which generate vibrations to the users and are controlled by embedded software which is integrated into the device. It takes the advantage of a combination of somatosensory pattern of skin and proprioception of hand position. Anatomical and diagnostic knowledge, when it is combined with this touch sense technology, has revolutionized medical education. This amalgamation of the worlds of diagnosis and surgical intervention adds precise robotic touch to the skill of the surgeon. A systematic literature review was done by using MEDLINE, GOOGLE SEARCH AND PubMed. The aim of this article was to introduce the fundamentals of haptic technology, its current applications in medical training and robotic surgeries, limitations of haptics and future aspects of haptics in medicine. PMID:24783164
Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart.
Kesner, Samuel B; Howe, Robert D
2011-01-01
Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system.
Haptics-based dynamic implicit solid modeling.
Hua, Jing; Qin, Hong
2004-01-01
This paper systematically presents a novel, interactive solid modeling framework, Haptics-based Dynamic Implicit Solid Modeling, which is founded upon volumetric implicit functions and powerful physics-based modeling. In particular, we augment our modeling framework with a haptic mechanism in order to take advantage of additional realism associated with a 3D haptic interface. Our dynamic implicit solids are semi-algebraic sets of volumetric implicit functions and are governed by the principles of dynamics, hence responding to sculpting forces in a natural and predictable manner. In order to directly manipulate existing volumetric data sets as well as point clouds, we develop a hierarchical fitting algorithm to reconstruct and represent discrete data sets using our continuous implicit functions, which permit users to further design and edit those existing 3D models in real-time using a large variety of haptic and geometric toolkits, and visualize their interactive deformation at arbitrary resolution. The additional geometric and physical constraints afford more sophisticated control of the dynamic implicit solids. The versatility of our dynamic implicit modeling enables the user to easily modify both the geometry and the topology of modeled objects, while the inherent physical properties can offer an intuitive haptic interface for direct manipulation with force feedback.
Evaluation of Wearable Haptic Systems for the Fingers in Augmented Reality Applications.
Maisto, Maurizio; Pacchierotti, Claudio; Chinello, Francesco; Salvietti, Gionata; De Luca, Alessandro; Prattichizzo, Domenico
2017-01-01
Although Augmented Reality (AR) has been around for almost five decades, only recently we have witnessed AR systems and applications entering in our everyday life. Representative examples of this technological revolution are the smartphone games "Pokémon GO" and "Ingress" or the Google Translate real-time sign interpretation app. Even if AR applications are already quite compelling and widespread, users are still not able to physically interact with the computer-generated reality. In this respect, wearable haptics can provide the compelling illusion of touching the superimposed virtual objects without constraining the motion or the workspace of the user. In this paper, we present the experimental evaluation of two wearable haptic interfaces for the fingers in three AR scenarios, enrolling 38 participants. In the first experiment, subjects were requested to write on a virtual board using a real chalk. The haptic devices provided the interaction forces between the chalk and the board. In the second experiment, subjects were asked to pick and place virtual and real objects. The haptic devices provided the interaction forces due to the weight of the virtual objects. In the third experiment, subjects were asked to balance a virtual sphere on a real cardboard. The haptic devices provided the interaction forces due to the weight of the virtual sphere rolling on the cardboard. Providing haptic feedback through the considered wearable device significantly improved the performance of all the considered tasks. Moreover, subjects significantly preferred conditions providing wearable haptic feedback.
Rajanna, Vijay; Vo, Patrick; Barth, Jerry; Mjelde, Matthew; Grey, Trevor; Oduola, Cassandra; Hammond, Tracy
2016-03-01
A carefully planned, structured, and supervised physiotherapy program, following a surgery, is crucial for the successful diagnosis of physical injuries. Nearly 50 % of the surgeries fail due to unsupervised, and erroneous physiotherapy. The demand for a physiotherapist for an extended period is expensive to afford, and sometimes inaccessible. Researchers have tried to leverage the advancements in wearable sensors and motion tracking by building affordable, automated, physio-therapeutic systems that direct a physiotherapy session by providing audio-visual feedback on patient's performance. There are many aspects of automated physiotherapy program which are yet to be addressed by the existing systems: a wide classification of patients' physiological conditions to be diagnosed, multiple demographics of the patients (blind, deaf, etc.), and the need to pursue patients to adopt the system for an extended period for self-care. In our research, we have tried to address these aspects by building a health behavior change support system called KinoHaptics, for post-surgery rehabilitation. KinoHaptics is an automated, wearable, haptic assisted, physio-therapeutic system that can be used by a wide variety of demographics and for various physiological conditions of the patients. The system provides rich and accurate vibro-haptic feedback that can be felt by the user, irrespective of the physiological limitations. KinoHaptics is built to ensure that no injuries are induced during the rehabilitation period. The persuasive nature of the system allows for personal goal-setting, progress tracking, and most importantly life-style compatibility. The system was evaluated under laboratory conditions, involving 14 users. Results show that KinoHaptics is highly convenient to use, and the vibro-haptic feedback is intuitive, accurate, and has shown to prevent accidental injuries. Also, results show that KinoHaptics is persuasive in nature as it supports behavior change and habit building. The successful acceptance of KinoHaptics, an automated, wearable, haptic assisted, physio-therapeutic system proves the need and future-scope of automated physio-therapeutic systems for self-care and behavior change. It also proves that such systems incorporated with vibro-haptic feedback encourage strong adherence to the physiotherapy program; can have profound impact on the physiotherapy experience resulting in higher acceptance rate.
Intuitive tactile zooming for graphics accessed by individuals who are blind and visually impaired.
Rastogi, Ravi; Pawluk, T V Dianne; Ketchum, Jessica
2013-07-01
One possibility of providing access to visual graphics for those who are visually impaired is to present them tactually: unfortunately, details easily available to vision need to be magnified to be accessible through touch. For this, we propose an "intuitive" zooming algorithm to solve potential problems with directly applying visual zooming techniques to haptic displays that sense the current location of a user on a virtual diagram with a position sensor and, then, provide the appropriate local information either through force or tactile feedback. Our technique works by determining and then traversing the levels of an object tree hierarchy of a diagram. In this manner, the zoom steps adjust to the content to be viewed, avoid clipping and do not zoom when no object is present. The algorithm was tested using a small, "mouse-like" display with tactile feedback on pictures representing houses in a community and boats on a lake. We asked the users to answer questions related to details in the pictures. Comparing our technique to linear and logarithmic step zooming, we found a significant increase in the correctness of the responses (odds ratios of 2.64:1 and 2.31:1, respectively) and usability (differences of 36% and 19%, respectively) using our "intuitive" zooming technique.
Discriminating Tissue Stiffness with a Haptic Catheter: Feeling the Inside of the Beating Heart
Kesner, Samuel B.; Howe, Robert D.
2011-01-01
Catheter devices allow physicians to access the inside of the human body easily and painlessly through natural orifices and vessels. Although catheters allow for the delivery of fluids and drugs, the deployment of devices, and the acquisition of the measurements, they do not allow clinicians to assess the physical properties of tissue inside the body due to the tissue motion and transmission limitations of the catheter devices, including compliance, friction, and backlash. The goal of this research is to increase the tactile information available to physicians during catheter procedures by providing haptic feedback during palpation procedures. To accomplish this goal, we have developed the first motion compensated actuated catheter system that enables haptic perception of fast moving tissue structures. The actuated catheter is instrumented with a distal tip force sensor and a force feedback interface that allows users to adjust the position of the catheter while experiencing the forces on the catheter tip. The efficacy of this device and interface is evaluated through a psychophyisical study comparing how accurately users can differentiate various materials attached to a cardiac motion simulator using the haptic device and a conventional manual catheter. The results demonstrate that haptics improves a user's ability to differentiate material properties and decreases the total number of errors by 50% over the manual catheter system. PMID:25285321
Analysis of hand contact areas and interaction capabilities during manipulation and exploration.
Gonzalez, Franck; Gosselin, Florian; Bachta, Wael
2014-01-01
Manual human-computer interfaces for virtual reality are designed to allow an operator interacting with a computer simulation as naturally as possible. Dexterous haptic interfaces are the best suited for this goal. They give intuitive and efficient control on the environment with haptic and tactile feedback. This paper is aimed at helping in the choice of the interaction areas to be taken into account in the design of such interfaces. The literature dealing with hand interactions is first reviewed in order to point out the contact areas involved in exploration and manipulation tasks. Their frequencies of use are then extracted from existing recordings. The results are gathered in an original graphical interaction map allowing for a simple visualization of the way the hand is used, and compared with a map of mechanoreceptors densities. Then an interaction tree, mapping the relative amount of actions made available through the use of a given contact area, is built and correlated with the losses of hand function induced by amputations. A rating of some existing haptic interfaces and guidelines for their design are finally achieved to illustrate a possible use of the developed graphical tools.
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-01-01
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape. PMID:24113680
Covarrubias, Mario; Bordegoni, Monica; Cugini, Umberto
2013-10-09
In this article, we present an approach that uses both two force sensitive handles (FSH) and a flexible capacitive touch sensor (FCTS) to drive a haptic-based immersive system. The immersive system has been developed as part of a multimodal interface for product design. The haptic interface consists of a strip that can be used by product designers to evaluate the quality of a 3D virtual shape by using touch, vision and hearing and, also, to interactively change the shape of the virtual object. Specifically, the user interacts with the FSH to move the virtual object and to appropriately position the haptic interface for retrieving the six degrees of freedom required for both manipulation and modification modalities. The FCTS allows the system to track the movement and position of the user's fingers on the strip, which is used for rendering visual and sound feedback. Two evaluation experiments are described, which involve both the evaluation and the modification of a 3D shape. Results show that the use of the haptic strip for the evaluation of aesthetic shapes is effective and supports product designers in the appreciation of the aesthetic qualities of the shape.
A Review of Simulators with Haptic Devices for Medical Training.
Escobar-Castillejos, David; Noguez, Julieta; Neri, Luis; Magana, Alejandra; Benes, Bedrich
2016-04-01
Medical procedures often involve the use of the tactile sense to manipulate organs or tissues by using special tools. Doctors require extensive preparation in order to perform them successfully; for example, research shows that a minimum of 750 operations are needed to acquire sufficient experience to perform medical procedures correctly. Haptic devices have become an important training alternative and they have been considered to improve medical training because they let users interact with virtual environments by adding the sense of touch to the simulation. Previous articles in the field state that haptic devices enhance the learning of surgeons compared to current training environments used in medical schools (corpses, animals, or synthetic skin and organs). Consequently, virtual environments use haptic devices to improve realism. The goal of this paper is to provide a state of the art review of recent medical simulators that use haptic devices. In particular we focus on stitching, palpation, dental procedures, endoscopy, laparoscopy, and orthopaedics. These simulators are reviewed and compared from the viewpoint of used technology, the number of degrees of freedom, degrees of force feedback, perceived realism, immersion, and feedback provided to the user. In the conclusion, several observations per area and suggestions for future work are provided.
Sensorimotor Interactions in the Haptic Perception of Virtual Objects
1997-01-01
the human user. 2 Compared to our understanding of vision and audition , our knowledge of the human haptic perception is very limited. Many basic...modalities such as vision and audition on haptic perception of viscosity or mass, for example. 116 Some preliminary work has already been done in this...string[3]; *posx="x" *forf="f’ *velv="v" * acca ="a" trial[64]; resp[64]; /* random number */ /* trial number */ /* index */ /* array holding stim
Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms
1998-01-01
2.7.3 Load/Save Options ..... 2.7.4 Information Display .... 2.8 Library Files. 2.9 Evaluation .............. 3 Visual-Haptic Interactions 3.1...Northwestern University[ Colgate , 1994]. It is possible for a user to touch one side of a thin object and be propelled out the opposite side, because...when there is a high correlation in motion and force between the visual and haptic realms. * Chapter 7 concludes with an evaluation of the application
Vision-Based Haptic Feedback for Remote Micromanipulation in-SEM Environment
NASA Astrophysics Data System (ADS)
Bolopion, Aude; Dahmen, Christian; Stolle, Christian; Haliyo, Sinan; Régnier, Stéphane; Fatikow, Sergej
2012-07-01
This article presents an intuitive environment for remote micromanipulation composed of both haptic feedback and virtual reconstruction of the scene. To enable nonexpert users to perform complex teleoperated micromanipulation tasks, it is of utmost importance to provide them with information about the 3-D relative positions of the objects and the tools. Haptic feedback is an intuitive way to transmit such information. Since position sensors are not available at this scale, visual feedback is used to derive information about the scene. In this work, three different techniques are implemented, evaluated, and compared to derive the object positions from scanning electron microscope images. The modified correlation matching with generated template algorithm is accurate and provides reliable detection of objects. To track the tool, a marker-based approach is chosen since fast detection is required for stable haptic feedback. Information derived from these algorithms is used to propose an intuitive remote manipulation system that enables users situated in geographically distant sites to benefit from specific equipments, such as SEMs. Stability of the haptic feedback is ensured by the minimization of the delays, the computational efficiency of vision algorithms, and the proper tuning of the haptic coupling. Virtual guides are proposed to avoid any involuntary collisions between the tool and the objects. This approach is validated by a teleoperation involving melamine microspheres with a diameter of less than 2 μ m between Paris, France and Oldenburg, Germany.
A design of hardware haptic interface for gastrointestinal endoscopy simulation.
Gu, Yunjin; Lee, Doo Yong
2011-01-01
Gastrointestinal endoscopy simulations have been developed to train endoscopic procedures which require hundreds of practices to be competent in the skills. Even though realistic haptic feedback is important to provide realistic sensation to the user, most of previous simulations including commercialized simulation have mainly focused on providing realistic visual feedback. In this paper, we propose a novel design of portable haptic interface, which provides 2DOF force feedback, for the gastrointestinal endoscopy simulation. The haptic interface consists of translational and rotational force feedback mechanism which are completely decoupled, and gripping mechanism for controlling connection between the endoscope and the force feedback mechanism.
A Haptic-Enhanced System for Molecular Sensing
NASA Astrophysics Data System (ADS)
Comai, Sara; Mazza, Davide
The science of haptics has received an enormous attention in the last decade. One of the major application trends of haptics technology is data visualization and training. In this paper, we present a haptically-enhanced system for manipulation and tactile exploration of molecules.The geometrical models of molecules is extracted either from theoretical or empirical data using file formats widely adopted in chemical and biological fields. The addition of information computed with computational chemistry tools, allows users to feel the interaction forces between an explored molecule and a charge associated to the haptic device, and to visualize a huge amount of numerical data in a more comprehensible way. The developed tool can be used either for teaching or research purposes due to its high reliance on both theoretical and experimental data.
Human eye haptics-based multimedia.
Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron
2014-01-01
Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.
Do Haptic Representations Help Complex Molecular Learning?
ERIC Educational Resources Information Center
Bivall, Petter; Ainsworth, Shaaron; Tibell, Lena A. E.
2011-01-01
This study explored whether adding a haptic interface (that provides users with somatosensory information about virtual objects by force and tactile feedback) to a three-dimensional (3D) chemical model enhanced students' understanding of complex molecular interactions. Two modes of the model were compared in a between-groups pre- and posttest…
Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback.
Gibson, Alison; Artemiadis, Panagiotis
2014-01-01
As the field of brain-machine interfaces and neuro-prosthetics continues to grow, there is a high need for sensor and actuation mechanisms that can provide haptic feedback to the user. Current technologies employ expensive, invasive and often inefficient force feedback methods, resulting in an unrealistic solution for individuals who rely on these devices. This paper responds through the development, integration and analysis of a novel feedback architecture where haptic information during the neural control of a prosthetic hand is perceived through multi-frequency auditory signals. Through representing force magnitude with volume and force location with frequency, the feedback architecture can translate the haptic experiences of a robotic end effector into the alternative sensory modality of sound. Previous research with the proposed cross-modal feedback method confirmed its learnability, so the current work aimed to investigate which frequency map (i.e. frequency-specific locations on the hand) is optimal in helping users distinguish between hand-held objects and tasks associated with them. After short use with the cross-modal feedback during the electromyographic (EMG) control of a prosthetic hand, testing results show that users are able to use audial feedback alone to discriminate between everyday objects. While users showed adaptation to three different frequency maps, the simplest map containing only two frequencies was found to be the most useful in discriminating between objects. This outcome provides support for the feasibility and practicality of the cross-modal feedback method during the neural control of prosthetics.
Culbertson, Heather; Kuchenbecker, Katherine J
2017-01-01
Interacting with physical objects through a tool elicits tactile and kinesthetic sensations that comprise your haptic impression of the object. These cues, however, are largely missing from interactions with virtual objects, yielding an unrealistic user experience. This article evaluates the realism of virtual surfaces rendered using haptic models constructed from data recorded during interactions with real surfaces. The models include three components: surface friction, tapping transients, and texture vibrations. We render the virtual surfaces on a SensAble Phantom Omni haptic interface augmented with a Tactile Labs Haptuator for vibration output. We conducted a human-subject study to assess the realism of these virtual surfaces and the importance of the three model components. Following a perceptual discrepancy paradigm, subjects compared each of 15 real surfaces to a full rendering of the same surface plus versions missing each model component. The realism improvement achieved by including friction, tapping, or texture in the rendering was found to directly relate to the intensity of the surface's property in that domain (slipperiness, hardness, or roughness). A subsequent analysis of forces and vibrations measured during interactions with virtual surfaces indicated that the Omni's inherent mechanical properties corrupted the user's haptic experience, decreasing realism of the virtual surface.
Telerobotic Haptic Exploration in Art Galleries and Museums for Individuals with Visual Impairments.
Park, Chung Hyuk; Ryu, Eun-Seok; Howard, Ayanna M
2015-01-01
This paper presents a haptic telepresence system that enables visually impaired users to explore locations with rich visual observation such as art galleries and museums by using a telepresence robot, a RGB-D sensor (color and depth camera), and a haptic interface. The recent improvement on RGB-D sensors has enabled real-time access to 3D spatial information in the form of point clouds. However, the real-time representation of this data in the form of tangible haptic experience has not been challenged enough, especially in the case of telepresence for individuals with visual impairments. Thus, the proposed system addresses the real-time haptic exploration of remote 3D information through video encoding and real-time 3D haptic rendering of the remote real-world environment. This paper investigates two scenarios in haptic telepresence, i.e., mobile navigation and object exploration in a remote environment. Participants with and without visual impairments participated in our experiments based on the two scenarios, and the system performance was validated. In conclusion, the proposed framework provides a new methodology of haptic telepresence for individuals with visual impairments by providing an enhanced interactive experience where they can remotely access public places (art galleries and museums) with the aid of haptic modality and robotic telepresence.
Dynamics modeling for parallel haptic interfaces with force sensing and control.
Bernstein, Nicholas; Lawrence, Dale; Pao, Lucy
2013-01-01
Closed-loop force control can be used on haptic interfaces (HIs) to mitigate the effects of mechanism dynamics. A single multidimensional force-torque sensor is often employed to measure the interaction force between the haptic device and the user's hand. The parallel haptic interface at the University of Colorado (CU) instead employs smaller 1D force sensors oriented along each of the five actuating rods to build up a 5D force vector. This paper shows that a particular manipulandum/hand partition in the system dynamics is induced by the placement and type of force sensing, and discusses the implications on force and impedance control for parallel haptic interfaces. The details of a "squaring down" process are also discussed, showing how to obtain reduced degree-of-freedom models from the general six degree-of-freedom dynamics formulation.
Palpation simulator with stable haptic feedback.
Kim, Sang-Youn; Ryu, Jee-Hwan; Lee, WooJeong
2015-01-01
The main difficulty in constructing palpation simulators is to compute and to generate stable and realistic haptic feedback without vibration. When a user haptically interacts with highly non-homogeneous soft tissues through a palpation simulator, a sudden change of stiffness in target tissues causes unstable interaction with the object. We propose a model consisting of a virtual adjustable damper and an energy measuring element. The energy measuring element gauges energy which is stored in a palpation simulator and the virtual adjustable damper dissipates the energy to achieve stable haptic interaction. To investigate the haptic behavior of the proposed method, impulse and continuous inputs are provided to target tissues. If a haptic interface point meets with the hardest portion in the target tissues modeled with a conventional method, we observe unstable motion and feedback force. However, when the target tissues are modeled with the proposed method, a palpation simulator provides stable interaction without vibration. The proposed method overcomes a problem in conventional haptic palpation simulators where unstable force or vibration can be generated if there is a big discrepancy in material property between an element and its neighboring elements in target tissues.
User Acceptance of a Haptic Interface for Learning Anatomy
ERIC Educational Resources Information Center
Yeom, Soonja; Choi-Lundberg, Derek; Fluck, Andrew; Sale, Arthur
2013-01-01
Visualizing the structure and relationships in three dimensions (3D) of organs is a challenge for students of anatomy. To provide an alternative way of learning anatomy engaging multiple senses, we are developing a force-feedback (haptic) interface for manipulation of 3D virtual organs, using design research methodology, with iterations of system…
Overview Electrotactile Feedback for Enhancing Human Computer Interface
NASA Astrophysics Data System (ADS)
Pamungkas, Daniel S.; Caesarendra, Wahyu
2018-04-01
To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.
Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli
NASA Astrophysics Data System (ADS)
Tsetserukou, D.; Neviarouskaya, A.
2012-03-01
The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.
Modeling and test of a kinaesthetic actuator based on MR fluid for haptic applications.
Yang, Tae-Heon; Koo, Jeong-Hoi; Kim, Sang-Youn; Kwon, Dong-Soo
2017-03-01
Haptic display units have been widely used for conveying button sensations to users, primarily employing vibrotactile actuators. However, the human feeling for pressing buttons mainly relies on kinaesthetic sensations (rather than vibrotactile sensations), and little studies exist on small-scale kinaesthetic haptic units. Thus, the primary goals of this paper are to design a miniature kinaesthetic actuator based on Magneto-Rheological (MR) fluid that can convey various button-clicking sensations and to experimentally evaluate its haptic performance. The design focuses of the proposed actuator were to produce sufficiently large actuation forces (resistive forces) for human users in a given size constraint and to offer a wide range of actuation forces for conveying vivid haptic sensations to users. To this end, this study first performed a series of parametric studies using mathematical force models for multiple operating modes of MR fluid in conjunction with finite element electromagnetism analysis. After selecting design parameters based on parametric studies, a prototype actuator was constructed, and its performance was evaluated using a dynamic mechanical analyzer. It measured the actuator's resistive force with a varying stroke (pressed depth) up to 1 mm and a varying input current from 0 A to 200 mA. The results show that the proposed actuator creates a wide range of resistive forces from around 2 N (off-state) to over 9.5 N at 200 mA. In order to assess the prototype's performance in the terms of the haptic application prospective, a maximum force rate was calculated to determine just noticeable difference in force changes for the 1 mm stoke of the actuator. The results show that the force rate is sufficient to mimic various levels of button sensations, indicating that the proposed kinaesthetic actuator can offer a wide range of resistive force changes that can be conveyed to human operators.
Haptic display for the VR arthroscopy training simulator
NASA Astrophysics Data System (ADS)
Ziegler, Rolf; Brandt, Christoph; Kunstmann, Christian; Mueller, Wolfgang; Werkhaeuser, Holger
1997-05-01
A specific desire to find new training methods arose from the new fields called 'minimal invasive surgery.' With the technical advance modern video arthroscopy became the standard procedure in the ORs. Holding the optical system with the video camera in one hand, watching the operation field on the monitor, the other hand was free to guide, e.g., a probe. As arthroscopy became a more common procedure it became obvious that some sort of special training was necessary to guarantee a certain level of qualification of the surgeons. Therefore, a hospital in Frankfurt, Germany approached the Fraunhofer Institute for Computer Graphics to develop a training system for arthroscopy based on VR techniques. At least the main drawback of the developed simulator is the missing of haptic perception, especially of force feedback. In cooperation with the Department of Electro-Mechanical Construction at the Darmstadt Technical University we have designed and built a haptic display for the VR arthroscopy training simulator. In parallel we developed a concept for the integration of the haptic display in a configurable way.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Crawford
2012-08-01
Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in remote and/or hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space to name a few. In order to achieve this end the research presented in this paper has developed an admittance type exoskeleton like multi-fingered haptic hand user interface that secures the user’s palm and provides 3-dimensional force feedback to the user’s fingertips. Atypical to conventional haptic hand user interfaces that limit themselves to integrating the human hand’s characteristics just into the system’smore » mechanical design this system also perpetuates that inspiration into the designed user interface’s controller. This is achieved by manifesting the property differences of manipulation and grasping activities as they pertain to the human hand into a nonlinear master-slave force relationship. The results presented in this paper show that the admittance-type system has sufficient bandwidth that it appears nearly transparent to the user when the user is in free motion and when the system is subjected to a manipulation task, increased performance is achieved using the nonlinear force relationship compared to the traditional linear scaling techniques implemented in the vast majority of systems.« less
River multimodal scenario for rehabilitation robotics.
Munih, Marko; Novak, Domen; Milavec, Maja; Ziherl, Jaka; Olenšek, Andrej; Mihelj, Matjaž
2011-01-01
This paper presents the novel "River" multimodal rehabilitation robotics scenario that includes video, audio and haptic modalities. Elements contributing to intrinsic motivation are carefully joined in the three modalities to increase motivation of the user. The user first needs to perform a motor action, then receives a cognitive challenge that is solved with adequate motor activity. Audio includes environmental sounds, music and spoken instructions or encouraging statements. Sounds and music were classified according to the arousal-valence space. The haptic modality can provide catching, grasping, tunnel or adaptive assistance, all depending on the user's needs. The scenario was evaluated in 16 stroke users, who responded to it favourably according to the Intrinsic Motivation Inventory questionnaire. Additionally, the river multimodal environment seems to elicit higher motivation than a simpler apple pick-and-place multimodal task. © 2011 IEEE
Virtual Reality simulator for dental anesthesia training in the inferior alveolar nerve block.
Corrêa, Cléber Gimenez; Machado, Maria Aparecida de Andrade Moreira; Ranzini, Edith; Tori, Romero; Nunes, Fátima de Lourdes Santos
2017-01-01
This study shows the development and validation of a dental anesthesia-training simulator, specifically for the inferior alveolar nerve block (IANB). The system developed provides the tactile sensation of inserting a real needle in a human patient, using Virtual Reality (VR) techniques and a haptic device that can provide a perceived force feedback in the needle insertion task during the anesthesia procedure. To simulate a realistic anesthesia procedure, a Carpule syringe was coupled to a haptic device. The Volere method was used to elicit requirements from users in the Dentistry area; Repeated Measures Two-Way ANOVA (Analysis of Variance), Tukey post-hoc test and averages for the results' analysis. A questionnaire-based subjective evaluation method was applied to collect information about the simulator, and 26 people participated in the experiments (12 beginners, 12 at intermediate level, and 2 experts). The questionnaire included profile, preferences (number of viewpoints, texture of the objects, and haptic device handler), as well as visual (appearance, scale, and position of objects) and haptic aspects (motion space, tactile sensation, and motion reproduction). The visual aspect was considered appropriate and the haptic feedback must be improved, which the users can do by calibrating the virtual tissues' resistance. The evaluation of visual aspects was influenced by the participants' experience, according to ANOVA test (F=15.6, p=0.0002, with p<0.01). The user preferences were the simulator with two viewpoints, objects with texture based on images and the device with a syringe coupled to it. The simulation was considered thoroughly satisfactory for the anesthesia training, considering the needle insertion task, which includes the correct insertion point and depth, as well as the perception of tissues resistances during the insertion.
Virtual reality simulation: basic concepts and use in endoscopic neurosurgery training.
Cohen, Alan R; Lohani, Subash; Manjila, Sunil; Natsupakpong, Suriya; Brown, Nathan; Cavusoglu, M Cenk
2013-08-01
Virtual reality simulation is a promising alternative to training surgical residents outside the operating room. It is also a useful aide to anatomic study, residency training, surgical rehearsal, credentialing, and recertification. Surgical simulation is based on a virtual reality with varying degrees of immersion and realism. Simulators provide a no-risk environment for harmless and repeatable practice. Virtual reality has three main components of simulation: graphics/volume rendering, model behavior/tissue deformation, and haptic feedback. The challenge of accurately simulating the forces and tactile sensations experienced in neurosurgery limits the sophistication of a virtual simulator. The limited haptic feedback available in minimally invasive neurosurgery makes it a favorable subject for simulation. Virtual simulators with realistic graphics and force feedback have been developed for ventriculostomy, intraventricular surgery, and transsphenoidal pituitary surgery, thus allowing preoperative study of the individual anatomy and increasing the safety of the procedure. The authors also present experiences with their own virtual simulation of endoscopic third ventriculostomy.
High-fidelity bilateral teleoperation systems and the effect of multimodal haptics.
Tavakoli, Mahdi; Aziminejad, Arash; Patel, Rajni V; Moallem, Mehrdad
2007-12-01
In master-slave teleoperation applications that deal with a delicate and sensitive environment, it is important to provide haptic feedback of slave/environment interactions to the user's hand as it improves task performance and teleoperation transparency (fidelity), which is the extent of telepresence of the remote environment available to the user through the master-slave system. For haptic teleoperation, in addition to a haptics-capable master interface, often one or more force sensors are also used, which warrant new bilateral control architectures while increasing the cost and the complexity of the teleoperation system. In this paper, we investigate the added benefits of using force sensors that measure hand/master and slave/environment interactions and of utilizing local feedback loops on the teleoperation transparency. We compare the two-channel and the four-channel bilateral control systems in terms of stability and transparency, and study the stability and performance robustness of the four-channel method against nonidealities that arise during bilateral control implementation, which include master-slave communication latency and changes in the environment dynamics. The next issue addressed in the paper deals with the case where the master interface is not haptics capable, but the slave is equipped with a force sensor. In the context of robotics-assisted soft-tissue surgical applications, we explore through human factors experiments whether slave/environment force measurements can be of any help with regard to improving task performance. The last problem we study is whether slave/environment force information, with and without haptic capability in the master interface, can help improve outcomes under degraded visual conditions.
Haptic Foot Pedal: Influence of Shoe Type, Age, and Gender on Subjective Pulse Perception.
Geitner, Claudia; Birrell, Stewart; Krehl, Claudia; Jennings, Paul
2018-06-01
This study investigates the influence of shoe type (sneakers and safety boots), age, and gender on the perception of haptic pulse feedback provided by a prototype accelerator pedal in a running stationary vehicle. Haptic feedback can be a less distracting alternative to traditionally visual and auditory in-vehicle feedback. However, to be effective, the device delivering the haptic feedback needs to be in contact with the person. Factors such as shoe type vary naturally over the season and could render feedback that is perceived well in one situation, unnoticeable in another. In this study, we evaluate factors that can influence the subjective perception of haptic feedback in a stationary but running car: shoe type, age, and gender. Thirty-six drivers within three age groups (≤39, 40-59, and ≥60) took part. For each haptic feedback, participants rated intensity, urgency, and comfort via a questionnaire. The perception of the haptic feedback is significantly influenced by the interaction between the pulse's duration and force amplitude and the participant's age and gender but not shoe type. The results indicate that it is important to consider different age groups and gender in the evaluation of haptic feedback. Future research might also look into approaches to adapt haptic feedback to the individual driver's preferences. Findings from this study can be applied to the design of an accelerator pedal in a car, for example, for a nonvisual in-vehicle warning, but also to plan user studies with a haptic pedal in general.
Development master arm of 2-DOF planar parallel manipulator for In-Vitro Fertilization
NASA Astrophysics Data System (ADS)
Thamrongaphichartkul, Kitti; Vongbunyong, Supachai; Nuntakarn, Lalana
2018-01-01
Micromanipulator is a mechanical device used for manipulating miniature objects in the order of micron. It is widely used in In-Vitro Fertilization (IVF) in which sperms will be held in a micro-needle and penetrate to an oocyte for fertilization. IVF needs to be performed by high skill embryologists to control the movement of the needle accurately due to the lack of tactile perception of the user. Haptic device is a device that can transmit and simulate position, velocity and force in order to enhance interaction between the user and system. However, commercially available haptic devices have unnecessary degrees of freedom and limited workspace which are inappropriate for IVF process. This paper focuses on development of a haptic device for using in IVF process. It will be used as a master arm for the master-slave system for IVF process in order to enhance the ability of users to control the micromanipulator. As a result, the embryologist is able to carry out the IVF process more effectively with having tactile perception.
Shared virtual environments for telerehabilitation.
Popescu, George V; Burdea, Grigore; Boian, Rares
2002-01-01
Current VR telerehabilitation systems use offline remote monitoring from the clinic and patient-therapist videoconferencing. Such "store and forward" and video-based systems cannot implement medical services involving patient therapist direct interaction. Real-time telerehabilitation applications (including remote therapy) can be developed using a shared Virtual Environment (VE) architecture. We developed a two-user shared VE for hand telerehabilitation. Each site has a telerehabilitation workstation with a videocamera and a Rutgers Master II (RMII) force feedback glove. Each user can control a virtual hand and interact hapticly with virtual objects. Simulated physical interactions between therapist and patient are implemented using hand force feedback. The therapist's graphic interface contains several virtual panels, which allow control over the rehabilitation process. These controls start a videoconferencing session, collect patient data, or apply therapy. Several experimental telerehabilitation scenarios were successfully tested on a LAN. A Web-based approach to "real-time" patient telemonitoring--the monitoring portal for hand telerehabilitation--was also developed. The therapist interface is implemented as a Java3D applet that monitors patient hand movement. The monitoring portal gives real-time performance on off-the-shelf desktop workstations.
NASA Astrophysics Data System (ADS)
Hwang, Donghyun; Lee, Jaemin; Kim, Keehoon
2017-10-01
This paper proposes a miniature haptic ring that can display touch/pressure and shearing force to the user’s fingerpad. For practical use and wider application of the device, it is developed with the aim of achieving high wearability and mobility/portability as well as cutaneous force feedback functionality. A main body of the device is designed as a ring-shaped lightweight structure with a simple driving mechanism, and thin shape memory alloy (SMA) wires having high energy density are applied as actuating elements. Also, based on a band-type wireless control unit including a wireless data communication module, the whole device could be realized as a wearable mobile haptic device system. These features enable the device to take diverse advantages on functional performances and to provide users with significant usability. In this work, the proposed miniature haptic ring is systematically designed, and its working performances are experimentally evaluated with a fabricated functional prototype. The experimental results obviously demonstrate that the proposed device exhibits higher force-to-weight ratio than conventional finger-wearable haptic devices for cutaneous force feedback. Also, it is investigated that operational performances of the device are strongly influenced by electro-thermomechanical behaviors of the SMA actuator. In addition to the experiments for performance evaluation, we conduct a preliminary user test to assess practical feasibility and usability based on user’s qualitative feedback.
Design and Evaluation of Shape-Changing Haptic Interfaces for Pedestrian Navigation Assistance.
Spiers, Adam J; Dollar, Aaron M
2017-01-01
Shape-changing interfaces are a category of device capable of altering their form in order to facilitate communication of information. In this work, we present a shape-changing device that has been designed for navigation assistance. 'The Animotus' (previously, 'The Haptic Sandwich' ), resembles a cube with an articulated upper half that is able to rotate and extend (translate) relative to the bottom half, which is fixed in the user's grasp. This rotation and extension, generally felt via the user's fingers, is used to represent heading and proximity to navigational targets. The device is intended to provide an alternative to screen or audio based interfaces for visually impaired, hearing impaired, deafblind, and sighted pedestrians. The motivation and design of the haptic device is presented, followed by the results of a navigation experiment that aimed to determine the role of each device DOF, in terms of facilitating guidance. An additional device, 'The Haptic Taco', which modulated its volume in response to target proximity (negating directional feedback), was also compared. Results indicate that while the heading (rotational) DOF benefited motion efficiency, the proximity (translational) DOF benefited velocity. Combination of the two DOF improved overall performance. The volumetric Taco performed comparably to the Animotus' extension DOF.
Real-time dual-band haptic music player for mobile devices.
Hwang, Inwook; Lee, Hyeseon; Choi, Seungmoon
2013-01-01
We introduce a novel dual-band haptic music player for real-time simultaneous vibrotactile playback with music in mobile devices. Our haptic music player features a new miniature dual-mode actuator that can produce vibrations consisting of two principal frequencies and a real-time vibration generation algorithm that can extract vibration commands from a music file for dual-band playback (bass and treble). The algorithm uses a "haptic equalizer" and provides plausible sound-to-touch modality conversion based on human perceptual data. In addition, we present a user study carried out to evaluate the subjective performance (precision, harmony, fun, and preference) of the haptic music player, in comparison with the current practice of bass-band-only vibrotactile playback via a single-frequency voice-coil actuator. The evaluation results indicated that the new dual-band playback outperforms the bass-only rendering, also providing several insights for further improvements. The developed system and experimental findings have implications for improving the multimedia experience with mobile devices.
Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle.
Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L; Cutkosky, Mark R
2014-09-01
This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024).
Detection of Membrane Puncture with Haptic Feedback using a Tip-Force Sensing Needle
Elayaperumal, Santhi; Bae, Jung Hwa; Daniel, Bruce L.; Cutkosky, Mark R.
2015-01-01
This paper presents calibration and user test results of a 3-D tip-force sensing needle with haptic feedback. The needle is a modified MRI-compatible biopsy needle with embedded fiber Bragg grating (FBG) sensors for strain detection. After calibration, the needle is interrogated at 2 kHz, and dynamic forces are displayed remotely with a voice coil actuator. The needle is tested in a single-axis master/slave system, with the voice coil haptic display at the master, and the needle at the slave end. Tissue phantoms with embedded membranes were used to determine the ability of the tip-force sensors to provide real-time haptic feedback as compared to external sensors at the needle base during needle insertion via the master/slave system. Subjects were able to determine the position of the embedded membranes with significantly better accuracy using FBG tip feedback than with base feedback using a commercial force/torque sensor (p = 0.045) or with no added haptic feedback (p = 0.0024). PMID:26509101
Haptic interfaces: Hardware, software and human performance
NASA Technical Reports Server (NTRS)
Srinivasan, Mandayam A.
1995-01-01
Virtual environments are computer-generated synthetic environments with which a human user can interact to perform a wide variety of perceptual and motor tasks. At present, most of the virtual environment systems engage only the visual and auditory senses, and not the haptic sensorimotor system that conveys the sense of touch and feel of objects in the environment. Computer keyboards, mice, and trackballs constitute relatively simple haptic interfaces. Gloves and exoskeletons that track hand postures have more interaction capabilities and are available in the market. Although desktop and wearable force-reflecting devices have been built and implemented in research laboratories, the current capabilities of such devices are quite limited. To realize the full promise of virtual environments and teleoperation of remote systems, further developments of haptic interfaces are critical. In this paper, the status and research needs in human haptics, technology development and interactions between the two are described. In particular, the excellent performance characteristics of Phantom, a haptic interface recently developed at MIT, are highlighted. Realistic sensations of single point of contact interactions with objects of variable geometry (e.g., smooth, textured, polyhedral) and material properties (e.g., friction, impedance) in the context of a variety of tasks (e.g., needle biopsy, switch panels) achieved through this device are described and the associated issues in haptic rendering are discussed.
Invited Article: A review of haptic optical tweezers for an interactive microworld exploration
NASA Astrophysics Data System (ADS)
Pacoret, Cécile; Régnier, Stéphane
2013-08-01
This paper is the first review of haptic optical tweezers, a new technique which associates force feedback teleoperation with optical tweezers. This technique allows users to explore the microworld by sensing and exerting picoNewton-scale forces with trapped microspheres. Haptic optical tweezers also allow improved dexterity of micromanipulation and micro-assembly. One of the challenges of this technique is to sense and magnify picoNewton-scale forces by a factor of 1012 to enable human operators to perceive interactions that they have never experienced before, such as adhesion phenomena, extremely low inertia, and high frequency dynamics of extremely small objects. The design of optical tweezers for high quality haptic feedback is challenging, given the requirements for very high sensitivity and dynamic stability. The concept, design process, and specification of optical tweezers reviewed here are focused on those intended for haptic teleoperation. In this paper, two new specific designs as well as the current state-of-the-art are presented. Moreover, the remaining important issues are identified for further developments. The initial results obtained are promising and demonstrate that optical tweezers have a significant potential for haptic exploration of the microworld. Haptic optical tweezers will become an invaluable tool for force feedback micromanipulation of biological samples and nano- and micro-assembly parts.
Hadavand, Mostafa; Mirbagheri, Alireza; Behzadipour, Saeed; Farahmand, Farzam
2014-06-01
An effective master robot for haptic tele-surgery applications needs to provide a solution for the inversed movements of the surgical tool, in addition to sufficient workspace and manipulability, with minimal moving inertia. A novel 4 + 1-DOF mechanism was proposed, based on a triple parallelogram linkage, which provided a Remote Center of Motion (RCM) at the back of the user's hand. The kinematics of the robot was analyzed and a prototype was fabricated and evaluated by experimental tests. With a RCM at the back of the user's hand the actuators far from the end effector, the robot could produce the sensation of hand-inside surgery with minimal moving inertia. The target workspace was achieved with an acceptable manipulability. The trajectory tracking experiments revealed small errors, due to backlash at the joints. The proposed mechanism meets the basic requirements of an effective master robot for haptic tele-surgery applications. Copyright © 2013 John Wiley & Sons, Ltd.
Designing Haptic Assistive Technology for Individuals Who Are Blind or Visually Impaired.
Pawluk, Dianne T V; Adams, Richard J; Kitada, Ryo
2015-01-01
This paper considers issues relevant for the design and use of haptic technology for assistive devices for individuals who are blind or visually impaired in some of the major areas of importance: Braille reading, tactile graphics, orientation and mobility. We show that there is a wealth of behavioral research that is highly applicable to assistive technology design. In a few cases, conclusions from behavioral experiments have been directly applied to design with positive results. Differences in brain organization and performance capabilities between individuals who are "early blind" and "late blind" from using the same tactile/haptic accommodations, such as the use of Braille, suggest the importance of training and assessing these groups individually. Practical restrictions on device design, such as performance limitations of the technology and cost, raise questions as to which aspects of these restrictions are truly important to overcome to achieve high performance. In general, this raises the question of what it means to provide functional equivalence as opposed to sensory equivalence.
Haptic Stylus and Empirical Studies on Braille, Button, and Texture Display
Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok
2008-01-01
This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor. PMID:18317520
Haptic stylus and empirical studies on braille, button, and texture display.
Kyung, Ki-Uk; Lee, Jun-Young; Park, Junseok
2008-01-01
This paper presents a haptic stylus interface with a built-in compact tactile display module and an impact module as well as empirical studies on Braille, button, and texture display. We describe preliminary evaluations verifying the tactile display's performance indicating that it can satisfactorily represent Braille numbers for both the normal and the blind. In order to prove haptic feedback capability of the stylus, an experiment providing impact feedback mimicking the click of a button has been conducted. Since the developed device is small enough to be attached to a force feedback device, its applicability to combined force and tactile feedback display in a pen-held haptic device is also investigated. The handle of pen-held haptic interface was replaced by the pen-like interface to add tactile feedback capability to the device. Since the system provides combination of force, tactile and impact feedback, three haptic representation methods for texture display have been compared on surface with 3 texture groups which differ in direction, groove width, and shape. In addition, we evaluate its capacity to support touch screen operations by providing tactile sensations when a user rubs against an image displayed on a monitor.
Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2010-01-01
Purpose Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. Methods A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. Results The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. Conclusion A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users. PMID:20714933
Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu
2011-07-01
Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.
Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.
Kling-Petersen, T; Rydmark, M
2000-01-01
The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-01-01
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality. PMID:28513545
A Study on Immersion and Presence of a Portable Hand Haptic System for Immersive Virtual Reality.
Kim, Mingyu; Jeon, Changyu; Kim, Jinmo
2017-05-17
This paper proposes a portable hand haptic system using Leap Motion as a haptic interface that can be used in various virtual reality (VR) applications. The proposed hand haptic system was designed as an Arduino-based sensor architecture to enable a variety of tactile senses at low cost, and is also equipped with a portable wristband. As a haptic system designed for tactile feedback, the proposed system first identifies the left and right hands and then sends tactile senses (vibration and heat) to each fingertip (thumb and index finger). It is incorporated into a wearable band-type system, making its use easy and convenient. Next, hand motion is accurately captured using the sensor of the hand tracking system and is used for virtual object control, thus achieving interaction that enhances immersion. A VR application was designed with the purpose of testing the immersion and presence aspects of the proposed system. Lastly, technical and statistical tests were carried out to assess whether the proposed haptic system can provide a new immersive presence to users. According to the results of the presence questionnaire and the simulator sickness questionnaire, we confirmed that the proposed hand haptic system, in comparison to the existing interaction that uses only the hand tracking system, provided greater presence and a more immersive environment in the virtual reality.
Daud Albasini, Omar A.; Oboe, Roberto; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Piron, Lamberto
2013-01-01
Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain. PMID:24319496
Turolla, Andrea; Daud Albasini, Omar A; Oboe, Roberto; Agostini, Michela; Tonin, Paolo; Paolucci, Stefano; Sandrini, Giorgio; Venneri, Annalena; Piron, Lamberto
2013-01-01
Background. Haptic robots allow the exploitation of known motor learning mechanisms, representing a valuable option for motor treatment after stroke. The aim of this feasibility multicentre study was to test the clinical efficacy of a haptic prototype, for the recovery of hand function after stroke. Methods. A prospective pilot clinical trial was planned on 15 consecutive patients enrolled in 3 rehabilitation centre in Italy. All the framework features of the haptic robot (e.g., control loop, external communication, and graphic rendering for virtual reality) were implemented into a real-time MATLAB/Simulink environment, controlling a five-bar linkage able to provide forces up to 20 [N] at the end effector, used for finger and hand rehabilitation therapies. Clinical (i.e., Fugl-Meyer upper extremity scale; nine hold pegboard test) and kinematics (i.e., time; velocity; jerk metric; normalized jerk of standard movements) outcomes were assessed before and after treatment to detect changes in patients' motor performance. Reorganization of cortical activation was detected in one patient by fMRI. Results and Conclusions. All patients showed significant improvements in both clinical and kinematic outcomes. Additionally, fMRI results suggest that the proposed approach may promote a better cortical activation in the brain.
Haptic augmented skin surface generation toward telepalpation from a mobile skin image.
Kim, K
2018-05-01
Very little is known about the methods of integrating palpation techniques to existing mobile teleskin imaging that delivers low quality tactile information (roughness) for telepalpation. However, no study has been reported yet regarding telehaptic palpation using mobile phone images for teledermatology or teleconsultations of skincare. This study is therefore aimed at introducing a new algorithm accurately reconstructing a haptic augmented skin surface for telehaptic palpation using a low-cost clip-on microscope simply attached to a mobile phone. Multiple algorithms such as gradient-based image enhancement, roughness-adaptive tactile mask generation, roughness-enhanced 3D tactile map building, and visual and haptic rendering with a three-degrees-of-freedom (DOF) haptic device were developed and integrated as one system. Evaluation experiments have been conducted to test the performance of 3D roughness reconstruction with/without the tactile mask. The results confirm that reconstructed haptic roughness with the tactile mask is superior to the reconstructed haptic roughness without the tactile mask. Additional experiments demonstrate that the proposed algorithm is robust against varying lighting conditions and blurring. In last, a user study has been designed to see the effect of the haptic modality to the existing visual only interface and the results attest that the haptic skin palpation can significantly improve the skin exam performance. Mobile image-based telehaptic palpation technology was proposed, and an initial version was developed. The developed technology was tested with several skin images and the experimental results showed the superiority of the proposed scheme in terms of the performance of haptic augmentation of real skin images. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Fortmeier, Dirk; Mastmeyer, Andre; Schröder, Julian; Handels, Heinz
2016-01-01
This study presents a new visuo-haptic virtual reality (VR) training and planning system for percutaneous transhepatic cholangio-drainage (PTCD) based on partially segmented virtual patient models. We only use partially segmented image data instead of a full segmentation and circumvent the necessity of surface or volume mesh models. Haptic interaction with the virtual patient during virtual palpation, ultrasound probing and needle insertion is provided. Furthermore, the VR simulator includes X-ray and ultrasound simulation for image-guided training. The visualization techniques are GPU-accelerated by implementation in Cuda and include real-time volume deformations computed on the grid of the image data. Computation on the image grid enables straightforward integration of the deformed image data into the visualization components. To provide shorter rendering times, the performance of the volume deformation algorithm is improved by a multigrid approach. To evaluate the VR training system, a user evaluation has been performed and deformation algorithms are analyzed in terms of convergence speed with respect to a fully converged solution. The user evaluation shows positive results with increased user confidence after a training session. It is shown that using partially segmented patient data and direct volume rendering is suitable for the simulation of needle insertion procedures such as PTCD.
Prototype tactile feedback system for examination by skin touch.
Lee, O; Lee, K; Oh, C; Kim, K; Kim, M
2014-08-01
Diagnosis of conditions such as psoriasis and atopic dermatitis, in the case of induration, involves palpating the infected area via hands and then selecting a ratings score. However, the score is determined based on the tester's experience and standards, making it subjective. To provide tactile feedback on the skin, we developed a prototype tactile feedback system to simulate skin wrinkles with PHANToM OMNI. To provide the user with tactile feedback on skin wrinkles, a visual and haptic Augmented Reality system was developed. First, a pair of stereo skin images obtained by a stereo camera generates a disparity map of skin wrinkles. Second, the generated disparity map is sent to an implemented tactile rendering algorithm that computes a reaction force according to the user's interaction with the skin image. We first obtained a stereo image of skin wrinkles from the in vivo stereo imaging system, which has a baseline of 50.8 μm, and obtained the disparity map with a graph cuts algorithm. The left image is displayed on the monitor to enable the user to recognize the location visually. The disparity map of the skin wrinkle image sends skin wrinkle information as a tactile response to the user through a haptic device. We successfully developed a tactile feedback system for virtual skin wrinkle simulation by means of a commercialized haptic device that provides the user with a single point of contact to feel the surface roughness of a virtual skin sample. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Haptic device for a ventricular shunt insertion simulator.
Panchaphongsaphak, Bundit; Stutzer, Diego; Schwyter, Etienne; Bernays, René-Ludwig; Riener, Robert
2006-01-01
In this paper we propose a new one-degree-of-freedom haptic device that can be used to simulate ventricular shunt insertion procedures. The device is used together with the BRAINTRAIN training simulator developed for neuroscience education, neurological data visualization and surgical planning. The design of the haptic device is based on a push-pull cable concept. The rendered forces produced by a linear motor connected at one end of the cable are transferred to the user via a sliding mechanism at the end-effector located at the other end of the cable. The end-effector provides the range of movement up to 12 cm. The force is controlled by an open-loop impedance algorithm and can become up to 15 N.
A novel shape-changing haptic table-top display
NASA Astrophysics Data System (ADS)
Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi
2018-01-01
A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.
Validation and learning in the Procedicus KSA virtual reality surgical simulator.
Ström, P; Kjellin, A; Hedman, L; Johnson, E; Wredmark, T; Felländer-Tsai, L
2003-02-01
Advanced simulator training within medicine is a rapidly growing field. Virtual reality simulators are being introduced as cost-saving educational tools, which also lead to increased patient safety. Fifteen medical students were included in the study. For 10 medical students performance was monitored, before and after 1 h of training, in two endoscopic simulators (the Procedicus KSA with haptic feedback and anatomical graphics and the established MIST simulator without this haptic feedback and graphics). Five medical students performed 50 tests in the Procedicus KSA in order to analyze learning curves. One of these five medical students performed multiple training sessions during 2 weeks and performed more than 300 tests. There was a significant improvement after 1 h of training regarding time, movement economy, and total score. The results in the two simulators were highly correlated. Our results show that the use of surgical simulators as a pedagogical tool in medical student training is encouraging. It shows rapid learning curves and our suggestion is to introduce endoscopic simulator training in undergraduate medical education during the course in surgery when motivation is high and before the development of "negative stereotypes" and incorrect practices.
A haptic pedal for surgery assistance.
Díaz, Iñaki; Gil, Jorge Juan; Louredo, Marcos
2014-09-01
The research and development of mechatronic aids for surgery is a persistent challenge in the field of robotic surgery. This paper presents a new haptic pedal conceived to assist surgeons in the operating room by transmitting real-time surgical information through the foot. An effective human-robot interaction system for medical practice must exchange appropriate information with the operator as quickly and accurately as possible. Moreover, information must flow through the appropriate sensory modalities for a natural and simple interaction. However, users of current robotic systems might experience cognitive overload and be increasingly overwhelmed by data streams from multiple modalities. A new haptic channel is thus explored to complement and improve existing systems. A preliminary set of experiments has been carried out to evaluate the performance of the proposed system in a virtual surgical drilling task. The results of the experiments show the effectiveness of the haptic pedal in providing surgical information through the foot. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Face and construct validity of a computer-based virtual reality simulator for ERCP.
Bittner, James G; Mellinger, John D; Imam, Toufic; Schade, Robert R; Macfadyen, Bruce V
2010-02-01
Currently, little evidence supports computer-based simulation for ERCP training. To determine face and construct validity of a computer-based simulator for ERCP and assess its perceived utility as a training tool. Novice and expert endoscopists completed 2 simulated ERCP cases by using the GI Mentor II. Virtual Education and Surgical Simulation Laboratory, Medical College of Georgia. Outcomes included times to complete the procedure, reach the papilla, and use fluoroscopy; attempts to cannulate the papilla, pancreatic duct, and common bile duct; and number of contrast injections and complications. Subjects assessed simulator graphics, procedural accuracy, difficulty, haptics, overall realism, and training potential. Only when performance data from cases A and B were combined did the GI Mentor II differentiate novices and experts based on times to complete the procedure, reach the papilla, and use fluoroscopy. Across skill levels, overall opinions were similar regarding graphics (moderately realistic), accuracy (similar to clinical ERCP), difficulty (similar to clinical ERCP), overall realism (moderately realistic), and haptics. Most participants (92%) claimed that the simulator has definite training potential or should be required for training. Small sample size, single institution. The GI Mentor II demonstrated construct validity for ERCP based on select metrics. Most subjects thought that the simulated graphics, procedural accuracy, and overall realism exhibit face validity. Subjects deemed it a useful training tool. Study repetition involving more participants and cases may help confirm results and establish the simulator's ability to differentiate skill levels based on ERCP-specific metrics.
NASA Astrophysics Data System (ADS)
Yang, Tae-Heon; Koo, Jeong-Hoi
2017-12-01
Humans can experience a realistic and vivid haptic sensations by the sense of touch. In order to have a fully immersive haptic experience, both kinaesthetic and vibrotactile information must be presented to human users. Currently, little haptic research has been performed on small haptic actuators that can covey both vibrotactile feedback based on the frequency of vibrations up to the human-perceivable limit and multiple levels of kinaesthetic feedback rapidly. Therefore, this study intends to design a miniature haptic device based on MR fluid and experimentally evaluate its ability to convey vibrotactile feedback up to 300 Hz along with kinaesthetic feedback. After constructing a prototype device, a series of testing was performed to evaluate its performance of the prototype using an experimental setup, consisting of a precision dynamic mechanical analyzer and an accelerometer. The kinaesthetic testing results show that the prototype device can provide the force rate up to 89% at 5 V (360 mA), which can be discretized into multiple levels of ‘just noticeable difference’ force rate, indicating that the device can convey a wide range of kinaesthetic sensations. To evaluate the high frequency vibrotactile feedback performance of the device, its acceleration responses were measured and processed using the FFT analysis. The results indicate that the device can convey high frequency vibrotactile sensations up to 300 Hz with the sufficiently large intensity of accelerations that human can feel.
Evaluation of Pseudo-Haptic Interactions with Soft Objects in Virtual Environments.
Li, Min; Sareh, Sina; Xu, Guanghua; Ridzuan, Maisarah Binti; Luo, Shan; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2016-01-01
This paper proposes a pseudo-haptic feedback method conveying simulated soft surface stiffness information through a visual interface. The method exploits a combination of two feedback techniques, namely visual feedback of soft surface deformation and control of the indenter avatar speed, to convey stiffness information of a simulated surface of a soft object in virtual environments. The proposed method was effective in distinguishing different sizes of virtual hard nodules integrated into the simulated soft bodies. To further improve the interactive experience, the approach was extended creating a multi-point pseudo-haptic feedback system. A comparison with regards to (a) nodule detection sensitivity and (b) elapsed time as performance indicators in hard nodule detection experiments to a tablet computer incorporating vibration feedback was conducted. The multi-point pseudo-haptic interaction is shown to be more time-efficient than the single-point pseudo-haptic interaction. It is noted that multi-point pseudo-haptic feedback performs similarly well when compared to a vibration-based feedback method based on both performance measures elapsed time and nodule detection sensitivity. This proves that the proposed method can be used to convey detailed haptic information for virtual environmental tasks, even subtle ones, using either a computer mouse or a pressure sensitive device as an input device. This pseudo-haptic feedback method provides an opportunity for low-cost simulation of objects with soft surfaces and hard inclusions, as, for example, occurring in ever more realistic video games with increasing emphasis on interaction with the physical environment and minimally invasive surgery in the form of soft tissue organs with embedded cancer nodules. Hence, the method can be used in many low-budget applications where haptic sensation is required, such as surgeon training or video games, either using desktop computers or portable devices, showing reasonably high fidelity in conveying stiffness perception to the user.
Haptics using a smart material for eyes-free interaction in personal devices
NASA Astrophysics Data System (ADS)
Wang, Huihui; Lane, William Brian; Pappas, Devin; Duque, Bryam; Leong, John
2014-03-01
In this paper we present a prototype using a dry ionic polymer metal composite (IPMC) in interactive personal devices such as bracelet, necklace, pocket key chain or mobile devices for haptic interaction when audio or visual feedback is not possible or practical. This prototype interface is an electro-mechanical system that realizes a shape-changing haptic display for information communication. A dry IPMC will change its dimensions due to the electrostatic effect when an electrical potential is provided to them. The IPMC can operate at a lower voltage (less than 2.5V) which is compatible with requirements for personal electrical devices or mobile devices. The prototype consists of the addressable arrays of the IPMCs with different dimensions which are deformable to different shapes with proper handling or customization. 3D printing technology will be used to form supporting parts. Microcontrollers (about 3cm square) from DigiKey will be imbedded into this personal device. An Android based mobile APP will be developed to talk with microcontrollers to control IPMCs. When personal devices receive information signals, the original shape of the prototype will change to another shape related to the specific sender or types of information sources. This interactive prototype can simultaneously realize multiple methods for conveying haptic information such as dimension, force, and texture due to the flexible array design. We conduct several studies of user experience to explore how users' respond to shape change information.
Simulation and training of lumbar punctures using haptic volume rendering and a 6DOF haptic device
NASA Astrophysics Data System (ADS)
Färber, Matthias; Heller, Julika; Handels, Heinz
2007-03-01
The lumbar puncture is performed by inserting a needle into the spinal chord of the patient to inject medicaments or to extract liquor. The training of this procedure is usually done on the patient guided by experienced supervisors. A virtual reality lumbar puncture simulator has been developed in order to minimize the training costs and the patient's risk. We use a haptic device with six degrees of freedom (6DOF) to feedback forces that resist needle insertion and rotation. An improved haptic volume rendering approach is used to calculate the forces. This approach makes use of label data of relevant structures like skin, bone, muscles or fat and original CT data that contributes information about image structures that can not be segmented. A real-time 3D visualization with optional stereo view shows the punctured region. 2D visualizations of orthogonal slices enable a detailed impression of the anatomical context. The input data consisting of CT and label data and surface models of relevant structures is defined in an XML file together with haptic rendering and visualization parameters. In a first evaluation the visible human male data has been used to generate a virtual training body. Several users with different medical experience tested the lumbar puncture trainer. The simulator gives a good haptic and visual impression of the needle insertion and the haptic volume rendering technique enables the feeling of unsegmented structures. Especially, the restriction of transversal needle movement together with rotation constraints enabled by the 6DOF device facilitate a realistic puncture simulation.
Griffiths, Paul G; Gillespie, R Brent
2005-01-01
This paper describes a paradigm for human/automation control sharing in which the automation acts through a motor coupled to a machine's manual control interface. The manual interface becomes a haptic display, continually informing the human about automation actions. While monitoring by feel, users may choose either to conform to the automation or override it and express their own control intentions. This paper's objective is to demonstrate that adding automation through haptic display can be used not only to improve performance on a primary task but also to reduce perceptual demands or free attention for a secondary task. Results are presented from three experiments in which 11 participants completed a lane-following task using a motorized steering wheel on a fixed-base driving simulator. The automation behaved like a copilot, assisting with lane following by applying torques to the steering wheel. Results indicate that haptic assist improves lane following by least 30%, p < .0001, while reducing visual demand by 29%, p < .0001, or improving reaction time in a secondary tone localization task by 18 ms, p = .0009. Potential applications of this research include the design of automation interfaces based on haptics that support human/automation control sharing better than traditional push-button automation interfaces.
Lin, Yanping; Chen, Huajiang; Yu, Dedong; Zhang, Ying; Yuan, Wen
2017-01-01
Bone drilling simulators with virtual and haptic feedback provide a safe, cost-effective and repeatable alternative to traditional surgical training methods. To develop such a simulator, accurate haptic rendering based on a force model is required to feedback bone drilling forces based on user input. Current predictive bone drilling force models based on bovine bones with various drilling conditions and parameters are not representative of the bone drilling process in bone surgery. The objective of this study was to provide a bone drilling force model for haptic rendering based on calibration and validation experiments in fresh cadaveric bones with different bone densities. Using a commonly used drill bit geometry (2 mm diameter), feed rates (20-60 mm/min) and spindle speeds (4000-6000 rpm) in orthognathic surgeries, the bone drilling forces of specimens from two groups were measured and the calibration coefficients of the specific normal and frictional pressures were determined. The comparison of the predicted forces and the measured forces from validation experiments with a large range of feed rates and spindle speeds demonstrates that the proposed bone drilling forces can predict the trends and average forces well. The presented bone drilling force model can be used for haptic rendering in surgical simulators.
A pervasive visual-haptic framework for virtual delivery training.
Abate, Andrea F; Acampora, Giovanni; Loia, Vincenzo; Ricciardi, Stefano; Vasilakos, Athanasios V
2010-03-01
Thanks to the advances of voltage regulator (VR) technologies and haptic systems, virtual simulators are increasingly becoming a viable alternative to physical simulators in medicine and surgery, though many challenges still remain. In this study, a pervasive visual-haptic framework aimed to the training of obstetricians and midwives to vaginal delivery is described. The haptic feedback is provided by means of two hand-based haptic devices able to reproduce force-feedbacks on fingers and arms, thus enabling a much more realistic manipulation respect to stylus-based solutions. The interactive simulation is not solely driven by an approximated model of complex forces and physical constraints but, instead, is approached by a formal modeling of the whole labor and of the assistance/intervention procedures performed by means of a timed automata network and applied to a parametrical 3-D model of the anatomy, able to mimic a wide range of configurations. This novel methodology is able to represent not only the sequence of the main events associated to either a spontaneous or to an operative childbirth process, but also to help in validating the manual intervention as the actions performed by the user during the simulation are evaluated according to established medical guidelines. A discussion on the first results as well as on the challenges still unaddressed is included.
Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo
2013-05-01
New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.
Absence of modulatory action on haptic height perception with musical pitch
Geronazzo, Michele; Avanzini, Federico; Grassi, Massimo
2015-01-01
Although acoustic frequency is not a spatial property of physical objects, in common language, pitch, i.e., the psychological correlated of frequency, is often labeled spatially (i.e., “high in pitch” or “low in pitch”). Pitch-height is known to modulate (and interact with) the response of participants when they are asked to judge spatial properties of non-auditory stimuli (e.g., visual) in a variety of behavioral tasks. In the current study we investigated whether the modulatory action of pitch-height extended to the haptic estimation of height of a virtual step. We implemented a HW/SW setup which is able to render virtual 3D objects (stair-steps) haptically through a PHANTOM device, and to provide real-time continuous auditory feedback depending on the user interaction with the object. The haptic exploration was associated with a sinusoidal tone whose pitch varied as a function of the interaction point's height within (i) a narrower and (ii) a wider pitch range, or (iii) a random pitch variation acting as a control audio condition. Explorations were also performed with no sound (haptic only). Participants were instructed to explore the virtual step freely, and to communicate height estimation by opening their thumb and index finger to mimic the step riser height, or verbally by reporting the height in centimeters of the step riser. We analyzed the role of musical expertise by dividing participants into non-musicians and musicians. Results showed no effects of musical pitch on high-realistic haptic feedback. Overall there is no difference between the two groups in the proposed multimodal conditions. Additionally, we observed a different haptic response distribution between musicians and non-musicians when estimations of the auditory conditions are matched with estimations in the no sound condition. PMID:26441745
Weather information network including graphical display
NASA Technical Reports Server (NTRS)
Leger, Daniel R. (Inventor); Burdon, David (Inventor); Son, Robert S. (Inventor); Martin, Kevin D. (Inventor); Harrison, John (Inventor); Hughes, Keith R. (Inventor)
2006-01-01
An apparatus for providing weather information onboard an aircraft includes a processor unit and a graphical user interface. The processor unit processes weather information after it is received onboard the aircraft from a ground-based source, and the graphical user interface provides a graphical presentation of the weather information to a user onboard the aircraft. Preferably, the graphical user interface includes one or more user-selectable options for graphically displaying at least one of convection information, turbulence information, icing information, weather satellite information, SIGMET information, significant weather prognosis information, and winds aloft information.
NASA Astrophysics Data System (ADS)
Simonnet, Mathieu; Jacobson, Dan; Vieilledent, Stephane; Tisseau, Jacques
Navigating consists of coordinating egocentric and allocentric spatial frames of reference. Virtual environments have afforded researchers in the spatial community with tools to investigate the learning of space. The issue of the transfer between virtual and real situations is not trivial. A central question is the role of frames of reference in mediating spatial knowledge transfer to external surroundings, as is the effect of different sensory modalities accessed in simulated and real worlds. This challenges the capacity of blind people to use virtual reality to explore a scene without graphics. The present experiment involves a haptic and auditory maritime virtual environment. In triangulation tasks, we measure systematic errors and preliminary results show an ability to learn configurational knowledge and to navigate through it without vision. Subjects appeared to take advantage of getting lost in an egocentric “haptic” view in the virtual environment to improve performances in the real environment.
Husman, M A B; Maqbool, H F; Awad, M I; Abouhossein, A; Dehghani-Sanij, A A
2016-08-01
Haptic feedback to lower limb amputees is essential to maximize the functionality of a prosthetic device by providing information to the user about the interaction with the environment and the position of the prostheses in space. Severed sensory pathway and the absence of connection between the prosthesis and the Central Nervous System (CNS) after lower limb amputation reduces balance control, increases visual dependency and increases risk of falls among amputees. This work describes the design of a wearable haptic feedback device for lower limb amputees using lateral skin-stretch modality intended to serve as a feedback cue during ambulation. A feedback scheme was proposed based on gait event detection for possible real-time postural adjustment. Preliminary perceptual test with healthy subjects in static condition was carried out and the results indicated over 98% accuracy in determining stimuli location around the upper leg region, suggesting good perceptibility of the delivered stimuli.
Advanced Maintenance Simulation by Means of Hand-Based Haptic Interfaces
NASA Astrophysics Data System (ADS)
Nappi, Michele; Paolino, Luca; Ricciardi, Stefano; Sebillo, Monica; Vitiello, Giuliana
Aerospace industry has been involved in virtual simulation for design and testing since the birth of virtual reality. Today this industry is showing a growing interest in the development of haptic-based maintenance training applications, which represent the most advanced way to simulate maintenance and repair tasks within a virtual environment by means of a visual-haptic approach. The goal is to allow the trainee to experiment the service procedures not only as a workflow reproduced at a visual level but also in terms of the kinaesthetic feedback involved with the manipulation of tools and components. This study, conducted in collaboration with aerospace industry specialists, is aimed to the development of an immersive virtual capable of immerging the trainees into a virtual environment where mechanics and technicians can perform maintenance simulation or training tasks by directly manipulating 3D virtual models of aircraft parts while perceiving force feedback through the haptic interface. The proposed system is based on ViRstperson, a virtual reality engine under development at the Italian Center for Aerospace Research (CIRA) to support engineering and technical activities such as design-time maintenance procedure validation, and maintenance training. This engine has been extended to support haptic-based interaction, enabling a more complete level of interaction, also in terms of impedance control, and thus fostering the development of haptic knowledge in the user. The user’s “sense of touch” within the immersive virtual environment is simulated through an Immersion CyberForce® hand-based force-feedback device. Preliminary testing of the proposed system seems encouraging.
Vibrotactile perception assessment for a haptic interface on an antigravity suit.
Ko, Sang Min; Lee, Kwangil; Kim, Daeho; Ji, Yong Gu
2017-01-01
Haptic technology is used in various fields to transmit information to the user with or without visual and auditory cues. This study aimed to provide preliminary data for use in developing a haptic interface for an antigravity (anti-G) suit. With the structural characteristics of the anti-G suit in mind, we determined five areas on the body (lower back, outer thighs, inner thighs, outer calves, and inner calves) on which to install ten bar-type eccentric rotating mass (ERM) motors as vibration actuators. To determine the design factors of the haptic anti-G suit, we conducted three experiments to find the absolute threshold, moderate intensity, and subjective assessments of vibrotactile stimuli. Twenty-six fighter pilots participated in the experiments, which were conducted in a fixed-based flight simulator. From the results of our study, we recommend 1) absolute thresholds of ∼11.98-15.84 Hz and 102.01-104.06 dB, 2) moderate intensities of 74.36 Hz and 126.98 dB for the lower back and 58.65 Hz and 122.37 dB for either side of the thighs and calves, and 3) subjective assessments of vibrotactile stimuli (displeasure, easy to perceive, and level of comfort). The results of this study will be useful for the design of a haptic anti-G suit. Copyright © 2016 Elsevier Ltd. All rights reserved.
Framework for e-learning assessment in dental education: a global model for the future.
Arevalo, Carolina R; Bayne, Stephen C; Beeley, Josie A; Brayshaw, Christine J; Cox, Margaret J; Donaldson, Nora H; Elson, Bruce S; Grayden, Sharon K; Hatzipanagos, Stylianos; Johnson, Lynn A; Reynolds, Patricia A; Schönwetter, Dieter J
2013-05-01
The framework presented in this article demonstrates strategies for a global approach to e-curricula in dental education by considering a collection of outcome assessment tools. By combining the outcomes for overall assessment, a global model for a pilot project that applies e-assessment tools to virtual learning environments (VLE), including haptics, is presented. Assessment strategies from two projects, HapTEL (Haptics in Technology Enhanced Learning) and UDENTE (Universal Dental E-learning), act as case-user studies that have helped develop the proposed global framework. They incorporate additional assessment tools and include evaluations from questionnaires and stakeholders' focus groups. These measure each of the factors affecting the classical teaching/learning theory framework as defined by Entwistle in a standardized manner. A mathematical combinatorial approach is proposed to join these results together as a global assessment. With the use of haptic-based simulation learning, exercises for tooth preparation assessing enamel and dentine were compared to plastic teeth in manikins. Equivalence for student performance for haptic versus traditional preparation methods was established, thus establishing the validity of the haptic solution for performing these exercises. Further data collected from HapTEL are still being analyzed, and pilots are being conducted to validate the proposed test measures. Initial results have been encouraging, but clearly the need persists to develop additional e-assessment methods for new learning domains.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery.
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2011-06-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information.
Prototyping a Hybrid Cooperative and Tele-robotic Surgical System for Retinal Microsurgery
Balicki, Marcin; Xia, Tian; Jung, Min Yang; Deguet, Anton; Vagvolgyi, Balazs; Kazanzides, Peter; Taylor, Russell
2013-01-01
This paper presents the design of a tele-robotic microsurgical platform designed for development of cooperative and tele-operative control schemes, sensor based smart instruments, user interfaces and new surgical techniques with eye surgery as the driving application. The system is built using the distributed component-based cisst libraries and the Surgical Assistant Workstation framework. It includes a cooperatively controlled EyeRobot2, a da Vinci Master manipulator, and a remote stereo visualization system. We use constrained optimization based virtual fixture control to provide Virtual Remote-Center-of-Motion (vRCM) and haptic feedback. Such system can be used in a hybrid setup, combining local cooperative control with remote tele-operation, where an experienced surgeon can provide hand-over-hand tutoring to a novice user. In another scheme, the system can provide haptic feedback based on virtual fixtures constructed from real-time force and proximity sensor information. PMID:24398557
Collision detection and modeling of rigid and deformable objects in laparoscopic simulator
NASA Astrophysics Data System (ADS)
Dy, Mary-Clare; Tagawa, Kazuyoshi; Tanaka, Hiromi T.; Komori, Masaru
2015-03-01
Laparoscopic simulators are viable alternatives for surgical training and rehearsal. Haptic devices can also be incorporated with virtual reality simulators to provide additional cues to the users. However, to provide realistic feedback, the haptic device must be updated by 1kHz. On the other hand, realistic visual cues, that is, the collision detection and deformation between interacting objects must be rendered at least 30 fps. Our current laparoscopic simulator detects the collision between a point on the tool tip, and on the organ surfaces, in which haptic devices are attached on actual tool tips for realistic tool manipulation. The triangular-mesh organ model is rendered using a mass spring deformation model, or finite element method-based models. In this paper, we investigated multi-point-based collision detection on the rigid tool rods. Based on the preliminary results, we propose a method to improve the collision detection scheme, and speed up the organ deformation reaction. We discuss our proposal for an efficient method to compute simultaneous multiple collision between rigid (laparoscopic tools) and deformable (organs) objects, and perform the subsequent collision response, with haptic feedback, in real-time.
Man, mind, and machine: the past and future of virtual reality simulation in neurologic surgery.
Robison, R Aaron; Liu, Charles Y; Apuzzo, Michael L J
2011-11-01
To review virtual reality in neurosurgery, including the history of simulation and virtual reality and some of the current implementations; to examine some of the technical challenges involved; and to propose a potential paradigm for the development of virtual reality in neurosurgery going forward. A search was made on PubMed using key words surgical simulation, virtual reality, haptics, collision detection, and volumetric modeling to assess the current status of virtual reality in neurosurgery. Based on previous results, investigators extrapolated the possible integration of existing efforts and potential future directions. Simulation has a rich history in surgical training, and there are numerous currently existing applications and systems that involve virtual reality. All existing applications are limited to specific task-oriented functions and typically sacrifice visual realism for real-time interactivity or vice versa, owing to numerous technical challenges in rendering a virtual space in real time, including graphic and tissue modeling, collision detection, and direction of the haptic interface. With ongoing technical advancements in computer hardware and graphic and physical rendering, incremental or modular development of a fully immersive, multipurpose virtual reality neurosurgical simulator is feasible. The use of virtual reality in neurosurgery is predicted to change the nature of neurosurgical education, and to play an increased role in surgical rehearsal and the continuing education and credentialing of surgical practitioners. Copyright © 2011 Elsevier Inc. All rights reserved.
Buck, Ursula; Naether, Silvio; Braun, Marcel; Thali, Michael
2008-09-18
Non-invasive documentation methods such as surface scanning and radiological imaging are gaining in importance in the forensic field. These three-dimensional technologies provide digital 3D data, which are processed and handled in the computer. However, the sense of touch gets lost using the virtual approach. The haptic device enables the use of the sense of touch to handle and feel digital 3D data. The multifunctional application of a haptic device for forensic approaches is evaluated and illustrated in three different cases: the representation of bone fractures of the lower extremities, by traffic accidents, in a non-invasive manner; the comparison of bone injuries with the presumed injury-inflicting instrument; and in a gunshot case, the identification of the gun by the muzzle imprint, and the reconstruction of the holding position of the gun. The 3D models of the bones are generated from the Computed Tomography (CT) images. The 3D models of the exterior injuries, the injury-inflicting tools and the bone injuries, where a higher resolution is necessary, are created by the optical surface scan. The haptic device is used in combination with the software FreeForm Modelling Plus for touching the surface of the 3D models to feel the minute injuries and the surface of tools, to reposition displaced bone parts and to compare an injury-causing instrument with an injury. The repositioning of 3D models in a reconstruction is easier, faster and more precisely executed by means of using the sense of touch and with the user-friendly movement in the 3D space. For representation purposes, the fracture lines of bones are coloured. This work demonstrates that the haptic device is a suitable and efficient application in forensic science. The haptic device offers a new way in the handling of digital data in the virtual 3D space.
Graphical User Interface Programming in Introductory Computer Science.
ERIC Educational Resources Information Center
Skolnick, Michael M.; Spooner, David L.
Modern computing systems exploit graphical user interfaces for interaction with users; as a result, introductory computer science courses must begin to teach the principles underlying such interfaces. This paper presents an approach to graphical user interface (GUI) implementation that is simple enough for beginning students to understand, yet…
Use of force feedback to enhance graphical user interfaces
NASA Astrophysics Data System (ADS)
Rosenberg, Louis B.; Brave, Scott
1996-04-01
This project focuses on the use of force feedback sensations to enhance user interaction with standard graphical user interface paradigms. While typical joystick and mouse devices are input-only, force feedback controllers allow physical sensations to be reflected to a user. Tasks that require users to position a cursor on a given target can be enhanced by applying physical forces to the user that aid in targeting. For example, an attractive force field implemented at the location of a graphical icon can greatly facilitate target acquisition and selection of the icon. It has been shown that force feedback can enhance a users ability to perform basic functions within graphical user interfaces.
Yoon, Han U.; Anil Kumar, Namita; Hur, Pilwon
2017-01-01
Cutaneous sensory feedback can be used to provide additional sensory cues to a person performing a motor task where vision is a dominant feedback signal. A haptic joystick has been widely used to guide a user by providing force feedback. However, the benefit of providing force feedback is still debatable due to performance dependency on factors such as the user's skill-level, task difficulty. Meanwhile, recent studies have shown the feasibility of improving a motor task performance by providing skin-stretch feedback. Therefore, a combination of two aforementioned feedback types is deemed to be promising to promote synergistic effects to consistently improve the person's motor performance. In this study, we aimed at identifying the effect of the combined haptic and skin-stretch feedbacks on the aged person's driving motor performance. For the experiment, 15 healthy elderly subjects (age 72.8 ± 6.6 years) were recruited and were instructed to drive a virtual power-wheelchair through four different courses with obstacles. Four augmented sensory feedback conditions were tested: no feedback, force feedback, skin-stretch feedback, and a combination of both force and skin-stretch feedbacks. While the haptic force was provided to the hand by the joystick, the skin-stretch was provided to the steering forearm by a custom-designed wearable skin-stretch device. We tested two hypotheses: (i) an elderly individual's motor control would benefit from receiving information about a desired trajectory from multiple sensory feedback sources, and (ii) the benefit does not depend on task difficulty. Various metrics related to skills and safety were used to evaluate the control performance. Repeated measure ANOVA was performed for those metrics with two factors: task scenario and the type of the augmented sensory feedback. The results revealed that elderly subjects' control performance significantly improved when the combined feedback of both haptic force and skin-stretch feedback was applied. The proposed approach suggest the feasibility to improve people's task performance by the synergistic effects of multiple augmented sensory feedback modalities. PMID:28690514
Kokes, Rebecca; Lister, Kevin; Gullapalli, Rao; Zhang, Bao; MacMillan, Alan; Richard, Howard; Desai, Jaydev P.
2009-01-01
Objective The purpose of this paper is to explore the feasibility of developing a MRI-compatible needle driver system for radiofrequency ablation (RFA) of breast tumors under continuous MRI imaging while being teleoperated by a haptic feedback device from outside the scanning room. The developed needle driver prototype was designed and tested for both tumor targeting capability as well as RFA. Methods The single degree-of-freedom (DOF) prototype was interfaced with a PHANToM haptic device controlled from outside the scanning room. Experiments were performed to demonstrate MRI-compatibility and position control accuracy with hydraulic actuation, along with an experiment to determine the PHANToM’s ability to guide the RFA tool to a tumor nodule within a phantom breast tissue model while continuously imaging within the MRI and receiving force feedback from the RFA tool. Results Hydraulic actuation is shown to be a feasible actuation technique for operation in an MRI environment. The design is MRI-compatible in all aspects except for force sensing in the directions perpendicular to the direction of motion. Experiments confirm that the user is able to detect healthy vs. cancerous tissue in a phantom model when provided with both visual (imaging) feedback and haptic feedback. Conclusion The teleoperated 1-DOF needle driver system presented in this paper demonstrates the feasibility of implementing a MRI-compatible robot for RFA of breast tumors with haptic feedback capability. PMID:19303805
Rodriguez-Guerrero, Carlos; Knaepen, Kristel; Fraile-Marinero, Juan C.; Perez-Turiel, Javier; Gonzalez-de-Garibay, Valentin; Lefeber, Dirk
2017-01-01
In order to harmonize robotic devices with human beings, the robots should be able to perceive important psychosomatic impact triggered by emotional states such as frustration or boredom. This paper presents a new type of biocooperative control architecture, which acts toward improving the challenge/skill relation perceived by the user when interacting with a robotic multimodal interface in a cooperative scenario. In the first part of the paper, open-loop experiments revealed which physiological signals were optimal for inclusion in the feedback loop. These were heart rate, skin conductance level, and skin conductance response frequency. In the second part of the paper, the proposed controller, consisting of a biocooperative architecture with two degrees of freedom, simultaneously modulating game difficulty and haptic assistance through performance and psychophysiological feedback, is presented. With this setup, the perceived challenge can be modulated by means of the game difficulty and the perceived skill by means of the haptic assistance. A new metric (FlowIndex) is proposed to numerically quantify and visualize the challenge/skill relation. The results are contrasted with comparable previously published work and show that the new method afforded a higher FlowIndex (i.e., a superior challenge/skill relation) and an improved balance between augmented performance and user satisfaction (higher level of valence, i.e., a more enjoyable and satisfactory experience). PMID:28507503
Building the Joint Battlespace Infosphere. Volume 2: Interactive Information Technologies
1999-12-17
G. A . Vouros, “ A Knowledge- Based Methodology for Supporting Multilingual and User -Tailored Interfaces ,” Interacting With Computers, Vol. 9 (1998), p...project is to develop a two-handed user interface to the stereoscopic field analyzer, an interactive 3-D scientific visualization system. The...62 See http://www.hitl.washington.edu/research/vrd/. 63 R. Baumann and R. Clavel, “Haptic Interface for Virtual Reality Based
A dual-user teleoperation system with Online Authority Adjustment for haptic training.
Fei Liu; Leleve, Arnaud; Eberard, Damien; Redarce, Tanneguy
2015-08-01
This paper introduces a dual-user teleoperation system for hands-on medical training. A shared control based architecture is presented for authority management. In this structure, the combination of control signals is obtained using a dominance factor. Its main improvement is Online Authority Adjustment (OAA): the authority can be adjusted manually/automatically during the training progress. Experimental results are provided to validate the performances of the system.
Representing Graphical User Interfaces with Sound: A Review of Approaches
ERIC Educational Resources Information Center
Ratanasit, Dan; Moore, Melody M.
2005-01-01
The inability of computer users who are visually impaired to access graphical user interfaces (GUIs) has led researchers to propose approaches for adapting GUIs to auditory interfaces, with the goal of providing access for visually impaired people. This article outlines the issues involved in nonvisual access to graphical user interfaces, reviews…
a New ER Fluid Based Haptic Actuator System for Virtual Reality
NASA Astrophysics Data System (ADS)
Böse, H.; Baumann, M.; Monkman, G. J.; Egersdörfer, S.; Tunayar, A.; Freimuth, H.; Ermert, H.; Khaled, W.
The concept and some steps in the development of a new actuator system which enables the haptic perception of mechanically inhomogeneous virtual objects are introduced. The system consists of a two-dimensional planar array of actuator elements containing an electrorheological (ER) fluid. When a user presses his fingers onto the surface of the actuator array, he perceives locally variable resistance forces generated by vertical pistons which slide in the ER fluid through the gaps between electrode pairs. The voltage in each actuator element can be individually controlled by a novel sophisticated switching technology based on optoelectric gallium arsenide elements. The haptic information which is represented at the actuator array can be transferred from a corresponding sensor system based on ultrasonic elastography. The combined sensor-actuator system may serve as a technology platform for various applications in virtual reality, like telemedicine where the information on the consistency of tissue of a real patient is detected by the sensor part and recorded by the actuator part at a remote location.
NASA Technical Reports Server (NTRS)
Litt, Jonathan; Wong, Edmond; Simon, Donald L.
1994-01-01
A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.
The Role of Direct and Visual Force Feedback in Suturing Using a 7-DOF Dual-Arm Teleoperated System.
Talasaz, Ali; Trejos, Ana Luisa; Patel, Rajni V
2017-01-01
The lack of haptic feedback in robotics-assisted surgery can result in tissue damage or accidental tool-tissue hits. This paper focuses on exploring the effect of haptic feedback via direct force reflection and visual presentation of force magnitudes on performance during suturing in robotics-assisted minimally invasive surgery (RAMIS). For this purpose, a haptics-enabled dual-arm master-slave teleoperation system capable of measuring tool-tissue interaction forces in all seven Degrees-of-Freedom (DOFs) was used. Two suturing tasks, tissue puncturing and knot-tightening, were chosen to assess user skills when suturing on phantom tissue. Sixteen subjects participated in the trials and their performance was evaluated from various points of view: force consistency, number of accidental hits with tissue, amount of tissue damage, quality of the suture knot, and the time required to accomplish the task. According to the results, visual force feedback was not very useful during the tissue puncturing task as different users needed different amounts of force depending on the penetration of the needle into the tissue. Direct force feedback, however, was more useful for this task to apply less force and to minimize the amount of damage to the tissue. Statistical results also reveal that both visual and direct force feedback were required for effective knot tightening: direct force feedback could reduce the number of accidental hits with the tissue and also the amount of tissue damage, while visual force feedback could help to securely tighten the suture knots and maintain force consistency among different trials/users. These results provide evidence of the importance of 7-DOF force reflection when performing complex tasks in a RAMIS setting.
Virtual reality simulation in neurosurgery: technologies and evolution.
Chan, Sonny; Conti, François; Salisbury, Kenneth; Blevins, Nikolas H
2013-01-01
Neurosurgeons are faced with the challenge of learning, planning, and performing increasingly complex surgical procedures in which there is little room for error. With improvements in computational power and advances in visual and haptic display technologies, virtual surgical environments can now offer potential benefits for surgical training, planning, and rehearsal in a safe, simulated setting. This article introduces the various classes of surgical simulators and their respective purposes through a brief survey of representative simulation systems in the context of neurosurgery. Many technical challenges currently limit the application of virtual surgical environments. Although we cannot yet expect a digital patient to be indistinguishable from reality, new developments in computational methods and related technology bring us closer every day. We recognize that the design and implementation of an immersive virtual reality surgical simulator require expert knowledge from many disciplines. This article highlights a selection of recent developments in research areas related to virtual reality simulation, including anatomic modeling, computer graphics and visualization, haptics, and physics simulation, and discusses their implication for the simulation of neurosurgery.
The Graphical User Interface: Crisis, Danger, and Opportunity.
ERIC Educational Resources Information Center
Boyd, L. H.; And Others
1990-01-01
This article describes differences between the graphical user interface and traditional character-based interface systems, identifies potential problems posed by graphic computing environments for blind computer users, and describes some programs and strategies that are being developed to provide access to those environments. (Author/JDD)
A Prototype Graphical User Interface for Co-op: A Group Decision Support System.
1992-03-01
achieve their potential to communicate. Information-oriented, systematic graphic design is the use of typography , symbols, color, and other static and...apphcuittin by reducig Uber ellurt anid enhuncizig Iliteracti. ’Iliis thesis designs and de% elupht Itrututylle Graphical User Interface iGUl i fui Cu f...ORGANIZATION.... .. .. ............ II. INTERFACE DESIGN PRINCIPLES. .............. 7 A. GRAPHICAL USER INTERFACES.............7 1. Design Principles
Graphical Requirements for Force Level Planning. Volume 2
1991-09-01
technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice
An Object-Oriented Graphical User Interface for a Reusable Rocket Engine Intelligent Control System
NASA Technical Reports Server (NTRS)
Litt, Jonathan S.; Musgrave, Jeffrey L.; Guo, Ten-Huei; Paxson, Daniel E.; Wong, Edmond; Saus, Joseph R.; Merrill, Walter C.
1994-01-01
An intelligent control system for reusable rocket engines under development at NASA Lewis Research Center requires a graphical user interface to allow observation of the closed-loop system in operation. The simulation testbed consists of a real-time engine simulation computer, a controls computer, and several auxiliary computers for diagnostics and coordination. The system is set up so that the simulation computer could be replaced by the real engine and the change would be transparent to the control system. Because of the hard real-time requirement of the control computer, putting a graphical user interface on it was not an option. Thus, a separate computer used strictly for the graphical user interface was warranted. An object-oriented LISP-based graphical user interface has been developed on a Texas Instruments Explorer 2+ to indicate the condition of the engine to the observer through plots, animation, interactive graphics, and text.
Common Graphics Library (CGL). Volume 1: LEZ user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.
1988-01-01
Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.
Haptic-assistive technologies for audition and vision sensory disabilities.
Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria
2018-05-01
The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients' acceptability of these technologies.
NASA Technical Reports Server (NTRS)
Mavroidis, Constantinos; Pfeiffer, Charles; Paljic, Alex; Celestino, James; Lennon, Jamie; Bar-Cohen, Yoseph
2000-01-01
For many years, the robotic community sought to develop robots that can eventually operate autonomously and eliminate the need for human operators. However, there is an increasing realization that there are some tasks that human can perform significantly better but, due to associated hazards, distance, physical limitations and other causes, only robot can be employed to perform these tasks. Remotely performing these types of tasks requires operating robots as human surrogates. While current "hand master" haptic systems are able to reproduce the feeling of rigid objects, they present great difficulties in emulating the feeling of remote/virtual stiffness. In addition, they tend to be heavy, cumbersome and usually they only allow limited operator workspace. In this paper a novel haptic interface is presented to enable human-operators to "feel" and intuitively mirror the stiffness/forces at remote/virtual sites enabling control of robots as human-surrogates. This haptic interface is intended to provide human operators intuitive feeling of the stiffness and forces at remote or virtual sites in support of space robots performing dexterous manipulation tasks (such as operating a wrench or a drill). Remote applications are referred to the control of actual robots whereas virtual applications are referred to simulated operations. The developed haptic interface will be applicable to IVA operated robotic EVA tasks to enhance human performance, extend crew capability and assure crew safety. The electrically controlled stiffness is obtained using constrained ElectroRheological Fluids (ERF), which changes its viscosity under electrical stimulation. Forces applied at the robot end-effector due to a compliant environment will be reflected to the user using this ERF device where a change in the system viscosity will occur proportionally to the force to be transmitted. In this paper, we will present the results of our modeling, simulation, and initial testing of such an electrorheological fluid (ERF) based haptic device.
Common Graphics Library (CGL). Volume 2: Low-level user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.
1989-01-01
The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.
Doxon, Andrew J; Johnson, David E; Tan, Hong Z; Provancher, William R
2013-01-01
Many of the devices used in haptics research are over-engineered for the task and are designed with capabilities that go far beyond human perception levels. Designing devices that more closely match the limits of human perception will make them smaller, less expensive, and more useful. However, many device-centric perception thresholds have yet to be evaluated. To this end, three experiments were conducted, using one degree-of-freedom contact location feedback device in combination with a kinesthetic display, to provide a more explicit set of specifications for similar tactile-kinesthetic haptic devices. The first of these experiments evaluated the ability of humans to repeatedly localize tactile cues across the fingerpad. Subjects could localize cues to within 1.3 mm and showed bias toward the center of the fingerpad. The second experiment evaluated the minimum perceptible difference of backlash at the tactile element. Subjects were able to discriminate device backlash in excess of 0.46 mm on low-curvature models and 0.93 mm on high-curvature models. The last experiment evaluated the minimum perceptible difference of system delay between user action and device reaction. Subjects were able to discriminate delays in excess of 61 ms. The results from these studies can serve as the maximum (i.e., most demanding) device specifications for most tactile-kinesthetic haptic systems.
OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM
NASA Technical Reports Server (NTRS)
Ackerson, T.
1994-01-01
The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser printer (or equivalent). Plots require the Tektronix 4662 Penplotter. Source code is supplied to the user for modification and customizing. Executables are also supplied for all twelve graphics capabilities. This system was developed in 1983, and Version 3.1 was released in 1986.
Schvartzman, Sara C; Silva, Rebeka; Salisbury, Ken; Gaudilliere, Dyani; Girod, Sabine
2014-10-01
Computer-assisted surgical (CAS) planning tools have become widely available in craniomaxillofacial surgery, but are time consuming and often require professional technical assistance to simulate a case. An initial oral and maxillofacial (OM) surgical user experience was evaluated with a newly developed CAS system featuring a bimanual sense of touch (haptic). Three volunteer OM surgeons received a 5-minute verbal introduction to the use of a newly developed haptic-enabled planning system. The surgeons were instructed to simulate mandibular fracture reductions of 3 clinical cases, within a 15-minute time limit and without a time limit, and complete a questionnaire to assess their subjective experience with the system. Standard landmarks and linear and angular measurements between the simulated results and the actual surgical outcome were compared. After the 5-minute instruction, all 3 surgeons were able to use the system independently. The analysis of standardized anatomic measurements showed that the simulation results within a 15-minute time limit were not significantly different from those without a time limit. Mean differences between measurements of surgical and simulated fracture reductions were within current resolution limitations in collision detection, segmentation of computed tomographic scans, and haptic devices. All 3 surgeons reported that the system was easy to learn and use and that they would be comfortable integrating it into their daily clinical practice for trauma cases. A CAS system with a haptic interface that capitalizes on touch and force feedback experience similar to operative procedures is fast and easy for OM surgeons to learn and use. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. All rights reserved.
Reasoning about Users' Actions in a Graphical User Interface.
ERIC Educational Resources Information Center
Virvou, Maria; Kabassi, Katerina
2002-01-01
Describes a graphical user interface called IFM (Intelligent File Manipulator) that provides intelligent help to users. Explains two underlying reasoning mechanisms, one an adaptation of human plausible reasoning and one that performs goal recognition based on the effects of users' commands; and presents results of an empirical study that…
Temporal bone dissection simulator for training pediatric otolaryngology surgeons
NASA Astrophysics Data System (ADS)
Tabrizi, Pooneh R.; Sang, Hongqiang; Talari, Hadi F.; Preciado, Diego; Monfaredi, Reza; Reilly, Brian; Arikatla, Sreekanth; Enquobahrie, Andinet; Cleary, Kevin
2017-03-01
Cochlear implantation is the standard of care for infants born with severe hearing loss. Current guidelines approve the surgical placement of implants as early as 12 months of age. Implantation at a younger age poses a greater surgical challenge since the underdeveloped mastoid tip, along with thin calvarial bone, creates less room for surgical navigation and can result in increased surgical risk. We have been developing a temporal bone dissection simulator based on actual clinical cases for training otolaryngology fellows in this delicate procedure. The simulator system is based on pre-procedure CT (Computed Tomography) images from pediatric infant cases (<12 months old) at our hospital. The simulator includes: (1) simulation engine to provide the virtual reality of the temporal bone surgery environment, (2) a newly developed haptic interface for holding the surgical drill, (3) an Oculus Rift to provide a microscopic-like view of the temporal bone surgery, and (4) user interface to interact with the simulator through the Oculus Rift and the haptic device. To evaluate the system, we have collected 10 representative CT data sets and segmented the key structures: cochlea, round window, facial nerve, and ossicles. The simulator will present these key structures to the user and warn the user if needed by continuously calculating the distances between the tip of surgical drill and the key structures.
Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael
2011-01-01
This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.
Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael
2011-01-01
This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation. PMID:22247677
Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model
Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2017-01-01
Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation. PMID:28248996
Evaluation of stiffness feedback for hard nodule identification on a phantom silicone model.
Li, Min; Konstantinova, Jelizaveta; Xu, Guanghua; He, Bo; Aminzadeh, Vahid; Xie, Jun; Wurdemann, Helge; Althoefer, Kaspar
2017-01-01
Haptic information in robotic surgery can significantly improve clinical outcomes and help detect hard soft-tissue inclusions that indicate potential abnormalities. Visual representation of tissue stiffness information is a cost-effective technique. Meanwhile, direct force feedback, although considerably more expensive than visual representation, is an intuitive method of conveying information regarding tissue stiffness to surgeons. In this study, real-time visual stiffness feedback by sliding indentation palpation is proposed, validated, and compared with force feedback involving human subjects. In an experimental tele-manipulation environment, a dynamically updated color map depicting the stiffness of probed soft tissue is presented via a graphical interface. The force feedback is provided, aided by a master haptic device. The haptic device uses data acquired from an F/T sensor attached to the end-effector of a tele-manipulated robot. Hard nodule detection performance is evaluated for 2 modes (force feedback and visual stiffness feedback) of stiffness feedback on an artificial organ containing buried stiff nodules. From this artificial organ, a virtual-environment tissue model is generated based on sliding indentation measurements. Employing this virtual-environment tissue model, we compare the performance of human participants in distinguishing differently sized hard nodules by force feedback and visual stiffness feedback. Results indicate that the proposed distributed visual representation of tissue stiffness can be used effectively for hard nodule identification. The representation can also be used as a sufficient substitute for force feedback in tissue palpation.
Network Control Center User Planning System (NCC UPS)
NASA Astrophysics Data System (ADS)
Dealy, Brian
1991-09-01
NCC UPS is presented in the form of the viewgraphs. The following subject areas are covered: UPS overview; NCC UPS role; major NCC UPS functional requirements; interactive user access levels; UPS interfaces; interactive user subsystem; interface navigation; scheduling screen hierarchy; interactive scheduling input panels; autogenerated schedule request panel; schedule data tabular display panel; schedule data graphic display panel; graphic scheduling aid design; and schedule data graphic display.
Network Control Center User Planning System (NCC UPS)
NASA Technical Reports Server (NTRS)
Dealy, Brian
1991-01-01
NCC UPS is presented in the form of the viewgraphs. The following subject areas are covered: UPS overview; NCC UPS role; major NCC UPS functional requirements; interactive user access levels; UPS interfaces; interactive user subsystem; interface navigation; scheduling screen hierarchy; interactive scheduling input panels; autogenerated schedule request panel; schedule data tabular display panel; schedule data graphic display panel; graphic scheduling aid design; and schedule data graphic display.
Functionalization of Tactile Sensation for Robot Based on Haptograph and Modal Decomposition
NASA Astrophysics Data System (ADS)
Yokokura, Yuki; Katsura, Seiichiro; Ohishi, Kiyoshi
In the real world, robots should be able to recognize the environment in order to be of help to humans. A video camera and a laser range finder are devices that can help robots recognize the environment. However, these devices cannot obtain tactile information from environments. Future human-assisting-robots should have the ability to recognize haptic signals, and a disturbance observer can possibly be used to provide the robot with this ability. In this study, a disturbance observer is employed in a mobile robot to functionalize the tactile sensation. This paper proposes a method that involves the use of haptograph and modal decomposition for the haptic recognition of road environments. The haptograph presents a graphic view of the tactile information. It is possible to classify road conditions intuitively. The robot controller is designed by considering the decoupled modal coordinate system, which consists of translational and rotational modes. Modal decomposition is performed by using a quarry matrix. Once the robot is provided with the ability to recognize tactile sensations, its usefulness to humans will increase.
Graphics Software For VT Terminals
NASA Technical Reports Server (NTRS)
Wang, Caroline
1991-01-01
VTGRAPH graphics software tool for DEC/VT computer terminal or terminals compatible with it, widely used by government and industry. Callable in FORTRAN or C language, library program enabling user to cope with many computer environments in which VT terminals used for window management and graphic systems. Provides PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. User can easily design more-friendly user-interface programs and design PLOT10 programs on VT terminals with different computer systems. Requires ReGis graphics set terminal and FORTRAN compiler.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faculjak, D.A.
1988-03-01
Graphics Manager (GFXMGR) is menu-driven, user-friendly software designed to interactively create, edit, and delete graphics displays on the Advanced Electronics Design (AED) graphics controller, Model 767. The software runs on the VAX family of computers and has been used successfully in security applications to create and change site layouts (maps) of specific facilities. GFXMGR greatly benefits graphics development by minimizing display-development time, reducing tedium on the part of the user, and improving system performance. It is anticipated that GFXMGR can be used to create graphics displays for many types of applications. 8 figs., 2 tabs.
Real-Time Distributed Algorithms for Visual and Battlefield Reasoning
2006-08-01
High-Level Task Definition Language, Graphical User Interface (GUI), Story Analysis, Story Interpretation, SensIT Nodes 16. SECURITY...or more actions to be taken in the event the conditions are satisfied. We developed graphical user interfaces that may be used to express such...actions to be taken in the event the conditions are satisfied. We developed graphical user interfaces that may be used to express such task
Program for Generating Graphs and Charts
NASA Technical Reports Server (NTRS)
Ackerson, C. T.
1986-01-01
Office Automation Pilot (OAP) Graphics Database system offers IBM personal computer user assistance in producing wide variety of graphs and charts and convenient data-base system, called chart base, for creating and maintaining data associated with graphs and charts. Thirteen different graphics packages available. Access graphics capabilities obtained in similar manner. User chooses creation, revision, or chartbase-maintenance options from initial menu; Enters or modifies data displayed on graphic chart. OAP graphics data-base system written in Microsoft PASCAL.
Overview of Graphical User Interfaces.
ERIC Educational Resources Information Center
Hulser, Richard P.
1993-01-01
Discussion of graphical user interfaces for online public access catalogs (OPACs) covers the history of OPACs; OPAC front-end design, including examples from Indiana University and the University of Illinois; and planning and implementation of a user interface. (10 references) (EA)
TIGER: A graphically interactive grid system for turbomachinery applications
NASA Technical Reports Server (NTRS)
Shih, Ming-Hsin; Soni, Bharat K.
1992-01-01
Numerical grid generation algorithm associated with the flow field about turbomachinery geometries is presented. Graphical user interface is developed with FORMS Library to create an interactive, user-friendly working environment. This customized algorithm reduces the man-hours required to generate a grid associated with turbomachinery geometry, as compared to the use of general-purpose grid generation softwares. Bezier curves are utilized both interactively and automatically to accomplish grid line smoothness and orthogonality. Graphical User Interactions are provided in the algorithm, allowing the user to design and manipulate the grid lines with a mouse.
Neodymium:YAG laser cutting of intraocular lens haptics in vitro and in vivo.
Feder, J M; Rosenberg, M A; Farber, M D
1989-09-01
Various complications following intraocular lens (IOL) surgery result in explantation of the lenses. Haptic fibrosis may necessitate cutting the IOL haptics prior to removal. In this study we used the neodymium: YAG (Nd:YAG) laser to cut polypropylene and poly(methyl methacrylate) (PMMA) haptics in vitro and in rabbit eyes. In vitro we were able to cut 100% of both haptic types successfully (28 PMMA and 30 polypropylene haptics). In rabbit eyes we were able to cut 50% of the PMMA haptics and 43% of the polypropylene haptics. Poly(methyl methacrylate) haptics were easier to cut in vitro and in vivo than polypropylene haptics, requiring fewer shots for transection. Complications of Nd:YAG laser use frequently interfered with haptic transections in rabbit eyes. Haptic transection may be more easily accomplished in human eyes.
The Rise of the Graphical User Interface.
ERIC Educational Resources Information Center
Edwards, Alastair D. N.
1996-01-01
Discusses the history of the graphical user interface (GUI) and the growing realization that adaptations must be made to it lest its visual nature discriminate against nonsighted or sight-impaired users. One of the most popular commercially developed adaptations is to develop sounds that signal the location of icons or menus to mouse users.…
Graphical User Interfaces and Library Systems: End-User Reactions.
ERIC Educational Resources Information Center
Zorn, Margaret; Marshall, Lucy
1995-01-01
Describes a study by Parke-Davis Pharmaceutical Research Library to determine user satisfaction with the graphical user interface-based (GUI) Dynix Marquis compared with the text-based Dynix Classic Online Public Access Catalog (OPAC). Results show that the GUI-based OPAC was preferred by endusers over the text-based OPAC. (eight references) (DGM)
Developing a Graphical User Interface for the ALSS Crop Planning Tool
NASA Technical Reports Server (NTRS)
Koehlert, Erik
1997-01-01
The goal of my project was to create a graphical user interface for a prototype crop scheduler. The crop scheduler was developed by Dr. Jorge Leon and Laura Whitaker for the ALSS (Advanced Life Support System) program. The addition of a system-independent graphical user interface to the crop planning tool will make the application more accessible to a wider range of users and enhance its value as an analysis, design, and planning tool. My presentation will demonstrate the form and functionality of this interface. This graphical user interface allows users to edit system parameters stored in the file system. Data on the interaction of the crew, crops, and waste processing system with the available system resources is organized and labeled. Program output, which is stored in the file system, is also presented to the user in performance-time plots and organized charts. The menu system is designed to guide the user through analysis and decision making tasks, providing some help if necessary. The Java programming language was used to develop this interface in hopes of providing portability and remote operation.
The Graphical User Interface Crisis: Danger and Opportunity.
ERIC Educational Resources Information Center
Boyd, Lawrence H.; And Others
This paper examines graphic computing environments, identifies potential problems in providing access to blind people, and describes programs and strategies being developed to provide this access. The paper begins with an explanation of how graphic user interfaces differ from character-based systems in their use of pixels, visual metaphors such as…
NASA Astrophysics Data System (ADS)
Dawson, P.; Gage, J.; Takatsuka, M.; Goyette, S.
2009-02-01
To compete with other digital images, holograms must go beyond the current range of source-image types, such as sequences of photographs, laser scans, and 3D computer graphics (CG) scenes made with software designed for other applications. This project develops a set of innovative techniques for creating 3D digital content specifically for digital holograms, with virtual tools which enable the direct hand-crafting of subjects, mark by mark, analogous to Michelangelo's practice in drawing, painting and sculpture. The haptic device, the Phantom Premium 1.5 is used to draw against three-dimensional laser- scan templates of Michelangelo's sculpture placed within the holographic viewing volume.
Heterogeneous Deformable Modeling of Bio-Tissues and Haptic Force Rendering for Bio-Object Modeling
NASA Astrophysics Data System (ADS)
Lin, Shiyong; Lee, Yuan-Shin; Narayan, Roger J.
This paper presents a novel technique for modeling soft biological tissues as well as the development of an innovative interface for bio-manufacturing and medical applications. Heterogeneous deformable models may be used to represent the actual internal structures of deformable biological objects, which possess multiple components and nonuniform material properties. Both heterogeneous deformable object modeling and accurate haptic rendering can greatly enhance the realism and fidelity of virtual reality environments. In this paper, a tri-ray node snapping algorithm is proposed to generate a volumetric heterogeneous deformable model from a set of object interface surfaces between different materials. A constrained local static integration method is presented for simulating deformation and accurate force feedback based on the material properties of a heterogeneous structure. Biological soft tissue modeling is used as an example to demonstrate the proposed techniques. By integrating the heterogeneous deformable model into a virtual environment, users can both observe different materials inside a deformable object as well as interact with it by touching the deformable object using a haptic device. The presented techniques can be used for surgical simulation, bio-product design, bio-manufacturing, and medical applications.
NASA Technical Reports Server (NTRS)
Lucas, S. H.; Davis, R. C.
1992-01-01
A user's manual is presented for MacPASCO, which is an interactive, graphic, preprocessor for panel design. MacPASCO creates input for PASCO, an existing computer code for structural analysis and sizing of longitudinally stiffened composite panels. MacPASCO provides a graphical user interface which simplifies the specification of panel geometry and reduces user input errors. The user draws the initial structural geometry and reduces user input errors. The user draws the initial structural geometry on the computer screen, then uses a combination of graphic and text inputs to: refine the structural geometry; specify information required for analysis such as panel load and boundary conditions; and define design variables and constraints for minimum mass optimization. Only the use of MacPASCO is described, since the use of PASCO has been documented elsewhere.
Review of surgical robotics user interface: what is the best way to control robotic surgery?
Simorov, Anton; Otte, R Stephen; Kopietz, Courtni M; Oleynikov, Dmitry
2012-08-01
As surgical robots begin to occupy a larger place in operating rooms around the world, continued innovation is necessary to improve our outcomes. A comprehensive review of current surgical robotic user interfaces was performed to describe the modern surgical platforms, identify the benefits, and address the issues of feedback and limitations of visualization. Most robots currently used in surgery employ a master/slave relationship, with the surgeon seated at a work-console, manipulating the master system and visualizing the operation on a video screen. Although enormous strides have been made to advance current technology to the point of clinical use, limitations still exist. A lack of haptic feedback to the surgeon and the inability of the surgeon to be stationed at the operating table are the most notable examples. The future of robotic surgery sees a marked increase in the visualization technologies used in the operating room, as well as in the robots' abilities to convey haptic feedback to the surgeon. This will allow unparalleled sensation for the surgeon and almost eliminate inadvertent tissue contact and injury. A novel design for a user interface will allow the surgeon to have access to the patient bedside, remaining sterile throughout the procedure, employ a head-mounted three-dimensional visualization system, and allow the most intuitive master manipulation of the slave robot to date.
Using R in Introductory Statistics Courses with the pmg Graphical User Interface
ERIC Educational Resources Information Center
Verzani, John
2008-01-01
The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Rothmann, Elizabeth; Mittal, Nitin; Koppen, Sandra Howell
1994-01-01
The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of highly reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed at the outset to be compatible with most computing platforms and operating systems, and some programs have been beta tested within the aerospace community for over 8 years. This document is a user's guide for the HiRel graphical preprocessor Graphics Oriented (GO) program. GO is a graphical user interface for the HARP engine that enables the drawing of reliability/availability models on a monitor. A mouse is used to select fault tree gates or Markov graphical symbols from a menu for drawing.
Graphic and haptic simulation for transvaginal cholecystectomy training in NOTES.
Pan, Jun J; Ahn, Woojin; Dargar, Saurabh; Halic, Tansel; Li, Bai C; Sankaranarayanan, Ganesh; Roberts, Kurt; Schwaitzberg, Steven; De, Suvranu
2016-04-01
Natural Orifice Transluminal Endoscopic Surgery (NOTES) provides an emerging surgical technique which usually needs a long learning curve for surgeons. Virtual reality (VR) medical simulators with vision and haptic feedback can usually offer an efficient and cost-effective alternative without risk to the traditional training approaches. Under this motivation, we developed the first virtual reality simulator for transvaginal cholecystectomy in NOTES (VTEST™). This VR-based surgical simulator aims to simulate the hybrid NOTES of cholecystectomy. We use a 6DOF haptic device and a tracking sensor to construct the core hardware component of simulator. For software, an innovative approach based on the inner-spheres is presented to deform the organs in real time. To handle the frequent collision between soft tissue and surgical instruments, an adaptive collision detection method based on GPU is designed and implemented. To give a realistic visual performance of gallbladder fat tissue removal by cautery hook, a multi-layer hexahedral model is presented to simulate the electric dissection of fat tissue. From the experimental results, trainees can operate in real time with high degree of stability and fidelity. A preliminary study was also performed to evaluate the realism and the usefulness of this hybrid NOTES simulator. This prototyped simulation system has been verified by surgeons through a pilot study. Some items of its visual performance and the utility were rated fairly high by the participants during testing. It exhibits the potential to improve the surgical skills of trainee and effectively shorten their learning curve. Copyright © 2016 Elsevier Inc. All rights reserved.
Sorensen, Mads Solvsten; Mosegaard, Jesper; Trier, Peter
2009-06-01
Existing virtual simulators for middle ear surgery are based on 3-dimensional (3D) models from computed tomographic or magnetic resonance imaging data in which image quality is limited by the lack of detail (maximum, approximately 50 voxels/mm3), natural color, and texture of the source material.Virtual training often requires the purchase of a program, a customized computer, and expensive peripherals dedicated exclusively to this purpose. The Visible Ear freeware library of digital images from a fresh-frozen human temporal bone was segmented, and real-time volume rendered as a 3D model of high-fidelity, true color, and great anatomic detail and realism of the surgically relevant structures. A haptic drilling model was developed for surgical interaction with the 3D model. Realistic visualization in high-fidelity (approximately 125 voxels/mm3) and true color, 2D, or optional anaglyph stereoscopic 3D was achieved on a standard Core 2 Duo personal computer with a GeForce 8,800 GTX graphics card, and surgical interaction was provided through a relatively inexpensive (approximately $2,500) Phantom Omni haptic 3D pointing device. This prototype is published for download (approximately 120 MB) as freeware at http://www.alexandra.dk/ves/index.htm.With increasing personal computer performance, future versions may include enhanced resolution (up to 8,000 voxels/mm3) and realistic interaction with deformable soft tissue components such as skin, tympanic membrane, dura, and cholesteatomas-features some of which are not possible with computed tomographic-/magnetic resonance imaging-based systems.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1991-01-01
The Transportable Applications Environment (TAE) Plus, developed at GSFC, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUI's), supports prototyping, allows applications to be ported easily between different platforms and encourages appropriate levels of user interface consistency between applications. The following topics are discussed: the capabilities of the TAE Plus tool; how the implementation has utilized state-of-the-art technologies within graphic workstations; and how it has been used both within and outside of NASA.
Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo
2018-03-15
RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .
NASA Technical Reports Server (NTRS)
Stockwell, Alan E.; Cooper, Paul A.
1991-01-01
The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.
Cooperative processing user interfaces for AdaNET
NASA Technical Reports Server (NTRS)
Gutzmann, Kurt M.
1991-01-01
A cooperative processing user interface (CUI) system shares the task of graphical display generation and presentation between the user's computer and a remote host. The communications link between the two computers is typically a modem or Ethernet. The two main purposes of a CUI are reduction of the amount of data transmitted between user and host machines, and provision of a graphical user interface system to make the system easier to use.
Vibrotactile display for mobile applications based on dielectric elastomer stack actuators
NASA Astrophysics Data System (ADS)
Matysek, Marc; Lotz, Peter; Flittner, Klaus; Schlaak, Helmut F.
2010-04-01
Dielectric elastomer stack actuators (DESA) offer the possibility to build actuator arrays at very high density. The driving voltage can be defined by the film thickness, ranging from 80 μm down to 5 μm and driving field strength of 30 V/μm. In this paper we present the development of a vibrotactile display based on multilayer technology. The display is used to present several operating conditions of a machine in form of haptic information to a human finger. As an example the design of a mp3-player interface is introduced. To build up an intuitive and user friendly interface several aspects of human haptic perception have to be considered. Using the results of preliminary user tests the interface is designed and an appropriate actuator layout is derived. Controlling these actuators is important because there are many possibilities to present different information, e.g. by varying the driving parameters. A built demonstrator is used to verify the concept: a high recognition rate of more than 90% validates the concept. A characterization of mechanical and electrical parameters proofs the suitability of dielectric elastomer stack actuators for the use in mobile applications.
Development of a User Interface for a Regression Analysis Software Tool
NASA Technical Reports Server (NTRS)
Ulbrich, Norbert Manfred; Volden, Thomas R.
2010-01-01
An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.
Assessment of a User Guide for One Semi-Automated Forces (OneSAF) Version 2.0
2009-09-01
OneSAF uses a two-dimensional feature named a Plan View Display ( PVD ) as the primary graphical interface. The PVD replicates a map with a series...primary interface, the PVD is how the user watches the scenario unfold and requires the most interaction with the user. As seen in Table 3, all...participant indicated never using these seven map-related functions. Graphic control measures. Graphic control measures are applied to the PVD map to
User's Guide for Flight Simulation Data Visualization Workstation
NASA Technical Reports Server (NTRS)
Kaplan, Joseph A.; Chen, Ronnie; Kenney, Patrick S.; Koval, Christopher M.; Hutchinson, Brian K.
1996-01-01
Today's modern flight simulation research produces vast amounts of time sensitive data. The meaning of this data can be difficult to assess while in its raw format. Therefore, a method of breaking the data down and presenting it to the user in a graphical format is necessary. Simulation Graphics (SimGraph) is intended as a data visualization software package that will incorporate simulation data into a variety of animated graphical displays for easy interpretation by the simulation researcher. This document is intended as an end user's guide.
Towards automation of user interface design
NASA Technical Reports Server (NTRS)
Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst
1992-01-01
This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.
Computer systems and methods for the query and visualization of multidimensional databases
Stolte, Chris; Tang, Diane L; Hanrahan, Patrick
2014-04-29
In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.
Computer systems and methods for the query and visualization of multidimensional databases
Stolte, Chris [Palo Alto, CA; Tang, Diane L [Palo Alto, CA; Hanrahan, Patrick [Portola Valley, CA
2011-02-01
In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.
Computer systems and methods for the query and visualization of multidimensional databases
Stolte, Chris [Palo Alto, CA; Tang, Diane L [Palo Alto, CA; Hanrahan, Patrick [Portola Valley, CA
2012-03-20
In response to a user request, a computer generates a graphical user interface on a computer display. A schema information region of the graphical user interface includes multiple operand names, each operand name associated with one or more fields of a multi-dimensional database. A data visualization region of the graphical user interface includes multiple shelves. Upon detecting a user selection of the operand names and a user request to associate each user-selected operand name with a respective shelf in the data visualization region, the computer generates a visual table in the data visualization region in accordance with the associations between the operand names and the corresponding shelves. The visual table includes a plurality of panes, each pane having at least one axis defined based on data for the fields associated with a respective operand name.
VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS
NASA Technical Reports Server (NTRS)
Wang, C.
1994-01-01
VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.
Development of a graphical user interface for the global land information system (GLIS)
Alstad, Susan R.; Jackson, David A.
1993-01-01
The process of developing a Motif Graphical User Interface for the Global Land Information System (GLIS) involved incorporating user requirements, in-house visual and functional design requirements, and Open Software Foundation (OSF) Motif style guide standards. Motif user interface windows have been developed using the software to support Motif window functions war written using the C programming language. The GLIS architecture was modified to support multiple servers and remote handlers running the X Window System by forming a network of servers and handlers connected by TCP/IP communications. In April 1993, prior to release the GLIS graphical user interface and system architecture modifications were test by developers and users located at the EROS Data Center and 11 beta test sites across the country.
NASA Astrophysics Data System (ADS)
Gaik Tay, Kim; Cheong, Tau Han; Foong Lee, Ming; Kek, Sie Long; Abdul-Kahar, Rosmila
2017-08-01
In the previous work on Euler’s spreadsheet calculator for solving an ordinary differential equation, the Visual Basic for Application (VBA) programming was used, however, a graphical user interface was not developed to capture users input. This weakness may make users confuse on the input and output since those input and output are displayed in the same worksheet. Besides, the existing Euler’s spreadsheet calculator is not interactive as there is no prompt message if there is a mistake in inputting the parameters. On top of that, there are no users’ instructions to guide users to input the derivative function. Hence, in this paper, we improved previous limitations by developing a user-friendly and interactive graphical user interface. This improvement is aimed to capture users’ input with users’ instructions and interactive prompt error messages by using VBA programming. This Euler’s graphical user interface spreadsheet calculator is not acted as a black box as users can click on any cells in the worksheet to see the formula used to implement the numerical scheme. In this way, it could enhance self-learning and life-long learning in implementing the numerical scheme in a spreadsheet and later in any programming language.
fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages
Hoffmann, Thomas J.; Laird, Nan M.
2009-01-01
The fgui R package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility. PMID:21625291
Soft tissue deformation modelling through neural dynamics-based reaction-diffusion mechanics.
Zhang, Jinao; Zhong, Yongmin; Gu, Chengfan
2018-05-30
Soft tissue deformation modelling forms the basis of development of surgical simulation, surgical planning and robotic-assisted minimally invasive surgery. This paper presents a new methodology for modelling of soft tissue deformation based on reaction-diffusion mechanics via neural dynamics. The potential energy stored in soft tissues due to a mechanical load to deform tissues away from their rest state is treated as the equivalent transmembrane potential energy, and it is distributed in the tissue masses in the manner of reaction-diffusion propagation of nonlinear electrical waves. The reaction-diffusion propagation of mechanical potential energy and nonrigid mechanics of motion are combined to model soft tissue deformation and its dynamics, both of which are further formulated as the dynamics of cellular neural networks to achieve real-time computational performance. The proposed methodology is implemented with a haptic device for interactive soft tissue deformation with force feedback. Experimental results demonstrate that the proposed methodology exhibits nonlinear force-displacement relationship for nonlinear soft tissue deformation. Homogeneous, anisotropic and heterogeneous soft tissue material properties can be modelled through the inherent physical properties of mass points. Graphical abstract Soft tissue deformation modelling with haptic feedback via neural dynamics-based reaction-diffusion mechanics.
TmoleX--a graphical user interface for TURBOMOLE.
Steffen, Claudia; Thomas, Klaus; Huniar, Uwe; Hellweg, Arnim; Rubner, Oliver; Schroer, Alexander
2010-12-01
We herein present the graphical user interface (GUI) TmoleX for the quantum chemical program package TURBOMOLE. TmoleX allows users to execute the complete workflow of a quantum chemical investigation from the initial building of a structure to the visualization of the results in a user friendly graphical front end. The purpose of TmoleX is to make TURBOMOLE easy to use and to provide a high degree of flexibility. Hence, it should be a valuable tool for most users from beginners to experts. The program is developed in Java and runs on Linux, Windows, and Mac platforms. It can be used to run calculations on local desktops as well as on remote computers. © 2010 Wiley Periodicals, Inc.
SWATMOD-PREP: Graphical user interface for preparing coupled SWAT-modflow simulations
USDA-ARS?s Scientific Manuscript database
This paper presents SWATMOD-Prep, a graphical user interface that couples a SWAT watershed model with a MODFLOW groundwater flow model. The interface is based on a recently published SWAT-MODFLOW code that couples the models via mapping schemes. The spatial layout of SWATMOD-Prep guides the user t...
User's manual for the two-dimensional transputer graphics toolkit
NASA Technical Reports Server (NTRS)
Ellis, Graham K.
1988-01-01
The user manual for the 2-D graphics toolkit for a transputer based parallel processor is presented. The toolkit consists of a package of 2-D display routines that can be used for the simulation visualizations. It supports multiple windows, double buffered screens for animations, and simple graphics transformations such as translation, rotation, and scaling. The display routines are written in occam to take advantage of the multiprocessing features available on transputers. The package is designed to run on a transputer separate from the graphics board.
Characteristic analysis and simulation for polysilicon comb micro-accelerometer
NASA Astrophysics Data System (ADS)
Liu, Fengli; Hao, Yongping
2008-10-01
High force update rate is a key factor for achieving high performance haptic rendering, which imposes a stringent real time requirement upon the execution environment of the haptic system. This requirement confines the haptic system to simplified environment for reducing the computation cost of haptic rendering algorithms. In this paper, we present a novel "hyper-threading" architecture consisting of several threads for haptic rendering. The high force update rate is achieved with relatively large computation time interval for each haptic loop. The proposed method was testified and proved to be effective with experiments on virtual wall prototype haptic system via Delta Haptic Device.
Perception of force and stiffness in the presence of low-frequency haptic noise
Gurari, Netta; Okamura, Allison M.; Kuchenbecker, Katherine J.
2017-01-01
Objective This work lays the foundation for future research on quantitative modeling of human stiffness perception. Our goal was to develop a method by which a human’s ability to perceive suprathreshold haptic force stimuli and haptic stiffness stimuli can be affected by adding haptic noise. Methods Five human participants performed a same-different task with a one-degree-of-freedom force-feedback device. Participants used the right index finger to actively interact with variations of force (∼5 and ∼8 N) and stiffness (∼290 N/m) stimuli that included one of four scaled amounts of haptically rendered noise (None, Low, Medium, High). The haptic noise was zero-mean Gaussian white noise that was low-pass filtered with a 2 Hz cut-off frequency; the resulting low-frequency signal was added to the force rendered while the participant interacted with the force and stiffness stimuli. Results We found that the precision with which participants could identify the magnitude of both the force and stiffness stimuli was affected by the magnitude of the low-frequency haptically rendered noise added to the haptic stimulus, as well as the magnitude of the haptic stimulus itself. The Weber fraction strongly correlated with the standard deviation of the low-frequency haptic noise with a Pearson product-moment correlation coefficient of ρ > 0.83. The mean standard deviation of the low-frequency haptic noise in the haptic stimuli ranged from 0.184 N to 1.111 N across the four haptically rendered noise levels, and the corresponding mean Weber fractions spanned between 0.042 and 0.101. Conclusions The human ability to perceive both suprathreshold haptic force and stiffness stimuli degrades in the presence of added low-frequency haptic noise. Future work can use the reported methods to investigate how force perception and stiffness perception may relate, with possible applications in haptic watermarking and in the assessment of the functionality of peripheral pathways in individuals with haptic impairments. PMID:28575068
The effects of perceptual priming on 4-year-olds' haptic-to-visual cross-modal transfer.
Kalagher, Hilary
2013-01-01
Four-year-old children often have difficulty visually recognizing objects that were previously experienced only haptically. This experiment attempts to improve their performance in these haptic-to-visual transfer tasks. Sixty-two 4-year-old children participated in priming trials in which they explored eight unfamiliar objects visually, haptically, or visually and haptically together. Subsequently, all children participated in the same haptic-to-visual cross-modal transfer task. In this task, children haptically explored the objects that were presented in the priming phase and then visually identified a match from among three test objects, each matching the object on only one dimension (shape, texture, or color). Children in all priming conditions predominantly made shape-based matches; however, the most shape-based matches were made in the Visual and Haptic condition. All kinds of priming provided the necessary memory traces upon which subsequent haptic exploration could build a strong enough representation to enable subsequent visual recognition. Haptic exploration patterns during the cross-modal transfer task are discussed and the detailed analyses provide a unique contribution to our understanding of the development of haptic exploratory procedures.
Fisher, J Brian; Porter, Susan M
2002-01-01
This paper describes an application of a display approach which uses chromakey techniques to composite real and computer-generated images allowing a user to see his hands and medical instruments collocated with the display of virtual objects during a medical training simulation. Haptic feedback is provided through the use of a PHANTOM force feedback device in addition to tactile augmentation, which allows the user to touch virtual objects by introducing corresponding real objects in the workspace. A simplified catheter introducer insertion simulation was developed to demonstrate the capabilities of this approach.
Measuring Presence in Virtual Environments
1994-10-01
viewpoint to change what they see, or to reposition their head to affect binaural hearing, or to search the environment haptically, they will experience a...increase presence in an alternate environment. For example a head mounted display that isolates the user from the real world may increase the sense...movement interface devices such as treadmills and trampolines , different gloves, and auditory equipment. Even as a low end technological implementation of
Development of a Haptic Interface for Natural Orifice Translumenal Endoscopic Surgery Simulation
Dargar, Saurabh; Sankaranarayanan, Ganesh
2016-01-01
Natural orifice translumenal endoscopic surgery (NOTES) is a minimally invasive procedure, which utilizes the body’s natural orifices to gain access to the peritoneal cavity. The NOTES procedure is designed to minimize external scarring and patient trauma, however flexible endoscopy based pure NOTES procedures require critical scope handling skills. The delicate nature of the NOTES procedure requires extensive training, thus to improve access to training while reducing risk to patients we have designed and developed the VTEST©, a virtual reality NOTES simulator. As part of the simulator, a novel decoupled 2-DOF haptic device was developed to provide realistic force feedback to the user in training. A series of experiments were performed to determine the behavioral characteristics of the device. The device was found capable of rendering up to 5.62N and 0.190Nm of continuous force and torque in the translational and rotational DOF, respectively. The device possesses 18.1Hz and 5.7Hz of force bandwidth in the translational and rotational DOF, respectively. A feedforward friction compensator was also successfully implemented to minimize the negative impact of friction during the interaction with the device. In this work we have presented the detailed development and evaluation of the haptic device for the VTEST©. PMID:27008674
Opportunities of hydrostatically coupled dielectric elastomer actuators for haptic interfaces
NASA Astrophysics Data System (ADS)
Carpi, Federico; Frediani, Gabriele; De Rossi, Danilo
2011-04-01
As a means to improve versatility and safety of dielectric elastomer actuators (DEAs) for several fields of application, so-called 'hydrostatically coupled' DEAs (HC-DEAs) have recently been described. HC-DEAs are based on an incompressible fluid that mechanically couples a DE-based active part to a passive part interfaced to the load, so as to enable hydrostatic transmission. This paper presents ongoing developments of HC-DEAs and potential applications in the field of haptics. Three specific examples are considered. The first deals with a wearable tactile display used to provide users with tactile feedback during electronic navigation in virtual environments. The display consists of HCDEAs arranged in contact with finger tips. As a second example, an up-scaled prototype version of an 8-dots refreshable cell for dynamic Braille displays is shown. Each Braille dot consists of a miniature HC-DEA, with a diameter lower than 2 mm. The third example refers to a device for finger rehabilitation, conceived to work as a sort of active version of a rehabilitation squeezing ball. The device is designed to dynamically change its compliance according to an electric control. The three examples of applications intend to show the potential of the new technology and the prospective opportunities for haptic interfaces.
Gal, Gilad Ben; Weiss, Ervin I; Gafni, Naomi; Ziv, Amitai
2011-04-01
Virtual reality force feedback simulators provide a haptic (sense of touch) feedback through the device being held by the user. The simulator's goal is to provide a learning experience resembling reality. A newly developed haptic simulator (IDEA Dental, Las Vegas, NV, USA) was assessed in this study. Our objectives were to assess the simulator's ability to serve as a tool for dental instruction, self-practice, and student evaluation, as well as to evaluate the sensation it provides. A total of thirty-three evaluators were divided into two groups. The first group consisted of twenty-one experienced dental educators; the second consisted of twelve fifth-year dental students. Each participant performed drilling tasks using the simulator and filled out a questionnaire regarding the simulator and potential ways of using it in dental education. The results show that experienced dental faculty members as well as advanced dental students found that the simulator could provide significant potential benefits in the teaching and self-learning of manual dental skills. Development of the simulator's tactile sensation is needed to attune it to genuine sensation. Further studies relating to aspects of the simulator's structure and its predictive validity, its scoring system, and the nature of the performed tasks should be conducted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony L. Crawford
MODIFIED PAPER TITLE AND ABSTRACT DUE TO SLIGHTLY MODIFIED SCOPE: TITLE: Nonlinear Force Profile Used to Increase the Performance of a Haptic User Interface for Teleoperating a Robotic Hand Natural movements and force feedback are important elements in using teleoperated equipment if complex and speedy manipulation tasks are to be accomplished in hazardous environments, such as hot cells, glove boxes, decommissioning, explosives disarmament, and space. The research associated with this paper hypothesizes that a user interface and complementary radiation compatible robotic hand that integrates the human hand’s anthropometric properties, speed capability, nonlinear strength profile, reduction of active degrees of freedommore » during the transition from manipulation to grasping, and just noticeable difference force sensation characteristics will enhance a user’s teleoperation performance. The main contribution of this research is in that a system that concisely integrates all these factors has yet to be developed and furthermore has yet to be applied to a hazardous environment as those referenced above. In fact, the most prominent slave manipulator teleoperation technology in use today is based on a design patented in 1945 (Patent 2632574) [1]. The robotic hand/user interface systems of similar function as the one being developed in this research limit their design input requirements in the best case to only complementing the hand’s anthropometric properties, speed capability, and linearly scaled force application relationship (e.g. robotic force is a constant, 4 times that of the user). In this paper a nonlinear relationship between the force experienced between the user interface and the robotic hand was devised based on property differences of manipulation and grasping activities as they pertain to the human hand. The results show that such a relationship when subjected to a manipulation task and grasping task produces increased performance compared to the traditional linear scaling techniques used by other systems. Key Words: Teleoperation, Robotic Hand, Robotic Force Scaling« less
Alió, Jorge L; Plaza-Puche, Ana B; Javaloy, Jaime; Ayala, María José; Vega-Estrada, Alfredo
2013-04-01
To compare the visual and intraocular optical quality outcomes with different designs of the refractive rotationally asymmetric multifocal intraocular lens (MFIOL) (Lentis Mplus; Oculentis GmbH, Berlin, Germany) with or without capsular tension ring (CTR) implantation. One hundred thirty-five consecutive eyes of 78 patients with cataract (ages 36 to 82 years) were divided into three groups: 43 eyes implanted with the C-Loop haptic design without CTR (C-Loop haptic only group); 47 eyes implanted with the C-Loop haptic design with CTR (C-Loop haptic with CTR group); and 45 eyes implanted with the plate-haptic design (plate-haptic group). Visual acuity, contrast sensitivity, defocus curve, and ocular and intraocular optical quality were evaluated at 3 months postoperatively. Significant differences in the postoperative sphere were found (P = .01), with a more myopic postoperative refraction for the C-Loop haptic only group. No significant differences were detected in photopic and scotopic contrast sensitivity among groups (P ⩾ .05). Significantly better visual acuities were present in the C-Loop haptic with CTR group for the defocus levels of -2.0, -1.5, -1.0, and -0.50 D (P ⩽.03). Statistically significant differences among groups were found in total intraocular root mean square (RMS), high-order intraocular RMS, and intraocular coma-like RMS aberrations (P ⩽.04), with lower values from the plate-haptic group. The plate-haptic design and the C-Loop haptic design with CTR implantation both allow good visual rehabilitation. However, better refractive predictability and intraocular optical quality was obtained with the plate-haptic design without CTR implantation. The plate-haptic design seems to be a better design to support rotational asymmetric MFIOL optics. Copyright 2013, SLACK Incorporated.
Limited value of haptics in virtual reality laparoscopic cholecystectomy training.
Thompson, Jonathan R; Leonard, Anthony C; Doarn, Charles R; Roesch, Matt J; Broderick, Timothy J
2011-04-01
Haptics is an expensive addition to virtual reality (VR) simulators, and the added value to training has not been proven. This study evaluated the benefit of haptics in VR laparoscopic surgery training for novices. The Simbionix LapMentor II haptic VR simulator was used in the study. Randomly, 33 laparoscopic novice students were placed in one of three groups: control, haptics-trained, or nonhaptics-trained group. The control group performed nine basic laparoscopy tasks and four cholecystectomy procedural tasks one time with haptics engaged at the default setting. The haptics group was trained to proficiency in the basic tasks and then performed each of the procedural tasks one time with haptics engaged. The nonhaptics group used the same training protocol except that haptics was disengaged. The proficiency values used were previously published expert values. Each group was assessed in the performance of 10 laparoscopic cholecystectomies (alternating with and without haptics). Performance was measured via automatically collected simulator data. The three groups exhibited no differences in terms of sex, education level, hand dominance, video game experience, surgical experience, and nonsurgical simulator experience. The number of attempts required to reach proficiency did not differ between the haptics- and nonhaptics-training groups. The haptics and nonhaptics groups exhibited no difference in performance. Both training groups outperformed the control group in number of movements as well as path length of the left instrument. In addition, the nonhaptics group outperformed the control group in total time. Haptics does not improve the efficiency or effectiveness of LapMentor II VR laparoscopic surgery training. The limited benefit and the significant cost of haptics suggest that haptics should not be included routinely in VR laparoscopic surgery training.
A novel graphical user interface for ultrasound-guided shoulder arthroscopic surgery
NASA Astrophysics Data System (ADS)
Tyryshkin, K.; Mousavi, P.; Beek, M.; Pichora, D.; Abolmaesumi, P.
2007-03-01
This paper presents a novel graphical user interface developed for a navigation system for ultrasound-guided computer-assisted shoulder arthroscopic surgery. The envisioned purpose of the interface is to assist the surgeon in determining the position and orientation of the arthroscopic camera and other surgical tools within the anatomy of the patient. The user interface features real time position tracking of the arthroscopic instruments with an optical tracking system, and visualization of their graphical representations relative to a three-dimensional shoulder surface model of the patient, created from computed tomography images. In addition, the developed graphical interface facilitates fast and user-friendly intra-operative calibration of the arthroscope and the arthroscopic burr, capture and segmentation of ultrasound images, and intra-operative registration. A pilot study simulating the computer-aided shoulder arthroscopic procedure on a shoulder phantom demonstrated the speed, efficiency and ease-of-use of the system.
Surgical scissors extension adds the 7th axis of force feedback to the Freedom 6S.
Powers, Marilyn J; Sinclair, Ian P W; Brouwer, Iman; Laroche, Denis
2007-01-01
A virtual reality surgical simulator ideally allows seamless transition between the real and virtual world. In that respect, all of a surgeon's motions and tools must be simulated. Until now researchers have been limited to using a pen-like tool in six degrees-of-freedom. This paper presents the addition of haptically enabled scissors to the end effector of a 6-DOF haptic device, the Freedom 6S. The scissors are capable of pinching a maximum torque of 460 mN.m with low inertia and low back-drive friction. The device is a balanced design so that the user feels like they are holding no more than actual scissors, although with some added inertia on the load end. The system is interchangeable between the 6-DOF and 7-DOF configurations to allow switching tools quickly.
Smart glove: hand master using magnetorheological fluid actuators
NASA Astrophysics Data System (ADS)
Nam, Y. J.; Park, M. K.; Yamane, R.
2007-12-01
In this study, a hand master using five miniature magneto-rheological (MR) actuators, which is called 'the smart glove', is introduced. This hand master is intended to display haptic feedback to the fingertip of the human user interacting with any virtual objects in virtual environment. For the smart glove, two effective approaches are proposed: (i) by using the MR actuator which can be considered as a passive actuator, the smart glove is made simple in structure, high in power, low in inertia, safe in interface and stable in haptic feedback, and (ii) with a novel flexible link mechanism designed for the position-force transmission between the fingertips and the actuators, the number of the actuator and the weight of the smart glove can be reduced. These features lead to the improvement in the manipulability and portability of the smart glove. The feasibility of the constructed smart glove is verified through basic performance evaluation.
Self-Control of Haptic Assistance for Motor Learning: Influences of Frequency and Opinion of Utility
Williams, Camille K.; Tseung, Victrine; Carnahan, Heather
2017-01-01
Studies of self-controlled practice have shown benefits when learners controlled feedback schedule, use of assistive devices and task difficulty, with benefits attributed to information processing and motivational advantages of self-control. Although haptic assistance serves as feedback, aids task performance and modifies task difficulty, researchers have yet to explore whether self-control over haptic assistance could be beneficial for learning. We explored whether self-control of haptic assistance would be beneficial for learning a tracing task. Self-controlled participants selected practice blocks on which they would receive haptic assistance, while participants in a yoked group received haptic assistance on blocks determined by a matched self-controlled participant. We inferred learning from performance on retention tests without haptic assistance. From qualitative analysis of open-ended questions related to rationales for/experiences of the haptic assistance that was chosen/provided, themes emerged regarding participants’ views of the utility of haptic assistance for performance and learning. Results showed that learning was directly impacted by the frequency of haptic assistance for self-controlled participants only and view of haptic assistance. Furthermore, self-controlled participants’ views were significantly associated with their requested haptic assistance frequency. We discuss these findings as further support for the beneficial role of self-controlled practice for motor learning. PMID:29255438
Scientific and Graphic Design Foundations for C2
2007-06-01
the elements in the composition. This section presents a summary of the concepts in graphic design layout, typography , color, and data graphics...assist the users in perceiving and recognizing patterns in information. Typography Typography is the art and technique of designing textual...Principles of typography for user interface design, interactions, Vol 5, pp. 15, Nov/Dec 1998 Kahneman, D., & Henik, A. 1981. Perceptual organization and
Schweykart, N; Reinhard, T; Engelhardt, S; Sundmacher, R
1999-06-01
Ultrasound biomicroscopy (UBM) allows to determine the haptic position of posterior chamber lenses (PCL) in relation to adjacent structures. In transsclerally sutured PCLs, the comparison between intraoperatively endoscopically and postoperatively localized haptic positions via UBM showed a correspondence of only 81%. The different localisation of 19% of the examined haptic positions was explained with postoperative dislocation without any proof for this assumption. The purpose of this study therefore was the correlation of UBM results with simultaneously determined haptic positions via gonioscopy in aniridia after black diaphragm PCL implantation. The haptic positions of black diaphragm PCL implants in 20 patients with congenital and 13 patients with traumatic aniridia were determined via UBM (50-MHz-probe) and gonioscopy 44.4 (6-75) months postoperatively. 39/66 haptic positions could be localized in gonioscopy as well as in UBM. 38 haptics (97.4%) showed the same position in both examination techniques. Determination of the haptic position through one of the two examination techniques was impossible in 27/66 haptics (11 haptics in gonioscopy, 16 haptics in UBM). Reasons for this were primarily haptic position behind iris remnants and corneal opacities in gonioscopy and scarring of the ciliary body in UBM. The validity of UBM in localization of PCLs was confirmed gonioscopically, which also confirms our prior assumption of postoperative displacement of IOL-haptics after transscleral suturing in about 20% of cases. Scarring of the ciliary body was the most important obstacle in the determination of PCL haptic positions in relation to adjacent structures.
Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.
2009-01-01
The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.
VAGUE: a graphical user interface for the Velvet assembler.
Powell, David R; Seemann, Torsten
2013-01-15
Velvet is a popular open-source de novo genome assembly software tool, which is run from the Unix command line. Most of the problems experienced by new users of Velvet revolve around constructing syntactically and semantically correct command lines, getting input files into acceptable formats and assessing the output. Here, we present Velvet Assembler Graphical User Environment (VAGUE), a multi-platform graphical front-end for Velvet. VAGUE aims to make sequence assembly accessible to a wider audience and to facilitate better usage amongst existing users of Velvet. VAGUE is implemented in JRuby and targets the Java Virtual Machine. It is available under an open-source GPLv2 licence from http://www.vicbioinformatics.com/. torsten.seemann@monash.edu.
Graphical user interfaces for symbol-oriented database visualization and interaction
NASA Astrophysics Data System (ADS)
Brinkschulte, Uwe; Siormanolakis, Marios; Vogelsang, Holger
1997-04-01
In this approach, two basic services designed for the engineering of computer based systems are combined: a symbol-oriented man-machine-service and a high speed database-service. The man-machine service is used to build graphical user interfaces (GUIs) for the database service; these interfaces are stored using the database service. The idea is to create a GUI-builder and a GUI-manager for the database service based upon the man-machine service using the concept of symbols. With user-definable and predefined symbols, database contents can be visualized and manipulated in a very flexible and intuitive way. Using the GUI-builder and GUI-manager, a user can build and operate its own graphical user interface for a given database according to its needs without writing a single line of code.
Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface. PMID:27045593
Hung, Ling-Hong; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee
2016-01-01
Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs) remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11) graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.
[Visual cuing effect for haptic angle judgment].
Era, Ataru; Yokosawa, Kazuhiko
2009-08-01
We investigated whether visual cues are useful for judging haptic angles. Participants explored three-dimensional angles with a virtual haptic feedback device. For visual cues, we use a location cue, which synchronizes haptic exploration, and a space cue, which specifies the haptic space. In Experiment 1, angles were judged more correctly with both cues, but were overestimated with a location cue only. In Experiment 2, the visual cues emphasized depth, and overestimation with location cues occurred, but space cues had no influence. The results showed that (a) when both cues are presented, haptic angles are judged more correctly. (b) Location cues facilitate only motion information, and not depth information. (c) Haptic angles are apt to be overestimated when there is both haptic and visual information.
Erdogan, Goker; Yildirim, Ilker; Jacobs, Robert A.
2015-01-01
People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception. PMID:26554704
Applications of graphics to support a testbed for autonomous space vehicle operations
NASA Technical Reports Server (NTRS)
Schmeckpeper, K. R.; Aldridge, J. P.; Benson, S.; Horner, S.; Kullman, A.; Mulder, T.; Parrott, W.; Roman, D.; Watts, G.; Bochsler, Daniel C.
1989-01-01
Researchers describe their experience using graphics tools and utilities while building an application, AUTOPS, that uses a graphical Machintosh (TM)-like interface for the input and display of data, and animation graphics to enhance the presentation of results of autonomous space vehicle operations simulations. AUTOPS is a test bed for evaluating decisions for intelligent control systems for autonomous vehicles. Decisions made by an intelligent control system, e.g., a revised mission plan, might be displayed to the user in textual format or he can witness the effects of those decisions via out of window graphics animations. Although a textual description conveys essentials, a graphics animation conveys the replanning results in a more convincing way. Similarily, iconic and menu-driven screen interfaces provide the user with more meaningful options and displays. Presented here are experiences with the SunView and TAE Plus graphics tools used for interface design, and the Johnson Space Center Interactive Graphics Laboratory animation graphics tools used for generating out out of the window graphics.
Storyboard method of end-user programming with natural language configuration
Bouchard, Ann M; Osbourn, Gordon C
2013-11-19
A technique for end-user programming includes populating a template with graphically illustrated actions and then invoking a command to generate a screen element based on the template. The screen element is rendered within a computing environment and provides a mechanism for triggering execution of a sequence of user actions. The sequence of user actions is based at least in part on the graphically illustrated actions populated into the template.
Effects of geared motor characteristics on tactile perception of tissue stiffness.
Longnion, J; Rosen, J; Sinanan, M; Hannaford, B
2001-01-01
Endoscopic haptic surgical devices have shown promise in addressing the loss of tactile sensation associated with minimally invasive surgery. However, these devices must be capable of generating forces and torques similar to those applied on the tissue with a standard endoscopic tool. Geared motors are a possible solution for actuation; however, they possess mechanical characteristics that could potentially interfere with tactile perception of tissue qualities. The aim of the current research was to determine how the characteristics of a geared motor suitable for a haptic surgical device affect a user's perception of stiffness. The experiment involved six blindfolded subjects who were asked to discriminate the stiffness of six distinct silicone rubber samples whose mechanical properties are similar to those of soft tissue. Using a novel testing device whose dimensions approximated those of an endoscopic grasper, each subject palpated 30 permutations of sample pairs for each of three types of mechanical loads; the motor (friction and inertia), a flywheel (with the same inertia as motor), and a control (no significant mechanical interference). One factor ANOVA of the error scores and palpation time showed that no significant difference existed among error scores, but mean palpation time for the control was significantly less than for the other two methods. These results indicated that the mechanical characteristics of a geared motor chosen for application in a haptic surgical device did not interfere with the subjects' perception of the silicone samples' stiffness, but these characteristics may significantly affect the energy expenditure and time required for tissue palpation. Therefore, before geared motors can be considered for use in haptic surgical devices, consideration should be given to factors such as palpation speed and fatigue.
A systematic review: the influence of real time feedback on wheelchair propulsion biomechanics.
Symonds, Andrew; Barbareschi, Giulia; Taylor, Stephen; Holloway, Catherine
2018-01-01
Clinical guidelines recommend that, in order to minimize upper limb injury risk, wheelchair users adopt a semi-circular pattern with a slow cadence and a large push arc. To examine whether real time feedback can be used to influence manual wheelchair propulsion biomechanics. Clinical trials and case series comparing the use of real time feedback against no feedback were included. A general review was performed and methodological quality assessed by two independent practitioners using the Downs and Black checklist. The review was completed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analyses (PRISMA) guidelines. Six papers met the inclusion criteria. Selected studies involved 123 participants and analysed the effect of visual and, in one case, haptic feedback. Across the studies it was shown that participants were able to achieve significant changes in propulsion biomechanics, when provided with real time feedback. However, the effect of targeting a single propulsion variable might lead to unwanted alterations in other parameters. Methodological assessment identified weaknesses in external validity. Visual feedback could be used to consistently increase push arc and decrease push rate, and may be the best focus for feedback training. Further investigation is required to assess such intervention during outdoor propulsion. Implications for Rehabilitation Upper limb pain and injuries are common secondary disorders that negatively affect wheelchair users' physical activity and quality of life. Clinical guidelines suggest that manual wheelchair users should aim to propel with a semi-circular pattern with low a push rate and large push arc in the range in order to minimise upper limbs' loading. Real time visual and haptic feedback are effective tools for improving propulsion biomechanics in both complete novices and experienced manual wheelchair users.
DISPLAY3D. A Graphics Preprocessor for CHIEF
1990-12-27
graphics devices, the user may write a graphics program th.,.t can read DISPLAY3D output files, or use one of the commercial plotting packages...COMMON/NBPRTC/IRHSPT, NARSPT, NPTBLK FRQPT COMMON/NBPRTS/SYMTPT CHARACTER*3 SYMTPT DIMENSION CC(10), TRNS(3), IELTS (8,300) real xl(1000) ,yl(leee...C Prompt the user for filename. C--- ------------------------------------------------------- WRITE (6,1) ’Enter filename used in CID or
Active skin as new haptic interface
NASA Astrophysics Data System (ADS)
Vuong, Nguyen Huu Lam; Kwon, Hyeok Yong; Chuc, Nguyen Huu; Kim, Duksang; An, Kuangjun; Phuc, Vuong Hong; Moon, Hyungpil; Koo, Jachoon; Lee, Youngkwan; Nam, Jae-Do; Choi, Hyouk Ryeol
2010-04-01
In this paper, we present a new haptic interface, called "active skin", which is configured with a tactile sensor and a tactile stimulator in single haptic cell, and multiple haptic cells are embedded in a dielectric elastomer. The active skin generates a wide variety of haptic feel in response to the touch by synchronizing the sensor and the stimulator. In this paper, the design of the haptic cell is derived via iterative analysis and design procedures. A fabrication method dedicated to the proposed device is investigated and a controller to drive multiple haptic cells is developed. In addition, several experiments are performed to evaluate the performance of the active skin.
Vertigo in virtual reality with haptics: case report.
Viirre, Erik; Ellisman, Mark
2003-08-01
A researcher was working with a desktop virtual environment system. The system was displaying vector fields of a cyclonic weather system, and the system incorporated a haptic display of the forces in the cyclonic field. As the subject viewed the rotating cyclone field, they would move a handle "through" the representation of the moving winds and "feel" the forces buffeting the handle as it moved. Stopping after using the system for about 10 min, the user experienced an immediate sensation of postural instability for several minutes. Several hours later, there was the onset of vertigo with head turns. This vertigo lasted several hours and was accompanied with nausea and motion illusions that exacerbated by head movements. Symptoms persisted mildly the next day and were still present the third and fourth day, but by then were only provoked by head movements. There were no accompanying symptoms or history to suggest an inner ear disorder. Physical examination of inner ear and associated neurologic function was normal. No other users of this system have reported similar symptoms. This case suggests that some individuals may be susceptible to the interaction of displays with motion and movement forces and as a result experience motion illusions. Operators of such systems should be aware of this potential and minimize exposure if vertigo occurs.
A comparison of haptic material perception in blind and sighted individuals.
Baumgartner, Elisabeth; Wiebel, Christiane B; Gegenfurtner, Karl R
2015-10-01
We investigated material perception in blind participants to explore the influence of visual experience on material representations and the relationship between visual and haptic material perception. In a previous study with sighted participants, we had found participants' visual and haptic judgments of material properties to be very similar (Baumgartner, Wiebel, & Gegenfurtner, 2013). In a categorization task, however, visual exploration had led to higher categorization accuracy than haptic exploration. Here, we asked congenitally blind participants to explore different materials haptically and rate several material properties in order to assess the role of the visual sense for the emergence of haptic material perception. Principal components analyses combined with a procrustes superimposition showed that the material representations of blind and blindfolded sighted participants were highly similar. We also measured haptic categorization performance, which was equal for the two groups. We conclude that haptic material representations can emerge independently of visual experience, and that there are no advantages for either group of observers in haptic categorization. Copyright © 2015 Elsevier Ltd. All rights reserved.
Menon, Samir; Brantner, Gerald; Aholt, Chris; Kay, Kendrick; Khatib, Oussama
2013-01-01
A challenging problem in motor control neuroimaging studies is the inability to perform complex human motor tasks given the Magnetic Resonance Imaging (MRI) scanner's disruptive magnetic fields and confined workspace. In this paper, we propose a novel experimental platform that combines Functional MRI (fMRI) neuroimaging, haptic virtual simulation environments, and an fMRI-compatible haptic device for real-time haptic interaction across the scanner workspace (above torso ∼ .65×.40×.20m(3)). We implement this Haptic fMRI platform with a novel haptic device, the Haptic fMRI Interface (HFI), and demonstrate its suitability for motor neuroimaging studies. HFI has three degrees-of-freedom (DOF), uses electromagnetic motors to enable high-fidelity haptic rendering (>350Hz), integrates radio frequency (RF) shields to prevent electromagnetic interference with fMRI (temporal SNR >100), and is kinematically designed to minimize currents induced by the MRI scanner's magnetic field during motor displacement (<2cm). HFI possesses uniform inertial and force transmission properties across the workspace, and has low friction (.05-.30N). HFI's RF noise levels, in addition, are within a 3 Tesla fMRI scanner's baseline noise variation (∼.85±.1%). Finally, HFI is haptically transparent and does not interfere with human motor tasks (tested for .4m reaches). By allowing fMRI experiments involving complex three-dimensional manipulation with haptic interaction, Haptic fMRI enables-for the first time-non-invasive neuroscience experiments involving interactive motor tasks, object manipulation, tactile perception, and visuo-motor integration.
A general graphical user interface for automatic reliability modeling
NASA Technical Reports Server (NTRS)
Liceaga, Carlos A.; Siewiorek, Daniel P.
1991-01-01
Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.
Graphic Design Is Not a Medium.
ERIC Educational Resources Information Center
Gruber, John Edward, Jr.
2001-01-01
Discusses graphic design and reviews its development from analog processes to a digital tool with the use of computers. Topics include graphical user interfaces; the need for visual communication concepts; transmedia as opposed to repurposing; and graphic design instruction in higher education. (LRW)
Image Understanding and Intelligent Parallel Systems
1991-05-09
a common user interface for the interactive , graphical manipulation of those histories, and...Circuits and Systems, August 1987. Yap, S.-K. and M.L. Scott, "PenGuin: A language for reactive graphical user interface programming," to appear, INTERACT , Cambridge, United Kingdom, 1990. 74 ...of up to a factor of 100 over single-workstation implementations. User interfaces to large multiprocessor computers are a difficult issue addressed
NLEdit: A generic graphical user interface for Fortran programs
NASA Technical Reports Server (NTRS)
Curlett, Brian P.
1994-01-01
NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.
Spatial issues in user interface design from a graphic design perspective
NASA Technical Reports Server (NTRS)
Marcus, Aaron
1989-01-01
The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.
Saving and Reproduction of Human Motion Data by Using Haptic Devices with Different Configurations
NASA Astrophysics Data System (ADS)
Tsunashima, Noboru; Yokokura, Yuki; Katsura, Seiichiro
Recently, there has been increased focus on “haptic recording” development of a motion-copying system is an efficient method for the realization of haptic recording. Haptic recording involves saving and reproduction of human motion data on the basis of haptic information. To increase the number of applications of the motion-copying system in various fields, it is necessary to reproduce human motion data by using haptic devices with different configurations. In this study, a method for the above-mentioned haptic recording is developed. In this method, human motion data are saved and reproduced on the basis of work space information, which is obtained by coordinate transformation of motor space information. The validity of the proposed method is demonstrated by experiments. With the proposed method, saving and reproduction of human motion data by using various devices is achieved. Furthermore, it is also possible to use haptic recording in various fields.
Teodorescu, Kinneret; Bouchigny, Sylvain; Korman, Maria
2013-08-01
In this study, we explored the time course of haptic stiffness discrimination learning and how it was affected by two experimental factors, the addition of visual information and/or knowledge of results (KR) during training. Stiffness perception may integrate both haptic and visual modalities. However, in many tasks, the visual field is typically occluded, forcing stiffness perception to be dependent exclusively on haptic information. No studies to date addressed the time course of haptic stiffness perceptual learning. Using a virtual environment (VE) haptic interface and a two-alternative forced-choice discrimination task, the haptic stiffness discrimination ability of 48 participants was tested across 2 days. Each day included two haptic test blocks separated by a training block Additional visual information and/or KR were manipulated between participants during training blocks. Practice repetitions alone induced significant improvement in haptic stiffness discrimination. Between days, accuracy was slightly improved, but decision time performance was deteriorated. The addition of visual information and/or KR had only temporary effects on decision time, without affecting the time course of haptic discrimination learning. Learning in haptic stiffness discrimination appears to evolve through at least two distinctive phases: A single training session resulted in both immediate and latent learning. This learning was not affected by the training manipulations inspected. Training skills in VE in spaced sessions can be beneficial for tasks in which haptic perception is critical, such as surgery procedures, when the visual field is occluded. However, training protocols for such tasks should account for low impact of multisensory information and KR.
Virtual rounds: simulation-based education in procedural medicine
NASA Astrophysics Data System (ADS)
Shaffer, David W.; Meglan, Dwight A.; Ferrell, Margaret; Dawson, Steven L.
1999-07-01
Computer-based simulation is a goal for training physicians in specialties where traditional training puts patients at risk. Intuitively, interactive simulation of anatomy, pathology, and therapeutic actions should lead to shortening of the learning curve for novice or inexperienced physicians. Effective transfer of knowledge acquired in simulators must be shown for such devices to be widely accepted in the medical community. We have developed an Interventional Cardiology Training Simulator which incorporates real-time graphic interactivity coupled with haptic response, and an embedded curriculum permitting rehearsal, hypertext links, personal archiving and instructor review and testing capabilities. This linking of purely technical simulation with educational content creates a more robust educational purpose for procedural simulators.
Structural impact detection with vibro-haptic interfaces
NASA Astrophysics Data System (ADS)
Jung, Hwee-Kwon; Park, Gyuhae; Todd, Michael D.
2016-07-01
This paper presents a new sensing paradigm for structural impact detection using vibro-haptic interfaces. The goal of this study is to allow humans to ‘feel’ structural responses (impact, shape changes, and damage) and eventually determine health conditions of a structure. The target applications for this study are aerospace structures, in particular, airplane wings. Both hardware and software components are developed to realize the vibro-haptic-based impact detection system. First, L-shape piezoelectric sensor arrays are deployed to measure the acoustic emission data generated by impacts on a wing. Unique haptic signals are then generated by processing the measured acoustic emission data. These haptic signals are wirelessly transmitted to human arms, and with vibro-haptic interface, human pilots could identify impact location, intensity and possibility of subsequent damage initiation. With the haptic interface, the experimental results demonstrate that human could correctly identify such events, while reducing false indications on structural conditions by capitalizing on human’s classification capability. Several important aspects of this study, including development of haptic interfaces, design of optimal human training strategies, and extension of the haptic capability into structural impact detection are summarized in this paper.
Haptic wearables as sensory replacement, sensory augmentation and trainer - a review.
Shull, Peter B; Damian, Dana D
2015-07-20
Sensory impairments decrease quality of life and can slow or hinder rehabilitation. Small, computationally powerful electronics have enabled the recent development of wearable systems aimed to improve function for individuals with sensory impairments. The purpose of this review is to synthesize current haptic wearable research for clinical applications involving sensory impairments. We define haptic wearables as untethered, ungrounded body worn devices that interact with skin directly or through clothing and can be used in natural environments outside a laboratory. Results of this review are categorized by degree of sensory impairment. Total impairment, such as in an amputee, blind, or deaf individual, involves haptics acting as sensory replacement; partial impairment, as is common in rehabilitation, involves haptics as sensory augmentation; and no impairment involves haptics as trainer. This review found that wearable haptic devices improved function for a variety of clinical applications including: rehabilitation, prosthetics, vestibular loss, osteoarthritis, vision loss and hearing loss. Future haptic wearables development should focus on clinical needs, intuitive and multimodal haptic displays, low energy demands, and biomechanical compliance for long-term usage.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1992-01-01
The Transportable Applications Environment (TAE) Plus was built to support the construction of graphical user interfaces (GUI's) for highly interactive applications, such as real-time processing systems and scientific analysis systems. It is a general purpose portable tool that includes a 'What You See Is What You Get' WorkBench that allows user interface designers to layout and manipulate windows and interaction objects. The WorkBench includes both user entry objects (e.g., radio buttons, menus) and data-driven objects (e.g., dials, gages, stripcharts), which dynamically change based on values of realtime data. Discussed here is what TAE Plus provides, how the implementation has utilized state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA.
Information Graphic Classification, Decomposition and Alternative Representation
ERIC Educational Resources Information Center
Gao, Jinglun
2012-01-01
This thesis work is mainly focused on two problems related to improving accessibility of information graphics for visually impaired users. The first problem is automated analysis of information graphics for information extraction and the second problem is multi-modal representations for accessibility. Information graphics are graphical…
Meaningful Real-Time Graphics Workstation Performance Measurements
1988-11-01
alike can effectively operate the program with little or no help from user’s manuals or other users. A thorough and efficient design of command line...Specifica- tions, San Jose, California, 1988. 4. Apgar , Brian, Bersack, Bret and Mammen, Abraham, "A Display System for the Stellarr m Graphics
A Graphical Database Interface for Casual, Naive Users.
ERIC Educational Resources Information Center
Burgess, Clifford; Swigger, Kathleen
1986-01-01
Describes the design of a database interface for infrequent users of computers which consists of a graphical display of a model of a database and a natural language query language. This interface was designed for and tested with physicians at the University of Texas Health Science Center in Dallas. (LRW)
Computer-Based Tools for Evaluating Graphical User Interfaces
NASA Technical Reports Server (NTRS)
Moore, Loretta A.
1997-01-01
The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.
Interactive graphics for expressing health risks: development and qualitative evaluation.
Ancker, Jessica S; Chan, Connie; Kukafka, Rita
2009-01-01
Recent findings suggest that interactive game-like graphics might be useful in communicating probabilities. We developed a prototype for a risk communication module, focusing on eliciting users' preferences for different interactive graphics and assessing usability and user interpretations. Feedback from five focus groups was used to design the graphics. The final version displayed a matrix of square buttons; clicking on any button allowed the user to see whether the stick figure underneath was affected by the health outcome. When participants used this interaction to learn about a risk, they expressed more emotional responses, both positive and negative, than when viewing any static graphic or numerical description of a risk. Their responses included relief about small risks and concern about large risks. The groups also commented on static graphics: arranging the figures affected by disease randomly throughout a group of figures made it more difficult to judge the proportion affected but often was described as more realistic. Interactive graphics appear to have potential for expressing risk magnitude as well as the feeling of risk. This affective impact could be useful in increasing perceived threat of high risks, calming fears about low risks, or comparing risks. Quantitative studies are planned to assess the effect on perceived risks and estimated risk magnitudes.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
End-to-End Flow Control for Visual-Haptic Communication under Bandwidth Change
NASA Astrophysics Data System (ADS)
Yashiro, Daisuke; Tian, Dapeng; Yakoh, Takahiro
This paper proposes an end-to-end flow controller for visual-haptic communication. A visual-haptic communication system transmits non-real-time packets, which contain large-size visual data, and real-time packets, which contain small-size haptic data. When the transmission rate of visual data exceeds the communication bandwidth, the visual-haptic communication system becomes unstable owing to buffer overflow. To solve this problem, an end-to-end flow controller is proposed. This controller determines the optimal transmission rate of visual data on the basis of the traffic conditions, which are estimated by the packets for haptic communication. Experimental results confirm that in the proposed method, a short packet-sending interval and a short delay are achieved under bandwidth change, and thus, high-precision visual-haptic communication is realized.
Preserved Haptic Shape Processing after Bilateral LOC Lesions.
Snow, Jacqueline C; Goodale, Melvyn A; Culham, Jody C
2015-10-07
The visual and haptic perceptual systems are understood to share a common neural representation of object shape. A region thought to be critical for recognizing visual and haptic shape information is the lateral occipital complex (LOC). We investigated whether LOC is essential for haptic shape recognition in humans by studying behavioral responses and brain activation for haptically explored objects in a patient (M.C.) with bilateral lesions of the occipitotemporal cortex, including LOC. Despite severe deficits in recognizing objects using vision, M.C. was able to accurately recognize objects via touch. M.C.'s psychophysical response profile to haptically explored shapes was also indistinguishable from controls. Using fMRI, M.C. showed no object-selective visual or haptic responses in LOC, but her pattern of haptic activation in other brain regions was remarkably similar to healthy controls. Although LOC is routinely active during visual and haptic shape recognition tasks, it is not essential for haptic recognition of object shape. The lateral occipital complex (LOC) is a brain region regarded to be critical for recognizing object shape, both in vision and in touch. However, causal evidence linking LOC with haptic shape processing is lacking. We studied recognition performance, psychophysical sensitivity, and brain response to touched objects, in a patient (M.C.) with extensive lesions involving LOC bilaterally. Despite being severely impaired in visual shape recognition, M.C. was able to identify objects via touch and she showed normal sensitivity to a haptic shape illusion. M.C.'s brain response to touched objects in areas of undamaged cortex was also very similar to that observed in neurologically healthy controls. These results demonstrate that LOC is not necessary for recognizing objects via touch. Copyright © 2015 the authors 0270-6474/15/3513745-16$15.00/0.
User interaction in smart ambient environment targeted for senior citizen.
Pulli, Petri; Hyry, Jaakko; Pouke, Matti; Yamamoto, Goshiro
2012-11-01
Many countries are facing a problem when the age-structure of the society is changing. The numbers of senior citizen are rising rapidly, and caretaking personnel numbers cannot match the problems and needs of these citizens. Using smart, ubiquitous technologies can offer ways in coping with the need of more nursing staff and the rising costs of taking care of senior citizens for the society. Helping senior citizens with a novel, easy to use interface that guides and helps, could improve their quality of living and make them participate more in daily activities. This paper presents a projection-based display system for elderly people with memory impairments and the proposed user interface for the system. The user's process recognition based on a sensor network is also described. Elderly people wearing the system can interact the projected user interface by tapping physical surfaces (such as walls, tables, or doors) using them as a natural, haptic feedback input surface.
Haptic augmentation of science instruction: Does touch matter?
NASA Astrophysics Data System (ADS)
Jones, M. Gail; Minogue, James; Tretter, Thomas R.; Negishi, Atsuko; Taylor, Russell
2006-01-01
This study investigated the impact of haptic augmentation of a science inquiry program on students' learning about viruses and nanoscale science. The study assessed how the addition of different types of haptic feedback (active touch and kinesthetic feedback) combined with computer visualizations influenced middle and high school students' experiences. The influences of a PHANToM (a sophisticated haptic desktop device), a Sidewinder (a haptic gaming joystick), and a mouse (no haptic feedback) interface were compared. The levels of engagement in the instruction and students' attitudes about the instructional program were assessed using a combination of constructed response and Likert scale items. Potential cognitive differences were examined through an analysis of spontaneously generated analogies that appeared during student discourse. Results showed that the addition of haptic feedback from the haptic-gaming joystick and the PHANToM provided a more immersive learning environment that not only made the instruction more engaging but may also influence the way in which the students construct their understandings about abstract science concepts.
Matsumiya, Kazumichi
2013-10-01
Current views on face perception assume that the visual system receives only visual facial signals. However, I show that the visual perception of faces is systematically biased by adaptation to a haptically explored face. Recently, face aftereffects (FAEs; the altered perception of faces after adaptation to a face) have been demonstrated not only in visual perception but also in haptic perception; therefore, I combined the two FAEs to examine whether the visual system receives face-related signals from the haptic modality. I found that adaptation to a haptically explored facial expression on a face mask produced a visual FAE for facial expression. This cross-modal FAE was not due to explicitly imaging a face, response bias, or adaptation to local features. Furthermore, FAEs transferred from vision to haptics. These results indicate that visual face processing depends on substrates adapted by haptic faces, which suggests that face processing relies on shared representation underlying cross-modal interactions.
Visual design for the user interface, Part 1: Design fundamentals.
Lynch, P J
1994-01-01
Digital audiovisual media and computer-based documents will be the dominant forms of professional communication in both clinical medicine and the biomedical sciences. The design of highly interactive multimedia systems will shortly become a major activity for biocommunications professionals. The problems of human-computer interface design are intimately linked with graphic design for multimedia presentations and on-line document systems. This article outlines the history of graphic interface design and the theories that have influenced the development of today's major graphic user interfaces.
A magnetorheological haptic cue accelerator for manual transmission vehicles
NASA Astrophysics Data System (ADS)
Han, Young-Min; Noh, Kyung-Wook; Lee, Yang-Sub; Choi, Seung-Bok
2010-07-01
This paper proposes a new haptic cue function for manual transmission vehicles to achieve optimal gear shifting. This function is implemented on the accelerator pedal by utilizing a magnetorheological (MR) brake mechanism. By combining the haptic cue function with the accelerator pedal, the proposed haptic cue device can transmit the optimal moment of gear shifting for manual transmission to a driver without requiring the driver's visual attention. As a first step to achieve this goal, a MR fluid-based haptic device is devised to enable rotary motion of the accelerator pedal. Taking into account spatial limitations, the design parameters are optimally determined using finite element analysis to maximize the relative control torque. The proposed haptic cue device is then manufactured and its field-dependent torque and time response are experimentally evaluated. Then the manufactured MR haptic cue device is integrated with the accelerator pedal. A simple virtual vehicle emulating the operation of the engine of a passenger vehicle is constructed and put into communication with the haptic cue device. A feed-forward torque control algorithm for the haptic cue is formulated and control performances are experimentally evaluated and presented in the time domain.
PRay - A graphical user interface for interactive visualization and modification of rayinvr models
NASA Astrophysics Data System (ADS)
Fromm, T.
2016-01-01
PRay is a graphical user interface for interactive displaying and editing of velocity models for seismic refraction. It is optimized for editing rayinvr models but can also be used as a dynamic viewer for ray tracing results from other software. The main features are the graphical editing of nodes and fast adjusting of the display (stations and phases). It can be extended by user-defined shell scripts and links to phase picking software. PRay is open source software written in the scripting language Perl, runs on Unix-like operating systems including Mac OS X and provides a version controlled source code repository for community development (https://sourceforge.net/projects/pray-plot-rayinvr/).
Methods and apparatus for graphical display and editing of flight plans
NASA Technical Reports Server (NTRS)
Gibbs, Michael J. (Inventor); Adams, Jr., Mike B. (Inventor); Chase, Karl L. (Inventor); Lewis, Daniel E. (Inventor); McCrobie, Daniel E. (Inventor); Omen, Debi Van (Inventor)
2002-01-01
Systems and methods are provided for an integrated graphical user interface which facilitates the display and editing of aircraft flight-plan data. A user (e.g., a pilot) located within the aircraft provides input to a processor through a cursor control device and receives visual feedback via a display produced by a monitor. The display includes various graphical elements associated with the lateral position, vertical position, flight-plan and/or other indicia of the aircraft's operational state as determined from avionics data and/or various data sources. Through use of the cursor control device, the user may modify the flight-plan and/or other such indicia graphically in accordance with feedback provided by the display. In one embodiment, the display includes a lateral view, a vertical profile view, and a hot-map view configured to simplify the display and editing of the aircraft's flight-plan data.
Multi-fingered haptic palpation utilizing granular jamming stiffness feedback actuators
NASA Astrophysics Data System (ADS)
Li, Min; Ranzani, Tommaso; Sareh, Sina; Seneviratne, Lakmal D.; Dasgupta, Prokar; Wurdemann, Helge A.; Althoefer, Kaspar
2014-09-01
This paper describes a multi-fingered haptic palpation method using stiffness feedback actuators for simulating tissue palpation procedures in traditional and in robot-assisted minimally invasive surgery. Soft tissue stiffness is simulated by changing the stiffness property of the actuator during palpation. For the first time, granular jamming and pneumatic air actuation are combined to realize stiffness modulation. The stiffness feedback actuator is validated by stiffness measurements in indentation tests and through stiffness discrimination based on a user study. According to the indentation test results, the introduction of a pneumatic chamber to granular jamming can amplify the stiffness variation range and reduce hysteresis of the actuator. The advantage of multi-fingered palpation using the proposed actuators is proven by the comparison of the results of the stiffness discrimination performance using two-fingered (sensitivity: 82.2%, specificity: 88.9%, positive predicative value: 80.0%, accuracy: 85.4%, time: 4.84 s) and single-fingered (sensitivity: 76.4%, specificity: 85.7%, positive predicative value: 75.3%, accuracy: 81.8%, time: 7.48 s) stiffness feedback.
Yovanoff, Mary; Pepley, David; Mirkin, Katelin; Moore, Jason; Han, David; Miller, Scarlett
2017-01-01
While Virtual Reality (VR) has emerged as a viable method for training new medical residents, it has not yet reached all areas of training. One area lacking such development is surgical residency programs where there are large learning curves associated with skill development. In order to address this gap, a Dynamic Haptic Robotic Trainer (DHRT) was developed to help train surgical residents in the placement of ultrasound guided Internal Jugular Central Venous Catheters and to incorporate personalized learning. In order to accomplish this, a 2-part study was conducted to: (1) systematically analyze the feedback given to 18 third year medical students by trained professionals to identify the items necessary for a personalized learning system and (2) develop and experimentally test the usability of the personalized learning interface within the DHRT system. The results can be used to inform the design of VR and personalized learning systems within the medical community. PMID:29123361
Flight Telerobotic Servicer prototype simulator
NASA Astrophysics Data System (ADS)
Schein, Rob; Krauze, Linda; Hartley, Craig; Dickenson, Alan; Lavecchia, Tom; Working, Bob
A prototype simulator for the Flight Telerobotic Servicer (FTS) system is described for use in the design development of the FTS, emphasizing the hand controller and user interface. The simulator utilizes a graphics workstation based on rapid prototyping tools for systems analyses of the use of the user interface and the hand controller. Kinematic modeling, manipulator-control algorithms, and communications programs are contained in the software for the simulator. The hardwired FTS panels and operator interface for use on the STS Orbiter are represented graphically, and the simulated controls function as the final FTS system configuration does. The robotic arm moves based on the user hand-controller interface, and the joint angles and other data are given on the prototype of the user interface. This graphics simulation tool provides the means for familiarizing crewmembers with the FTS system operation, displays, and controls.
Design of a haptic device with grasp and push-pull force feedback for a master-slave surgical robot.
Hu, Zhenkai; Yoon, Chae-Hyun; Park, Samuel Byeongjun; Jo, Yung-Ho
2016-07-01
We propose a portable haptic device providing grasp (kinesthetic) and push-pull (cutaneous) sensations for optical-motion-capture master interfaces. Although optical-motion-capture master interfaces for surgical robot systems can overcome the stiffness, friction, and coupling problems of mechanical master interfaces, it is difficult to add haptic feedback to an optical-motion-capture master interface without constraining the free motion of the operator's hands. Therefore, we utilized a Bowden cable-driven mechanism to provide the grasp and push-pull sensation while retaining the free hand motion of the optical-motion capture master interface. To evaluate the haptic device, we construct a 2-DOF force sensing/force feedback system. We compare the sensed force and the reproduced force of the haptic device. Finally, a needle insertion test was done to evaluate the performance of the haptic interface in the master-slave system. The results demonstrate that both the grasp force feedback and the push-pull force feedback provided by the haptic interface closely matched with the sensed forces of the slave robot. We successfully apply our haptic interface in the optical-motion-capture master-slave system. The results of the needle insertion test showed that our haptic feedback can provide more safety than merely visual observation. We develop a suitable haptic device to produce both kinesthetic grasp force feedback and cutaneous push-pull force feedback. Our future research will include further objective performance evaluations of the optical-motion-capture master-slave robot system with our haptic interface in surgical scenarios.
PrimerMapper: high throughput primer design and graphical assembly for PCR and SNP detection
O’Halloran, Damien M.
2016-01-01
Primer design represents a widely employed gambit in diverse molecular applications including PCR, sequencing, and probe hybridization. Variations of PCR, including primer walking, allele-specific PCR, and nested PCR provide specialized validation and detection protocols for molecular analyses that often require screening large numbers of DNA fragments. In these cases, automated sequence retrieval and processing become important features, and furthermore, a graphic that provides the user with a visual guide to the distribution of designed primers across targets is most helpful in quickly ascertaining primer coverage. To this end, I describe here, PrimerMapper, which provides a comprehensive graphical user interface that designs robust primers from any number of inputted sequences while providing the user with both, graphical maps of primer distribution for each inputted sequence, and also a global assembled map of all inputted sequences with designed primers. PrimerMapper also enables the visualization of graphical maps within a browser and allows the user to draw new primers directly onto the webpage. Other features of PrimerMapper include allele-specific design features for SNP genotyping, a remote BLAST window to NCBI databases, and remote sequence retrieval from GenBank and dbSNP. PrimerMapper is hosted at GitHub and freely available without restriction. PMID:26853558
The Efficacy of Surface Haptics and Force Feedback in Education
ERIC Educational Resources Information Center
Gorlewicz, Jenna Lynn
2013-01-01
This dissertation bridges the fields of haptics, engineering, and education to realize some of the potential benefits haptic devices may have in Science, Technology, Engineering, and Math (STEM) education. Specifically, this dissertation demonstrates the development, implementation, and assessment of two haptic devices in engineering and math…
Incorporating Haptic Feedback in Simulation for Learning Physics
ERIC Educational Resources Information Center
Han, Insook; Black, John B.
2011-01-01
The purpose of this study was to investigate the effectiveness of a haptic augmented simulation in learning physics. The results indicate that haptic augmented simulations, both the force and kinesthetic and the purely kinesthetic simulations, were more effective than the equivalent non-haptic simulation in providing perceptual experiences and…
Haptic Distal Spatial Perception Mediated by Strings: Haptic "Looming"
ERIC Educational Resources Information Center
Cabe, Patrick A.
2011-01-01
Five experiments tested a haptic analog of optical looming, demonstrating string-mediated haptic distal spatial perception. Horizontally collinear hooks supported a weighted string held taut by a blindfolded participant's finger midway between the hooks. At the finger, the angle between string segments increased as the finger approached…
Haptic Classification of Common Objects: Knowledge-Driven Exploration.
ERIC Educational Resources Information Center
Lederman, Susan J.; Klatzky, Roberta L.
1990-01-01
Theoretical and empirical issues relating to haptic exploration and the representation of common objects during haptic classification were investigated in 3 experiments involving a total of 112 college students. Results are discussed in terms of a computational model of human haptic object classification with implications for dextrous robot…
enhancedGraphics: a Cytoscape app for enhanced node graphics
Morris, John H.; Kuchinsky, Allan; Ferrin, Thomas E.; Pico, Alexander R.
2014-01-01
enhancedGraphics ( http://apps.cytoscape.org/apps/enhancedGraphics) is a Cytoscape app that implements a series of enhanced charts and graphics that may be added to Cytoscape nodes. It enables users and other app developers to create pie, line, bar, and circle plots that are driven by columns in the Cytoscape Node Table. Charts are drawn using vector graphics to allow full-resolution scaling. PMID:25285206
Model-Driven Development of Interactive Multimedia Applications with MML
NASA Astrophysics Data System (ADS)
Pleuss, Andreas; Hussmann, Heinrich
There is an increasing demand for high-quality interactive applications which combine complex application logic with a sophisticated user interface, making use of individual media objects like graphics, animations, 3D graphics, audio or video. Their development is still challenging as it requires the integration of software design, user interface design, and media design.
Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.
ERIC Educational Resources Information Center
Nowaczyk, Ronald H.; James, E. Christopher
1993-01-01
Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…
ERIC Educational Resources Information Center
Metros, Susan E.; Hedberg, John G.
2002-01-01
Examines the relationship between the graphical user interface (GUI) and the cognitive demands placed on the learner in eLearning (electronic learning) environments. Describes ways educators can design appropriate interfaces to facilitate meaningful interactions with educational content; and examines learner engagement and engagement theory using…
The Design of a Graphical User Interface for an Electronic Classroom.
ERIC Educational Resources Information Center
Cahalan, Kathleen J.; Levin, Jacques
2000-01-01
Describes the design of a prototype for the graphical user interface component of an electronic classroom (ECR) application that supports real-time lectures and question-and-answer sessions between an instructor and students. Based on requirements analysis and an analysis of competing products, a Web-based ECR prototype was produced. Findings show…
Circumventing Graphical User Interfaces in Chemical Engineering Plant Design
ERIC Educational Resources Information Center
Romey, Noel; Schwartz, Rachel M.; Behrend, Douglas; Miao, Peter; Cheung, H. Michael; Beitle, Robert
2007-01-01
Graphical User Interfaces (GUIs) are pervasive elements of most modern technical software and represent a convenient tool for student instruction. For example, GUIs are used for [chemical] process design software (e.g., CHEMCAD, PRO/II and ASPEN) typically encountered in the senior capstone course. Drag and drop aspects of GUIs are challenging for…
Software Graphical User Interface For Analysis Of Images
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn
1992-01-01
CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.
Comparing Text-based and Graphic User Interfaces for Novice and Expert Users
Chen, Jung-Wei; Zhang, Jiajie
2007-01-01
Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI. PMID:18693811
Comparing Text-based and Graphic User Interfaces for novice and expert users.
Chen, Jung-Wei; Zhang, Jiajie
2007-10-11
Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI.
A Java-Enabled Interactive Graphical Gas Turbine Propulsion System Simulator
NASA Technical Reports Server (NTRS)
Reed, John A.; Afjeh, Abdollah A.
1997-01-01
This paper describes a gas turbine simulation system which utilizes the newly developed Java language environment software system. The system provides an interactive graphical environment which allows the quick and efficient construction and analysis of arbitrary gas turbine propulsion systems. The simulation system couples a graphical user interface, developed using the Java Abstract Window Toolkit, and a transient, space- averaged, aero-thermodynamic gas turbine analysis method, both entirely coded in the Java language. The combined package provides analytical, graphical and data management tools which allow the user to construct and control engine simulations by manipulating graphical objects on the computer display screen. Distributed simulations, including parallel processing and distributed database access across the Internet and World-Wide Web (WWW), are made possible through services provided by the Java environment.
Interactive SIGHT: textual access to simple bar charts
NASA Astrophysics Data System (ADS)
Demir, Seniz; Oliver, David; Schwartz, Edward; Elzer, Stephanie; Carberry, Sandra; Mccoy, Kathleen F.; Chester, Daniel
2010-12-01
Information graphics, such as bar charts and line graphs, are an important component of many articles from popular media. The majority of such graphics have an intention (a high-level message) to communicate to the graph viewer. Since the intended message of a graphic is often not repeated in the accompanying text, graphics together with the textual segments contribute to the overall purpose of an article and cannot be ignored. Unfortunately, these visual displays are provided in a format which is not readily accessible to everyone. For example, individuals with sight impairments who use screen readers to listen to documents have limited access to the graphics. This article presents a new accessibility tool, the Interactive SIGHT (Summarizing Information GrapHics Textually) system, that is intended to enable visually impaired users to access the knowledge that one would gain from viewing information graphics found on the web. The current system, which is implemented as a browser extension that works on simple bar charts, can be invoked by a user via a keystroke combination while navigating the web. Once launched, Interactive SIGHT first provides a brief summary that conveys the underlying intention of a bar chart along with the chart's most significant and salient features, and then produces history-aware follow-up responses to provide further information about the chart upon request from the user. We present two user studies that were conducted with sighted and visually impaired users to determine how effective the initial summary and follow-up responses are in conveying the informational content of bar charts, and to evaluate how easy it is to use the system interface. The evaluation results are promising and indicate that the system responses are well-structured and enable visually impaired users to answer key questions about bar charts in an easy-to-use manner. Post-experimental interviews revealed that visually impaired participants were very satisfied with the system offering different options to access the content of a chart to meet their specific needs and that they would use Interactive SIGHT if it was publicly available so as not to have to ignore graphics on the web. Being a language based assistive technology designed to compensate for the lack of sight, our work paves the road for a stronger acceptance of natural language interfaces to graph interpretation that we believe will be of great benefit to the visually impaired community.
An experimental study on CHVE's performance evaluation.
Paiva, Paulo V F; Machado, Liliane S; Oliveira, Jauvane C
2012-01-01
Virtual reality-based training simulators, with collaborative capabilities, are known to improve the way users interact with one another while learning or improving skills on a given medical procedure. Performance evaluation of Collaborative Haptic Virtual Environments (CHVE) allows us to understand how such systems can work in the Internet, as well as the requirements for multisensorial and real-time data. This work discloses new performance evaluation results for the collaborative module of the CyberMed VR framework.
Perceptualization of geometry using intelligent haptic and visual sensing
NASA Astrophysics Data System (ADS)
Weng, Jianguang; Zhang, Hui
2013-01-01
We present a set of paradigms for investigating geometric structures using haptic and visual sensing. Our principal test cases include smoothly embedded geometry shapes such as knotted curves embedded in 3D and knotted surfaces in 4D, that contain massive intersections when projected to one lower dimension. One can exploit a touch-responsive 3D interactive probe to haptically override this conflicting evidence in the rendered images, by forcing continuity in the haptic representation to emphasize the true topology. In our work, we exploited a predictive haptic guidance, a "computer-simulated hand" with supplementary force suggestion, to support intelligent exploration of geometry shapes that will smooth and maximize the probability of recognition. The cognitive load can be reduced further when enabling an attention-driven visual sensing during the haptic exploration. Our methods combine to reveal the full richness of the haptic exploration of geometric structures, and to overcome the limitations of traditional 4D visualization.
User's manual for the coupled rotor/airframe vibration analysis graphic package
NASA Technical Reports Server (NTRS)
Studwell, R. E.
1982-01-01
User instructions for a graphics package for coupled rotor/airframe vibration analysis are presented. Responses to plot package messages which the user must make to activate plot package operations and options are described. Installation instructions required to set up the program on the CDC system are included. The plot package overlay structure and subroutines which have to be modified for the CDC system are also described. Operating instructions for CDC applications are included.
Brayda, Luca; Campus, Claudio; Memeo, Mariacarla; Lucagrossi, Laura
2015-01-01
Tactile maps are efficient tools to improve spatial understanding and mobility skills of visually impaired people. Their limited adaptability can be compensated with haptic devices which display graphical information, but their assessment is frequently limited to performance-based metrics only which can hide potential spatial abilities in O&M protocols. We assess a low-tech tactile mouse able to deliver three-dimensional content considering how performance, mental workload, behavior, and anxiety status vary with task difficulty and gender in congenitally blind, late blind, and sighted subjects. Results show that task difficulty coherently modulates the efficiency and difficulty to build mental maps, regardless of visual experience. Although exhibiting attitudes that were similar and gender-independent, the females had lower performance and higher cognitive load, especially when congenitally blind. All groups showed a significant decrease in anxiety after using the device. Tactile graphics with our device seems therefore to be applicable with different visual experiences, with no negative emotional consequences of mentally demanding spatial tasks. Going beyond performance-based assessment, our methodology can help with better targeting technological solutions in orientation and mobility protocols.
Differential effects of non-informative vision and visual interference on haptic spatial processing
van Rheede, Joram J.; Postma, Albert; Kappers, Astrid M. L.
2008-01-01
The primary purpose of this study was to examine the effects of non-informative vision and visual interference upon haptic spatial processing, which supposedly derives from an interaction between an allocentric and egocentric reference frame. To this end, a haptic parallelity task served as baseline to determine the participant-dependent biasing influence of the egocentric reference frame. As expected, large systematic participant-dependent deviations from veridicality were observed. In the second experiment we probed the effect of non-informative vision on the egocentric bias. Moreover, orienting mechanisms (gazing directions) were studied with respect to the presentation of haptic information in a specific hemispace. Non-informative vision proved to have a beneficial effect on haptic spatial processing. No effect of gazing direction or hemispace was observed. In the third experiment we investigated the effect of simultaneously presented interfering visual information on the haptic bias. Interfering visual information parametrically influenced haptic performance. The interplay of reference frames that subserves haptic spatial processing was found to be related to both the effects of non-informative vision and visual interference. These results suggest that spatial representations are influenced by direct cross-modal interactions; inter-participant differences in the haptic modality resulted in differential effects of the visual modality. PMID:18553074
Size-Sensitive Perceptual Representations Underlie Visual and Haptic Object Recognition
Craddock, Matt; Lawson, Rebecca
2009-01-01
A variety of similarities between visual and haptic object recognition suggests that the two modalities may share common representations. However, it is unclear whether such common representations preserve low-level perceptual features or whether transfer between vision and haptics is mediated by high-level, abstract representations. Two experiments used a sequential shape-matching task to examine the effects of size changes on unimodal and crossmodal visual and haptic object recognition. Participants felt or saw 3D plastic models of familiar objects. The two objects presented on a trial were either the same size or different sizes and were the same shape or different but similar shapes. Participants were told to ignore size changes and to match on shape alone. In Experiment 1, size changes on same-shape trials impaired performance similarly for both visual-to-visual and haptic-to-haptic shape matching. In Experiment 2, size changes impaired performance on both visual-to-haptic and haptic-to-visual shape matching and there was no interaction between the cost of size changes and direction of transfer. Together the unimodal and crossmodal matching results suggest that the same, size-specific perceptual representations underlie both visual and haptic object recognition, and indicate that crossmodal memory for objects must be at least partly based on common perceptual representations. PMID:19956685
Rauter, Georg; Sigrist, Roland; Riener, Robert; Wolf, Peter
2015-01-01
In literature, the effectiveness of haptics for motor learning is controversially discussed. Haptics is believed to be effective for motor learning in general; however, different types of haptic control enhance different movement aspects. Thus, in dependence on the movement aspects of interest, one type of haptic control may be effective whereas another one is not. Therefore, in the current work, it was investigated if and how different types of haptic controllers affect learning of spatial and temporal movement aspects. In particular, haptic controllers that enforce active participation of the participants were expected to improve spatial aspects. Only haptic controllers that provide feedback about the task's velocity profile were expected to improve temporal aspects. In a study on learning a complex trunk-arm rowing task, the effect of training with four different types of haptic control was investigated: position control, path control, adaptive path control, and reactive path control. A fifth group (control) trained with visual concurrent augmented feedback. As hypothesized, the position controller was most effective for learning of temporal movement aspects, while the path controller was most effective in teaching spatial movement aspects of the rowing task. Visual feedback was also effective for learning temporal and spatial movement aspects.
Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.
Pinzon, David; Byrns, Simon; Zheng, Bin
2016-08-01
Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.
Graphical workstation capability for reliability modeling
NASA Technical Reports Server (NTRS)
Bavuso, Salvatore J.; Koppen, Sandra V.; Haley, Pamela J.
1992-01-01
In addition to computational capabilities, software tools for estimating the reliability of fault-tolerant digital computer systems must also provide a means of interfacing with the user. Described here is the new graphical interface capability of the hybrid automated reliability predictor (HARP), a software package that implements advanced reliability modeling techniques. The graphics oriented (GO) module provides the user with a graphical language for modeling system failure modes through the selection of various fault-tree gates, including sequence-dependency gates, or by a Markov chain. By using this graphical input language, a fault tree becomes a convenient notation for describing a system. In accounting for any sequence dependencies, HARP converts the fault-tree notation to a complex stochastic process that is reduced to a Markov chain, which it can then solve for system reliability. The graphics capability is available for use on an IBM-compatible PC, a Sun, and a VAX workstation. The GO module is written in the C programming language and uses the graphical kernal system (GKS) standard for graphics implementation. The PC, VAX, and Sun versions of the HARP GO module are currently in beta-testing stages.
NASA Astrophysics Data System (ADS)
Sherley, Patrick L.; Pujol, Alfonso, Jr.; Meadow, John S.
1990-07-01
To provide a means of rendering complex computer architectures languages and input/output modalities transparent to experienced and inexperienced users research is being conducted to develop a voice driven/voice response computer graphics imaging system. The system will be used for reconstructing and displaying computed tomography and magnetic resonance imaging scan data. In conjunction with this study an artificial intelligence (Al) control strategy was developed to interface the voice components and support software to the computer graphics functions implemented on the Sun Microsystems 4/280 color graphics workstation. Based on generated text and converted renditions of verbal utterances by the user the Al control strategy determines the user''s intent and develops and validates a plan. The program type and parameters within the plan are used as input to the graphics system for reconstructing and displaying medical image data corresponding to that perceived intent. If the plan is not valid the control strategy queries the user for additional information. The control strategy operates in a conversation mode and vocally provides system status reports. A detailed examination of the various AT techniques is presented with major emphasis being placed on their specific roles within the total control strategy structure. 1.
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
ERIC Educational Resources Information Center
Hancock-Beaulieu, Micheline; And Others
1995-01-01
An online library catalog was used to evaluate an interactive query expansion facility based on relevance feedback for the Okapi, probabilistic, term weighting, retrieval system. A graphical user interface allowed searchers to select candidate terms extracted from relevant retrieved items to reformulate queries. Results suggested that the…
GeoWorks Considered. Part I: A GUI for the Rest of Us. Part II: Doing Windows Right.
ERIC Educational Resources Information Center
Flanders, Bruce; Lewis, Paul
1991-01-01
Describes GeoWorks, a new graphical user interface (GUI) that works on older, less powerful IBM PCs and compatibles. The PC/GEOS (PC/Graphical Environment Operating System) is explained, user friendliness is emphasized, comparisons are made to Microsoft Windows, and GeoWorks applications software is described. (LRW)
ERIC Educational Resources Information Center
National Council on Disability, Washington, DC.
This report investigates the use of the graphical user interface (GUI) in computer programs, the problems it creates for individuals with visual impairments or blindness, and advocacy efforts concerning this issue, which have been targeted primarily at Microsoft, producer of Windows. The report highlights the concerns of individuals with visual…
Young Children's Skill in Using a Mouse to Control a Graphical Computer Interface.
ERIC Educational Resources Information Center
Crook, Charles
1992-01-01
Describes a study that investigated the performance of preschoolers and children in the first three years of formal education on tasks that involved skills using a mouse-based control of a graphical computer interface. The children's performance is compared with that of novice adult users and expert users. (five references) (LRW)
Automatic programming of simulation models
NASA Technical Reports Server (NTRS)
Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.
1990-01-01
The concepts of software engineering were used to improve the simulation modeling environment. Emphasis was placed on the application of an element of rapid prototyping, or automatic programming, to assist the modeler define the problem specification. Then, once the problem specification has been defined, an automatic code generator is used to write the simulation code. The following two domains were selected for evaluating the concepts of software engineering for discrete event simulation: manufacturing domain and a spacecraft countdown network sequence. The specific tasks were to: (1) define the software requirements for a graphical user interface to the Automatic Manufacturing Programming System (AMPS) system; (2) develop a graphical user interface for AMPS; and (3) compare the AMPS graphical interface with the AMPS interactive user interface.
iMOSFLM: a new graphical interface for diffraction-image processing with MOSFLM
Battye, T. Geoff G.; Kontogiannis, Luke; Johnson, Owen; Powell, Harold R.; Leslie, Andrew G. W.
2011-01-01
iMOSFLM is a graphical user interface to the diffraction data-integration program MOSFLM. It is designed to simplify data processing by dividing the process into a series of steps, which are normally carried out sequentially. Each step has its own display pane, allowing control over parameters that influence that step and providing graphical feedback to the user. Suitable values for integration parameters are set automatically, but additional menus provide a detailed level of control for experienced users. The image display and the interfaces to the different tasks (indexing, strategy calculation, cell refinement, integration and history) are described. The most important parameters for each step and the best way of assessing success or failure are discussed. PMID:21460445
An adaptive structure data acquisition system using a graphical-based programming language
NASA Technical Reports Server (NTRS)
Baroth, Edmund C.; Clark, Douglas J.; Losey, Robert W.
1992-01-01
An example of the implementation of data fusion using a PC and a graphical programming language is discussed. A schematic of the data acquisition system and user interface panel for an adaptive structure test are presented. The computer programs (a series of icons 'wired' together) are also discussed. The way in which using graphical-based programming software to control a data acquisition system can simplify analysis of data, promote multidisciplinary interaction, and provide users a more visual key to understanding their data are shown.
Advanced graphical user interface for multi-physics simulations using AMST
NASA Astrophysics Data System (ADS)
Hoffmann, Florian; Vogel, Frank
2017-07-01
Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.
ROCOPT: A user friendly interactive code to optimize rocket structural components
NASA Technical Reports Server (NTRS)
Rule, William K.
1989-01-01
ROCOPT is a user-friendly, graphically-interfaced, microcomputer-based computer program (IBM compatible) that optimizes rocket components by minimizing the structural weight. The rocket components considered are ring stiffened truncated cones and cylinders. The applied loading is static, and can consist of any combination of internal or external pressure, axial force, bending moment, and torque. Stress margins are calculated by means of simple closed form strength of material type equations. Stability margins are determined by approximate, orthotropic-shell, closed-form equations. A modified form of Powell's method, in conjunction with a modified form of the external penalty method, is used to determine the minimum weight of the structure subject to stress and stability margin constraints, as well as user input constraints on the structural dimensions. The graphical interface guides the user through the required data prompts, explains program options and graphically displays results for easy interpretation.
Customizing graphical user interface technology for spacecraft control centers
NASA Technical Reports Server (NTRS)
Beach, Edward; Giancola, Peter; Gibson, Steven; Mahmot, Ronald
1993-01-01
The Transportable Payload Operations Control Center (TPOCC) project is applying the latest in graphical user interface technology to the spacecraft control center environment. This project of the Mission Operations Division's (MOD) Control Center Systems Branch (CCSB) at NASA Goddard Space Flight Center (GSFC) has developed an architecture for control centers which makes use of a distributed processing approach and the latest in Unix workstation technology. The TPOCC project is committed to following industry standards and using commercial off-the-shelf (COTS) hardware and software components wherever possible to reduce development costs and to improve operational support. TPOCC's most successful use of commercial software products and standards has been in the development of its graphical user interface. This paper describes TPOCC's successful use and customization of four separate layers of commercial software products to create a flexible and powerful user interface that is uniquely suited to spacecraft monitoring and control.
Dibble, Edward; Zivanovic, Aleksandar; Davies, Brian
2004-01-01
This paper presents the results of several early studies relating to human haptic perception sensitivity when probing a virtual object. A 1 degree of freedom (DoF) rotary haptic system, that was designed and built for this purpose, is also presented. The experiments were to assess the maximum forces applied in a minimally invasive surgery (MIS) procedure, quantify the compliance sensitivity threshold when probing virtual tissue and identify the haptic system loop rate necessary for haptic feedback to feel realistic.
User's manual for the HYPGEN hyperbolic grid generator and the HGUI graphical user interface
NASA Technical Reports Server (NTRS)
Chan, William M.; Chiu, Ing-Tsau; Buning, Pieter G.
1993-01-01
The HYPGEN program is used to generate a 3-D volume grid over a user-supplied single-block surface grid. This is accomplished by solving the 3-D hyperbolic grid generation equations consisting of two orthogonality relations and one cell volume constraint. In this user manual, the required input files and parameters and output files are described. Guidelines on how to select the input parameters are given. Illustrated examples are provided showing a variety of topologies and geometries that can be treated. HYPGEN can be used in stand-alone mode as a batch program or it can be called from within a graphical user interface HGUI that runs on Silicon Graphics workstations. This user manual provides a description of the menus, buttons, sliders, and typein fields in HGUI for users to enter the parameters needed to run HYPGEN. Instructions are given on how to configure the interface to allow HYPGEN to run either locally or on a faster remote machine through the use of shell scripts on UNIX operating systems. The volume grid generated is copied back to the local machine for visualization using a built-in hook to PLOT3D.
Roughness based perceptual analysis towards digital skin imaging system with haptic feedback.
Kim, K
2016-08-01
To examine psoriasis or atopic eczema, analyzing skin roughness by palpation is essential to precisely diagnose skin diseases. However, optical sensor based skin imaging systems do not allow dermatologists to touch skin images. To solve the problem, a new haptic rendering technology that can accurately display skin roughness must be developed. In addition, the rendering algorithm must be able to filter spatial noises created during 2D to 3D image conversion without losing the original roughness on the skin image. In this study, a perceptual way to design a noise filter that will remove spatial noises and in the meantime recover maximized roughness is introduced by understanding human sensitivity on surface roughness. A visuohaptic rendering system that can provide a user with seeing and touching digital skin surface roughness has been developed including a geometric roughness estimation method from a meshed surface. In following, a psychophysical experiment was designed and conducted with 12 human subjects to measure human perception with the developed visual and haptic interfaces to examine surface roughness. From the psychophysical experiment, it was found that touch is more sensitive at lower surface roughness, and vice versa. Human perception with both senses, vision and touch, becomes less sensitive to surface distortions as roughness increases. When interact with both channels, visual and haptic interfaces, the performance to detect abnormalities on roughness is greatly improved by sensory integration with the developed visuohaptic rendering system. The result can be used as a guideline to design a noise filter that can perceptually remove spatial noises while recover maximized roughness values from a digital skin image obtained by optical sensors. In addition, the result also confirms that the developed visuohaptic rendering system can help dermatologists or skin care professionals examine skin conditions by using vision and touch at the same time. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
X based interactive computer graphics applications for aerodynamic design and education
NASA Technical Reports Server (NTRS)
Benson, Thomas J.; Higgs, C. Fred, III
1995-01-01
Six computer applications packages have been developed to solve a variety of aerodynamic problems in an interactive environment on a single workstation. The packages perform classical one dimensional analysis under the control of a graphical user interface and can be used for preliminary design or educational purposes. The programs were originally developed on a Silicon Graphics workstation and used the GL version of the FORMS library as the graphical user interface. These programs have recently been converted to the XFORMS library of X based graphics widgets and have been tested on SGI, IBM, Sun, HP and PC-Lunix computers. The paper will show results from the new VU-DUCT program as a prime example. VU-DUCT has been developed as an educational package for the study of subsonic open and closed loop wind tunnels.
Color graphics, interactive processing, and the supercomputer
NASA Technical Reports Server (NTRS)
Smith-Taylor, Rudeen
1987-01-01
The development of a common graphics environment for the NASA Langley Research Center user community and the integration of a supercomputer into this environment is examined. The initial computer hardware, the software graphics packages, and their configurations are described. The addition of improved computer graphics capability to the supercomputer, and the utilization of the graphic software and hardware are discussed. Consideration is given to the interactive processing system which supports the computer in an interactive debugging, processing, and graphics environment.
NASA Astrophysics Data System (ADS)
Choi, Seung-Hyun; Kim, Soomin; Kim, Pyunghwa; Park, Jinhyuk; Choi, Seung-Bok
2015-06-01
In this study, we developed a novel four-degrees-of-freedom haptic master using controllable magnetorheological (MR) fluid. We also integrated the haptic master with a vision device with image processing for robot-assisted minimally invasive surgery (RMIS). The proposed master can be used in RMIS as a haptic interface to provide the surgeon with a sense of touch by using both kinetic and kinesthetic information. The slave robot, which is manipulated with a proportional-integrative-derivative controller, uses a force sensor to obtain the desired forces from tissue contact, and these desired repulsive forces are then embodied through the MR haptic master. To verify the effectiveness of the haptic master, the desired force and actual force are compared in the time domain. In addition, a visual feedback system is implemented in the RMIS experiment to distinguish between the tumor and organ more clearly and provide better visibility to the operator. The hue-saturation-value color space is adopted for the image processing since it is often more intuitive than other color spaces. The image processing and haptic feedback are realized on surgery performance. In this work, tumor-cutting experiments are conducted under four different operating conditions: haptic feedback on, haptic feedback off, image processing on, and image processing off. The experimental realization shows that the performance index, which is a function of pixels, is different in the four operating conditions.
Aging and solid shape recognition: Vision and haptics.
Norman, J Farley; Cheeseman, Jacob R; Adkins, Olivia C; Cox, Andrea G; Rogers, Connor E; Dowell, Catherine J; Baxter, Michael W; Norman, Hideko F; Reyes, Cecia M
2015-10-01
The ability of 114 younger and older adults to recognize naturally-shaped objects was evaluated in three experiments. The participants viewed or haptically explored six randomly-chosen bell peppers (Capsicum annuum) in a study session and were later required to judge whether each of twelve bell peppers was "old" (previously presented during the study session) or "new" (not presented during the study session). When recognition memory was tested immediately after study, the younger adults' (Experiment 1) performance for vision and haptics was identical when the individual study objects were presented once. Vision became superior to haptics, however, when the individual study objects were presented multiple times. When 10- and 20-min delays (Experiment 2) were inserted in between study and test sessions, no significant differences occurred between vision and haptics: recognition performance in both modalities was comparable. When the recognition performance of older adults was evaluated (Experiment 3), a negative effect of age was found for visual shape recognition (younger adults' overall recognition performance was 60% higher). There was no age effect, however, for haptic shape recognition. The results of the present experiments indicate that the visual recognition of natural object shape is different from haptic recognition in multiple ways: visual shape recognition can be superior to that of haptics and is affected by aging, while haptic shape recognition is less accurate and unaffected by aging. Copyright © 2015 Elsevier Ltd. All rights reserved.
Validation of the PASSPORT V2 training environment for arthroscopic skills.
Stunt, J J; Kerkhoffs, G M M J; Horeman, T; van Dijk, C N; Tuijthof, G J M
2016-06-01
Virtual reality simulators used in the education of orthopaedic residents often lack realistic haptic feedback. To solve this, the (Practice Arthroscopic Surgical Skills for Perfect Operative Real-life Treatment) PASSPORT simulator was developed, which was subjected to fundamental changes: improved realism and user interface. The purpose was to demonstrate its face and construct validity. Thirty-one participants were divided into three groups having different levels of arthroscopic experience. Participants answered questions regarding general information and the outer appearance of the simulator for face validity. Construct validity was assessed with one standardized navigation task, which was timed. Face validity, educational value and user-friendliness were determined with two representative exercises and by asking participants to fill out the questionnaire. A value of 7 or greater was considered sufficient. Construct validity was demonstrated between experts and novices. Median task time for the fifth trial was 55 s (range 17-139 s) for the novices, 33 s (range 17-59 s) for the intermediates, and 26 s (range 14-52 s) for the experts. Median task times of three trials were not significantly different between the novices and intermediates, and none of the trials between intermediates and experts. Face validity, educational value and user-friendliness were perceived as sufficient (median >7). The presence of realistic tactile feedback was considered the biggest asset of the simulator. Proper preparation for arthroscopic operations will increase the quality of real-life surgery and patients' safety. The PASSPORT simulator can assist in achieving this, as it showed construct and face validity, and its physical nature offered adequate haptic feedback during training. This indicates that PASSPORT has potential to evolve as a valuable training modality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasser, D.W.
1978-03-01
EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of EASI Graphics and illustrates its application with some examples.
Haptic Paddle Enhancements and a Formal Assessment of Student Learning in System Dynamics
ERIC Educational Resources Information Center
Gorlewicz, Jenna L.; Kratchman, Louis B.; Webster, Robert J., III
2014-01-01
The haptic paddle is a force-feedback joystick used at several universities in teaching System Dynamics, a core mechanical engineering undergraduate course where students learn to model dynamic systems in several domains. A second goal of the haptic paddle is to increase the accessibility of robotics and haptics by providing a low-cost device for…
Giordano, Bruno L; Visell, Yon; Yao, Hsin-Yun; Hayward, Vincent; Cooperstock, Jeremy R; McAdams, Stephen
2012-05-01
Locomotion generates multisensory information about walked-upon objects. How perceptual systems use such information to get to know the environment remains unexplored. The ability to identify solid (e.g., marble) and aggregate (e.g., gravel) walked-upon materials was investigated in auditory, haptic or audio-haptic conditions, and in a kinesthetic condition where tactile information was perturbed with a vibromechanical noise. Overall, identification performance was better than chance in all experimental conditions and for both solids and the better identified aggregates. Despite large mechanical differences between the response of solids and aggregates to locomotion, for both material categories discrimination was at its worst in the auditory and kinesthetic conditions and at its best in the haptic and audio-haptic conditions. An analysis of the dominance of sensory information in the audio-haptic context supported a focus on the most accurate modality, haptics, but only for the identification of solid materials. When identifying aggregates, response biases appeared to produce a focus on the least accurate modality--kinesthesia. When walking on loose materials such as gravel, individuals do not perceive surfaces by focusing on the most accurate modality, but by focusing on the modality that would most promptly signal postural instabilities.
Role of combined tactile and kinesthetic feedback in minimally invasive surgery.
Lim, Soo-Chul; Lee, Hyung-Kew; Park, Joonah
2014-10-18
Haptic feedback is of critical importance in surgical tasks. However, conventional surgical robots do not provide haptic feedback to surgeons during surgery. Thus, in this study, a combined tactile and kinesthetic feedback system was developed to provide haptic feedback to surgeons during robotic surgery. To assess haptic feasibility, the effects of two types of haptic feedback were examined empirically - kinesthetic and tactile feedback - to measure object-pulling force with a telesurgery robotics system at two desired pulling forces (1 N and 2 N). Participants answered a set of questionnaires after experiments. The experimental results reveal reductions in force error (39.1% and 40.9%) when using haptic feedback during 1 N and 2 N pulling tasks. Moreover, survey analyses show the effectiveness of the haptic feedback during teleoperation. The combined tactile and kinesthetic feedback of the master device in robotic surgery improves the surgeon's ability to control the interaction force applied to the tissue. Copyright © 2014 John Wiley & Sons, Ltd. Copyright © 2014 John Wiley & Sons, Ltd.
Study on development of active-passive rehabilitation system for upper limbs: Hybrid-PLEMO
NASA Astrophysics Data System (ADS)
Kikuchi, T.; Jin, Y.; Fukushima, K.; Akai, H.; Furusho, J.
2009-02-01
In recent years, many researchers have studied the potential of using robotics technology to assist and quantify the motor functions for neuron-rehabilitation. Some kinds of haptic devices have been developed and evaluated its efficiency with clinical tests, for example, upper limb training for patients with spasticity after stroke. Active-type (motor-driven) haptic devices can realize a lot of varieties of haptics. But they basically require high-cost safety system. On the other hand, passive-type (brake-based) haptic devices have inherent safety. However, the passive robot system has strong limitation on varieties of haptics. There are not sufficient evidences to clarify how the passive/active haptics effect to the rehabilitation of motor skills. In this paper, we developed an active-passive-switchable rehabilitation system with ER clutch/brake device named "Hybrid-PLEMO" in order to address these problems. In this paper, basic structures and haptic control methods of the Hybrid-PLEMO are described.
Advanced display object selection methods for enhancing user-computer productivity
NASA Technical Reports Server (NTRS)
Osga, Glenn A.
1993-01-01
The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.
Interactive Design and the Mythical "Intuitive User Interface."
ERIC Educational Resources Information Center
Bielenberg, Daniel R.
1993-01-01
Discusses the design of graphical user interfaces. Highlights include conceptual models, including user needs, content, and what multimedia can do; and tools for building the users' mental models, including metaphor, natural mappings, prompts, feedback, and user testing. (LRW)
Mastoidectomy simulation with combined visual and haptic feedback.
Agus, Marco; Giachetti, Andrea; Gobbetti, Enrico; Zanetti, Gianluigi; Zorcolo, Antonio; John, Nigel W; Stone, Robert J
2002-01-01
Mastoidectomy is one of the most common surgical procedures relating to the petrous bone. In this paper we describe our preliminary results in the realization of a virtual reality mastoidectomy simulator. Our system is designed to work on patient-specific volumetric object models directly derived from 3D CT and MRI images. The paper summarizes the detailed task analysis performed in order to define the system requirements, introduces the architecture of the prototype simulator, and discusses the initial feedback received from selected end users.
AnthropMMD: An R package with a graphical user interface for the mean measure of divergence.
Santos, Frédéric
2018-01-01
The mean measure of divergence is a dissimilarity measure between groups of individuals described by dichotomous variables. It is well suited to datasets with many missing values, and it is generally used to compute distance matrices and represent phenograms. Although often used in biological anthropology and archaeozoology, this method suffers from a lack of implementation in common statistical software. A package for the R statistical software, AnthropMMD, is presented here. Offering a dynamic graphical user interface, it is the first one dedicated to Smith's mean measure of divergence. The package also provides facilities for graphical representations and the crucial step of trait selection, so that the entire analysis can be performed through the graphical user interface. Its use is demonstrated using an artificial dataset, and the impact of trait selection is discussed. Finally, AnthropMMD is compared to three other free tools available for calculating the mean measure of divergence, and is proven to be consistent with them. © 2017 Wiley Periodicals, Inc.
Graphical explanation in an expert system for Space Station Freedom rack integration
NASA Technical Reports Server (NTRS)
Craig, F. G.; Cutts, D. E.; Fennel, T. R.; Purves, B.
1990-01-01
The rationale and methodology used to incorporate graphics into explanations provided by an expert system for Space Station Freedom rack integration is examined. The rack integration task is typical of a class of constraint satisfaction problems for large programs where expertise from several areas is required. Graphically oriented approaches are used to explain the conclusions made by the system, the knowledge base content, and even at more abstract levels the control strategies employed by the system. The implemented architecture combines hypermedia and inference engine capabilities. The advantages of this architecture include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. The graphical techniques employed range from simple statis presentation of schematics to dynamic creation of a series of pictures presented motion picture style. User models control the type, amount, and order of information presented.
Towards a Taxonomy of Metaphorical Graphical User Interfaces: Demands and Implementations.
ERIC Educational Resources Information Center
Cates, Ward Mitchell
The graphical user interface (GUI) has become something of a standard for instructional programs in recent years. One type of GUI is the metaphorical type. For example, the Macintosh GUI is based on the "desktop" metaphor where objects one manipulates within the GUI are implied to be objects one might find in a real office's desktop.…
A Monthly Water-Balance Model Driven By a Graphical User Interface
McCabe, Gregory J.; Markstrom, Steven L.
2007-01-01
This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.
Integrating Commercial Off-The-Shelf (COTS) graphics and extended memory packages with CLIPS
NASA Technical Reports Server (NTRS)
Callegari, Andres C.
1990-01-01
This paper addresses the question of how to mix CLIPS with graphics and how to overcome PC's memory limitations by using the extended memory available in the computer. By adding graphics and extended memory capabilities, CLIPS can be converted into a complete and powerful system development tool, on the other most economical and popular computer platform. New models of PCs have amazing processing capabilities and graphic resolutions that cannot be ignored and should be used to the fullest of their resources. CLIPS is a powerful expert system development tool, but it cannot be complete without the support of a graphics package needed to create user interfaces and general purpose graphics, or without enough memory to handle large knowledge bases. Now, a well known limitation on the PC's is the usage of real memory which limits CLIPS to use only 640 Kb of real memory, but now that problem can be solved by developing a version of CLIPS that uses extended memory. The user has access of up to 16 MB of memory on 80286 based computers and, practically, all the available memory (4 GB) on computers that use the 80386 processor. So if we give CLIPS a self-configuring graphics package that will automatically detect the graphics hardware and pointing device present in the computer, and we add the availability of the extended memory that exists in the computer (with no special hardware needed), the user will be able to create more powerful systems at a fraction of the cost and on the most popular, portable, and economic platform available such as the PC platform.
NASA Astrophysics Data System (ADS)
Huang, Shih-Chieh Douglas
In this dissertation, I investigate the effects of a grounded learning experience on college students' mental models of physics systems. The grounded learning experience consisted of a priming stage and an instruction stage, and within each stage, one of two different types of visuo-haptic representation was applied: visuo-gestural simulation (visual modality and gestures) and visuo-haptic simulation (visual modality, gestures, and somatosensory information). A pilot study involving N = 23 college students examined how using different types of visuo-haptic representation in instruction affected people's mental model construction for physics systems. Participants' abilities to construct mental models were operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Findings from this pilot study revealed that, while both simulations significantly improved participants' mental modal construction for physics systems, visuo-haptic simulation was significantly better than visuo-gestural simulation. In addition, clinical interviews suggested that participants' mental model construction for physics systems benefited from receiving visuo-haptic simulation in a tutorial prior to the instruction stage. A dissertation study involving N = 96 college students examined how types of visuo-haptic representation in different applications support participants' mental model construction for physics systems. Participant's abilities to construct mental models were again operationalized through their pretest-to-posttest gain scores for a basic physics system and their performance on a transfer task involving an advanced physics system. Participants' physics misconceptions were also measured before and after the grounded learning experience. Findings from this dissertation study not only revealed that visuo-haptic simulation was significantly more effective in promoting mental model construction and remedying participants' physics misconceptions than visuo-gestural simulation, they also revealed that visuo-haptic simulation was more effective during the priming stage than during the instruction stage. Interestingly, the effects of visuo-haptic simulation in priming and visuo-haptic simulation in instruction on participants' pretest-to-posttest gain scores for a basic physics system appeared additive. These results suggested that visuo-haptic simulation is effective in physics learning, especially when it is used during the priming stage.
Optimization Techniques for 3D Graphics Deployment on Mobile Devices
NASA Astrophysics Data System (ADS)
Koskela, Timo; Vatjus-Anttila, Jarkko
2015-03-01
3D Internet technologies are becoming essential enablers in many application areas including games, education, collaboration, navigation and social networking. The use of 3D Internet applications with mobile devices provides location-independent access and richer use context, but also performance issues. Therefore, one of the important challenges facing 3D Internet applications is the deployment of 3D graphics on mobile devices. In this article, we present an extensive survey on optimization techniques for 3D graphics deployment on mobile devices and qualitatively analyze the applicability of each technique from the standpoints of visual quality, performance and energy consumption. The analysis focuses on optimization techniques related to data-driven 3D graphics deployment, because it supports off-line use, multi-user interaction, user-created 3D graphics and creation of arbitrary 3D graphics. The outcome of the analysis facilitates the development and deployment of 3D Internet applications on mobile devices and provides guidelines for future research.
Takahashi, Chie; Watt, Simon J.
2014-01-01
When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245
van der Meijden, O A J; Schijven, M P
2009-06-01
Virtual reality (VR) as surgical training tool has become a state-of-the-art technique in training and teaching skills for minimally invasive surgery (MIS). Although intuitively appealing, the true benefits of haptic (VR training) platforms are unknown. Many questions about haptic feedback in the different areas of surgical skills (training) need to be answered before adding costly haptic feedback in VR simulation for MIS training. This study was designed to review the current status and value of haptic feedback in conventional and robot-assisted MIS and training by using virtual reality simulation. A systematic review of the literature was undertaken using PubMed and MEDLINE. The following search terms were used: Haptic feedback OR Haptics OR Force feedback AND/OR Minimal Invasive Surgery AND/OR Minimal Access Surgery AND/OR Robotics AND/OR Robotic Surgery AND/OR Endoscopic Surgery AND/OR Virtual Reality AND/OR Simulation OR Surgical Training/Education. The results were assessed according to level of evidence as reflected by the Oxford Centre of Evidence-based Medicine Levels of Evidence. In the current literature, no firm consensus exists on the importance of haptic feedback in performing minimally invasive surgery. Although the majority of the results show positive assessment of the benefits of force feedback, results are ambivalent and not unanimous on the subject. Benefits are least disputed when related to surgery using robotics, because there is no haptic feedback in currently used robotics. The addition of haptics is believed to reduce surgical errors resulting from a lack of it, especially in knot tying. Little research has been performed in the area of robot-assisted endoscopic surgical training, but results seem promising. Concerning VR training, results indicate that haptic feedback is important during the early phase of psychomotor skill acquisition.
ModelMate - A graphical user interface for model analysis
Banta, Edward R.
2011-01-01
ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.
Hedayat, Isabel; Moraes, Renato; Lanovaz, Joel L; Oates, Alison R
2017-06-01
There are different ways to add haptic input during walking which may affect walking balance. This study compared the use of two different haptic tools (rigid railing and haptic anchors) and investigated whether any effects on walking were the result of the added sensory input and/or the posture generated when using those tools. Data from 28 young healthy adults were collected using the Mobility Lab inertial sensor system (APDM, Oregon, USA). Participants walked with and without both haptic tools and while pretending to use both haptic tools (placebo trials), with eyes opened and eyes closed. Using the tools or pretending to use both tools decreased normalized stride velocity (p < .001-0.008) and peak medial-lateral (ML) trunk velocity (p < .001-0.001). Normalized stride velocity was slower when actually using the railing compared to placebo railing trials (p = .006). Using the anchors resulted in lower peak ML trunk velocity than the railing (p = .002). The anchors had lower peak ML trunk velocity than placebo anchors (p < .001), but there was no difference between railing and placebo railing (p > .999). These findings highlight a difference in the type of tool used to add haptic input and suggest that changes in balance control strategy resulting from using the railing are based on arm placement, where it is the posture combined with added sensory input that affects balance control strategies with the haptic anchors. These findings provide a strong framework for additional research to be conducted on the effects of haptic input on walking in populations known to have decreased walking balance.
Haptic device development based on electro static force of cellulose electro active paper
NASA Astrophysics Data System (ADS)
Yun, Gyu-young; Kim, Sang-Youn; Jang, Sang-Dong; Kim, Dong-Gu; Kim, Jaehwan
2011-04-01
Haptic is one of well-considered device which is suitable for demanding virtual reality applications such as medical equipment, mobile devices, the online marketing and so on. Nowadays, many of concepts for haptic devices have been suggested to meet the demand of industries. Cellulose has received much attention as an emerging smart material, named as electro-active paper (EAPap). The EAPap is attractive for mobile haptic devices due to its unique characteristics in terms of low actuation power, suitability for thin devices and transparency. In this paper, we suggest a new concept of haptic actuator with the use of cellulose EAPap. Its performance is evaluated depending on various actuation conditions. As a result, cellulose electrostatic force actuator shows a large output displacement and fast response, which is suitable for mobile haptic devices.
Detection thresholds for small haptic effects
NASA Astrophysics Data System (ADS)
Dosher, Jesse A.; Hannaford, Blake
2002-02-01
We are interested in finding out whether or not haptic interfaces will be useful in portable and hand held devices. Such systems will have severe constraints on force output. Our first step is to investigate the lower limits at which haptic effects can be perceived. In this paper we report on experiments studying the effects of varying the amplitude, size, shape, and pulse-duration of a haptic feature. Using a specific haptic device we measure the smallest detectable haptics effects, with active exploration of saw-tooth shaped icons sized 3, 4 and 5 mm, a sine-shaped icon 5 mm wide, and static pulses 50, 100, and 150 ms in width. Smooth shaped icons resulted in a detection threshold of approximately 55 mN, almost twice that of saw-tooth shaped icons which had a threshold of 31 mN.
Enhancing audiovisual experience with haptic feedback: a survey on HAV.
Danieau, F; Lecuyer, A; Guillotel, P; Fleureau, J; Mollet, N; Christie, M
2013-01-01
Haptic technology has been widely employed in applications ranging from teleoperation and medical simulation to art and design, including entertainment, flight simulation, and virtual reality. Today there is a growing interest among researchers in integrating haptic feedback into audiovisual systems. A new medium emerges from this effort: haptic-audiovisual (HAV) content. This paper presents the techniques, formalisms, and key results pertinent to this medium. We first review the three main stages of the HAV workflow: the production, distribution, and rendering of haptic effects. We then highlight the pressing necessity for evaluation techniques in this context and discuss the key challenges in the field. By building on existing technologies and tackling the specific challenges of the enhancement of audiovisual experience with haptics, we believe the field presents exciting research perspectives whose financial and societal stakes are significant.
Investigating Students' Ideas About Buoyancy and the Influence of Haptic Feedback
NASA Astrophysics Data System (ADS)
Minogue, James; Borland, David
2016-04-01
While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of everyday experiences, a scientifically sound explanation of buoyancy remains difficult to construct for many. It requires the integration of domain-specific knowledge regarding density, fluid, force, gravity, mass, weight, and buoyancy. Prior studies suggest that novices often focus on only one dimension of the sinking and floating phenomenon. Our HES was designed to promote the integration of the subconcepts of density and buoyant forces and stresses the relationship between the object itself and the surrounding fluid. The study employed a randomized pretest-posttest control group research design and a suite of measures including an open-ended prompt and objective content questions to provide insights into the influence of haptic feedback on undergraduate students' thinking about buoyancy. A convenience sample (n = 40) was drawn from a university's population of undergraduate elementary education majors. Two groups were formed from haptic feedback (n = 22) and no haptic feedback (n = 18). Through content analysis, discernible differences were seen in the posttest explanations sinking and floating across treatment groups. Learners that experienced the haptic feedback made more frequent use of "haptically grounded" terms (e.g., mass, gravity, buoyant force, pushing), leading us to begin to build a local theory of language-mediated haptic cognition.
Purpura, Giulia; Cioni, Giovanni; Tinelli, Francesca
2018-07-01
Object recognition is a long and complex adaptive process and its full maturation requires combination of many different sensory experiences as well as cognitive abilities to manipulate previous experiences in order to develop new percepts and subsequently to learn from the environment. It is well recognized that the transfer of visual and haptic information facilitates object recognition in adults, but less is known about development of this ability. In this study, we explored the developmental course of object recognition capacity in children using unimodal visual information, unimodal haptic information, and visuo-haptic information transfer in children from 4 years to 10 years and 11 months of age. Participants were tested through a clinical protocol, involving visual exploration of black-and-white photographs of common objects, haptic exploration of real objects, and visuo-haptic transfer of these two types of information. Results show an age-dependent development of object recognition abilities for visual, haptic, and visuo-haptic modalities. A significant effect of time on development of unimodal and crossmodal recognition skills was found. Moreover, our data suggest that multisensory processes for common object recognition are active at 4 years of age. They facilitate recognition of common objects, and, although not fully mature, are significant in adaptive behavior from the first years of age. The study of typical development of visuo-haptic processes in childhood is a starting point for future studies regarding object recognition in impaired populations.
Lee Masson, Haemy; Bulthé, Jessica; Op de Beeck, Hans P; Wallraven, Christian
2016-08-01
Humans are highly adept at multisensory processing of object shape in both vision and touch. Previous studies have mostly focused on where visually perceived object-shape information can be decoded, with haptic shape processing receiving less attention. Here, we investigate visuo-haptic shape processing in the human brain using multivoxel correlation analyses. Importantly, we use tangible, parametrically defined novel objects as stimuli. Two groups of participants first performed either a visual or haptic similarity-judgment task. The resulting perceptual object-shape spaces were highly similar and matched the physical parameter space. In a subsequent fMRI experiment, objects were first compared within the learned modality and then in the other modality in a one-back task. When correlating neural similarity spaces with perceptual spaces, visually perceived shape was decoded well in the occipital lobe along with the ventral pathway, whereas haptically perceived shape information was mainly found in the parietal lobe, including frontal cortex. Interestingly, ventrolateral occipito-temporal cortex decoded shape in both modalities, highlighting this as an area capable of detailed visuo-haptic shape processing. Finally, we found haptic shape representations in early visual cortex (in the absence of visual input), when participants switched from visual to haptic exploration, suggesting top-down involvement of visual imagery on haptic shape processing. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation
NASA Technical Reports Server (NTRS)
Richardson, Marilou R.
2010-01-01
This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.
Engineering computer graphics in gas turbine engine design, analysis and manufacture
NASA Technical Reports Server (NTRS)
Lopatka, R. S.
1975-01-01
A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.
Analog-to-digital clinical data collection on networked workstations with graphic user interface.
Lunt, D
1991-02-01
An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.
Graphical Language for Data Processing
NASA Technical Reports Server (NTRS)
Alphonso, Keith
2011-01-01
A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1991-01-01
The transportable Applications Environment Plus (TAE Plus), developed at the NASA Goddard Space FLight Center, is a portable, What you see is what you get (WYSIWYG) user interface development and management system. Its primary objective is to provide an integrated software environment that allows interactive prototyping and development of graphical user interfaces, as well as management of the user interface within the operational domain. TAE Plus is being applied to many types of applications, and what TAE Plus provides, how the implementation has utilizes state-of-the-art technologies within graphic workstations, and how it has been used both within and without NASA are discussed.
1988-04-30
side it necessary and Identify’ by’ block n~nmbot) haptic hand, touch , vision, robot, object recognition, categorization 20. AGSTRPACT (Continue an...established that the haptic system has remarkable capabilities for object recognition. We define haptics as purposive touch . The basic tactual system...gathered ratings of the importance of dimensions for categorizing common objects by touch . Texture and hardness ratings strongly co-vary, which is
MuSim, a Graphical User Interface for Multiple Simulation Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Thomas; Cummings, Mary Anne; Johnson, Rolland
2016-06-01
MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X,more » and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.« less
Seung, Sungmin; Choi, Hongseok; Jang, Jongseong; Kim, Young Soo; Park, Jong-Oh; Park, Sukho; Ko, Seong Young
2017-01-01
This article presents a haptic-guided teleoperation for a tumor removal surgical robotic system, so-called a SIROMAN system. The system was developed in our previous work to make it possible to access tumor tissue, even those that seat deeply inside the brain, and to remove the tissue with full maneuverability. For a safe and accurate operation to remove only tumor tissue completely while minimizing damage to the normal tissue, a virtual wall-based haptic guidance together with a medical image-guided control is proposed and developed. The virtual wall is extracted from preoperative medical images, and the robot is controlled to restrict its motion within the virtual wall using haptic feedback. Coordinate transformation between sub-systems, a collision detection algorithm, and a haptic-guided teleoperation using a virtual wall are described in the context of using SIROMAN. A series of experiments using a simplified virtual wall are performed to evaluate the performance of virtual wall-based haptic-guided teleoperation. With haptic guidance, the accuracy of the robotic manipulator's trajectory is improved by 57% compared to one without. The tissue removal performance is also improved by 21% ( p < 0.05). The experiments show that virtual wall-based haptic guidance provides safer and more accurate tissue removal for single-port brain surgery.
Haptograph Representation of Real-World Haptic Information by Wideband Force Control
NASA Astrophysics Data System (ADS)
Katsura, Seiichiro; Irie, Kouhei; Ohishi, Kiyoshi
Artificial acquisition and reproduction of human sensations are basic technologies of communication engineering. For example, auditory information is obtained by a microphone, and a speaker reproduces it by artificial means. Furthermore, a video camera and a television make it possible to transmit visual sensation by broadcasting. On the contrary, since tactile or haptic information is subject to the Newton's “law of action and reaction” in the real world, a device which acquires, transmits, and reproduces the information has not been established. From the point of view, real-world haptics is the key technology for future haptic communication engineering. This paper proposes a novel acquisition method of haptic information named “haptograph”. The haptograph visualizes the haptic information like photograph. The proposed haptograph is applied to haptic recognition of the contact environment. A linear motor contacts to the surface of the environment and its reaction force is used to make a haptograph. A robust contact motion and sensor-less sensing of the reaction force are attained by using a disturbance observer. As a result, an encyclopedia of contact environment is attained. Since temporal and spatial analyses are conducted to represent haptic information as the haptograph, it is possible to be recognized and to be evaluated intuitively.
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2013-10-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Make Movies out of Your Dynamical Simulations with OGRE!
NASA Astrophysics Data System (ADS)
Tamayo, Daniel; Douglas, R. W.; Ge, H. W.; Burns, J. A.
2014-01-01
We have developed OGRE, the Orbital GRaphics Environment, an open-source project comprising a graphical user interface that allows the user to view the output from several dynamical integrators (e.g., SWIFT) that are commonly used for academic work. One can interactively vary the display speed, rotate the view and zoom the camera. This makes OGRE a great tool for students or the general public to explore accurate orbital histories that may display interesting dynamical features, e.g. the destabilization of Solar System orbits under the Nice model, or interacting pairs of exoplanets. Furthermore, OGRE allows the user to choreograph sequences of transformations as the simulation is played to generate movies for use in public talks or professional presentations. The graphical user interface is coded using Qt to ensure portability across different operating systems. OGRE will run on Linux and Mac OS X. The program is available as a self-contained executable, or as source code that the user can compile. We are continually updating the code, and hope that people who find it useful will contribute to the development of new features.
Graphic Interfaces and Online Information.
ERIC Educational Resources Information Center
Percival, J. Mark
1990-01-01
Discusses the growing importance of the use of Graphic User Interfaces (GUIs) with microcomputers and online services. Highlights include the development of graphics interfacing with microcomputers; CD-ROM databases; an evaluation of HyperCard as a potential interface to electronic mail and online commercial databases; and future possibilities.…
Graphic Design in Libraries: A Conceptual Process
ERIC Educational Resources Information Center
Ruiz, Miguel
2014-01-01
Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…
NASA Technical Reports Server (NTRS)
Lewis, Clayton; Wilde, Nick
1989-01-01
Space construction will require heavy investment in the development of a wide variety of user interfaces for the computer-based tools that will be involved at every stage of construction operations. Using today's technology, user interface development is very expensive for two reasons: (1) specialized and scarce programming skills are required to implement the necessary graphical representations and complex control regimes for high-quality interfaces; (2) iteration on prototypes is required to meet user and task requirements, since these are difficult to anticipate with current (and foreseeable) design knowledge. We are attacking this problem by building a user interface development tool based on extensions to the spreadsheet model of computation. The tool provides high-level support for graphical user interfaces and permits dynamic modification of interfaces, without requiring conventional programming concepts and skills.
Development of a StandAlone Surgical Haptic Arm.
Jones, Daniel; Lewis, Andrew; Fischer, Gregory S
2011-01-01
When performing telesurgery with current commercially available Minimally Invasive Robotic Surgery (MIRS) systems, a surgeon cannot feel the tool interactions that are inherent in traditional laparoscopy. It is proposed that haptic feedback in the control of MIRS systems could improve the speed, safety and learning curve of robotic surgery. To test this hypothesis, a standalone surgical haptic arm (SASHA) capable of manipulating da Vinci tools has been designed and fabricated with the additional ability of providing information for haptic feedback. This arm was developed as a research platform for developing and evaluating approaches to telesurgery, including various haptic mappings between master and slave and evaluating the effects of latency.
Graphical User Interface Development and Design to Support Airport Runway Configuration Management
NASA Technical Reports Server (NTRS)
Jones, Debra G.; Lenox, Michelle; Onal, Emrah; Latorella, Kara A.; Lohr, Gary W.; Le Vie, Lisa
2015-01-01
The objective of this effort was to develop a graphical user interface (GUI) for the National Aeronautics and Space Administration's (NASA) System Oriented Runway Management (SORM) decision support tool to support runway management. This tool is expected to be used by traffic flow managers and supervisors in the Airport Traffic Control Tower (ATCT) and Terminal Radar Approach Control (TRACON) facilities.
Collaborative voxel-based surgical virtual environments.
Acosta, Eric; Muniz, Gilbert; Armonda, Rocco; Bowyer, Mark; Liu, Alan
2008-01-01
Virtual Reality-based surgical simulators can utilize Collaborative Virtual Environments (C-VEs) to provide team-based training. To support real-time interactions, C-VEs are typically replicated on each user's local computer and a synchronization method helps keep all local copies consistent. This approach does not work well for voxel-based C-VEs since large and frequent volumetric updates make synchronization difficult. This paper describes a method that allows multiple users to interact within a voxel-based C-VE for a craniotomy simulator being developed. Our C-VE method requires smaller update sizes and provides faster synchronization update rates than volumetric-based methods. Additionally, we address network bandwidth/latency issues to simulate networked haptic and bone drilling tool interactions with a voxel-based skull C-VE.
User Interface Technology Transfer to NASA's Virtual Wind Tunnel System
NASA Technical Reports Server (NTRS)
vanDam, Andries
1998-01-01
Funded by NASA grants for four years, the Brown Computer Graphics Group has developed novel 3D user interfaces for desktop and immersive scientific visualization applications. This past grant period supported the design and development of a software library, the 3D Widget Library, which supports the construction and run-time management of 3D widgets. The 3D Widget Library is a mechanism for transferring user interface technology from the Brown Graphics Group to the Virtual Wind Tunnel system at NASA Ames as well as the public domain.
MeV+R: using MeV as a graphical user interface for Bioconductor applications in microarray analysis
Chu, Vu T; Gottardo, Raphael; Raftery, Adrian E; Bumgarner, Roger E; Yeung, Ka Yee
2008-01-01
We present MeV+R, an integration of the JAVA MultiExperiment Viewer program with Bioconductor packages. This integration of MultiExperiment Viewer and R is easily extensible to other R packages and provides users with point and click access to traditionally command line driven tools written in R. We demonstrate the ability to use MultiExperiment Viewer as a graphical user interface for Bioconductor applications in microarray data analysis by incorporating three Bioconductor packages, RAMA, BRIDGE and iterativeBMA. PMID:18652698
Gromita: a fully integrated graphical user interface to gromacs 4.
Sellis, Diamantis; Vlachakis, Dimitrios; Vlassi, Metaxia
2009-09-07
Gromita is a fully integrated and efficient graphical user interface (GUI) to the recently updated molecular dynamics suite Gromacs, version 4. Gromita is a cross-platform, perl/tcl-tk based, interactive front end designed to break the command line barrier and introduce a new user-friendly environment to run molecular dynamics simulations through Gromacs. Our GUI features a novel workflow interface that guides the user through each logical step of the molecular dynamics setup process, making it accessible to both advanced and novice users. This tool provides a seamless interface to the Gromacs package, while providing enhanced functionality by speeding up and simplifying the task of setting up molecular dynamics simulations of biological systems. Gromita can be freely downloaded from http://bio.demokritos.gr/gromita/.
Is There a Chance for a Standardised User Interface?
ERIC Educational Resources Information Center
Fletcher, Liz
1993-01-01
Issues concerning the implementation of standard user interfaces for CD-ROMs are discussed, including differing perceptions of the ideal interface, graphical user interfaces, user needs, and the standard protocols. It is suggested users should be able to select from a variety of user interfaces on each CD-ROM. (EA)
An Interactive Graphics Program for Assistance in Learning Convolution.
ERIC Educational Resources Information Center
Frederick, Dean K.; Waag, Gary L.
1980-01-01
A program has been written for the interactive computer graphics facility at Rensselaer Polytechnic Institute that is designed to assist the user in learning the mathematical technique of convolving two functions. Because convolution can be represented graphically by a sequence of steps involving folding, shifting, multiplying, and integration, it…
Imagining a Stata / Python Combination
NASA Technical Reports Server (NTRS)
Fiedler, James
2012-01-01
There are occasions when a task is difficult in Stata, but fairly easy in a more general programming language. Python is a popular language for a range of uses. It is easy to use, has many high ]quality packages, and programs can be written relatively quickly. Is there any advantage in combining Stata and Python within a single interface? Stata already offers support for user-written programs, which allow extensive control over calculations, but somewhat less control over graphics. Also, except for specifying output, the user has minimal programmatic control over the user interface. Python can be used in a way that allows more control over the interface and graphics, and in so doing provide a roundabout method for satisfying some user requests (e.g., transparency levels in graphics and the ability to clear the results window). My talk will explore these ideas, present a possible method for combining Stata and Python, and give examples to demonstrate how this combination might be useful.
Virtual hand: a 3D tactile interface to virtual environments
NASA Astrophysics Data System (ADS)
Rogowitz, Bernice E.; Borrel, Paul
2008-02-01
We introduce a novel system that allows users to experience the sensation of touch in a computer graphics environment. In this system, the user places his/her hand on an array of pins, which is moved about space on a 6 degree-of-freedom robot arm. The surface of the pins defines a surface in the virtual world. This "virtual hand" can move about the virtual world. When the virtual hand encounters an object in the virtual world, the heights of the pins are adjusted so that they represent the object's shape, surface, and texture. A control system integrates pin and robot arm motions to transmit information about objects in the computer graphics world to the user. It also allows the user to edit, change and move the virtual objects, shapes and textures. This system provides a general framework for touching, manipulating, and modifying objects in a 3-D computer graphics environment, which may be useful in a wide range of applications, including computer games, computer aided design systems, and immersive virtual worlds.
System and method for creating expert systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M. (Inventor); Luczak, Edward C. (Inventor)
1998-01-01
A system and method provides for the creation of a highly graphical expert system without the need for programming in code. An expert system is created by initially building a data interface, defining appropriate Mission, User-Defined, Inferred, and externally-generated GenSAA (EGG) data variables whose data values will be updated and input into the expert system. Next, rules of the expert system are created by building appropriate conditions of the rules which must be satisfied and then by building appropriate actions of rules which are to be executed upon corresponding conditions being satisfied. Finally, an appropriate user interface is built which can be highly graphical in nature and which can include appropriate message display and/or modification of display characteristics of a graphical display object, to visually alert a user of the expert system of varying data values, upon conditions of a created rule being satisfied. The data interface building, rule building, and user interface building are done in an efficient manner and can be created without the need for programming in code.
Haptic, Virtual Interaction and Motor Imagery: Entertainment Tools and Psychophysiological Testing.
Invitto, Sara; Faggiano, Chiara; Sammarco, Silvia; De Luca, Valerio; De Paolis, Lucio T
2016-03-18
In this work, the perception of affordances was analysed in terms of cognitive neuroscience during an interactive experience in a virtual reality environment. In particular, we chose a virtual reality scenario based on the Leap Motion controller: this sensor device captures the movements of the user's hand and fingers, which are reproduced on a computer screen by the proper software applications. For our experiment, we employed a sample of 10 subjects matched by age and sex and chosen among university students. The subjects took part in motor imagery training and immersive affordance condition (a virtual training with Leap Motion and a haptic training with real objects). After each training sessions the subject performed a recognition task, in order to investigate event-related potential (ERP) components. The results revealed significant differences in the attentional components during the Leap Motion training. During Leap Motion session, latencies increased in the occipital lobes, which are entrusted to visual sensory; in contrast, latencies decreased in the frontal lobe, where the brain is mainly activated for attention and action planning.
Kitada, Ryo; Johnsrude, Ingrid S; Kochiyama, Takanori; Lederman, Susan J
2009-10-01
Humans can recognize common objects by touch extremely well whenever vision is unavailable. Despite its importance to a thorough understanding of human object recognition, the neuroscientific study of this topic has been relatively neglected. To date, the few published studies have addressed the haptic recognition of nonbiological objects. We now focus on haptic recognition of the human body, a particularly salient object category for touch. Neuroimaging studies demonstrate that regions of the occipito-temporal cortex are specialized for visual perception of faces (fusiform face area, FFA) and other body parts (extrastriate body area, EBA). Are the same category-sensitive regions activated when these components of the body are recognized haptically? Here, we use fMRI to compare brain organization for haptic and visual recognition of human body parts. Sixteen subjects identified exemplars of faces, hands, feet, and nonbiological control objects using vision and haptics separately. We identified two discrete regions within the fusiform gyrus (FFA and the haptic face region) that were each sensitive to both haptically and visually presented faces; however, these two regions differed significantly in their response patterns. Similarly, two regions within the lateral occipito-temporal area (EBA and the haptic body region) were each sensitive to body parts in both modalities, although the response patterns differed. Thus, although the fusiform gyrus and the lateral occipito-temporal cortex appear to exhibit modality-independent, category-sensitive activity, our results also indicate a degree of functional specialization related to sensory modality within these structures.
Norman, J Farley; Phillips, Flip; Holmin, Jessica S; Norman, Hideko F; Beers, Amanda M; Boswell, Alexandria M; Cheeseman, Jacob R; Stethen, Angela G; Ronning, Cecilia
2012-10-01
A set of three experiments evaluated 96 participants' ability to visually and haptically discriminate solid object shape. In the past, some researchers have found haptic shape discrimination to be substantially inferior to visual shape discrimination, while other researchers have found haptics and vision to be essentially equivalent. A primary goal of the present study was to understand these discrepant past findings and to determine the true capabilities of the haptic system. All experiments used the same task (same vs. different shape discrimination) and stimulus objects (James Gibson's "feelies" and a set of naturally shaped objects--bell peppers). However, the methodology varied across experiments. Experiment 1 used random 3-dimensional (3-D) orientations of the stimulus objects, and the conditions were full-cue (active manipulation of objects and rotation of the visual objects in depth). Experiment 2 restricted the 3-D orientations of the stimulus objects and limited the haptic and visual information available to the participants. Experiment 3 compared restricted and full-cue conditions using random 3-D orientations. We replicated both previous findings in the current study. When we restricted visual and haptic information (and placed the stimulus objects in the same orientation on every trial), the participants' visual performance was superior to that obtained for haptics (replicating the earlier findings of Davidson et al. in Percept Psychophys 15(3):539-543, 1974). When the circumstances resembled those of ordinary life (e.g., participants able to actively manipulate objects and see them from a variety of perspectives), we found no significant difference between visual and haptic solid shape discrimination.
PC Software graphics tool for conceptual design of space/planetary electrical power systems
NASA Technical Reports Server (NTRS)
Truong, Long V.
1995-01-01
This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.
A study on haptic collaborative game in shared virtual environment
NASA Astrophysics Data System (ADS)
Lu, Keke; Liu, Guanyang; Liu, Lingzhi
2013-03-01
A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.
A “virtually minimal” visuo-haptic training of attention in severe traumatic brain injury
2013-01-01
Background Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. Methods We designed a “virtually minimal” approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. Results The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Conclusions Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing. PMID:23938101
A "virtually minimal" visuo-haptic training of attention in severe traumatic brain injury.
Dvorkin, Assaf Y; Ramaiya, Milan; Larson, Eric B; Zollman, Felise S; Hsu, Nancy; Pacini, Sonia; Shah, Amit; Patton, James L
2013-08-09
Although common during the early stages of recovery from severe traumatic brain injury (TBI), attention deficits have been scarcely investigated. Encouraging evidence suggests beneficial effects of attention training in more chronic and higher functioning patients. Interactive technology may provide new opportunities for rehabilitation in inpatients who are earlier in their recovery. We designed a "virtually minimal" approach using robot-rendered haptics in a virtual environment to train severely injured inpatients in the early stages of recovery to sustain attention to a visuo-motor task. 21 inpatients with severe TBI completed repetitive reaching toward targets that were both seen and felt. Patients were tested over two consecutive days, experiencing 3 conditions (no haptic feedback, a break-through force, and haptic nudge) in 12 successive, 4-minute blocks. The interactive visuo-haptic environments were well-tolerated and engaging. Patients typically remained attentive to the task. However, patients exhibited attention loss both before (prolonged initiation) and during (pauses during motion) a movement. Compared to no haptic feedback, patients benefited from haptic nudge cues but not break-through forces. As training progressed, patients increased the number of targets acquired and spontaneously improved from one day to the next. Interactive visuo-haptic environments could be beneficial for attention training for severe TBI patients in the early stages of recovery and warrants further and more prolonged clinical testing.
NASA Astrophysics Data System (ADS)
Yin, Feilong; Hayashi, Ryuzo; Raksincharoensak, Pongsathorn; Nagai, Masao
This research proposes a haptic velocity guidance assistance system for realizing eco-driving as well as enhancing traffic capacity by cooperating with ITS (Intelligent Transportation Systems). The proposed guidance system generates the desired accelerator pedal (abbreviated as pedal) stroke with respect to the desired velocity obtained from ITS considering vehicle dynamics, and provides the desired pedal stroke to the driver via a haptic pedal whose reaction force is controllable and guides the driver in order to trace the desired velocity in real time. The main purpose of this paper is to discuss the feasibility of the haptic velocity guidance. A haptic velocity guidance system for research is developed on the Driving Simulator of TUAT (DS), by attaching a low-inertia, low-friction motor to the pedal, which does not change the original characteristics of the original pedal when it is not operated, implementing an algorithm regarding the desired pedal stroke calculation and the reaction force controller. The haptic guidance maneuver is designed based on human pedal stepping experiments. A simple velocity profile with acceleration, deceleration and cruising is synthesized according to naturalistic driving for testing the proposed system. The experiment result of 9 drivers shows that the haptic guidance provides high accuracy and quick response in velocity tracking. These results prove that the haptic guidance is a promising velocity guidance method from the viewpoint of HMI (Human Machine Interface).
Haptic perception accuracy depending on self-produced movement.
Park, Chulwook; Kim, Seonjin
2014-01-01
This study measured whether self-produced movement influences haptic perception ability (experiment 1) as well as the factors associated with levels of influence (experiment 2) in racket sports. For experiment 1, the haptic perception accuracy levels of five male table tennis experts and five male novices were examined under two different conditions (no movement vs. movement). For experiment 2, the haptic afferent subsystems of five male table tennis experts and five male novices were investigated in only the self-produced movement-coupled condition. Inferential statistics (ANOVA, t-test) and custom-made devices (shock & vibration sensor, Qualisys Track Manager) of the data were used to determine the haptic perception accuracy (experiment 1, experiment 2) and its association with expertise. The results of this research show that expert-level players acquire higher accuracy with less variability (racket vibration and angle) than novice-level players, especially in their self-produced movement coupled performances. The important finding from this result is that, in terms of accuracy, the skill-associated differences were enlarged during self-produced movement. To explain the origin of this difference between experts and novices, the functional variability of haptic afferent subsystems can serve as a reference. These two factors (self-produced accuracy and the variability of haptic features) as investigated in this study would be useful criteria for educators in racket sports and suggest a broader hypothesis for further research into the effects of the haptic accuracy related to variability.
Mechatronic design of haptic forceps for robotic surgery.
Rizun, P; Gunn, D; Cox, B; Sutherland, G
2006-12-01
Haptic feedback increases operator performance and comfort during telerobotic manipulation. Feedback of grasping pressure is critical in many microsurgical tasks, yet no haptic interface for surgical tools is commercially available. Literature on the psychophysics of touch was reviewed to define the spectrum of human touch perception and the fidelity requirements of an ideal haptic interface. Mechanical design and control literature was reviewed to translate the psychophysical requirements to engineering specification. High-fidelity haptic forceps were then developed through an iterative process between engineering and surgery. The forceps are a modular device that integrate with a haptic hand controller to add force feedback for tool actuation in telerobotic or virtual surgery. Their overall length is 153 mm and their mass is 125 g. A contact-free voice coil actuator generates force feedback at frequencies up to 800 Hz. Maximum force output is 6 N (2N continuous) and the force resolution is 4 mN. The forceps employ a contact-free magnetic position sensor as well as micro-machined accelerometers to measure opening/closing acceleration. Position resolution is 0.6 microm with 1.3 microm RMS noise. The forceps can simulate stiffness greater than 20N/mm or impedances smaller than 15 g with no noticeable haptic artifacts or friction. As telerobotic surgery evolves, haptics will play an increasingly important role. Copyright 2006 John Wiley & Sons, Ltd.
Afzal, Muhammad Raheel; Byun, Ha-Young; Oh, Min-Kyun; Yoon, Jungwon
2015-03-13
Haptic control is a useful therapeutic option in rehabilitation featuring virtual reality interaction. As with visual and vibrotactile biofeedback, kinesthetic haptic feedback may assist in postural control, and can achieve balance control. Kinesthetic haptic feedback in terms of body sway can be delivered via a commercially available haptic device and can enhance the balance stability of both young healthy subjects and stroke patients. Our system features a waist-attached smartphone, software running on a computer (PC), and a dedicated Phantom Omni® device. Young healthy participants performed balance tasks after assumption of each of four distinct postures for 30 s (one foot on the ground; the Tandem Romberg stance; one foot on foam; and the Tandem Romberg stance on foam) with eyes closed. Patient eyes were not closed and assumption of the Romberg stance (only) was tested during a balance task 25 s in duration. An Android application running continuously on the smartphone sent mediolateral (ML) and anteroposterior (AP) tilt angles to a PC, which generated kinesthetic haptic feedback via Phantom Omni®. A total of 16 subjects, 8 of whom were young healthy and 8 of whom had suffered stroke, participated in the study. Post-experiment data analysis was performed using MATLAB®. Mean Velocity Displacement (MVD), Planar Deviation (PD), Mediolateral Trajectory (MLT) and Anteroposterior Trajectory (APT) parameters were analyzed to measure reduction in body sway. Our kinesthetic haptic feedback system was effective to reduce postural sway in young healthy subjects regardless of posture and the condition of the substrate (the ground) and to improve MVD and PD in stroke patients who assumed the Romberg stance. Analysis of Variance (ANOVA) revealed that kinesthetic haptic feedback significantly reduced body sway in both categories of subjects. Kinesthetic haptic feedback can be implemented using a commercial haptic device and a smartphone. Intuitive balance cues were created using the handle of a haptic device, rendering the approach very simple yet efficient in practice. This novel form of biofeedback will be a useful rehabilitation tool improving the balance of stroke patients.
Development of a Graphical User Interface to Visualize Surface Observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buckley, R.L.
1998-07-13
Thousands of worldwide observing stations provide meteorological information near the earth's surface as often as once each hour. This surface data may be plotted on geographical maps to provide the meteorologist useful information regarding weather patterns for a region of interest. This report describes the components and applications of a graphical user interface which have been developed to visualize surface observations at any global location and time of interest.
User's manual for EZPLOT version 5.5: A FORTRAN program for 2-dimensional graphic display of data
NASA Technical Reports Server (NTRS)
Garbinski, Charles; Redin, Paul C.; Budd, Gerald D.
1988-01-01
EZPLOT is a computer applications program that converts data resident on a file into a plot displayed on the screen of a graphics terminal. This program generates either time history or x-y plots in response to commands entered interactively from a terminal keyboard. Plot parameters consist of a single independent parameter and from one to eight dependent parameters. Various line patterns, symbol shapes, axis scales, text labels, and data modification techniques are available. This user's manual describes EZPLOT as it is implemented on the Ames Research Center, Dryden Research Facility ELXSI computer using DI-3000 graphics software tools.
Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array
NASA Astrophysics Data System (ADS)
Sadeh, I.; Oya, I.; Schwarz, J.; Pietriga, E.
2016-07-01
The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction (HCI). The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.
LinkWinds: An Approach to Visual Data Analysis
NASA Technical Reports Server (NTRS)
Jacobson, Allan S.
1992-01-01
The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.
Guide to using Cuechart, Tellagraf, and Disspla at ANL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertoncini, P.J.; Thommes, M.M.
1986-01-01
Guide to Curchart, Tellagraf, and Disspla at ANL provides information necessary for using the three ISSCO graphics packages at Argonne: Cuechart is a cue-and-response program available in CMS that aids users in creating bar charts, line charts, pie charts, and word charts. It is appropriate for users with little or no previous graphics experience. Cuechart provides much of the capability of Tellagraf without the user's having to learn Tellagraf commands. Tellagraf is a more powerful, easy-to-use graphics package also available in CMS. With a little training, scientists, administrators, and secretaries can produce sophisticated publication-quality log or linear plots, bar charts,more » pie charts, tables, or posters. Disspla is a more versatile and sophisticated graphics package. It is available in both CMS and batch and consists of several hundred Fortran-callable and PL/I-callable subroutines that will enable you to obtain professional quality plots. In addition to log or linear plots, bar charts, pie charts, and pages of text, Disspla provides subroutines for contour plots, 3-D plots, and world maps.« less
Demonstration of new PCSD capabilities
NASA Technical Reports Server (NTRS)
Gough, M.
1986-01-01
The new, more flexible and more friendly graphics capabilities to be available in later releases of the Pilot Climate Data System were demonstrated. The LIMS-LAMAT data set was chosen to illustrate these new capabilities. Pseudocolor and animation were used to represent the third and fourth dimensions, expanding the analytical capabilities available through the traditional two-dimensional x-y plot. In the new version, variables for the axes are chosen by scrolling through viable selections. This scrolling feature is a function of the new user interface customization. The new graphics are extremely user friendly and should free the scientist to look at data and converse with it, without doing any programming. The system is designed to rapidly plot any variable versus any other variable and animate by any variable. Any one plot in itself is not extraordinary; however, the fact that a user can generate the plots instead of a programmer distinguishes the graphics capabilities of the PCDS from other software packages. In addition, with the new CDF design, the system will become more generic, and the new graphics will become much more rigorous in the area of correlative studies.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1989-01-01
The Transportable Applications Environment Plus (TAE Plus), developed by NASA's Goddard Space Flight Center, is a portable User Interface Management System (UIMS), which provides an intuitive WYSIWYG WorkBench for prototyping and designing an application's user interface, integrated with tools for efficiently implementing the designed user interface and effective management of the user interface during an application's active domain. During the development of TAE Plus, many design and implementation decisions were based on the state-of-the-art within graphics workstations, windowing system and object-oriented programming languages. Some of the problems and issues experienced during implementation are discussed. A description of the next development steps planned for TAE Plus is also given.
Salisbury, C M; Gillespie, R B; Tan, H Z; Barbagli, F; Salisbury, J K
2011-01-01
In this paper, we extend the concept of the contrast sensitivity function - used to evaluate video projectors - to the evaluation of haptic devices. We propose using human observers to determine if vibrations rendered using a given haptic device are accompanied by artifacts detectable to humans. This determination produces a performance measure that carries particular relevance to applications involving texture rendering. For cases in which a device produces detectable artifacts, we have developed a protocol that localizes deficiencies in device design and/or hardware implementation. In this paper, we present results from human vibration detection experiments carried out using three commercial haptic devices and one high performance voice coil motor. We found that all three commercial devices produced perceptible artifacts when rendering vibrations near human detection thresholds. Our protocol allowed us to pinpoint the deficiencies, however, and we were able to show that minor modifications to the haptic hardware were sufficient to make these devices well suited for rendering vibrations, and by extension, the vibratory components of textures. We generalize our findings to provide quantitative design guidelines that ensure the ability of haptic devices to proficiently render the vibratory components of textures.
Haptic Feedback in Robot-Assisted Minimally Invasive Surgery
Okamura, Allison M.
2009-01-01
Purpose of Review Robot-assisted minimally invasive surgery (RMIS) holds great promise for improving the accuracy and dexterity of a surgeon while minimizing trauma to the patient. However, widespread clinical success with RMIS has been marginal. It is hypothesized that the lack of haptic (force and tactile) feedback presented to the surgeon is a limiting factor. This review explains the technical challenges of creating haptic feedback for robot-assisted surgery and provides recent results that evaluate the effectiveness of haptic feedback in mock surgical tasks. Recent Findings Haptic feedback systems for RMIS are still under development and evaluation. Most provide only force feedback, with limited fidelity. The major challenge at this time is sensing forces applied to the patient. A few tactile feedback systems for RMIS have been created, but their practicality for clinical implementation needs to be shown. It is particularly difficult to sense and display spatially distributed tactile information. The cost-benefit ratio for haptic feedback in RMIS has not been established. Summary The designs of existing commercial RMIS systems are not conducive for force feedback, and creative solutions are needed to create compelling tactile feedback systems. Surgeons, engineers, and neuroscientists should work together to develop effective solutions for haptic feedback in RMIS. PMID:19057225
Learning, retention, and generalization of haptic categories
NASA Astrophysics Data System (ADS)
Do, Phuong T.
This dissertation explored how haptic concepts are learned, retained, and generalized to the same or different modality. Participants learned to classify objects into three categories either visually or haptically via different training procedures, followed by an immediate or delayed transfer test. Experiment I involved visual versus haptic learning and transfer. Intermodal matching between vision and haptics was investigated in Experiment II. Experiments III and IV examined intersensory conflict in within- and between-category bimodal situations to determine the degree of perceptual dominance between sight and touch. Experiment V explored the intramodal relationship between similarity and categorization in a psychological space, as revealed by MDS analysis of similarity judgments. Major findings were: (1) visual examination resulted in relatively higher performance accuracy than haptic learning; (2) systematic training produced better category learning of haptic concepts across all modality conditions; (3) the category prototypes were rated newer than any transfer stimulus followed learning both immediately and after a week delay; and, (4) although they converged at the apex of two transformational trajectories, the category prototypes became more central to their respective categories and increasingly structured as a function of learning. Implications for theories of multimodal similarity and categorization behavior are discussed in terms of discrimination learning, sensory integration, and dominance relation.
A Multi-Finger Interface with MR Actuators for Haptic Applications.
Qin, Huanhuan; Song, Aiguo; Gao, Zhan; Liu, Yuqing; Jiang, Guohua
2018-01-01
Haptic devices with multi-finger input are highly desirable in providing realistic and natural feelings when interacting with the remote or virtual environment. Compared with the conventional actuators, MR (Magneto-rheological) actuators are preferable options in haptics because of larger passive torque and torque-volume ratios. Among the existing haptic MR actuators, most of them are still bulky and heavy. If they were smaller and lighter, they would become more suitable for haptics. In this paper, a small-scale yet powerful MR actuator was designed to build a multi-finger interface for the 6 DOF haptic device. The compact structure was achieved by adopting the multi-disc configuration. Based on this configuration, the MR actuator can generate the maximum torque of 480 N.mm with dimensions of only 36 mm diameter and 18 mm height. Performance evaluation showed that it can exhibit a relatively high dynamic range and good response characteristics when compared with some other haptic MR actuators. The multi-finger interface is equipped with three MR actuators and can provide up to 8 N passive force to the thumb, index and middle fingers, respectively. An application example was used to demonstrate the effectiveness and potential of this new MR actuator based interface.
Interactive graphics to demonstrate health risks: formative development and qualitative evaluation
Ancker, Jessica S.; Chan, Connie; Kukafka, Rita
2015-01-01
Background Recent findings suggest that interactive game-like graphics might be useful in communicating probabilities. We developed a prototype for a risk communication module, focusing on eliciting users’ preferences for different interactive graphics and assessing usability and user interpretations. Methods Focus groups and iterative design methods. Results Feedback from five focus groups was used to design the graphics. The final version displayed a matrix of square buttons; clicking on any button allowed the user to see whether the stick figure underneath was affected by the health outcome. When participants used this interaction to learn about a risk, they expressed more emotional responses, both positive and negative, than when viewing any static graphic or numerical description of a risk. Their responses included relief about small risks and concern about large risks. The groups also commented on static graphics: arranging the figures affected by disease randomly throughout a group of figures made it more difficult to judge the proportion affected but was described as more realistic. Conclusions Interactive graphics appear to have potential for expressing risk magnitude as well as the affective feeling of risk. Quantitative studies are planned to assess the effect on perceived risks and estimated risk magnitudes. PMID:19657926
User-Extensible Graphics Using Abstract Structure,
1987-08-01
Flex 6 The Algol68 model of the graphical abstract structure 5 The creation of a PictureDefinition 6 The making of a picture from a PictureDefinition 7...data together with the operations that can be performed on that data. i 7! ś I _ § 4, The Alqol68 model of the graphical abstract structure Every
Tag, You're It: Enhancing Access to Graphic Novels
ERIC Educational Resources Information Center
West, Wendy
2013-01-01
Current users of academic libraries are avid readers of graphic novels. These thought-provoking materials are used for leisure reading, in instruction, and for research purposes. Libraries need to take care in providing access to these resources. This study analyzed the cataloging practices and social tagging of a specific list of graphic novel…
Starting Over: Current Issues in Online Catalog User Interface Design.
ERIC Educational Resources Information Center
Crawford, Walt
1992-01-01
Discussion of online catalogs focuses on issues in interface design. Issues addressed include understanding the user base; common user access (CUA) with personal computers; common command language (CCL); hyperlinks; screen design issues; differences from card catalogs; indexes; graphic user interfaces (GUIs); color; online help; and remote users.…
Optimizing Cubature for Efficient Integration of Subspace Deformations
An, Steven S.; Kim, Theodore; James, Doug L.
2009-01-01
We propose an efficient scheme for evaluating nonlinear subspace forces (and Jacobians) associated with subspace deformations. The core problem we address is efficient integration of the subspace force density over the 3D spatial domain. Similar to Gaussian quadrature schemes that efficiently integrate functions that lie in particular polynomial subspaces, we propose cubature schemes (multi-dimensional quadrature) optimized for efficient integration of force densities associated with particular subspace deformations, particular materials, and particular geometric domains. We support generic subspace deformation kinematics, and nonlinear hyperelastic materials. For an r-dimensional deformation subspace with O(r) cubature points, our method is able to evaluate subspace forces at O(r2) cost. We also describe composite cubature rules for runtime error estimation. Results are provided for various subspace deformation models, several hyperelastic materials (St.Venant-Kirchhoff, Mooney-Rivlin, Arruda-Boyce), and multimodal (graphics, haptics, sound) applications. We show dramatically better efficiency than traditional Monte Carlo integration. CR Categories: I.6.8 [Simulation and Modeling]: Types of Simulation—Animation, I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling—Physically based modeling G.1.4 [Mathematics of Computing]: Numerical Analysis—Quadrature and Numerical Differentiation PMID:19956777
NASA Technical Reports Server (NTRS)
1991-01-01
The Engineering Scripting Language (ESL) is a language designed to allow nonprogramming users to write Higher Order Language (HOL) programs by drawing directed graphs to represent the program and having the system generate the corresponding program in HOL. The ESL system supports user generation of HOL programs through the manipulation of directed graphs. The components of this graphs (nodes, ports, and connectors) are objects each of which has its own properties and property values. The purpose of the ESL graphical editor is to allow the user to create or edit graph objects which represent programs.
Moore, Kieran M; Edge, Graham; Kurc, Andrew R
2008-11-14
Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources.
Moore, Kieran M; Edge, Graham; Kurc, Andrew R
2008-01-01
Timeliness is a critical asset to the detection of public health threats when using syndromic surveillance systems. In order for epidemiologists to effectively distinguish which events are indicative of a true outbreak, the ability to utilize specific data streams from generalized data summaries is necessary. Taking advantage of graphical user interfaces and visualization capacities of current surveillance systems makes it easier for users to investigate detected anomalies by generating custom graphs, maps, plots, and temporal-spatial analysis of specific syndromes or data sources. PMID:19025683
SearchGUI: An open-source graphical user interface for simultaneous OMSSA and X!Tandem searches.
Vaudel, Marc; Barsnes, Harald; Berven, Frode S; Sickmann, Albert; Martens, Lennart
2011-03-01
The identification of proteins by mass spectrometry is a standard technique in the field of proteomics, relying on search engines to perform the identifications of the acquired spectra. Here, we present a user-friendly, lightweight and open-source graphical user interface called SearchGUI (http://searchgui.googlecode.com), for configuring and running the freely available OMSSA (open mass spectrometry search algorithm) and X!Tandem search engines simultaneously. Freely available under the permissible Apache2 license, SearchGUI is supported on Windows, Linux and OSX. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The contributions of vision and haptics to reaching and grasping
Stone, Kayla D.; Gonzalez, Claudia L. R.
2015-01-01
This review aims to provide a comprehensive outlook on the sensory (visual and haptic) contributions to reaching and grasping. The focus is on studies in developing children, normal, and neuropsychological populations, and in sensory-deprived individuals. Studies have suggested a right-hand/left-hemisphere specialization for visually guided grasping and a left-hand/right-hemisphere specialization for haptically guided object recognition. This poses the interesting possibility that when vision is not available and grasping relies heavily on the haptic system, there is an advantage to use the left hand. We review the evidence for this possibility and dissect the unique contributions of the visual and haptic systems to grasping. We ultimately discuss how the integration of these two sensory modalities shape hand preference. PMID:26441777
FLIP for FLAG model visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wooten, Hasani Omar
A graphical user interface has been developed for FLAG users. FLIP (FLAG Input deck Parser) provides users with an organized view of FLAG models and a means for efficiently and easily navigating and editing nodes, parameters, and variables.
Robotic guidance benefits the learning of dynamic, but not of spatial movement characteristics.
Lüttgen, Jenna; Heuer, Herbert
2012-10-01
Robotic guidance is an engineered form of haptic-guidance training and intended to enhance motor learning in rehabilitation, surgery, and sports. However, its benefits (and pitfalls) are still debated. Here, we investigate the effects of different presentation modes on the reproduction of a spatiotemporal movement pattern. In three different groups of participants, the movement was demonstrated in three different modalities, namely visual, haptic, and visuo-haptic. After demonstration, participants had to reproduce the movement in two alternating recall conditions: haptic and visuo-haptic. Performance of the three groups during recall was compared with regard to spatial and dynamic movement characteristics. After haptic presentation, participants showed superior dynamic accuracy, whereas after visual presentation, participants performed better with regard to spatial accuracy. Added visual feedback during recall always led to enhanced performance, independent of the movement characteristic and the presentation modality. These findings substantiate the different benefits of different presentation modes for different movement characteristics. In particular, robotic guidance is beneficial for the learning of dynamic, but not of spatial movement characteristics.
Haptic feedback in OP:Sense - augmented reality in telemanipulated robotic surgery.
Beyl, T; Nicolai, P; Mönnich, H; Raczkowksy, J; Wörn, H
2012-01-01
In current research, haptic feedback in robot assisted interventions plays an important role. However most approaches to haptic feedback only regard the mapping of the current forces at the surgical instrument to the haptic input devices, whereas surgeons demand a combination of medical imaging and telemanipulated robotic setups. In this paper we describe how this feature is integrated in our robotic research platform OP:Sense. The proposed method allows the automatic transfer of segmented imaging data to the haptic renderer and therefore allows enriching the haptic feedback with virtual fixtures based on imaging data. Anatomical structures are extracted from pre-operative generated medical images or virtual walls are defined by the surgeon inside the imaging data. Combining real forces with virtual fixtures can guide the surgeon to the regions of interest as well as helps to prevent the risk of damage to critical structures inside the patient. We believe that the combination of medical imaging and telemanipulation is a crucial step for the next generation of MIRS-systems.
Design of a 7-DOF haptic master using a magneto-rheological devices for robot surgery
NASA Astrophysics Data System (ADS)
Kang, Seok-Rae; Choi, Seung-Bok; Hwang, Yong-Hoon; Cha, Seung-Woo
2017-04-01
This paper presents a 7 degrees-of-freedom (7-DOF) haptic master which is applicable to the robot-assisted minimally invasive surgery (RMIS). By utilizing a controllable magneto-rheological (MR) fluid, the haptic master can provide force information to the surgeon during surgery. The proposed haptic master consists of three degrees motions of X, Y, Z and four degrees motions of the pitch, yaw, roll and grasping. All of them have force feedback capability. The proposed haptic master can generate the repulsive forces or torques by activating MR clutch and MR brake. Both MR clutch and MR brake are designed and manufactured with consideration of the size and output torque which is usable to the robotic surgery. A proportional-integral-derivative (PID) controller is then designed and implemented to achieve torque/force tracking trajectories. It is verified that the proposed haptic master can track well the desired torque and force occurred in the surgical place by controlling the input current applied to MR clutch and brake.
Menon, Samir; Zhu, Jack; Goyal, Deeksha; Khatib, Oussama
2017-07-01
Haptic interfaces compatible with functional magnetic resonance imaging (Haptic fMRI) promise to enable rich motor neuroscience experiments that study how humans perform complex manipulation tasks. Here, we present a large-scale study (176 scans runs, 33 scan sessions) that characterizes the reliability and performance of one such electromagnetically actuated device, Haptic fMRI Interface 3 (HFI-3). We outline engineering advances that ensured HFI-3 did not interfere with fMRI measurements. Observed fMRI temporal noise levels with HFI-3 operating were at the fMRI baseline (0.8% noise to signal). We also present results from HFI-3 experiments demonstrating that high resolution fMRI can be used to study spatio-temporal patterns of fMRI blood oxygenation dependent (BOLD) activation. These experiments include motor planning, goal-directed reaching, and visually-guided force control. Observed fMRI responses are consistent with existing literature, which supports Haptic fMRI's effectiveness at studying the brain's motor regions.
Configurable software for satellite graphics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartzman, P D
An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The levelmore » of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.« less
Display system for imaging scientific telemetric information
NASA Technical Reports Server (NTRS)
Zabiyakin, G. I.; Rykovanov, S. N.
1979-01-01
A system for imaging scientific telemetric information, based on the M-6000 minicomputer and the SIGD graphic display, is described. Two dimensional graphic display of telemetric information and interaction with the computer, in analysis and processing of telemetric parameters displayed on the screen is provided. The running parameter information output method is presented. User capabilities in the analysis and processing of telemetric information imaged on the display screen and the user language are discussed and illustrated.
Spatial Modeling Tools for Cell Biology
2006-10-01
multiphysics modeling expertise. A graphical user interface (GUI) for CoBi, JCoBi, was written in Java and interactive 3D graphics. CoBi has been...tools (C++ and Java ) to simulate complex cell and organ biology problems. CoBi has been designed to interact with the other Bio-SPICE software...fall of 2002. VisIt supports C++, Python and Java interfaces. The C++ and Java interfaces make it possible to provide alternate user interfaces for
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
NASA is currently using a set of applications called the Display Builder and Display Manager. They run on Concurrent systems and heavily depend on the Graphic Kernel System (GKS). At this time however, these two applications would more appropriately be developed in X Windows, in which a low X is used for all actual text and graphics display and a standard widget set (such as Motif) is used for the user interface. Use of the X Windows will increase performance, improve the user interface, enhance portability, and improve reliability. Prototype of X Window/Motif based Display Manager provides the following advantages over a GKS based application: improved performance by using a low level X Windows, display of graphic and text will be more efficient; improved user interface by using Motif; Improved portability by operating on both Concurrent and Sun workstations; and Improved reliability.
Design and validation of an improved graphical user interface with the 'Tool ball'.
Lee, Kuo-Wei; Lee, Ying-Chu
2012-01-01
The purpose of this research is introduce the design of an improved graphical user interface (GUI) and verifies the operational efficiency of the proposed interface. Until now, clicking the toolbar with the mouse is the usual way to operate software functions. In our research, we designed an improved graphical user interface - a tool ball that is operated by a mouse wheel to perform software functions. Several experiments are conducted to measure the time needed to operate certain software functions with the traditional combination of "mouse click + tool button" and the proposed integration of "mouse wheel + tool ball". The results indicate that the tool ball design can accelerate the speed of operating software functions, decrease the number of icons on the screen, and enlarge the applications of the mouse wheel. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.
AutoAssemblyD: a graphical user interface system for several genome assemblers.
Veras, Adonney Allan de Oliveira; de Sá, Pablo Henrique Caracciolo Gomes; Azevedo, Vasco; Silva, Artur; Ramos, Rommel Thiago Jucá
2013-01-01
Next-generation sequencing technologies have increased the amount of biological data generated. Thus, bioinformatics has become important because new methods and algorithms are necessary to manipulate and process such data. However, certain challenges have emerged, such as genome assembly using short reads and high-throughput platforms. In this context, several algorithms have been developed, such as Velvet, Abyss, Euler-SR, Mira, Edna, Maq, SHRiMP, Newbler, ALLPATHS, Bowtie and BWA. However, most such assemblers do not have a graphical interface, which makes their use difficult for users without computing experience given the complexity of the assembler syntax. Thus, to make the operation of such assemblers accessible to users without a computing background, we developed AutoAssemblyD, which is a graphical tool for genome assembly submission and remote management by multiple assemblers through XML templates. AssemblyD is freely available at https://sourceforge.net/projects/autoassemblyd. It requires Sun jdk 6 or higher.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hohn, Michael; Adams, Paul
2006-09-05
The L3 system is a computational steering environment for image processing and scientific computing. It consists of an interactive graphical language and interface. Its purpose is to help advanced users in controlling their computational software and assist in the management of data accumulated during numerical experiments. L3 provides a combination of features not found in other environments; these are: - textual and graphical construction of programs - persistence of programs and associated data - direct mapping between the scripts, the parameters, and the produced data - implicit hierarchial data organization - full programmability, including conditionals and functions - incremental executionmore » of programs The software includes the l3 language and the graphical environment. The language is a single-assignment functional language; the implementation consists of lexer, parser, interpreter, storage handler, and editing support, The graphical environment is an event-driven nested list viewer/editor providing graphical elements corresponding to the language. These elements are both the represenation of a users program and active interfaces to the values computed by that program.« less
Advancing satellite operations with intelligent graphical monitoring systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.
1993-01-01
For nearly twenty-five years, spacecraft missions have been operated in essentially the same manner: human operators monitor displays filled with alphanumeric text watching for limit violations or other indicators that signal a problem. The task is performed predominately by humans. Only in recent years have graphical user interfaces and expert systems been accepted within the control center environment to help reduce operator workloads. Unfortunately, the development of these systems is often time consuming and costly. At the NASA Goddard Space Flight Center (GSFC), a new domain specific expert system development tool called the Generic Spacecraft Analyst Assistant (GenSAA) has been developed. Through the use of a highly graphical user interface and point-and-click operation, GenSAA facilitates the rapid, 'programming-free' construction of intelligent graphical monitoring systems to serve as real-time, fault-isolation assistants for spacecraft analysts. Although specifically developed to support real-time satellite monitoring, GenSAA can support the development of intelligent graphical monitoring systems in a variety of space and commercial applications.
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
Effects of Motion and Figural Goodness on Haptic Object Perception in Infancy.
ERIC Educational Resources Information Center
Streri, Arlette; Spelke, Elizabeth S.
1989-01-01
After haptic habituation to a ring display, infants perceived the rings in two experiments as parts of one connected object. In both haptic and visual modes, infants appeared to perceive object unity by analyzing motion but not by analyzing figural goodness. (RH)
Teaching Classical Mechanics Concepts Using Visuo-Haptic Simulators
ERIC Educational Resources Information Center
Neri, Luis; Noguez, Julieta; Robledo-Rella, Victor; Escobar-Castillejos, David; Gonzalez-Nucamendi, Andres
2018-01-01
In this work, the design and implementation of several physics scenarios using haptic devices are presented and discussed. Four visuo-haptic applications were developed for an undergraduate engineering physics course. Experiments with experimental and control groups were designed and implemented. Activities and exercises related to classical…
INCA- INTERACTIVE CONTROLS ANALYSIS
NASA Technical Reports Server (NTRS)
Bauer, F. H.
1994-01-01
The Interactive Controls Analysis (INCA) program was developed to provide a user friendly environment for the design and analysis of linear control systems, primarily feedback control systems. INCA is designed for use with both small and large order systems. Using the interactive graphics capability, the INCA user can quickly plot a root locus, frequency response, or time response of either a continuous time system or a sampled data system. The system configuration and parameters can be easily changed, allowing the INCA user to design compensation networks and perform sensitivity analysis in a very convenient manner. A journal file capability is included. This stores an entire sequence of commands, generated during an INCA session into a file which can be accessed later. Also included in INCA are a context-sensitive help library, a screen editor, and plot windows. INCA is robust to VAX-specific overflow problems. The transfer function is the basic unit of INCA. Transfer functions are automatically saved and are available to the INCA user at any time. A powerful, user friendly transfer function manipulation and editing capability is built into the INCA program. The user can do all transfer function manipulations and plotting without leaving INCA, although provisions are made to input transfer functions from data files. By using a small set of commands, the user may compute and edit transfer functions, and then examine these functions by using the ROOT_LOCUS, FREQUENCY_RESPONSE, and TIME_RESPONSE capabilities. Basic input data, including gains, are handled as single-input single-output transfer functions. These functions can be developed using the function editor or by using FORTRAN- like arithmetic expressions. In addition to the arithmetic functions, special functions are available to 1) compute step, ramp, and sinusoid functions, 2) compute closed loop transfer functions, 3) convert from S plane to Z plane with optional advanced Z transform, and 4) convert from Z plane to W plane and back. These capabilities allow the INCA user to perform block diagram algebraic manipulations quickly for functions in the S, Z, and W domains. Additionally, a versatile digital control capability has been included in INCA. Special plane transformations allow the user to easily convert functions from one domain to another. Other digital control capabilities include: 1) totally independent open loop frequency response analyses on a continuous plant, discrete control system with a delay, 2) advanced Z-transform capability for systems with delays, and 3) multirate sampling analyses. The current version of INCA includes Dynamic Functions (which change when a parameter changes), standard filter generation, PD and PID controller generation, incorporation of the QZ-algorithm (function addition, inverse Laplace), and describing functions that allow the user to calculate the gain and phase characteristics of a nonlinear device. The INCA graphic modes provide the user with a convenient means to document and study frequency response, time response, and root locus analyses. General graphics features include: 1) zooming and dezooming, 2) plot documentation, 3) a table of analytic computation results, 4) multiple curves on the same plot, and 5) displaying frequency and gain information for a specific point on a curve. Additional capabilities in the frequency response mode include: 1) a full complement of graphical methods Bode magnitude, Bode phase, Bode combined magnitude and phase, Bode strip plots, root contour plots, Nyquist, Nichols, and Popov plots; 2) user selected plot scaling; and 3) gain and phase margin calculation and display. In the time response mode, additional capabilities include: 1) support for inverse Laplace and inverse Z transforms, 2) support for various input functions, 3) closed loop response evaluation, 4) loop gain sensitivity analyses, 5) intersample time response for discrete systems using the advanced Z transform, and 6) closed loop time response using mixed plane (S, Z, W) operations with delay. A Graphics mode command was added to the current version of INCA, version 3.13, to produce Metafiles (graphic files) of the currently displayed plot. The metafile can be displayed and edited using the QPLOT Graphics Editor and Replotter for Metafiles (GERM) program included with the INCA package. The INCA program is written in Pascal and FORTRAN for interactive or batch execution and has been implemented on a DEC VAX series computer under VMS. Both source code and executable code are supplied for INCA. Full INCA graphics capabilities are supported for various Tektronix 40xx and 41xx terminals; DEC VT graphics terminals; many PC and Macintosh terminal emulators; TEK014 hardcopy devices such as the LN03 Laserprinter; and bit map graphics external hardcopy devices. Also included for the TEK4510 rasterizer users are a multiple copy feature, a wide line feature, and additional graphics fonts. The INCA program was developed in 1985, Version 2.04 was released in 1986, Version 3.00 was released in 1988, and Version 3.13 was released in 1989. An INCA version 2.0X conversion program is included.
Systems Engineering Model and Training Application for Desktop Environment
NASA Technical Reports Server (NTRS)
May, Jeffrey T.
2010-01-01
Provide a graphical user interface based simulator for desktop training, operations and procedure development and system reference. This simulator allows for engineers to train and further understand the dynamics of their system from their local desktops. It allows the users to train and evaluate their system at a pace and skill level based on the user's competency and from a perspective based on the user's need. The simulator will not require any special resources to execute and should generally be available for use. The interface is based on a concept of presenting the model of the system in ways that best suits the user's application or training needs. The three levels of views are Component View, the System View (overall system), and the Console View (monitor). These views are portals into a single model, so changing the model from one view or from a model manager Graphical User Interface will be reflected on all other views.
Applying Cognitive Psychology to User Interfaces
NASA Astrophysics Data System (ADS)
Durrani, Sabeen; Durrani, Qaiser S.
This paper explores some key aspects of cognitive psychology that may be mapped onto user interfaces. Major focus in existing user interface guidelines is on consistency, simplicity, feedback, system messages, display issues, navigation, colors, graphics, visibility and error prevention [8-10]. These guidelines are effective indesigning user interfaces. However, these guidelines do not handle the issues that may arise due to the innate structure of human brain and human limitations. For example, where to place graphics on the screen so that user can easily process them and what kind of background should be given on the screen according to the limitation of human motor system. In this paper we have collected some available guidelines from the area of cognitive psychology [1, 5, 7]. In addition, we have extracted few guidelines from theories and studies of cognitive psychology [3, 11] which may be mapped to user interfaces.
NASA Technical Reports Server (NTRS)
Raible, E.
1994-01-01
The Panel Library and Editor is a graphical user interface (GUI) builder for the Silicon Graphics IRIS workstation family. The toolkit creates "widgets" which can be manipulated by the user. Its appearance is similar to that of the X-Windows System. The Panel Library is written in C and is used by programmers writing user-friendly mouse-driven applications for the IRIS. GUIs built using the Panel Library consist of "actuators" and "panels." Actuators are buttons, dials, sliders, or other mouse-driven symbols. Panels are groups of actuators that occupy separate windows on the IRIS workstation. The application user can alter variables in the graphics program, or fire off functions with a click on a button. The evolution of data values can be tracked with meters and strip charts, and dialog boxes with text processing can be built. Panels can be stored as icons when not in use. The Panel Editor is a program used to interactively create and test panel library interfaces in a simple and efficient way. The Panel Editor itself uses a panel library interface, so all actions are mouse driven. Extensive context-sensitive on-line help is provided. Programmers can graphically create and test the user interface without writing a single line of code. Once an interface is judged satisfactory, the Panel Editor will dump it out as a file of C code that can be used in an application. The Panel Library (v9.8) and Editor (v1.1) are written in C-Language (63%) and Scheme, a dialect of LISP, (37%) for Silicon Graphics 4D series workstations running IRIX 3.2 or higher. Approximately 10Mb of disk space is required once compiled. 1.5Mb of main memory is required to execute the panel editor. This program is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format for an IRIS, and includes a copy of XScheme, the public-domain Scheme interpreter used by the Panel Editor. The Panel Library Programmer's Manual is included on the distribution media. The Panel Library and Editor were released to COSMIC in 1991. Silicon Graphics, IRIS, and IRIX are trademarks of Silicon Graphics, Inc. X-Window System is a trademark of Massachusetts Institute of Technology.
Visual Debugging of Object-Oriented Systems With the Unified Modeling Language
2004-03-01
to be “the systematic and imaginative use of the technology of interactive computer graphics and the disciplines of graphic design , typography ... Graphics volume 23 no 6, pp893-901, 1999. [SHN98] Shneiderman, B. Designing the User Interface. Strategies for Effective Human-Computer Interaction...System Design Objectives ................................................................................ 44 3.3 System Architecture
Operator dynamics for stability condition in haptic and teleoperation system: A survey.
Li, Hongbing; Zhang, Lei; Kawashima, Kenji
2018-04-01
Currently, haptic systems ignore the varying impedance of the human hand with its countless configurations and thus cannot recreate the complex haptic interactions. The literature does not reveal a comprehensive survey on the methods proposed and this study is an attempt to bridge this gap. The paper includes an extensive review of human arm impedance modeling and control deployed to address inherent stability and transparency issues in haptic interaction and teleoperation systems. Detailed classification and comparative study of various contributions in human arm modeling are presented and summarized in tables and diagrams. The main challenges in modeling human arm impedance for haptic robotic applications are identified. The possible future research directions are outlined based on the gaps identified in the survey. Copyright © 2018 John Wiley & Sons, Ltd.
A Haptics Symposium Retrospective: 20 Years
NASA Technical Reports Server (NTRS)
Colgate, J. Edward; Adelstein, Bernard
2012-01-01
The very first "Haptics Symposium" actually went by the name "Issues in the Development of Kinesthetic Displays of Teleoperation and Virtual environments." The word "Haptic" didn't make it into the name until the next year. Not only was the most important word absent but so were RFPs, journals and commercial markets. And yet, as we prepare for the 2012 symposium, haptics is a thriving and amazingly diverse field of endeavor. In this talk we'll reflect on the origins of this field and on its evolution over the past twenty years, as well as the evolution of the Haptics Symposium itself. We hope to share with you some of the excitement we've felt along the way, and that we continue to feel as we look toward the future of our field.
Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models.
Fortmeier, Dirk; Wilms, Matthias; Mastmeyer, Andre; Handels, Heinz
2015-01-01
This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.
Automation in the graphic arts
NASA Astrophysics Data System (ADS)
Truszkowski, Walt
1995-04-01
The CHIMES (Computer-Human Interaction Models) tool was designed to help solve a simply-stated but important problem, i.e., the problem of generating a user interface to a system that complies with established human factors standards and guidelines. Though designed for use in a fairly restricted user domain, i.e., spacecraft mission operations, the CHIMES system is essentially domain independent and applicable wherever graphical user interfaces of displays are to be encountered. The CHIMES philosophy and operating strategy are quite simple. Instead of requiring a human designer to actively maintain in his or her head the now encyclopedic knowledge that human factors and user interface specialists have evolved, CHIMES incorporates this information in its knowledge bases. When directed to evaluated a design, CHIMES determines and accesses the appropriate knowledge, performs an evaluation of the design against that information, determines whether the design is compliant with the selected guidelines and suggests corrective actions if deviations from guidelines are discovered. This paper will provide an overview of the capabilities of the current CHIMES tool and discuss the potential integration of CHIMES-like technology in automated graphic arts systems.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1991-01-01
The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.
An Intuitive Graphical User Interface for Small UAS
2013-05-01
reduced from two to one . The stock displays, including video with text overlay on one and FalconView on the other, are replaced with a single, graphics...INTRODUCTION Tactical UAVs such as the Raven, Puma and Wasp are often used by dismounted warfighters on missions that require a high degree of mobility by...the operators on the ground. The current ground control stations (GCS) for the Wasp, Raven and Puma tactical UAVs require two people and two user
Özcan, Alpay; Christoforou, Eftychios; Brown, Daniel; Tsekos, Nikolaos
2011-01-01
The graphical user interface for an MR compatible robotic device has the capability of displaying oblique MR slices in 2D and a 3D virtual environment along with the representation of the robotic arm in order to swiftly complete the intervention. Using the advantages of the MR modality the device saves time and effort, is safer for the medical staff and is more comfortable for the patient. PMID:17946067
Menon, Samir; Quigley, Paul; Yu, Michelle; Khatib, Oussama
2014-01-01
Neuroimaging artifacts in haptic functional magnetic resonance imaging (Haptic fMRI) experiments have the potential to induce spurious fMRI activation where there is none, or to make neural activation measurements appear correlated across brain regions when they are actually not. Here, we demonstrate that performing three-dimensional goal-directed reaching motions while operating Haptic fMRI Interface (HFI) does not create confounding motion artifacts. To test for artifacts, we simultaneously scanned a subject's brain with a customized soft phantom placed a few centimeters away from the subject's left motor cortex. The phantom captured task-related motion and haptic noise, but did not contain associated neural activation measurements. We quantified the task-related information present in fMRI measurements taken from the brain and the phantom by using a linear max-margin classifier to predict whether raw time series data could differentiate between motion planning or reaching. fMRI measurements in the phantom were uninformative (2σ, 45-73%; chance=50%), while those in primary motor, visual, and somatosensory cortex accurately classified task-conditions (2σ, 90-96%). We also localized artifacts due to the haptic interface alone by scanning a stand-alone fBIRN phantom, while an operator performed haptic tasks outside the scanner's bore with the interface at the same location. The stand-alone phantom had lower temporal noise and had similar mean classification but a tighter distribution (bootstrap Gaussian fit) than the brain phantom. Our results suggest that any fMRI measurement artifacts for Haptic fMRI reaching experiments are dominated by actual neural responses.
Evaluation of Motor Control Using Haptic Device
NASA Astrophysics Data System (ADS)
Nuruki, Atsuo; Kawabata, Takuro; Shimozono, Tomoyuki; Yamada, Masafumi; Yunokuchi, Kazutomo
When the kinesthesia and the touch act at the same time, such perception is called haptic perception. This sense has the key role in motor information on the force and position control. The haptic perception is important in the field where the evaluation of the motor control is needed. The purpose of this paper is to evaluate the motor control, perception of heaviness and distance in normal and fatigue conditions using psychophysical experiment. We used a haptic device in order to generate precise force and distance, but the precedent of the evaluation system with the haptic device has been few. Therefore, it is another purpose to examine whether the haptic device is useful as evaluation system for the motor control. The psychophysical quantity of force and distance was measured by two kinds of experiments. Eight healthy subjects participated in this study. The stimulation was presented by haptic device [PHANTOM Omni: SensAble Company]. The subjects compared between standard and test stimulation, and answered it had felt which stimulation was strong. In the result of the psychophysical quantity of force, just noticeable difference (JND) had a significant difference, and point of subjective equality (PSE) was not different between normal and muscle fatigue. On the other hand, in the result of the psychophysical quantity of distance, JND and PSE were not difference between normal and muscle fatigue. These results show that control of force was influenced, but control of distance was not influenced in muscle fatigue. Moreover, these results suggested that the haptic device is useful as the evaluation system for the motor control.
Mazella, Anaïs; Albaret, Jean-Michel; Picard, Delphine
2016-01-01
To fill an important gap in the psychometric assessment of children and adolescents with impaired vision, we designed a new battery of haptic tests, called Haptic-2D, for visually impaired and sighted individuals aged five to 18 years. Unlike existing batteries, ours uses only two-dimensional raised materials that participants explore using active touch. It is composed of 11 haptic tests, measuring scanning skills, tactile discrimination skills, spatial comprehension skills, short-term tactile memory, and comprehension of tactile pictures. We administered this battery to 138 participants, half of whom were sighted (n=69), and half visually impaired (blind, n=16; low vision, n=53). Results indicated a significant main effect of age on haptic scores, but no main effect of vision or Age × Vision interaction effect. Reliability of test items was satisfactory (Cronbach's alpha, α=0.51-0.84). Convergent validity was good, as shown by a significant correlation (age partialled out) between total haptic scores and scores on the B101 test (rp=0.51, n=47). Discriminant validity was also satisfactory, as attested by a lower but still significant partial correlation between total haptic scores and the raw score on the verbal WISC (rp=0.43, n=62). Finally, test-retest reliability was good (rs=0.93, n=12; interval of one to two months). This new psychometric tool should prove useful to practitioners working with young people with impaired vision. Copyright © 2015 Elsevier Ltd. All rights reserved.
Performance evaluation of a robot-assisted catheter operating system with haptic feedback.
Song, Yu; Guo, Shuxiang; Yin, Xuanchun; Zhang, Linshuai; Hirata, Hideyuki; Ishihara, Hidenori; Tamiya, Takashi
2018-06-20
In this paper, a novel robot-assisted catheter operating system (RCOS) has been proposed as a method to reduce physical stress and X-ray exposure time to physicians during endovascular procedures. The unique design of this system allows the physician to apply conventional bedside catheterization skills (advance, retreat and rotate) to an input catheter, which is placed at the master side to control another patient catheter placed at the slave side. For this purpose, a magnetorheological (MR) fluids-based master haptic interface has been developed to measure the axial and radial motions of an input catheter, as well as to provide the haptic feedback to the physician during the operation. In order to achieve a quick response of the haptic force in the master haptic interface, a hall sensor-based closed-loop control strategy is employed. In slave side, a catheter manipulator is presented to deliver the patient catheter, according to position commands received from the master haptic interface. The contact forces between the patient catheter and blood vessel system can be measured by designed force sensor unit of catheter manipulator. Four levels of haptic force are provided to make the operator aware of the resistance encountered by the patient catheter during the insertion procedure. The catheter manipulator was evaluated for precision positioning. The time lag from the sensed motion to replicated motion is tested. To verify the efficacy of the proposed haptic feedback method, the evaluation experiments in vitro are carried out. The results demonstrate that the proposed system has the ability to enable decreasing the contact forces between the catheter and vasculature.
Fundamentals of computer graphics for artists and designers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riley, B.A.
1986-01-01
This tutorial provides introductory information about computer graphics slanted towards novice users from artist/designer backgrounds. The goal is to describe the applications and terminology sufficiently to provide a base of knowledge for discussions with vendors.
Profex: a graphical user interface for the Rietveld refinement program BGMN.
Doebelin, Nicola; Kleeberg, Reinhard
2015-10-01
Profex is a graphical user interface for the Rietveld refinement program BGMN . Its interface focuses on preserving BGMN 's powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems.
Profex: a graphical user interface for the Rietveld refinement program BGMN
Doebelin, Nicola; Kleeberg, Reinhard
2015-01-01
Profex is a graphical user interface for the Rietveld refinement program BGMN. Its interface focuses on preserving BGMN’s powerful and flexible scripting features by giving direct access to BGMN input files. Very efficient workflows for single or batch refinements are achieved by managing refinement control files and structure files, by providing dialogues and shortcuts for many operations, by performing operations in the background, and by providing import filters for CIF and XML crystal structure files. Refinement results can be easily exported for further processing. State-of-the-art graphical export of diffraction patterns to pixel and vector graphics formats allows the creation of publication-quality graphs with minimum effort. Profex reads and converts a variety of proprietary raw data formats and is thus largely instrument independent. Profex and BGMN are available under an open-source license for Windows, Linux and OS X operating systems. PMID:26500466
Investigations into haptic space and haptic perception of shape for active touch
NASA Astrophysics Data System (ADS)
Sanders, A. F. J.
2008-12-01
This thesis presents a number of psychophysical investigations into haptic space and haptic perception of shape. Haptic perception is understood to include the two subsystems of the cutaneous sense and kinesthesis. Chapter 2 provides an extensive quantitative study into haptic perception of curvature. I investigated bimanual curvature discrimination of cylindrically curved, hand-sized surfaces. I found that discrimination thresholds were in the same range as unimanual thresholds reported in previous studies. Moreover, the distance between the surfaces or the position of the setup with respect to the observer had no effect on thresholds. Finally, I found idiosyncratic biases: A number of observers judged two surfaces that had different radii as equally curved. Biases were of the same order of magnitude as thresholds. In Chapter 3, I investigated haptic space. Here, haptic space is understood to be (1) the set of observer’s judgments of spatial relations in physical space, and (2) a set of constraints by which these judgments are internally consistent. I asked blindfolded observers to construct straight lines in a number of different tasks. I show that the shape of the haptically straight line depends on the task used to produce it. I therefore conclude that there is no unique definition of the haptically straight line and that doubts are cast on the usefulness of the concept of haptic space. In Chapter 4, I present a new experiment into haptic length perception. I show that when observers trace curved pathways with their index finger and judge distance traversed, their distance estimates depend on the geometry of the paths: Lengths of convex, cylindrically curved pathways were overestimated and lengths of concave pathways were underestimated. In addition, I show that a kinematic mechanism must underlie this interaction: (1) the geometry of the path traced by the finger affects movement speed and consequently movement time, and (2) movement time is taken as a measure of traversed length. The study presented in Chapter 5 addresses the question of how kinematic properties of exploratory movements affect perceived shape. I identify a kinematic invariant for the case of a single finger moving across cylindrically curved strips under conditions of slip. I found that the rotation angle of the finger increased linearly with the curvature of the stimulus. In addition, I show that observers took rotation angle as their primary measure of perceived curvature: Observers rotated their finger less on a concave curvature by a constant amount, and consequently, they overestimated the radius of the concave strips compared to the convex ones. Finally, in Chapter 6, I investigated the haptic filled-space illusion for dynamic touch: Observers move their fingertip across an unfilled extent or an extent filled with intermediate stimulations. Previous researchers have reported lengths of filled extents to be overestimated, but the parameters affecting the strength of the illusion are still largely unknown. Factors investigated in this chapter include end point effects, filler density and overall average movement speed.
How Information Visualization Systems Change Users' Understandings of Complex Data
ERIC Educational Resources Information Center
Allendoerfer, Kenneth Robert
2009-01-01
User-centered evaluations of information systems often focus on the usability of the system rather its usefulness. This study examined how a using an interactive knowledge-domain visualization (KDV) system affected users' understanding of a domain. Interactive KDVs allow users to create graphical representations of domains that depict important…
Haptic Glove Technology: Skill Development through Video Game Play
ERIC Educational Resources Information Center
Bargerhuff, Mary Ellen; Cowan, Heidi; Oliveira, Francisco; Quek, Francis; Fang, Bing
2010-01-01
This article introduces a recently developed haptic glove system and describes how the participants used a video game that was purposely designed to train them in skills that are needed for the efficient use of the haptic glove. Assessed skills included speed, efficiency, embodied skill, and engagement. The findings and implications for future…
The Role of Visual Experience on the Representation and Updating of Novel Haptic Scenes
ERIC Educational Resources Information Center
Pasqualotto, Achille; Newell, Fiona N.
2007-01-01
We investigated the role of visual experience on the spatial representation and updating of haptic scenes by comparing recognition performance across sighted, congenitally and late blind participants. We first established that spatial updating occurs in sighted individuals to haptic scenes of novel objects. All participants were required to…
Effect of Auditory Interference on Memory of Haptic Perceptions.
ERIC Educational Resources Information Center
Anater, Paul F.
1980-01-01
The effect of auditory interference on the processing of haptic information by 61 visually impaired students (8 to 20 years old) was the focus of the research described in this article. It was assumed that as the auditory interference approximated the verbalized activity of the haptic task, accuracy of recall would decline. (Author)
Investigating Students' Ideas about Buoyancy and the Influence of Haptic Feedback
ERIC Educational Resources Information Center
Minogue, James; Borland, David
2016-01-01
While haptics (simulated touch) represents a potential breakthrough technology for science teaching and learning, there is relatively little research into its differential impact in the context of teaching and learning. This paper describes the testing of a haptically enhanced simulation (HES) for learning about buoyancy. Despite a lifetime of…
Superior haptic-to-visual shape matching in autism spectrum disorders.
Nakano, Tamami; Kato, Nobumasa; Kitazawa, Shigeru
2012-04-01
A weak central coherence theory in autism spectrum disorder (ASD) proposes that a cognitive bias toward local processing in ASD derives from a weakness in integrating local elements into a coherent whole. Using this theory, we hypothesized that shape perception through active touch, which requires sequential integration of sensorimotor traces of exploratory finger movements into a shape representation, would be impaired in ASD. Contrary to our expectation, adults with ASD showed superior performance in a haptic-to-visual delayed shape-matching task compared to adults without ASD. Accuracy in discriminating haptic lengths or haptic orientations, which lies within the somatosensory modality, did not differ between adults with ASD and adults without ASD. Moreover, this superior ability in inter-modal haptic-to-visual shape matching was not explained by the score in a unimodal visuospatial rotation task. These results suggest that individuals with ASD are not impaired in integrating sensorimotor traces into a global visual shape and that their multimodal shape representations and haptic-to-visual information transfer are more accurate than those of individuals without ASD. Copyright © 2012 Elsevier Ltd. All rights reserved.
Haptic Recreation of Elbow Spasticity
Kim, Jonghyun; Damiano, Diane L.
2013-01-01
The aim of this paper is to develop a haptic device capable of presenting standardized recreation of elbow spasticity. Using the haptic device, clinicians will be able to repeatedly practice the assessment of spasticity without requiring patient involvement, and these practice opportunities will help improve accuracy and reliability of the assessment itself. Haptic elbow spasticity simulator (HESS) was designed and prototyped according to mechanical requirements to recreate the feel of elbow spasticity. Based on the data collected from subjects with elbow spasticity, a mathematical model representing elbow spasticity is proposed. As an attempt to differentiate the feel of each score in Modified Ashworth Scale (MAS), parameters of the model were obtained respectively for three different MAS scores 1, 1+, and 2. The implemented haptic recreation was evaluated by experienced clinicians who were asked to give MAS scores by manipulating the haptic device. The clinicians who participated in the study were blinded to each other’s scores and to the given models. They distinguished the three models and the MAS scores given to the recreated models matched 100% with the original MAS scores from the patients. PMID:22275660
Control of a haptic gear shifting assistance device utilizing a magnetorheological clutch
NASA Astrophysics Data System (ADS)
Han, Young-Min; Choi, Seung-Bok
2014-10-01
This paper proposes a haptic clutch driven gear shifting assistance device that can help when the driver shifts the gear of a transmission system. In order to achieve this goal, a magnetorheological (MR) fluid-based clutch is devised to be capable of the rotary motion of an accelerator pedal to which the MR clutch is integrated. The proposed MR clutch is then manufactured, and its transmission torque is experimentally evaluated according to the magnetic field intensity. The manufactured MR clutch is integrated with the accelerator pedal to transmit a haptic cue signal to the driver. The impending control issue is to cue the driver to shift the gear via the haptic force. Therefore, a gear-shifting decision algorithm is constructed by considering the vehicle engine speed concerned with engine combustion dynamics, vehicle dynamics and driving resistance. Then, the algorithm is integrated with a compensation strategy for attaining the desired haptic force. In this work, the compensator is also developed and implemented through the discrete version of the inverse hysteretic model. The control performances, such as the haptic force tracking responses and fuel consumption, are experimentally evaluated.
Torque Measurement of 3-DOF Haptic Master Operated by Controllable Electrorheological Fluid
NASA Astrophysics Data System (ADS)
Oh, Jong-Seok; Choi, Seung-Bok; Lee, Yang-Sub
2015-02-01
This work presents a torque measurement method of 3-degree-of-freedom (3-DOF) haptic master featuring controllable electrorheological (ER) fluid. In order to reflect the sense of an organ for a surgeon, the ER haptic master which can generate the repulsive torque of an organ is utilized as a remote controller for a surgery robot. Since accurate representation of organ feeling is essential for the success of the robot-assisted surgery, it is indispensable to develop a proper torque measurement method of 3-DOF ER haptic master. After describing the structural configuration of the haptic master, the torque models of ER spherical joint are mathematically derived based on the Bingham model of ER fluid. A new type of haptic device which has pitching, rolling, and yawing motions is then designed and manufactured using a spherical joint mechanism. Subsequently, the field-dependent parameters of the Bingham model are identified and generating repulsive torque according to applied electric field is measured. In addition, in order to verify the effectiveness of the proposed torque model, a comparative work between simulated and measured torques is undertaken.
NASA Technical Reports Server (NTRS)
Szczur, Martha R.
1993-01-01
The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development which simplifies the process of creating and managing complex application graphical user interfaces (GUI's). TAE Plus supports the rapid prototyping of GUI's and allows applications to be ported easily between different platforms. This paper will discuss the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUI's easier for application developers. TAE Plus is being applied to many types of applications, and this paper discusses how it has been used both within and outside NASA.
NASA Technical Reports Server (NTRS)
Laudeman, Irene V.; Brasil, Connie L.; Stassart, Philippe
1998-01-01
The Planview Graphical User Interface (PGUI) is the primary display of air traffic for the Conflict Prediction and Trial Planning, function of the Center TRACON Automation System. The PGUI displays air traffic information that assists the user in making decisions related to conflict detection, conflict resolution, and traffic flow management. The intent of this document is to outline the human factors issues related to the design of the conflict prediction and trial planning portions of the PGUI, document all human factors related design changes made to the PGUI from December 1996 to September 1997, and outline future plans for the ongoing PGUI design.
Computer-Aided System Engineering and Analysis (CASE/A) Programmer's Manual, Version 5.0
NASA Technical Reports Server (NTRS)
Knox, J. C.
1996-01-01
The Computer Aided System Engineering and Analysis (CASE/A) Version 5.0 Programmer's Manual provides the programmer and user with information regarding the internal structure of the CASE/A 5.0 software system. CASE/A 5.0 is a trade study tool that provides modeling/simulation capabilities for analyzing environmental control and life support systems and active thermal control systems. CASE/A has been successfully used in studies such as the evaluation of carbon dioxide removal in the space station. CASE/A modeling provides a graphical and command-driven interface for the user. This interface allows the user to construct a model by placing equipment components in a graphical layout of the system hardware, then connect the components via flow streams and define their operating parameters. Once the equipment is placed, the simulation time and other control parameters can be set to run the simulation based on the model constructed. After completion of the simulation, graphical plots or text files can be obtained for evaluation of the simulation results over time. Additionally, users have the capability to control the simulation and extract information at various times in the simulation (e.g., control equipment operating parameters over the simulation time or extract plot data) by using "User Operations (OPS) Code." This OPS code is written in FORTRAN with a canned set of utility subroutines for performing common tasks. CASE/A version 5.0 software runs under the VAX VMS(Trademark) environment. It utilizes the Tektronics 4014(Trademark) graphics display system and the VTIOO(Trademark) text manipulation/display system.
IDEA: Interactive Display for Evolutionary Analyses.
Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C
2008-12-08
The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data.
IDEA: Interactive Display for Evolutionary Analyses
Egan, Amy; Mahurkar, Anup; Crabtree, Jonathan; Badger, Jonathan H; Carlton, Jane M; Silva, Joana C
2008-01-01
Background The availability of complete genomic sequences for hundreds of organisms promises to make obtaining genome-wide estimates of substitution rates, selective constraints and other molecular evolution variables of interest an increasingly important approach to addressing broad evolutionary questions. Two of the programs most widely used for this purpose are codeml and baseml, parts of the PAML (Phylogenetic Analysis by Maximum Likelihood) suite. A significant drawback of these programs is their lack of a graphical user interface, which can limit their user base and considerably reduce their efficiency. Results We have developed IDEA (Interactive Display for Evolutionary Analyses), an intuitive graphical input and output interface which interacts with PHYLIP for phylogeny reconstruction and with codeml and baseml for molecular evolution analyses. IDEA's graphical input and visualization interfaces eliminate the need to edit and parse text input and output files, reducing the likelihood of errors and improving processing time. Further, its interactive output display gives the user immediate access to results. Finally, IDEA can process data in parallel on a local machine or computing grid, allowing genome-wide analyses to be completed quickly. Conclusion IDEA provides a graphical user interface that allows the user to follow a codeml or baseml analysis from parameter input through to the exploration of results. Novel options streamline the analysis process, and post-analysis visualization of phylogenies, evolutionary rates and selective constraint along protein sequences simplifies the interpretation of results. The integration of these functions into a single tool eliminates the need for lengthy data handling and parsing, significantly expediting access to global patterns in the data. PMID:19061522
NASA Astrophysics Data System (ADS)
Le, Minh Tuan; Nguyen, Congdu; Yoon, Dae-Il; Jung, Eun Ku; Kim, Hae-Kwang
2007-12-01
In this paper, we introduce a graphics to Scalable Vector Graphics (SVG) adaptation framework with a mechanism of vector graphics transmission to overcome the shortcoming of real-time representation and interaction experiences of 3D graphics application running on mobile devices. We therefore develop an interactive 3D visualization system based on the proposed framework for rapidly representing a 3D scene on mobile devices without having to download it from the server. Our system scenario is composed of a client viewer and a graphic to SVG adaptation server. The client viewer offers the user to access to the same 3D contents with different devices according to consumer interactions.
Gurari, Netta; Baud-Bovy, Gabriel
2014-09-30
The emergence of commercial haptic devices offers new research opportunities to enhance our understanding of the human sensory-motor system. Yet, commercial device capabilities have limitations which need to be addressed. This paper describes the customization of a commercial force feedback device for displaying forces with a precision that exceeds the human force perception threshold. The device was outfitted with a multi-axis force sensor and closed-loop controlled to improve its transparency. Additionally, two force sensing resistors were attached to the device to measure grip force. Force errors were modeled in the frequency- and time-domain to identify contributions from the mass, viscous friction, and Coulomb friction during open- and closed-loop control. The effect of user interaction on system stability was assessed in the context of a user study which aimed to measure force perceptual thresholds. Findings based on 15 participants demonstrate that the system maintains stability when rendering forces ranging from 0-0.20 N, with an average maximum absolute force error of 0.041 ± 0.013 N. Modeling the force errors revealed that Coulomb friction and inertia were the main contributors to force distortions during respectively slow and fast motions. Existing commercial force feedback devices cannot render forces with the required precision for certain testing scenarios. Building on existing robotics work, this paper shows how a device can be customized to make it reliable for studying the perception of weak forces. The customized and closed-loop controlled device is suitable for measuring force perceptual thresholds. Copyright © 2014 Elsevier B.V. All rights reserved.
Development of haptic system for surgical robot
NASA Astrophysics Data System (ADS)
Gang, Han Gyeol; Park, Jiong Min; Choi, Seung-Bok; Sohn, Jung Woo
2017-04-01
In this paper, a new type of haptic system for surgical robot application is proposed and its performances are evaluated experimentally. The proposed haptic system consists of an effective master device and a precision slave robot. The master device has 3-DOF rotational motion as same as human wrist motion. It has lightweight structure with a gyro sensor and three small-sized MR brakes for position measurement and repulsive torque generation, respectively. The slave robot has 3-DOF rotational motion using servomotors, five bar linkage and a torque sensor is used to measure resistive torque. It has been experimentally demonstrated that the proposed haptic system has good performances on tracking control of desired position and repulsive torque. It can be concluded that the proposed haptic system can be effectively applied to the surgical robot system in real field.
fMRI-Compatible Electromagnetic Haptic Interface.
Riener, R; Villgrattner, T; Kleiser, R; Nef, T; Kollias, S
2005-01-01
A new haptic interface device is suggested, which can be used for functional magnetic resonance imaging (fMRI) studies. The basic component of this 1 DOF haptic device are two coils that produce a Lorentz force induced by the large static magnetic field of the MR scanner. A MR-compatible optical angular encoder and a optical force sensor enable the implementation of different control architectures for haptic interactions. The challenge was to provide a large torque, and not to affect image quality by the currents applied in the device. The haptic device was tested in a 3T MR scanner. With a current of up to 1A and a distance of 1m to the focal point of the MR-scanner it was possible to generate torques of up to 4 Nm. Within these boundaries image quality was not affected.
Flexible Environmental Modeling with Python and Open - GIS
NASA Astrophysics Data System (ADS)
Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann
2015-04-01
Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.
The effect of haptic guidance and visual feedback on learning a complex tennis task.
Marchal-Crespo, Laura; van Raai, Mark; Rauter, Georg; Wolf, Peter; Riener, Robert
2013-11-01
While haptic guidance can improve ongoing performance of a motor task, several studies have found that it ultimately impairs motor learning. However, some recent studies suggest that the haptic demonstration of optimal timing, rather than movement magnitude, enhances learning in subjects trained with haptic guidance. Timing of an action plays a crucial role in the proper accomplishment of many motor skills, such as hitting a moving object (discrete timing task) or learning a velocity profile (time-critical tracking task). The aim of the present study is to evaluate which feedback conditions-visual or haptic guidance-optimize learning of the discrete and continuous elements of a timing task. The experiment consisted in performing a fast tennis forehand stroke in a virtual environment. A tendon-based parallel robot connected to the end of a racket was used to apply haptic guidance during training. In two different experiments, we evaluated which feedback condition was more adequate for learning: (1) a time-dependent discrete task-learning to start a tennis stroke and (2) a tracking task-learning to follow a velocity profile. The effect that the task difficulty and subject's initial skill level have on the selection of the optimal training condition was further evaluated. Results showed that the training condition that maximizes learning of the discrete time-dependent motor task depends on the subjects' initial skill level. Haptic guidance was especially suitable for less-skilled subjects and in especially difficult discrete tasks, while visual feedback seems to benefit more skilled subjects. Additionally, haptic guidance seemed to promote learning in a time-critical tracking task, while visual feedback tended to deteriorate the performance independently of the task difficulty and subjects' initial skill level. Haptic guidance outperformed visual feedback, although additional studies are needed to further analyze the effect of other types of feedback visualization on motor learning of time-critical tasks.
A user's guide for DTIZE an interactive digitizing and graphical editing computer program
NASA Technical Reports Server (NTRS)
Thomas, C. C.
1981-01-01
A guide for DTIZE, a two dimensional digitizing program with graphical editing capability, is presented. DTIZE provides the capability to simultaneously create and display a picture on the display screen. Data descriptions may be permanently saved in three different formats. DTIZE creates the picture graphics in the locator mode, thus inputting one coordinate each time the terminator button is pushed. Graphic input devices (GIN) are also used to select function command menu. These menu commands and the program's interactive prompting sequences provide a complete capability for creating, editing, and permanently recording a graphical picture file. DTIZE is written in FORTRAN IV language for the Tektronix 4081 graphic system utilizing the Plot 80 Distributed Graphics Library (DGL) subroutines. The Tektronix 4953/3954 Graphic Tablet with mouse, pen, or joystick are used as graphics input devices to create picture graphics.
Haptic Cues for Balance: Use of a Cane Provides Immediate Body Stabilization
Sozzi, Stefania; Crisafulli, Oscar; Schieppati, Marco
2017-01-01
Haptic cues are important for balance. Knowledge of the temporal features of their effect may be crucial for the design of neural prostheses. Touching a stable surface with a fingertip reduces body sway in standing subjects eyes closed (EC), and removal of haptic cue reinstates a large sway pattern. Changes in sway occur rapidly on changing haptic conditions. Here, we describe the effects and time-course of stabilization produced by a haptic cue derived from a walking cane. We intended to confirm that cane use reduces body sway, to evaluate the effect of vision on stabilization by a cane, and to estimate the delay of the changes in body sway after addition and withdrawal of haptic input. Seventeen healthy young subjects stood in tandem position on a force platform, with eyes closed or open (EO). They gently lowered the cane onto and lifted it from a second force platform. Sixty trials per direction of haptic shift (Touch → NoTouch, T-NT; NoTouch → Touch, NT-T) and visual condition (EC-EO) were acquired. Traces of Center of foot Pressure (CoP) and the force exerted by cane were filtered, rectified, and averaged. The position in space of a reflective marker positioned on the cane tip was also acquired by an optoelectronic device. Cross-correlation (CC) analysis was performed between traces of cane tip and CoP displacement. Latencies of changes in CoP oscillation in the frontal plane EC following the T-NT and NT-T haptic shift were statistically estimated. The CoP oscillations were larger in EC than EO under both T and NT (p < 0.001) and larger during NT than T conditions (p < 0.001). Haptic-induced effect under EC (Romberg quotient NT/T ~ 1.2) was less effective than that of vision under NT condition (EC/EO ~ 1.5) (p < 0.001). With EO cane had little effect. Cane displacement lagged CoP displacement under both EC and EO. Latencies to changes in CoP oscillations were longer after addition (NT-T, about 1.6 s) than withdrawal (T-NT, about 0.9 s) of haptic input (p < 0.001). These latencies were similar to those occurring on fingertip touch, as previously shown. Overall, data speak in favor of substantial equivalence of the haptic information derived from both “direct” fingertip contact and “indirect” contact with the floor mediated by the cane. Cane, finger and visual inputs would be similarly integrated in the same neural centers for balance control. Haptic input from a walking aid and its processing time should be considered when designing prostheses for locomotion. PMID:29311785
Multiple reference frames in haptic spatial processing
NASA Astrophysics Data System (ADS)
Volčič, R.
2008-08-01
The present thesis focused on haptic spatial processing. In particular, our interest was directed to the perception of spatial relations with the main focus on the perception of orientation. To this end, we studied haptic perception in different tasks, either in isolation or in combination with vision. The parallelity task, where participants have to match the orientations of two spatially separated bars, was used in its two-dimensional and three-dimensional versions in Chapter 2 and Chapter 3, respectively. The influence of non-informative vision and visual interference on performance in the parallelity task was studied in Chapter 4. A different task, the mental rotation task, was introduced in a purely haptic study in Chapter 5 and in a visuo-haptic cross-modal study in Chapter 6. The interaction of multiple reference frames and their influence on haptic spatial processing were the common denominators of these studies. In this thesis we approached the problems of which reference frames play the major role in haptic spatial processing and how the relative roles of distinct reference frames change depending on the available information and the constraints imposed by different tasks. We found that the influence of a reference frame centered on the hand was the major cause of the deviations from veridicality observed in both the two-dimensional and three-dimensional studies. The results were described by a weighted average model, in which the hand-centered egocentric reference frame is supposed to have a biasing influence on the allocentric reference frame. Performance in haptic spatial processing has been shown to depend also on sources of information or processing that are not strictly connected to the task at hand. When non-informative vision was provided, a beneficial effect was observed in the haptic performance. This improvement was interpreted as a shift from the egocentric to the allocentric reference frame. Moreover, interfering visual information presented in the vicinity of the haptic stimuli parametrically modulated the magnitude of the deviations. The influence of the hand-centered reference frame was shown also in the haptic mental rotation task where participants were quicker in judging the parity of objects when these were aligned with respect to the hands than when they were physically aligned. Similarly, in the visuo-haptic cross-modal mental rotation task the parity judgments were influenced by the orientation of the exploring hand with respect to the viewing direction. This effect was shown to be modulated also by an intervening temporal delay that supposedly counteracts the influence of the hand-centered reference frame. We suggest that the hand-centered reference frame is embedded in a hierarchical structure of reference frames where some of these emerge depending on the demands and the circumstances of the surrounding environment and the needs of an active perceiver.
An introduction to real-time graphical techniques for analyzing multivariate data
NASA Astrophysics Data System (ADS)
Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner
1987-08-01
Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".
Myoelectric control of prosthetic hands: state-of-the-art review
Geethanjali, Purushothaman
2016-01-01
Myoelectric signals (MES) have been used in various applications, in particular, for identification of user intention to potentially control assistive devices for amputees, orthotic devices, and exoskeleton in order to augment capability of the user. MES are also used to estimate force and, hence, torque to actuate the assistive device. The application of MES is not limited to assistive devices, and they also find potential applications in teleoperation of robots, haptic devices, virtual reality, and so on. The myoelectric control-based prosthetic hand aids to restore activities of daily living of amputees in order to improve the self-esteem of the user. All myoelectric control-based prosthetic hands may not have similar operations and exhibit variation in sensing input, deciphering the signals, and actuating prosthetic hand. Researchers are focusing on improving the functionality of prosthetic hand in order to suit the user requirement with the different operating features. The myoelectric control differs in operation to accommodate various external factors. This article reviews the state of the art of myoelectric prosthetic hand, giving description of each control strategy. PMID:27555799
Design of a 4-DOF MR haptic master for application to robot surgery: virtual environment work
NASA Astrophysics Data System (ADS)
Oh, Jong-Seok; Choi, Seung-Hyun; Choi, Seung-Bok
2014-09-01
This paper presents the design and control performance of a novel type of 4-degrees-of-freedom (4-DOF) haptic master in cyberspace for a robot-assisted minimally invasive surgery (RMIS) application. By using a controllable magnetorheological (MR) fluid, the proposed haptic master can have a feedback function for a surgical robot. Due to the difficulty in utilizing real human organs in the experiment, the cyberspace that features the virtual object is constructed to evaluate the performance of the haptic master. In order to realize the cyberspace, a volumetric deformable object is represented by a shape-retaining chain-linked (S-chain) model, which is a fast volumetric model and is suitable for real-time applications. In the haptic architecture for an RMIS application, the desired torque and position induced from the virtual object of the cyberspace and the haptic master of real space are transferred to each other. In order to validate the superiority of the proposed master and volumetric model, a tracking control experiment is implemented with a nonhomogenous volumetric cubic object to demonstrate that the proposed model can be utilized in real-time haptic rendering architecture. A proportional-integral-derivative (PID) controller is then designed and empirically implemented to accomplish the desired torque trajectories. It has been verified from the experiment that tracking the control performance for torque trajectories from a virtual slave can be successfully achieved.
A brain-computer interface with vibrotactile biofeedback for haptic information.
Chatterjee, Aniruddha; Aggarwal, Vikram; Ramos, Ander; Acharya, Soumyadipta; Thakor, Nitish V
2007-10-17
It has been suggested that Brain-Computer Interfaces (BCI) may one day be suitable for controlling a neuroprosthesis. For closed-loop operation of BCI, a tactile feedback channel that is compatible with neuroprosthetic applications is desired. Operation of an EEG-based BCI using only vibrotactile feedback, a commonly used method to convey haptic senses of contact and pressure, is demonstrated with a high level of accuracy. A Mu-rhythm based BCI using a motor imagery paradigm was used to control the position of a virtual cursor. The cursor position was shown visually as well as transmitted haptically by modulating the intensity of a vibrotactile stimulus to the upper limb. A total of six subjects operated the BCI in a two-stage targeting task, receiving only vibrotactile biofeedback of performance. The location of the vibration was also systematically varied between the left and right arms to investigate location-dependent effects on performance. Subjects are able to control the BCI using only vibrotactile feedback with an average accuracy of 56% and as high as 72%. These accuracies are significantly higher than the 15% predicted by random chance if the subject had no voluntary control of their Mu-rhythm. The results of this study demonstrate that vibrotactile feedback is an effective biofeedback modality to operate a BCI using motor imagery. In addition, the study shows that placement of the vibrotactile stimulation on the biceps ipsilateral or contralateral to the motor imagery introduces a significant bias in the BCI accuracy. This bias is consistent with a drop in performance generated by stimulation of the contralateral limb. Users demonstrated the capability to overcome this bias with training.
Design of Flight Control Panel Layout using Graphical User Interface in MATLAB
NASA Astrophysics Data System (ADS)
Wirawan, A.; Indriyanto, T.
2018-04-01
This paper introduces the design of Flight Control Panel (FCP) Layout using Graphical User Interface in MATLAB. The FCP is the interface to give the command to the simulation and to monitor model variables while the simulation is running. The command accommodates by the FCP are altitude command, the angle of sideslip command, heading command, and setting command for turbulence model. The FCP was also designed to monitor the flight parameter while the simulation is running.
IGGy: An interactive environment for surface grid generation
NASA Technical Reports Server (NTRS)
Prewitt, Nathan C.
1992-01-01
A graphically interactive derivative of the EAGLE boundary code is presented. This code allows the user to interactively build and execute commands and immediately see the results. Strong ties with a batch oriented script language are maintained. A generalized treatment of grid definition parameters allows a more generic definition of the grid generation process and allows the generation of command scripts which can be applied to topologically similar configurations. The use of the graphical user interface is outlined and example applications are presented.
Graphical User Interface for an Observing Control System for the UK Infrared Telescope
NASA Astrophysics Data System (ADS)
Tan, M.; Bridger, A.; Wright, G. S.; Adamson, A. J.; Currie, M. J.; Economou, F.
A Graphical user interface for the observing control system of UK Infrared Telescope has been developed as a part of the ORAC (Observatory Reduction and Acquisition Control) Project. We analyzed and designed the system using the Unified Modelling Language (UML) with the CASE tool Rational Rose 98. The system has been implemented in a modular way with Java packages using Swing and RMI. This system is component-based with pluggability. Object orientation concepts and UML notations have been applied throughout the development.
Magezi, David A
2015-01-01
Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).
Graphical user interface for a neonatal parenteral nutrition decision support system.
Peverini, R. L.; Beach, D. S.; Wan, K. W.; Vyhmeister, N. R.
2000-01-01
We developed and implemented a decision support system for prescribing parenteral nutrition (PN) solutions for infants in our neonatal intensive care unit. We employed a graphical user interface to provide clinical guidelines and aid the understanding of the interaction among the various ingredients that make up a PN solution. In particular, by displaying the interaction between the PN total solution volume, protein, calcium and phosphorus, we have eliminated PN orders that previously would have resulted in calcium-phosphorus precipitation errors. PMID:11079964
The use of Graphic User Interface for development of a user-friendly CRS-Stack software
NASA Astrophysics Data System (ADS)
Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah
2017-04-01
The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the parameter values for each operation. The knowledge of CRS-Stack processing procedure is still preserved in the software, which is easy and efficient to be learned. The software will still be developed in the future. Any new innovative seismic processing workflow will also be added into this GUI software.
Perceptual Grouping in Haptic Search: The Influence of Proximity, Similarity, and Good Continuation
ERIC Educational Resources Information Center
Overvliet, Krista E.; Krampe, Ralf Th.; Wagemans, Johan
2012-01-01
We conducted a haptic search experiment to investigate the influence of the Gestalt principles of proximity, similarity, and good continuation. We expected faster search when the distractors could be grouped. We chose edges at different orientations as stimuli because they are processed similarly in the haptic and visual modality. We therefore…
Immediate Memory for Haptically-Examined Braille Symbols by Blind and Sighted Subjects.
ERIC Educational Resources Information Center
Newman, Slater E.; And Others
The paper reports on two experiments in Braille learning which compared blind and sighted subjects on the immediate recall of haptically-examined Braille symbols. In the first study, sighted subjects (N=64) haptically examined each of a set of Braille symbols with their preferred or nonpreferred hand and immediately recalled the symbol by drawing…
Haptic Cues Used for Outdoor Wayfinding by Individuals with Visual Impairments
ERIC Educational Resources Information Center
Koutsoklenis, Athanasios; Papadopoulos, Konstantinos
2014-01-01
Introduction: The study presented here examines which haptic cues individuals with visual impairments use more frequently and determines which of these cues are deemed by these individuals to be the most important for way-finding in urban environments. It also investigates the ways in which these haptic cues are used by individuals with visual…
ERIC Educational Resources Information Center
Stock, Oliver; Roder, Brigitte; Burke, Michael; Bien, Siegfried; Rosler, Frank
2009-01-01
The present study used functional magnetic resonance imaging to delineate cortical networks that are activated when objects or spatial locations encoded either visually (visual encoding group, n = 10) or haptically (haptic encoding group, n = 10) had to be retrieved from long-term memory. Participants learned associations between auditorily…
Physical Student-Robot Interaction with the ETHZ Haptic Paddle
ERIC Educational Resources Information Center
Gassert, R.; Metzger, J.; Leuenberger, K.; Popp, W. L.; Tucker, M. R.; Vigaru, B.; Zimmermann, R.; Lambercy, O.
2013-01-01
Haptic paddles--low-cost one-degree-of-freedom force feedback devices--have been used with great success at several universities throughout the US to teach the basic concepts of dynamic systems and physical human-robot interaction (pHRI) to students. The ETHZ haptic paddle was developed for a new pHRI course offered in the undergraduate…
The use of graphics in the design of the human-telerobot interface
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Smith, Randy L.
1989-01-01
The Man-Systems Telerobotics Laboratory (MSTL) of NASA's Johnson Space Center employs computer graphics tools in their design and evaluation of the Flight Telerobotic Servicer (FTS) human/telerobot interface on the Shuttle and on the Space Station. It has been determined by the MSTL that the use of computer graphics can promote more expedient and less costly design endeavors. Several specific examples of computer graphics applied to the FTS user interface by the MSTL are described.
3D Graphics For Interactive Surgical Simulation And Implant Design
NASA Astrophysics Data System (ADS)
Dev, P.; Fellingham, L. L.; Vassiliadis, A.; Woolson, S. T.; White, D. N.; Young, S. L.
1984-10-01
The combination of user-friendly, highly interactive software, 3D graphics, and the high-resolution detailed views of anatomy afforded by X-ray computer tomography and magnetic resonance imaging can provide surgeons with the ability to plan and practice complex surgeries. In addition to providing a realistic and manipulable 3D graphics display, this system can drive a milling machine in order to produce physical models of the anatomy or prosthetic devices and implants which have been designed using its interactive graphics editing facilities.
Continuation of research into language concepts for the mission support environment
NASA Technical Reports Server (NTRS)
1991-01-01
A concept for a more intuitive and graphically based Computation (Comp) Builder was developed. The Graphical Comp Builder Prototype was developed, which is an X Window based graphical tool that allows the user to build Comps using graphical symbols. Investigation was conducted to determine the availability and suitability of the Ada programming language for the development of future control center type software. The Space Station Freedom Project identified Ada as the desirable programming language for the development of Space Station Control Center software systems.
NASA Technical Reports Server (NTRS)
1974-01-01
The present form of this cardiovascular model simulates both 1-g and zero-g LBNP (lower body negative pressure) experiments and tilt experiments. In addition, the model simulates LBNP experiments at any body angle. The model is currently accessible on the Univac 1110 Time-Shared System in an interactive operational mode. Model output may be in tabular form and/or graphic form. The graphic capabilities are programmed for the Tektronix 4010 graphics terminal and the Univac 1110.
The DFVLR main department for central data processing, 1976 - 1983
NASA Technical Reports Server (NTRS)
1983-01-01
Data processing, equipment and systems operation, operative and user systems, user services, computer networks and communications, text processing, computer graphics, and high power computers are discussed.
Exploring the simulation requirements for virtual regional anesthesia training
NASA Astrophysics Data System (ADS)
Charissis, V.; Zimmer, C. R.; Sakellariou, S.; Chan, W.
2010-01-01
This paper presents an investigation towards the simulation requirements for virtual regional anaesthesia training. To this end we have developed a prototype human-computer interface designed to facilitate Virtual Reality (VR) augmenting educational tactics for regional anaesthesia training. The proposed interface system, aims to compliment nerve blocking techniques methods. The system is designed to operate in real-time 3D environment presenting anatomical information and enabling the user to explore the spatial relation of different human parts without any physical constrains. Furthermore the proposed system aims to assist the trainee anaesthetists so as to build a mental, three-dimensional map of the anatomical elements and their depictive relationship to the Ultra-Sound imaging which is used for navigation of the anaesthetic needle. Opting for a sophisticated approach of interaction, the interface elements are based on simplified visual representation of real objects, and can be operated through haptic devices and surround auditory cues. This paper discusses the challenges involved in the HCI design, introduces the visual components of the interface and presents a tentative plan of future work which involves the development of realistic haptic feedback and various regional anaesthesia training scenarios.
Yim, Sunghoon; Jeon, Seokhee; Choi, Seungmoon
2016-01-01
In this paper, we present an extended data-driven haptic rendering method capable of reproducing force responses during pushing and sliding interaction on a large surface area. The main part of the approach is a novel input variable set for the training of an interpolation model, which incorporates the position of a proxy - an imaginary contact point on the undeformed surface. This allows us to estimate friction in both sliding and sticking states in a unified framework. Estimating the proxy position is done in real-time based on simulation using a sliding yield surface - a surface defining a border between the sliding and sticking regions in the external force space. During modeling, the sliding yield surface is first identified via an automated palpation procedure. Then, through manual palpation on a target surface, input data and resultant force data are acquired. The data are used to build a radial basis interpolation model. During rendering, this input-output mapping interpolation model is used to estimate force responses in real-time in accordance with the interaction input. Physical performance evaluation demonstrates that our approach achieves reasonably high estimation accuracy. A user study also shows plausible perceptual realism under diverse and extensive exploration.
Geometric modeling of the temporal bone for cochlea implant simulation
NASA Astrophysics Data System (ADS)
Todd, Catherine A.; Naghdy, Fazel; O'Leary, Stephen
2004-05-01
The first stage in the development of a clinically valid surgical simulator for training otologic surgeons in performing cochlea implantation is presented. For this purpose, a geometric model of the temporal bone has been derived from a cadaver specimen using the biomedical image processing software package Analyze (AnalyzeDirect, Inc) and its three-dimensional reconstruction is examined. Simulator construction begins with registration and processing of a Computer Tomography (CT) medical image sequence. Important anatomical structures of the middle and inner ear are identified and segmented from each scan in a semi-automated threshold-based approach. Linear interpolation between image slices produces a three-dimensional volume dataset: the geometrical model. Artefacts are effectively eliminated using a semi-automatic seeded region-growing algorithm and unnecessary bony structures are removed. Once validated by an Ear, Nose and Throat (ENT) specialist, the model may be imported into the Reachin Application Programming Interface (API) (Reachin Technologies AB) for visual and haptic rendering associated with a virtual mastoidectomy. Interaction with the model is realized with haptics interfacing, providing the user with accurate torque and force feedback. Electrode array insertion into the cochlea will be introduced in the final stage of design.
Controllable surface haptics via particle jamming and pneumatics.
Stanley, Andrew A; Okamura, Allison M
2015-01-01
The combination of particle jamming and pneumatics allows the simultaneous control of shape and mechanical properties in a tactile display. A hollow silicone membrane is molded into an array of thin cells, each filled with coffee grounds such that adjusting the vacuum level in any individual cell rapidly switches it between flexible and rigid states. The array clamps over a pressure-regulated air chamber with internal mechanisms designed to pin the nodes between cells at any given height. Various sequences of cell vacuuming, node pinning, and chamber pressurization allow the surface to balloon into a variety of shapes. Experiments were performed to expand existing physical models of jamming at the inter-particle level to define the rheological characteristics of jammed systems from a macroscopic perspective, relevant to force-displacement interactions that would be experienced by human users. Force-displacement data show that a jammed cell in compression fits a Maxwell model and a cell deflected in the center while supported only at the edges fits a Zener model, each with stiffness and damping parameters that increase at higher levels of applied vacuum. This provides framework to tune and control the mechanical properties of a jamming haptic interface.
Internet-based support for the production of holographic stereograms
NASA Astrophysics Data System (ADS)
Gustafsson, Jonny
1998-03-01
Holographic hard-copy techniques suffers from a lack of availability for ordinary users of computer graphics. The production of holograms usually requires special skills as well as expensive equipment which means that the direct production cost will be high for an ordinary user with little or no knowledge in holography. Here it is shown how a system may be created in which the users of computer graphics can do all communication with a holography studio through a Java-based web browser. This system will facilitate for the user to understand the technique of holographic stereograms, make decisions about angles, views, lighting etc., previsualizing the end result, as well as automatically submit the 3D-data to the producer of the hologram. A prototype system has been built which uses internal scripting in VRML.
Graphical User Interface for the NASA FLOPS Aircraft Performance and Sizing Code
NASA Technical Reports Server (NTRS)
Lavelle, Thomas M.; Curlett, Brian P.
1994-01-01
XFLOPS is an X-Windows/Motif graphical user interface for the aircraft performance and sizing code FLOPS. This new interface simplifies entering data and analyzing results, thereby reducing analysis time and errors. Data entry is simpler because input windows are used for each of the FLOPS namelists. These windows contain fields to input the variable's values along with help information describing the variable's function. Analyzing results is simpler because output data are displayed rapidly. This is accomplished in two ways. First, because the output file has been indexed, users can view particular sections with the click of a mouse button. Second, because menu picks have been created, users can plot engine and aircraft performance data. In addition, XFLOPS has a built-in help system and complete on-line documentation for FLOPS.
Haptic perception and body representation in lateral and medial occipito-temporal cortices.
Costantini, Marcello; Urgesi, Cosimo; Galati, Gaspare; Romani, Gian Luca; Aglioti, Salvatore M
2011-04-01
Although vision is the primary sensory modality that humans and other primates use to identify objects in the environment, we can recognize crucial object features (e.g., shape, size) using the somatic modality. Previous studies have shown that the occipito-temporal areas dedicated to the visual processing of object forms, faces and bodies also show category-selective responses when the preferred stimuli are haptically explored out of view. Visual processing of human bodies engages specific areas in lateral (extrastriate body area, EBA) and medial (fusiform body area, FBA) occipito-temporal cortex. This study aimed at exploring the relative involvement of EBA and FBA in the haptic exploration of body parts. During fMRI scanning, participants were asked to haptically explore either real-size fake body parts or objects. We found a selective activation of right and left EBA, but not of right FBA, while participants haptically explored body parts as compared to real objects. This suggests that EBA may integrate visual body representations with somatosensory information regarding body parts and form a multimodal representation of the body. Furthermore, both left and right EBA showed a comparable level of body selectivity during haptic perception and visual imagery. However, right but not left EBA was more activated during haptic exploration than visual imagery of body parts, ruling out that the response to haptic body exploration was entirely due to the use of visual imagery. Overall, the results point to the existence of different multimodal body representations in the occipito-temporal cortex which are activated during perception and imagery of human body parts. Copyright © 2011 Elsevier Ltd. All rights reserved.
Perceptual grouping determines haptic contextual modulation.
Overvliet, K E; Sayim, B
2016-09-01
Since the early phenomenological demonstrations of Gestalt principles, one of the major challenges of Gestalt psychology has been to quantify these principles. Here, we show that contextual modulation, i.e. the influence of context on target perception, can be used as a tool to quantify perceptual grouping in the haptic domain, similar to the visual domain. We investigated the influence of target-flanker grouping on performance in haptic vernier offset discrimination. We hypothesized that when, despite the apparent differences between vision and haptics, similar grouping principles are operational, a similar pattern of flanker interference would be observed in the haptic as in the visual domain. Participants discriminated the offset of a haptic vernier. The vernier was flanked by different flanker configurations: no flankers, single flanking lines, 10 flanking lines, rectangles and single perpendicular lines, varying the degree to which the vernier grouped with the flankers. Additionally, we used two different flanker widths (same width as and narrower than the target), again to vary target-flanker grouping. Our results show a clear effect of flankers: performance was much better when the vernier was presented alone compared to when it was presented with flankers. In the majority of flanker configurations, grouping between the target and the flankers determined the strength of interference, similar to the visual domain. However, in the same width rectangular flanker condition we found aberrant results. We discuss the results of our study in light of similarities and differences between vision and haptics and the interaction between different grouping principles. We conclude that in haptics, similar organization principles apply as in visual perception and argue that grouping and Gestalt are key organization principles not only of vision, but of the perceptual system in general. Copyright © 2015 Elsevier Ltd. All rights reserved.
Haptic biofeedback for improving compliance with lower-extremity partial weight bearing.
Fu, Michael C; DeLuke, Levi; Buerba, Rafael A; Fan, Richard E; Zheng, Ying Jean; Leslie, Michael P; Baumgaertner, Michael R; Grauer, Jonathan N
2014-11-01
After lower-extremity orthopedic trauma and surgery, patients are often advised to restrict weight bearing on the affected limb. Conventional training methods are not effective at enabling patients to comply with recommendations for partial weight bearing. The current study assessed a novel method of using real-time haptic (vibratory/vibrotactile) biofeedback to improve compliance with instructions for partial weight bearing. Thirty healthy, asymptomatic participants were randomized into 1 of 3 groups: verbal instruction, bathroom scale training, and haptic biofeedback. Participants were instructed to restrict lower-extremity weight bearing in a walking boot with crutches to 25 lb, with an acceptable range of 15 to 35 lb. A custom weight bearing sensor and biofeedback system was attached to all participants, but only those in the haptic biofeedback group were given a vibrotactile signal if they exceeded the acceptable range. Weight bearing in all groups was measured with a separate validated commercial system. The verbal instruction group bore an average of 60.3±30.5 lb (mean±standard deviation). The bathroom scale group averaged 43.8±17.2 lb, whereas the haptic biofeedback group averaged 22.4±9.1 lb (P<.05). As a percentage of body weight, the verbal instruction group averaged 40.2±19.3%, the bathroom scale group averaged 32.5±16.9%, and the haptic biofeedback group averaged 14.5±6.3% (P<.05). In this initial evaluation of the use of haptic biofeedback to improve compliance with lower-extremity partial weight bearing, haptic biofeedback was superior to conventional physical therapy methods. Further studies in patients with clinical orthopedic trauma are warranted. Copyright 2014, SLACK Incorporated.
mcaGUI: microbial community analysis R-Graphical User Interface (GUI).
Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid
2012-08-15
Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html
Mabuchi, Kunihiko
2013-01-01
We are currently developing an artificial arm/hand system which is capable of sensing stimuli and then transferring these stimuli to users as somatic sensations. Presently, we are evoking the virtual somatic sensations by electrically stimulating a sensory nerve fiber which innervates a single mechanoreceptor unit at the target area; this is done using a tungsten microelectrode that was percutaneously inserted into the use's peripheral nerve (a microstimulation method). The artificial arm/hand system is composed of a robot hand equipped with a pressure sensor system on its fingers. The sensor system detects mechanical stimuli, which are transferred to the user by means of the microstimulation method so that the user experiences the stimuli as the corresponding somatic sensations. In trials, the system worked satisfactorily and there was a good correlation between the pressure applied to the pressure sensors on the robot fingers and the subjective intensities of the evoked pressure sensations.
LTCP 2D Graphical User Interface. Application Description and User's Guide
NASA Technical Reports Server (NTRS)
Ball, Robert; Navaz, Homayun K.
1996-01-01
A graphical user interface (GUI) written for NASA's LTCP (Liquid Thrust Chamber Performance) 2 dimensional computational fluid dynamic code is described. The GUI is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. Through the use of common and familiar dialog boxes, features, and tools, the user can easily and quickly create and modify input files for the LTCP code. In addition, old input files used with the LTCP code can be opened and modified using the GUI. The application is written in C++ for a desktop personal computer running under a Microsoft Windows operating environment. The program and its capabilities are presented, followed by a detailed description of each menu selection and the method of creating an input file for LTCP. A cross reference is included to help experienced users quickly find the variables which commonly need changes. Finally, the system requirements and installation instructions are provided.
Enhanced operator perception through 3D vision and haptic feedback
NASA Astrophysics Data System (ADS)
Edmondson, Richard; Light, Kenneth; Bodenhamer, Andrew; Bosscher, Paul; Wilkinson, Loren
2012-06-01
Polaris Sensor Technologies (PST) has developed a stereo vision upgrade kit for TALON® robot systems comprised of a replacement gripper camera and a replacement mast zoom camera on the robot, and a replacement display in the Operator Control Unit (OCU). Harris Corporation has developed a haptic manipulation upgrade for TALON® robot systems comprised of a replacement arm and gripper and an OCU that provides haptic (force) feedback. PST and Harris have recently collaborated to integrate the 3D vision system with the haptic manipulation system. In multiple studies done at Fort Leonard Wood, Missouri it has been shown that 3D vision and haptics provide more intuitive perception of complicated scenery and improved robot arm control, allowing for improved mission performance and the potential for reduced time on target. This paper discusses the potential benefits of these enhancements to robotic systems used for the domestic homeland security mission.
[Haptic tracking control for minimally invasive robotic surgery].
Xu, Zhaohong; Song, Chengli; Wu, Wenwu
2012-06-01
Haptic feedback plays a significant role in minimally invasive robotic surgery (MIRS). A major deficiency of the current MIRS is the lack of haptic perception for the surgeon, including the commercially available robot da Vinci surgical system. In this paper, a dynamics model of a haptic robot is established based on Newton-Euler method. Because it took some period of time in exact dynamics solution, we used a digital PID arithmetic dependent on robot dynamics to ensure real-time bilateral control, and it could improve tracking precision and real-time control efficiency. To prove the proposed method, an experimental system in which two Novint Falcon haptic devices acting as master-slave system has been developed. Simulations and experiments showed proposed methods could give instrument force feedbacks to operator, and bilateral control strategy is an effective method to master-slave MIRS. The proposed methods could be used to tele-robotic system.
NASA Technical Reports Server (NTRS)
Walatka, Pamela P.; Clucas, Jean; McCabe, R. Kevin; Plessel, Todd; Potter, R.; Cooper, D. M. (Technical Monitor)
1994-01-01
The Flow Analysis Software Toolkit, FAST, is a software environment for visualizing data. FAST is a collection of separate programs (modules) that run simultaneously and allow the user to examine the results of numerical and experimental simulations. The user can load data files, perform calculations on the data, visualize the results of these calculations, construct scenes of 3D graphical objects, and plot, animate and record the scenes. Computational Fluid Dynamics (CFD) visualization is the primary intended use of FAST, but FAST can also assist in the analysis of other types of data. FAST combines the capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one environment with modules that share data. Sharing data between modules eliminates the drudgery of transferring data between programs. All the modules in the FAST environment have a consistent, highly interactive graphical user interface. Most commands are entered by pointing and'clicking. The modular construction of FAST makes it flexible and extensible. The environment can be custom configured and new modules can be developed and added as needed. The following modules have been developed for FAST: VIEWER, FILE IO, CALCULATOR, SURFER, TOPOLOGY, PLOTTER, TITLER, TRACER, ARCGRAPH, GQ, SURFERU, SHOTET, and ISOLEVU. A utility is also included to make the inclusion of user defined modules in the FAST environment easy. The VIEWER module is the central control for the FAST environment. From VIEWER, the user can-change object attributes, interactively position objects in three-dimensional space, define and save scenes, create animations, spawn new FAST modules, add additional view windows, and save and execute command scripts. The FAST User Guide uses text and FAST MAPS (graphical representations of the entire user interface) to guide the user through the use of FAST. Chapters include: Maps, Overview, Tips, Getting Started Tutorial, a separate chapter for each module, file formats, and system administration.
NASA Astrophysics Data System (ADS)
Unemi, Tatsuo
This chapter describes a basic framework of simulated breeding, a type of interactive evolutionary computing to breed artifacts, whose origin is Blind Watchmaker by Dawkins. These methods make it easy for humans to design a complex object adapted to his/her subjective criteria, just similarly to agricultural products we have been developing over thousands of years. Starting from randomly initialized genome, the solution candidates are improved through several generations with artificial selection. The graphical user interface helps the process of breeding with techniques of multifield user interface and partial breeding. The former improves the diversity of individuals that prevents being trapped at local optimum. The latter makes it possible for the user to fix features he/she already satisfied. These methods were examined through artistic applications by the author: SBART for graphics art and SBEAT for music. Combining with a direct genome editor and exportation to another graphical or musical tool on the computer, they can be powerful tools for artistic creation. These systems may contribute to the creation of a type of new culture.
MAPA: Implementation of the Standard Interchange Format and use for analyzing lattices
NASA Astrophysics Data System (ADS)
Shasharina, Svetlana G.; Cary, John R.
1997-05-01
MAPA (Modular Accelerator Physics Analysis) is an object oriented application for accelerator design and analysis with a Motif based graphical user interface. MAPA has been ported to AIX, Linux, HPUX, Solaris, and IRIX. MAPA provides an intuitive environment for accelerator study and design. The user can bring up windows for fully nonlinear analysis of accelerator lattices in any number of dimensions. The current graphical analysis methods of Lifetime plots and Surfaces of Section have been used to analyze the improved lattice designs of Wan, Cary, and Shasharina (this conference). MAPA can now read and write Standard Interchange Format (MAD) accelerator description files and it has a general graphical user interface for adding, changing, and deleting elements. MAPA's consistency checks prevent deletion of used elements and prevent creation of recursive beam lines. Plans include development of a richer set of modeling tools and the ability to invoke existing modeling codes through the MAPA interface. MAPA will be demonstrated on a Pentium 150 laptop running Linux.
AirShow 1.0 CFD Software Users' Guide
NASA Technical Reports Server (NTRS)
Mohler, Stanley R., Jr.
2005-01-01
AirShow is visualization post-processing software for Computational Fluid Dynamics (CFD). Upon reading binary PLOT3D grid and solution files into AirShow, the engineer can quickly see how hundreds of complex 3-D structured blocks are arranged and numbered. Additionally, chosen grid planes can be displayed and colored according to various aerodynamic flow quantities such as Mach number and pressure. The user may interactively rotate and translate the graphical objects using the mouse. The software source code was written in cross-platform Java, C++, and OpenGL, and runs on Unix, Linux, and Windows. The graphical user interface (GUI) was written using Java Swing. Java also provides multiple synchronized threads. The Java Native Interface (JNI) provides a bridge between the Java code and the C++ code where the PLOT3D files are read, the OpenGL graphics are rendered, and numerical calculations are performed. AirShow is easy to learn and simple to use. The source code is available for free from the NASA Technology Transfer and Partnership Office.
X-Antenna: A graphical interface for antenna analysis codes
NASA Technical Reports Server (NTRS)
Goldstein, B. L.; Newman, E. H.; Shamansky, H. T.
1995-01-01
This report serves as the user's manual for the X-Antenna code. X-Antenna is intended to simplify the analysis of antennas by giving the user graphical interfaces in which to enter all relevant antenna and analysis code data. Essentially, X-Antenna creates a Motif interface to the user's antenna analysis codes. A command-file allows new antennas and codes to be added to the application. The menu system and graphical interface screens are created dynamically to conform to the data in the command-file. Antenna data can be saved and retrieved from disk. X-Antenna checks all antenna and code values to ensure they are of the correct type, writes an output file, and runs the appropriate antenna analysis code. Volumetric pattern data may be viewed in 3D space with an external viewer run directly from the application. Currently, X-Antenna includes analysis codes for thin wire antennas (dipoles, loops, and helices), rectangular microstrip antennas, and thin slot antennas.
Dakota Graphical User Interface v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus
Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.
The WebACS - An Accessible Graphical Editor.
Parker, Stefan; Nussbaum, Gerhard; Pölzer, Stephan
2017-01-01
This paper is about the solution to accessibility problems met when implementing a graphical editor, a major challenge being the comprehension of the relationships between graphical components, which needs to be guaranteed for blind and vision impaired users. In the concrete case the HTML5 canvas and Javascript were used. Accessibility was reached by implementing a list view of elements, which also enhances the usability of the editor.
NASA Technical Reports Server (NTRS)
Desautel, Richard
1993-01-01
The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).
Program Aids Specification Of Multiple-Block Grids
NASA Technical Reports Server (NTRS)
Sorenson, R. L.; Mccann, K. M.
1993-01-01
3DPREP computer program aids specification of multiple-block computational grids. Highly interactive graphical preprocessing program designed for use on powerful graphical scientific computer workstation. Divided into three main parts, each corresponding to principal graphical-and-alphanumerical display. Relieves user of some burden of collecting and formatting many data needed to specify blocks and grids, and prepares input data for NASA's 3DGRAPE grid-generating computer program.
Haptic Guidance Improves the Visuo-Manual Tracking of Trajectories
Bluteau, Jérémy; Coquillart, Sabine; Payan, Yohan; Gentaz, Edouard
2008-01-01
Background Learning to perform new movements is usually achieved by following visual demonstrations. Haptic guidance by a force feedback device is a recent and original technology which provides additional proprioceptive cues during visuo-motor learning tasks. The effects of two types of haptic guidances-control in position (HGP) or in force (HGF)–on visuo-manual tracking (“following”) of trajectories are still under debate. Methodology/Principals Findings Three training techniques of haptic guidance (HGP, HGF or control condition, NHG, without haptic guidance) were evaluated in two experiments. Movements produced by adults were assessed in terms of shapes (dynamic time warping) and kinematics criteria (number of velocity peaks and mean velocity) before and after the training sessions. Trajectories consisted of two Arabic and two Japanese-inspired letters in Experiment 1 and ellipses in Experiment 2. We observed that the use of HGF globally improves the fluency of the visuo-manual tracking of trajectories while no significant improvement was found for HGP or NHG. Conclusion/Significance These results show that the addition of haptic information, probably encoded in force coordinates, play a crucial role on the visuo-manual tracking of new trajectories. PMID:18335049
The effect of perceptual grouping on haptic numerosity perception.
Verlaers, K; Wagemans, J; Overvliet, K E
2015-01-01
We used a haptic enumeration task to investigate whether enumeration can be facilitated by perceptual grouping in the haptic modality. Eight participants were asked to count tangible dots as quickly and accurately as possible, while moving their finger pad over a tactile display. In Experiment 1, we manipulated the number and organization of the dots, while keeping the total exploration area constant. The dots were either evenly distributed on a horizontal line (baseline condition) or organized into groups based on either proximity (dots placed in closer proximity to each other) or configural cues (dots placed in a geometric configuration). In Experiment 2, we varied the distance between the subsets of dots. We hypothesized that when subsets of dots can be grouped together, the enumeration time will be shorter and accuracy will be higher than in the baseline condition. The results of both experiments showed faster enumeration for the configural condition than for the baseline condition, indicating that configural grouping also facilitates haptic enumeration. In Experiment 2, faster enumeration was also observed for the proximity condition than for the baseline condition. Thus, perceptual grouping speeds up haptic enumeration by both configural and proximity cues, suggesting that similar mechanisms underlie perceptual grouping in both visual and haptic enumeration.