Sample records for eye tracking device

  1. A pilot study of eye-tracking devices in intensive care.

    PubMed

    Garry, Jonah; Casey, Kelly; Cole, Therese Kling; Regensburg, Angela; McElroy, Colleen; Schneider, Eric; Efron, David; Chi, Albert

    2016-03-01

    Eye-tracking devices have been suggested as a means of improving communication and psychosocial status among patients in the intensive care unit (ICU). This study was undertaken to explore the psychosocial impact and communication effects of eye-tracking devices in the ICU. A convenience sample of patients in the medical ICU, surgical ICU, and neurosciences critical care unit were enrolled prospectively. Patients participated in 5 guided sessions of 45 minutes each with the eye-tracking computer. After completion of the sessions, the Psychosocial Impact of Assistive Devices Scale (PIADS) was used to evaluate the device from the patient's perspective. All patients who participated in the study were able to communicate basic needs to nursing staff and family. Delirium as assessed by the Confusion Assessment Method for the Intensive Care Unit was present in 4 patients at recruitment and none after training. The device's overall psychosocial impact ranged from neutral (-0.29) to strongly positive (2.76). Compared with the absence of intervention (0 = no change), patients exposed to eye-tracking computers demonstrated a positive mean overall impact score (PIADS = 1.30; P = .004). This finding was present in mean scores for each PIADS domain: competence = 1.26, adaptability = 1.60, and self-esteem = 1.02 (all P < .01). There is a population of patients in the ICU whose psychosocial status, delirium, and communication ability may be enhanced by eye-tracking devices. These 3 outcomes are intertwined with ICU patient outcomes and indirectly suggest that eye-tracking devices might improve outcomes. A more in-depth exploration of the population to be targeted, the device's limitations, and the benefits of eye-tracking devices in the ICU is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR.

    PubMed

    King, Andrew J; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device's accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use.

  3. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR

    PubMed Central

    King, Andrew J.; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F.

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device’s accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use. PMID:28815151

  4. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    PubMed

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  5. Infant Eye-Tracking in the Context of Goal-Directed Actions

    ERIC Educational Resources Information Center

    Corbetta, Daniela; Guan, Yu; Williams, Joshua L.

    2012-01-01

    This paper presents two methods that we applied to our research to record infant gaze in the context of goal-oriented actions using different eye-tracking devices: head-mounted and remote eye-tracking. For each type of eye-tracking system, we discuss their advantages and disadvantages, describe the particular experimental setups we used to study…

  6. Towards understanding addiction factors of mobile devices: An eye tracking study on effect of screen size.

    PubMed

    Wibirama, Sunu; Nugroho, Hanung A

    2017-07-01

    Mobile devices addiction has been an important research topic in cognitive science, mental health, and human-machine interaction. Previous works observed mobile device addiction by logging mobile devices activity. Although immersion has been linked as a significant predictor of video game addiction, investigation on addiction factors of mobile device with behavioral measurement has never been done before. In this research, we demonstrated the usage of eye tracking to observe effect of screen size on experience of immersion. We compared subjective judgment with eye movements analysis. Non-parametric analysis on immersion score shows that screen size affects experience of immersion (p<;0.05). Furthermore, our experimental results suggest that fixational eye movements may be used as an indicator for future investigation of mobile devices addiction. Our experimental results are also useful to develop a guideline as well as intervention strategy to deal with smartphone addiction.

  7. Using an eye tracker during medication administration to identify gaps in nursing students' contextual knowledge: an observational study.

    PubMed

    Amster, Brian; Marquard, Jenna; Henneman, Elizabeth; Fisher, Donald

    2015-01-01

    In this clinical simulation study using an eye-tracking device, 40% of senior nursing students administered a contraindicated medication to a patient. Our findings suggest that the participants who did not identify the error did not know that amoxicillin is a type of penicillin. Eye-tracking devices may be valuable for determining whether nursing students are making rule- or knowledge-based errors, a distinction not easily captured via observations and interviews.

  8. Extracting information of fixational eye movements through pupil tracking

    NASA Astrophysics Data System (ADS)

    Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng

    2018-01-01

    Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.

  9. The Right Track for Vision Correction

    NASA Technical Reports Server (NTRS)

    2003-01-01

    More and more people are putting away their eyeglasses and contact lenses as a result of laser vision correction surgery. LASIK, the most widely performed version of this surgical procedure, improves vision by reshaping the cornea, the clear front surface of the eye, using an excimer laser. One excimer laser system, Alcon s LADARVision 4000, utilizes a laser radar (LADAR) eye tracking device that gives it unmatched precision. During LASIK surgery, laser During LASIK surgery, laser pulses must be accurately placed to reshape the cornea. A challenge to this procedure is the patient s constant eye movement. A person s eyes make small, involuntary movements known as saccadic movements about 100 times per second. Since the saccadic movements will not stop during LASIK surgery, most excimer laser systems use an eye tracking device that measures the movements and guides the placement of the laser beam. LADARVision s eye tracking device stems from the LADAR technology originally developed through several Small Business Innovation Research (SBIR) contracts with NASA s Johnson Space Center and the U.S. Department of Defense s Ballistic Missile Defense Office (BMDO). In the 1980s, Johnson awarded Autonomous Technologies Corporation a Phase I SBIR contract to develop technology for autonomous rendezvous and docking of space vehicles to service satellites. During Phase II of the Johnson SBIR contract, Autonomous Technologies developed a prototype range and velocity imaging LADAR to demonstrate technology that could be used for this purpose.

  10. A Novel Hybrid Mental Spelling Application Based on Eye Tracking and SSVEP-Based BCI

    PubMed Central

    Stawicki, Piotr; Gembler, Felix; Rezeika, Aya; Volosyak, Ivan

    2017-01-01

    Steady state visual evoked potentials (SSVEPs)-based Brain-Computer interfaces (BCIs), as well as eyetracking devices, provide a pathway for re-establishing communication for people with severe disabilities. We fused these control techniques into a novel eyetracking/SSVEP hybrid system, which utilizes eye tracking for initial rough selection and the SSVEP technology for fine target activation. Based on our previous studies, only four stimuli were used for the SSVEP aspect, granting sufficient control for most BCI users. As Eye tracking data is not used for activation of letters, false positives due to inappropriate dwell times are avoided. This novel approach combines the high speed of eye tracking systems and the high classification accuracies of low target SSVEP-based BCIs, leading to an optimal combination of both methods. We evaluated accuracy and speed of the proposed hybrid system with a 30-target spelling application implementing all three control approaches (pure eye tracking, SSVEP and the hybrid system) with 32 participants. Although the highest information transfer rates (ITRs) were achieved with pure eye tracking, a considerable amount of subjects was not able to gain sufficient control over the stand-alone eye-tracking device or the pure SSVEP system (78.13% and 75% of the participants reached reliable control, respectively). In this respect, the proposed hybrid was most universal (over 90% of users achieved reliable control), and outperformed the pure SSVEP system in terms of speed and user friendliness. The presented hybrid system might offer communication to a wider range of users in comparison to the standard techniques. PMID:28379187

  11. The eye-tracking computer device for communication in amyotrophic lateral sclerosis.

    PubMed

    Spataro, R; Ciriacono, M; Manno, C; La Bella, V

    2014-07-01

    To explore the effectiveness of communication and the variables affecting the eye-tracking computer system (ETCS) utilization in patients with late-stage amyotrophic lateral sclerosis (ALS). We performed a telephone survey on 30 patients with advanced non-demented ALS that were provisioned an ECTS device. Median age at interview was 55 years (IQR = 48-62), with a relatively high education (13 years, IQR = 8-13). A one-off interview was made and answers were later provided with the help of the caregiver. The interview included items about demographic and clinical variables affecting the daily ETCS utilization. The median time of ETCS device possession was 15 months (IQR = 9-20). The actual daily utilization was 300 min (IQR = 100-720), mainly for the communication with relatives/caregiver, internet surfing, e-mailing, and social networking. 23.3% of patients with ALS (n = 7) had a low daily ETCS utilization; most reported causes were eye-gaze tiredness and oculomotor dysfunction. Eye-tracking computer system is a valuable device for AAC in patients with ALS, and it can be operated with a good performance. The development of oculomotor impairment may limit its functional use. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Video-based eye tracking for neuropsychiatric assessment.

    PubMed

    Adhikari, Sam; Stark, David E

    2017-01-01

    This paper presents a video-based eye-tracking method, ideally deployed via a mobile device or laptop-based webcam, as a tool for measuring brain function. Eye movements and pupillary motility are tightly regulated by brain circuits, are subtly perturbed by many disease states, and are measurable using video-based methods. Quantitative measurement of eye movement by readily available webcams may enable early detection and diagnosis, as well as remote/serial monitoring, of neurological and neuropsychiatric disorders. We successfully extracted computational and semantic features for 14 testing sessions, comprising 42 individual video blocks and approximately 17,000 image frames generated across several days of testing. Here, we demonstrate the feasibility of collecting video-based eye-tracking data from a standard webcam in order to assess psychomotor function. Furthermore, we were able to demonstrate through systematic analysis of this data set that eye-tracking features (in particular, radial and tangential variance on a circular visual-tracking paradigm) predict performance on well-validated psychomotor tests. © 2017 New York Academy of Sciences.

  13. Through Their Eyes: Tracking the Gaze of Students in a Geology Field Course

    ERIC Educational Resources Information Center

    Maltese, Adam V.; Balliet, Russell N.; Riggs, Eric M.

    2013-01-01

    The focus of this research was to investigate how students learn to do fieldwork through observation. This study addressed the following questions: (1) Can mobile eye-tracking devices provide a robust source of data to investigate the observations and workflow of novice students while participating in a field exercise? If so, what are the…

  14. Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm.

    PubMed

    Li, Bin; Fu, Hong; Wen, Desheng; Lo, WaiLun

    2018-05-19

    Eye tracking technology has become increasingly important for psychological analysis, medical diagnosis, driver assistance systems, and many other applications. Various gaze-tracking models have been established by previous researchers. However, there is currently no near-eye display system with accurate gaze-tracking performance and a convenient user experience. In this paper, we constructed a complete prototype of the mobile gaze-tracking system ' Etracker ' with a near-eye viewing device for human gaze tracking. We proposed a combined gaze-tracking algorithm. In this algorithm, the convolutional neural network is used to remove blinking images and predict coarse gaze position, and then a geometric model is defined for accurate human gaze tracking. Moreover, we proposed using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms, so that an individual user only needs to calibrate it the first time, which makes our system more convenient. The experiments on gaze data from 26 participants show that the eye center detection accuracy is 98% and Etracker can provide an average gaze accuracy of 0.53° at a rate of 30⁻60 Hz.

  15. Virtual Averaging Making Nonframe-Averaged Optical Coherence Tomography Images Comparable to Frame-Averaged Images.

    PubMed

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A; Kagemann, Larry; Schuman, Joel S

    2016-01-01

    Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t -test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects.

  16. Virtual Averaging Making Nonframe-Averaged Optical Coherence Tomography Images Comparable to Frame-Averaged Images

    PubMed Central

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Kagemann, Larry; Schuman, Joel S.

    2016-01-01

    Purpose Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Methods Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. Results All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t-test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. Conclusion The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Translational Relevance Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects. PMID:26835180

  17. Eye-Tracking Analysis of the Figures of Anti-Smoking Health Promoting Periodical's Illustrations

    ERIC Educational Resources Information Center

    Maródi, Ágnes; Devosa, Iván; Steklács, János; Fáyné-Dombi, Alice; Buzas, Zsuzsanna; Vanya, Melinda

    2015-01-01

    Nowadays new education technologies and e-communication devices give new measuring and assessing tools for researchers. Eye-tracking is one of these new methods in education. In our study we assessed 4 figures from the anti-smoking heath issues of National Institute for Health Development. In the study 22 students were included from a 7th grade…

  18. Oculomotor Behavior Metrics Change According to Circadian Phase and Time Awake

    NASA Technical Reports Server (NTRS)

    Flynn-Evans, Erin E.; Tyson, Terence L.; Cravalho, Patrick; Feick, Nathan; Stone, Leland S.

    2017-01-01

    There is a need for non-invasive, objective measures to forecast performance impairment arising from sleep loss and circadian misalignment, particularly in safety-sensitive occupations. Eye-tracking devices have been used in some operational scenarios, but such devices typically focus on eyelid closures and slow rolling eye movements and are susceptible to the intrusion of head movement artifacts. We hypothesized that an expanded suite of oculomotor behavior metrics, collected during a visual tracking task, would change according to circadian phase and time awake, and could be used as a marker of performance impairment.

  19. Encoding Strategies in Primary School Children: Insights from an Eye-Tracking Approach and the Role of Individual Differences in Attentional Control

    ERIC Educational Resources Information Center

    Roebers, Claudia M.; Schmid, Corinne; Roderer, Thomas

    2010-01-01

    The authors explored different aspects of encoding strategy use in primary school children by including (a) an encoding strategy task in which children's encoding strategy use was recorded through a remote eye-tracking device and, later, free recall and recognition for target items was assessed; and (b) tasks measuring resistance to interference…

  20. Pilots' visual scan patterns and situation awareness in flight operations.

    PubMed

    Yu, Chung-San; Wang, Eric Min-Yang; Li, Wen-Chin; Braithwaite, Graham

    2014-07-01

    Situation awareness (SA) is considered an essential prerequisite for safe flying. If the impact of visual scanning patterns on a pilot's situation awareness could be identified in flight operations, then eye-tracking tools could be integrated with flight simulators to improve training efficiency. Participating in this research were 18 qualified, mission-ready fighter pilots. The equipment included high-fidelity and fixed-base type flight simulators and mobile head-mounted eye-tracking devices to record a subject's eye movements and SA while performing air-to-surface tasks. There were significant differences in pilots' percentage of fixation in three operating phases: preparation (M = 46.09, SD = 14.79), aiming (M = 24.24, SD = 11.03), and release and break-away (M = 33.98, SD = 14.46). Also, there were significant differences in pilots' pupil sizes, which were largest in the aiming phase (M = 27,621, SD = 6390.8), followed by release and break-away (M = 27,173, SD = 5830.46), then preparation (M = 25,710, SD = 6078.79), which was the smallest. Furthermore, pilots with better SA performance showed lower perceived workload (M = 30.60, SD = 17.86), and pilots with poor SA performance showed higher perceived workload (M = 60.77, SD = 12.72). Pilots' percentage of fixation and average fixation duration among five different areas of interest showed significant differences as well. Eye-tracking devices can aid in capturing pilots' visual scan patterns and SA performance, unlike traditional flight simulators. Therefore, integrating eye-tracking devices into the simulator may be a useful method for promoting SA training in flight operations, and can provide in-depth understanding of the mechanism of visual scan patterns and information processing to improve training effectiveness in aviation.

  1. New Eye-Tracking Techniques May Revolutionize Mental Health Screening

    DTIC Science & Technology

    2015-11-04

    health? Recent progress in eye-tracking tech- niques is opening new avenues for quanti - tative, objective, simple, inexpensive, and rapid evaluation ...to check with your doctor whether any corrective action should be taken. What if similar devices could be made available for the evaluation of mental... evaluations , especially for those disor- ders for which a clear chemical, genetic, morphological, physiological, or histologi- cal biomarker has not yet

  2. Can You See Me Thinking (about My Answers)? Using Eye-Tracking to Illuminate Developmental Differences in Monitoring and Control Skills and Their Relation to Performance

    ERIC Educational Resources Information Center

    Roderer, Thomas; Roebers, Claudia M.

    2014-01-01

    This study focuses on relations between 7- and 9-year-old children's and adults' metacognitive monitoring and control processes. In addition to explicit confidence judgments (CJ), data for participants' control behavior during learning and recall as well as implicit CJs were collected with an eye-tracking device (Tobii 1750).…

  3. Real-time eye tracking for the assessment of driver fatigue.

    PubMed

    Xu, Junli; Min, Jianliang; Hu, Jianfeng

    2018-04-01

    Eye-tracking is an important approach to collect evidence regarding some participants' driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants' eye state for collecting eye-movement data. These data are useful to get insights into assessing participants' fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1-2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K -nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue.

  4. Real-time eye tracking for the assessment of driver fatigue

    PubMed Central

    Xu, Junli; Min, Jianliang

    2018-01-01

    Eye-tracking is an important approach to collect evidence regarding some participants’ driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants’ eye state for collecting eye-movement data. These data are useful to get insights into assessing participants’ fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1–2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K-nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue. PMID:29750113

  5. Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction.

    PubMed

    Black, David; Unger, Michael; Fischer, Nele; Kikinis, Ron; Hahn, Horst; Neumuth, Thomas; Glaser, Bernhard

    2018-01-01

    The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response. An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures. When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings. Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.

  6. GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.

    PubMed

    Sogo, Hiroyuki

    2013-09-01

    Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.

  7. A MATLAB-based eye tracking control system using non-invasive helmet head restraint in the macaque.

    PubMed

    De Luna, Paolo; Mohamed Mustafar, Mohamed Faiz Bin; Rainer, Gregor

    2014-09-30

    Tracking eye position is vital for behavioral and neurophysiological investigations in systems and cognitive neuroscience. Infrared camera systems which are now available can be used for eye tracking without the need to surgically implant magnetic search coils. These systems are generally employed using rigid head fixation in monkeys, which maintains the eye in a constant position and facilitates eye tracking. We investigate the use of non-rigid head fixation using a helmet that constrains only general head orientation and allows some freedom of movement. We present a MATLAB software solution to gather and process eye position data, present visual stimuli, interact with various devices, provide experimenter feedback and store data for offline analysis. Our software solution achieves excellent timing performance due to the use of data streaming, instead of the traditionally employed data storage mode for processing analog eye position data. We present behavioral data from two monkeys, demonstrating that adequate performance levels can be achieved on a simple fixation paradigm and show how performance depends on parameters such as fixation window size. Our findings suggest that non-rigid head restraint can be employed for behavioral training and testing on a variety of gaze-dependent visual paradigms, reducing the need for rigid head restraint systems for some applications. While developed for macaque monkey, our system of course can work equally well for applications in human eye tracking where head constraint is undesirable. Copyright © 2014. Published by Elsevier B.V.

  8. Technical Report of Successful Deployment of Tandem Visual Tracking During Live Laparoscopic Cholecystectomy Between Novice and Expert Surgeon.

    PubMed

    Puckett, Yana; Baronia, Benedicto C

    2016-09-20

    With the recent advances in eye tracking technology, it is now possible to track surgeons' eye movements while engaged in a surgical task or when surgical residents practice their surgical skills. Several studies have compared eye movements of surgical experts and novices and developed techniques to assess surgical skill on the basis of eye movement utilizing simulators and live surgery. None have evaluated simultaneous visual tracking between an expert and a novice during live surgery. Here, we describe a successful simultaneous deployment of visual tracking of an expert and a novice during live laparoscopic cholecystectomy. One expert surgeon and one chief surgical resident at an accredited surgical program in Lubbock, TX, USA performed a live laparoscopic cholecystectomy while simultaneously wearing the visual tracking devices. Their visual attitudes and movements were monitored via video recordings. The recordings were then analyzed for correlation between the expert and the novice. The visual attitudes and movements correlated approximately 85% between an expert surgeon and a chief surgical resident. The surgery was carried out uneventfully, and the data was abstracted with ease. We conclude that simultaneous deployment of visual tracking during live laparoscopic surgery is a possibility. More studies and subjects are needed to verify the success of our results and obtain data analysis.

  9. Using eye-tracking technology for communication in Rett syndrome: perceptions of impact.

    PubMed

    Vessoyan, Kelli; Steckle, Gill; Easton, Barb; Nichols, Megan; Mok Siu, Victoria; McDougall, Janette

    2018-04-27

    Studies have investigated the use of eye-tracking technology to assess cognition in individuals with Rett syndrome, but few have looked at this access method for communication for this group. Loss of speech, decreased hand use, and severe motor apraxia significantly impact functional communication for this population. Eye gaze is one modality that may be used successfully by individuals with Rett syndrome. This multiple case study explored whether using eye-tracking technology, with ongoing support from a team of augmentative and alternative communication (AAC) therapists, could help four participants with Rett syndrome meet individualized communication goals. Two secondary objectives were to examine parents' perspectives on (a) the psychosocial impact of their child's use of the technology, and (b) satisfaction with using the technology. All four participants were rated by the treating therapists to have made improvement on their goals. According to both quantitative findings and descriptive information, eye-tracking technology was viewed by parents as contributing to participants' improved psychosocial functioning. Parents reported being highly satisfied with both the device and the clinical services received. This study provides initial evidence that eye-tracking may be perceived as a worthwhile and potentially satisfactory technology to support individuals with Rett syndrome in communicating. Future, more rigorous research that addresses the limitations of a case study design is required to substantiate study findings.

  10. Brief Report: Patterns of Eye Movements in Face to Face Conversation Are Associated with Autistic Traits--Evidence from a Student Sample

    ERIC Educational Resources Information Center

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking…

  11. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research.

    PubMed

    Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim

    2017-01-01

    Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  12. Integrating Eye Trackers with Handwriting Tablets to Discover Difficulties of Solving Geometry Problems

    ERIC Educational Resources Information Center

    Lin, John J. H.; Lin, Sunny S. J.

    2018-01-01

    To deepen our understanding of those aspects of problems that cause the most difficulty for solvers, this study integrated eye-tracking with handwriting devices to investigate problem solvers' online processes while solving geometry problems. We are interested in whether the difference between successful and unsuccessful solvers can be identified…

  13. Analysis of eye-tracking experiments performed on a Tobii T60

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banks, David C

    2008-01-01

    Commercial eye-gaze trackers have the potential to be an important tool for quantifying the benefits of new visualization techniques. The expense of such trackers has made their use relatively infrequent in visualization studies. As such, it is difficult for researchers to compare multiple devices obtaining several demonstration models is impractical in cost and time, and quantitative measures from real-world use are not readily available. In this paper, we present a sample protocol to determine the accuracy of a gaze-tacking device.

  14. Technical Report of Successful Deployment of Tandem Visual Tracking During Live Laparoscopic Cholecystectomy Between Novice and Expert Surgeon

    PubMed Central

    Baronia, Benedicto C

    2016-01-01

    With the recent advances in eye tracking technology, it is now possible to track surgeons’ eye movements while engaged in a surgical task or when surgical residents practice their surgical skills. Several studies have compared eye movements of surgical experts and novices and developed techniques to assess surgical skill on the basis of eye movement utilizing simulators and live surgery. None have evaluated simultaneous visual tracking between an expert and a novice during live surgery. Here, we describe a successful simultaneous deployment of visual tracking of an expert and a novice during live laparoscopic cholecystectomy. One expert surgeon and one chief surgical resident at an accredited surgical program in Lubbock, TX, USA performed a live laparoscopic cholecystectomy while simultaneously wearing the visual tracking devices. Their visual attitudes and movements were monitored via video recordings. The recordings were then analyzed for correlation between the expert and the novice. The visual attitudes and movements correlated approximately 85% between an expert surgeon and a chief surgical resident. The surgery was carried out uneventfully, and the data was abstracted with ease. We conclude that simultaneous deployment of visual tracking during live laparoscopic surgery is a possibility. More studies and subjects are needed to verify the success of our results and obtain data analysis. PMID:27774359

  15. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.

    PubMed

    Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart

    2017-01-01

    Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built-in web cameras are a standard feature of most smart devices (e.g., laptops, tablets, smart phones) and can be effectively employed to track eye movements on decisional tasks with high accuracy and minimal cost.

  16. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.

    PubMed

    Danion, Frederic; Mathew, James; Flanagan, J Randall

    2017-01-01

    Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.

  17. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics

    PubMed Central

    Mathew, James

    2017-01-01

    Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964

  18. Eye gaze tracking based on the shape of pupil image

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is an important instrument for research in psychology, widely used in attention, visual perception, reading and other fields of research. Because of its potential function in human-computer interaction, the eye gaze tracking has already been a topic of research in many fields over the last decades. Nowadays, with the development of technology, non-intrusive methods are more and more welcomed. In this paper, we will present a method based on the shape of pupil image to estimate the gaze point of human eyes without any other intrusive devices such as a hat, a pair of glasses and so on. After using the ellipse fitting algorithm to deal with the pupil image we get, we can determine the direction of the fixation by the shape of the pupil.The innovative aspect of this method is to utilize the new idea of the shape of the pupil so that we can avoid much complicated algorithm. The performance proposed is very helpful for the study of eye gaze tracking, which just needs one camera without infrared light to know the changes in the shape of the pupil to determine the direction of the eye gazing, no additional condition is required.

  19. The Eyes Have It

    NASA Technical Reports Server (NTRS)

    1999-01-01

    NASA'S Ames Research Center contracted with SRI international to contract a device that would be able to anticipate, track, and monitor involuntary ocular movement horizontally, vertically, and with respect to depth-of-field. This development helped research institutions to understand the eye. The Eyetracker, manufactured and distributed by Forward Optical Technologies, Inc. is now used in the clinical/medical field.

  20. Validation of mobile eye-tracking as novel and efficient means for differentiating progressive supranuclear palsy from Parkinson's disease

    PubMed Central

    Marx, Svenja; Respondek, Gesine; Stamelou, Maria; Dowiasch, Stefan; Stoll, Josef; Bremmer, Frank; Oertel, Wolfgang H.; Höglinger, Günter U.; Einhäuser, Wolfgang

    2012-01-01

    Background: The decreased ability to carry out vertical saccades is a key symptom of Progressive Supranuclear Palsy (PSP). Objective measurement devices can help to reliably detect subtle eye movement disturbances to improve sensitivity and specificity of the clinical diagnosis. The present study aims at transferring findings from restricted stationary video-oculography (VOG) to a wearable head-mounted device, which can be readily applied in clinical practice. Methods: We investigated the eye movements in 10 possible or probable PSP patients, 11 Parkinson's disease (PD) patients, and 10 age-matched healthy controls (HCs) using a mobile, gaze-driven video camera setup (EyeSeeCam). Ocular movements were analyzed during a standardized fixation protocol and in an unrestricted real-life scenario while walking along a corridor. Results: The EyeSeeCam detected prominent impairment of both saccade velocity and amplitude in PSP patients, differentiating them from PD and HCs. Differences were particularly evident for saccades in the vertical plane, and stronger for saccades than for other eye movements. Differences were more pronounced during the standardized protocol than in the real-life scenario. Conclusions: Combined analysis of saccade velocity and saccade amplitude during the fixation protocol with the EyeSeeCam provides a simple, rapid (<20 s), and reliable tool to differentiate clinically established PSP patients from PD and HCs. As such, our findings prepare the ground for using wearable eye-tracking in patients with uncertain diagnoses. PMID:23248593

  1. SET: a pupil detection method using sinusoidal approximation

    PubMed Central

    Javadi, Amir-Homayoun; Hakimi, Zahra; Barati, Morteza; Walsh, Vincent; Tcheang, Lili

    2015-01-01

    Mobile eye-tracking in external environments remains challenging, despite recent advances in eye-tracking software and hardware engineering. Many current methods fail to deal with the vast range of outdoor lighting conditions and the speed at which these can change. This confines experiments to artificial environments where conditions must be tightly controlled. Additionally, the emergence of low-cost eye tracking devices calls for the development of analysis tools that enable non-technical researchers to process the output of their images. We have developed a fast and accurate method (known as “SET”) that is suitable even for natural environments with uncontrolled, dynamic and even extreme lighting conditions. We compared the performance of SET with that of two open-source alternatives by processing two collections of eye images: images of natural outdoor scenes with extreme lighting variations (“Natural”); and images of less challenging indoor scenes (“CASIA-Iris-Thousand”). We show that SET excelled in outdoor conditions and was faster, without significant loss of accuracy, indoors. SET offers a low cost eye-tracking solution, delivering high performance even in challenging outdoor environments. It is offered through an open-source MATLAB toolkit as well as a dynamic-link library (“DLL”), which can be imported into many programming languages including C# and Visual Basic in Windows OS (www.eyegoeyetracker.co.uk). PMID:25914641

  2. Walking simulator for evaluation of ophthalmic devices

    NASA Astrophysics Data System (ADS)

    Barabas, James; Woods, Russell L.; Peli, Eli

    2005-03-01

    Simulating mobility tasks in a virtual environment reduces risk for research subjects, and allows for improved experimental control and measurement. We are currently using a simulated shopping mall environment (where subjects walk on a treadmill in front of a large projected video display) to evaluate a number of ophthalmic devices developed at the Schepens Eye Research Institute for people with vision impairment, particularly visual field defects. We have conducted experiments to study subject's perception of "safe passing distance" when walking towards stationary obstacles. The subject's binary responses about potential collisions are analyzed by fitting a psychometric function, which gives an estimate of the subject's perceived safe passing distance, and the variability of subject responses. The system also enables simulations of visual field defects using head and eye tracking, enabling better understanding of the impact of visual field loss. Technical infrastructure for our simulated walking environment includes a custom eye and head tracking system, a gait feedback system to adjust treadmill speed, and a handheld 3-D pointing device. Images are generated by a graphics workstation, which contains a model with photographs of storefronts from an actual shopping mall, where concurrent validation experiments are being conducted.

  3. Advanced autostereoscopic display for G-7 pilot project

    NASA Astrophysics Data System (ADS)

    Hattori, Tomohiko; Ishigaki, Takeo; Shimamoto, Kazuhiro; Sawaki, Akiko; Ishiguchi, Tsuneo; Kobayashi, Hiromi

    1999-05-01

    An advanced auto-stereoscopic display is described that permits the observation of a stereo pair by several persons simultaneously without the use of special glasses and any kind of head tracking devices for the viewers. The system is composed of a right eye system, a left eye system and a sophisticated head tracking system. In the each eye system, a transparent type color liquid crystal imaging plate is used with a special back light unit. The back light unit consists of a monochrome 2D display and a large format convex lens. The unit distributes the light of the viewers' correct each eye only. The right eye perspective system is combined with a left eye perspective system is combined with a left eye perspective system by a half mirror in order to function as a time-parallel stereoscopic system. The viewer's IR image is taken through and focused by the large format convex lens and feed back to the back light as a modulated binary half face image. The auto-stereoscopic display employs the TTL method as the accurate head tracking. The system was worked as a stereoscopic TV phone between Duke University Department Tele-medicine and Nagoya University School of Medicine Department Radiology using a high-speed digital line of GIBN. The applications are also described in this paper.

  4. Webcam mouse using face and eye tracking in various illumination environments.

    PubMed

    Lin, Yuan-Pin; Chao, Yi-Ping; Lin, Chung-Chih; Chen, Jyh-Horng

    2005-01-01

    Nowadays, due to enhancement of computer performance and popular usage of webcam devices, it has become possible to acquire users' gestures for the human-computer-interface with PC via webcam. However, the effects of illumination variation would dramatically decrease the stability and accuracy of skin-based face tracking system; especially for a notebook or portable platform. In this study we present an effective illumination recognition technique, combining K-Nearest Neighbor classifier and adaptive skin model, to realize the real-time tracking system. We have demonstrated that the accuracy of face detection based on the KNN classifier is higher than 92% in various illumination environments. In real-time implementation, the system successfully tracks user face and eyes features at 15 fps under standard notebook platforms. Although KNN classifier only initiates five environments at preliminary stage, the system permits users to define and add their favorite environments to KNN for computer access. Eventually, based on this efficient tracking algorithm, we have developed a "Webcam Mouse" system to control the PC cursor using face and eye tracking. Preliminary studies in "point and click" style PC web games also shows promising applications in consumer electronic markets in the future.

  5. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Via, Riccardo, E-mail: riccardo.via@polimi.it; Fassi, Aurora; Fattori, Giovanni

    Purpose: External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Methods: Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by twomore » calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Results: Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. Conclusions: A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.« less

  6. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy.

    PubMed

    Via, Riccardo; Fassi, Aurora; Fattori, Giovanni; Fontana, Giulia; Pella, Andrea; Tagaste, Barbara; Riboldi, Marco; Ciocca, Mario; Orecchia, Roberto; Baroni, Guido

    2015-05-01

    External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by two calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.

  7. Method for Improving EEG Based Emotion Recognition by Combining It with Synchronized Biometric and Eye Tracking Technologies in a Non-invasive and Low Cost Way

    PubMed Central

    López-Gil, Juan-Miguel; Virgili-Gomá, Jordi; Gil, Rosa; Guilera, Teresa; Batalla, Iolanda; Soler-González, Jorge; García, Roberto

    2016-01-01

    Technical advances, particularly the integration of wearable and embedded sensors, facilitate tracking of physiological responses in a less intrusive way. Currently, there are many devices that allow gathering biometric measurements from human beings, such as EEG Headsets or Health Bracelets. The massive data sets generated by tracking of EEG and physiology may be used, among other things, to infer knowledge about human moods and emotions. Apart from direct biometric signal measurement, eye tracking systems are nowadays capable of determining the point of gaze of the users when interacting in ICT environments, which provides an added value research on many different areas, such as psychology or marketing. We present a process in which devices for eye tracking, biometric, and EEG signal measurements are synchronously used for studying both basic and complex emotions. We selected the least intrusive devices for different signal data collection given the study requirements and cost constraints, so users would behave in the most natural way possible. On the one hand, we have been able to determine basic emotions participants were experiencing by means of valence and arousal. On the other hand, a complex emotion such as empathy has also been detected. To validate the usefulness of this approach, a study involving forty-four people has been carried out, where they were exposed to a series of affective stimuli while their EEG activity, biometric signals, and eye position were synchronously recorded to detect self-regulation. The hypothesis of the work was that people who self-regulated would show significantly different results when analyzing their EEG data. Participants were divided into two groups depending on whether Electro Dermal Activity (EDA) data indicated they self-regulated or not. The comparison of the results obtained using different machine learning algorithms for emotion recognition shows that using EEG activity alone as a predictor for self-regulation does not allow properly determining whether a person in self-regulation its emotions while watching affective stimuli. However, adequately combining different data sources in a synchronous way to detect emotions makes it possible to overcome the limitations of single detection methods. PMID:27594831

  8. Method for Improving EEG Based Emotion Recognition by Combining It with Synchronized Biometric and Eye Tracking Technologies in a Non-invasive and Low Cost Way.

    PubMed

    López-Gil, Juan-Miguel; Virgili-Gomá, Jordi; Gil, Rosa; García, Roberto

    2016-01-01

    Technical advances, particularly the integration of wearable and embedded sensors, facilitate tracking of physiological responses in a less intrusive way. Currently, there are many devices that allow gathering biometric measurements from human beings, such as EEG Headsets or Health Bracelets. The massive data sets generated by tracking of EEG and physiology may be used, among other things, to infer knowledge about human moods and emotions. Apart from direct biometric signal measurement, eye tracking systems are nowadays capable of determining the point of gaze of the users when interacting in ICT environments, which provides an added value research on many different areas, such as psychology or marketing. We present a process in which devices for eye tracking, biometric, and EEG signal measurements are synchronously used for studying both basic and complex emotions. We selected the least intrusive devices for different signal data collection given the study requirements and cost constraints, so users would behave in the most natural way possible. On the one hand, we have been able to determine basic emotions participants were experiencing by means of valence and arousal. On the other hand, a complex emotion such as empathy has also been detected. To validate the usefulness of this approach, a study involving forty-four people has been carried out, where they were exposed to a series of affective stimuli while their EEG activity, biometric signals, and eye position were synchronously recorded to detect self-regulation. The hypothesis of the work was that people who self-regulated would show significantly different results when analyzing their EEG data. Participants were divided into two groups depending on whether Electro Dermal Activity (EDA) data indicated they self-regulated or not. The comparison of the results obtained using different machine learning algorithms for emotion recognition shows that using EEG activity alone as a predictor for self-regulation does not allow properly determining whether a person in self-regulation its emotions while watching affective stimuli. However, adequately combining different data sources in a synchronous way to detect emotions makes it possible to overcome the limitations of single detection methods.

  9. The Learning Benefits of Using Eye Trackers to Enhance the Geospatial Abilities of Elementary School Students

    ERIC Educational Resources Information Center

    Wang, Hsiao-shen; Chen, Yi-Ting; Lin, Chih-Hung

    2014-01-01

    In this study, we examined the spatial abilities of students using eye-movement tracking devices to identify and analyze their characteristics. For this research, 12 students aged 11-12 years participated as novices and 4 mathematics students participated as experts. A comparison of the visual-spatial abilities of each group showed key factors of…

  10. Remote vs. head-mounted eye-tracking: a comparison using radiologists reading mammograms

    NASA Astrophysics Data System (ADS)

    Mello-Thoms, Claudia; Gur, David

    2007-03-01

    Eye position monitoring has been used for decades in Radiology in order to determine how radiologists interpret medical images. Using these devices several discoveries about the perception/decision making process have been made, such as the importance of comparisons of perceived abnormalities with selected areas of the background, the likelihood that a true lesion will attract visual attention early in the reading process, and the finding that most misses attract prolonged visual dwell, often comparable to dwell in the location of reported lesions. However, eye position tracking is a cumbersome process, which often requires the observer to wear a helmet gear which contains the eye tracker per se and a magnetic head tracker, which allows for the computation of head position. Observers tend to complain of fatigue after wearing the gear for a prolonged time. Recently, with the advances made to remote eye-tracking, the use of head-mounted systems seemed destined to become a thing of the past. In this study we evaluated a remote eye tracking system, and compared it to a head-mounted system, as radiologists read a case set of one-view mammograms on a high-resolution display. We compared visual search parameters between the two systems, such as time to hit the location of the lesion for the first time, amount of dwell time in the location of the lesion, total time analyzing the image, etc. We also evaluated the observers' impressions of both systems, and what their perceptions were of the restrictions of each system.

  11. Application of TrackEye in equine locomotion research.

    PubMed

    Drevemo, S; Roepstorff, L; Kallings, P; Johnston, C J

    1993-01-01

    TrackEye is an analysis system, which is applicable for equine biokinematic studies. It covers the whole process from digitizing of images, automatic target tracking and analysis. Key components in the system are an image work station for processing of video images and a high-resolution film-to-video scanner for 16-mm film. A recording module controls the input device and handles the capture of image sequences into a videodisc system, and a tracking module is able to follow reference markers automatically. The system offers a flexible analysis including calculations of markers displacements, distances and joint angles, velocities and accelerations. TrackEye was used to study effects of phenylbutazone on the fetlock and carpal joint angle movements in a horse with a mild lameness caused by osteo-arthritis in the fetlock joint of a forelimb. Significant differences, most evident before treatment, were observed in the minimum fetlock and carpal joint angles when contralateral limbs were compared (p < 0.001). The minimum fetlock angle and the minimum carpal joint angle were significantly greater in the lame limb before treatment compared to those 6, 37 and 49 h after the last treatment (p < 0.001).

  12. Use of Eye Tracking as an Innovative Instructional Method in Surgical Human Anatomy.

    PubMed

    Sánchez-Ferrer, María Luísa; Grima-Murcia, María Dolores; Sánchez-Ferrer, Francisco; Hernández-Peñalver, Ana Isabel; Fernández-Jover, Eduardo; Sánchez Del Campo, Francisco

    Tobii glasses can record corneal infrared light reflection to track pupil position and to map gaze focusing in the video recording. Eye tracking has been proposed for use in training and coaching as a visually guided control interface. The aim of our study was to test the potential use of these glasses in various situations: explanations of anatomical structures on tablet-type electronic devices, explanations of anatomical models and dissected cadavers, and during the prosection thereof. An additional aim of the study was to test the use of the glasses during laparoscopies performed on Thiel-embalmed cadavers (that allows pneumoinsufflation and exact reproduction of the laparoscopic surgical technique). The device was also tried out in actual surgery (both laparoscopy and open surgery). We performed a pilot study using the Tobii glasses. Dissection room at our School of Medicine and in the operating room at our Hospital. To evaluate usefulness, a survey was designed for use among students, instructors, and practicing physicians. The results were satisfactory, with the usefulness of this tool supported by more than 80% positive responses to most questions. There was no inconvenience for surgeons and that patient safety was ensured in the real laparoscopy. To our knowledge, this is the first publication to demonstrate the usefulness of eye tracking in practical instruction of human anatomy, as well as in teaching clinical anatomy and surgical techniques in the dissection and operating rooms. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Fly's Eye camera system: optical imaging using a hexapod platform

    NASA Astrophysics Data System (ADS)

    Jaskó, Attila; Pál, András.; Vida, Krisztián.; Mészáros, László; Csépány, Gergely; Mező, György

    2014-07-01

    The Fly's Eye Project is a high resolution, high coverage time-domain survey in multiple optical passbands: our goal is to cover the entire visible sky above the 30° horizontal altitude with a cadence of ~3 min. Imaging is going to be performed by 19 wide-field cameras mounted on a hexapod platform resembling a fly's eye. Using a hexapod developed and built by our team allows us to create a highly fault-tolerant instrument that uses the sky as a reference to define its own tracking motion. The virtual axis of the platform is automatically aligned with the Earth's rotational axis; therefore the same mechanics can be used independently from the geographical location of the device. Its enclosure makes it capable of autonomous observing and withstanding harsh environmental conditions. We briefly introduce the electrical, mechanical and optical design concepts of the instrument and summarize our early results, focusing on sidereal tracking. Due to the hexapod design and hence the construction is independent from the actual location, it is considerably easier to build, install and operate a network of such devices around the world.

  14. Evaluation of an eye-pointer interaction device for human-computer interaction.

    PubMed

    Cáceres, Enrique; Carrasco, Miguel; Ríos, Sebastián

    2018-03-01

    Advances in eye-tracking technology have led to better human-computer interaction, and involve controlling a computer without any kind of physical contact. This research describes the transformation of a commercial eye-tracker for use as an alternative peripheral device in human-computer interactions, implementing a pointer that only needs the eye movements of a user facing a computer screen, thus replacing the need to control the software by hand movements. The experiment was performed with 30 test individuals who used the prototype with a set of educational videogames. The results show that, although most of the test subjects would prefer a mouse to control the pointer, the prototype tested has an empirical precision similar to that of the mouse, either when trying to control its movements or when attempting to click on a point of the screen.

  15. An improved apparatus of infrared videopupillography for monitoring pupil size

    NASA Astrophysics Data System (ADS)

    Huang, T.-.; Ko, M.-.; Ouyang, Y.; Chen, Y.-.; Sone, B.-.; Ou-Yang, M.; Chiou, J.-.

    2014-10-01

    The intraocular pressure (IOP) that can diagnose or track glaucoma generally because it is one of the physiology parameters that are associated with glaucoma. But IOP is not easy and consistence to be measured under different measure conditions. Besides, diabetes is associated with diabetic autonomic neuropathy (DAN). Pupil size response might provide an indirect means about neuronal pathways, so the abnormal pupil size may relate with DAN. Hence an infrared videopupillography is needed for tracking glaucoma and exploring the relation between pupil size and DAN. Our previous research proposed an infrared videopupillography to monitoring the pupil size of different light stimulus in dark room. And this portable infrared videopupillography contains a camera, a beam splitter, the visible-light LEDs for stimulating the eyes, and the infrared LEDs for lighting the eyes. It can be mounted on any eyeglass frame. But it can modulate only two dimensions, we cannot zoom in/out the eyes. Moreover, the eye diameter curves were not smooth and jagged because of the light spots, lone eyelashes, and blink. Therefore, we redesign the optical path of our device to have three dimension modulation. Then we can zoom in the eye to increase the eye resolution and to avoid the LED light spots. The light spot could be solved by defining the distance between IR LED and CCD. This device smaller volume and less prices of our previous videopupillography. We hope this new infrared videopupillography proposed in this paper can achieving early detection about autonomic neuropathy in the future.

  16. Gaze-contingent control for minimally invasive robotic surgery.

    PubMed

    Mylonas, George P; Darzi, Ara; Yang, Guang Zhong

    2006-09-01

    Recovering tissue depth and deformation during robotically assisted minimally invasive procedures is an important step towards motion compensation, stabilization and co-registration with preoperative data. This work demonstrates that eye gaze derived from binocular eye tracking can be effectively used to recover 3D motion and deformation of the soft tissue. A binocular eye-tracking device was integrated into the stereoscopic surgical console. After calibration, the 3D fixation point of the participating subjects could be accurately resolved in real time. A CT-scanned phantom heart model was used to demonstrate the accuracy of gaze-contingent depth extraction and motion stabilization of the soft tissue. The dynamic response of the oculomotor system was assessed with the proposed framework by using autoregressive modeling techniques. In vivo data were also used to perform gaze-contingent decoupling of cardiac and respiratory motion. Depth reconstruction, deformation tracking, and motion stabilization of the soft tissue were possible with binocular eye tracking. The dynamic response of the oculomotor system was able to cope with frequencies likely to occur under most routine minimally invasive surgical operations. The proposed framework presents a novel approach towards the tight integration of a human and a surgical robot where interaction in response to sensing is required to be under the control of the operating surgeon.

  17. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  18. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  19. Focus and perspective adaptive digital surgical microscope: optomechanical design and experimental implementation

    NASA Astrophysics Data System (ADS)

    Claus, Daniel; Reichert, Carsten; Herkommer, Alois

    2017-05-01

    This paper relates to the improvement of conventional surgical stereo microscopy via the application of digital recording devices and adaptive optics. The research is aimed at improving the working conditions of the surgeon during the operation, such that free head movement is possible. The depth clues known from conventional stereo microscopy in interaction with the human eye's functionality, such as convergence, disparity, angular elevation, parallax, and accommodation, are implemented in a digital recording system via adaptive optomechanical components. Two laterally moving pupil apertures have been used mimicking the digital implementation of the eye's vergence and head motion. The natural eye's accommodation is mimicked via the application of a tunable lens. Additionally, another system has been built, which enables tracking the surgeon's eye pupil through a digital displaying stereoscopic microscope to supply the necessary information for steering the recording system. The optomechanical design and experimental results for both systems, digital recording stereoscopic microscope and pupil tracking system, are shown.

  20. Strabismus and Amblyopia.

    ERIC Educational Resources Information Center

    Trief, E.; Morse, A. R.

    1988-01-01

    Strabismus and amblyopia are two common childhood vision conditions requiring early identification and treatment. Screening devices include external examination of the eye, ability to track, a cover test, acuity tests, and stereoscopic tests. Treatment includes patching therapy, use of glasses, orthoptics, CAM vision stimulator, or a combination…

  1. Design of integrated eye tracker-display device for head mounted systems

    NASA Astrophysics Data System (ADS)

    David, Y.; Apter, B.; Thirer, N.; Baal-Zedaka, I.; Efron, U.

    2009-08-01

    We propose an Eye Tracker/Display system, based on a novel, dual function device termed ETD, which allows sharing the optical paths of the Eye tracker and the display and on-chip processing. The proposed ETD design is based on a CMOS chip combining a Liquid-Crystal-on-Silicon (LCoS) micro-display technology with near infrared (NIR) Active Pixel Sensor imager. The ET operation allows capturing the Near IR (NIR) light, back-reflected from the eye's retina. The retinal image is then used for the detection of the current direction of eye's gaze. The design of the eye tracking imager is based on the "deep p-well" pixel technology, providing low crosstalk while shielding the active pixel circuitry, which serves the imaging and the display drivers, from the photo charges generated in the substrate. The use of the ETD in the HMD Design enables a very compact design suitable for Smart Goggle applications. A preliminary optical, electronic and digital design of the goggle and its associated ETD chip and digital control, are presented.

  2. Impaired Oculomotor Behavior of Children with Developmental Dyslexia in Antisaccades and Predictive Saccades Tasks

    PubMed Central

    Lukasova, Katerina; Silva, Isadora P.; Macedo, Elizeu C.

    2016-01-01

    Analysis of eye movement patterns during tracking tasks represents a potential way to identify differences in the cognitive processing and motor mechanisms underlying reading in dyslexic children before the occurrence of school failure. The current study aimed to evaluate the pattern of eye movements in antisaccades, predictive saccades and visually guided saccades in typical readers and readers with developmental dyslexia. The study included 30 children (age M = 11; SD = 1.67), 15 diagnosed with developmental dyslexia (DG) and 15 regular readers (CG), matched by age, gender and school grade. Cognitive assessment was performed prior to the eye-tracking task during which both eyes were registered using the Tobii® 1750 eye-tracking device. The results demonstrated a lower correct antisaccades rate in dyslexic children compared to the controls (p < 0.001, DG = 25%, CC = 37%). Dyslexic children also made fewer saccades in predictive latency (p < 0.001, DG = 34%, CG = 46%, predictive latency within −300–120 ms with target as 0 point). No between-group difference was found for visually guided saccades. In this task, both groups showed shorter latency for right-side targets. The results indicated altered oculomotor behavior in dyslexic children, which has been reported in previous studies. We extend these findings by demonstrating impaired implicit learning of target's time/position patterns in dyslexic children. PMID:27445945

  3. Rotational symmetric HMD with eye-tracking capability

    NASA Astrophysics Data System (ADS)

    Liu, Fangfang; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian

    2016-10-01

    As an important auxiliary function of head-mounted displays (HMDs), eye tracking has an important role in the field of intelligent human-machine interaction. In this paper, an eye-tracking HMD system (ET-HMD) is designed based on the rotational symmetric system. The tracking principle in this paper is based on pupil-corneal reflection. The ET-HMD system comprises three optical paths for virtual display, infrared illumination, and eye tracking. The display optics is shared by three optical paths and consists of four spherical lenses. For the eye-tracking path, an extra imaging lens is added to match the image sensor and achieve eye tracking. The display optics provides users a 40° diagonal FOV with a ״ 0.61 OLED, the 19 mm eye clearance, and 10 mm exit pupil diameter. The eye-tracking path can capture 15 mm × 15 mm of the users' eyes. The average MTF is above 0.1 at 26 lp/mm for the display path, and exceeds 0.2 at 46 lp/mm for the eye-tracking path. Eye illumination is simulated using LightTools with an eye model and an 850 nm near-infrared LED (NIR-LED). The results of the simulation show that the illumination of the NIR-LED can cover the area of the eye model with the display optics that is sufficient for eye tracking. The integrated optical system HMDs with eye-tracking feature can help improve the HMD experience of users.

  4. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker.

    PubMed

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2014-06-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees.

  5. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker

    PubMed Central

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2015-01-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees. PMID:26539565

  6. Eye/head tracking technology to improve HCI with iPad applications.

    PubMed

    Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña

    2015-01-22

    In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future.

  7. Eye/Head Tracking Technology to Improve HCI with iPad Applications

    PubMed Central

    Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña

    2015-01-01

    In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future. PMID:25621603

  8. Use of head-worn sensors to detect lapses in vigilance through the measurement of PERCLOS and cerebral blood flow velocity

    NASA Astrophysics Data System (ADS)

    McIntire, Lindsey K.; McKinley, R. Andy; Goodyear, Chuck; McIntire, John P.

    2017-05-01

    The purpose of this study is to determine the ability of an eye-tracker to detect changes in vigilance performance compared to the common method of using cerebral blood flow velocities (CBFV). Sixteen subjects completed this study. Each participant performed a 40-minute vigilance task while wearing an eye-tracker and a transcranial doppler (TCD) on each of four separate days. The results indicate that percentage of eye closure (PERCLOS) measured by the eye-tracker increased as vigilance performance declined and right CBFV as measured by the TCD decreased as performance declined. The results indicate that PERCLOS (left eye r=-.72 right eye r=-.67) more strongly correlated with changes in performance when compared to CBFV (r=.54). We conclude that PERCLOS, as measured by a head-worn eye tracking system, may serve as a compelling alternative (or supplemental) indicator of impending or concurrent performance declines in operational settings where sustained attention or vigilance is required. Such head-worn or perhaps even offbody oculometric sensor systems could potentially overcome some of the practical disadvantages inherent with TCD data collection for operational purposes. If portability and discomfort challenges with TCD can be overcome, both TCD and eye tracking might be advantageously combined for even greater performance monitoring than can be offered by any single device.

  9. Eye-movements and Voice as Interface Modalities to Computer Systems

    NASA Astrophysics Data System (ADS)

    Farid, Mohsen M.; Murtagh, Fionn D.

    2003-03-01

    We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance human-computer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.

  10. Disk space and load time requirements for eye movement biometric databases

    NASA Astrophysics Data System (ADS)

    Kasprowski, Pawel; Harezlak, Katarzyna

    2016-06-01

    Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.

  11. On Biometrics With Eye Movements.

    PubMed

    Zhang, Youming; Juhola, Martti

    2017-09-01

    Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.

  12. Before your very eyes: the value and limitations of eye tracking in medical education.

    PubMed

    Kok, Ellen M; Jarodzka, Halszka

    2017-01-01

    Medicine is a highly visual discipline. Physicians from many specialties constantly use visual information in diagnosis and treatment. However, they are often unable to explain how they use this information. Consequently, it is unclear how to train medical students in this visual processing. Eye tracking is a research technique that may offer answers to these open questions, as it enables researchers to investigate such visual processes directly by measuring eye movements. This may help researchers understand the processes that support or hinder a particular learning outcome. In this article, we clarify the value and limitations of eye tracking for medical education researchers. For example, eye tracking can clarify how experience with medical images mediates diagnostic performance and how students engage with learning materials. Furthermore, eye tracking can also be used directly for training purposes by displaying eye movements of experts in medical images. Eye movements reflect cognitive processes, but cognitive processes cannot be directly inferred from eye-tracking data. In order to interpret eye-tracking data properly, theoretical models must always be the basis for designing experiments as well as for analysing and interpreting eye-tracking data. The interpretation of eye-tracking data is further supported by sound experimental design and methodological triangulation. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  13. Let's Use Cognitive Science to Create Collaborative Workstations.

    PubMed

    Reicher, Murray A; Wolfe, Jeremy M

    2016-05-01

    When informed by an understanding of cognitive science, radiologists' workstations could become collaborative to improve radiologists' performance and job satisfaction. The authors review relevant literature and present several promising areas of research, including image toggling, eye tracking, cognitive computing, intelligently restricted messaging, work habit tracking, and innovative input devices. The authors call for more research in "perceptual design," a promising field that can complement advances in computer-aided detection. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. Using an auditory sensory substitution device to augment vision: evidence from eye movements.

    PubMed

    Wright, Thomas D; Margolis, Aaron; Ward, Jamie

    2015-03-01

    Sensory substitution devices convert information normally associated with one sense into another sense (e.g. converting vision into sound). This is often done to compensate for an impaired sense. The present research uses a multimodal approach in which both natural vision and sound-from-vision ('soundscapes') are simultaneously presented. Although there is a systematic correspondence between what is seen and what is heard, we introduce a local discrepancy between the signals (the presence of a target object that is heard but not seen) that the participant is required to locate. In addition to behavioural responses, the participants' gaze is monitored with eye-tracking. Although the target object is only presented in the auditory channel, behavioural performance is enhanced when visual information relating to the non-target background is presented. In this instance, vision may be used to generate predictions about the soundscape that enhances the ability to detect the hidden auditory object. The eye-tracking data reveal that participants look for longer in the quadrant containing the auditory target even when they subsequently judge it to be located elsewhere. As such, eye movements generated by soundscapes reveal the knowledge of the target location that does not necessarily correspond to the actual judgment made. The results provide a proof of principle that multimodal sensory substitution may be of benefit to visually impaired people with some residual vision and, in normally sighted participants, for guiding search within complex scenes.

  15. New method for remote and repeatable monitoring of intraocular pressure variations.

    PubMed

    Margalit, Israel; Beiderman, Yevgeny; Skaat, Alon; Rosenfeld, Elkanah; Belkin, Michael; Tornow, Ralf-Peter; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

    2014-02-01

    We present initial steps toward a new measurement device enabling high-precision, noncontact remote and repeatable monitoring of intraocular pressure (IOP)-based on an innovative measurement principle. Using only a camera and a laser source, the device measures IOP by tracking the secondary speckle pattern trajectories produced by the reflection of an illuminating laser beam from the iris or the sclera. The device was tested on rabbit eyes using two different methods to modify IOP: via an infusion bag and via mechanical pressure. In both cases, the eyes were stimulated with increasing and decreasing ramps of the IOP. As IOP variations changed the speckle distributions reflected back from the eye, data were recorded under various optical configurations to define and optimize the best experimental configuration for the IOP extraction. The association between the data provided by our proposed device and that resulting from controlled modification of the IOP was assessed, revealing high correlation (R2=0.98) and sensitivity and providing a high-precision measurement (5% estimated error) for the best experimental configuration. Future steps will be directed toward applying the proposed measurement principle in clinical trials for monitoring IOP with human subjects.

  16. Real time eye tracking using Kalman extended spatio-temporal context learning

    NASA Astrophysics Data System (ADS)

    Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu

    2017-06-01

    Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.

  17. The Role of Executive Control of Attention and Selective Encoding for Preschoolers' Learning

    ERIC Educational Resources Information Center

    Roderer, Thomas; Krebs, Saskia; Schmid, Corinne; Roebers, Claudia M.

    2012-01-01

    Selectivity in encoding, aspects of attentional control and their contribution to learning performance were explored in a sample of preschoolers. While the children are performing a learning task, their encoding of relevant and attention towards irrelevant information was recorded through an eye-tracking device. Recognition of target items was…

  18. Eye-tracking and EMG supported 3D Virtual Reality - an integrated tool for perceptual and motor development of children with severe physical disabilities: a research concept.

    PubMed

    Pulay, Márk Ágoston

    2015-01-01

    Letting children with severe physical disabilities (like Tetraparesis spastica) to get relevant motional experiences of appropriate quality and quantity is now the greatest challenge for us in the field of neurorehabilitation. These motional experiences may establish many cognitive processes, but may also cause additional secondary cognitive dysfunctions such as disorders in body image, figure invariance, visual perception, auditory differentiation, concentration, analytic and synthetic ways of thinking, visual memory etc. Virtual Reality is a technology that provides a sense of presence in a real environment with the help of 3D pictures and animations formed in a computer environment and enable the person to interact with the objects in that environment. One of our biggest challenges is to find a well suited input device (hardware) to let the children with severe physical disabilities to interact with the computer. Based on our own experiences and a thorough literature review we have come to the conclusion that an effective combination of eye-tracking and EMG devices should work well.

  19. Comparison of Predictable Smooth Ocular and Combined Eye-Head Tracking Behaviour in Patients with Lesions Affecting the Brainstem and Cerebellum

    NASA Technical Reports Server (NTRS)

    Grant, Michael P.; Leigh, R. John; Seidman, Scott H.; Riley, David E.; Hanna, Joseph P.

    1992-01-01

    We compared the ability of eight normal subjects and 15 patients with brainstem or cerebellar disease to follow a moving visual stimulus smoothly with either the eyes alone or with combined eye-head tracking. The visual stimulus was either a laser spot (horizontal and vertical planes) or a large rotating disc (torsional plane), which moved at one sinusoidal frequency for each subject. The visually enhanced Vestibulo-Ocular Reflex (VOR) was also measured in each plane. In the horizontal and vertical planes, we found that if tracking gain (gaze velocity/target velocity) for smooth pursuit was close to 1, the gain of combined eye-hand tracking was similar. If the tracking gain during smooth pursuit was less than about 0.7, combined eye-head tracking was usually superior. Most patients, irrespective of diagnosis, showed combined eye-head tracking that was superior to smooth pursuit; only two patients showed the converse. In the torsional plane, in which optokinetic responses were weak, combined eye-head tracking was much superior, and this was the case in both subjects and patients. We found that a linear model, in which an internal ocular tracking signal cancelled the VOR, could account for our findings in most normal subjects in the horizontal and vertical planes, but not in the torsional plane. The model failed to account for tracking behaviour in most patients in any plane, and suggested that the brain may use additional mechanisms to reduce the internal gain of the VOR during combined eye-head tracking. Our results confirm that certain patients who show impairment of smooth-pursuit eye movements preserve their ability to smoothly track a moving target with combined eye-head tracking.

  20. Spatial orientation perception and reflexive eye movements--a perspective, an overview, and some clinical implications

    NASA Technical Reports Server (NTRS)

    Guedry, F. E.; Paloski, W. F. (Principal Investigator)

    1996-01-01

    When head motion includes a linear velocity component, eye velocity required to track an earth-fixed target depends upon: a) angular and linear head velocity, b) target distance, and c) direction of gaze relative to the motion trajectory. Recent research indicates that eye movements (LVOR), presumably otolith-mediated, partially compensate for linear velocity in small head excursions on small devices. Canal-mediated eye velocity (AVOR), otolith-mediated eye velocity (LVOR), and Ocular Torsion (OT) can be measured, one by one, on small devices. However, response dynamics that depend upon the ratio of linear to angular velocity in the motion trajectory and on subject orientation relative to the trajectory are present in a centrifuge paradigm. With this paradigm, two 3-min runs yields measures of: LVOR differentially modulated by different subject orientations in the two runs; OT dynamics in four conditions; two directions of "steady-state" OT, and two directions of AVOR. Efficient assessment of the dynamics (and of the underlying central integrative processes) may require a centrifuge radius of 1.0 meters or more. Clinical assessment of the spatial orientation system should include evaluation of central integrative processes that determine the dynamics of these responses.

  1. Detection of third and sixth cranial nerve palsies with a novel method for eye tracking while watching a short film clip

    PubMed Central

    Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H.; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P.; Smith, R. Theodore

    2015-01-01

    OBJECT Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. METHODS The authors recorded subjects’ eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. RESULTS In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III palsy had significantly decreased ratios of 0.19 and 0.06, respectively. Three patients with surgically treatable pathological conditions impacting CN VI, such as infratentorial mass effect or hydrocephalus, had significantly increased ratios (1.84, 1.44, and 1.34, respectively) relative to normal controls, and 6 patients with supratentorial mass effect had significantly decreased ratios (0.27, 0.53, 0.62, 0.45, 0.49, and 0.41, respectively). These alterations in eye tracking all reverted to normal ranges after surgical treatment of underlying pathological conditions in these 9 neurosurgical cases. CONCLUSIONS This proof of concept series of cases suggests that the use of eye tracking to detect CN palsy while the patient watches television or its equivalent represents a new capacity for this technology. It may provide a new tool for the assessment of multiple CNS functions that can potentially be useful in the assessment of awake patients with elevated intracranial pressure from hydrocephalus or trauma. PMID:25495739

  2. Detection of third and sixth cranial nerve palsies with a novel method for eye tracking while watching a short film clip.

    PubMed

    Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P; Smith, R Theodore

    2015-03-01

    Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. The authors recorded subjects' eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III palsy had significantly decreased ratios of 0.19 and 0.06, respectively. Three patients with surgically treatable pathological conditions impacting CN VI, such as infratentorial mass effect or hydrocephalus, had significantly increased ratios (1.84, 1.44, and 1.34, respectively) relative to normal controls, and 6 patients with supratentorial mass effect had significantly decreased ratios (0.27, 0.53, 0.62, 0.45, 0.49, and 0.41, respectively). These alterations in eye tracking all reverted to normal ranges after surgical treatment of underlying pathological conditions in these 9 neurosurgical cases. This proof of concept series of cases suggests that the use of eye tracking to detect CN palsy while the patient watches television or its equivalent represents a new capacity for this technology. It may provide a new tool for the assessment of multiple CNS functions that can potentially be useful in the assessment of awake patients with elevated intracranial pressure from hydrocephalus or trauma.

  3. Advances in Eye Tracking in Infancy Research

    ERIC Educational Resources Information Center

    Oakes, Lisa M.

    2012-01-01

    In 2004, McMurray and Aslin edited for "Infancy" a special section on eye tracking. The articles in that special issue revealed the enormous promise of automatic eye tracking with young infants and demonstrated that eye-tracking procedures can provide significant insight into the emergence of cognitive, social, and emotional processing in infancy.…

  4. Testing of visual field with virtual reality goggles in manual and visual grasp modes.

    PubMed

    Wroblewski, Dariusz; Francis, Brian A; Sadun, Alfredo; Vakili, Ghazal; Chopra, Vikas

    2014-01-01

    Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye) that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1) manual, with patient response registered with a mouse click, and (2) visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA) testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1) minimal systematic differences between measurements taken in visual grasp and manual modes, (2) the average standard deviation of the difference distributions of about 5 dB, and (3) a systematic shift (of 4-6 dB) to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients' acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode.

  5. Pseudo-cat's eye for improved tilt-immune interferometry.

    PubMed

    Speake, Clive C; Bradshaw, Miranda J

    2015-08-20

    We present a new simple optical design for a cat's eye retroreflector. We describe the design of the new optical configuration and its use in tilt-immune interferometry where it enables the tracking of the displacement of a plane target mirror with minimum sensitivity to its tilt about axes orthogonal to the interferometer's optical axis. In this application the new cat's eye does not behave as a perfect retroreflector and we refer to it as a "pseudo"-cat's eye (PCE). The device allows, for the first time, tilt-immune interferometric displacement measurements in cases where the nominal distance to the target mirror is significantly larger than the length of the cat's eye. We describe the general optical characteristics of the PCE and compare its performance in our application with that of a conventional cat's eye optical configuration using ABCD matrices and Zemax analyses. We further suggest a simple modification to the design that would enable the PCE to behave as a perfect cat's eye, and this design may provide an advantageous solution for other applications.

  6. The effect of concurrent hand movement on estimated time to contact in a prediction motion task.

    PubMed

    Zheng, Ran; Maraj, Brian K V

    2018-04-27

    In many activities, we need to predict the arrival of an occluded object. This action is called prediction motion or motion extrapolation. Previous researchers have found that both eye tracking and the internal clocking model are involved in the prediction motion task. Additionally, it is reported that concurrent hand movement facilitates the eye tracking of an externally generated target in a tracking task, even if the target is occluded. The present study examined the effect of concurrent hand movement on the estimated time to contact in a prediction motion task. We found different (accurate/inaccurate) concurrent hand movements had the opposite effect on the eye tracking accuracy and estimated TTC in the prediction motion task. That is, the accurate concurrent hand tracking enhanced eye tracking accuracy and had the trend to increase the precision of estimated TTC, but the inaccurate concurrent hand tracking decreased eye tracking accuracy and disrupted estimated TTC. However, eye tracking accuracy does not determine the precision of estimated TTC.

  7. Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review.

    PubMed

    Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G; Goldstein, Adam O; Ranney, Leah

    2016-10-01

    In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics.

  8. Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review

    PubMed Central

    Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G.; Goldstein, Adam O.; Ranney, Leah

    2016-01-01

    Objective In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. Methods We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Results Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Conclusions Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics. PMID:27668270

  9. Emerging applications of eye-tracking technology in dermatology.

    PubMed

    John, Kevin K; Jensen, Jakob D; King, Andy J; Pokharel, Manusheela; Grossman, Douglas

    2018-04-06

    Eye-tracking technology has been used within a multitude of disciplines to provide data linking eye movements to visual processing of various stimuli (i.e., x-rays, situational positioning, printed information, and warnings). Despite the benefits provided by eye-tracking in allowing for the identification and quantification of visual attention, the discipline of dermatology has yet to see broad application of the technology. Notwithstanding dermatologists' heavy reliance upon visual patterns and cues to discriminate between benign and atypical nevi, literature that applies eye-tracking to the study of dermatology is sparse; and literature specific to patient-initiated behaviors, such as skin self-examination (SSE), is largely non-existent. The current article provides a review of eye-tracking research in various medical fields, culminating in a discussion of current applications and advantages of eye-tracking for dermatology research. Copyright © 2018 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.

  10. Toward Optimization of Gaze-Controlled Human-Computer Interaction: Application to Hindi Virtual Keyboard for Stroke Patients.

    PubMed

    Meena, Yogesh Kumar; Cecotti, Hubert; Wong-Lin, Kongfatt; Dutta, Ashish; Prasad, Girijesh

    2018-04-01

    Virtual keyboard applications and alternative communication devices provide new means of communication to assist disabled people. To date, virtual keyboard optimization schemes based on script-specific information, along with multimodal input access facility, are limited. In this paper, we propose a novel method for optimizing the position of the displayed items for gaze-controlled tree-based menu selection systems by considering a combination of letter frequency and command selection time. The optimized graphical user interface layout has been designed for a Hindi language virtual keyboard based on a menu wherein 10 commands provide access to type 88 different characters, along with additional text editing commands. The system can be controlled in two different modes: eye-tracking alone and eye-tracking with an access soft-switch. Five different keyboard layouts have been presented and evaluated with ten healthy participants. Furthermore, the two best performing keyboard layouts have been evaluated with eye-tracking alone on ten stroke patients. The overall performance analysis demonstrated significantly superior typing performance, high usability (87% SUS score), and low workload (NASA TLX with 17 scores) for the letter frequency and time-based organization with script specific arrangement design. This paper represents the first optimized gaze-controlled Hindi virtual keyboard, which can be extended to other languages.

  11. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology

    PubMed Central

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822

  12. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.

    PubMed

    Demšar, Urška; Çöltekin, Arzu

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.

  13. 2012 Year-End Report on Neurotechnologies for In-Vehicle Applications

    DTIC Science & Technology

    2013-06-01

    signals. • Alternative feature extraction methods have been proposed based on matching pursuit and wavelet analysis . Examining specific features of...locally networked PCs. 4.3 Arduino- Based Simulation Synchronization Time synchronization across measurement devices in neuroscience experiments is...steering behavior; the Optalert† (Optalert, Melbourne, Australia) system, which predicts fatigue based on eye-tracking measures ; or the SafeTraK (Takata

  14. Looking at Images with Human Figures: Comparison between Autistic and Normal Children.

    ERIC Educational Resources Information Center

    van der Geest, J. N.; Kemner, C.; Camfferman, G.; Verbaten, M. N.; van Engeland, H.

    2002-01-01

    In this study, the looking behavior of 16 autistic and 14 non-autistic children toward cartoon-like scenes that included a human figure was measured quantitatively using an infrared eye-tracking device. Fixation behavior of autistic children was similar to that of their age-and IQ-matched normal peers. Results do not support the idea that autistic…

  15. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing.

    PubMed

    Diard, Julien; Rynik, Vincent; Lorenceau, Jean

    2013-01-01

    This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables "eye writing," which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL). It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database). We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories). Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges.

  16. A Bayesian computational model for online character recognition and disability assessment during cursive eye writing

    PubMed Central

    Diard, Julien; Rynik, Vincent; Lorenceau, Jean

    2013-01-01

    This research involves a novel apparatus, in which the user is presented with an illusion inducing visual stimulus. The user perceives illusory movement that can be followed by the eye, so that smooth pursuit eye movements can be sustained in arbitrary directions. Thus, free-flow trajectories of any shape can be traced. In other words, coupled with an eye-tracking device, this apparatus enables “eye writing,” which appears to be an original object of study. We adapt a previous model of reading and writing to this context. We describe a probabilistic model called the Bayesian Action-Perception for Eye On-Line model (BAP-EOL). It encodes probabilistic knowledge about isolated letter trajectories, their size, high-frequency components of the produced trajectory, and pupil diameter. We show how Bayesian inference, in this single model, can be used to solve several tasks, like letter recognition and novelty detection (i.e., recognizing when a presented character is not part of the learned database). We are interested in the potential use of the eye writing apparatus by motor impaired patients: the final task we solve by Bayesian inference is disability assessment (i.e., measuring and tracking the evolution of motor characteristics of produced trajectories). Preliminary experimental results are presented, which illustrate the method, showing the feasibility of character recognition in the context of eye writing. We then show experimentally how a model of the unknown character can be used to detect trajectories that are likely to be new symbols, and how disability assessment can be performed by opportunistically observing characteristics of fine motor control, as letter are being traced. Experimental analyses also help identify specificities of eye writing, as compared to handwriting, and the resulting technical challenges. PMID:24273525

  17. MR-Compatible Integrated Eye Tracking System

    DTIC Science & Technology

    2016-03-10

    SECURITY CLASSIFICATION OF: This instrumentation grant was used to purchase state-of-the-art, high-resolution video eye tracker that can be used to...P.O. Box 12211 Research Triangle Park, NC 27709-2211 video eye tracking, eye movments, visual search; camouflage-breaking REPORT DOCUMENTATION PAGE...Report: MR-Compatible Integrated Eye Tracking System Report Title This instrumentation grant was used to purchase state-of-the-art, high-resolution video

  18. Eye Tracking: A Brief Guide for Developmental Researchers

    ERIC Educational Resources Information Center

    Feng, Gary

    2011-01-01

    Eye tracking offers a powerful research tool for developmental scientists. In this brief article, the author introduces the methodology and issues associated with its applications in developmental research, beginning with an overview of eye movements and eye-tracking technologies, followed by examples of how it is used to study the developing mind…

  19. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  20. Physiologically Modulating Videogames or Simulations which use Motion-Sensing Input Devices

    NASA Technical Reports Server (NTRS)

    Pope, Alan T. (Inventor); Stephens, Chad L. (Inventor); Blanson, Nina Marie (Inventor)

    2014-01-01

    New types of controllers allow players to make inputs to a video game or simulation by moving the entire controller itself. This capability is typically accomplished using a wireless input device having accelerometers, gyroscopes, and an infrared LED tracking camera. The present invention exploits these wireless motion-sensing technologies to modulate the player's movement inputs to the videogame based upon physiological signals. Such biofeedback-modulated video games train valuable mental skills beyond eye-hand coordination. These psychophysiological training technologies enhance personal improvement, not just the diversion, of the user.

  1. Testing of Visual Field with Virtual Reality Goggles in Manual and Visual Grasp Modes

    PubMed Central

    Wroblewski, Dariusz; Francis, Brian A.; Sadun, Alfredo; Vakili, Ghazal; Chopra, Vikas

    2014-01-01

    Automated perimetry is used for the assessment of visual function in a variety of ophthalmic and neurologic diseases. We report development and clinical testing of a compact, head-mounted, and eye-tracking perimeter (VirtualEye) that provides a more comfortable test environment than the standard instrumentation. VirtualEye performs the equivalent of a full threshold 24-2 visual field in two modes: (1) manual, with patient response registered with a mouse click, and (2) visual grasp, where the eye tracker senses change in gaze direction as evidence of target acquisition. 59 patients successfully completed the test in manual mode and 40 in visual grasp mode, with 59 undergoing the standard Humphrey field analyzer (HFA) testing. Large visual field defects were reliably detected by VirtualEye. Point-by-point comparison between the results obtained with the different modalities indicates: (1) minimal systematic differences between measurements taken in visual grasp and manual modes, (2) the average standard deviation of the difference distributions of about 5 dB, and (3) a systematic shift (of 4–6 dB) to lower sensitivities for VirtualEye device, observed mostly in high dB range. The usability survey suggested patients' acceptance of the head-mounted device. The study appears to validate the concepts of a head-mounted perimeter and the visual grasp mode. PMID:25050326

  2. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2017-06-01

    report. 10 Supporting Data None. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI Psychological Health...Award Number: W81XWH-13-1-0095 TITLE: Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI PRINCIPAL INVESTIGATOR...COVERED 08 MAR 2016 – 07 MAR 2017 4. TITLE AND SUBTITLE Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI 5a

  3. Active eye-tracking for an adaptive optics scanning laser ophthalmoscope

    PubMed Central

    Sheehy, Christy K.; Tiruveedhula, Pavan; Sabesan, Ramkumar; Roorda, Austin

    2015-01-01

    We demonstrate a system that combines a tracking scanning laser ophthalmoscope (TSLO) and an adaptive optics scanning laser ophthalmoscope (AOSLO) system resulting in both optical (hardware) and digital (software) eye-tracking capabilities. The hybrid system employs the TSLO for active eye-tracking at a rate up to 960 Hz for real-time stabilization of the AOSLO system. AOSLO videos with active eye-tracking signals showed, at most, an amplitude of motion of 0.20 arcminutes for horizontal motion and 0.14 arcminutes for vertical motion. Subsequent real-time digital stabilization limited residual motion to an average of only 0.06 arcminutes (a 95% reduction). By correcting for high amplitude, low frequency drifts of the eye, the active TSLO eye-tracking system enabled the AOSLO system to capture high-resolution retinal images over a larger range of motion than previously possible with just the AOSLO imaging system alone. PMID:26203370

  4. Assistive Device for Efficient Intravitreal Injections.

    PubMed

    Ullrich, Franziska; Michels, Stephan; Lehmann, Daniel; Pieters, Roel S; Becker, Matthias; Nelson, Bradley J

    2016-08-01

    Intravitreal therapy is the most common treatment for many chronic ophthalmic diseases, such as age-related macular degeneration. Due to the increasing worldwide demand for intravitreal injections, there exists a need to render this medical procedure more time- and cost-efficient while increasing patient safety. The authors propose a medical assistive device that injects medication intravitreally. Compared to the manual intravitreal injection procedure, an automated device has the potential to increase safety for patients, decrease procedure times, allow for integrated data storage and documentation, and reduce costs for medical staff and expensive operating rooms. This work demonstrates the development of an assistive injection system that is coarsely positioned over the patient's head by the human operator, followed by automatic fine positioning and intravitreal injection through the pars plana. Several safety features, such as continuous eye tracking and iris recognition, have been implemented. The functioning system is demonstrated through ex vivo experiments with porcine eyes. [Ophthalmic Surg Lasers Imaging Retina. 2016;47:752-762.]. Copyright 2016, SLACK Incorporated.

  5. An eye tracking system for monitoring face scanning patterns reveals the enhancing effect of oxytocin on eye contact in common marmosets.

    PubMed

    Kotani, Manato; Shimono, Kohei; Yoneyama, Toshihiro; Nakako, Tomokazu; Matsumoto, Kenji; Ogi, Yuji; Konoike, Naho; Nakamura, Katsuki; Ikeda, Kazuhito

    2017-09-01

    Eye tracking systems are used to investigate eyes position and gaze patterns presumed as eye contact in humans. Eye contact is a useful biomarker of social communication and known to be deficient in patients with autism spectrum disorders (ASDs). Interestingly, the same eye tracking systems have been used to directly compare face scanning patterns in some non-human primates to those in human. Thus, eye tracking is expected to be a useful translational technique for investigating not only social attention and visual interest, but also the effects of psychiatric drugs, such as oxytocin, a neuropeptide that regulates social behavior. In this study, we report on a newly established method for eye tracking in common marmosets as unique New World primates that, like humans, use eye contact as a mean of communication. Our investigation was aimed at characterizing these primates face scanning patterns and evaluating the effects of oxytocin on their eye contact behavior. We found that normal common marmosets spend more time viewing the eyes region in common marmoset's picture than the mouth region or a scrambled picture. In oxytocin experiment, the change in eyes/face ratio was significantly greater in the oxytocin group than in the vehicle group. Moreover, oxytocin-induced increase in the change in eyes/face ratio was completely blocked by the oxytocin receptor antagonist L-368,899. These results indicate that eye tracking in common marmosets may be useful for evaluating drug candidates targeting psychiatric conditions, especially ASDs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Using Eye-Tracking in Applied Linguistics and Second Language Research

    ERIC Educational Resources Information Center

    Conklin, Kathy; Pellicer-Sánchez, Ana

    2016-01-01

    With eye-tracking technology the eye is thought to give researchers a window into the mind. Importantly, eye-tracking has significant advantages over traditional online processing measures: chiefly that it allows for more "natural" processing as it does not require a secondary task, and that it provides a very rich moment-to-moment data…

  7. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study.

    PubMed

    Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M

    2017-02-01

    Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.

  8. Footprints in the Sky: Using Student Track Logs from a "Bird's Eye View" Virtual Field Trip to Enhance Learning

    ERIC Educational Resources Information Center

    Treves, Richard; Viterbo, Paolo; Haklay, Mordechai

    2015-01-01

    Research into virtual field trips (VFTs) started in the 1990s but, only recently, the maturing technology of devices and networks has made them viable options for educational settings. By considering an experiment, the learning benefits of logging the movement of students within a VFT are shown. The data are visualized by two techniques:…

  9. Elevated intracranial pressure and reversible eye-tracking changes detected while viewing a film clip.

    PubMed

    Kolecki, Radek; Dammavalam, Vikalpa; Bin Zahid, Abdullah; Hubbard, Molly; Choudhry, Osamah; Reyes, Marleen; Han, ByoungJun; Wang, Tom; Papas, Paraskevi Vivian; Adem, Aylin; North, Emily; Gilbertson, David T; Kondziolka, Douglas; Huang, Jason H; Huang, Paul P; Samadani, Uzma

    2018-03-01

    OBJECTIVE The precise threshold differentiating normal and elevated intracranial pressure (ICP) is variable among individuals. In the context of several pathophysiological conditions, elevated ICP leads to abnormalities in global cerebral functioning and impacts the function of cranial nerves (CNs), either or both of which may contribute to ocular dysmotility. The purpose of this study was to assess the impact of elevated ICP on eye-tracking performed while patients were watching a short film clip. METHODS Awake patients requiring placement of an ICP monitor for clinical purposes underwent eye tracking while watching a 220-second continuously playing video moving around the perimeter of a viewing monitor. Pupil position was recorded at 500 Hz and metrics associated with each eye individually and both eyes together were calculated. Linear regression with generalized estimating equations was performed to test the association of eye-tracking metrics with changes in ICP. RESULTS Eye tracking was performed at ICP levels ranging from -3 to 30 mm Hg in 23 patients (12 women, 11 men, mean age 46.8 years) on 55 separate occasions. Eye-tracking measures correlating with CN function linearly decreased with increasing ICP (p < 0.001). Measures for CN VI were most prominently affected. The area under the curve (AUC) for eye-tracking metrics to discriminate between ICP < 12 and ≥ 12 mm Hg was 0.798. To discriminate an ICP < 15 from ≥ 15 mm Hg the AUC was 0.833, and to discriminate ICP < 20 from ≥ 20 mm Hg the AUC was 0.889. CONCLUSIONS Increasingly elevated ICP was associated with increasingly abnormal eye tracking detected while patients were watching a short film clip. These results suggest that eye tracking may be used as a noninvasive, automatable means to quantitate the physiological impact of elevated ICP, which has clinical application for assessment of shunt malfunction, pseudotumor cerebri, concussion, and prevention of second-impact syndrome.

  10. Eye-Tracking in the Study of Visual Expertise: Methodology and Approaches in Medicine

    ERIC Educational Resources Information Center

    Fox, Sharon E.; Faulkner-Jones, Beverly E.

    2017-01-01

    Eye-tracking is the measurement of eye motions and point of gaze of a viewer. Advances in this technology have been essential to our understanding of many forms of visual learning, including the development of visual expertise. In recent years, these studies have been extended to the medical professions, where eye-tracking technology has helped us…

  11. Vestibulo-Cervico-Ocular Responses and Tracking Eye Movements after Prolonged Exposure to Microgravity

    NASA Technical Reports Server (NTRS)

    Kornilova, L. N.; Naumov, I. A.; Azarov, K. A.; Sagalovitch, S. V.; Reschke, Millard F.; Kozlovskaya, I. B.

    2007-01-01

    The vestibular function and tracking eye movements were investigated in 12 Russian crew members of ISS missions on days 1(2), 4(5-6), and 8(9-10) after prolonged exposure to microgravity (126 to 195 days). The spontaneous oculomotor activity, static torsional otolith-cervico-ocular reflex, dynamic vestibulo-cervico-ocular responses, vestibular reactivity, tracking eye movements, and gaze-holding were studied using videooculography (VOG) and electrooculography (EOG) for parallel eye movement recording. On post-flight days 1-2 (R+1-2) some cosmonauts demonstrated: - an increased spontaneous oculomotor activity (floating eye movements, spontaneous nystagmus of the typical and atypical form, square wave jerks, gaze nystagmus) with the head held in the vertical position; - suppressed otolith function (absent or reduced by one half amplitude of torsional compensatory eye counter-rolling) with the head inclined statically right- or leftward by 300; - increased vestibular reactivity (lowered threshold and increased intensity of the vestibular nystagmus) during head turns around the longitudinal body axis at 0.125 Hz; - a significant change in the accuracy, velocity, and temporal characteristics of the eye tracking. The pattern, depth, dynamics, and velocity of the vestibular function and tracking eye movements recovery varied with individual participants in the investigation. However, there were also regular responses during readaptation to the normal gravity: - suppression of the otolith function was typically accompanied by an exaggerated vestibular reactivity; - the structure of visual tracking (the accuracy of fixational eye rotations, smooth tracking, and gaze-holding) was disturbed (the appearance of correcting saccades, the transition of smooth tracking to saccadic tracking) only in those cosmonauts who, in parallel to an increased reactivity of the vestibular input, also had central changes in the oculomotor system (spontaneous nystagmus, gaze nystagmus).

  12. Real-time eye motion correction in phase-resolved OCT angiography with tracking SLO

    PubMed Central

    Braaf, Boy; Vienola, Kari V.; Sheehy, Christy K.; Yang, Qiang; Vermeer, Koenraad A.; Tiruveedhula, Pavan; Arathorn, David W.; Roorda, Austin; de Boer, Johannes F.

    2012-01-01

    In phase-resolved OCT angiography blood flow is detected from phase changes in between A-scans that are obtained from the same location. In ophthalmology, this technique is vulnerable to eye motion. We address this problem by combining inter-B-scan phase-resolved OCT angiography with real-time eye tracking. A tracking scanning laser ophthalmoscope (TSLO) at 840 nm provided eye tracking functionality and was combined with a phase-stabilized optical frequency domain imaging (OFDI) system at 1040 nm. Real-time eye tracking corrected eye drift and prevented discontinuity artifacts from (micro)saccadic eye motion in OCT angiograms. This improved the OCT spot stability on the retina and consequently reduced the phase-noise, thereby enabling the detection of slower blood flows by extending the inter-B-scan time interval. In addition, eye tracking enabled the easy compounding of multiple data sets from the fovea of a healthy volunteer to create high-quality eye motion artifact-free angiograms. High-quality images are presented of two distinct layers of vasculature in the retina and the dense vasculature of the choroid. Additionally we present, for the first time, a phase-resolved OCT angiogram of the mesh-like network of the choriocapillaris containing typical pore openings. PMID:23304647

  13. New developments in short-pulse eye safe lasers pay the way for future LADARs and 3D mapping performances

    NASA Astrophysics Data System (ADS)

    Pasmanik, Guerman; Latone, Kevin; Shilov, Alex; Shklovsky, Eugeni; Spiro, Alex; Tiour, Larissa

    2005-06-01

    We have demonstrated that direct excitation of 3rd Stokes Raman emission in crystal can produce short (few nanosecond) eye-safe pulses. Produced beam has very high quality and the pulse energy can be as high as tens of millijoules. For pulsed diode pumped solid state lasers the demonstrated repetition rate was 250 Hz but higher repetition rates are certainly achievable. It is important that tested schemes do not have strict requirements on laser pump parameters, namely beam divergence and frequency bandwidth. The obtained results are very relevant to the development of eye-safe lasers, such as the new generation of rangefinders, target designators, and laser tracking and pin-pointing devices, as well as remote 2D and 3D imaging systems.

  14. Improvement of design of a surgical interface using an eye tracking device

    PubMed Central

    2014-01-01

    Background Surgical interfaces are used for helping surgeons in interpretation and quantification of the patient information, and for the presentation of an integrated workflow where all available data are combined to enable optimal treatments. Human factors research provides a systematic approach to design user interfaces with safety, accuracy, satisfaction and comfort. One of the human factors research called user-centered design approach is used to develop a surgical interface for kidney tumor cryoablation. An eye tracking device is used to obtain the best configuration of the developed surgical interface. Methods Surgical interface for kidney tumor cryoablation has been developed considering the four phases of user-centered design approach, which are analysis, design, implementation and deployment. Possible configurations of the surgical interface, which comprise various combinations of menu-based command controls, visual display of multi-modal medical images, 2D and 3D models of the surgical environment, graphical or tabulated information, visual alerts, etc., has been developed. Experiments of a simulated cryoablation of a tumor task have been performed with surgeons to evaluate the proposed surgical interface. Fixation durations and number of fixations at informative regions of the surgical interface have been analyzed, and these data are used to modify the surgical interface. Results Eye movement data has shown that participants concentrated their attention on informative regions more when the number of displayed Computer Tomography (CT) images has been reduced. Additionally, the time required to complete the kidney tumor cryoablation task by the participants had been decreased with the reduced number of CT images. Furthermore, the fixation durations obtained after the revision of the surgical interface are very close to what is observed in visual search and natural scene perception studies suggesting more efficient and comfortable interaction with the surgical interface. The National Aeronautics and Space Administration Task Load Index (NASA-TLX) and Short Post-Assessment Situational Awareness (SPASA) questionnaire results have shown that overall mental workload of surgeons related with surgical interface has been low as it has been aimed, and overall situational awareness scores of surgeons have been considerably high. Conclusions This preliminary study highlights the improvement of a developed surgical interface using eye tracking technology to obtain the best SI configuration. The results presented here reveal that visual surgical interface design prepared according to eye movement characteristics may lead to improved usability. PMID:25080176

  15. Improvement of design of a surgical interface using an eye tracking device.

    PubMed

    Erol Barkana, Duygun; Açık, Alper; Duru, Dilek Goksel; Duru, Adil Deniz

    2014-05-07

    Surgical interfaces are used for helping surgeons in interpretation and quantification of the patient information, and for the presentation of an integrated workflow where all available data are combined to enable optimal treatments. Human factors research provides a systematic approach to design user interfaces with safety, accuracy, satisfaction and comfort. One of the human factors research called user-centered design approach is used to develop a surgical interface for kidney tumor cryoablation. An eye tracking device is used to obtain the best configuration of the developed surgical interface. Surgical interface for kidney tumor cryoablation has been developed considering the four phases of user-centered design approach, which are analysis, design, implementation and deployment. Possible configurations of the surgical interface, which comprise various combinations of menu-based command controls, visual display of multi-modal medical images, 2D and 3D models of the surgical environment, graphical or tabulated information, visual alerts, etc., has been developed. Experiments of a simulated cryoablation of a tumor task have been performed with surgeons to evaluate the proposed surgical interface. Fixation durations and number of fixations at informative regions of the surgical interface have been analyzed, and these data are used to modify the surgical interface. Eye movement data has shown that participants concentrated their attention on informative regions more when the number of displayed Computer Tomography (CT) images has been reduced. Additionally, the time required to complete the kidney tumor cryoablation task by the participants had been decreased with the reduced number of CT images. Furthermore, the fixation durations obtained after the revision of the surgical interface are very close to what is observed in visual search and natural scene perception studies suggesting more efficient and comfortable interaction with the surgical interface. The National Aeronautics and Space Administration Task Load Index (NASA-TLX) and Short Post-Assessment Situational Awareness (SPASA) questionnaire results have shown that overall mental workload of surgeons related with surgical interface has been low as it has been aimed, and overall situational awareness scores of surgeons have been considerably high. This preliminary study highlights the improvement of a developed surgical interface using eye tracking technology to obtain the best SI configuration. The results presented here reveal that visual surgical interface design prepared according to eye movement characteristics may lead to improved usability.

  16. Eye-in-Hand Manipulation for Remote Handling: Experimental Setup

    NASA Astrophysics Data System (ADS)

    Niu, Longchuan; Suominen, Olli; Aref, Mohammad M.; Mattila, Jouni; Ruiz, Emilio; Esque, Salvador

    2018-03-01

    A prototype for eye-in-hand manipulation in the context of remote handling in the International Thermonuclear Experimental Reactor (ITER)1 is presented in this paper. The setup consists of an industrial robot manipulator with a modified open control architecture and equipped with a pair of stereoscopic cameras, a force/torque sensor, and pneumatic tools. It is controlled through a haptic device in a mock-up environment. The industrial robot controller has been replaced by a single industrial PC running Xenomai that has a real-time connection to both the robot controller and another Linux PC running as the controller for the haptic device. The new remote handling control environment enables further development of advanced control schemes for autonomous and semi-autonomous manipulation tasks. This setup benefits from a stereovision system for accurate tracking of the target objects with irregular shapes. The overall environmental setup successfully demonstrates the required robustness and precision that remote handling tasks need.

  17. Correlation of individual cosmic ray nuclei with the observation of light flashes by Apollo astronauts. [nuclear emulsion detector design and operation

    NASA Technical Reports Server (NTRS)

    Pinsky, L. S.; Osborne, W. Z.; Bailey, J. V.

    1975-01-01

    A nuclear emulsion detector known as the Apollo Light Flash Moving Emulsion Detector (ALFMED) was designed: (1) to record tracks of primary cosmic rays; (2) to provide time-of-passage information via a relative plate translation technique; (3) to provide particle trajectory information; and (4) to fit into a masklike device that could be located about the head and eyes of an astronaut. An ALFMED device was worn by an astronaut observing light flashes for 60 minutes on each of the last two Apollo missions. During the Apollo 17 experiment seventeen separate flashes were reported by the observer. With one-third of the total plate area completely analyzed, two definite correlations have been found between Z greater than 8 cosmic ray nuclei traversing an eye and the reports of visual sensations.

  18. Using Eye Trackers for Usability Evaluation of Health Information Technology: A Systematic Literature Review

    PubMed Central

    Yang, Yushi

    2015-01-01

    Background Eye-tracking technology has been used to measure human cognitive processes and has the potential to improve the usability of health information technology (HIT). However, it is still unclear how the eye-tracking method can be integrated with other traditional usability methodologies to achieve its full potential. Objective The objective of this study was to report on HIT evaluation studies that have used eye-tracker technology, and to envision the potential use of eye-tracking technology in future research. Methods We used four reference databases to initially identify 5248 related papers, which resulted in only 9 articles that met our inclusion criteria. Results Eye-tracking technology was useful in finding usability problems in many ways, but is still in its infancy for HIT usability evaluation. Limited types of HITs have been evaluated by eye trackers, and there has been a lack of evaluation research in natural settings. Conclusions More research should be done in natural settings to discover the real contextual-based usability problems of clinical and mobile HITs using eye-tracking technology with more standardized methodologies and guidance. PMID:27026079

  19. Scanning mid-IR laser apparatus with eye tracking for refractive surgery

    NASA Astrophysics Data System (ADS)

    Telfair, William B.; Yoder, Paul R., Jr.; Bekker, Carsten; Hoffman, Hanna J.; Jensen, Eric F.

    1999-06-01

    A robust, real-time, dynamic eye tracker has been integrated with the short pulse mid-infrared laser scanning delivery system previously described. This system employs a Q- switched Nd:YAG laser pumped optical parametric oscillator operating at 2.94 micrometers. Previous ablation studies on human cadaver eyes and in-vivo cat eyes demonstrated very smooth ablations with extremely low damage levels similar to results with an excimer. A 4-month healing study with cats indicated no adverse healing effects. In order to treat human eyes, the tracker is required because the eyes move during the procedure due to both voluntary and involuntary motions such as breathing, heartbeat, drift, loss of fixation, saccades and microsaccades. Eye tracking techniques from the literature were compared. A limbus tracking system was best for this application. Temporal and spectral filtering techniques were implemented to reduce tracking errors, reject stray light, and increase signal to noise ratio. The expanded-capability system (IRVision AccuScan 2000 Laser System) has been tested in the lab on simulated eye targets, glass eyes, cadaver eyes, and live human subjects. Circular targets ranging from 10-mm to 14-mm diameter were successfully tracked. The tracker performed beyond expectations while the system performed myopic photorefractive keratectomy procedures on several legally blind human subjects.

  20. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  1. Advances in the Dynallax solid-state dynamic parallax barrier autostereoscopic visualization display system.

    PubMed

    Peterka, Tom; Kooima, Robert L; Sandin, Daniel J; Johnson, Andrew; Leigh, Jason; DeFanti, Thomas A

    2008-01-01

    A solid-state dynamic parallax barrier autostereoscopic display mitigates some of the restrictions present in static barrier systems, such as fixed view-distance range, slow response to head movements, and fixed stereo operating mode. By dynamically varying barrier parameters in real time, viewers may move closer to the display and move faster laterally than with a static barrier system, and the display can switch between 3D and 2D modes by disabling the barrier on a per-pixel basis. Moreover, Dynallax can output four independent eye channels when two viewers are present, and both head-tracked viewers receive an independent pair of left-eye and right-eye perspective views based on their position in 3D space. The display device is constructed by using a dual-stacked LCD monitor where a dynamic barrier is rendered on the front display and a modulated virtual environment composed of two or four channels is rendered on the rear display. Dynallax was recently demonstrated in a small-scale head-tracked prototype system. This paper summarizes the concepts presented earlier, extends the discussion of various topics, and presents recent improvements to the system.

  2. Magnetic eye tracking in mice

    PubMed Central

    Payne, Hannah L

    2017-01-01

    Eye movements provide insights about a wide range of brain functions, from sensorimotor integration to cognition; hence, the measurement of eye movements is an important tool in neuroscience research. We describe a method, based on magnetic sensing, for measuring eye movements in head-fixed and freely moving mice. A small magnet was surgically implanted on the eye, and changes in the magnet angle as the eye rotated were detected by a magnetic field sensor. Systematic testing demonstrated high resolution measurements of eye position of <0.1°. Magnetic eye tracking offers several advantages over the well-established eye coil and video-oculography methods. Most notably, it provides the first method for reliable, high-resolution measurement of eye movements in freely moving mice, revealing increased eye movements and altered binocular coordination compared to head-fixed mice. Overall, magnetic eye tracking provides a lightweight, inexpensive, easily implemented, and high-resolution method suitable for a wide range of applications. PMID:28872455

  3. Eye tracking detects disconjugate eye movements associated with structural traumatic brain injury and concussion.

    PubMed

    Samadani, Uzma; Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H; McStay, Christopher; Todd, S Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul

    2015-04-15

    Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury.

  4. Eye Tracking Detects Disconjugate Eye Movements Associated with Structural Traumatic Brain Injury and Concussion

    PubMed Central

    Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H.; McStay, Christopher; Todd, S. Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul

    2015-01-01

    Abstract Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury. PMID:25582436

  5. Alteration of travel patterns with vision loss from glaucoma and macular degeneration.

    PubMed

    Curriero, Frank C; Pinchoff, Jessie; van Landingham, Suzanne W; Ferrucci, Luigi; Friedman, David S; Ramulu, Pradeep Y

    2013-11-01

    The distance patients can travel outside the home influences how much of the world they can sample and to what extent they can live independently. Recent technological advances have allowed travel outside the home to be directly measured in patients' real-world routines. To determine whether decreased visual acuity (VA) from age-related macular degeneration (AMD) and visual field (VF) loss from glaucoma are associated with restricted travel patterns in older adults. Cross-sectional study. Patients were recruited from an eye clinic, while travel patterns were recorded during their real-world routines using a cellular tracking device. Sixty-one control subjects with normal vision, 84 subjects with glaucoma with bilateral VF loss, and 65 subjects with AMD with bilateral or severe unilateral loss of VA had their location tracked every 15 minutes between 7 am and 11 pm for 7 days using a tracking device. Average daily excursion size (defined as maximum distance away from home) and average daily excursion span (defined as maximum span of travel) were defined for each individual. The effects of vision loss on travel patterns were evaluated after controlling for individual and geographic factors. In multivariable models comparing subjects with AMD and control subjects, average excursion size and span decreased by approximately one-quarter mile for each line of better-eye VA loss (P ≤ .03 for both). Similar but not statistically significant associations were observed between average daily excursion size and span for severity of better-eye VF loss in subjects with glaucoma and control subjects. Being married or living with someone and younger age were associated with more distant travel, while less-distant travel was noted for older individuals, African Americans, and those living in more densely populated regions. Age-related macular degeneration-related loss of VA, but not glaucoma-related loss of VF, is associated with restriction of travel to more nearby locations. This constriction of life space may impact quality of life and restrict access to services.

  6. Use of Cognitive and Metacognitive Strategies in Online Search: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Zhou, Mingming; Ren, Jing

    2016-01-01

    This study used eye-tracking technology to track students' eye movements while searching information on the web. The research question guiding this study was "Do students with different search performance levels have different visual attention distributions while searching information online? If yes, what are the patterns for high and low…

  7. Assessing the Potential Use of Eye-Tracking Triangulation for Evaluating the Usability of an Online Diabetes Exercise System.

    PubMed

    Schaarup, Clara; Hartvigsen, Gunnar; Larsen, Lars Bo; Tan, Zheng-Hua; Årsand, Eirik; Hejlesen, Ole Kristian

    2015-01-01

    The Online Diabetes Exercise System was developed to motivate people with Type 2 diabetes to do a 25 minutes low-volume high-intensity interval training program. In a previous multi-method evaluation of the system, several usability issues were identified and corrected. Despite the thorough testing, it was unclear whether all usability problems had been identified using the multi-method evaluation. Our hypothesis was that adding the eye-tracking triangulation to the multi-method evaluation would increase the accuracy and completeness when testing the usability of the system. The study design was an Eye-tracking Triangulation; conventional eye-tracking with predefined tasks followed by The Post-Experience Eye-Tracked Protocol (PEEP). Six Areas of Interests were the basis for the PEEP-session. The eye-tracking triangulation gave objective and subjective results, which are believed to be highly relevant for designing, implementing, evaluating and optimizing systems in the field of health informatics. Future work should include testing the method on a larger and more representative group of users and apply the method on different system types.

  8. Video-Based Eye Tracking in Sex Research: A Systematic Literature Review.

    PubMed

    Wenzlaff, Frederike; Briken, Peer; Dekker, Arne

    2015-12-21

    Although eye tracking has been used for decades, it has gained popularity in the area of sex research only recently. The aim of this article is to examine the potential merits of eye tracking for this field. We present a systematic review of the current use of video-based eye-tracking technology in this area, evaluate the findings, and identify future research opportunities. A total of 34 relevant studies published between 2006 and 2014 were identified for inclusion by means of online databases and other methods. We grouped them into three main areas of research: body perception and attractiveness, forensic research, and sexual orientation. Despite the methodological and theoretical differences across the studies, eye tracking has been shown to be a promising tool for sex research. The article suggests there is much potential for further studies to employ this technique because it is noninvasive and yet still allows for the assessment of both conscious and unconscious perceptional processes. Furthermore, eye tracking can be implemented in investigations of various theoretical backgrounds, ranging from biology to the social sciences.

  9. Registration of clinical volumes to beams-eye-view images for real-time tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryant, Jonathan H.; Rottmann, Joerg; Lewis, John H.

    2014-12-15

    Purpose: The authors combine the registration of 2D beam’s eye view (BEV) images and 3D planning computed tomography (CT) images, with relative, markerless tumor tracking to provide automatic absolute tracking of physician defined volumes such as the gross tumor volume (GTV). Methods: During treatment of lung SBRT cases, BEV images were continuously acquired with an electronic portal imaging device (EPID) operating in cine mode. For absolute registration of physician-defined volumes, an intensity based 2D/3D registration to the planning CT was performed using the end-of-exhale (EoE) phase of the four dimensional computed tomography (4DCT). The volume was converted from Hounsfield unitsmore » into electron density by a calibration curve and digitally reconstructed radiographs (DRRs) were generated for each beam geometry. Using normalized cross correlation between the DRR and an EoE BEV image, the best in-plane rigid transformation was found. The transformation was applied to physician-defined contours in the planning CT, mapping them into the EPID image domain. A robust multiregion method of relative markerless lung tumor tracking quantified deviations from the EoE position. Results: The success of 2D/3D registration was demonstrated at the EoE breathing phase. By registering at this phase and then employing a separate technique for relative tracking, the authors are able to successfully track target volumes in the BEV images throughout the entire treatment delivery. Conclusions: Through the combination of EPID/4DCT registration and relative tracking, a necessary step toward the clinical implementation of BEV tracking has been completed. The knowledge of tumor volumes relative to the treatment field is important for future applications like real-time motion management, adaptive radiotherapy, and delivered dose calculations.« less

  10. Brief Report: Patterns of Eye Movements in Face to Face Conversation are Associated with Autistic Traits: Evidence from a Student Sample.

    PubMed

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking to the social partner overall, nor with reduced looking to the face. However, individuals who were high in autistic traits exhibited reduced visual exploration during the face-to-face interaction overall, as demonstrated by shorter and less frequent saccades. Visual exploration was not related to social anxiety. This study suggests that there are systematic individual differences in visual exploration during social interactions and these are related to amount of autistic traits.

  11. Mobile Eye Tracking Methodology in Informal E-Learning in Social Groups in Technology-Enhanced Science Centres

    ERIC Educational Resources Information Center

    Magnussen, Rikke; Zachariassen, Maria; Kharlamov, Nikita; Larsen, Birger

    2017-01-01

    This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye-tracking studies of audience interaction and knowledge generation in the technology-enhanced health…

  12. Context Effects and Spoken Word Recognition of Chinese: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Yip, Michael C. W.; Zhai, Mingjun

    2018-01-01

    This study examined the time-course of context effects on spoken word recognition during Chinese sentence processing. We recruited 60 native Mandarin listeners to participate in an eye-tracking experiment. In this eye-tracking experiment, listeners were told to listen to a sentence carefully, which ended with a Chinese homophone, and look at…

  13. Eye Movements during Multiple Object Tracking: Where Do Participants Look?

    ERIC Educational Resources Information Center

    Fehd, Hilda M.; Seiffert, Adriane E.

    2008-01-01

    Similar to the eye movements you might make when viewing a sports game, this experiment investigated where participants tend to look while keeping track of multiple objects. While eye movements were recorded, participants tracked either 1 or 3 of 8 red dots that moved randomly within a square box on a black background. Results indicated that…

  14. Eye-Hand Synergy and Intermittent Behaviors during Target-Directed Tracking with Visual and Non-visual Information

    PubMed Central

    Huang, Chien-Ting; Hwang, Ing-Shiou

    2012-01-01

    Visual feedback and non-visual information play different roles in tracking of an external target. This study explored the respective roles of the visual and non-visual information in eleven healthy volunteers who coupled the manual cursor to a rhythmically moving target of 0.5 Hz under three sensorimotor conditions: eye-alone tracking (EA), eye-hand tracking with visual feedback of manual outputs (EH tracking), and the same tracking without such feedback (EHM tracking). Tracking error, kinematic variables, and movement intermittency (saccade and speed pulse) were contrasted among tracking conditions. The results showed that EHM tracking exhibited larger pursuit gain, less tracking error, and less movement intermittency for the ocular plant than EA tracking. With the vision of manual cursor, EH tracking achieved superior tracking congruency of the ocular and manual effectors with smaller movement intermittency than EHM tracking, except that the rate precision of manual action was similar for both types of tracking. The present study demonstrated that visibility of manual consequences altered mutual relationships between movement intermittency and tracking error. The speed pulse metrics of manual output were linked to ocular tracking error, and saccade events were time-locked to the positional error of manual tracking during EH tracking. In conclusion, peripheral non-visual information is critical to smooth pursuit characteristics and rate control of rhythmic manual tracking. Visual information adds to eye-hand synchrony, underlying improved amplitude control and elaborate error interpretation during oculo-manual tracking. PMID:23236498

  15. Instructional Suggestions Supporting Science Learning in Digital Environments Based on a Review of Eye-Tracking Studies

    ERIC Educational Resources Information Center

    Yang, Fang-Ying; Tsai, Meng-Jung; Chiou, Guo-Li; Lee, Silvia Wen-Yu; Chang, Cheng-Chieh; Chen, Li-Ling

    2018-01-01

    The main purpose of this study was to provide instructional suggestions for supporting science learning in digital environments based on a review of eye tracking studies in e-learning related areas. Thirty-three eye-tracking studies from 2005 to 2014 were selected from the Social Science Citation Index (SSCI) database for review. Through a…

  16. How Visual Search Relates to Visual Diagnostic Performance: A Narrative Systematic Review of Eye-Tracking Research in Radiology

    ERIC Educational Resources Information Center

    van der Gijp, A.; Ravesloot, C. J.; Jarodzka, H.; van der Schaaf, M. F.; van der Schaaf, I. C.; van Schaik, J. P.; ten Cate, Th. J.

    2017-01-01

    Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology…

  17. Using Eye Tracking as a Tool to Teach Informatics Students the Importance of User Centered Design

    ERIC Educational Resources Information Center

    Gelderblom, Helene; Adebesin, Funmi; Brosens, Jacques; Kruger, Rendani

    2017-01-01

    In this article the authors describe how they incorporate eye tracking in a human-computer interaction (HCI) course that forms part of a postgraduate Informatics degree. The focus is on an eye tracking assignment that involves student groups performing usability evaluation studies for real world clients. Over the past three years the authors have…

  18. A relationship between eye movement patterns and performance in a precognitive tracking task

    NASA Technical Reports Server (NTRS)

    Repperger, D. W.; Hartzell, E. J.

    1977-01-01

    Eye movements made by various subjects in the performance of a precognitive tracking task are studied. The tracking task persented by an antiaircraft artillery (AAA) simulator has an input forcing function represented by a deterministic aircraft fly-by. The performance of subjects is ranked by two metrics. Good, mediocre, and poor trackers are selected for analysis based on performance during the difficult segment of the tracking task and over replications. Using phase planes to characterize both the eye movement patterns and the displayed error signal, a simple metric is developed to study these patterns. Two characterizations of eye movement strategies are defined and quantified. Using these two types of eye strategies, two conclusions are obtained about good, mediocre, and poor trackers. First, the eye tracker who used a fixed strategy will consistently perform better. Secondly, the best fixed strategy is defined as a Crosshair Fixator.

  19. 3D ocular ultrasound using gaze tracking on the contralateral eye: a feasibility study.

    PubMed

    Afsham, Narges; Najafi, Mohammad; Abolmaesumi, Purang; Rohling, Robert

    2011-01-01

    A gaze-deviated examination of the eye with a 2D ultrasound transducer is a common and informative ophthalmic test; however, the complex task of the pose estimation of the ultrasound images relative to the eye affects 3D interpretation. To tackle this challenge, a novel system for 3D image reconstruction based on gaze tracking of the contralateral eye has been proposed. The gaze fixates on several target points and, for each fixation, the pose of the examined eye is inferred from the gaze tracking. A single camera system has been developed for pose estimation combined with subject-specific parameter identification. The ultrasound images are then transformed to the coordinate system of the examined eye to create a 3D volume. Accuracy of the proposed gaze tracking system and the pose estimation of the eye have been validated in a set of experiments. Overall system error, including pose estimation and calibration, are 3.12 mm and 4.68 degrees.

  20. A laser-based eye-tracking system.

    PubMed

    Irie, Kenji; Wilson, Bruce A; Jones, Richard D; Bones, Philip J; Anderson, Tim J

    2002-11-01

    This paper reports on the development of a new eye-tracking system for noninvasive recording of eye movements. The eye tracker uses a flying-spot laser to selectively image landmarks on the eye and, subsequently, measure horizontal, vertical, and torsional eye movements. Considerable work was required to overcome the adverse effects of specular reflection of the flying-spot from the surface of the eye onto the sensing elements of the eye tracker. These effects have been largely overcome, and the eye-tracker has been used to document eye movement abnormalities, such as abnormal torsional pulsion of saccades, in the clinical setting.

  1. A free geometry model-independent neural eye-gaze tracking system

    PubMed Central

    2012-01-01

    Background Eye Gaze Tracking Systems (EGTSs) estimate the Point Of Gaze (POG) of a user. In diagnostic applications EGTSs are used to study oculomotor characteristics and abnormalities, whereas in interactive applications EGTSs are proposed as input devices for human computer interfaces (HCI), e.g. to move a cursor on the screen when mouse control is not possible, such as in the case of assistive devices for people suffering from locked-in syndrome. If the user’s head remains still and the cornea rotates around its fixed centre, the pupil follows the eye in the images captured from one or more cameras, whereas the outer corneal reflection generated by an IR light source, i.e. glint, can be assumed as a fixed reference point. According to the so-called pupil centre corneal reflection method (PCCR), the POG can be thus estimated from the pupil-glint vector. Methods A new model-independent EGTS based on the PCCR is proposed. The mapping function based on artificial neural networks allows to avoid any specific model assumption and approximation either for the user’s eye physiology or for the system initial setup admitting a free geometry positioning for the user and the system components. The robustness of the proposed EGTS is proven by assessing its accuracy when tested on real data coming from: i) different healthy users; ii) different geometric settings of the camera and the light sources; iii) different protocols based on the observation of points on a calibration grid and halfway points of a test grid. Results The achieved accuracy is approximately 0.49°, 0.41°, and 0.62° for respectively the horizontal, vertical and radial error of the POG. Conclusions The results prove the validity of the proposed approach as the proposed system performs better than EGTSs designed for HCI which, even if equipped with superior hardware, show accuracy values in the range 0.6°-1°. PMID:23158726

  2. SU-G-BRA-06: Quantification of Tracking Performance of a Multi-Layer Electronic Portal Imaging Device

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Y; Rottmann, J; Myronakis, M

    2016-06-15

    Purpose: The purpose of this study was to quantify the improvement in tumor tracking, with and without fiducial markers, afforded by employing a multi-layer (MLI) electronic portal imaging device (EPID) over the current state-of-the-art, single-layer, digital megavolt imager (DMI) architecture. Methods: An ideal observer signal-to-noise ratio (d’) approach was used to quantify the ability of an MLI EPID and a current, state-of-the-art DMI EPID to track lung tumors from the treatment beam’s-eye-view. Using each detector modulation transfer function (MTF) and noise power spectrum (NPS) as inputs, a detection task was employed with object functions describing simple three-dimensional Cartesian shapes (spheresmore » and cylinders). Marker-less tumor tracking algorithms often use texture discrimination to differentiate benign and malignant tissue. The performance of such algorithms is simulated by employing a discrimination task for the ideal observer, which measures the ability of a system to differentiate two image quantities. These were defined as the measured textures for benign and malignant lung tissue. Results: The NNPS of the MLI ∼25% of that of the DMI at the expense of decreased MTF at intermediate frequencies (0.25≤« less

  3. A deep (learning) dive into visual search behaviour of breast radiologists

    NASA Astrophysics Data System (ADS)

    Mall, Suneeta; Brennan, Patrick C.; Mello-Thoms, Claudia

    2018-03-01

    Visual search, the process of detecting and identifying objects using the eye movements (saccades) and the foveal vision, has been studied for identification of root causes of errors in the interpretation of mammography. The aim of this study is to model visual search behaviour of radiologists and their interpretation of mammograms using deep machine learning approaches. Our model is based on a deep convolutional neural network, a biologically-inspired multilayer perceptron that simulates the visual cortex, and is reinforced with transfer learning techniques. Eye tracking data obtained from 8 radiologists (of varying experience levels in reading mammograms) reviewing 120 two-view digital mammography cases (59 cancers) have been used to train the model, which was pre-trained with the ImageNet dataset for transfer learning. Areas of the mammogram that received direct (foveally fixated), indirect (peripherally fixated) or no (never fixated) visual attention were extracted from radiologists' visual search maps (obtained by a head mounted eye tracking device). These areas, along with the radiologists' assessment (including confidence of the assessment) of suspected malignancy were used to model: 1) Radiologists' decision; 2) Radiologists' confidence on such decision; and 3) The attentional level (i.e. foveal, peripheral or none) obtained by an area of the mammogram. Our results indicate high accuracy and low misclassification in modelling such behaviours.

  4. Learning and Treatment of Anaphylaxis by Laypeople: A Simulation Study Using Pupilar Technology

    PubMed Central

    Fernandez-Mendez, Felipe; Barcala-Furelos, Roberto; Padron-Cabo, Alexis; Garcia-Magan, Carlos; Moure-Gonzalez, Jose; Contreras-Jordan, Onofre; Rodriguez-Nuñez, Antonio

    2017-01-01

    An anaphylactic shock is a time-critical emergency situation. The decision-making during emergencies is an important responsibility but difficult to study. Eye-tracking technology allows us to identify visual patterns involved in the decision-making. The aim of this pilot study was to evaluate two training models for the recognition and treatment of anaphylaxis by laypeople, based on expert assessment and eye-tracking technology. A cross-sectional quasi-experimental simulation study was made to evaluate the identification and treatment of anaphylaxis. 50 subjects were randomly assigned to four groups: three groups watching different training videos with content supervised by sanitary personnel and one control group who received face-to-face training during paediatric practice. To evaluate the learning, a simulation scenario represented by an anaphylaxis' victim was designed. A device capturing eye movement as well as expert valuation was used to evaluate the performance. The subjects that underwent paediatric face-to-face training achieved better and faster recognition of the anaphylaxis. They also used the adrenaline injector with better precision and less mistakes, and they needed a smaller number of visual fixations to recognise the anaphylaxis and to make the decision to inject epinephrine. Analysing the different video formats, mixed results were obtained. Therefore, they should be tested to evaluate their usability before implementation. PMID:28758128

  5. Where Do Neurologists Look When Viewing Brain CT Images? An Eye-Tracking Study Involving Stroke Cases

    PubMed Central

    Matsumoto, Hideyuki; Terao, Yasuo; Yugeta, Akihiro; Fukuda, Hideki; Emoto, Masaki; Furubayashi, Toshiaki; Okano, Tomoko; Hanajima, Ritsuko; Ugawa, Yoshikazu

    2011-01-01

    The aim of this study was to investigate where neurologists look when they view brain computed tomography (CT) images and to evaluate how they deploy their visual attention by comparing their gaze distribution with saliency maps. Brain CT images showing cerebrovascular accidents were presented to 12 neurologists and 12 control subjects. The subjects' ocular fixation positions were recorded using an eye-tracking device (Eyelink 1000). Heat maps were created based on the eye-fixation patterns of each group and compared between the two groups. The heat maps revealed that the areas on which control subjects frequently fixated often coincided with areas identified as outstanding in saliency maps, while the areas on which neurologists frequently fixated often did not. Dwell time in regions of interest (ROI) was likewise compared between the two groups, revealing that, although dwell time on large lesions was not different between the two groups, dwell time in clinically important areas with low salience was longer in neurologists than in controls. Therefore it appears that neurologists intentionally scan clinically important areas when reading brain CT images showing cerebrovascular accidents. Both neurologists and control subjects used the “bottom-up salience” form of visual attention, although the neurologists more effectively used the “top-down instruction” form. PMID:22174928

  6. Learning and Treatment of Anaphylaxis by Laypeople: A Simulation Study Using Pupilar Technology.

    PubMed

    Fernandez-Mendez, Felipe; Saez-Gallego, Nieves Maria; Barcala-Furelos, Roberto; Abelairas-Gomez, Cristian; Padron-Cabo, Alexis; Perez-Ferreiros, Alexandra; Garcia-Magan, Carlos; Moure-Gonzalez, Jose; Contreras-Jordan, Onofre; Rodriguez-Nuñez, Antonio

    2017-01-01

    An anaphylactic shock is a time-critical emergency situation. The decision-making during emergencies is an important responsibility but difficult to study. Eye-tracking technology allows us to identify visual patterns involved in the decision-making. The aim of this pilot study was to evaluate two training models for the recognition and treatment of anaphylaxis by laypeople, based on expert assessment and eye-tracking technology. A cross-sectional quasi-experimental simulation study was made to evaluate the identification and treatment of anaphylaxis. 50 subjects were randomly assigned to four groups: three groups watching different training videos with content supervised by sanitary personnel and one control group who received face-to-face training during paediatric practice. To evaluate the learning, a simulation scenario represented by an anaphylaxis' victim was designed. A device capturing eye movement as well as expert valuation was used to evaluate the performance. The subjects that underwent paediatric face-to-face training achieved better and faster recognition of the anaphylaxis. They also used the adrenaline injector with better precision and less mistakes, and they needed a smaller number of visual fixations to recognise the anaphylaxis and to make the decision to inject epinephrine. Analysing the different video formats, mixed results were obtained. Therefore, they should be tested to evaluate their usability before implementation.

  7. Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis

    DTIC Science & Technology

    2006-01-01

    Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis Laura Kurland, Abigail Gertner, Tom Bartee, Michael Chisholm and...have used these to study the analysts search behavior in detail. 2 EXPERIMENT Using a Cognitive Task Analysis (CTA) framework for knowledge...TITLE AND SUBTITLE Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  8. An Exploration of Cognitive Agility as Quantified by Attention Allocation in a Complex Environment

    DTIC Science & Technology

    2017-03-01

    quantified by eye-tracking data collected while subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether...subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether certain patterns are associated with effective...Group and Control Group on Eye Tracking and Game Performance .....................36 3. Comparison between High and Low Performers on Eye tracking and

  9. A Novel Eye-Tracking Method to Assess Attention Allocation in Individuals with and without Aphasia Using a Dual-Task Paradigm

    PubMed Central

    Heuer, Sabine; Hallowell, Brooke

    2015-01-01

    Numerous authors report that people with aphasia have greater difficulty allocating attention than people without neurological disorders. Studying how attention deficits contribute to language deficits is important. However, existing methods for indexing attention allocation in people with aphasia pose serious methodological challenges. Eye-tracking methods have great potential to address such challenges. We developed and assessed the validity of a new dual-task method incorporating eye tracking to assess attention allocation. Twenty-six adults with aphasia and 33 control participants completed auditory sentence comprehension and visual search tasks. To test whether the new method validly indexes well-documented patterns in attention allocation, demands were manipulated by varying task complexity in single- and dual-task conditions. Differences in attention allocation were indexed via eye-tracking measures. For all participants significant increases in attention allocation demands were observed from single- to dual-task conditions and from simple to complex stimuli. Individuals with aphasia had greater difficulty allocating attention with greater task demands. Relationships between eye-tracking indices of comprehension during single and dual tasks and standardized testing were examined. Results support the validity of the novel eye-tracking method for assessing attention allocation in people with and without aphasia. Clinical and research implications are discussed. PMID:25913549

  10. APPLICATION OF EYE TRACKING FOR MEASUREMENT AND EVALUATION IN HUMAN FACTORS STUDIES IN CONTROL ROOM MODERNIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, C.; Spielman, Z.; LeBlanc, K.

    An important element of human factors engineering (HFE) pertains to measurement and evaluation (M&E). The role of HFE-M&E should be integrated throughout the entire control room modernization (CRM) process and be used for human-system performance evaluation and diagnostic purposes with resolving potential human engineering deficiencies (HEDs) and other human machine interface (HMI) design issues. NUREG-0711 describes how HFE in CRM should employ a hierarchical set of measures, particularly during integrated system validation (ISV), including plant performance, personnel task performance, situation awareness, cognitive workload, and anthropometric/ physiological factors. Historically, subjective measures have been primarily used since they are easier to collectmore » and do not require specialized equipment. However, there are pitfalls with relying solely on subjective measures in M&E such that negatively impact reliability, sensitivity, and objectivity. As part of comprehensively capturing a diverse set of measures that strengthen findings and inferences made of the benefits from emerging technologies like advanced displays, this paper discusses the value of using eye tracking as an objective method that can be used in M&E. A brief description of eye tracking technology and relevant eye tracking measures is provided. Additionally, technical considerations and the unique challenges with using eye tracking in full-scaled simulations are addressed. Finally, this paper shares preliminary findings regarding the use of a wearable eye tracking system in a full-scale simulator study. These findings should help guide future full-scale simulator studies using eye tracking as a methodology to evaluate human-system performance.« less

  11. Clutter in electronic medical records: examining its performance and attentional costs using eye tracking.

    PubMed

    Moacdieh, Nadine; Sarter, Nadine

    2015-06-01

    The objective was to use eye tracking to trace the underlying changes in attention allocation associated with the performance effects of clutter, stress, and task difficulty in visual search and noticing tasks. Clutter can degrade performance in complex domains, yet more needs to be known about the associated changes in attention allocation, particularly in the presence of stress and for different tasks. Frequently used and relatively simple eye tracking metrics do not effectively capture the various effects of clutter, which is critical for comprehensively analyzing clutter and developing targeted, real-time countermeasures. Electronic medical records (EMRs) were chosen as the application domain for this research. Clutter, stress, and task difficulty were manipulated, and physicians' performance on search and noticing tasks was recorded. Several eye tracking metrics were used to trace attention allocation throughout those tasks, and subjective data were gathered via a debriefing questionnaire. Clutter degraded performance in terms of response time and noticing accuracy. These decrements were largely accentuated by high stress and task difficulty. Eye tracking revealed the underlying attentional mechanisms, and several display-independent metrics were shown to be significant indicators of the effects of clutter. Eye tracking provides a promising means to understand in detail (offline) and prevent (in real time) major performance breakdowns due to clutter. Display designers need to be aware of the risks of clutter in EMRs and other complex displays and can use the identified eye tracking metrics to evaluate and/or adjust their display. © 2015, Human Factors and Ergonomics Society.

  12. Parallax barrier engineering for image quality improvement in an autostereoscopic 3D display.

    PubMed

    Kim, Sung-Kyu; Yoon, Ki-Hyuk; Yoon, Seon Kyu; Ju, Heongkyu

    2015-05-18

    We present a image quality improvement in a parallax barrier (PB)-based multiview autostereoscopic 3D display system under a real-time tracking of positions of a viewer's eyes. The system presented exploits a parallax barrier engineered to offer significantly improved quality of three-dimensional images for a moving viewer without an eyewear under the dynamic eye tracking. The improved image quality includes enhanced uniformity of image brightness, reduced point crosstalk, and no pseudoscopic effects. We control the relative ratio between two parameters i.e., a pixel size and the aperture of a parallax barrier slit to improve uniformity of image brightness at a viewing zone. The eye tracking that monitors positions of a viewer's eyes enables pixel data control software to turn on only pixels for view images near the viewer's eyes (the other pixels turned off), thus reducing point crosstalk. The eye tracking combined software provides right images for the respective eyes, therefore producing no pseudoscopic effects at its zone boundaries. The viewing zone can be spanned over area larger than the central viewing zone offered by a conventional PB-based multiview autostereoscopic 3D display (no eye tracking). Our 3D display system also provides multiviews for motion parallax under eye tracking. More importantly, we demonstrate substantial reduction of point crosstalk of images at the viewing zone, its level being comparable to that of a commercialized eyewear-assisted 3D display system. The multiview autostereoscopic 3D display presented can greatly resolve the point crosstalk problem, which is one of the critical factors that make it difficult for previous technologies for a multiview autostereoscopic 3D display to replace an eyewear-assisted counterpart.

  13. VisualEyes: a modular software system for oculomotor experimentation.

    PubMed

    Guo, Yi; Kim, Eun H; Kim, Eun; Alvarez, Tara; Alvarez, Tara L

    2011-03-25

    Eye movement studies have provided a strong foundation forming an understanding of how the brain acquires visual information in both the normal and dysfunctional brain.(1) However, development of a platform to stimulate and store eye movements can require substantial programming, time and costs. Many systems do not offer the flexibility to program numerous stimuli for a variety of experimental needs. However, the VisualEyes System has a flexible architecture, allowing the operator to choose any background and foreground stimulus, program one or two screens for tandem or opposing eye movements and stimulate the left and right eye independently. This system can significantly reduce the programming development time needed to conduct an oculomotor study. The VisualEyes System will be discussed in three parts: 1) the oculomotor recording device to acquire eye movement responses, 2) the VisualEyes software written in LabView, to generate an array of stimuli and store responses as text files and 3) offline data analysis. Eye movements can be recorded by several types of instrumentation such as: a limbus tracking system, a sclera search coil, or a video image system. Typical eye movement stimuli such as saccadic steps, vergent ramps and vergent steps with the corresponding responses will be shown. In this video report, we demonstrate the flexibility of a system to create numerous visual stimuli and record eye movements that can be utilized by basic scientists and clinicians to study healthy as well as clinical populations.

  14. Detection of differential viewing patterns to erotic and non-erotic stimuli using eye-tracking methodology.

    PubMed

    Lykins, Amy D; Meana, Marta; Kambe, Gretchen

    2006-10-01

    As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.

  15. Quantitative, simultaneous, and collinear eye-tracked, high dynamic range optical coherence tomography at 850 and 1060 nm

    NASA Astrophysics Data System (ADS)

    Mooser, Matthias; Burri, Christian; Stoller, Markus; Luggen, David; Peyer, Michael; Arnold, Patrik; Meier, Christoph; Považay, Boris

    2017-07-01

    Ocular optical coherence tomography at the wavelengths ranges of 850 and 1060 nm have been integrated with a confocal scanning laser ophthalmoscope eye-tracker as a clinical commercial-class system. Collinear optics enables an exact overlap of the different channels to produce precisely overlapping depth-scans for evaluating the similarities and differences between the wavelengths to extract additional physiologic information. A reliable segmentation algorithm utilizing Graphcuts has been implemented and applied to automatically extract retinal and choroidal shape in cross-sections and volumes. The device has been tested in normals and pathologies including a cross-sectional and longitudinal study of myopia progress and control with a duplicate instrument in Asian children.

  16. Active eye-tracking improves LASIK results.

    PubMed

    Lee, Yuan-Chieh

    2007-06-01

    To study the advantage of active eye-tracking for photorefractive surgery. In a prospective, double-masked study, LASIK for myopia and myopic astigmatism was performed in 50 patients using the ALLEGRETTO WAVE version 1007. All patients received LASIK with full comprehension of the importance of fixation during the procedure. All surgical procedures were performed by a single surgeon. The eye-tracker was turned off in one group (n = 25) and kept on in another group (n = 25). Preoperatively and 3 months postoperatively, patients underwent a standard ophthalmic examination, which included comeal topography. In the patients treated with the eye-tracker off, all had uncorrected visual acuity (UCVA) of > or = 20/40 and 64% had > or = 20/20. Compared with the patients treated with the eye-tracker on, they had higher residual cylindrical astigmatism (P < .05). Those treated with the eye-tracker on achieved better UCVA and best spectacle-corrected visual acuity (P < .05). Spherical error and potential visual acuity (TMS-II) were not significantly different between the groups. The flying-spot system can achieve a fair result without active eye-tracking, but active eye-tracking helps improve the visual outcome and reduces postoperative cylindrical astigmatism.

  17. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.

    PubMed

    Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo

    2017-07-01

    Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.

  18. An eye tracking study of bloodstain pattern analysts during pattern classification.

    PubMed

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  19. Feasibility of utilizing a commercial eye tracker to assess electronic health record use during patient simulation.

    PubMed

    Gold, Jeffrey Allen; Stephenson, Laurel E; Gorsuch, Adriel; Parthasarathy, Keshav; Mohan, Vishnu

    2016-09-01

    Numerous reports describe unintended consequences of electronic health record implementation. Having previously described physicians' failures to recognize patient safety issues within our electronic health record simulation environment, we now report on our use of eye and screen-tracking technology to understand factors associated with poor error recognition during an intensive care unit-based electronic health record simulation. We linked performance on the simulation to standard eye and screen-tracking readouts including number of fixations, saccades, mouse clicks and screens visited. In addition, we developed an overall Composite Eye Tracking score which measured when, where and how often each safety item was viewed. For 39 participants, the Composite Eye Tracking score correlated with performance on the simulation (p = 0.004). Overall, the improved performance was associated with a pattern of rapid scanning of data manifested by increased number of screens visited (p = 0.001), mouse clicks (p = 0.03) and saccades (p = 0.004). Eye tracking can be successfully integrated into electronic health record-based simulation and provides a surrogate measure of cognitive decision making and electronic health record usability. © The Author(s) 2015.

  20. Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities

    NASA Astrophysics Data System (ADS)

    Pansing, Craig W.; Hua, Hong; Rolland, Jannick P.

    2005-08-01

    Head-mounted display (HMD) technologies find a variety of applications in the field of 3D virtual and augmented environments, 3D scientific visualization, as well as wearable displays. While most of the current HMDs use head pose to approximate line of sight, we propose to investigate approaches and designs for integrating eye tracking capability into HMDs from a low-level system design perspective and to explore schemes for optimizing system performance. In this paper, we particularly propose to optimize the illumination scheme, which is a critical component in designing an eye tracking-HMD (ET-HMD) integrated system. An optimal design can improve not only eye tracking accuracy, but also robustness. Using LightTools, we present the simulation of a complete eye illumination and imaging system using an eye model along with multiple near infrared LED (IRLED) illuminators and imaging optics, showing the irradiance variation of the different eye structures. The simulation of dark pupil effects along with multiple 1st-order Purkinje images will be presented. A parametric analysis is performed to investigate the relationships between the IRLED configurations and the irradiance distribution at the eye, and a set of optimal configuration parameters is recommended. The analysis will be further refined by actual eye image acquisition and processing.

  1. Comparison of Threshold Saccadic Vector Optokinetic Perimetry (SVOP) and Standard Automated Perimetry (SAP) in Glaucoma. Part II: Patterns of Visual Field Loss and Acceptability.

    PubMed

    McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A

    2017-09-01

    We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.

  2. Measuring Human Performance in Simulated Nuclear Power Plant Control Rooms Using Eye Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, Casey Robert; Rice, Brandon Charles; Bower, Gordon Ross

    Control room modernization will be an important part of life extension for the existing light water reactor fleet. As part of modernization efforts, personnel will need to gain a full understanding of how control room technologies affect performance of human operators. Recent advances in technology enables the use of eye tracking technology to continuously measure an operator’s eye movement, which correlates with a variety of human performance constructs such as situation awareness and workload. This report describes eye tracking metrics in the context of how they will be used in nuclear power plant control room simulator studies.

  3. Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop.

    PubMed

    Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K

    2008-01-01

    A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.

  4. 21 CFR 886.4750 - Ophthalmic eye shield.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Ophthalmic eye shield. 886.4750 Section 886.4750...) MEDICAL DEVICES OPHTHALMIC DEVICES Surgical Devices § 886.4750 Ophthalmic eye shield. (a) Identification. An ophthalmic eye shield is a device that consists of a plastic or aluminum eye covering intended to...

  5. 21 CFR 886.4750 - Ophthalmic eye shield.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Ophthalmic eye shield. 886.4750 Section 886.4750...) MEDICAL DEVICES OPHTHALMIC DEVICES Surgical Devices § 886.4750 Ophthalmic eye shield. (a) Identification. An ophthalmic eye shield is a device that consists of a plastic or aluminum eye covering intended to...

  6. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  7. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment.

    PubMed

    Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M

    2016-01-26

    Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.

  8. The Potential Influence of “Stimulus Overselectivity” in AAC: Information from Eye-tracking and Behavioral Studies of Attention

    PubMed Central

    Dube, William V.; Wilkinson, Krista M.

    2014-01-01

    This paper examines the phenomenon of “stimulus overselectivity” or “overselective attention” as it may impact AAC training and use in individuals with intellectual disabilities. Stimulus overselectivity is defined as an atypical limitation in the number of stimuli or stimulus features within an image that are attended to and subsequently learned. Within AAC, the term “stimulus” could refer to symbols or line drawings on speech generating devices, drawings or pictures on low-technology systems, and/or the elements within visual scene displays. In this context, overselective attention may result in unusual or uneven error patterns such as confusion between two symbols that share a single feature or difficulties with transitioning between different types of hardware. We review some of the ways that overselective attention has been studied behaviorally. We then examine how eye tracking technology allows a glimpse into some of the behavioral characteristics of overselective attention. We describe an intervention approach, differential observing responses, that may reduce or eliminate overselectivity, and we consider this type of intervention as it relates to issues of relevance for AAC. PMID:24773053

  9. Tracking Students' Eye-Movements When Reading Learning Objects on Mobile Phones: A Discourse Analysis of Luganda Language Teacher-Trainees' Reflective Observations

    ERIC Educational Resources Information Center

    Kabugo, David; Muyinda, Paul B.; Masagazi, Fred. M.; Mugagga, Anthony M.; Mulumba, Mathias B.

    2016-01-01

    Although eye-tracking technologies such as Tobii-T120/TX and Eye-Tribe are steadily becoming ubiquitous, and while their appropriation in education can aid teachers to collect robust information on how students move their eyes when reading and engaging with different learning objects, many teachers of Luganda language are yet to gain experiences…

  10. Visual attention on a respiratory function monitor during simulated neonatal resuscitation: an eye-tracking study.

    PubMed

    Katz, Trixie A; Weinberg, Danielle D; Fishman, Claire E; Nadkarni, Vinay; Tremoulet, Patrice; Te Pas, Arjan B; Sarcevic, Aleksandra; Foglia, Elizabeth E

    2018-06-14

    A respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV. Mixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses. Level 3 academic neonatal intensive care unit. Twenty neonatal resuscitation providers. Visual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation. Twenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13-51%), highest visit count (median 5.17 per 10 s, IQR 2.82-6.16) and longest visit duration (median 0.48 s, IQR 0.38-0.81 s). All participants were willing to wear the glasses during clinical resuscitation. Wearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. USSR and Eastern Europe Scientific Abstracts, Cybernetics, Computers, and Automation Technology, Number 27

    DTIC Science & Technology

    1977-05-10

    apply this method of forecast- ing in the solution of all major scientific-technical problems of the na- tional economy. Citing the slow...the future, however, computers will "mature" and learn to recognize patterns in what amounts to a much more complex language—the language of visual...images. Photoelectronic tracking devices or "eyes" will allow the computer to take in information in a much more complex form and to perform opera

  12. Mobile Eye Tracking Reveals Little Evidence for Age Differences in Attentional Selection for Mood Regulation

    PubMed Central

    Isaacowitz, Derek M.; Livingstone, Kimberly M.; Harris, Julia A.; Marcotte, Stacy L.

    2014-01-01

    We report two studies representing the first use of mobile eye tracking to study emotion regulation across adulthood. Past research on age differences in attentional deployment using stationary eye tracking has found older adults show relatively more positive looking, and seem to benefit more mood-wise from this looking pattern, compared to younger adults. However, these past studies have greatly constrained the stimuli participants can look at, despite real-world settings providing numerous possibilities for what to choose to look at. We therefore used mobile eye tracking to study age differences in attentional selection, as indicated by fixation patterns to stimuli of different valence freely chosen by the participant. In contrast to stationary eye tracking studies of attentional deployment, Study 1 showed that younger and older individuals generally selected similar proportions of valenced stimuli, and attentional selection had similar effects on mood across age groups. Study 2 replicated this pattern with an adult lifespan sample including middle-aged individuals. Emotion regulation-relevant attention may thus differ depending on whether stimuli are freely chosen or not. PMID:25527965

  13. Binocular device for displaying numerical information in field of view

    NASA Technical Reports Server (NTRS)

    Fuller, H. V. (Inventor)

    1977-01-01

    An apparatus is described for superimposing numerical information on the field of view of binoculars. The invention has application in the flying of radio-controlled model airplanes. Information such as airspeed and angle of attack are sensed on a model airplane and transmitted back to earth where this information is changed into numerical form. Optical means are attached to the binoculars that a pilot is using to track the model air plane for displaying the numerical information in the field of view of the binoculars. The device includes means for focusing the numerical information at infinity whereby the user of the binoculars can see both the field of view and the numerical information without refocusing his eyes.

  14. Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects.

    PubMed

    Kang, Ziho; Mandal, Saptarshi; Crutchfield, Jerry; Millan, Angel; McClung, Sarah N

    2016-01-01

    Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance.

  15. Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects

    PubMed Central

    Mandal, Saptarshi

    2016-01-01

    Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance. PMID:27725830

  16. [Electronic Device for Retinal and Iris Imaging].

    PubMed

    Drahanský, M; Kolář, R; Mňuk, T

    This paper describes design and construction of a new device for automatic capturing of eye retina and iris. This device has two possible ways of utilization - either for biometric purposes (persons recognition on the base of their eye characteristics) or for medical purposes as supporting diagnostic device. eye retina, eye iris, device, acquisition, image.

  17. Three-Dimensional Eye Tracking in a Surgical Scenario.

    PubMed

    Bogdanova, Rositsa; Boulanger, Pierre; Zheng, Bin

    2015-10-01

    Eye tracking has been widely used in studying the eye behavior of surgeons in the past decade. Most eye-tracking data are reported in a 2-dimensional (2D) fashion, and data for describing surgeons' behaviors on stereoperception are often missed. With the introduction of stereoscopes in laparoscopic procedures, there is an increasing need for studying the depth perception of surgeons under 3D image-guided surgery. We developed a new algorithm for the computation of convergence points in stereovision by measuring surgeons' interpupillary distance, the distance to the view target, and the difference between gaze locations of the 2 eyes. To test the feasibility of our new algorithm, we recruited 10 individuals to watch stereograms using binocular disparity and asked them to develop stereoperception using a cross-eyed viewing technique. Participants' eye motions were recorded by the Tobii eye tracker while they performed the trials. Convergence points between normal and stereo-viewing conditions were computed using the developed algorithm. All 10 participants were able to develop stereovision after a short period of training. During stereovision, participants' eye convergence points were 14 ± 1 cm in front of their eyes, which was significantly closer than the convergence points under the normal viewing condition (77 ± 20 cm). By applying our method of calculating convergence points using eye tracking, we were able to elicit the eye movement patterns of human operators between the normal and stereovision conditions. Knowledge from this study can be applied to the design of surgical visual systems, with the goal of improving surgical performance and patient safety. © The Author(s) 2015.

  18. Out of the Corner of My Eye: Foveal Semantic Load Modulates Parafoveal Processing in Reading.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Brennan R.; Stites, Mallory C.; Federmeier, Kara D.

    In two experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERP) in a modified RSVP paradigm to track the time-course of foveal semantic influences on convert attentional allocation to parafoveal word processing. Furthermore, eye-tracking and ERP data converged to reveal graded effects of semantic foveal load on parafoveal processing.

  19. Out of the Corner of My Eye: Foveal Semantic Load Modulates Parafoveal Processing in Reading.

    DOE PAGES

    Payne, Brennan R.; Stites, Mallory C.; Federmeier, Kara D.

    2016-07-18

    In two experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERP) in a modified RSVP paradigm to track the time-course of foveal semantic influences on convert attentional allocation to parafoveal word processing. Furthermore, eye-tracking and ERP data converged to reveal graded effects of semantic foveal load on parafoveal processing.

  20. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2016-04-01

    and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task has been completed and is in beta testing...neurocognitive test battery, and self-report measures of cognitive efficacy. We will also include functional magnetic resonance imagining ( fMRI ) and... fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye tracking data will be

  1. Processing Control Information in a Nominal Control Construction: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Kwon, Nayoung; Sturt, Patrick

    2016-01-01

    In an eye-tracking experiment, we examined the processing of the nominal control construction. Participants' eye-movements were monitored while they read sentences that included either giver control nominals (e.g. "promise" in "Luke's promise to Sophia to photograph himself") or recipient control nominals (e.g. "plea"…

  2. NMR Spectra through the Eyes of a Student: Eye Tracking Applied to NMR Items

    ERIC Educational Resources Information Center

    Topczewski, Joseph J.; Topczewski, Anna M.; Tang, Hui; Kendhammer, Lisa K.; Pienta, Norbert J.

    2017-01-01

    Nuclear magnetic resonance spectroscopy (NMR) plays a key role in introductory organic chemistry, spanning theory, concepts, and experimentation. Therefore, it is imperative that the instruction methods for NMR are both efficient and effective. By utilizing eye tracking equipment, the researchers were able to monitor how second-semester organic…

  3. Optimizations and Applications in Head-Mounted Video-Based Eye Tracking

    ERIC Educational Resources Information Center

    Li, Feng

    2011-01-01

    Video-based eye tracking techniques have become increasingly attractive in many research fields, such as visual perception and human-computer interface design. The technique primarily relies on the positional difference between the center of the eye's pupil and the first-surface reflection at the cornea, the corneal reflection (CR). This…

  4. Compensating For Movement Of Eye In Laser Surgery

    NASA Technical Reports Server (NTRS)

    Juday, Richard D.

    1991-01-01

    Conceptual system for laser surgery of retina includes subsystem that tracks position of retina. Tracking signal used to control galvanometer-driven mirrors keeping laser aimed at desired spot on retina as eye moves. Alternatively or additionally, indication of position used to prevent firing of laser when eye moved too far from proper aiming position.

  5. Oculomatic: High speed, reliable, and accurate open-source eye tracking for humans and non-human primates.

    PubMed

    Zimmermann, Jan; Vazquez, Yuriria; Glimcher, Paul W; Pesaran, Bijan; Louie, Kenway

    2016-09-01

    Video-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive, limiting wide-spread use. Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (<0.5°), and low system latency (∼1.8ms, 0.32ms STD) at a relatively low-cost. Oculomatic compares favorably to our existing scleral search-coil system while being fully non invasive. We propose that Oculomatic can support a wide range of research into the properties and neural mechanisms of oculomotor behavior. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. A novel device for head gesture measurement system in combination with eye-controlled human machine interface

    NASA Astrophysics Data System (ADS)

    Lin, Chern-Sheng; Ho, Chien-Wa; Chang, Kai-Chieh; Hung, San-Shan; Shei, Hung-Jung; Yeh, Mau-Shiun

    2006-06-01

    This study describes the design and combination of an eye-controlled and a head-controlled human-machine interface system. This system is a highly effective human-machine interface, detecting head movement by changing positions and numbers of light sources on the head. When the users utilize the head-mounted display to browse a computer screen, the system will catch the images of the user's eyes with CCD cameras, which can also measure the angle and position of the light sources. In the eye-tracking system, the program in the computer will locate each center point of the pupils in the images, and record the information on moving traces and pupil diameters. In the head gesture measurement system, the user wears a double-source eyeglass frame, so the system catches images of the user's head by using a CCD camera in front of the user. The computer program will locate the center point of the head, transferring it to the screen coordinates, and then the user can control the cursor by head motions. We combine the eye-controlled and head-controlled human-machine interface system for the virtual reality applications.

  7. The effects of using a portable music player on simulated driving performance and task-sharing strategies.

    PubMed

    Young, Kristie L; Mitsopoulos-Rubens, Eve; Rudin-Brown, Christina M; Lenné, Michael G

    2012-07-01

    This study examined the effects of performing scrollable music selection tasks using a portable music player (iPod Touch™) on simulated driving performance and task-sharing strategies, as evidenced through eye glance behaviour and secondary task performance. A total of 37 drivers (18-48 yrs) completed the PC-based MUARC Driver Distraction Test (DDT) while performing music selection tasks on an iPod Touch. Drivers' eye glance behaviour was examined using faceLAB eye tracking equipment. Results revealed that performing music search tasks while driving increased the amount of time that drivers spent with their eyes off the roadway and decreased their ability to maintain a constant lane position and time headway from a lead vehicle. There was also evidence, however, that drivers attempted to regulate their behaviour when distracted by decreasing their speed and taking a large number of short glances towards the device. Overall, results suggest that performing music search tasks while driving is problematic and steps to prohibit this activity should be taken. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. [A tracking function of human eye in microgravity and during readaptation to earth's gravity].

    PubMed

    Kornilova, L N

    2001-01-01

    The paper summarizes results of electro-oculography of all ways of visual tracking: fixative eye movements (saccades), smooth pursuit of linearly, pendulum-like and circularly moving point stimuli, pursuit of vertically moving foveoretinal optokinetic stimuli, and presents values of thresholds and amplification coefficients of the optokinetic nystagmus during tracking of linear movement of foveoretinal optokinetic stimuli. Investigations were performed aboard the Salyut and Mir space stations with participation of 31 cosmonauts of whom 27 made long-term (76 up to 438 day) and 4 made short-term (7 to 9 day) missions. It was shown that in space flight the saccadic structure within the tracking reaction does not change; yet, corrective movements (additional microsaccades to achieve tracking) appeared in 47% of observations at the onset and in 76% of observations on months 3 to 6 of space flight. After landing, the structure of vertical saccades was found altered in half the cosmonauts. No matter in or after flight, reverse nystagmus was present along with the gaze nystagmus during static saccades in 22% (7 cosmonauts) of the observations. Amplitude of tracking vertically, diagonally or circularly moving stimuli was significantly reduced as period on mission increased. Early in flight (40% of the cosmonauts) and shortly afterwards (21% of the cosmonauts) the structure of smooth tracking reaction was totally broken up, that is eye followed stimulus with micro- or macrosaccades. The structure of smooth eye tracking recovered on flight days 6-8 and on postflight days 3-4. However, in 46% of the cosmonauts on long-term missions the structure of smooth eye tracking was noted to be disturbed periodically, i.e. smooth tracking was replaced by saccadic.

  9. Micro-video display with ocular tracking and interactive voice control

    NASA Technical Reports Server (NTRS)

    Miller, James E.

    1993-01-01

    In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.

  10. Temperature profiles of patient-applied eyelid warming therapies.

    PubMed

    Wang, Michael T M; Gokul, Akilesh; Craig, Jennifer P

    2015-12-01

    To compare temperature profile characteristics (on and off eye) of two patient-applied heat therapies for meibomian gland dysfunction (MGD): an eye mask containing disposable warming units (EyeGiene(®)) and a microwave-heated flaxseed eye bag(®) (MGDRx EyeBag(®)). In vitro evaluation: surface temperature profiles of activated eye masks and heated eye bags(®) (both n=10), were tracked every 10s until return to ambient temperature. Heat-transfer assessment: outer and inner eyelid temperature profiles throughout the eye mask and eye bag(®) treatment application period (10min) were investigated in triplicate. The devices were applied for 12 different time intervals in a randomised order, with a cool-down period in between to ensure ocular temperatures returned to baseline. Temperature measurements were taken before and immediately after each application. In vitro evaluation: on profile, the eye bag(®) surface temperature peaked earlier (0±0 s vs. 100±20 s, p<0.001), cooled more slowly and displayed less variability than the eye mask (all p<0.05). Heat-transfer assessment: the eye bag(®) effected higher peak inner eyelid temperatures (38.1±0.4°C vs. 37.4±0.2°C, p=0.04), as well as larger inner eyelid temperature increases over the first 2 min, and between 9 and 10 min (all p<0.05). The eye bag(®) surface temperature profile displayed greater uniformity and slower cooling than the eye mask, and was demonstrated to be significantly more effective in raising ocular temperatures than the eye mask, both statistically and clinically. This has implications for MGD treatment, where the melting points of meibomian secretions are likely to be higher with increasing disease severity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Prior Knowledge and Online Inquiry-Based Science Reading: Evidence from Eye Tracking

    ERIC Educational Resources Information Center

    Ho, Hsin Ning Jessie; Tsai, Meng-Jung; Wang, Ching-Yeh; Tsai, Chin-Chung

    2014-01-01

    This study employed eye-tracking technology to examine how students with different levels of prior knowledge process text and data diagrams when reading a web-based scientific report. Students' visual behaviors were tracked and recorded when they read a report demonstrating the relationship between the greenhouse effect and global climate…

  12. Like a rolling stone: naturalistic visual kinematics facilitate tracking eye movements.

    PubMed

    Souto, David; Kerzel, Dirk

    2013-02-06

    Newtonian physics constrains object kinematics in the real world. We asked whether eye movements towards tracked objects depend on their compliance with those constraints. In particular, the force of gravity constrains round objects to roll on the ground with a particular rotational and translational motion. We measured tracking eye movements towards rolling objects. We found that objects with rotational and translational motion that was congruent with an object rolling on the ground elicited faster tracking eye movements during pursuit initiation than incongruent stimuli. Relative to a condition without rotational component, we compared objects with this motion with a condition in which there was no rotational component, we essentially obtained benefits of congruence, and, to a lesser extent, costs from incongruence. Anticipatory pursuit responses showed no congruence effect, suggesting that the effect is based on visually-driven predictions, not on velocity storage. We suggest that the eye movement system incorporates information about object kinematics acquired by a lifetime of experience with visual stimuli obeying the laws of Newtonian physics.

  13. Comparison of smooth pursuit and combined eye-head tracking in human subjects with deficient labyrinthine function

    NASA Technical Reports Server (NTRS)

    Leigh, R. J.; Thurston, S. E.; Sharpe, J. A.; Ranalli, P. J.; Hamid, M. A.

    1987-01-01

    The effects of deficient labyrinthine function on smooth visual tracking with the eyes and head were investigated, using ten patients with bilateral peripheral vestibular disease and ten normal controls. Active, combined eye-head tracking (EHT) was significantly better in patients than smooth pursuit with the eyes alone, whereas normal subjects pursued equally well in both cases. Compensatory eye movements during active head rotation in darkness were always less in patients than in normal subjects. These data were used to examine current hypotheses that postulate central cancellation of the vestibulo-ocular reflex (VOR) during EHT. A model that proposes summation of an integral smooth pursuit command and VOR/compensatory eye movements is consistent with the findings. Observation of passive EHT (visual fixation of a head-fixed target during en bloc rotation) appears to indicate that in this mode parametric gain changes contribute to modulation of the VOR.

  14. Experimental support that ocular tremor in Parkinson's disease does not originate from head movement.

    PubMed

    Gitchel, George T; Wetzel, Paul A; Qutubuddin, Abu; Baron, Mark S

    2014-07-01

    Our recent report of ocular tremor in Parkinson's disease (PD) has raised considerable controversy as to the origin of the tremor. Using an infrared based eye tracker and a magnetic head tracker, we reported that ocular tremor was recordable in PD subjects with no apparent head tremor. However, other investigators suggest that the ocular tremor may represent either transmitted appendicular tremor or subclinical head tremor inducing the vestibulo-ocular reflex (VOR). The present study aimed to further investigate the origin of ocular tremor in PD. Eye movements were recorded in 8 PD subjects both head free, and with full head restraint by means of a head holding device and a dental impression bite plate. Head movements were recorded independently using both a high sensitivity tri-axial accelerometer and a magnetic tracking system, each synchronized to the eye tracker. Ocular tremor was observed in all 8 PD subjects and was not influenced by head free and head fixed conditions. Both magnetic tracking and accelerometer recordings supported that the ocular tremor was fully independent of head position. The present study findings support our initial findings that ocular tremor is a fundamental feature of PD unrelated to head movements. Although the utility of ocular tremor for diagnostic purposes requires validation, current findings in large cohorts of PD subjects suggest its potential as a reliable clinical biomarker. Published by Elsevier Ltd.

  15. Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys.

    PubMed

    Ando, K; Johanson, C E; Levy, D L; Yasillo, N J; Holzman, P S; Schuster, C R

    1983-01-01

    Rhesus monkeys were trained to track a moving disk using a procedure in which responses on a lever were reinforced with water delivery only when the disk, oscillating in a horizontal plane on a screen at a frequency of 0.4 Hz in a visual angle of 20 degrees, dimmed for a brief period. Pursuit eye movements were recorded by electrooculography (EOG). IM phencyclidine, secobarbital, and diazepam injections decreased the number of reinforced lever presses in a dose-related manner. Both secobarbital and diazepam produced episodic jerky-pursuit eye movements, while phencyclidine had no consistent effects on eye movements. Lever pressing was disrupted at doses which had little effect on the quality of smooth-pursuit eye movements in some monkeys. This separation was particularly pronounced with diazepam. The similarities of the drug effects on smooth-pursuit eye movements between the present study and human studies indicate that the present method using rhesus monkeys may be useful for predicting drug effects on eye tracking and oculomotor function in humans.

  16. Do the Eyes Have It? Using Eye Tracking to Assess Students Cognitive Dimensions

    ERIC Educational Resources Information Center

    Nisiforou, Efi A.; Laghos, Andrew

    2013-01-01

    Field dependence/independence (FD/FI) is a significant dimension of cognitive styles. The paper presents results of a study that seeks to identify individuals' level of field independence during visual stimulus tasks processing. Specifically, it examined the relationship between the Hidden Figure Test (HFT) scores and the eye tracking metrics.…

  17. Tracking the Eye Movement of Four Years Old Children Learning Chinese Words

    ERIC Educational Resources Information Center

    Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei

    2018-01-01

    Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese…

  18. The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Sterling, Lindsey; Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth

    2008-01-01

    It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically…

  19. Eye-Tracking as a Measure of Responsiveness to Joint Attention in Infants at Risk for Autism

    ERIC Educational Resources Information Center

    Navab, Anahita; Gillespie-Lynch, Kristen; Johnson, Scott P.; Sigman, Marian; Hutman, Ted

    2012-01-01

    Reduced responsiveness to joint attention (RJA), as assessed by the Early Social Communication Scales (ESCS), is predictive of both subsequent language difficulties and autism diagnosis. Eye-tracking measurement of RJA is a promising prognostic tool because it is highly precise and standardized. However, the construct validity of eye-tracking…

  20. Reading Mathematics Representations: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Andrá, Chiara; Lindström, Paulina; Arzarello, Ferdinando; Holmqvist, Kenneth; Robutti, Ornella; Sabena, Cristina

    2015-01-01

    We use eye tracking as a method to examine how different mathematical representations of the same mathematical object are attended to by students. The results of this study show that there is a meaningful difference in the eye movements between formulas and graphs. This difference can be understood in terms of the cultural and social shaping of…

  1. A Critical Test of Temporal and Spatial Accuracy of the Tobii T60XL Eye Tracker

    ERIC Educational Resources Information Center

    Morgante, James D.; Zolfaghari, Rahman; Johnson, Scott P.

    2012-01-01

    Infant eye tracking is becoming increasingly popular for its presumed precision relative to traditional looking time paradigms and potential to yield new insights into developmental processes. However, there is strong reason to suspect that the temporal and spatial resolution of popular eye tracking systems is not entirely accurate, potentially…

  2. 21 CFR 886.1510 - Eye movement monitor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...

  3. 21 CFR 886.1510 - Eye movement monitor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...

  4. Eyes wide shut: implied social presence, eye tracking and attention.

    PubMed

    Risko, Evan F; Kingstone, Alan

    2011-02-01

    People often behave differently when they know they are being watched. Here, we report the first investigation of whether such social presence effects also influence looking behavior--a popular measure of attention allocation. We demonstrate that wearing an eye tracker, an implied social presence, leads individuals to avoid looking at particular stimuli. These results demonstrate that an implied social presence, here an eye tracker, can alter looking behavior. These data provide a new manipulation of social attention, as well as presenting a methodological challenge to researchers using eye tracking.

  5. Assessing cognitive functioning in females with Rett syndrome by eye-tracking methodology.

    PubMed

    Ahonniska-Assa, Jaana; Polack, Orli; Saraf, Einat; Wine, Judy; Silberg, Tamar; Nissenkorn, Andreea; Ben-Zeev, Bruria

    2018-01-01

    While many individuals with severe developmental impairments learn to communicate with augmentative and alternative communication (AAC) devices, a significant number of individuals show major difficulties in the effective use of AAC. Recent technological innovations, i.e., eye-tracking technology (ETT), aim to improve the transparency of communication and may also enable a more valid cognitive assessment. To investigate whether ETT in forced-choice tasks can enable children with very severe motor and speech impairments to respond consistently, allowing a more reliable evaluation of their language comprehension. Participants were 17 girls with Rett syndrome (M = 6:06 years). Their ability to respond by eye gaze was first practiced with computer games using ETT. Afterwards, their receptive vocabulary was assessed using the Peabody Picture Vocabulary Test-4 (PPVT-4). Target words were orally presented and participants responded by focusing their eyes on the preferred picture. Remarkable differences between the participants in receptive vocabulary were demonstrated using ETT. The verbal comprehension abilities of 32% of the participants ranged from low-average to mild cognitive impairment, and the other 68% of the participants showed moderate to severe impairment. Young age at the time of assessment was positively correlated with higher receptive vocabulary. The use of ETT seems to make the communicational signals of children with severe motor and communication impairments more easily understood. Early practice of ETT may improve the quality of communication and enable more reliable conclusions in learning and assessment sessions. Copyright © 2017 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.

  6. Large-field-of-view wide-spectrum artificial reflecting superposition compound eyes

    NASA Astrophysics Data System (ADS)

    Huang, Chi-Chieh

    The study of the imaging principles of natural compound eyes has become an active area of research and has fueled the advancement of modern optics with many attractive design features beyond those available with conventional technologies. Most prominent among all compound eyes is the reflecting superposition compound eyes (RSCEs) found in some decapods. They are extraordinary imaging systems with numerous optical features such as minimum chromatic aberration, wide-angle field of view (FOV), high sensitivity to light and superb acuity to motion. Inspired by their remarkable visual system, we were able to implement the unique lens-free, reflection-based imaging mechanisms into a miniaturized, large-FOV optical imaging device operating at the wide visible spectrum to minimize chromatic aberration without any additional post-image processing. First, two micro-transfer printing methods, a multiple and a shear-assisted transfer printing technique, were studied and discussed to realize life-sized artificial RSCEs. The processes exploited the differential adhesive tendencies of the microstructures formed between a donor and a transfer substrate to accomplish an efficient release and transfer process. These techniques enabled conformal wrapping of three-dimensional (3-D) microstructures, initially fabricated in two-dimensional (2-D) layouts with standard fabrication technology onto a wide range of surfaces with complex and curvilinear shapes. Final part of this dissertation was focused on implementing the key operational features of the natural RSCEs into large-FOV, wide-spectrum artificial RSCEs as an optical imaging device suitable for the wide visible spectrum. Our devices can form real, clear images based on reflection rather than refraction, hence avoiding chromatic aberration due to dispersion by the optical materials. Compared to the performance of conventional refractive lenses of comparable size, our devices demonstrated minimum chromatic aberration, exceptional FOV up to 165o without distortion, modest spherical aberrations and comparable imaging quality without any post-image processing. Together with an augmenting cruciform pattern surrounding each focused image, our devices possessed enhanced, dynamic motion-tracking capability ideal for diverse applications in military, security, search and rescue, night navigation, medical imaging and astronomy. In the future, due to its reflection-based operating principles, it can be further extended into mid- and far-infrared for more demanding applications.

  7. Wide-angle camera with multichannel architecture using microlenses on a curved surface.

    PubMed

    Liang, Wei-Lun; Shen, Hui-Kai; Su, Guo-Dung J

    2014-06-10

    We propose a multichannel imaging system that combines the principles of an insect's compound eye and the human eye. The optical system enables a reduction in track length of the imaging device to achieve miniaturization. The multichannel structure is achieved by a curved microlens array, and a Hypergon lens is used as the main lens to simulate the human eye, achieving large field of view (FOV). With this architecture, each microlens of the array transmits a segment of the overall FOV. The partial images are recorded in separate channels and stitched together to form the final image of the whole FOV by image processing. The design is 2.7 mm thick, with 59 channels; the 100°×80° full FOV is optimized using ZEMAX ray-tracing software on an image plane. The image plane size is 4.53  mm×3.29  mm. Given the recent progress in the fabrication of microlenses, this image system has the potential to be commercialized in the near future.

  8. A full-parallax 3D display with restricted viewing zone tracking viewer's eye

    NASA Astrophysics Data System (ADS)

    Beppu, Naoto; Yendo, Tomohiro

    2015-03-01

    The Three-Dimensional (3D) vision became widely known as familiar imaging technique now. The 3D display has been put into practical use in various fields, such as entertainment and medical fields. Development of 3D display technology will play an important role in a wide range of fields. There are various ways to the method of displaying 3D image. There is one of the methods that showing 3D image method to use the ray reproduction and we focused on it. This method needs many viewpoint images when achieve a full-parallax because this method display different viewpoint image depending on the viewpoint. We proposed to reduce wasteful rays by limiting projector's ray emitted to around only viewer using a spinning mirror, and to increase effectiveness of display device to achieve a full-parallax 3D display. We propose a method by using a tracking viewer's eye, a high-speed projector, a rotating mirror that tracking viewer (a spinning mirror), a concave mirror array having the different vertical slope arranged circumferentially (a concave mirror array), a cylindrical mirror. About proposed method in simulation, we confirmed the scanning range and the locus of the movement in the horizontal direction of the ray. In addition, we confirmed the switching of the viewpoints and convergence performance in the vertical direction of rays. Therefore, we confirmed that it is possible to realize a full-parallax.

  9. 21 CFR 886.3200 - Artificial eye.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Artificial eye. 886.3200 Section 886.3200 Food and... OPHTHALMIC DEVICES Prosthetic Devices § 886.3200 Artificial eye. (a) Identification. An artificial eye is a device resembling the anterior portion of the eye, usually made of glass or plastic, intended to be...

  10. 21 CFR 886.3200 - Artificial eye.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Artificial eye. 886.3200 Section 886.3200 Food and... OPHTHALMIC DEVICES Prosthetic Devices § 886.3200 Artificial eye. (a) Identification. An artificial eye is a device resembling the anterior portion of the eye, usually made of glass or plastic, intended to be...

  11. Surface ablation with iris recognition and dynamic rotational eye tracking-based tissue saving treatment with the Technolas 217z excimer laser.

    PubMed

    Prakash, Gaurav; Agarwal, Amar; Kumar, Dhivya Ashok; Jacob, Soosan; Agarwal, Athiya; Maity, Amrita

    2011-03-01

    To evaluate the visual and refractive outcomes and expected benefits of Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking. This prospective, interventional case series comprised 122 eyes (70 patients). Pre- and postoperative assessment included uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), refraction, and higher order aberrations. All patients underwent Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking using the Technolas 217z 100-Hz excimer platform (Technolas Perfect Vision GmbH). Follow-up was performed up to 6 months postoperatively. Theoretical benefit analysis was performed to evaluate the algorithm's outcomes compared to others. Preoperative spherocylindrical power was sphere -3.62 ± 1.60 diopters (D) (range: 0 to -6.75 D), cylinder -1.15 ± 1.00 D (range: 0 to -3.50 D), and spherical equivalent -4.19 ± 1.60 D (range: -7.75 to -2.00 D). At 6 months, 91% (111/122) of eyes were within ± 0.50 D of attempted correction. Postoperative UDVA was comparable to preoperative CDVA at 1 month (P=.47) and progressively improved at 6 months (P=.004). Two eyes lost one line of CDVA at 6 months. Theoretical benefit analysis revealed that of 101 eyes with astigmatism, 29 would have had cyclotorsion-induced astigmatism of ≥ 10% if iris recognition and dynamic rotational eye tracking were not used. Furthermore, the mean percentage decrease in maximum depth of ablation by using the Tissue Saving Treatment was 11.8 ± 2.9% over Aspheric, 17.8 ± 6.2% over Personalized, and 18.2 ± 2.8% over Planoscan algorithms. Tissue saving surface ablation with iris recognition and dynamic rotational eye tracking was safe and effective in this series of eyes. Copyright 2011, SLACK Incorporated.

  12. The seam visual tracking method for large structures

    NASA Astrophysics Data System (ADS)

    Bi, Qilin; Jiang, Xiaomin; Liu, Xiaoguang; Cheng, Taobo; Zhu, Yulong

    2017-10-01

    In this paper, a compact and flexible weld visual tracking method is proposed. Firstly, there was the interference between the visual device and the work-piece to be welded when visual tracking height cannot change. a kind of weld vision system with compact structure and tracking height is researched. Secondly, according to analyze the relative spatial pose between the camera, the laser and the work-piece to be welded and study with the theory of relative geometric imaging, The mathematical model between image feature parameters and three-dimensional trajectory of the assembly gap to be welded is established. Thirdly, the visual imaging parameters of line structured light are optimized by experiment of the weld structure of the weld. Fourth, the interference that line structure light will be scatters at the bright area of metal and the area of surface scratches will be bright is exited in the imaging. These disturbances seriously affect the computational efficiency. The algorithm based on the human eye visual attention mechanism is used to extract the weld characteristics efficiently and stably. Finally, in the experiment, It is verified that the compact and flexible weld tracking method has the tracking accuracy of 0.5mm in the tracking of large structural parts. It is a wide range of industrial application prospects.

  13. Dissociable Frontal Controls during Visible and Memory-guided Eye-Tracking of Moving Targets

    PubMed Central

    Ding, Jinhong; Powell, David; Jiang, Yang

    2009-01-01

    When tracking visible or occluded moving targets, several frontal regions including the frontal eye fields (FEF), dorsal-lateral prefrontal cortex (DLPFC), and Anterior Cingulate Cortex (ACC) are involved in smooth pursuit eye movements (SPEM). To investigate how these areas play different roles in predicting future locations of moving targets, twelve healthy college students participated in a smooth pursuit task of visual and occluded targets. Their eye movements and brain responses measured by event-related functional MRI were simultaneously recorded. Our results show that different visual cues resulted in time discrepancies between physical and estimated pursuit time only when the moving dot was occluded. Visible phase velocity gain was higher than that of occlusion phase. We found bilateral FEF association with eye-movement whether moving targets are visible or occluded. However, the DLPFC and ACC showed increased activity when tracking and predicting locations of occluded moving targets, and were suppressed during smooth pursuit of visible targets. When visual cues were increasingly available, less activation in the DLPFC and the ACC was observed. Additionally, there was a significant hemisphere effect in DLPFC, where right DLPFC showed significantly increased responses over left when pursuing occluded moving targets. Correlation results revealed that DLPFC, the right DLPFC in particular, communicates more with FEF during tracking of occluded moving targets (from memory). The ACC modulates FEF more during tracking of visible targets (likely related to visual attention). Our results suggest that DLPFC and ACC modulate FEF and cortical networks differentially during visible and memory-guided eye tracking of moving targets. PMID:19434603

  14. Remote gaze tracking system for 3D environments.

    PubMed

    Congcong Liu; Herrup, Karl; Shi, Bertram E

    2017-07-01

    Eye tracking systems are typically divided into two categories: remote and mobile. Remote systems, where the eye tracker is located near the object being viewed by the subject, have the advantage of being less intrusive, but are typically used for tracking gaze points on fixed two dimensional (2D) computer screens. Mobile systems such as eye tracking glasses, where the eye tracker are attached to the subject, are more intrusive, but are better suited for cases where subjects are viewing objects in the three dimensional (3D) environment. In this paper, we describe how remote gaze tracking systems developed for 2D computer screens can be used to track gaze points in a 3D environment. The system is non-intrusive. It compensates for small head movements by the user, so that the head need not be stabilized by a chin rest or bite bar. The system maps the 3D gaze points of the user onto 2D images from a scene camera and is also located remotely from the subject. Measurement results from this system indicate that it is able to estimate gaze points in the scene camera to within one degree over a wide range of head positions.

  15. Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments.

    PubMed

    Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and p<;0.05). This implies that with vigilance decrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.

  16. Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors

    PubMed Central

    Zhan, Zehui; Zhang, Lei; Mei, Hu; Fong, Patrick S. W.

    2016-01-01

    The detection of university online learners’ reading ability is generally problematic and time-consuming. Thus the eye-tracking sensors have been employed in this study, to record temporal and spatial human eye movements. Learners’ pupils, blinks, fixation, saccade, and regression are recognized as primary indicators for detecting reading abilities. A computational model is established according to the empirical eye-tracking data, and applying the multi-feature regularization machine learning mechanism based on a Low-rank Constraint. The model presents good generalization ability with an error of only 4.9% when randomly running 100 times. It has obvious advantages in saving time and improving precision, with only 20 min of testing required for prediction of an individual learner’s reading ability. PMID:27626418

  17. Human-like object tracking and gaze estimation with PKD android

    PubMed Central

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K; Bugnariu, Nicoleta L.; Popa, Dan O.

    2018-01-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans. PMID:29416193

  18. Human-like object tracking and gaze estimation with PKD android

    NASA Astrophysics Data System (ADS)

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K.; Bugnariu, Nicoleta L.; Popa, Dan O.

    2016-05-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

  19. Exploring Eye Movements of Experienced and Novice Readers of Medical Texts Concerning the Cardiovascular System in Making a Diagnosis

    ERIC Educational Resources Information Center

    Vilppu, Henna; Mikkilä-Erdmann, Mirjamaija; Södervik, Ilona; Österholm-Matikainen, Erika

    2017-01-01

    This study used the eye-tracking method to explore how the level of expertise influences reading, and solving, two written patient cases on cardiac failure and pulmonary embolus. Eye-tracking is a fairly commonly used method in medical education research, but it has been primarily applied to studies analyzing the processing of visualizations, such…

  20. Web Usability or Accessibility: Comparisons between People with and without Intellectual Disabilities in Viewing Complex Naturalistic Scenes Using Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Bazar, Nancy Sceery

    2009-01-01

    The purpose of this primarily quantitative study was to compare how young adults with and without intellectual disabilities examine different types of images. Two experiments were conducted. The first, a replication and extension of a classic eye-tracking study (Yarbus, 1967), generated eye gaze patterns and data in response to questions related…

  1. An Eye for Words: Gauging the Role of Attention in Incidental L2 Vocabulary Acquisition by Means of Eye-Tracking

    ERIC Educational Resources Information Center

    Godfroid, Aline; Boers, Frank; Housen, Alex

    2013-01-01

    This eye-tracking study tests the hypothesis that more attention leads to more learning, following claims that attention to new language elements in the input results in their initial representation in long-term memory (i.e., intake; Robinson, 2003; Schmidt, 1990, 2001). Twenty-eight advanced learners of English read English texts that contained…

  2. Preliminary results of tracked photorefractive keratectomy (T-PRK) for mild to moderate myopia with the autonomous technologies excimer laser at Cedars-Sinai Medical Center

    NASA Astrophysics Data System (ADS)

    Maguen, Ezra I.; Salz, James J.; Nesburn, Anthony B.

    1997-05-01

    Preliminary results of the correction of myopia up to -7.00 D by tracked photorefractive keratectomy (T-PRK) with a scanning and tracking excimer laser by Autonomous Technologies are discussed. 41 eyes participated (20 males). 28 eyes were evaluated one month postop. At epithelization day mean uncorrected vision was 20/45.3. At one month postop, 92.8 of eyes were 20/40 and 46.4% were 20/20. No eye was worse than 20/50. 75% of eyes were within +/- 0.5 D of emmetropia and 82% were within +/- 1.00 D of emmetropia. Eyes corrected for monovision were included. One eye lost 3 lines of best corrected vision, and had more than 1.00 D induced astigmatism due to a central corneal ulcer. Additional complications included symptomatic recurrent corneal erosions which were controlled with topical hypertonic saline. T-PRK appears to allow effective correction of low to moderate myopia. Further study will establish safety and efficacy of the procedure.

  3. Can eye-tracking technology improve situational awareness in paramedic clinical education?

    PubMed

    Williams, Brett; Quested, Andrew; Cooper, Simon

    2013-01-01

    Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.

  4. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  5. Eye Tracking Metrics for Workload Estimation in Flight Deck Operation

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle; Schnell, Thomas

    2010-01-01

    Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.

  6. 21 CFR 872.2060 - Jaw tracking device.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...) MEDICAL DEVICES DENTAL DEVICES Diagnostic Devices § 872.2060 Jaw tracking device. (a) Jaw tracking device... Controls Guidance Document: Dental Sonography and Jaw Tracking Devices.” [68 FR 67367, Dec. 2, 2003] ...

  7. Accounting for direction and speed of eye motion in planning visually guided manual tracking.

    PubMed

    Leclercq, Guillaume; Blohm, Gunnar; Lefèvre, Philippe

    2013-10-01

    Accurate motor planning in a dynamic environment is a critical skill for humans because we are often required to react quickly and adequately to the visual motion of objects. Moreover, we are often in motion ourselves, and this complicates motor planning. Indeed, the retinal and spatial motions of an object are different because of the retinal motion component induced by self-motion. Many studies have investigated motion perception during smooth pursuit and concluded that eye velocity is partially taken into account by the brain. Here we investigate whether the eye velocity during ongoing smooth pursuit is taken into account for the planning of visually guided manual tracking. We had 10 human participants manually track a target while in steady-state smooth pursuit toward another target such that the difference between the retinal and spatial target motion directions could be large, depending on both the direction and the speed of the eye. We used a measure of initial arm movement direction to quantify whether motor planning occurred in retinal coordinates (not accounting for eye motion) or was spatially correct (incorporating eye velocity). Results showed that the eye velocity was nearly fully taken into account by the neuronal areas involved in the visuomotor velocity transformation (between 75% and 102%). In particular, these neuronal pathways accounted for the nonlinear effects due to the relative velocity between the target and the eye. In conclusion, the brain network transforming visual motion into a motor plan for manual tracking adequately uses extraretinal signals about eye velocity.

  8. Long-range eye tracking: A feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayaweera, S.K.; Lu, Shin-yee

    1994-08-24

    The design considerations for a long-range Purkinje effects based video tracking system using current technology is presented. Past work, current experiments, and future directions are thoroughly discussed, with an emphasis on digital signal processing techniques and obstacles. It has been determined that while a robust, efficient, long-range, and non-invasive eye tracking system will be difficult to develop, such as a project is indeed feasible.

  9. Using eye tracking technology to compare the effectiveness of malignant hyperthermia cognitive aid design.

    PubMed

    King, Roderick; Hanhan, Jaber; Harrison, T Kyle; Kou, Alex; Howard, Steven K; Borg, Lindsay K; Shum, Cynthia; Udani, Ankeet D; Mariano, Edward R

    2018-05-15

    Malignant hyperthermia is a rare but potentially fatal complication of anesthesia, and several different cognitive aids designed to facilitate a timely and accurate response to this crisis currently exist. Eye tracking technology can measure voluntary and involuntary eye movements, gaze fixation within an area of interest, and speed of visual response and has been used to a limited extent in anesthesiology. With eye tracking technology, we compared the accessibility of five malignant hyperthermia cognitive aids by collecting gaze data from twelve volunteer participants. Recordings were reviewed and annotated to measure the time required for participants to locate objects on the cognitive aid to provide an answer; cumulative time to answer was the primary outcome. For the primary outcome, there were differences detected between cumulative time to answer survival curves (P < 0.001). Participants demonstrated the shortest cumulative time to answer when viewing the Society for Pediatric Anesthesia (SPA) cognitive aid compared to four other publicly available cognitive aids for malignant hyperthermia, and this outcome was not influenced by the anesthesiologists' years of experience. This is the first study to utilize eye tracking technology in a comparative evaluation of cognitive aid design, and our experience suggests that there may be additional applications of eye tracking technology in healthcare and medical education. Potentially advantageous design features of the SPA cognitive aid include a single page, linear layout, and simple typescript with minimal use of single color blocking.

  10. 21 CFR 886.3320 - Eye sphere implant.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Eye sphere implant. 886.3320 Section 886.3320 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES OPHTHALMIC DEVICES Prosthetic Devices § 886.3320 Eye sphere implant. (a) Identification. An eye...

  11. 21 CFR 886.3320 - Eye sphere implant.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Eye sphere implant. 886.3320 Section 886.3320 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES OPHTHALMIC DEVICES Prosthetic Devices § 886.3320 Eye sphere implant. (a) Identification. An eye...

  12. 21 CFR 878.4440 - Eye pad.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Eye pad. 878.4440 Section 878.4440 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4440 Eye pad. (a) Identification. An eye pad is...

  13. 21 CFR 878.4440 - Eye pad.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Eye pad. 878.4440 Section 878.4440 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4440 Eye pad. (a) Identification. An eye pad is...

  14. 21 CFR 878.4440 - Eye pad.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Eye pad. 878.4440 Section 878.4440 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4440 Eye pad. (a) Identification. An eye pad is...

  15. Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.

    PubMed

    Evitts, Paul; Gallop, Robert

    2011-01-01

    There is a large body of research demonstrating the impact of visual information on speaker intelligibility in both normal and disordered speaker populations. However, there is minimal information on which specific visual features listeners find salient during conversational discourse. To investigate listeners' eye-gaze behaviour during face-to-face conversation with normal, laryngeal and proficient alaryngeal speakers. Sixty participants individually participated in a 10-min conversation with one of four speakers (typical laryngeal, tracheoesophageal, oesophageal, electrolaryngeal; 15 participants randomly assigned to one mode of speech). All speakers were > 85% intelligible and were judged to be 'proficient' by two certified speech-language pathologists. Participants were fitted with a head-mounted eye-gaze tracking device (Mobile Eye, ASL) that calculated the region of interest and mean duration of eye-gaze. Self-reported gaze behaviour was also obtained following the conversation using a 10 cm visual analogue scale. While listening, participants viewed the lower facial region of the oesophageal speaker more than the normal or tracheoesophageal speaker. Results of non-hierarchical cluster analyses showed that while listening, the pattern of eye-gaze was predominantly directed at the lower face of the oesophageal and electrolaryngeal speaker and more evenly dispersed among the background, lower face, and eyes of the normal and tracheoesophageal speakers. Finally, results show a low correlation between self-reported eye-gaze behaviour and objective regions of interest data. Overall, results suggest similar eye-gaze behaviour when healthy controls converse with normal and tracheoesophageal speakers and that participants had significantly different eye-gaze patterns when conversing with an oesophageal speaker. Results are discussed in terms of existing eye-gaze data and its potential implications on auditory-visual speech perception. © 2011 Royal College of Speech & Language Therapists.

  16. Quantitative measurement of eyestrain on 3D stereoscopic display considering the eye foveation model and edge information.

    PubMed

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-05-15

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors.

  17. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.

    PubMed

    Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.

  18. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma

    PubMed Central

    Black, Alex A.

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433

  19. Eye-tracking-based assessment of cognitive function in low-resource settings.

    PubMed

    Forssman, Linda; Ashorn, Per; Ashorn, Ulla; Maleta, Kenneth; Matchado, Andrew; Kortekangas, Emma; Leppänen, Jukka M

    2017-04-01

    Early development of neurocognitive functions in infants can be compromised by poverty, malnutrition and lack of adequate stimulation. Optimal management of neurodevelopmental problems in infants requires assessment tools that can be used early in life, and are objective and applicable across economic, cultural and educational settings. The present study examined the feasibility of infrared eye tracking as a novel and highly automated technique for assessing visual-orienting and sequence-learning abilities as well as attention to facial expressions in young (9-month-old) infants. Techniques piloted in a high-resource laboratory setting in Finland (N=39) were subsequently field-tested in a community health centre in rural Malawi (N=40). Parents' perception of the acceptability of the method (Finland 95%, Malawi 92%) and percentages of infants completing the whole eye-tracking test (Finland 95%, Malawi 90%) were high, and percentages of valid test trials (Finland 69-85%, Malawi 68-73%) satisfactory at both sites. Test completion rates were slightly higher for eye tracking (90%) than traditional observational tests (87%) in Malawi. The predicted response pattern indicative of specific cognitive function was replicated in Malawi, but Malawian infants exhibited lower response rates and slower processing speed across tasks. High test completion rates and the replication of the predicted test patterns in a novel environment in Malawi support the feasibility of eye tracking as a technique for assessing infant development in low-resource setting. Further research is needed to the test-retest stability and predictive validity of the eye-tracking scores in low-income settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. Our self-tracking movement and health literacy: are we really making every moment count?

    PubMed

    Vamos, Sandra; Klein, Klaus

    2016-08-03

    There is a growing movement related to self-tracking in the quest for better health. Why do so many people like to use 'intelligent tools' like shiny sensors or mobile apps to keep an eye on every move? Do they really help us drive sustained healthy behavioral changes? Despite technological advances and product promises, we must remember that technology alone does not facilitate change to optimize health benefits. The purpose of the commentary is to pose the question: How 'health literate' do we have to be to reap the actionable health benefits of self-tracking? Research has revealed the prevalence of limited health literacy across the globe. Health literacy involves a complex set of inter-connected skills, including acting upon health information. This commentary puts attention on health literacy as an essential human tool to better equip people to overcome barriers and use devices to leverage their full potential. © The Author(s) 2016.

  1. Color vision deficits and laser eyewear protection for soft tissue laser applications.

    PubMed

    Teichman, J M; Vassar, G J; Yates, J T; Angle, B N; Johnson, A J; Dirks, M S; Thompson, I M

    1999-03-01

    Laser safety considerations require urologists to wear laser eye protection. Laser eye protection devices block transmittance of specific light wavelengths and may distort color perception. We tested whether urologists risk color confusion when wearing laser eye protection devices for laser soft tissue applications. Subjects were tested with the Farnsworth-Munsell 100-Hue Test without (controls) and with laser eye protection devices for carbon dioxide, potassium titanyl phosphate (KTP), neodymium (Nd):YAG and holmium:YAG lasers. Color deficits were characterized by error scores, polar graphs, confusion angles, confusion index, scatter index and color axes. Laser eye protection device spectral transmittance was tested with spectrophotometry. Mean total error scores plus or minus standard deviation were 13+/-5 for controls, and 44+/-31 for carbon dioxide, 273+/-26 for KTP, 22+/-6 for Nd:YAG and 14+/-8 for holmium:YAG devices (p <0.001). The KTP laser eye protection polar graphs, and confusion and scatter indexes revealed moderate blue-yellow and red-green color confusion. Color axes indicated no significant deficits for controls, or carbon dioxide, Nd:YAG or holmium:YAG laser eye protection in any subject compared to blue-yellow color vision deficits in 8 of 8 tested with KTP laser eye protection (p <0.001). Spectrophotometry demonstrated that light was blocked with laser eye protection devices for carbon dioxide less than 380, holmium:YAG greater than 850, Nd:YAG less than 350 and greater than 950, and KTP less than 550 and greater than 750 nm. The laser eye protection device for KTP causes significant blue-yellow and red-green color confusion. Laser eye protection devices for carbon dioxide, holmium:YAG and Nd:YAG cause no significant color confusion compared to controls. The differences are explained by laser eye protection spectrophotometry characteristics and visual physiology.

  2. Automatic Classification of Users’ Health Information Need Context: Logistic Regression Analysis of Mouse-Click and Eye-Tracker Data

    PubMed Central

    Pian, Wenjing; Khoo, Christopher SG

    2017-01-01

    Background Users searching for health information on the Internet may be searching for their own health issue, searching for someone else’s health issue, or browsing with no particular health issue in mind. Previous research has found that these three categories of users focus on different types of health information. However, most health information websites provide static content for all users. If the three types of user health information need contexts can be identified by the Web application, the search results or information offered to the user can be customized to increase its relevance or usefulness to the user. Objective The aim of this study was to investigate the possibility of identifying the three user health information contexts (searching for self, searching for others, or browsing with no particular health issue in mind) using just hyperlink clicking behavior; using eye-tracking information; and using a combination of eye-tracking, demographic, and urgency information. Predictive models are developed using multinomial logistic regression. Methods A total of 74 participants (39 females and 35 males) who were mainly staff and students of a university were asked to browse a health discussion forum, Healthboards.com. An eye tracker recorded their examining (eye fixation) and skimming (quick eye movement) behaviors on 2 types of screens: summary result screen displaying a list of post headers, and detailed post screen. The following three types of predictive models were developed using logistic regression analysis: model 1 used only the time spent in scanning the summary result screen and reading the detailed post screen, which can be determined from the user’s mouse clicks; model 2 used the examining and skimming durations on each screen, recorded by an eye tracker; and model 3 added user demographic and urgency information to model 2. Results An analysis of variance (ANOVA) analysis found that users’ browsing durations were significantly different for the three health information contexts (P<.001). The logistic regression model 3 was able to predict the user’s type of health information context with a 10-fold cross validation mean accuracy of 84% (62/74), followed by model 2 at 73% (54/74) and model 1 at 71% (52/78). In addition, correlation analysis found that particular browsing durations were highly correlated with users’ age, education level, and the urgency of their information need. Conclusions A user’s type of health information need context (ie, searching for self, for others, or with no health issue in mind) can be identified with reasonable accuracy using just user mouse clicks that can easily be detected by Web applications. Higher accuracy can be obtained using Google glass or future computing devices with eye tracking function. PMID:29269342

  3. Toward Collaboration Sensing

    ERIC Educational Resources Information Center

    Schneider, Bertrand; Pea, Roy

    2014-01-01

    We describe preliminary applications of network analysis techniques to eye-tracking data collected during a collaborative learning activity. This paper makes three contributions: first, we visualize collaborative eye-tracking data as networks, where the nodes of the graph represent fixations and edges represent saccades. We found that those…

  4. Eye Tracking Dysfunction in Schizophrenia: Characterization and Pathophysiology

    PubMed Central

    Sereno, Anne B.; Gooding, Diane C.; O’Driscoll, Gilllian A.

    2011-01-01

    Eye tracking dysfunction (ETD) is one of the most widely replicated behavioral deficits in schizophrenia and is over-represented in clinically unaffected first-degree relatives of schizophrenia patients. Here, we provide an overview of research relevant to the characterization and pathophysiology of this impairment. Deficits are most robust in the maintenance phase of pursuit, particularly during the tracking of predictable target movement. Impairments are also found in pursuit initiation and correlate with performance on tests of motion processing, implicating early sensory processing of motion signals. Taken together, the evidence suggests that ETD involves higher-order structures, including the frontal eye fields, which adjust the gain of the pursuit response to visual and anticipated target movement, as well as early parts of the pursuit pathway, including motion areas (the middle temporal area and the adjacent medial superior temporal area). Broader application of localizing behavioral paradigms in patient and family studies would be advantageous for refining the eye tracking phenotype for genetic studies. PMID:21312405

  5. Physiological and behavioral responses to an exposure of pitch illusion in the simulator.

    PubMed

    Cheung, Bob; Hofer, Kevin; Heskin, Raquel; Smith, Andrew

    2004-08-01

    It has been suggested that a pilot's physiological and behavioral responses during disorientation can provide a real-time model of pilot state in order to optimize performance. We investigated whether there were consistent behavioral or physiological "markers" that can be monitored during a single episode of disorientation. An Integrated Physiological Trainer with a closed loop interactive aircraft control and point of gaze/eye-tracking device was employed. There were 16 subjects proficient in maintaining straight and level flight and with procedures in changing attitude who were exposed to yaw rotation and a brief head roll to 35 +/- 2 degrees. On return to upright head position, subjects were required to initiate either an ascent or descent to a prescribed attitude. BP, HR, skin conductance, eye movements, and point of gaze were monitored throughout the onset, duration, and immediately after the disorientation insult. Simultaneously, airspeed and power settings were recorded. Compared with the control condition, a significant increase (p < 0.01) in HR, HR variability, and mean arterial BP was observed during the disorientation. Flight performance decrement was reflected by a significant delay in setting power for attitude change and deviation in maintaining airspeed (p < 0.01). Changes in cardiovascular responses appear to be correlated with the onset of disorientation. The correlation of changing eye-tracking behavior and flight performance decrement is consistent with our previous findings. Further study is required to determine whether these findings can be extrapolated to repeated exposures and to other disorientation scenarios.

  6. Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain-machine interfaces

    NASA Astrophysics Data System (ADS)

    Abbott, W. W.; Faisal, A. A.

    2012-08-01

    Eye movements are highly correlated with motor intentions and are often retained by patients with serious motor deficiencies. Despite this, eye tracking is not widely used as control interface for movement in impaired patients due to poor signal interpretation and lack of control flexibility. We propose that tracking the gaze position in 3D rather than 2D provides a considerably richer signal for human machine interfaces by allowing direct interaction with the environment rather than via computer displays. We demonstrate here that by using mass-produced video-game hardware, it is possible to produce an ultra-low-cost binocular eye-tracker with comparable performance to commercial systems, yet 800 times cheaper. Our head-mounted system has 30 USD material costs and operates at over 120 Hz sampling rate with a 0.5-1 degree of visual angle resolution. We perform 2D and 3D gaze estimation, controlling a real-time volumetric cursor essential for driving complex user interfaces. Our approach yields an information throughput of 43 bits s-1, more than ten times that of invasive and semi-invasive brain-machine interfaces (BMIs) that are vastly more expensive. Unlike many BMIs our system yields effective real-time closed loop control of devices (10 ms latency), after just ten minutes of training, which we demonstrate through a novel BMI benchmark—the control of the video arcade game ‘Pong’.

  7. Mouse cursor movement and eye tracking data as an indicator of pathologists’ attention when viewing digital whole slide images

    PubMed Central

    Raghunath, Vignesh; Braxton, Melissa O.; Gagnon, Stephanie A.; Brunyé, Tad T.; Allison, Kimberly H.; Reisch, Lisa M.; Weaver, Donald L.; Elmore, Joann G.; Shapiro, Linda G.

    2012-01-01

    Context: Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists’ viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists’ viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists’ viewing strategies and time expenditures in their interpretive workflow. Aims: To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists’ attention and viewing behavior. Settings and Design: Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Materials and Methods: Participants’ foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Statistical Analysis Used: Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists’ accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Results: Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Conclusions: Data detailing mouse cursor movements may be a useful addition to future studies of pathologists’ accuracy and efficiency when using digital pathology. PMID:23372984

  8. Mouse cursor movement and eye tracking data as an indicator of pathologists' attention when viewing digital whole slide images.

    PubMed

    Raghunath, Vignesh; Braxton, Melissa O; Gagnon, Stephanie A; Brunyé, Tad T; Allison, Kimberly H; Reisch, Lisa M; Weaver, Donald L; Elmore, Joann G; Shapiro, Linda G

    2012-01-01

    Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists' viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists' viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists' viewing strategies and time expenditures in their interpretive workflow. To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists' attention and viewing behavior. Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Participants' foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists' accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Data detailing mouse cursor movements may be a useful addition to future studies of pathologists' accuracy and efficiency when using digital pathology.

  9. A new generation of IC based beam steering devices for free-space optical communication

    NASA Astrophysics Data System (ADS)

    Bedi, Vijit

    Free Space Optical (FSO) communication has tremendously advanced within the last decade to meet the ever increasing demand for higher communication bandwidth. Advancement in laser technology since its invention in the 1960's [1] attracted them to be the dominant source in FSO communication modules. The future of FSO systems lay in implementing semiconductor lasers due to their small size, power efficiency and mass fabrication abilities. In the near future, these systems are very likely to be used in space and ground based applications and revolutionary beam steering technologies will be required for distant communications in free-space. The highly directional characteristic inherent to a laser beam challenges and calls for new beam pointing and steering technologies for such type of communication. In this dissertation, research is done on a novel FSO communication device based on semiconductor lasers for high bandwidth communication. The "Fly eye transceiver" is an extremely wide steering bandwidth, completely non-mechanical FSO laser communication device primarily designed to replace traditional mechanical beam steering optical systems. This non-mechanical FSO device possesses a full spherical steering range and a very high tracking bandwidth. Inspired by the evolutionary model of a fly's eye, the full spherical steering range is assured by electronically controlled switching of its sub-eyes. Non mechanical technologies used in the past for beam steering such as acousto-optic Bragg cells, liquid crystal arrays or piezoelectric elements offer the wide steering bandwidth and fast response time, but are limited in their angular steering range. Mechanical gimbals offer a much greater steering range but face a much slower response time or steering bandwidth problem and often require intelligent adaptive controls with bulky driver amplifiers to feed their actuators. As a solution to feed both the fast and full spherical steering, the Fly-eye transceiver is studied as part of my PhD work. The design tool created for the research of the fly eye is then used to study different applications that may be implemented with the concept. Research is done on the mathematical feasibility, modeling, design, application of the technology, and its characterization in a simulation environment. In addition, effects of atmospheric turbulence on beam propagation in free space, and applying data security using optical encryption are also researched.

  10. Intraocular and extraocular cameras for retinal prostheses: Effects of foveation by means of visual prosthesis simulation

    NASA Astrophysics Data System (ADS)

    McIntosh, Benjamin Patrick

    Blindness due to Age-Related Macular Degeneration and Retinitis Pigmentosa is unfortunately both widespread and largely incurable. Advances in visual prostheses that can restore functional vision in those afflicted by these diseases have evolved rapidly from new areas of research in ophthalmology and biomedical engineering. This thesis is focused on further advancing the state-of-the-art of both visual prostheses and implantable biomedical devices. A novel real-time system with a high performance head-mounted display is described that enables enhanced realistic simulation of intraocular retinal prostheses. A set of visual psychophysics experiments is presented using the visual prosthesis simulator that quantify, in several ways, the benefit of foveation afforded by an eye-pointed camera (such as an eye-tracked extraocular camera or an implantable intraocular camera) as compared with a head-pointed camera. A visual search experiment demonstrates a significant improvement in the time to locate a target on a screen when using an eye-pointed camera. A reach and grasp experiment demonstrates a 20% to 70% improvement in time to grasp an object when using an eye-pointed camera, with the improvement maximized when the percept is blurred. A navigation and mobility experiment shows a 10% faster walking speed and a 50% better ability to avoid obstacles when using an eye-pointed camera. Improvements to implantable biomedical devices are also described, including the design and testing of VLSI-integrable positive mobile ion contamination sensors and humidity sensors that can validate the hermeticity of biomedical device packages encapsulated by hermetic coatings, and can provide early warning of leaks or contamination that may jeopardize the implant. The positive mobile ion contamination sensors are shown to be sensitive to externally applied contamination. A model is proposed to describe sensitivity as a function of device geometry, and verified experimentally. Guidelines are provided on the use of spare CMOS oxide and metal layers to maximize the hermeticity of an implantable microchip. In addition, results are presented on the design and testing of small form factor, very low power, integrated CMOS clock generation circuits that are stable enough to drive commercial image sensor arrays, and therefore can be incorporated in an intraocular camera for retinal prostheses.

  11. SacLab: A toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice.

    PubMed

    Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela

    2017-01-01

    Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Insights into numerical cognition: considering eye-fixations in number processing and arithmetic.

    PubMed

    Mock, J; Huber, S; Klein, E; Moeller, K

    2016-05-01

    Considering eye-fixation behavior is standard in reading research to investigate underlying cognitive processes. However, in numerical cognition research eye-tracking is used less often and less systematically. Nevertheless, we identified over 40 studies on this topic from the last 40 years with an increase of eye-tracking studies on numerical cognition during the last decade. Here, we review and discuss these empirical studies to evaluate the added value of eye-tracking for the investigation of number processing. Our literature review revealed that the way eye-fixation behavior is considered in numerical cognition research ranges from investigating basic perceptual aspects of processing non-symbolic and symbolic numbers, over assessing the common representational space of numbers and space, to evaluating the influence of characteristics of the base-10 place-value structure of Arabic numbers and executive control on number processing. Apart from basic results such as reading times of numbers increasing with their magnitude, studies revealed that number processing can influence domain-general processes such as attention shifting-but also the other way round. Domain-general processes such as cognitive control were found to affect number processing. In summary, eye-fixation behavior allows for new insights into both domain-specific and domain-general processes involved in number processing. Based thereon, a processing model of the temporal dynamics of numerical cognition is postulated, which distinguishes an early stage of stimulus-driven bottom-up processing from later more top-down controlled stages. Furthermore, perspectives for eye-tracking research in numerical cognition are discussed to emphasize the potential of this methodology for advancing our understanding of numerical cognition.

  13. What interests them in the pictures?--differences in eye-tracking between rhesus monkeys and humans.

    PubMed

    Hu, Ying-Zhou; Jiang, Hui-Hui; Liu, Ci-Rong; Wang, Jian-Hong; Yu, Cheng-Yang; Carlson, Synnöve; Yang, Shang-Chuan; Saarinen, Veli-Matti; Rizak, Joshua D; Tian, Xiao-Guang; Tan, Hen; Chen, Zhu-Yue; Ma, Yuan-Ye; Hu, Xin-Tian

    2013-10-01

    Studies estimating eye movements have demonstrated that non-human primates have fixation patterns similar to humans at the first sight of a picture. In the current study, three sets of pictures containing monkeys, humans or both were presented to rhesus monkeys and humans. The eye movements on these pictures by the two species were recorded using a Tobii eye-tracking system. We found that monkeys paid more attention to the head and body in pictures containing monkeys, whereas both monkeys and humans paid more attention to the head in pictures containing humans. The humans always concentrated on the eyes and head in all the pictures, indicating the social role of facial cues in society. Although humans paid more attention to the hands than monkeys, both monkeys and humans were interested in the hands and what was being done with them in the pictures. This may suggest the importance and necessity of hands for survival. Finally, monkeys scored lower in eye-tracking when fixating on the pictures, as if they were less interested in looking at the screen than humans. The locations of fixation in monkeys may provide insight into the role of eye movements in an evolutionary context.

  14. Effects of reward on the accuracy and dynamics of smooth pursuit eye movements.

    PubMed

    Brielmann, Aenne A; Spering, Miriam

    2015-08-01

    Reward modulates behavioral choices and biases goal-oriented behavior, such as eye or hand movements, toward locations or stimuli associated with higher rewards. We investigated reward effects on the accuracy and timing of smooth pursuit eye movements in 4 experiments. Eye movements were recorded in participants tracking a moving visual target on a computer monitor. Before target motion onset, a monetary reward cue indicated whether participants could earn money by tracking accurately, or whether the trial was unrewarded (Experiments 1 and 2, n = 11 each). Reward significantly improved eye-movement accuracy across different levels of task difficulty. Improvements were seen even in the earliest phase of the eye movement, within 70 ms of tracking onset, indicating that reward impacts visual-motor processing at an early level. We obtained similar findings when reward was not precued but explicitly associated with the pursuit target (Experiment 3, n = 16); critically, these results were not driven by stimulus prevalence or other factors such as preparation or motivation. Numerical cues (Experiment 4, n = 9) were not effective. (c) 2015 APA, all rights reserved).

  15. Eye tracking a self-moved target with complex hand-target dynamics

    PubMed Central

    Landelle, Caroline; Montagnini, Anna; Madelain, Laurent

    2016-01-01

    Previous work has shown that the ability to track with the eye a moving target is substantially improved when the target is self-moved by the subject's hand compared with when being externally moved. Here, we explored a situation in which the mapping between hand movement and target motion was perturbed by simulating an elastic relationship between the hand and target. Our objective was to determine whether the predictive mechanisms driving eye-hand coordination could be updated to accommodate this complex hand-target dynamics. To fully appreciate the behavioral effects of this perturbation, we compared eye tracking performance when self-moving a target with a rigid mapping (simple) and a spring mapping as well as when the subject tracked target trajectories that he/she had previously generated when using the rigid or spring mapping. Concerning the rigid mapping, our results confirmed that smooth pursuit was more accurate when the target was self-moved than externally moved. In contrast, with the spring mapping, eye tracking had initially similar low spatial accuracy (though shorter temporal lag) in the self versus externally moved conditions. However, within ∼5 min of practice, smooth pursuit improved in the self-moved spring condition, up to a level similar to the self-moved rigid condition. Subsequently, when the mapping unexpectedly switched from spring to rigid, the eye initially followed the expected target trajectory and not the real one, thereby suggesting that subjects used an internal representation of the new hand-target dynamics. Overall, these results emphasize the stunning adaptability of smooth pursuit when self-maneuvering objects with complex dynamics. PMID:27466129

  16. Nerve Fiber Flux Analysis Using Wide-Field Swept-Source Optical Coherence Tomography.

    PubMed

    Tan, Ou; Liu, Liang; Liu, Li; Huang, David

    2018-02-01

    To devise a method to quantify nerve fibers over their arcuate courses over an extended peripapillary area using optical coherence tomography (OCT). Participants were imaged with 8 × 8-mm volumetric OCT scans centered at the optic disc. A new quantity, nerve fiber flux (NFF), represents the cross-sectional area transected perpendicular to the nerve fibers. The peripapillary area was divided into 64 tracks with equal flux. An iterative algorithm traced the trajectory of the tracks assuming that the relative distribution of the NFF was conserved with compensation for fiber connections to ganglion cells on the macular side. Average trajectory was averaged from normal eyes and use to calculate the NFF maps for glaucomatous eyes. The NFF maps were divided into eight sectors that correspond to visual field regions. There were 24 healthy and 10 glaucomatous eyes enrolled. The algorithm converged on similar patterns of NFL tracks for all healthy eyes. In glaucomatous eyes, NFF correlated with visual field sensitivity in the arcuate sectors (Spearman ρ = 0.53-0.62). Focal nerve fiber loss in glaucomatous eyes appeared as uniform tracks of NFF defects that followed the expected arcuate fiber trajectory. Using an algorithm based on the conservation of flux, we derived nerve fiber trajectories in the peripapillary area. The NFF map is useful for the visualization of focal defects and quantification of sector nerve fiber loss from wide-area volumetric OCT scans. NFF provides a cumulative measure of volumetric loss along nerve fiber tracks and could improve the detection of focal glaucoma damage.

  17. Alcohol and disorientation-responses. VI, Effects of alcohol on eye movements and tracking performance during laboratory angular accelerations about the yaw and pitch axes.

    DOT National Transportation Integrated Search

    1972-12-01

    Alcohol ingestion interferes with visual control of vestibular eye movements and thereby produces significant decrements in performance at a compensatory tracking task during oscillation about the yaw axis; significant or consistent decrements in per...

  18. 77 FR 35983 - Agency Information Collection Activities; Proposed Collection; Comment Request; Eye Tracking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ... also help improve questionnaire design. Different respondents may pay differing degrees of attention to... and strategies for improving the design (Refs. 5 and 6). Finally, eye tracking data can provide... design elements (e.g., prominence, text vs. graphics) will cause variations in information seeking. To...

  19. High-Speed Noninvasive Eye-Tracking System

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; LaBaw, Clayton; Michael-Morookian, John; Monacos, Steve; Serviss, Orin

    2007-01-01

    The figure schematically depicts a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Relative to the prior commercial systems, the present system operates at much higher speed and thereby offers enhanced capability for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals,and eye-based diagnosis of physiological disorders that affect gaze responses.

  20. 21 CFR 872.2060 - Jaw tracking device.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Jaw tracking device. 872.2060 Section 872.2060 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES DENTAL DEVICES Diagnostic Devices § 872.2060 Jaw tracking device. (a) Jaw tracking device...

  1. 21 CFR 872.2060 - Jaw tracking device.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Jaw tracking device. 872.2060 Section 872.2060 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES DENTAL DEVICES Diagnostic Devices § 872.2060 Jaw tracking device. (a) Jaw tracking device...

  2. 21 CFR 872.2060 - Jaw tracking device.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Jaw tracking device. 872.2060 Section 872.2060 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES DENTAL DEVICES Diagnostic Devices § 872.2060 Jaw tracking device. (a) Jaw tracking device...

  3. 21 CFR 872.2060 - Jaw tracking device.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Jaw tracking device. 872.2060 Section 872.2060 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES DENTAL DEVICES Diagnostic Devices § 872.2060 Jaw tracking device. (a) Jaw tracking device...

  4. Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor

    PubMed Central

    Tanno, Koichi

    2017-01-01

    A gaze estimation system is one of the communication methods for severely disabled people who cannot perform gestures and speech. We previously developed an eye tracking method using a compact and light electrooculogram (EOG) signal, but its accuracy is not very high. In the present study, we conducted experiments to investigate the EOG component strongly correlated with the change of eye movements. The experiments in this study are of two types: experiments to see objects only by eye movements and experiments to see objects by face and eye movements. The experimental results show the possibility of an eye tracking method using EOG signals and a Kinect sensor. PMID:28912800

  5. Differences in gaze behaviour of expert and junior surgeons performing open inguinal hernia repair.

    PubMed

    Tien, Tony; Pucher, Philip H; Sodergren, Mikael H; Sriskandarajah, Kumuthan; Yang, Guang-Zhong; Darzi, Ara

    2015-02-01

    Various fields have used gaze behaviour to evaluate task proficiency. This may also apply to surgery for the assessment of technical skill, but has not previously been explored in live surgery. The aim was to assess differences in gaze behaviour between expert and junior surgeons during open inguinal hernia repair. Gaze behaviour of expert and junior surgeons (defined by operative experience) performing the operation was recorded using eye-tracking glasses (SMI Eye Tracking Glasses 2.0, SensoMotoric Instruments, Germany). Primary endpoints were fixation frequency (steady eye gaze rate) and dwell time (fixation and saccades duration) and were analysed for designated areas of interest in the subject's visual field. Secondary endpoints were maximum pupil size, pupil rate of change (change frequency in pupil size) and pupil entropy (predictability of pupil change). NASA TLX scale measured perceived workload. Recorded metrics were compared between groups for the entire procedure and for comparable procedural segments. Twenty-five cases were recorded, with 13 operations analysed, from 9 surgeons giving 630 min of data, recorded at 30 Hz. Experts demonstrated higher fixation frequency (median[IQR] 1.86 [0.3] vs 0.96 [0.3]; P = 0.006) and dwell time on the operative site during application of mesh (792 [159] vs 469 [109] s; P = 0.028), closure of the external oblique (1.79 [0.2] vs 1.20 [0.6]; P = 0.003) (625 [154] vs 448 [147] s; P = 0.032) and dwelled more on the sterile field during cutting of mesh (716 [173] vs 268 [297] s; P = 0.019). NASA TLX scores indicated experts found the procedure less mentally demanding than juniors (3 [2] vs 12 [5.2]; P = 0.038). No subjects reported problems with wearing of the device, or obstruction of view. Use of portable eye-tracking technology in open surgery is feasible, without impinging surgical performance. Differences in gaze behaviour during open inguinal hernia repair can be seen between expert and junior surgeons and may have uses for assessment of surgical skill.

  6. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2016-04-01

    but these delays are nearing resolution and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task...resonance imagining ( fMRI ) and diffusion tensor imaging (DTI) to characterize the extent of functional cortical recruitment and white matter injury...respectively. The inclusion of fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye

  7. An automatic eye detection and tracking technique for stereo video sequences

    NASA Astrophysics Data System (ADS)

    Paduru, Anirudh; Charalampidis, Dimitrios; Fouts, Brandon; Jovanovich, Kim

    2009-05-01

    Human-computer interfacing (HCI) describes a system or process with which two information processors, namely a human and a computer, attempt to exchange information. Computer-to-human (CtH) information transfer has been relatively effective through visual displays and sound devices. On the other hand, the human-tocomputer (HtC) interfacing avenue has yet to reach its full potential. For instance, the most common HtC communication means are the keyboard and mouse, which are already becoming a bottleneck in the effective transfer of information. The solution to the problem is the development of algorithms that allow the computer to understand human intentions based on their facial expressions, head motion patterns, and speech. In this work, we are investigating the feasibility of a stereo system to effectively determine the head position, including the head rotation angles, based on the detection of eye pupils.

  8. A non-invasive method for studying an index of pupil diameter and visual performance in the rhesus monkey.

    PubMed

    Fairhall, Sarah J; Dickson, Carol A; Scott, Leah; Pearce, Peter C

    2006-04-01

    A non-invasive model has been developed to estimate gaze direction and relative pupil diameter, in minimally restrained rhesus monkeys, to investigate the effects of low doses of ocularly administered cholinergic compounds on visual performance. Animals were trained to co-operate with a novel device, which enabled eye movements to be recorded using modified human eye-tracking equipment, and to perform a task which determined visual threshold contrast. Responses were made by gaze transfer under twilight conditions. 4% w/v pilocarpine nitrate was studied to demonstrate the suitability of the model. Pilocarpine induced marked miosis for >3 h which was accompanied by a decrement in task performance. The method obviates the need for invasive surgery and, as the position of point of gaze can be approximately defined, the approach may have utility in other areas of research involving non-human primates.

  9. Glaucoma in modified osteo-odonto-keratoprosthesis eyes: role of additional stage 1A and Ahmed glaucoma drainage device-technique and timing.

    PubMed

    Iyer, Geetha; Srinivasan, Bhaskar; Agarwal, Shweta; Shetty, Roshni; Krishnamoorthy, Sripriya; Balekudaru, Shantha; Vijaya, Lingam

    2015-03-01

    To report the technique, timing, and outcomes of the Ahmed glaucoma drainage device in eyes with the modified osteo-odonto-keratoprosthesis (MOOKP) and the role of an additional stage 1A to the Rome-Vienna protocol. Retrospective interventional case series. Case records of 22 eyes of 20 patients with high intraocular pressure at various stages of the MOOKP procedure performed in 85 eyes of 82 patients were studied. Stage 1A, which includes total iridodialysis, intracapsular cataract extraction, and anterior vitrectomy, was done in all eyes as the primary stage. Seventeen Ahmed glaucoma drainage devices were implanted in 15 eyes of 14 patients (chemical injury in 9 [10 eyes] and Stevens-Johnson syndrome in 5 patients). Implantation was performed during and after stage 1A in 2 and 7 eyes, respectively, after stage 1B+1C in 1 eye, and after stage 2 in 6 eyes. Eleven of 15 eyes (73.3%) remained stable with adequate control of intraocular pressure over a mean follow-up period of 33.68 months (1-90 months). Complications related to the drainage device were hypotony in 1 eye and vitreous block of the tube in 1 eye. It is ideal to place the Ahmed glaucoma drainage device prior to the mucosal graft when the anatomy of the ocular surface is least altered with best outcomes. The technique of placement of the drainage device during the various stages of the MOOKP procedure has been described. The intraocular pressure stabilized in three quarters of the eyes with pre-existing glaucoma. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio

    2017-03-01

    It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.

  11. Effects of Detailed Illustrations on Science Learning: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Lin, Yu Ying; Holmqvist, Kenneth; Miyoshi, Kiyofumi; Ashida, Hiroshi

    2017-01-01

    The eye-tracking method was used to assess the influence of detailed, colorful illustrations on reading behaviors and learning outcomes. Based on participants' subjective ratings in a pre-study, we selected eight one-page human anatomy lessons. In the main study, participants learned these eight human anatomy lessons; four were accompanied by…

  12. Incidental L2 Vocabulary Acquisition "from" and "while" Reading: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Pellicer-Sánchez, Ana

    2016-01-01

    Previous studies have shown that reading is an important source of incidental second language (L2) vocabulary acquisition. However, we still do not have a clear picture of what happens when readers encounter unknown words. Combining offline (vocabulary tests) and online (eye-tracking) measures, the incidental acquisition of vocabulary knowledge…

  13. How Young Children View Mathematical Representations: A Study Using Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Bolden, David; Barmby, Patrick; Raine, Stephanie; Gardner, Matthew

    2015-01-01

    Background: It has been shown that mathematical representations can aid children's understanding of mathematical concepts but that children can sometimes have difficulty in interpreting them correctly. New advances in eye-tracking technology can help in this respect because it allows data to be gathered concerning children's focus of attention and…

  14. Subtitles and Eye Tracking: Reading and Performance

    ERIC Educational Resources Information Center

    Kruger, Jan-Louis; Steyn, Faans

    2014-01-01

    This article presents an experimental study to investigate whether subtitle reading has a positive impact on academic performance. In the absence of reliable indexes of reading behavior in dynamic texts, the article first formulates and validates an index to measure the reading of text, such as subtitles on film. Eye-tracking measures (fixations…

  15. Looking at Movies and Cartoons: Eye-Tracking Evidence from Williams Syndrome and Autism

    ERIC Educational Resources Information Center

    Riby, D.; Hancock, P. J. B.

    2009-01-01

    Background: Autism and Williams syndrome (WS) are neuro-developmental disorders associated with distinct social phenotypes. While individuals with autism show a lack of interest in socially important cues, individuals with WS often show increased interest in socially relevant information. Methods: The current eye-tracking study explores how…

  16. Reconceptualizing Reactivity of Think-Alouds and Eye Tracking: Absence of Evidence Is Not Evidence of Absence

    ERIC Educational Resources Information Center

    Godfroid, Aline; Spino, Le Anne

    2015-01-01

    This study extends previous reactivity research on the cognitive effects of think-alouds to include eye-tracking methodology. Unlike previous studies, we supplemented traditional superiority tests with equivalence tests, because only the latter are conceptually appropriate for demonstrating nonreactivity. Advanced learners of English read short…

  17. Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993

    ERIC Educational Resources Information Center

    O'Driscoll, Gillian A.; Callahan, Brandy L.

    2008-01-01

    Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…

  18. 78 FR 40153 - Agency Information Collection Activities; Submission for Office of Management and Budget Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-03

    ... images (Refs. 1 to 4, 7). Data from eye tracking studies can also help improve questionnaire design... response options. Eye tracking data can help to identify the need and strategies for improving the design... product familiarity or personal needs will cause variations in information seeking and that design...

  19. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis

    ERIC Educational Resources Information Center

    Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying

    2012-01-01

    This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…

  20. Using Dual Eye-Tracking Measures to Differentiate between Collaboration on Procedural and Conceptual Learning Activities

    ERIC Educational Resources Information Center

    Belenky, Daniel; Ringenberg, Michael; Olsen, Jennifer; Aleven, Vincent; Rummel, Nikol

    2013-01-01

    Dual eye-tracking measures enable novel ways to test predictions about collaborative learning. For example, the research project we are engaging in uses measures of gaze recurrence to help understand how collaboration may differ when students are completing various learning activities focused on different learning objectives. Specifically, we…

  1. Eye-Tracking as a Tool in Process-Oriented Reading Test Validation

    ERIC Educational Resources Information Center

    Solheim, Oddny Judith; Uppstad, Per Henning

    2011-01-01

    The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…

  2. Linguistic Complexity and Information Structure in Korean: Evidence from Eye-Tracking during Reading

    ERIC Educational Resources Information Center

    Lee, Yoonhyoung; Lee, Hanjung; Gordon, Peter C.

    2007-01-01

    The nature of the memory processes that support language comprehension and the manner in which information packaging influences online sentence processing were investigated in three experiments that used eye-tracking during reading to measure the ease of understanding complex sentences in Korean. All three experiments examined reading of embedded…

  3. Eye Tracking and Head Movement Detection: A State-of-Art Survey

    PubMed Central

    2013-01-01

    Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851

  4. Measure and Analysis of a Gaze Position Using Infrared Light Technique

    DTIC Science & Technology

    2001-10-25

    MEASURE AND ANALYSIS OF A GAZE POSITION USING INFRARED LIGHT TECHNIQUE Z. Ramdane-Cherif1,2, A. Naït-Ali2, J F. Motsch2, M. O. Krebs1 1INSERM E 01-17...also proposes a method to correct head movements. Keywords: eye movement, gaze tracking, visual scan path, spatial mapping. INTRODUCTION The eye gaze ...tracking has been used for clinical purposes to detect illnesses, such as nystagmus , unusual eye movements and many others [1][2][3]. It is also used

  5. Tracking with the mind's eye

    NASA Technical Reports Server (NTRS)

    Krauzlis, R. J.; Stone, L. S.

    1999-01-01

    The two components of voluntary tracking eye-movements in primates, pursuit and saccades, are generally viewed as relatively independent oculomotor subsystems that move the eyes in different ways using independent visual information. Although saccades have long been known to be guided by visual processes related to perception and cognition, only recently have psychophysical and physiological studies provided compelling evidence that pursuit is also guided by such higher-order visual processes, rather than by the raw retinal stimulus. Pursuit and saccades also do not appear to be entirely independent anatomical systems, but involve overlapping neural mechanisms that might be important for coordinating these two types of eye movement during the tracking of a selected visual object. Given that the recovery of objects from real-world images is inherently ambiguous, guiding both pursuit and saccades with perception could represent an explicit strategy for ensuring that these two motor actions are driven by a single visual interpretation.

  6. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  7. Simultaneous Recordings of Human Microsaccades and Drifts with a Contemporary Video Eye Tracker and the Search Coil Technique

    PubMed Central

    McCamy, Michael B.; Otero-Millan, Jorge; Leigh, R. John; King, Susan A.; Schneider, Rosalyn M.; Macknik, Stephen L.; Martinez-Conde, Susana

    2015-01-01

    Human eyes move continuously, even during visual fixation. These “fixational eye movements” (FEMs) include microsaccades, intersaccadic drift and oculomotor tremor. Research in human FEMs has grown considerably in the last decade, facilitated by the manufacture of noninvasive, high-resolution/speed video-oculography eye trackers. Due to the small magnitude of FEMs, obtaining reliable data can be challenging, however, and depends critically on the sensitivity and precision of the eye tracking system. Yet, no study has conducted an in-depth comparison of human FEM recordings obtained with the search coil (considered the gold standard for measuring microsaccades and drift) and with contemporary, state-of-the art video trackers. Here we measured human microsaccades and drift simultaneously with the search coil and a popular state-of-the-art video tracker. We found that 95% of microsaccades detected with the search coil were also detected with the video tracker, and 95% of microsaccades detected with video tracking were also detected with the search coil, indicating substantial agreement between the two systems. Peak/mean velocities and main sequence slopes of microsaccades detected with video tracking were significantly higher than those of the same microsaccades detected with the search coil, however. Ocular drift was significantly correlated between the two systems, but drift speeds were higher with video tracking than with the search coil. Overall, our combined results suggest that contemporary video tracking now approaches the search coil for measuring FEMs. PMID:26035820

  8. Perceptions of rapport across the life span: Gaze patterns and judgment accuracy.

    PubMed

    Vicaria, Ishabel M; Bernieri, Frank J; Isaacowitz, Derek M

    2015-06-01

    Although age-related deficits in emotion perception have been established using photographs of individuals, the extension of these findings to dynamic displays and dyads is just beginning. Similarly, most eye-tracking research in the person perception literature, including those that study age differences, have focused on individual attributes gleaned from static images; to our knowledge, no previous research has considered cue use in dyadic judgments with eye-tracking. The current study employed a Brunswikian lens model analysis in conjunction with eye-tracking measurements to study age differences in the judgment of rapport, a social construct comprised of mutual attentiveness, positive feelings, and coordination between interacting partners. Judgment accuracy and cue utilization of younger (n = 47) and older (n = 46) adults were operationalized as correlations between a perceiver's judgments and criterion values within a set of 34 brief interaction videos in which 2 opposite sex college students discussed a controversial topic. No age differences emerged in the accuracy of judgments; however, pathways to accuracy differed by age: Younger adults' judgments relied on some behavioral cues more than older adults. In addition, eye-tracking analyses revealed that older adults spent more time looking at the bodies of the targets in the videos, whereas younger adults spent more time looking at the targets' heads. The contributions from both the lens model and eye-tracking findings provide distinct but complementary insights to our understanding of age-related continuities and shifts in social perceptual processing. (c) 2015 APA, all rights reserved.

  9. What triggers catch-up saccades during visual tracking?

    PubMed

    de Brouwer, Sophie; Yuksel, Demet; Blohm, Gunnar; Missal, Marcus; Lefèvre, Philippe

    2002-03-01

    When tracking moving visual stimuli, primates orient their visual axis by combining two kinds of eye movements, smooth pursuit and saccades, that have very different dynamics. Yet, the mechanisms that govern the decision to switch from one type of eye movement to the other are still poorly understood, even though they could bring a significant contribution to the understanding of how the CNS combines different kinds of control strategies to achieve a common motor and sensory goal. In this study, we investigated the oculomotor responses to a large range of different combinations of position error and velocity error during visual tracking of moving stimuli in humans. We found that the oculomotor system uses a prediction of the time at which the eye trajectory will cross the target, defined as the "eye crossing time" (T(XE)). The eye crossing time, which depends on both position error and velocity error, is the criterion used to switch between smooth and saccadic pursuit, i.e., to trigger catch-up saccades. On average, for T(XE) between 40 and 180 ms, no saccade is triggered and target tracking remains purely smooth. Conversely, when T(XE) becomes smaller than 40 ms or larger than 180 ms, a saccade is triggered after a short latency (around 125 ms).

  10. Quantitative Measurement of Eyestrain on 3D Stereoscopic Display Considering the Eye Foveation Model and Edge Information

    PubMed Central

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-01-01

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors. PMID:24834910

  11. 76 FR 51876 - Medical Devices; Ophthalmic Devices; Classification of the Eyelid Thermal Pulsation System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-19

    ... meibomian gland dysfunction (MGD), also known as evaporative dry eye or lipid deficiency dry eye. The system... evaporative dry eye or lipid deficiency dry eye. The system consists of a component that is inserted around...

  12. 45 Mbps cat's eye modulating retro-reflector link over 7 Km

    NASA Astrophysics Data System (ADS)

    Rabinovich, W. S.; Mahon, R.; Goetz, P. G.; Swingen, L.; Murphy, J.; Ferraro, M.; Burris, R.; Suite, M.; Moore, C. I.; Gilbreath, G. C.; Binari, S.

    2006-09-01

    Modulating retro-reflectors (MRR) allow free space optical links with no need for pointing, tracking or a laser on one end of the link. They work by coupling a passive optical retro-reflector with an optical modulator. The most common kind of MRR uses a corner cube retro-reflector. These devices must have a modulator whose active area is as large as the area of the corner cube. This limits the ability to close longer range high speed links because the large aperture need to return sufficient light implies a large modulator capacitance. To overcome this limitation we developed the concept of a cat's eye MRR. Cat's eye MRRs place the modulator in the focal plane of a lens system designed to passively retro-reflect light. Because the light focuses onto the modulator, a small, low capacitance, modulator can be used with a large optical aperture. However, the position of the focal spot varies with the angle of incidence so an array of modulators must be placed in the focal plane, In addition, to avoid having to drive all the modulator pixels, an angle of arrival sensor must be used. We discuss several cat's eye MRR systems with near diffraction limited performance and bandwidths of 45 Mbps. We also discuss a link to a cat's eye MRR over a 7 Km range.

  13. The feasibility of automated eye tracking with the Early Childhood Vigilance Test of attention in younger HIV-exposed Ugandan children.

    PubMed

    Boivin, Michael J; Weiss, Jonathan; Chhaya, Ronak; Seffren, Victoria; Awadu, Jorem; Sikorskii, Alla; Giordani, Bruno

    2017-07-01

    Tobii eye tracking was compared with webcam-based observer scoring on an animation viewing measure of attention (Early Childhood Vigilance Test; ECVT) to evaluate the feasibility of automating measurement and scoring. Outcomes from both scoring approaches were compared with the Mullen Scales of Early Learning (MSEL), Color-Object Association Test (COAT), and Behavior Rating Inventory of Executive Function for preschool children (BRIEF-P). A total of 44 children 44 to 65 months of age were evaluated with the ECVT, COAT, MSEL, and BRIEF-P. Tobii ×2-30 portable infrared cameras were programmed to monitor pupil direction during the ECVT 6-min animation and compared with observer-based PROCODER webcam scoring. Children watched 78% of the cartoon (Tobii) compared with 67% (webcam scoring), although the 2 measures were highly correlated (r = .90, p = .001). It is possible for 2 such measures to be highly correlated even if one is consistently higher than the other (Bergemann et al., 2012). Both ECVT Tobii and webcam ECVT measures significantly correlated with COAT immediate recall (r = .37, p = .02 vs. r = .38, p = .01, respectively) and total recall (r = .33, p = .06 vs. r = .42, p = .005) measures. However, neither the Tobii eye tracking nor PROCODER webcam ECVT measures of attention correlated with MSEL composite cognitive performance or BRIEF-P global executive composite. ECVT scoring using Tobii eye tracking is feasible with at-risk very young African children and consistent with webcam-based scoring approaches in their correspondence to one another and other neurocognitive performance-based measures. By automating measurement and scoring, eye tracking technologies can improve the efficiency and help better standardize ECVT testing of attention in younger children. This holds promise for other neurodevelopmental tests where eye movements, tracking, and gaze length can provide important behavioral markers of neuropsychological and neurodevelopmental processes associated with such tests. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Preliminary Experience Using Eye-Tracking Technology to Differentiate Novice and Expert Image Interpretation for Ultrasound-Guided Regional Anesthesia.

    PubMed

    Borg, Lindsay K; Harrison, T Kyle; Kou, Alex; Mariano, Edward R; Udani, Ankeet D; Kim, T Edward; Shum, Cynthia; Howard, Steven K

    2018-02-01

    Objective measures are needed to guide the novice's pathway to expertise. Within and outside medicine, eye tracking has been used for both training and assessment. We designed this study to test the hypothesis that eye tracking may differentiate novices from experts in static image interpretation for ultrasound (US)-guided regional anesthesia. We recruited novice anesthesiology residents and regional anesthesiology experts. Participants wore eye-tracking glasses, were shown 5 sonograms of US-guided regional anesthesia, and were asked a series of anatomy-based questions related to each image while their eye movements were recorded. The answer to each question was a location on the sonogram, defined as the area of interest (AOI). The primary outcome was the total gaze time in the AOI (seconds). Secondary outcomes were the total gaze time outside the AOI (seconds), total time to answer (seconds), and time to first fixation on the AOI (seconds). Five novices and 5 experts completed the study. Although the gaze time (mean ± SD) in the AOI was not different between groups (7 ± 4 seconds for novices and 7 ± 3 seconds for experts; P = .150), the gaze time outside the AOI was greater for novices (75 ± 18 versus 44 ± 4 seconds for experts; P = .005). The total time to answer and total time to first fixation in the AOI were both shorter for experts. Experts in US-guided regional anesthesia take less time to identify sonoanatomy and spend less unfocused time away from a target compared to novices. Eye tracking is a potentially useful tool to differentiate novices from experts in the domain of US image interpretation. © 2017 by the American Institute of Ultrasound in Medicine.

  15. Online webcam-based eye tracking in cognitive science: A first look.

    PubMed

    Semmelmann, Kilian; Weigelt, Sarah

    2018-04-01

    Online experimentation is emerging in many areas of cognitive psychology as a viable alternative or supplement to classical in-lab experimentation. While performance- and reaction-time-based paradigms are covered in recent studies, one instrument of cognitive psychology has not received much attention up to now: eye tracking. In this study, we used JavaScript-based eye tracking algorithms recently made available by Papoutsaki et al. (International Joint Conference on Artificial Intelligence, 2016) together with consumer-grade webcams to investigate the potential of online eye tracking to benefit from the common advantages of online data conduction. We compared three in-lab conducted tasks (fixation, pursuit, and free viewing) with online-acquired data to analyze the spatial precision in the first two, and replicability of well-known gazing patterns in the third task. Our results indicate that in-lab data exhibit an offset of about 172 px (15% of screen size, 3.94° visual angle) in the fixation task, while online data is slightly less accurate (18% of screen size, 207 px), and shows higher variance. The same results were found for the pursuit task with a constant offset during the stimulus movement (211 px in-lab, 216 px online). In the free-viewing task, we were able to replicate the high attention attribution to eyes (28.25%) compared to other key regions like the nose (9.71%) and mouth (4.00%). Overall, we found web technology-based eye tracking to be suitable for all three tasks and are confident that the required hard- and software will be improved continuously for even more sophisticated experimental paradigms in all of cognitive psychology.

  16. Screening for Dyslexia Using Eye Tracking during Reading.

    PubMed

    Nilsson Benfatto, Mattias; Öqvist Seimyr, Gustaf; Ygge, Jan; Pansell, Tony; Rydberg, Agneta; Jacobson, Christer

    2016-01-01

    Dyslexia is a neurodevelopmental reading disability estimated to affect 5-10% of the population. While there is yet no full understanding of the cause of dyslexia, or agreement on its precise definition, it is certain that many individuals suffer persistent problems in learning to read for no apparent reason. Although it is generally agreed that early intervention is the best form of support for children with dyslexia, there is still a lack of efficient and objective means to help identify those at risk during the early years of school. Here we show that it is possible to identify 9-10 year old individuals at risk of persistent reading difficulties by using eye tracking during reading to probe the processes that underlie reading ability. In contrast to current screening methods, which rely on oral or written tests, eye tracking does not depend on the subject to produce some overt verbal response and thus provides a natural means to objectively assess the reading process as it unfolds in real-time. Our study is based on a sample of 97 high-risk subjects with early identified word decoding difficulties and a control group of 88 low-risk subjects. These subjects were selected from a larger population of 2165 school children attending second grade. Using predictive modeling and statistical resampling techniques, we develop classification models from eye tracking records less than one minute in duration and show that the models are able to differentiate high-risk subjects from low-risk subjects with high accuracy. Although dyslexia is fundamentally a language-based learning disability, our results suggest that eye movements in reading can be highly predictive of individual reading ability and that eye tracking can be an efficient means to identify children at risk of long-term reading difficulties.

  17. 21 CFR 878.4440 - Eye pad.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Eye pad. 878.4440 Section 878.4440 Food and Drugs... GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4440 Eye pad. (a) Identification. An eye pad is... use as a bandage over the eye for protection or absorption of secretions. (b) Classification. Class I...

  18. 21 CFR 878.4440 - Eye pad.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Eye pad. 878.4440 Section 878.4440 Food and Drugs... GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4440 Eye pad. (a) Identification. An eye pad is... use as a bandage over the eye for protection or absorption of secretions. (b) Classification. Class I...

  19. Non-orthogonal tool/flange and robot/world calibration.

    PubMed

    Ernst, Floris; Richter, Lars; Matthäus, Lars; Martens, Volker; Bruder, Ralf; Schlaefer, Alexander; Schweikard, Achim

    2012-12-01

    For many robot-assisted medical applications, it is necessary to accurately compute the relation between the robot's coordinate system and the coordinate system of a localisation or tracking device. Today, this is typically carried out using hand-eye calibration methods like those proposed by Tsai/Lenz or Daniilidis. We present a new method for simultaneous tool/flange and robot/world calibration by estimating a solution to the matrix equation AX = YB. It is computed using a least-squares approach. Because real robots and localisation are all afflicted by errors, our approach allows for non-orthogonal matrices, partially compensating for imperfect calibration of the robot or localisation device. We also introduce a new method where full robot/world and partial tool/flange calibration is possible by using localisation devices providing less than six degrees of freedom (DOFs). The methods are evaluated on simulation data and on real-world measurements from optical and magnetical tracking devices, volumetric ultrasound providing 3-DOF data, and a surface laser scanning device. We compare our methods with two classical approaches: the method by Tsai/Lenz and the method by Daniilidis. In all experiments, the new algorithms outperform the classical methods in terms of translational accuracy by up to 80% and perform similarly in terms of rotational accuracy. Additionally, the methods are shown to be stable: the number of calibration stations used has far less influence on calibration quality than for the classical methods. Our work shows that the new method can be used for estimating the relationship between the robot's and the localisation device's coordinate systems. The new method can also be used for deficient systems providing only 3-DOF data, and it can be employed in real-time scenarios because of its speed. Copyright © 2012 John Wiley & Sons, Ltd.

  20. How visual search relates to visual diagnostic performance: a narrative systematic review of eye-tracking research in radiology.

    PubMed

    van der Gijp, A; Ravesloot, C J; Jarodzka, H; van der Schaaf, M F; van der Schaaf, I C; van Schaik, J P J; Ten Cate, Th J

    2017-08-01

    Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology domain aims to identify visual search patterns associated with high perceptual performance. Databases PubMed, EMBASE, ERIC, PsycINFO, Scopus and Web of Science were searched using 'visual perception' OR 'eye tracking' AND 'radiology' and synonyms. Two authors independently screened search results and included eye tracking studies concerning visual skills in radiology published between January 1, 1994 and July 31, 2015. Two authors independently assessed study quality with the Medical Education Research Study Quality Instrument, and extracted study data with respect to design, participant and task characteristics, and variables. A thematic analysis was conducted to extract and arrange study results, and a textual narrative synthesis was applied for data integration and interpretation. The search resulted in 22 relevant full-text articles. Thematic analysis resulted in six themes that informed the relation between visual search and level of expertise: (1) time on task, (2) eye movement characteristics of experts, (3) differences in visual attention, (4) visual search patterns, (5) search patterns in cross sectional stack imaging, and (6) teaching visual search strategies. Expert search was found to be characterized by a global-focal search pattern, which represents an initial global impression, followed by a detailed, focal search-to-find mode. Specific task-related search patterns, like drilling through CT scans and systematic search in chest X-rays, were found to be related to high expert levels. One study investigated teaching of visual search strategies, and did not find a significant effect on perceptual performance. Eye tracking literature in radiology indicates several search patterns are related to high levels of expertise, but teaching novices to search as an expert may not be effective. Experimental research is needed to find out which search strategies can improve image perception in learners.

  1. Home use of binocular dichoptic video content device for treatment of amblyopia: a pilot study.

    PubMed

    Mezad-Koursh, Daphna; Rosenblatt, Amir; Newman, Hadas; Stolovitch, Chaim

    2018-04-01

    To evaluate the efficacy of the BinoVision home system as measured by improvement of visual acuity in the patient's amblyopic eye. An open-label prospective pilot-trial of the system was conducted with amblyopic children aged 4-8 years at the pediatric ophthalmology unit, Tel-Aviv Medical Center, January 2014 to October 2015. Participants were assigned to the study or sham group for treatment with BinoVision for 8 or 12 weeks. Patients were instructed to watch animated television shows and videos at home using the BinoVision device for 60 minutes, 6 days a week. The BinoVision program incorporates elements at different contrast and brightness levels for both eyes, weak eye tracking training by superimposed screen images, and weak eye flicker stimuli with alerting sound manipulations. Patients were examined at 4, 8, 12, 24, and 36 weeks. A total of 27 children were recruited (14 boys), with 19 in the treatment group. Median age was 5 years (range, 4-8 years). Mean visual acuity improved by 0.26 logMAR lines in the treatment group from baseline to 12 weeks. Visual acuity was improved compared to baseline during all study and follow-up appointments (P < 0.01), with stabilization of visual acuity after cessation of treatment. The sham group completed 4 weeks of sham protocol with no change in visual acuity (P = 0.285). The average compliance rate was 88% ± 16% (50% to 100%) in treatment group. This pilot trial of 12 weeks of amblyopia treatment with the BinoVision home system demonstrated significant improvement in patients' visual acuity. Copyright © 2018 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  2. Using eye tracking to identify faking attempts during penile plethysmography assessment.

    PubMed

    Trottier, Dominique; Rouleau, Joanne-Lucine; Renaud, Patrice; Goyette, Mathieu

    2014-01-01

    Penile plethysmography (PPG) is considered the most rigorous method for sexual interest assessment. Nevertheless, it is subject to faking attempts by participants, which compromises the internal validity of the instrument. To date, various attempts have been made to limit voluntary control of sexual response during PPG assessments, without satisfactory results. This exploratory research examined eye-tracking technologies' ability to identify the presence of cognitive strategies responsible for erectile inhibition during PPG assessment. Eye movements and penile responses for 20 subjects were recorded while exploring animated human-like computer-generated stimuli in a virtual environment under three distinct viewing conditions: (a) the free visual exploration of a preferred sexual stimulus without erectile inhibition; (b) the viewing of a preferred sexual stimulus with erectile inhibition; and (c) the free visual exploration of a non-preferred sexual stimulus. Results suggest that attempts to control erectile responses generate specific eye-movement variations, characterized by a general deceleration of the exploration process and limited exploration of the erogenous zone. Findings indicate that recording eye movements can provide significant information on the presence of competing covert processes responsible for erectile inhibition. The use of eye-tracking technologies during PPG could therefore lead to improved internal validity of the plethysmographic procedure.

  3. Development of a novel disposable lid speculum with a drape.

    PubMed

    Urano, Toru; Kasaoka, Masataka; Yamakawa, Ryoji; Yukihikotamai; Nakamura, Shoichiro

    2013-01-01

    To evaluate the clinical use of a newly-developed disposable lid speculum with a drape. LiDrape® is a cylindrical device that consists of two flexible rings of polyacetal resin with a transparent elastic silicone sheet attached to the rings. The novel device holds the eyelids between the rings, and a hole in the center of the device provides a surgical field. We used the novel device in cataract surgery (75 eyes), glaucoma surgery (eleven eyes), vitrectomy (ten eyes), and intravitreal injection (six eyes) and evaluated its clinical efficacy. The LiDrape was easy to attach and detach. The novel device was not detached from the eye during surgery. No eyelashes or secretions from the meibomian glands were seen in the surgical field, and the drape provided a sufficient surgical field. The LiDrape functions as a lid speculum as well as a drape. Our results showed that the novel device is useful for ocular surgeries.

  4. Geometry and Gesture-Based Features from Saccadic Eye-Movement as a Biometric in Radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Tracy; Tourassi, Georgia; Yoon, Hong-Jun

    In this study, we present a novel application of sketch gesture recognition on eye-movement for biometric identification and estimating task expertise. The study was performed for the task of mammographic screening with simultaneous viewing of four coordinated breast views as typically done in clinical practice. Eye-tracking data and diagnostic decisions collected for 100 mammographic cases (25 normal, 25 benign, 50 malignant) and 10 readers (three board certified radiologists and seven radiology residents), formed the corpus for this study. Sketch gesture recognition techniques were employed to extract geometric and gesture-based features from saccadic eye-movements. Our results show that saccadic eye-movement, characterizedmore » using sketch-based features, result in more accurate models for predicting individual identity and level of expertise than more traditional eye-tracking features.« less

  5. Effects of Device on Video Head Impulse Test (vHIT) Gain.

    PubMed

    Janky, Kristen L; Patterson, Jessie N; Shepard, Neil T; Thomas, Megan L A; Honaker, Julie A

    2017-10-01

    Numerous video head impulse test (vHIT) devices are available commercially; however, gain is not calculated uniformly. An evaluation of these devices/algorithms in healthy controls and patients with vestibular loss is necessary for comparing and synthesizing work that utilizes different devices and gain calculations. Using three commercially available vHIT devices/algorithms, the purpose of the present study was to compare: (1) horizontal canal vHIT gain among devices/algorithms in normal control subjects; (2) the effects of age on vHIT gain for each device/algorithm in normal control subjects; and (3) the clinical performance of horizontal canal vHIT gain between devices/algorithms for differentiating normal versus abnormal vestibular function. Prospective. Sixty-one normal control adult subjects (range 20-78) and eleven adults with unilateral or bilateral vestibular loss (range 32-79). vHIT was administered using three different devices/algorithms, randomized in order, for each subject on the same day: (1) Impulse (Otometrics, Schaumberg, IL; monocular eye recording, right eye only; using area under the curve gain), (2) EyeSeeCam (Interacoustics, Denmark; monocular eye recording, left eye only; using instantaneous gain), and (3) VisualEyes (MicroMedical, Chatham, IL, binocular eye recording; using position gain). There was a significant mean difference in vHIT gain among devices/algorithms for both the normal control and vestibular loss groups. vHIT gain was significantly larger in the ipsilateral direction of the eye used to measure gain; however, in spite of the significant mean differences in vHIT gain among devices/algorithms and the significant directional bias, classification of "normal" versus "abnormal" gain is consistent across all compared devices/algorithms, with the exception of instantaneous gain at 40 msec. There was not an effect of age on vHIT gain up to 78 years regardless of the device/algorithm. These findings support that vHIT gain is significantly different between devices/algorithms, suggesting that care should be taken when making direct comparisons of absolute gain values between devices/algorithms. American Academy of Audiology

  6. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback

    NASA Astrophysics Data System (ADS)

    Margolis, Todd; DeFanti, Thomas A.; Dawe, Greg; Prudhomme, Andrew; Schulze, Jurgen P.; Cutchin, Steve

    2011-03-01

    Researchers at the University of California, San Diego, have created a new, relatively low-cost augmented reality system that enables users to touch the virtual environment they are immersed in. The Heads-Up Virtual Reality device (HUVR) couples a consumer 3D HD flat screen TV with a half-silvered mirror to project any graphic image onto the user's hands and into the space surrounding them. With his or her head position optically tracked to generate the correct perspective view, the user maneuvers a force-feedback (haptic) device to interact with the 3D image, literally 'touching' the object's angles and contours as if it was a tangible physical object. HUVR can be used for training and education in structural and mechanical engineering, archaeology and medicine as well as other tasks that require hand-eye coordination. One of the most unique characteristics of HUVR is that a user can place their hands inside of the virtual environment without occluding the 3D image. Built using open-source software and consumer level hardware, HUVR offers users a tactile experience in an immersive environment that is functional, affordable and scalable.

  7. Tracking C. elegans and its neuromuscular activity using NemaFlex

    NASA Astrophysics Data System (ADS)

    van Bussel, Frank; Rahman, Mizanur; Hewitt, Jennifer; Blawzdziewicz, Jerzy; Driscoll, Monica; Szewczyk, Nathaniel; Vanapalli, Siva

    Recently, a novel platform has been developed for studying the behavior and physical characteristics of the nematode C. elegans. This is NemaFlex, developed by the Vanapalli group at Texas Tech University to analyze movement and muscular strength of crawling C. elegans. NemaFlex is a microfluidic device consisting of an array of deformable PDMS pillars, with which the C. elegans interacts in the course of moving through the system. Deflection measurements then allow us to calculate the force exerted by the worm via Euler-Bernoulli beam theory. For the procedure to be fully automated a fairly sophisticated software analysis has to be developed in tandem with the physical device. In particular, the usefulness of the force calculations is highly dependent on the accuracy and volume of the deflection measurements, which would be prohibitively time-consuming if carried out by hand/eye. In order to correlate the force results with muscle activations the C. elegans itself has to be tracked simultaneously, and pillar deflections precisely associated with mechanical-contact on the worm's body. Here we will outline the data processing and analysis routines that have been implemented in order to automate the calculation of these forces and muscular activations.

  8. Keeping an eye on pain: investigating visual attention biases in individuals with chronic pain using eye-tracking methodology

    PubMed Central

    Fashler, Samantha R; Katz, Joel

    2016-01-01

    Attentional biases to painful stimuli are evident in individuals with chronic pain, although the directional tendency of these biases (ie, toward or away from threat-related stimuli) remains unclear. This study used eye-tracking technology, a measure of visual attention, to evaluate the attentional patterns of individuals with and without chronic pain during exposure to injury-related and neutral pictures. Individuals with (N=51) and without chronic pain (N=62) completed a dot-probe task using injury-related and neutral pictures while their eye movements were recorded. Mixed-design analysis of variance evaluated the interaction between group (chronic pain, pain-free) and picture type (injury-related, neutral). Reaction time results showed that regardless of chronic pain status, participants responded faster to trials with neutral stimuli in comparison to trials that included injury-related pictures. Eye-tracking measures showed within-group differences whereby injury-related pictures received more frequent fixations and visits, as well as longer average visit durations. Between-group differences showed that individuals with chronic pain had fewer fixations and shorter average visit durations for all stimuli. An examination of how biases change over the time-course of stimulus presentation showed that during the late phase of attention, individuals with chronic pain had longer average gaze durations on injury pictures relative to pain-free individuals. The results show the advantage of incorporating eye-tracking methodology when examining attentional biases, and suggest future avenues of research. PMID:27570461

  9. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays.

    PubMed

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A; Wetzstein, Gordon

    2017-02-28

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.

  10. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays

    NASA Astrophysics Data System (ADS)

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A.; Wetzstein, Gordon

    2017-02-01

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.

  11. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams during Text-Diagram Integration

    ERIC Educational Resources Information Center

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-01-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our…

  12. Sight-Reading Expertise: Cross-Modality Integration Investigated Using Eye Tracking

    ERIC Educational Resources Information Center

    Drai-Zerbib, Veronique; Baccino, Thierry; Bigand, Emmanuel

    2012-01-01

    It is often said that experienced musicians are capable of hearing what they read (and vice versa). This suggests that they are able to process and to integrate multimodal information. The present study investigates this issue with an eye-tracking technique. Two groups of musicians chosen on the basis of their level of expertise (experts,…

  13. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2012-01-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…

  14. Morphosyntactic Development in a Second Language: An Eye-Tracking Study on the Role of Attention

    ERIC Educational Resources Information Center

    Issa, Bernard Ibrahim, II

    2015-01-01

    One common claim in second language (L2) acquisition research is that attention is crucial for development to occur. Although previous empirical research supports this claim, methodological approaches have not been able to directly measure attention. This thesis utilized eye-tracking to directly measure attention and thus provide converging…

  15. Hidden Communicative Competence: Case Study Evidence Using Eye-Tracking and Video Analysis

    ERIC Educational Resources Information Center

    Grayson, Andrew; Emerson, Anne; Howard-Jones, Patricia; O'Neil, Lynne

    2012-01-01

    A facilitated communication (FC) user with an autism spectrum disorder produced sophisticated texts by pointing, with physical support, to letters on a letterboard while their eyes were tracked and while their pointing movements were video recorded. This FC user has virtually no independent means of expression, and is held to have no literacy…

  16. An Eye-Tracking Investigation of Written Sarcasm Comprehension: The Roles of Familiarity and Context

    ERIC Educational Resources Information Center

    ?urcan, Alexandra; Filik, Ruth

    2016-01-01

    This article addresses a current theoretical debate between the standard pragmatic model, the graded salience hypothesis, and the implicit display theory, by investigating the roles of the context and of the properties of the sarcastic utterance itself in the comprehension of a sarcastic remark. Two eye-tracking experiments were conducted where we…

  17. Using Eye Tracking to Understand the Responses of Learners to Vocabulary Learning Strategy Instruction and Use

    ERIC Educational Resources Information Center

    Liu, Pei-Lin

    2014-01-01

    This study examined the influence of morphological instruction in an eye-tracking English vocabulary recognition task. Sixty-eight freshmen enrolled in an English course and received either traditional or morphological instruction for learning English vocabulary. The experimental part of the study was conducted over two-hour class periods for…

  18. Eye-Tracking Verification of the Strategy Used to Analyse Algorithms Expressed in a Flowchart and Pseudocode

    ERIC Educational Resources Information Center

    Andrzejewska, Magdalena; Stolinska, Anna; Blasiak, Wladyslaw; Peczkowski, Pawel; Rosiek, Roman; Rozek, Bozena; Sajka, Miroslawa; Wcislo, Dariusz

    2016-01-01

    The results of qualitative and quantitative investigations conducted with individuals who learned algorithms in school are presented in this article. In these investigations, eye-tracking technology was used to follow the process of solving algorithmic problems. The algorithmic problems were presented in two comparable variants: in a pseudocode…

  19. Basic Number Processing Deficits in Developmental Dyscalculia: Evidence from Eye Tracking

    ERIC Educational Resources Information Center

    Moeller, K.; Neuburger, S.; Kaufmann, L.; Landerl, K.; Nuerk, H. C.

    2009-01-01

    Recent research suggests that developmental dyscalculia is associated with a subitizing deficit (i.e., the inability to quickly enumerate small sets of up to 3 objects). However, the nature of this deficit has not previously been investigated. In the present study the eye-tracking methodology was employed to clarify whether (a) the subitizing…

  20. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  1. Theories of Spoken Word Recognition Deficits in Aphasia: Evidence from Eye-Tracking and Computational Modeling

    ERIC Educational Resources Information Center

    Mirman, Daniel; Yee, Eiling; Blumstein, Sheila E.; Magnuson, James S.

    2011-01-01

    We used eye-tracking to investigate lexical processing in aphasic participants by examining the fixation time course for rhyme (e.g., "carrot-parrot") and cohort (e.g., "beaker-beetle") competitors. Broca's aphasic participants exhibited larger rhyme competition effects than age-matched controls. A re-analysis of previously reported data (Yee,…

  2. Target Selection by the Frontal Cortex during Coordinated Saccadic and Smooth Pursuit Eye Movements

    ERIC Educational Resources Information Center

    Srihasam, Krishna; Bullock, Daniel; Grossberg, Stephen

    2009-01-01

    Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth-pursuit eye movements. In particular, the saccadic and smooth-pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do…

  3. 78 FR 71621 - Agency Information Collection Activities; Proposed Collection; Comment Request; Eye Tracking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... notice. This notice solicits comments on research entitled, ``Eye Tracking Study of Direct-to-Consumer... the FDA to conduct research relating to health information. Section 1003(d)(2)(C) of the Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 393(b)(2)(c)) authorizes FDA to conduct research...

  4. Visual Processing of Faces in Individuals with Fragile X Syndrome: An Eye Tracking Study

    ERIC Educational Resources Information Center

    Farzin, Faraz; Rivera, Susan M.; Hessl, David

    2009-01-01

    Gaze avoidance is a hallmark behavioral feature of fragile X syndrome (FXS), but little is known about whether abnormalities in the visual processing of faces, including disrupted autonomic reactivity, may underlie this behavior. Eye tracking was used to record fixations and pupil diameter while adolescents and young adults with FXS and sex- and…

  5. Factors Influencing the Use of Captions by Foreign Language Learners: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Winke, Paula; Gass, Susan; Sydorenko, Tetyana

    2013-01-01

    This study investigates caption-reading behavior by foreign language (L2) learners and, through eye-tracking methodology, explores the extent to which the relationship between the native and target language affects that behavior. Second-year (4th semester) English-speaking learners of Arabic, Chinese, Russian, and Spanish watched 2 videos…

  6. Do Faces Capture the Attention of Individuals with Williams Syndrome or Autism? Evidence from Tracking Eye Movements

    ERIC Educational Resources Information Center

    Riby, Deborah M.; Hancock, Peter J. B.

    2009-01-01

    The neuro-developmental disorders of Williams syndrome (WS) and autism can reveal key components of social cognition. Eye-tracking techniques were applied in two tasks exploring attention to pictures containing faces. Images were (i) scrambled pictures containing faces or (ii) pictures of scenes with embedded faces. Compared to individuals who…

  7. The Influences of Static and Interactive Dynamic Facial Stimuli on Visual Strategies in Persons with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn

    2011-01-01

    Several studies, using eye tracking methodology, suggest that different visual strategies in persons with autism spectrum conditions, compared with controls, are applied when viewing facial stimuli. Most eye tracking studies are, however, made in laboratory settings with either static (photos) or non-interactive dynamic stimuli, such as video…

  8. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  9. Similarity-Based Interference during Language Comprehension: Evidence from Eye Tracking during Reading

    ERIC Educational Resources Information Center

    Gordon, Peter C.; Hendrick, Randall; Johnson, Marcus; Lee, Yoonhyoung

    2006-01-01

    The nature of working memory operation during complex sentence comprehension was studied by means of eye-tracking methodology. Readers had difficulty when the syntax of a sentence required them to hold 2 similar noun phrases (NPs) in working memory before syntactically and semantically integrating either of the NPs with a verb. In sentence …

  10. Eye-Tracking Study on Facial Emotion Recognition Tasks in Individuals with High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Tsang, Vicky

    2018-01-01

    The eye-tracking experiment was carried out to assess fixation duration and scan paths that individuals with and without high-functioning autism spectrum disorders employed when identifying simple and complex emotions. Participants viewed human photos of facial expressions and decided on the identification of emotion, the negative-positive emotion…

  11. Intentional Response Distortion on Personality Tests: Using Eye-Tracking to Understand Response Processes when Faking

    ERIC Educational Resources Information Center

    van Hooft, Edwin A. J.; Born, Marise Ph.

    2012-01-01

    Intentional response distortion or faking among job applicants completing measures such as personality and integrity tests is a concern in personnel selection. The present study aimed to investigate whether eye-tracking technology can improve our understanding of the response process when faking. In an experimental within-participants design, a…

  12. Automatic Classification of Users' Health Information Need Context: Logistic Regression Analysis of Mouse-Click and Eye-Tracker Data.

    PubMed

    Pian, Wenjing; Khoo, Christopher Sg; Chi, Jianxing

    2017-12-21

    Users searching for health information on the Internet may be searching for their own health issue, searching for someone else's health issue, or browsing with no particular health issue in mind. Previous research has found that these three categories of users focus on different types of health information. However, most health information websites provide static content for all users. If the three types of user health information need contexts can be identified by the Web application, the search results or information offered to the user can be customized to increase its relevance or usefulness to the user. The aim of this study was to investigate the possibility of identifying the three user health information contexts (searching for self, searching for others, or browsing with no particular health issue in mind) using just hyperlink clicking behavior; using eye-tracking information; and using a combination of eye-tracking, demographic, and urgency information. Predictive models are developed using multinomial logistic regression. A total of 74 participants (39 females and 35 males) who were mainly staff and students of a university were asked to browse a health discussion forum, Healthboards.com. An eye tracker recorded their examining (eye fixation) and skimming (quick eye movement) behaviors on 2 types of screens: summary result screen displaying a list of post headers, and detailed post screen. The following three types of predictive models were developed using logistic regression analysis: model 1 used only the time spent in scanning the summary result screen and reading the detailed post screen, which can be determined from the user's mouse clicks; model 2 used the examining and skimming durations on each screen, recorded by an eye tracker; and model 3 added user demographic and urgency information to model 2. An analysis of variance (ANOVA) analysis found that users' browsing durations were significantly different for the three health information contexts (P<.001). The logistic regression model 3 was able to predict the user's type of health information context with a 10-fold cross validation mean accuracy of 84% (62/74), followed by model 2 at 73% (54/74) and model 1 at 71% (52/78). In addition, correlation analysis found that particular browsing durations were highly correlated with users' age, education level, and the urgency of their information need. A user's type of health information need context (ie, searching for self, for others, or with no health issue in mind) can be identified with reasonable accuracy using just user mouse clicks that can easily be detected by Web applications. Higher accuracy can be obtained using Google glass or future computing devices with eye tracking function. ©Wenjing Pian, Christopher SG Khoo, Jianxing Chi. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 21.12.2017.

  13. Covert enaction at work: Recording the continuous movements of visuospatial attention to visible or imagined targets by means of Steady-State Visual Evoked Potentials (SSVEPs).

    PubMed

    Gregori Grgič, Regina; Calore, Enrico; de'Sperati, Claudio

    2016-01-01

    Whereas overt visuospatial attention is customarily measured with eye tracking, covert attention is assessed by various methods. Here we exploited Steady-State Visual Evoked Potentials (SSVEPs) - the oscillatory responses of the visual cortex to incoming flickering stimuli - to record the movements of covert visuospatial attention in a way operatively similar to eye tracking (attention tracking), which allowed us to compare motion observation and motion extrapolation with and without eye movements. Observers fixated a central dot and covertly tracked a target oscillating horizontally and sinusoidally. In the background, the left and the right halves of the screen flickered at two different frequencies, generating two SSVEPs in occipital regions whose size varied reciprocally as observers attended to the moving target. The two signals were combined into a single quantity that was modulated at the target frequency in a quasi-sinusoidal way, often clearly visible in single trials. The modulation continued almost unchanged when the target was switched off and observers mentally extrapolated its motion in imagery, and also when observers pointed their finger at the moving target during covert tracking, or imagined doing so. The amplitude of modulation during covert tracking was ∼25-30% of that measured when observers followed the target with their eyes. We used 4 electrodes in parieto-occipital areas, but similar results were achieved with a single electrode in Oz. In a second experiment we tested ramp and step motion. During overt tracking, SSVEPs were remarkably accurate, showing both saccadic-like and smooth pursuit-like modulations of cortical responsiveness, although during covert tracking the modulation deteriorated. Covert tracking was better with sinusoidal motion than ramp motion, and better with moving targets than stationary ones. The clear modulation of cortical responsiveness recorded during both overt and covert tracking, identical for motion observation and motion extrapolation, suggests to include covert attention movements in enactive theories of mental imagery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Context effects on smooth pursuit and manual interception of a disappearing target.

    PubMed

    Kreyenmeier, Philipp; Fooken, Jolande; Spering, Miriam

    2017-07-01

    In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points. Copyright © 2017 the American Physiological Society.

  15. Implications of comorbidity for genetic studies of bipolar disorder: P300 and eye tracking as biological markers for illness.

    PubMed

    Blackwood, D H; Sharp, C W; Walker, M T; Doody, G A; Glabus, M F; Muir, W J

    1996-06-01

    In large families with affective illness, identification of a biological variable is needed that reflects brain dysfunction at an earlier point than symptom development. Eye movement disorder, a possible vulnerability marker in schizophrenia, is less clearly associated with affective illness, although a subgroup of affective disorders shows smooth-pursuit eye movement disorder. The auditory P300 event-related potential may be a useful marker for risk to schizophrenia, but a role in bipolar illness is less certain. The distribution of these two biological variables and their association with symptoms in two multiply affected bipolar families is described. In a single, five-generation family identified for linkage studies through two bipolar I (BPI) probands, 128 members (including 20 spouses) were interviewed. The 108 related individuals had diagnoses of BPI (7), bipolar II (2), cyclothymia (3), or major depressive disorder (19). Eight others had generalised anxiety (1), minor depression (5), intermittent depression (1), or alcoholism (1). Sixty-nine subjects had no psychiatric diagnosis. P300 latency (81) and eye tracking (71) were recorded from a subgroup of relatives within the pedigree. Eye tracking was abnormal in 11 of 71 relatives (15.5%) and was bimodally distributed. In these 11 relatives, clinical diagnoses included minor depression (1), alcoholism (1) and generalised anxiety disorder (1). P300 latency was normally distributed and did not differ from controls. In a second family in which five of seven siblings have BPI illness, P300 latency and eye movement disorder were found in affected relatives and in some unaffected offspring. In these large families, clinical diagnoses of general anxiety, alcoholism and minor depression, when associated with eye tracking abnormality, may be considered alternative clinical manifestations of the same trait that in other relatives is expressed as bipolar illness.

  16. Eye Protection in Kansas Schools.

    ERIC Educational Resources Information Center

    Hay, Kenneth M.; And Others

    A law passed by a state legislature requires that students in industrial arts shops and science laboratories must wear eye protective devices. Explanatory material presents the text of the bill and guidelines for implementation, including--(1) types of eye hazards, (2) types of protective devices, (3) administrating eye safety equipment, (4)…

  17. Corneal seal device

    NASA Technical Reports Server (NTRS)

    Baehr, E. F. (Inventor)

    1977-01-01

    A corneal seal device is provided which, when placed in an incision in the eye, permits the insertion of a surgical tool or instrument through the device into the eye. The device includes a seal chamber which opens into a tube which is adapted to be sutured to the eye and serves as an entry passage for a tool. A sealable aperture in the chamber permits passage of the tool through the chamber into the tube and hence into the eye. The chamber includes inlet ports adapted to be connected to a regulated source of irrigation fluid which provides a safe intraocular pressure.

  18. Getting Inside the Expert's Head: An Analysis of Physician Cognitive Processes During Trauma Resuscitations.

    PubMed

    White, Matthew R; Braund, Heather; Howes, Daniel; Egan, Rylan; Gegenfurtner, Andreas; van Merrienboer, Jeroen J G; Szulewski, Adam

    2018-04-23

    Crisis resource management skills are integral to leading the resuscitation of a critically ill patient. Despite their importance, crisis resource management skills (and their associated cognitive processes) have traditionally been difficult to study in the real world. The objective of this study was to derive key cognitive processes underpinning expert performance in resuscitation medicine, using a new eye-tracking-based video capture method during clinical cases. During an 18-month period, a sample of 10 trauma resuscitations led by 4 expert trauma team leaders was analyzed. The physician team leaders were outfitted with mobile eye-tracking glasses for each case. After each resuscitation, participants were debriefed with a modified cognitive task analysis, based on a cued-recall protocol, augmented by viewing their own first-person perspective eye-tracking video from the clinical encounter. Eye-tracking technology was successfully applied as a tool to aid in the qualitative analysis of expert performance in a clinical setting. All participants stated that using these methods helped uncover previously unconscious aspects of their cognition. Overall, 5 major themes were derived from the interviews: logistic awareness, managing uncertainty, visual fixation behaviors, selective attendance to information, and anticipatory behaviors. The novel approach of cognitive task analysis augmented by eye tracking allowed the derivation of 5 unique cognitive processes underpinning expert performance in leading a resuscitation. An understanding of these cognitive processes has the potential to enhance educational methods and to create new assessment modalities of these previously tacit aspects of expertise in this field. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  19. Combining user logging with eye tracking for interactive and dynamic applications.

    PubMed

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  20. Driver eye-scanning behavior at intersections at night.

    DOT National Transportation Integrated Search

    2009-10-01

    This research project analyzed drivers eye scanning behavior at night when approaching signalized : and unsignalized intersections using the data from a head-mounted eye-tracking system during open road : driving on a prescribed route. During the ...

  1. A model that integrates eye velocity commands to keep track of smooth eye displacements.

    PubMed

    Blohm, Gunnar; Optican, Lance M; Lefèvre, Philippe

    2006-08-01

    Past results have reported conflicting findings on the oculomotor system's ability to keep track of smooth eye movements in darkness. Whereas some results indicate that saccades cannot compensate for smooth eye displacements, others report that memory-guided saccades during smooth pursuit are spatially correct. Recently, it was shown that the amount of time before the saccade made a difference: short-latency saccades were retinotopically coded, whereas long-latency saccades were spatially coded. Here, we propose a model of the saccadic system that can explain the available experimental data. The novel part of this model consists of a delayed integration of efferent smooth eye velocity commands. Two alternative physiologically realistic neural mechanisms for this integration stage are proposed. Model simulations accurately reproduced prior findings. Thus, this model reconciles the earlier contradictory reports from the literature about compensation for smooth eye movements before saccades because it involves a slow integration process.

  2. High-resolution eye tracking using V1 neuron activity

    PubMed Central

    McFarland, James M.; Bondy, Adrian G.; Cumming, Bruce G.; Butts, Daniel A.

    2014-01-01

    Studies of high-acuity visual cortical processing have been limited by the inability to track eye position with sufficient accuracy to precisely reconstruct the visual stimulus on the retina. As a result, studies on primary visual cortex (V1) have been performed almost entirely on neurons outside the high-resolution central portion of the visual field (the fovea). Here we describe a procedure for inferring eye position using multi-electrode array recordings from V1 coupled with nonlinear stimulus processing models. We show that this method can be used to infer eye position with one arc-minute accuracy – significantly better than conventional techniques. This allows for analysis of foveal stimulus processing, and provides a means to correct for eye-movement induced biases present even outside the fovea. This method could thus reveal critical insights into the role of eye movements in cortical coding, as well as their contribution to measures of cortical variability. PMID:25197783

  3. Novel shadowless imaging for eyes-like diagnosis in vivo

    NASA Astrophysics Data System (ADS)

    Xue, Ning; Jiang, Kai; Li, Qi; Zhang, Lili; Ma, Li; Huang, Guoliang

    2016-10-01

    Eyes-like diagnosis was a traditional Chinese medicine method for many diseases, such as chronic gastritis, diabetes, hypertension etc. There was a close relationship between viscera and eyes-like. White-Eye was divided into fourteen sections, which corresponded to different viscera, so eyes-like was the reflection of status of viscera, in another words, it was an epitome of viscera health condition. In this paper, we developed a novel shadowless imaging technology and system for eyes-like diagnosis in vivo, which consisted of an optical shadowless imaging device for capturing and saving images of patients' eyes-like, and a computer linked to the device for image processing. A character matching algorithm was developed to extract the character of white-eye in corresponding sections of eyes-like images taken by the optical shadowless imaging device, according to the character of eyes-like, whether there were viscera diseases could be learned. A series of assays were carried out, and the results verified the feasibility of eyes-like diagnosis technique.

  4. Experiencing Light's Properties within Your Own Eye

    ERIC Educational Resources Information Center

    Mauser, Michael

    2011-01-01

    Seeing the reflection, refraction, dispersion, absorption, polarization, and scattering or diffraction of light within your own eye makes these properties of light truly personal. There are practical aspects of these within the eye phenomena, such as eye tracking for computer interfaces. They also offer some intriguing diversions, for example,…

  5. An Eye Tracking Comparison of External Pointing Cues and Internal Continuous Cues in Learning with Complex Animations

    ERIC Educational Resources Information Center

    Boucheix, Jean-Michel; Lowe, Richard K.

    2010-01-01

    Two experiments used eye tracking to investigate a novel cueing approach for directing learner attention to low salience, high relevance aspects of a complex animation. In the first experiment, comprehension of a piano mechanism animation containing spreading-colour cues was compared with comprehension obtained with arrow cues or no cues. Eye…

  6. Sensitivity to Speaker Control in the Online Comprehension of Conditional Tips and Promises: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Stewart, Andrew J.; Haigh, Matthew; Ferguson, Heather J.

    2013-01-01

    Statements of the form if… then… can be used to communicate conditional speech acts such as tips and promises. Conditional promises require the speaker to have perceived control over the outcome event, whereas conditional tips do not. In an eye-tracking study, we examined whether readers are sensitive to information about perceived speaker control…

  7. An Exploration of the Use of Eye-Gaze Tracking to Study Problem-Solving on Standardized Science Assessments

    ERIC Educational Resources Information Center

    Tai, Robert H.; Loehr, John F.; Brigham, Frederick J.

    2006-01-01

    This pilot study investigated the capacity of eye-gaze tracking to identify differences in problem-solving behaviours within a group of individuals who possessed varying degrees of knowledge and expertise in three disciplines of science (biology, chemistry and physics). The six participants, all pre-service science teachers, completed an 18-item…

  8. An Eye-Tracking Study of Learning from Science Text with Concrete and Abstract Illustrations

    ERIC Educational Resources Information Center

    Mason, Lucia; Pluchino, Patrik; Tornatora, Maria Caterina; Ariasi, Nicola

    2013-01-01

    This study investigated the online process of reading and the offline learning from an illustrated science text. The authors examined the effects of using a concrete or abstract picture to illustrate a text and adopted eye-tracking methodology to trace text and picture processing. They randomly assigned 59 eleventh-grade students to 3 reading…

  9. Peer Assessment of Webpage Design: Behavioral Sequential Analysis Based on Eye-Tracking Evidence

    ERIC Educational Resources Information Center

    Hsu, Ting-Chia; Chang, Shao-Chen; Liu, Nan-Cen

    2018-01-01

    This study employed an eye-tracking machine to record the process of peer assessment. Each web page was divided into several regions of interest (ROIs) based on the frame design and content. A total of 49 undergraduate students with a visual learning style participated in the experiment. This study investigated the peer assessment attitudes of the…

  10. Expertise Differences in the Comprehension of Visualizations: A Meta-Analysis of Eye-Tracking Research in Professional Domains

    ERIC Educational Resources Information Center

    Gegenfurtner, Andreas; Lehtinen, Erno; Saljo, Roger

    2011-01-01

    This meta-analysis integrates 296 effect sizes reported in eye-tracking research on expertise differences in the comprehension of visualizations. Three theories were evaluated: Ericsson and Kintsch's ("Psychol Rev" 102:211-245, 1995) theory of long-term working memory, Haider and Frensch's ("J Exp Psychol Learn Mem Cognit" 25:172-190, 1999)…

  11. Procedural Learning and Associative Memory Mechanisms Contribute to Contextual Cueing: Evidence from fMRI and Eye-Tracking

    ERIC Educational Resources Information Center

    Manelis, Anna; Reder, Lynne M.

    2012-01-01

    Using a combination of eye tracking and fMRI in a contextual cueing task, we explored the mechanisms underlying the facilitation of visual search for repeated spatial configurations. When configurations of distractors were repeated, greater activation in the right hippocampus corresponded to greater reductions in the number of saccades to locate…

  12. An Eye Tracking Investigation of Attentional Biases towards Affect in Young Children

    ERIC Educational Resources Information Center

    Burris, Jessica L.; Barry-Anwar, Ryan A.; Rivera, Susan M.

    2017-01-01

    This study examines attentional biases in the presence of angry, happy and neutral faces using a modified eye tracking version of the dot probe task (DPT). Participants were 111 young children between 9 and 48 months. Children passively viewed an affective attention bias task that consisted of a face pairing (neutral paired with either neutral,…

  13. Where Low and High Inference Data Converge: Validation of CLASS Assessment of Mathematics Instruction Using Mobile Eye Tracking with Expert and Novice Teachers

    ERIC Educational Resources Information Center

    Cortina, Kai S.; Miller, Kevin F.; McKenzie, Ryan; Epstein, Alanna

    2015-01-01

    Classroom observation research and research on teacher expertise are similar in their reliance on observational data with high-inference procedure to assess the quality of instruction. Expertise research usually uses low-inference measures like eye tracking to identify qualitative difference between expert and novice behaviors and cognition. In…

  14. A psychotechnological review on eye-tracking systems: towards user experience.

    PubMed

    Mele, Maria Laura; Federici, Stefano

    2012-07-01

    The aim of the present work is to show a critical review of the international literature on eye-tracking technologies by focusing on those features that characterize them as 'psychotechnologies'. A critical literature review was conducted through the main psychology, engineering, and computer sciences databases by following specific inclusion and exclusion criteria. A total of 46 matches from 1998 to 2010 were selected for content analysis. Results have been divided into four broad thematic areas. We found that, although there is a growing attention to end-users, most of the studies reviewed in this work are far from being considered as adopting holistic human-computer interaction models that include both individual differences and needs of users. User is often considered only as a measurement object of the functioning of the technological system and not as a real alter-ego of the intrasystemic interaction. In order to fully benefit from the communicative functions of gaze, the research on eye-tracking must emphasize user experience. Eye-tracking systems would become an effective assistive technology for integration, adaptation and neutralization of the environmental barrier only when a holistic model can be applied for both design processes and assessment of the functional components of the interaction.

  15. A first approach to a neuropsychological screening tool using eye-tracking for bedside cognitive testing based on the Edinburgh Cognitive and Behavioural ALS Screen.

    PubMed

    Keller, Jürgen; Krimly, Amon; Bauer, Lisa; Schulenburg, Sarah; Böhm, Sarah; Aho-Özhan, Helena E A; Uttner, Ingo; Gorges, Martin; Kassubek, Jan; Pinkhardt, Elmar H; Abrahams, Sharon; Ludolph, Albert C; Lulé, Dorothée

    2017-08-01

    Reliable assessment of cognitive functions is a challenging task in amyotrophic lateral sclerosis (ALS) patients unable to speak and write. We therefore present an eye-tracking based neuropsychological screening tool based on the Edinburgh Cognitive and Behavioural ALS Screen (ECAS), a standard screening tool for cognitive deficits in ALS. In total, 46 ALS patients and 50 healthy controls matched for age, gender and education were tested with an oculomotor based and a standard paper-and-pencil version of the ECAS. Significant correlation between both versions was observed for ALS patients and healthy controls in the ECAS total score and in all of its ALS-specific domains (all r > 0.3; all p < 0.05). The eye-tracking version of the ECAS reliably distinguished between ALS patients and healthy controls in the ECAS total score (p < 0.05). Also, cognitively impaired and non-impaired patients could be reliably distinguished with a specificity of 95%. This study provides first evidence that the eye-tracking based ECAS version is a promising approach for assessing cognitive deficits in ALS patients who are unable to speak or write.

  16. Ambient-Light-Canceling Camera Using Subtraction of Frames

    NASA Technical Reports Server (NTRS)

    Morookian, John Michael

    2004-01-01

    The ambient-light-canceling camera (ALCC) is a proposed near-infrared electronic camera that would utilize a combination of (1) synchronized illumination during alternate frame periods and (2) subtraction of readouts from consecutive frames to obtain images without a background component of ambient light. The ALCC is intended especially for use in tracking the motion of an eye by the pupil center corneal reflection (PCCR) method. Eye tracking by the PCCR method has shown potential for application in human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological deficiencies. In the PCCR method, an eye is illuminated by near-infrared light from a lightemitting diode (LED). Some of the infrared light is reflected from the surface of the cornea. Some of the infrared light enters the eye through the pupil and is reflected from back of the eye out through the pupil a phenomenon commonly observed as the red-eye effect in flash photography. An electronic camera is oriented to image the user's eye. The output of the camera is digitized and processed by algorithms that locate the two reflections. Then from the locations of the centers of the two reflections, the direction of gaze is computed. As described thus far, the PCCR method is susceptible to errors caused by reflections of ambient light. Although a near-infrared band-pass optical filter can be used to discriminate against ambient light, some sources of ambient light have enough in-band power to compete with the LED signal. The mode of operation of the ALCC would complement or supplant spectral filtering by providing more nearly complete cancellation of the effect of ambient light. In the operation of the ALCC, a near-infrared LED would be pulsed on during one camera frame period and off during the next frame period. Thus, the scene would be illuminated by both the LED (signal) light and the ambient (background) light during one frame period, and would be illuminated with only ambient (background) light during the next frame period. The camera output would be digitized and sent to a computer, wherein the pixel values of the background-only frame would be subtracted from the pixel values of the signal-plus-background frame to obtain signal-only pixel values (see figure). To prevent artifacts of motion from entering the images, it would be necessary to acquire image data at a rate greater than the standard video rate of 30 frames per second. For this purpose, the ALCC would exploit a novel control technique developed at NASA s Jet Propulsion Laboratory for advanced charge-coupled-device (CCD) cameras. This technique provides for readout from a subwindow [region of interest (ROI)] within the image frame. Because the desired reflections from the eye would typically occupy a small fraction of the area within the image frame, the ROI capability would make it possible to acquire and subtract pixel values at rates of several hundred frames per second considerably greater than the standard video rate and sufficient to both (1) suppress motion artifacts and (2) track the motion of the eye between consecutive subtractive frame pairs.

  17. Comparison of corneal power, astigmatism, and wavefront aberration measurements obtained by a point-source color light-emitting diode-based topographer, a Placido-disk topographer, and a combined Placido and dual Scheimpflug device.

    PubMed

    Ventura, Bruna V; Wang, Li; Ali, Shazia F; Koch, Douglas D; Weikert, Mitchell P

    2015-08-01

    To evaluate and compare the performance of a point-source color light-emitting diode (LED)-based topographer (color-LED) in measuring anterior corneal power and aberrations with that of a Placido-disk topographer and a combined Placido and dual Scheimpflug device. Cullen Eye Institute, Baylor College of Medicine, Houston, Texas USA. Retrospective observational case series. Normal eyes and post-refractive-surgery eyes were consecutively measured using color-LED, Placido, and dual-Scheimpflug devices. The main outcome measures were anterior corneal power, astigmatism, and higher-order aberrations (HOAs) (6.0 mm pupil), which were compared using the t test. There were no statistically significant differences in corneal power measurements in normal and post-refractive surgery eyes and in astigmatism magnitude in post-refractive surgery eyes between the color-LED device and Placido or dual Scheimpflug devices (all P > .05). In normal eyes, there were no statistically significant differences in 3rd-order coma and 4th-order spherical aberration between the color-LED and Placido devices and in HOA root mean square, 3rd-order coma, 3rd-order trefoil, 4th-order spherical aberration, and 4th-order secondary astigmatism between the color-LED and dual Scheimpflug devices (all P > .05). In post-refractive surgery eyes, the color-LED device agreed with the Placido and dual-Scheimpflug devices regarding 3rd-order coma and 4th-order spherical aberration (all P > .05). In normal and post-refractive surgery eyes, all 3 devices were comparable with respect to corneal power. The agreement in corneal aberrations varied. Drs. Wang, Koch, and Weikert are consultants to Ziemer Ophthalmic Systems AG. Dr. Koch is a consultant to Abbott Medical Optics, Inc., Alcon Surgical, Inc., and i-Optics Corp. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  18. Validation of a Behavioral Approach for Measuring Saccades in Parkinson's Disease.

    PubMed

    Turner, Travis H; Renfroe, Jenna B; Duppstadt-Delambo, Amy; Hinson, Vanessa K

    2017-01-01

    Speed and control of saccades are related to disease progression and cognitive functioning in Parkinson's disease (PD). Traditional eye-tracking complexities encumber application for individual evaluations and clinical trials. The authors examined psychometric properties of standalone tasks for reflexive prosaccade latency, volitional saccade initiation, and saccade inhibition (antisaccade) in a heterogeneous sample of 65 PD patients. Demographics had minimal impact on task performance. Thirty-day test-retest reliability estimates for behavioral tasks were acceptable and similar to traditional eye tracking. Behavioral tasks demonstrated concurrent validity with traditional eye-tracking measures; discriminant validity was less clear. Saccade initiation and inhibition discriminated PD patients with cognitive impairment. The present findings support further development and use of the behavioral tasks for assessing latency and control of saccades in PD.

  19. Temporal eye movement strategies during naturalistic viewing

    PubMed Central

    Wang, Helena X.; Freeman, Jeremy; Merriam, Elisha P.; Hasson, Uri; Heeger, David J.

    2011-01-01

    The deployment of eye movements to complex spatiotemporal stimuli likely involves a variety of cognitive factors. However, eye movements to movies are surprisingly reliable both within and across observers. We exploited and manipulated that reliability to characterize observers’ temporal viewing strategies. Introducing cuts and scrambling the temporal order of the resulting clips systematically changed eye movement reliability. We developed a computational model that exhibited this behavior and provided an excellent fit to the measured eye movement reliability. The model assumed that observers searched for, found, and tracked a point-of-interest, and that this process reset when there was a cut. The model did not require that eye movements depend on temporal context in any other way, and it managed to describe eye movements consistently across different observers and two movie sequences. Thus, we found no evidence for the integration of information over long time scales (greater than a second). The results are consistent with the idea that observers employ a simple tracking strategy even while viewing complex, engaging naturalistic stimuli. PMID:22262911

  20. [Virtual reality in ophthalmological education].

    PubMed

    Wagner, C; Schill, M; Hennen, M; Männer, R; Jendritza, B; Knorz, M C; Bender, H J

    2001-04-01

    We present a computer-based medical training workstation for the simulation of intraocular eye surgery. The surgeon manipulates two original instruments inside a mechanical model of the eye. The instrument positions are tracked by CCD cameras and monitored by a PC which renders the scenery using a computer-graphic model of the eye and the instruments. The simulator incorporates a model of the operation table, a mechanical eye, three CCD cameras for the position tracking, the stereo display, and a computer. The three cameras are mounted under the operation table from where they can observe the interior of the mechanical eye. Using small markers the cameras recognize the instruments and the eye. Their position and orientation in space is determined by stereoscopic back projection. The simulation runs with more than 20 frames per second and provides a realistic impression of the surgery. It includes the cold light source which can be moved inside the eye and the shadow of the instruments on the retina which is important for navigational purposes.

  1. Analysis of Eye Movements and Linguistic Boundaries in a Text for the Investigation of Japanese Reading Processes

    NASA Astrophysics Data System (ADS)

    Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo

    In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.

  2. An evaluation of eye tracking technology in the assessment of 12 lead electrocardiography interpretation.

    PubMed

    Breen, Cathal J; Bond, Raymond; Finlay, Dewar

    2014-01-01

    This study investigated eye tracking technology for 12 lead electrocardiography interpretation to Healthcare Scientist students. Participants (n=33) interpreted ten 12 lead ECG recordings and randomized to receive objective individual appraisal on their efforts either by traditional didactic format or by eye tracker software. One hundred percent of participants reported the experience positively at improving their ECG interpretation competency. ECG analysis time ranged between 13.2 and 59.5s. The rhythm strip was the most common lead studied and fixated on for the longest duration (mean 9.9s). Lead I was studied for the shortest duration (mean 0.25s). Feedback using eye tracking data during ECG interpretation did not produce any significant variation between the assessment marks of the study and the control groups (p=0.32). Although the hypothesis of this study was rejected active teaching and early feedback practices are recommended within this discipline. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Simulation of eye-tracker latency, spot size, and ablation pulse depth on the correction of higher order wavefront aberrations with scanning spot laser systems.

    PubMed

    Bueeler, Michael; Mrochen, Michael

    2005-01-01

    The aim of this theoretical work was to investigate the robustness of scanning spot laser treatments with different laser spot diameters and peak ablation depths in case of incomplete compensation of eye movements due to eye-tracker latency. Scanning spot corrections of 3rd to 5th Zernike order wavefront errors were numerically simulated. Measured eye-movement data were used to calculate the positioning error of each laser shot assuming eye-tracker latencies of 0, 5, 30, and 100 ms, and for the case of no eye tracking. The single spot ablation depth ranged from 0.25 to 1.0 microm and the spot diameter from 250 to 1000 microm. The quality of the ablation was rated by the postoperative surface variance and the Strehl intensity ratio, which was calculated after a low-pass filter was applied to simulate epithelial surface smoothing. Treatments performed with nearly ideal eye tracking (latency approximately 0) provide the best results with a small laser spot (0.25 mm) and a small ablation depth (250 microm). However, combinations of a large spot diameter (1000 microm) and a small ablation depth per pulse (0.25 microm) yield the better results for latencies above a certain threshold to be determined specifically. Treatments performed with tracker latencies in the order of 100 ms yield similar results as treatments done completely without eye-movement compensation. CONCWSIONS: Reduction of spot diameter was shown to make the correction more susceptible to eye movement induced error. A smaller spot size is only beneficial when eye movement is neutralized with a tracking system with a latency <5 ms.

  4. Binocular eye movement control and motion perception: what is being tracked?

    PubMed

    van der Steen, Johannes; Dits, Joyce

    2012-10-19

    We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking.

  5. Binocular Eye Movement Control and Motion Perception: What Is Being Tracked?

    PubMed Central

    van der Steen, Johannes; Dits, Joyce

    2012-01-01

    Purpose. We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. Methods. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Results. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. Conclusions. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking. PMID:22997286

  6. Face landmark point tracking using LK pyramid optical flow

    NASA Astrophysics Data System (ADS)

    Zhang, Gang; Tang, Sikan; Li, Jiaquan

    2018-04-01

    LK pyramid optical flow is an effective method to implement object tracking in a video. It is used for face landmark point tracking in a video in the paper. The landmark points, i.e. outer corner of left eye, inner corner of left eye, inner corner of right eye, outer corner of right eye, tip of a nose, left corner of mouth, right corner of mouth, are considered. It is in the first frame that the landmark points are marked by hand. For subsequent frames, performance of tracking is analyzed. Two kinds of conditions are considered, i.e. single factors such as normalized case, pose variation and slowly moving, expression variation, illumination variation, occlusion, front face and rapidly moving, pose face and rapidly moving, and combination of the factors such as pose and illumination variation, pose and expression variation, pose variation and occlusion, illumination and expression variation, expression variation and occlusion. Global measures and local ones are introduced to evaluate performance of tracking under different factors or combination of the factors. The global measures contain the number of images aligned successfully, average alignment error, the number of images aligned before failure, and the local ones contain the number of images aligned successfully for components of a face, average alignment error for the components. To testify performance of tracking for face landmark points under different cases, tests are carried out for image sequences gathered by us. Results show that the LK pyramid optical flow method can implement face landmark point tracking under normalized case, expression variation, illumination variation which does not affect facial details, pose variation, and that different factors or combination of the factors have different effect on performance of alignment for different landmark points.

  7. Predictive factor analysis for successful performance of iris recognition-assisted dynamic rotational eye tracking during laser in situ keratomileusis.

    PubMed

    Prakash, Gaurav; Ashok Kumar, Dhivya; Agarwal, Amar; Jacob, Soosan; Sarvanan, Yoga; Agarwal, Athiya

    2010-02-01

    To analyze the predictive factors associated with success of iris recognition and dynamic rotational eye tracking on a laser in situ keratomileusis (LASIK) platform with active assessment and correction of intraoperative cyclotorsion. Interventional case series. Two hundred seventy-five eyes of 142 consecutive candidates underwent LASIK with attempted iris recognition and dynamic rotational tracking on the Technolas 217z100 platform (Techolas Perfect Vision, St Louis, Missouri, USA) at a tertiary care ophthalmic hospital. The main outcome measures were age, gender, flap creation method (femtosecond, microkeratome, epi-LASIK), success of static rotational tracking, ablation algorithm, pulses, and depth; preablation and intraablation rotational activity were analyzed and evaluated using regression models. Preablation static iris recognition was successful in 247 eyes, without difference in flap creation methods (P = .6). Age (partial correlation, -0.16; P = .014), amount of pulses (partial correlation, 0.39; P = 1.6 x 10(-8)), and gender (P = .02) were significant predictive factors for the amount of intraoperative cyclodeviation. Tracking difficulties leading to linking the ablation with a new intraoperatively acquired iris image were more with femtosecond-assisted flaps (P = 2.8 x 10(-7)) and the amount of intraoperative cyclotorsion (P = .02). However, the number of cases having nonresolvable failure of intraoperative rotational tracking was similar in the 3 flap creation methods (P = .22). Intraoperative cyclotorsional activity depends on the age, gender, and duration of ablation (pulses delivered). Femtosecond flaps do not seem to have a disadvantage over microkeratome flaps as far as iris recognition and success of intraoperative dynamic rotational tracking is concerned. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  8. Computer vision syndrome and ergonomic practices among undergraduate university students.

    PubMed

    Mowatt, Lizette; Gordon, Carron; Santosh, Arvind Babu Rajendra; Jones, Thaon

    2018-01-01

    To determine the prevalence of computer vision syndrome (CVS) and ergonomic practices among students in the Faculty of Medical Sciences at The University of the West Indies (UWI), Jamaica. A cross-sectional study was done with a self-administered questionnaire. Four hundred and nine students participated; 78% were females. The mean age was 21.6 years. Neck pain (75.1%), eye strain (67%), shoulder pain (65.5%) and eye burn (61.9%) were the most common CVS symptoms. Dry eyes (26.2%), double vision (28.9%) and blurred vision (51.6%) were the least commonly experienced symptoms. Eye burning (P = .001), eye strain (P = .041) and neck pain (P = .023) were significantly related to level of viewing. Moderate eye burning (55.1%) and double vision (56%) occurred in those who used handheld devices (P = .001 and .007, respectively). Moderate blurred vision was reported in 52% who looked down at the device compared with 14.8% who held it at an angle. Severe eye strain occurred in 63% of those who looked down at a device compared with 21% who kept the device at eye level. Shoulder pain was not related to pattern of use. Ocular symptoms and neck pain were less likely if the device was held just below eye level. There is a high prevalence of Symptoms of CVS amongst university students which could be reduced, in particular neck pain and eye strain and burning, with improved ergonomic practices. © 2017 John Wiley & Sons Ltd.

  9. Adaptive optics optical coherence tomography with dynamic retinal tracking

    PubMed Central

    Kocaoglu, Omer P.; Ferguson, R. Daniel; Jonnal, Ravi S.; Liu, Zhuolin; Wang, Qiang; Hammer, Daniel X.; Miller, Donald T.

    2014-01-01

    Adaptive optics optical coherence tomography (AO-OCT) is a highly sensitive and noninvasive method for three dimensional imaging of the microscopic retina. Like all in vivo retinal imaging techniques, however, it suffers the effects of involuntary eye movements that occur even under normal fixation. In this study we investigated dynamic retinal tracking to measure and correct eye motion at KHz rates for AO-OCT imaging. A customized retina tracking module was integrated into the sample arm of the 2nd-generation Indiana AO-OCT system and images were acquired on three subjects. Analyses were developed based on temporal amplitude and spatial power spectra in conjunction with strip-wise registration to independently measure AO-OCT tracking performance. After optimization of the tracker parameters, the system was found to correct eye movements up to 100 Hz and reduce residual motion to 10 µm root mean square. Between session precision was 33 µm. Performance was limited by tracker-generated noise at high temporal frequencies. PMID:25071963

  10. Tracking Students' Cognitive Processes during Program Debugging--An Eye-Movement Approach

    ERIC Educational Resources Information Center

    Lin, Yu-Tzu; Wu, Cheng-Chih; Hou, Ting-Yun; Lin, Yu-Chih; Yang, Fang-Ying; Chang, Chia-Hu

    2016-01-01

    This study explores students' cognitive processes while debugging programs by using an eye tracker. Students' eye movements during debugging were recorded by an eye tracker to investigate whether and how high- and low-performance students act differently during debugging. Thirty-eight computer science undergraduates were asked to debug two C…

  11. Eye-Movement Patterns Are Associated with Communicative Competence in Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Norbury, Courtenay Frazier; Brock, Jon; Cragg, Lucy; Einav, Shiri; Griffiths, Helen; Nation, Kate

    2009-01-01

    Background: Investigations using eye-tracking have reported reduced fixations to salient social cues such as eyes when participants with autism spectrum disorders (ASD) view social scenes. However, these studies have not distinguished different cognitive phenotypes. Methods: The eye-movements of 28 teenagers with ASD and 18 typically developing…

  12. Improving Silent Reading Performance through Feedback on Eye Movements: A Feasibility Study

    ERIC Educational Resources Information Center

    Korinth, Sebastian P.; Fiebach, Christian J.

    2018-01-01

    This feasibility study investigated if feedback about individual eye movements, reflecting varying word processing stages, can improve reading performance. Twenty-five university students read 90 newspaper articles during 9 eye-tracking sessions. Training group participants (n = 12) were individually briefed before each session, which eye movement…

  13. Tracking without perceiving: a dissociation between eye movements and motion perception.

    PubMed

    Spering, Miriam; Pomplun, Marc; Carrasco, Marisa

    2011-02-01

    Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.

  14. Tracking Without Perceiving: A Dissociation Between Eye Movements and Motion Perception

    PubMed Central

    Spering, Miriam; Pomplun, Marc; Carrasco, Marisa

    2011-01-01

    Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept. PMID:21189353

  15. Simulating hemispatial neglect with virtual reality.

    PubMed

    Baheux, Kenji; Yoshizawa, Makoto; Yoshida, Yasuko

    2007-07-19

    Hemispatial neglect is a cognitive disorder defined as a lack of attention for stimuli contra-lateral to the brain lesion. The assessment is traditionally done with basic pencil and paper tests and the rehabilitation programs are generally not well adapted. We propose a virtual reality system featuring an eye-tracking device for a better characterization of the neglect that will lead to new rehabilitation techniques. This paper presents a comparison of eye-gaze patterns of healthy subjects, patients and healthy simulated patients on a virtual line bisection test. The task was also executed with a reduced visual field condition hoping that fewer stimuli would limit the neglect. We found that patients and healthy simulated patients had similar eye-gaze patterns. However, while the reduced visual field condition had no effect on the healthy simulated patients, it actually had a negative impact on the patients. We discuss the reasons for these differences and how they relate to the limitations of the neglect simulation. We argue that with some improvements the technique could be used to determine the potential of new rehabilitation techniques and also help the rehabilitation staff or the patient's relatives to better understand the neglect condition.

  16. The Human Engineering Eye Movement Measurement Research Facility.

    DTIC Science & Technology

    1985-04-01

    tracked reliably. When tracking is disrupted (e.g., by gross and sudden head movements, gross change in the head position, sneezing, prolonged eye...these are density ^\\ and " busyness " of the slides (stimulus material), as well as consistency . I„ between successive... change the material being projected based on the subject’s previous performance. The minicomputer relays the calibrated data to one of the magnetic

  17. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2015-04-01

    virtual reality driving simulator data acquisition. Data collection for the pilot study is nearly complete and data analyses are currently under way...Training for primary study procedures including neuropsychological testing, eye- tracking, virtual reality driving simulator, and EEG data acquisition is...the virtual reality driving simulator. Participants are instructed to drive along a coastal highway while performing the target detection task

  18. Brief Report: Broad Autism Phenotype in Adults Is Associated with Performance on an Eye-Tracking Measure of Joint Attention

    ERIC Educational Resources Information Center

    Swanson, Meghan R.; Siller, Michael

    2014-01-01

    The current study takes advantage of modern eye-tracking technology and evaluates how individuals allocate their attention when viewing social videos that display an adult model who is gazing at a series of targets that appear and disappear in the four corners of the screen (congruent condition), or gazing elsewhere (incongruent condition). Data…

  19. Eye-Tracking Provides a Sensitive Measure of Exploration Deficits After Acute Right MCA Stroke

    PubMed Central

    Delazer, Margarete; Sojer, Martin; Ellmerer, Philipp; Boehme, Christian; Benke, Thomas

    2018-01-01

    The eye-tracking study aimed at assessing spatial biases in visual exploration in patients after acute right MCA (middle cerebral artery) stroke. Patients affected by unilateral neglect show less functional recovery and experience severe difficulties in everyday life. Thus, accurate diagnosis is essential, and specific treatment is required. Early assessment is of high importance as rehabilitative interventions are more effective when applied soon after stroke. Previous research has shown that deficits may be overlooked when classical paper-and-pencil tasks are used for diagnosis. Conversely, eye-tracking allows direct monitoring of visual exploration patterns. We hypothesized that the analysis of eye-tracking provides more sensitive measures for spatial exploration deficits after right middle cerebral artery stroke. Twenty-two patients with right MCA stroke (median 5 days after stroke) and 28 healthy controls were included. Lesions were confirmed by MRI/CCT. Groups performed comparably in the Mini–Mental State Examination (patients and controls median 29) and in a screening of executive functions. Eleven patients scored at ceiling in neglect screening tasks, 11 showed minimal to severe signs of unilateral visual neglect. An overlap plot based on MRI and CCT imaging showed lesions in the temporo–parieto–frontal cortex, basal ganglia, and adjacent white matter tracts. Visual exploration was evaluated in two eye-tracking tasks, one assessing free visual exploration of photographs, the other visual search using symbols and letters. An index of fixation asymmetries proved to be a sensitive measure of spatial exploration deficits. Both patient groups showed a marked exploration bias to the right when looking at complex photographs. A single case analysis confirmed that also most of those patients who showed no neglect in screening tasks performed outside the range of controls in free exploration. The analysis of patients’ scoring at ceiling in neglect screening tasks is of special interest, as possible deficits may be overlooked and thus remain untreated. Our findings are in line with other studies suggesting considerable limitations of laboratory screening procedures to fully appreciate the occurrence of neglect symptoms. Future investigations are needed to explore the predictive value of the eye-tracking index and its validity in everyday situations.

  20. Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.

    PubMed

    Sesin, Anaelis; Adjouadi, Malek; Cabrerizo, Mercedes; Ayala, Melvin; Barreto, Armando

    2008-01-01

    This study developed an adaptive real-time human-computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user's different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user's initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.

  1. Involuntary eye motion correction in retinal optical coherence tomography: Hardware or software solution?

    PubMed

    Baghaie, Ahmadreza; Yu, Zeyun; D'Souza, Roshan M

    2017-04-01

    In this paper, we review state-of-the-art techniques to correct eye motion artifacts in Optical Coherence Tomography (OCT) imaging. The methods for eye motion artifact reduction can be categorized into two major classes: (1) hardware-based techniques and (2) software-based techniques. In the first class, additional hardware is mounted onto the OCT scanner to gather information about the eye motion patterns during OCT data acquisition. This information is later processed and applied to the OCT data for creating an anatomically correct representation of the retina, either in an offline or online manner. In software based techniques, the motion patterns are approximated either by comparing the acquired data to a reference image, or by considering some prior assumptions about the nature of the eye motion. Careful investigations done on the most common methods in the field provides invaluable insight regarding future directions of the research in this area. The challenge in hardware-based techniques lies in the implementation aspects of particular devices. However, the results of these techniques are superior to those obtained from software-based techniques because they are capable of capturing secondary data related to eye motion during OCT acquisition. Software-based techniques on the other hand, achieve moderate success and their performance is highly dependent on the quality of the OCT data in terms of the amount of motion artifacts contained in them. However, they are still relevant to the field since they are the sole class of techniques with the ability to be applied to legacy data acquired using systems that do not have extra hardware to track eye motion. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Fusion of P300 and eye-tracker data for spelling using BCI2000

    NASA Astrophysics Data System (ADS)

    Kalika, Dmitry; Collins, Leslie; Caves, Kevin; Throckmorton, Chandra

    2017-10-01

    Objective. Various augmentative and alternative communication (AAC) devices have been developed in order to aid communication for individuals with communication disorders. Recently, there has been interest in combining EEG data and eye-gaze data with the goal of developing a hybrid (or ‘fused’) BCI (hBCI) AAC system. This work explores the effectiveness of a speller that fuses data from an eye-tracker and the P300 speller in order to create a hybrid P300 speller. Approach. This hybrid speller collects both eye-tracking and EEG data in parallel, and the user spells characters on the screen in the same way that they would if they were only using the P300 speller. Online and offline experiments were performed. The online experiments measured the performance of the speller for sixteen non-disabled participants, while the offline simulations were used to assess the robustness of the hybrid system. Main results. Online results showed that for fifteen non-disabled participants, using eye-gaze in a Bayesian framework with EEG data from the P300 speller improved accuracy (0.0163+/- 2.72 , 0.085+/- 0.111 , 0.080+/- 0.106 for estimated, medium and high variance configurations) and reduced the average number of flashes required to spell a character compared to the standard P300 speller that relies solely on EEG data (-53.27+/- 25.87 , -36.15+/- 19.3 , -18.85+/- 12.43 for estimated, medium and high variance configurations). Offline simulations indicate that the system provides more robust performance than a standalone eye gaze system. Significance. The results of this work on non-disabled participants shows the potential efficacy of hybrid P300 and eye-tracker speller. Further validation on the amyotrophic lateral sceloris population is needed to assess the benefit of this hybrid system.

  3. Eyelid contour detection and tracking for startle research related eye-blink measurements from high-speed video records.

    PubMed

    Bernard, Florian; Deuter, Christian Eric; Gemmar, Peter; Schachinger, Hartmut

    2013-10-01

    Using the positions of the eyelids is an effective and contact-free way for the measurement of startle induced eye-blinks, which plays an important role in human psychophysiological research. To the best of our knowledge, no methods for an efficient detection and tracking of the exact eyelid contours in image sequences captured at high-speed exist that are conveniently usable by psychophysiological researchers. In this publication a semi-automatic model-based eyelid contour detection and tracking algorithm for the analysis of high-speed video recordings from an eye tracker is presented. As a large number of images have been acquired prior to method development it was important that our technique is able to deal with images that are recorded without any special parametrisation of the eye tracker. The method entails pupil detection, specular reflection removal and makes use of dynamic model adaption. In a proof-of-concept study we could achieve a correct detection rate of 90.6%. With this approach, we provide a feasible method to accurately assess eye-blinks from high-speed video recordings. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Quantifying Novice and Expert Differences in Visual Diagnostic Reasoning in Veterinary Pathology Using Eye-Tracking Technology.

    PubMed

    Warren, Amy L; Donnon, Tyrone L; Wagg, Catherine R; Priest, Heather; Fernandez, Nicole J

    2018-01-18

    Visual diagnostic reasoning is the cognitive process by which pathologists reach a diagnosis based on visual stimuli (cytologic, histopathologic, or gross imagery). Currently, there is little to no literature examining visual reasoning in veterinary pathology. The objective of the study was to use eye tracking to establish baseline quantitative and qualitative differences between the visual reasoning processes of novice and expert veterinary pathologists viewing cytology specimens. Novice and expert participants were each shown 10 cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently verbalizing their thought processes using the think-aloud protocol (5 slides). Compared to novices, experts demonstrated significantly higher diagnostic accuracy (p<.017), shorter time to diagnosis (p<.017), and a higher percentage of time spent viewing areas of diagnostic interest (p<.017). Experts elicited more key diagnostic features in the think-aloud protocol and had more efficient patterns of eye movement. These findings suggest that experts' fast time to diagnosis, efficient eye-movement patterns, and preference for viewing areas of interest supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis.

  5. Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate.

    PubMed

    Chen, Haiwei; Weng, Yishi; Xu, Daming; Tabiryan, Nelson V; Wu, Shin-Tson

    2016-04-04

    We proposed a switchable beam steering device with cycloidal diffractive waveplate (CDW) for eye tracking in a virtual reality (VR) or augmented reality (AR) display system. Such a CDW diffracts the incident circularly polarized light to the first order with over 95% efficiency. To convert the input linearly polarized light to right-handed or left-handed circular polarization, we developed a broadband polarization switch consisting of a twisted nematic liquid crystal cell and an achromatic quarter-wave retardation film. By cascading 2-3 CDWs together, multiple diffraction angles can be achieved. To suppress the color dispersion, we proposed two approaches to obtain the same diffraction angle for red, green, and blue LEDs-based full color displays. Our device exhibits several advantages, such as high diffraction efficiency, fast response time, low power consumption, and low cost. It holds promise for the emerging VR/AR displays.

  6. Child attention to pain and pain tolerance are dependent upon anxiety and attention control: An eye-tracking study.

    PubMed

    Heathcote, L C; Lau, J Y F; Mueller, S C; Eccleston, C; Fox, E; Bosmans, M; Vervoort, T

    2017-02-01

    Pain is common and can be debilitating in childhood. Theoretical models propose that attention to pain plays a key role in pain outcomes, however, very little research has investigated this in youth. This study examined how anxiety-related variables and attention control interacted to predict children's attention to pain cues using eye-tracking methodology, and their pain tolerance on the cold pressor test (CPT). Children aged 8-17 years had their eye-gaze tracked whilst they viewed photographs of other children displaying painful facial expressions during the CPT, before completing the CPT themselves. Children also completed self-report measures of anxiety and attention control. Findings indicated that anxiety and attention control did not impact children's initial fixations on pain or neutral faces, but did impact how long they dwelled on pain versus neutral faces. For children reporting low levels of attention control, higher anxiety was associated with less dwell time on pain faces as opposed to neutral faces, and the opposite pattern was observed for children with high attention control. Anxiety and attention control also interacted to predict pain outcomes. For children with low attention control, increasing anxiety was associated with anticipating more pain and tolerating pain for less time. This is the first study to examine children's attention to pain cues using eye-tracking technology in the context of a salient painful experience. Data suggest that attention control is an important moderator of anxiety on multiple outcomes relevant to young people's pain experiences. This study uses eye tracking to study attention to pain cues in children. Attention control is an important moderator of anxiety on attention bias to pain and tolerance of cold pressor pain in youth. © 2016 European Pain Federation - EFIC®.

  7. Pupil Tracking for Real-Time Motion Corrected Anterior Segment Optical Coherence Tomography

    PubMed Central

    Carrasco-Zevallos, Oscar M.; Nankivil, Derek; Viehland, Christian; Keller, Brenton; Izatt, Joseph A.

    2016-01-01

    Volumetric acquisition with anterior segment optical coherence tomography (ASOCT) is necessary to obtain accurate representations of the tissue structure and to account for asymmetries of the anterior eye anatomy. Additionally, recent interest in imaging of anterior segment vasculature and aqueous humor flow resulted in application of OCT angiography techniques to generate en face and 3D micro-vasculature maps of the anterior segment. Unfortunately, ASOCT structural and vasculature imaging systems do not capture volumes instantaneously and are subject to motion artifacts due to involuntary eye motion that may hinder their accuracy and repeatability. Several groups have demonstrated real-time tracking for motion-compensated in vivo OCT retinal imaging, but these techniques are not applicable in the anterior segment. In this work, we demonstrate a simple and low-cost pupil tracking system integrated into a custom swept-source OCT system for real-time motion-compensated anterior segment volumetric imaging. Pupil oculography hardware coaxial with the swept-source OCT system enabled fast detection and tracking of the pupil centroid. The pupil tracking ASOCT system with a field of view of 15 x 15 mm achieved diffraction-limited imaging over a lateral tracking range of +/- 2.5 mm and was able to correct eye motion at up to 22 Hz. Pupil tracking ASOCT offers a novel real-time motion compensation approach that may facilitate accurate and reproducible anterior segment imaging. PMID:27574800

  8. The perception of heading during eye movements

    NASA Technical Reports Server (NTRS)

    Royden, Constance S.; Banks, Martin S.; Crowell, James A.

    1992-01-01

    Warren and Hannon (1988, 1990), while studying the perception of heading during eye movements, concluded that people do not require extraretinal information to judge heading with eye/head movements present. Here, heading judgments are examined at higher, more typical eye movement velocities than the extremely slow tracking eye movements used by Warren and Hannon. It is found that people require extraretinal information about eye position to perceive heading accurately under many viewing conditions.

  9. Driver fatigue detection based on eye state.

    PubMed

    Lin, Lizong; Huang, Chao; Ni, Xiaopeng; Wang, Jiawen; Zhang, Hao; Li, Xiao; Qian, Zhiqin

    2015-01-01

    Nowadays, more and more traffic accidents occur because of driver fatigue. In order to reduce and prevent it, in this study, a calculation method using PERCLOS (percentage of eye closure time) parameter characteristics based on machine vision was developed. It determined whether a driver's eyes were in a fatigue state according to the PERCLOS value. The overall workflow solutions included face detection and tracking, detection and location of the human eye, human eye tracking, eye state recognition, and driver fatigue testing. The key aspects of the detection system incorporated the detection and location of human eyes and driver fatigue testing. The simplified method of measuring the PERCLOS value of the driver was to calculate the ratio of the eyes being open and closed with the total number of frames for a given period. If the eyes were closed more than the set threshold in the total number of frames, the system would alert the driver. Through many experiments, it was shown that besides the simple detection algorithm, the rapid computing speed, and the high detection and recognition accuracies of the system, the system was demonstrated to be in accord with the real-time requirements of a driver fatigue detection system.

  10. The Reliability, Validity, and Normative Data of Interpupillary Distance and Pupil Diameter Using Eye-Tracking Technology

    PubMed Central

    Murray, Nicholas P.; Hunfalvay, Melissa; Bolte, Takumi

    2017-01-01

    Purpose The purpose of this study was to determine the reliability of interpupillary distance (IPD) and pupil diameter (PD) measures using an infrared eye tracker and central point stimuli. Validity of the test compared to known clinical tools was determined, and normative data was established against which individuals can measure themselves. Methods Participants (416) across various demographics were examined for normative data. Of these, 50 were examined for reliability and validity. Validity for IPD measured the test (RightEye IPD/PD) against the PL850 Pupilometer and the Essilor Digital CRP. For PD, the test was measured against the Rosenbaum Pocket Vision Screener (RPVS). Reliability was analyzed with intraclass correlation coefficients (ICC) between trials with Cronbach's alpha (CA) and the standard error of measurement for each ICC. Convergent validity was investigated by calculating the bivariate correlation coefficient. Results Reliability results were strong (CA > 0.7) for all measures. High positive significant correlations were found between the RightEye IPD test and the PL850 Pupilometer (P < 0.001) and Essilor Digital CRP (P < 0.001) and for the RightEye PD test and the RPVS (P < 0.001). Conclusions Using infrared eye tracking and the RightEye IPD/PD test stimuli, reliable and accurate measures of IPD and PD were found. Results from normative data showed an adequate comparison for people with normal vision development. Translational Relevance Results revealed a central point of fixation may remove variability in examining PD reliably using infrared eye tracking when consistent environmental and experimental procedures are conducted. PMID:28685104

  11. Reproducibility of retinal nerve fiber layer thickness measures using eye tracking in children with nonglaucomatous optic neuropathy.

    PubMed

    Rajjoub, Raneem D; Trimboli-Heidler, Carmelina; Packer, Roger J; Avery, Robert A

    2015-01-01

    To determine the intra- and intervisit reproducibility of circumpapillary retinal nerve fiber layer (RNFL) thickness measures using eye tracking-assisted spectral-domain optical coherence tomography (SD OCT) in children with nonglaucomatous optic neuropathy. Prospective longitudinal study. Circumpapillary RNFL thickness measures were acquired with SD OCT using the eye-tracking feature at 2 separate study visits. Children with normal and abnormal vision (visual acuity ≥ 0.2 logMAR above normal and/or visual field loss) who demonstrated clinical and radiographic stability were enrolled. Intra- and intervisit reproducibility was calculated for the global average and 9 anatomic sectors by calculating the coefficient of variation and intraclass correlation coefficient. Forty-two subjects (median age 8.6 years, range 3.9-18.2 years) met inclusion criteria and contributed 62 study eyes. Both the abnormal and normal vision cohort demonstrated the lowest intravisit coefficient of variation for the global RNFL thickness. Intervisit reproducibility remained good for those with normal and abnormal vision, although small but statistically significant increases in the coefficient of variation were observed for multiple anatomic sectors in both cohorts. The magnitude of visual acuity loss was significantly associated with the global (ß = 0.026, P < .01) and temporal sector coefficient of variation (ß = 0.099, P < .01). SD OCT with eye tracking demonstrates highly reproducible RNFL thickness measures. Subjects with vision loss demonstrate greater intra- and intervisit variability than those with normal vision. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Preverbal Infants Anticipate that Food Will Be Brought to the Mouth: An Eye Tracking Study of Manual Feeding and Flying Spoons

    ERIC Educational Resources Information Center

    Kochukhova, Olga; Gredeback, Gustaf

    2010-01-01

    This study relies on eye tracking technology to investigate how humans perceive others' feeding actions. Results demonstrate that 6-month-olds (n = 54) anticipate that food is brought to the mouth when observing an adult feeding herself with a spoon. Still, they fail to anticipate self-propelled (SP) spoons that move toward the mouth and manual…

  13. Similarity and Difference in the Processing of Same- and Other-Race Faces as Revealed by Eye Tracking in 4- to 9-Month-Olds

    ERIC Educational Resources Information Center

    Liu, Shaoying; Quinn, Paul C.; Wheeler, Andrea; Xiao, Naiqi; Ge, Liezhong; Lee, Kang

    2011-01-01

    Fixation duration for same-race (i.e., Asian) and other-race (i.e., Caucasian) female faces by Asian infant participants between 4 and 9 months of age was investigated with an eye-tracking procedure. The age range tested corresponded with prior reports of processing differences between same- and other-race faces observed in behavioral looking time…

  14. Adding More Fuel to the Fire: An Eye-Tracking Study of Idiom Processing by Native and Non-Native Speakers

    ERIC Educational Resources Information Center

    Siyanova-Chanturia, Anna; Conklin, Kathy; Schmitt, Norbert

    2011-01-01

    Using eye-tracking, we investigate on-line processing of idioms in a biasing story context by native and non-native speakers of English. The stimuli are idioms used figuratively ("at the end of the day"--"eventually"), literally ("at the end of the day"--"in the evening"), and novel phrases ("at the end of the war"). Native speaker results…

  15. The effect of extended sensory range via the EyeCane sensory substitution device on the characteristics of visionless virtual navigation.

    PubMed

    Maidenbaum, Shachar; Levy-Tzedek, Shelly; Chebat, Daniel Robert; Namer-Furstenberg, Rinat; Amedi, Amir

    2014-01-01

    Mobility training programs for helping the blind navigate through unknown places with a White-Cane significantly improve their mobility. However, what is the effect of new assistive technologies, offering more information to the blind user, on the underlying premises of these programs such as navigation patterns? We developed the virtual-EyeCane, a minimalistic sensory substitution device translating single-point-distance into auditory cues identical to the EyeCane's in the real world. We compared performance in virtual environments when using the virtual-EyeCane, a virtual-White-Cane, no device and visual navigation. We show that the characteristics of virtual-EyeCane navigation differ from navigation with a virtual-White-Cane or no device, and that virtual-EyeCane users complete more levels successfully, taking shorter paths and with less collisions than these groups, and we demonstrate the relative similarity of virtual-EyeCane and visual navigation patterns. This suggests that additional distance information indeed changes navigation patterns from virtual-White-Cane use, and brings them closer to visual navigation.

  16. Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays

    PubMed Central

    Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Wetzstein, Gordon

    2017-01-01

    From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one. PMID:28193871

  17. Assessing computerized eye tracking technology for gaining insight into expert interpretation of the 12-lead electrocardiogram: an objective quantitative approach.

    PubMed

    Bond, R R; Zhu, T; Finlay, D D; Drew, B; Kligfield, P D; Guldenring, D; Breen, C; Gallagher, A G; Daly, M J; Clifford, G D

    2014-01-01

    It is well known that accurate interpretation of the 12-lead electrocardiogram (ECG) requires a high degree of skill. There is also a moderate degree of variability among those who interpret the ECG. While this is the case, there are no best practice guidelines for the actual ECG interpretation process. Hence, this study adopts computerized eye tracking technology to investigate whether eye-gaze can be used to gain a deeper insight into how expert annotators interpret the ECG. Annotators were recruited in San Jose, California at the 2013 International Society of Computerised Electrocardiology (ISCE). Each annotator was recruited to interpret a number of 12-lead ECGs (N=12) while their eye gaze was recorded using a Tobii X60 eye tracker. The device is based on corneal reflection and is non-intrusive. With a sampling rate of 60Hz, eye gaze coordinates were acquired every 16.7ms. Fixations were determined using a predefined computerized classification algorithm, which was then used to generate heat maps of where the annotators looked. The ECGs used in this study form four groups (3=ST elevation myocardial infarction [STEMI], 3=hypertrophy, 3=arrhythmias and 3=exhibiting unique artefacts). There was also an equal distribution of difficulty levels (3=easy to interpret, 3=average and 3=difficult). ECGs were displayed using the 4x3+1 display format and computerized annotations were concealed. Precisely 252 expert ECG interpretations (21 annotators×12 ECGs) were recorded. Average duration for ECG interpretation was 58s (SD=23). Fleiss' generalized kappa coefficient (Pa=0.56) indicated a moderate inter-rater reliability among the annotators. There was a 79% inter-rater agreement for STEMI cases, 71% agreement for arrhythmia cases, 65% for the lead misplacement and dextrocardia cases and only 37% agreement for the hypertrophy cases. In analyzing the total fixation duration, it was found that on average annotators study lead V1 the most (4.29s), followed by leads V2 (3.83s), the rhythm strip (3.47s), II (2.74s), V3 (2.63s), I (2.53s), aVL (2.45s), V5 (2.27s), aVF (1.74s), aVR (1.63s), V6 (1.39s), III (1.32s) and V4 (1.19s). It was also found that on average the annotator spends an equal amount of time studying leads in the frontal plane (15.89s) when compared to leads in the transverse plane (15.70s). It was found that on average the annotators fixated on lead I first followed by leads V2, aVL, V1, II, aVR, V3, rhythm strip, III, aVF, V5, V4 and V6. We found a strong correlation (r=0.67) between time to first fixation on a lead and the total fixation duration on each lead. This indicates that leads studied first are studied the longest. There was a weak negative correlation between duration and accuracy (r=-0.2) and a strong correlation between age and accuracy (r=0.67). Eye tracking facilitated a deeper insight into how expert annotators interpret the 12-lead ECG. As a result, the authors recommend ECG annotators to adopt an initial first impression/pattern recognition approach followed by a conventional systematic protocol to ECG interpretation. This recommendation is based on observing misdiagnoses given due to first impression only. In summary, this research presents eye gaze results from expert ECG annotators and provides scope for future work that involves exploiting computerized eye tracking technology to further the science of ECG interpretation. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Neurodegeneration and Vision Loss after Mild Blunt Trauma in the C57Bl/6 and DBA/2J Mouse

    PubMed Central

    Bricker-Anthony, Courtney; Rex, Tonia S.

    2015-01-01

    Damage to the eye from blast exposure can occur as a result of the overpressure air-wave (primary injury), flying debris (secondary injury), blunt force trauma (tertiary injury), and/or chemical/thermal burns (quaternary injury). In this study, we investigated damage in the contralateral eye after a blast directed at the ipsilateral eye in the C57Bl/6J and DBA/2J mouse. Assessments of ocular health (gross pathology, electroretinogram recordings, optokinetic tracking, optical coherence tomography and histology) were performed at 3, 7, 14 and 28 days post-trauma. Olfactory epithelium and optic nerves were also examined. Anterior pathologies were more common in the DBA/2J than in the C57Bl/6 and could be prevented with non-medicated viscous eye drops. Visual acuity decreased over time in both strains, but was more rapid and severe in the DBA/2J. Retinal cell death was present in approximately 10% of the retina at 7 and 28 days post-blast in both strains. Approximately 60% of the cell death occurred in photoreceptors. Increased oxidative stress and microglial reactivity was detected in both strains, beginning at 3 days post-injury. However, there was no sign of injury to the olfactory epithelium or optic nerve in either strain. Although our model directs an overpressure air-wave at the left eye in a restrained and otherwise protected mouse, retinal damage was detected in the contralateral eye. The lack of damage to the olfactory epithelium and optic nerve, as well as the different timing of cell death as compared to the blast-exposed eye, suggests that the injuries were due to physical contact between the contralateral eye and the housing chamber of the blast device and not propagation of the blast wave through the head. Thus we describe a model of mild blunt eye trauma. PMID:26148200

  19. Eye Tracking Based Control System for Natural Human-Computer Interaction

    PubMed Central

    Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528

  20. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    PubMed

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  1. The impact of fatigue on latent print examinations as revealed by behavioral and eye gaze testing.

    PubMed

    Busey, Thomas; Swofford, Henry J; Vanderkolk, John; Emerick, Brandi

    2015-06-01

    Eye tracking and behavioral methods were used to assess the effects of fatigue on performance in latent print examiners. Eye gaze was measured both before and after a fatiguing exercise involving fine-grained examination decisions. The eye tracking tasks used similar images, often laterally reversed versions of previously viewed prints, which holds image detail constant while minimizing prior recognition. These methods, as well as a within-subject design with fine grained analyses of the eye gaze data, allow fairly strong conclusions despite a relatively small subject population. Consistent with the effects of fatigue on practitioners in other fields such as radiology, behavioral performance declined with fatigue, and the eye gaze statistics suggested a smaller working memory capacity. Participants also terminated the search/examination process sooner when fatigued. However, fatigue did not produce changes in inter-examiner consistency as measured by the Earth Mover Metric. Implications for practice are discussed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Data on eye behavior during idea generation and letter-by-letter reading.

    PubMed

    Walcher, Sonja; Körner, Christof; Benedek, Mathias

    2017-12-01

    This article includes the description of data information from an idea generation task (alternate uses task, (Guilford, 1967) [1]) and a letter-by-letter reading task under two background brightness conditions with healthy adults as well as a baseline measurement and questionnaire data (SIPI (Huba et al., 1981) [2]; DDFS (Singer and Antrobus, 1972) [3], 1963; RIBS (Runco et al., 2001) [4]). Data are hosted at the Open Science Framework (OSF): https://osf.io/fh66g/ (Walcher et al., 2017) [5]. There you will find eye tracking data, task performance data, questionnaires data, analyses scripts (in R, R Core Team, 2017 [6]), eye tracking paradigms (in the Experiment Builder (SR Research Ltd., [7]) and graphs on pupil and angle of eye vergence dynamics. Data are interpreted and discussed in the article 'Looking for ideas: Eye behavior during goal-directed internally focused cognition' (Walcher et al., 2017) [8].

  3. Tracking the Eye Movement of Four Years Old Children Learning Chinese Words.

    PubMed

    Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei

    2018-02-01

    Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese words, was used in the pretest, 5-min eye-tracking learning session, and posttest. Additionally, visual spatial skill and phonological awareness were assessed in the pretest as cognitive controls. The results showed that the children's attention was attracted quickly by pictures, on which their attention was focused most, with only 13% of the time looking at words. Moreover, significant learning gains in word reading were observed, from the pretest to posttest, from 5-min exposure to simulated storybook pages with words, picture and pronunciation of two-character words present. Furthermore, the children's attention to words significantly predicted posttest reading beyond socioeconomic status, age, visual spatial skill, phonological awareness and pretest reading performance. This eye-movement evidence of storybook reading by children as young as four years, reading a non-alphabetic script (i.e., Chinese), has demonstrated exciting findings that children can learn words effectively with minimal exposure and little instruction; these findings suggest that learning to read requires attention to the basic words itself. The study contributes to our understanding of early reading acquisition with eye-movement evidence from beginning readers.

  4. The Head Tracks and Gaze Predicts: How the World’s Best Batters Hit a Ball

    PubMed Central

    Mann, David L.; Spratford, Wayne; Abernethy, Bruce

    2013-01-01

    Hitters in fast ball-sports do not align their gaze with the ball throughout ball-flight; rather, they use predictive eye movement strategies that contribute towards their level of interceptive skill. Existing studies claim that (i) baseball and cricket batters cannot track the ball because it moves too quickly to be tracked by the eyes, and that consequently (ii) batters do not – and possibly cannot – watch the ball at the moment they hit it. However, to date no studies have examined the gaze of truly elite batters. We examined the eye and head movements of two of the world’s best cricket batters and found both claims do not apply to these batters. Remarkably, the batters coupled the rotation of their head to the movement of the ball, ensuring the ball remained in a consistent direction relative to their head. To this end, the ball could be followed if the batters simply moved their head and kept their eyes still. Instead of doing so, we show the elite batters used distinctive eye movement strategies, usually relying on two predictive saccades to anticipate (i) the location of ball-bounce, and (ii) the location of bat-ball contact, ensuring they could direct their gaze towards the ball as they hit it. These specific head and eye movement strategies play important functional roles in contributing towards interceptive expertise. PMID:23516460

  5. Ablation centration after active eye tracker-assisted LASIK and comparison of flying-spot and broad-beam laser.

    PubMed

    Lin, Jane-Ming; Chen, Wen-Lu; Chiang, Chun-Chi; Tsai, Yi-Yu

    2008-04-01

    To evaluate ablation centration of flying-spot LASIK, investigate the effect of patient- and surgeon-related factors on centration, and compare flying-spot and broad-beam laser results. This retrospective study comprised 173 eyes of 94 patients who underwent LASIK with the Alcon LADARVision4000 with an active eye-tracking system. The effective tracking rate of the system is 100 Hz. The amount of decentration was analyzed by corneal topography. Patient- (low, high, and extreme myopia; effect of learning) and surgeon-related (learning curve) factors influencing centration were identified. Centration was compared to the SCHWIND Multiscan broad-beam laser with a 50-Hz tracker from a previous study. Mean decentration was 0.36+/-0.18 mm (range: 0 to 0.9 mm). Centration did not differ in low, high, and extreme myopia or in patients' first and second eyes. There were no significant differences in centration between the first 50 LASIK procedures and the last 50 procedures. Comparing flying-spot and broad-beam laser results, there were no differences in centration in low myopia. However, the LADARVision4000 yielded better centration results in high and extreme myopia. The Alcon LADARVision4000 active eye tracking system provides good centration for all levels of myopic correction and better centration than the Schwind broad-beam Multiscan in eyes with high and extreme myopia.

  6. The Influence of Different Representations on Solving Concentration Problems at Elementary School

    NASA Astrophysics Data System (ADS)

    Liu, Chia-Ju; Shen, Ming-Hsun

    2011-10-01

    This study investigated the students' learning process of the concept of concentration at the elementary school level in Taiwan. The influence of different representational types on the process of proportional reasoning was also explored. The participants included nineteen third-grade and eighteen fifth-grade students. Eye-tracking technology was used in conducting the experiment. The materials were adapted from Noelting's (1980a) "orange juice test" experiment. All problems on concentration included three stages (the intuitive, the concrete operational, and the formal operational), and each problem was displayed in iconic and symbolic representations. The data were collected through eye-tracking technology and post-test interviews. The results showed that the representational types influenced students' solving of concentration problems. Furthermore, the data on eye movement indicated that students used different strategies or rules to solve concentration problems at the different stages of the problems with different representational types. This study is intended to contribute to the understanding of elementary school students' problem-solving strategies and the usability of eye-tracking technology in related studies.

  7. ATTENTION BIAS OF ANXIOUS YOUTH DURING EXTENDED EXPOSURE OF EMOTIONAL FACE PAIRS: AN EYE-TRACKING STUDY

    PubMed Central

    Shechner, Tomer; Jarcho, Johanna M.; Britton, Jennifer C.; Leibenluft, Ellen; Pine, Daniel S.; Nelson, Eric E.

    2012-01-01

    Background Previous studies demonstrate that anxiety is characterized by biased attention toward threats, typically measured by differences in motor reaction time to threat and neutral cues. Using eye-tracking methodology, the current study measured attention biases in anxious and nonanxious youth, using unrestricted free viewing of angry, happy, and neutral faces. Methods Eighteen anxious and 15 nonanxious youth (8–17 years old) passively viewed angry-neutral and happy-neutral face pairs for 10 s while their eye movements were recorded. Results Anxious youth displayed a greater attention bias toward angry faces than nonanxious youth, and this bias occurred in the earliest phases of stimulus presentation. Specifically, anxious youth were more likely to direct their first fixation to angry faces, and they made faster fixations to angry than neutral faces. Conclusions Consistent with findings from earlier, reaction-time studies, the current study shows that anxious youth, like anxious adults, exhibit biased orienting to threat-related stimuli. This study adds to the existing literature by documenting that threat biases in eye-tracking patterns are manifest at initial attention orienting. PMID:22815254

  8. Low frequency rTMS over posterior parietal cortex impairs smooth pursuit eye tracking.

    PubMed

    Hutton, Samuel B; Weekes, Brendan S

    2007-11-01

    The role of the posterior parietal cortex in smooth pursuit eye movements remains unclear. We used low frequency repetitive transcranial magnetic stimulation (rTMS) to study the cognitive and neural systems involved in the control of smooth pursuit eye movements. Eighteen participants were tested on two separate occasions. On each occasion we measured smooth pursuit eye tracking before and after 6 min of 1 Hz rTMS delivered at 90% of motor threshold. Low frequency rTMS over the posterior parietal cortex led to a significant reduction in smooth pursuit velocity gain, whereas rTMS over the motor cortex had no effect on gain. We conclude that low frequency offline rTMS is a potentially useful tool with which to explore the cortical systems involved in oculomotor control.

  9. Quantifying Eye Tracking Between Skilled Nurses and Nursing Students in Intravenous Injection.

    PubMed

    Maekawa, Yasuko; Majima, Yukie; Soga, Masato

    2016-01-01

    In nursing education, it is important that nursing students acquire the appropriate nursing knowledge and skills which include the empirical tacit knowledge of the skilled nurses. Verbalizing them is difficult. We paid attention to the eye tracking at the time of the skill enforcement of expert nurses and the nursing students. It is said that the sight accounts for 70% higher than of all sense information. For the purpose of the learning support of the tacit nursing skill, we analyzed the difference of both including the gaze from an actual measured value with the eye mark recorder. In the results the nurses particularly address the part related to inserting a needle among the other actions, they should move their eyes safely, surely, and economically along with the purposes of their tasks.

  10. CEFR and Eye Movement Characteristics during EFL Reading: The Case of Intermediate Readers

    ERIC Educational Resources Information Center

    Dolgunsöz, Emrah; Sariçoban, Arif

    2016-01-01

    This study primarily aims to (1) examine the relationship between foreign language reading proficiency and eye movements during reading, and (2) to describe eye movement differences between two CEFR proficiency groups (B1 and B2) by using eye tracking technique. 57 learners of EFL were tested under two experimental conditions: Natural L2 reading…

  11. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    NASA Astrophysics Data System (ADS)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  12. Pre- and Post-Head Processing for Single- and Double-Scrambled Sentences of a Head-Final Language as Measured by the Eye Tracking Method

    ERIC Educational Resources Information Center

    Tamaoka, Katsuo; Asano, Michiko; Miyaoka, Yayoi; Yokosawa, Kazuhiko

    2014-01-01

    Using the eye-tracking method, the present study depicted pre- and post-head processing for simple scrambled sentences of head-final languages. Three versions of simple Japanese active sentences with ditransitive verbs were used: namely, (1) SO[subscript 1]O[subscript 2]V canonical, (2) SO[subscript 2]O[subscript 1]V single-scrambled, and (3)…

  13. Time Course of Visual Attention in Infant Categorization of Cats versus Dogs: Evidence for a Head Bias as Revealed through Eye Tracking

    ERIC Educational Resources Information Center

    Quinn, Paul C.; Doran, Matthew M.; Reiss, Jason E.; Hoffman, James E.

    2009-01-01

    Previous looking time studies have shown that infants use the heads of cat and dog images to form category representations for these animal classes. The present research used an eye-tracking procedure to determine the time course of attention to the head and whether it reflects a preexisting bias or online learning. Six- to 7-month-olds were…

  14. Upconverting device for enhanced recogntion of certain wavelengths of light

    DOEpatents

    Kross, Brian; McKIsson, John E; McKisson, John; Weisenberger, Andrew; Xi, Wenze; Zorn, Carl

    2013-05-21

    An upconverting device for enhanced recognition of selected wavelengths is provided. The device comprises a transparent light transmitter in combination with a plurality of upconverting nanoparticles. The device may a lens in eyewear or alternatively a transparent panel such as a window in an instrument or machine. In use the upconverting device is positioned between a light source and the eye(s) of the user of the upconverting device.

  15. Gaze inspired subtitle position evaluation for MOOCs videos

    NASA Astrophysics Data System (ADS)

    Chen, Hongli; Yan, Mengzhen; Liu, Sijiang; Jiang, Bo

    2017-06-01

    Online educational resources, such as MOOCs, is becoming increasingly popular, especially in higher education field. One most important media type for MOOCs is course video. Besides traditional bottom-position subtitle accompany to the videos, in recent years, researchers try to develop more advanced algorithms to generate speaker-following style subtitles. However, the effectiveness of such subtitle is still unclear. In this paper, we investigate the relationship between subtitle position and the learning effect after watching the video on tablet devices. Inspired with image based human eye tracking technique, this work combines the objective gaze estimation statistics with subjective user study to achieve a convincing conclusion - speaker-following subtitles are more suitable for online educational videos.

  16. 21 CFR 886.3200 - Artificial eye.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Artificial eye. 886.3200 Section 886.3200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... device resembling the anterior portion of the eye, usually made of glass or plastic, intended to be...

  17. 21 CFR 886.3200 - Artificial eye.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Artificial eye. 886.3200 Section 886.3200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... device resembling the anterior portion of the eye, usually made of glass or plastic, intended to be...

  18. 21 CFR 886.3200 - Artificial eye.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Artificial eye. 886.3200 Section 886.3200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... device resembling the anterior portion of the eye, usually made of glass or plastic, intended to be...

  19. 76 FR 66283 - Notice of Intent To Grant Partially Exclusive Patent License; BOLD Industries, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... Method for a Mobile Tracking Device.//U.S. Patent Application No. 20110036998 filed on August 14, 2009: Countermeasure Device for a Mobile Tracking Device.//U.S. Patent Application No. 20110113949 filed on May 12, 2010: Modulation Device for a Mobile Tracking Device.//U.S. Patent Application Serial No. 12/778,643...

  20. Remote gaze tracking system on a large display.

    PubMed

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-10-07

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°~±0.775° and a speed of 5~10 frames/s.

  1. Remote Gaze Tracking System on a Large Display

    PubMed Central

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-01-01

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°∼±0.775° and a speed of 5∼10 frames/s. PMID:24105351

  2. Helmet Mounted Eye Tracking for Virtual Panoramic Displays. Volume 1: Review of Current Eye Movement Measurement Technology

    DTIC Science & Technology

    1989-08-01

    paths for integration with the off-aperture and dual-mirror VPD designs. PREFACE The goal of this work was to explore integration of an eye line-of- gaze ...Relationship in one plane between point-of- gaze on a flat scene and relative eye, detector, and scene positions...and eye line-of- gaze measurement. As a first step towards the design of an appropriate eye trac.<ing system for interface with the virtual cockpit

  3. Eye-tracking novice and expert geologist groups in the field and laboratory

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.

    2010-12-01

    We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.

  4. Adolescents' attention to responsibility messages in magazine alcohol advertisements: an eye-tracking approach.

    PubMed

    Thomsen, Steven R; Fulton, Kristi

    2007-07-01

    To investigate whether adolescent readers attend to responsibility or moderation messages (e.g., "drink responsibly") included in magazine advertisements for alcoholic beverages and to assess the association between attention and the ability to accurately recall the content of these messages. An integrated head-eye tracking system (ASL Eye-TRAC 6000) was used to measure the eye movements, including fixations and fixation duration, of a group of 63 adolescents (ages 12-14 years) as they viewed six print advertisements for alcoholic beverages. Immediately after the eye-tracking sessions, participants completed a masked-recall exercise. Overall, the responsibility or moderation messages were the least frequently viewed textual or visual areas of the advertisements. Participants spent an average of only .35 seconds, or 7% of the total viewing time, fixating on each responsibility message. Beverage bottles, product logos, and cartoon illustrations were the most frequently viewed elements of the advertisements. Among those participants who fixated at least once on an advertisement's warning message, only a relatively small percentage were able to recall its general concept or restate it verbatim in the masked recall test. Voluntary responsibility or moderation messages failed to capture the attention of teenagers who participated in this study and need to be typographically modified to be more effective.

  5. Sex differences in a chronometric mental rotation test with cube figures: a behavioral, electroencephalography, and eye-tracking pilot study.

    PubMed

    Scheer, Clara; Mattioni Maturana, Felipe; Jansen, Petra

    2018-05-07

    In chronometric mental rotation tasks, sex differences are widely discussed. Most studies find men to be more skilled in mental rotation than women, which can be explained by the holistic strategy that they use to rotate stimuli. Women are believed to apply a piecemeal strategy. So far, there have been no studies investigating this phenomenon using eye-tacking methods in combination with electroencephalography (EEG) analysis: the present study compared behavioral responses, EEG activity, and eye movements of 15 men and 15 women while solving a three-dimensional chronometric mental rotation test. The behavioral analysis showed neither differences in reaction time nor in the accuracy rate between men and women. The EEG data showed a higher right activation on parietal electrodes for women and the eye-tracking results indicated a longer fixation in a higher number of areas of interest at 0° for women. Men and women are likely to possess different perceptual (visual search) and decision-making mechanisms, but similar mental rotation processes. Furthermore, men presented a longer visual search processing, characterized by the greater saccade latency of 0°-135°. Generally, this study could be considered a pilot study to investigate sex differences in mental rotation tasks while combining eye-tracking and EEG methods.

  6. Objective Methods to Test Visual Dysfunction in the Presence of Cognitive Impairment

    DTIC Science & Technology

    2015-12-01

    the eye and 3) purposeful eye movements to track targets that are resolved. Major Findings: Three major objective tests of vision were successfully...developed and optimized to detect disease. These were 1) the pupil light reflex (either comparing the two eyes or independently evaluating each eye ...separately for retina or optic nerve damage, 2) eye movement based analysis of target acquisition, fixation, and eccentric viewing as a means of

  7. Your Child's Vision

    MedlinePlus

    ... 3½, kids should have eye health screenings and visual acuity tests (tests that measure sharpness of vision) ... eye rubbing extreme light sensitivity poor focusing poor visual tracking (following an object) abnormal alignment or movement ...

  8. The added value of eye-tracking in diagnosing dyscalculia: a case study

    PubMed Central

    van Viersen, Sietske; Slot, Esther M.; Kroesbergen, Evelyn H.; van't Noordende, Jaccoline E.; Leseman, Paul P. M.

    2013-01-01

    The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R2) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures. PMID:24098294

  9. Instruction-based clinical eye-tracking study on the visual interpretation of divergence: How do students look at vector field plots?

    NASA Astrophysics Data System (ADS)

    Klein, P.; Viiri, J.; Mozaffari, S.; Dengel, A.; Kuhn, J.

    2018-06-01

    Relating mathematical concepts to graphical representations is a challenging task for students. In this paper, we introduce two visual strategies to qualitatively interpret the divergence of graphical vector field representations. One strategy is based on the graphical interpretation of partial derivatives, while the other is based on the flux concept. We test the effectiveness of both strategies in an instruction-based eye-tracking study with N =41 physics majors. We found that students' performance improved when both strategies were introduced (74% correct) instead of only one strategy (64% correct), and students performed best when they were free to choose between the two strategies (88% correct). This finding supports the idea of introducing multiple representations of a physical concept to foster student understanding. Relevant eye-tracking measures demonstrate that both strategies imply different visual processing of the vector field plots, therefore reflecting conceptual differences between the strategies. Advanced analysis methods further reveal significant differences in eye movements between the best and worst performing students. For instance, the best students performed predominantly horizontal and vertical saccades, indicating correct interpretation of partial derivatives. They also focused on smaller regions when they balanced positive and negative flux. This mixed-method research leads to new insights into student visual processing of vector field representations, highlights the advantages and limitations of eye-tracking methodologies in this context, and discusses implications for teaching and for future research. The introduction of saccadic direction analysis expands traditional methods, and shows the potential to discover new insights into student understanding and learning difficulties.

  10. The added value of eye-tracking in diagnosing dyscalculia: a case study.

    PubMed

    van Viersen, Sietske; Slot, Esther M; Kroesbergen, Evelyn H; Van't Noordende, Jaccoline E; Leseman, Paul P M

    2013-01-01

    The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R (2)) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures.

  11. Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision.

    PubMed

    Ben-Simon, Avi; Ben-Shahar, Ohad; Segev, Ronen

    2009-11-15

    The archer fish (Toxotes chatareus) exhibits unique visual behavior in that it is able to aim at and shoot down with a squirt of water insects resting on the foliage above water level and then feed on them. This extreme behavior requires excellent visual acuity, learning, and tight synchronization between the visual system and body motion. This behavior also raises many important questions, such as the fish's ability to compensate for air-water refraction and the neural mechanisms underlying target acquisition. While many such questions remain open, significant insights towards solving them can be obtained by tracking the eye and body movements of freely behaving fish. Unfortunately, existing tracking methods suffer from either a high level of invasiveness or low resolution. Here, we present a video-based eye tracking method for accurately and remotely measuring the eye and body movements of a freely moving behaving fish. Based on a stereo vision system and a unique triangulation method that corrects for air-glass-water refraction, we are able to measure a full three-dimensional pose of the fish eye and body with high temporal and spatial resolution. Our method, being generic, can be applied to studying the behavior of marine animals in general. We demonstrate how data collected by our method may be used to show that the hunting behavior of the archer fish is composed of surfacing concomitant with rotating the body around the direction of the fish's fixed gaze towards the target, until the snout reaches in the correct shooting position at water level.

  12. Tracking the eye non-invasively: simultaneous comparison of the scleral search coil and optical tracking techniques in the macaque monkey

    PubMed Central

    Kimmel, Daniel L.; Mammo, Dagem; Newsome, William T.

    2012-01-01

    From human perception to primate neurophysiology, monitoring eye position is critical to the study of vision, attention, oculomotor control, and behavior. Two principal techniques for the precise measurement of eye position—the long-standing sclera-embedded search coil and more recent optical tracking techniques—are in use in various laboratories, but no published study compares the performance of the two methods simultaneously in the same primates. Here we compare two popular systems—a sclera-embedded search coil from C-N-C Engineering and the EyeLink 1000 optical system from SR Research—by recording simultaneously from the same eye in the macaque monkey while the animal performed a simple oculomotor task. We found broad agreement between the two systems, particularly in positional accuracy during fixation, measurement of saccade amplitude, detection of fixational saccades, and sensitivity to subtle changes in eye position from trial to trial. Nonetheless, certain discrepancies persist, particularly elevated saccade peak velocities, post-saccadic ringing, influence of luminance change on reported position, and greater sample-to-sample variation in the optical system. Our study shows that optical performance now rivals that of the search coil, rendering optical systems appropriate for many if not most applications. This finding is consequential, especially for animal subjects, because the optical systems do not require invasive surgery for implantation and repair of search coils around the eye. Our data also allow laboratories using the optical system in human subjects to assess the strengths and limitations of the technique for their own applications. PMID:22912608

  13. 49 CFR 214.509 - Required visual illumination and reflective devices for new on-track roadway maintenance machines.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... devices for new on-track roadway maintenance machines. 214.509 Section 214.509 Transportation Other... TRANSPORTATION RAILROAD WORKPLACE SAFETY On-Track Roadway Maintenance Machines and Hi-Rail Vehicles § 214.509 Required visual illumination and reflective devices for new on-track roadway maintenance machines. Each new...

  14. Ciliary muscle contraction force and trapezius muscle activity during manual tracking of a moving visual target.

    PubMed

    Domkin, Dmitry; Forsman, Mikael; Richter, Hans O

    2016-06-01

    Previous studies have shown an association of visual demands during near work and increased activity of the trapezius muscle. Those studies were conducted under stationary postural conditions with fixed gaze and artificial visual load. The present study investigated the relationship between ciliary muscle contraction force and trapezius muscle activity across individuals during performance of a natural dynamic motor task under free gaze conditions. Participants (N=11) tracked a moving visual target with a digital pen on a computer screen. Tracking performance, eye refraction and trapezius muscle activity were continuously measured. Ciliary muscle contraction force was computed from eye accommodative response. There was a significant Pearson correlation between ciliary muscle contraction force and trapezius muscle activity on the tracking side (0.78, p<0.01) and passive side (0.64, p<0.05). The study supports the hypothesis that high visual demands, leading to an increased ciliary muscle contraction during continuous eye-hand coordination, may increase trapezius muscle tension and thus contribute to the development of musculoskeletal complaints in the neck-shoulder area. Further experimental studies are required to clarify whether the relationship is valid within each individual or may represent a general personal trait, when individuals with higher eye accommodative response tend to have higher trapezius muscle activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. The socialization effect on decision making in the Prisoner's Dilemma game: An eye-tracking study

    PubMed Central

    Myagkov, Mikhail G.; Harriff, Kyle

    2017-01-01

    We used a mobile eye-tracking system (in the form of glasses) to study the characteristics of visual perception in decision making in the Prisoner's Dilemma game. In each experiment, one of the 12 participants was equipped with eye-tracking glasses. The experiment was conducted in three stages: an anonymous Individual Game stage against a randomly chosen partner (one of the 12 other participants of the experiment); a Socialization stage, in which the participants were divided into two groups; and a Group Game stage, in which the participants played with partners in the groups. After each round, the respondent received information about his or her personal score in the last round and the overall winner of the game at the moment. The study proves that eye-tracking systems can be used for studying the process of decision making and forecasting. The total viewing time and the time of fixation on areas corresponding to noncooperative decisions is related to the participants’ overall level of cooperation. The increase in the total viewing time and the time of fixation on the areas of noncooperative choice is due to a preference for noncooperative decisions and a decrease in the overall level of cooperation. The number of fixations on the group attributes is associated with group identity, but does not necessarily lead to cooperative behavior. PMID:28394939

  16. Measuring advertising effectiveness in Travel 2.0 websites through eye-tracking technology.

    PubMed

    Muñoz-Leiva, Francisco; Hernández-Méndez, Janet; Gómez-Carmona, Diego

    2018-03-06

    The advent of Web 2.0 is changing tourists' behaviors, prompting them to take on a more active role in preparing their travel plans. It is also leading tourism companies to have to adapt their marketing strategies to different online social media. The present study analyzes advertising effectiveness in social media in terms of customers' visual attention and self-reported memory (recall). Data were collected through a within-subjects and between-groups design based on eye-tracking technology, followed by a self-administered questionnaire. Participants were instructed to visit three Travel 2.0 websites (T2W), including a hotel's blog, social network profile (Facebook), and virtual community profile (Tripadvisor). Overall, the results revealed greater advertising effectiveness in the case of the hotel social network; and visual attention measures based on eye-tracking data differed from measures of self-reported recall. Visual attention to the ad banner was paid at a low level of awareness, which explains why the associations with the ad did not activate its subsequent recall. The paper offers a pioneering attempt in the application of eye-tracking technology, and examines the possible impact of visual marketing stimuli on user T2W-related behavior. The practical implications identified in this research, along with its limitations and future research opportunities, are of interest both for further theoretical development and practical application. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. E-Readers Are More Effective than Paper for Some with Dyslexia

    PubMed Central

    Schneps, Matthew H.; Thomson, Jenny M.; Chen, Chen; Sonnert, Gerhard; Pomplun, Marc

    2013-01-01

    E-readers are fast rivaling print as a dominant method for reading. Because they offer accessibility options that are impossible in print, they are potentially beneficial for those with impairments, such as dyslexia. Yet, little is known about how the use of these devices influences reading in those who struggle. Here, we observe reading comprehension and speed in 103 high school students with dyslexia. Reading on paper was compared with reading on a small handheld e-reader device, formatted to display few words per line. We found that use of the device significantly improved speed and comprehension, when compared with traditional presentations on paper for specific subsets of these individuals: Those who struggled most with phoneme decoding or efficient sight word reading read more rapidly using the device, and those with limited VA Spans gained in comprehension. Prior eye tracking studies demonstrated that short lines facilitate reading in dyslexia, suggesting that it is the use of short lines (and not the device per se) that leads to the observed benefits. We propose that these findings may be understood as a consequence of visual attention deficits, in some with dyslexia, that make it difficult to allocate attention to uncrowded text near fixation, as the gaze advances during reading. Short lines ameliorate this by guiding attention to the uncrowded span. PMID:24058697

  18. Generic Dynamic Environment Perception Using Smart Mobile Devices.

    PubMed

    Danescu, Radu; Itu, Razvan; Petrovai, Andra

    2016-10-17

    The driving environment is complex and dynamic, and the attention of the driver is continuously challenged, therefore computer based assistance achieved by processing image and sensor data may increase traffic safety. While active sensors and stereovision have the advantage of obtaining 3D data directly, monocular vision is easy to set up, and can benefit from the increasing computational power of smart mobile devices, and from the fact that almost all of them come with an embedded camera. Several driving assistance application are available for mobile devices, but they are mostly targeted for simple scenarios and a limited range of obstacle shapes and poses. This paper presents a technique for generic, shape independent real-time obstacle detection for mobile devices, based on a dynamic, free form 3D representation of the environment: the particle based occupancy grid. Images acquired in real time from the smart mobile device's camera are processed by removing the perspective effect and segmenting the resulted bird-eye view image to identify candidate obstacle areas, which are then used to update the occupancy grid. The occupancy grid tracked cells are grouped into obstacles depicted as cuboids having position, size, orientation and speed. The easy to set up system is able to reliably detect most obstacles in urban traffic, and its measurement accuracy is comparable to a stereovision system.

  19. An open-source framework for testing tracking devices using Lego Mindstorms

    NASA Astrophysics Data System (ADS)

    Jomier, Julien; Ibanez, Luis; Enquobahrie, Andinet; Pace, Danielle; Cleary, Kevin

    2009-02-01

    In this paper, we present an open-source framework for testing tracking devices in surgical navigation applications. At the core of image-guided intervention systems is the tracking interface that handles communication with the tracking device and gathers tracking information. Given that the correctness of tracking information is critical for protecting patient safety and for ensuring the successful execution of an intervention, the tracking software component needs to be thoroughly tested on a regular basis. Furthermore, with widespread use of extreme programming methodology that emphasizes continuous and incremental testing of application components, testing design becomes critical. While it is easy to automate most of the testing process, it is often more difficult to test components that require manual intervention such as tracking device. Our framework consists of a robotic arm built from a set of Lego Mindstorms and an open-source toolkit written in C++ to control the robot movements and assess the accuracy of the tracking devices. The application program interface (API) is cross-platform and runs on Windows, Linux and MacOS. We applied this framework for the continuous testing of the Image-Guided Surgery Toolkit (IGSTK), an open-source toolkit for image-guided surgery and shown that regression testing on tracking devices can be performed at low cost and improve significantly the quality of the software.

  20. Postural stabilizing effect of alfacalcidol and active absorbable algal calcium (AAA Ca) compared with calcium carbonate assessed by computerized posturography.

    PubMed

    Fujita, Takuo; Nakamura, Shoji; Ohue, Mutsumi; Fujii, Yoshio; Miyauchi, Akimitsu; Takagi, Yasuyuki; Tsugeno, Hirofumi

    2007-01-01

    Sway and postural instability have drawn attention as a risk factor for osteoporotic fracture, in addition to low bone mineral density (BMD) and poor bone quality. In view of the fracture-reducing effect of alfacalcidol and active absorbable algal calcium (AAA Ca) not readily explained by rather mild increases of BMD, attempts were made to evaluate postural stabilizing effect of alfacalcidol, AAA Ca, and calcium carbonate (CaCO(3)) by computerized posturography. Track of the gravity center was analyzed to calculate parameters related to tract length, track range, and track density to express the degree of sway before and after supplementation in 126 subjects ranging in age between 20 and 81 years randomly divided into four groups. Supplementation with AAA Ca containing 900 mg elemental Ca (group A), no calcium (group B), CaCO(3) also containing 900 mg elemental Ca (group C), or alfacalcidol (group D) continued daily for 12 months. For each parameter, the ratio closed eye value/open eye value (Romberg ratio) was calculated to detect aggravation of sway by eye closure. Age, parameters of Ca and P, and proportions of subjects with fracture and those with low BMD showed no marked deviation among the groups. With eyes open, significant decreases of a track range parameter (REC) from group B was noted in groups A (P = 0.0397) and D (P = 0.0296), but not in group C according to multiple comparison by Scheffe, indicating superior postural stabilizing effect of A and D over C. In the first 2 months, a significant fall was already evident in REC from group B in group D (P = 0.0120) with eyes open. Paired comparison of sway parameters before and after supplementation revealed a significant increase of track density parameter (LNGA), indicating sway control efficiency and a significant decrease of REC in groups A and D compared to group B with eyes open. With eyes closed, only group A showed a significant improvement from group B (P = 0.0456; Fig. 1), with a significant shortening on paired After/Before comparison (P = 0.0142; Fig. 2). Computerized posturography appears to be useful in analyzing sway phenomena especially as to the effects of vitamin D and various Ca preparations.

  1. Interacting with mobile devices by fusion eye and hand gestures recognition systems based on decision tree approach

    NASA Astrophysics Data System (ADS)

    Elleuch, Hanene; Wali, Ali; Samet, Anis; Alimi, Adel M.

    2017-03-01

    Two systems of eyes and hand gestures recognition are used to control mobile devices. Based on a real-time video streaming captured from the device's camera, the first system recognizes the motion of user's eyes and the second one detects the static hand gestures. To avoid any confusion between natural and intentional movements we developed a system to fuse the decision coming from eyes and hands gesture recognition systems. The phase of fusion was based on decision tree approach. We conducted a study on 5 volunteers and the results that our system is robust and competitive.

  2. Eye tracking reveals a crucial role for facial motion in recognition of faces by infants

    PubMed Central

    Xiao, Naiqi G.; Quinn, Paul C.; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-01-01

    Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces and then their face recognition was tested with static face images. Eye tracking methodology was used to record eye movements during familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better was their face recognition, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. PMID:26010387

  3. Reasoning strategies with rational numbers revealed by eye tracking.

    PubMed

    Plummer, Patrick; DeWolf, Melissa; Bassok, Miriam; Gordon, Peter C; Holyoak, Keith J

    2017-07-01

    Recent research has begun to investigate the impact of different formats for rational numbers on the processes by which people make relational judgments about quantitative relations. DeWolf, Bassok, and Holyoak (Journal of Experimental Psychology: General, 144(1), 127-150, 2015) found that accuracy on a relation identification task was highest when fractions were presented with countable sets, whereas accuracy was relatively low for all conditions where decimals were presented. However, it is unclear what processing strategies underlie these disparities in accuracy. We report an experiment that used eye-tracking methods to externalize the strategies that are evoked by different types of rational numbers for different types of quantities (discrete vs. continuous). Results showed that eye-movement behavior during the task was jointly determined by image and number format. Discrete images elicited a counting strategy for both fractions and decimals, but this strategy led to higher accuracy only for fractions. Continuous images encouraged magnitude estimation and comparison, but to a greater degree for decimals than fractions. This strategy led to decreased accuracy for both number formats. By analyzing participants' eye movements when they viewed a relational context and made decisions, we were able to obtain an externalized representation of the strategic choices evoked by different ontological types of entities and different types of rational numbers. Our findings using eye-tracking measures enable us to go beyond previous studies based on accuracy data alone, demonstrating that quantitative properties of images and the different formats for rational numbers jointly influence strategies that generate eye-movement behavior.

  4. Comparability of anterior chamber depth measurements with partial coherence interferometry and optical low-coherence reflectometry in pseudophakic eyes.

    PubMed

    Luft, Nikolaus; Hirnschall, Nino; Farrokhi, Sanaz; Findl, Oliver

    2015-08-01

    To assess whether anterior chamber depth (ACD) measurements in pseudophakic eyes obtained with partial coherence interferometry (PCI) and optical low-coherence reflectometry (OLCR) devices can be used interchangeably. Vienna Institute for Research in Ocular Surgery, A Karl Landsteiner Institute, Hanusch Hospital, Vienna, Austria. Prospective case series. The ACD measurements in 1 eye of each pseudophakic patient were performed with the PCI-based ACMaster device and the OLCR-based Lenstar LS900 device at least 1 day postoperatively. The study comprised 65 eyes of 65 patients with a mean age of 71.7 years ± 9.0 (SD) (range 39 to 91 years). In 15 eyes, no valid ACD readings could be obtained with the OLCR device. No obvious reason for these measurement failures was identified; however, tear-film alterations shortly after surgery were suspected. No significant difference in the mean ACD in the remaining 50 eyes was found between PCI measurements (5019 ± 660 μm; range 4008 to 6181 μm) and OLCR measurements (5015 ± 663 μm; range 4017 to 6163 μm) (P = .06). Three (6%) of 50 measurements were not within the 95% limits of agreement in the Bland-Altman analysis. Pseudophakic ACD measurements with the PCI and OLCR devices can be used interchangeably. The OLCR device proved to be more user-friendly and faster; however, in a substantial number of eyes, no usable values were obtainable. No author has a financial or proprietary interest in any material or method mentioned. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  5. An ex vivo rat eye model to aid development of high-resolution retina imaging devices for rodents

    NASA Astrophysics Data System (ADS)

    van Oterendorp, Christian; Martin, Keith R.; Zhong, Jiang Jian; Diaz-Santana, Luis

    2010-09-01

    High resolution in vivo retinal imaging in rodents is becoming increasingly important in eye research. Development of suitable imaging devices currently requires many lengthy animal procedures. We present an ex vivo rat model eye with fluorescently labelled retinal ganglion cells (RGC) and nerve fibre bundles that reduces the need for animal procedures while preserving key properties of the living rat eye. Optical aberrations and scattering of four model eyes and eight live rat eyes were quantified using a Shack-Hartmann sensor. Fluorescent images from RGCs were obtained using a prototype scanning laser ophthalmoscope. The wavefront aberration root mean square value without defocus did not significantly differ between model and living eyes. Higher order aberrations were slightly higher but RGC image quality was comparable to published in vivo work. Overall, the model allows a large reduction in number and duration of animal procedures required to develop new in vivo retinal imaging devices.

  6. An Examination of Cognitive Processing of Multimedia Information Based on Viewers' Eye Movements

    ERIC Educational Resources Information Center

    Liu, Han-Chin; Chuang, Hsueh-Hua

    2011-01-01

    This study utilized qualitative and quantitative designs and eye-tracking technology to understand how viewers process multimedia information. Eye movement data were collected from eight college students (non-science majors) while they were viewing web pages containing different types of text and illustrations depicting the mechanism of…

  7. Codebook-based electrooculography data analysis towards cognitive activity recognition.

    PubMed

    Lagodzinski, P; Shirahama, K; Grzegorzek, M

    2018-04-01

    With the advancement in mobile/wearable technology, people started to use a variety of sensing devices to track their daily activities as well as health and fitness conditions in order to improve the quality of life. This work addresses an idea of eye movement analysis, which due to the strong correlation with cognitive tasks can be successfully utilized in activity recognition. Eye movements are recorded using an electrooculographic (EOG) system built into the frames of glasses, which can be worn more unobtrusively and comfortably than other devices. Since the obtained information is low-level sensor data expressed as a sequence representing values in constant intervals (100 Hz), the cognitive activity recognition problem is formulated as sequence classification. However, it is unclear what kind of features are useful for accurate cognitive activity recognition. Thus, a machine learning algorithm like a codebook approach is applied, which instead of focusing on feature engineering is using a distribution of characteristic subsequences (codewords) to describe sequences of recorded EOG data, where the codewords are obtained by clustering a large number of subsequences. Further, statistical analysis of the codeword distribution results in discovering features which are characteristic to a certain activity class. Experimental results demonstrate good accuracy of the codebook-based cognitive activity recognition reflecting the effective usage of the codewords. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Comparison of home and away-from-home physical activity using accelerometers and cellular network-based tracking devices.

    PubMed

    Ramulu, Pradeep Y; Chan, Emilie S; Loyd, Tara L; Ferrucci, Luigi; Friedman, David S

    2012-08-01

    Measuring physical at home and away from home is essential for assessing health and well-being, and could help design interventions to increase physical activity. Here, we describe how physical activity at home and away from home can be quantified by combining information from cellular network-based tracking devices and accelerometers. Thirty-five working adults wore a cellular network-based tracking device and an accelerometer for 6 consecutive days and logged their travel away from home. Performance of the tracking device was determined using the travel log for reference. Tracking device and accelerometer data were merged to compare physical activity at home and away from home. The tracking device detected 98.6% of all away-from-home excursions, accurately measured time away from home and demonstrated few prolonged signal drop-out periods. Most physical activity took place away from home on weekdays, but not on weekends. Subjects were more physically active per unit of time while away from home, particularly on weekends. Cellular network-based tracking devices represent an alternative to global positioning systems for tracking location, and provide information easily integrated with accelerometers to determine where physical activity takes place. Promoting greater time spent away from home may increase physical activity.

  9. Distractor interference during smooth pursuit eye movements.

    PubMed

    Spering, Miriam; Gegenfurtner, Karl R; Kerzel, Dirk

    2006-10-01

    When 2 targets for pursuit eye movements move in different directions, the eye velocity follows the vector average (S. G. Lisberger & V. P. Ferrera, 1997). The present study investigates the mechanisms of target selection when observers are instructed to follow a predefined horizontal target and to ignore a moving distractor stimulus. Results show that at 140 ms after distractor onset, horizontal eye velocity is decreased by about 25%. Vertical eye velocity increases or decreases by 1 degrees /s in the direction opposite from the distractor. This deviation varies in size with distractor direction, velocity, and contrast. The effect was present during the initiation and steady-state tracking phase of pursuit but only when the observer had prior information about target motion. Neither vector averaging nor winner-take-all models could predict the response to a moving to-be-ignored distractor during steady-state tracking of a predefined target. The contributions of perceptual mislocalization and spatial attention to the vertical deviation in pursuit are discussed. Copyright 2006 APA.

  10. Contribution of malocclusion and female facial attractiveness to smile esthetics evaluated by eye tracking.

    PubMed

    Richards, Michael R; Fields, Henry W; Beck, F Michael; Firestone, Allen R; Walther, Dirk B; Rosenstiel, Stephen; Sacksteder, James M

    2015-04-01

    There is disagreement in the literature concerning the importance of the mouth in overall facial attractiveness. Eye tracking provides an objective method to evaluate what people see. The objective of this study was to determine whether dental and facial attractiveness alters viewers' visual attention in terms of which area of the face (eyes, nose, mouth, chin, ears, or other) is viewed first, viewed the greatest number of times, and viewed for the greatest total time (duration) using eye tracking. Seventy-six viewers underwent 1 eye tracking session. Of these, 53 were white (49% female, 51% male). Their ages ranged from 18 to 29 years, with a mean of 19.8 years, and none were dental professionals. After being positioned and calibrated, they were shown 24 unique female composite images, each image shown twice for reliability. These images reflected a repaired unilateral cleft lip or 3 grades of dental attractiveness similar to those of grades 1 (near ideal), 7 (borderline treatment need), and 10 (definite treatment need) as assessed in the aesthetic component of the Index of Orthodontic Treatment Need (AC-IOTN). The images were then embedded in faces of 3 levels of attractiveness: attractive, average, and unattractive. During viewing, data were collected for the first location, frequency, and duration of each viewer's gaze. Observer reliability ranged from 0.58 to 0.92 (intraclass correlation coefficients) but was less than 0.07 (interrater) for the chin, which was eliminated from the study. Likewise, reliability for the area of first fixation was kappa less than 0.10 for both intrarater and interrater reliabilities; the area of first fixation was also removed from the data analysis. Repeated-measures analysis of variance showed a significant effect (P <0.001) for level of attractiveness by malocclusion by area of the face. For both number of fixations and duration of fixations, the eyes overwhelmingly were most salient, with the mouth receiving the second most visual attention. At times, the mouth and the eyes were statistically indistinguishable in viewers' gazes of fixation and duration. As the dental attractiveness decreased, the visual attention increased on the mouth, approaching that of the eyes. AC-IOTN grade 10 gained the most attention, followed by both AC-IOTN grade 7 and the cleft. AC-IOTN grade 1 received the least amount of visual attention. Also, lower dental attractiveness (AC-IOTN 7 and AC-IOTN 10) received more visual attention as facial attractiveness increased. Eye tracking indicates that dental attractiveness can alter the level of visual attention depending on the female models' facial attractiveness when viewed by laypersons. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  11. Eye vision system using programmable micro-optics and micro-electronics

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.; Amin, M. Junaid; Riza, Mehdi N.

    2014-02-01

    Proposed is a novel eye vision system that combines the use of advanced micro-optic and microelectronic technologies that includes programmable micro-optic devices, pico-projectors, Radio Frequency (RF) and optical wireless communication and control links, energy harvesting and storage devices and remote wireless energy transfer capabilities. This portable light weight system can measure eye refractive powers, optimize light conditions for the eye under test, conduct color-blindness tests, and implement eye strain relief and eye muscle exercises via time sequenced imaging. Described is the basic design of the proposed system and its first stage system experimental results for vision spherical lens refractive error correction.

  12. Dual Purkinje-Image Eyetracker

    DTIC Science & Technology

    1996-01-01

    Abnormal nystagmus can also be detected through the use of an eyetracker [4]. Through tracking points of eye gaze within a scene, it is possible to...moving, even when gazing . Correcting for these unpredictable micro eye movements would allow corrective procedures in eye surgery to become more accurate...victim with a screen of letters on a monitor. A calibrated eyetracker then provides a processor with information about the location of eye gaze . The

  13. Comparing Eye Tracking with Electrooculography for Measuring Individual Sentence Comprehension Duration

    PubMed Central

    Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas

    2016-01-01

    The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125

  14. Dynamic light scattering in ophthalmology: results of in vitro and in vivo experiments.

    PubMed

    Fankhauser, Franz

    2006-01-01

    To calibrate new dynamic light scattering (DLS) devices in defined solutions and post mortem porcine and human eyes. To examine all segments of the eye and to become familiar with the usage of the technique in living subjects. METHODS, DESIGN: Three new DLS devices for the usage in patients were developed. Mono-disperse solutions, poly-disperse solutions, gels, post mortem porcine and human eyes as well as healthy volunteers were studied. The detected signals were inverted into autocorrelation functions. We constructed three DLS devices appropriate for in vitro as well as in vivo examinations. In mono disperse solution precise disintegration rates could be calculated. In poly-disperse solutions, in gel and in the vitreous the results did not correlate with movements of individual particles but we could calculate characteristics of the complete scattering system. In vivo measurements demonstrated that DLS can be used in all human eye segments. DLS is a unique technique. With DLS the molecular composition of eye segments can be studied in living subjects. This can be used to understand the molecular basis of severe eye diseases. The presented data demonstrate that DLS delivers reproducible data from all eye segments. It is possible to study the molecular structures of eye segments in living subjects. The developed devices were proved successfully in vitro as well as in vivo. Limitations are the low specificity of DLS and its sensitivity to background noise. Now clinical studies are necessary to demonstrate potential diagnostic benefits of DLS in specific eye diseases.

  15. The specificity of attentional biases by type of gambling: An eye-tracking study.

    PubMed

    McGrath, Daniel S; Meitner, Amadeus; Sears, Christopher R

    2018-01-01

    A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers.

  16. Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study.

    PubMed

    Kesner, Ladislav; Grygarová, Dominika; Fajnerová, Iveta; Lukavský, Jiří; Nekovářová, Tereza; Tintěra, Jaroslav; Zaytseva, Yuliya; Horáček, Jiří

    2018-06-15

    In this study, we use separate eye-tracking measurements and functional magnetic resonance imaging to investigate the neuronal and behavioral response to painted portraits with direct versus averted gaze. We further explored modulatory effects of several painting characteristics (premodern vs modern period, influence of style and pictorial context). In the fMRI experiment, we show that the direct versus averted gaze elicited increased activation in lingual and inferior occipital and the fusiform face area, as well as in several areas involved in attentional and social cognitive processes, especially the theory of mind: angular gyrus/temporo-parietal junction, inferior frontal gyrus and dorsolateral prefrontal cortex. The additional eye-tracking experiment showed that participants spent more time viewing the portrait's eyes and mouth when the portrait's gaze was directed towards the observer. These results suggest that static and, in some cases, highly stylized depictions of human beings in artistic portraits elicit brain activation commensurate with the experience of being observed by a watchful intelligent being. They thus involve observers in implicit inferences of the painted subject's mental states and emotions. We further confirm the substantial influence of representational medium on brain activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Eye-tracking of visual attention in web-based assessment using the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Han, Jing; Chen, Li; Fu, Zhao; Fritchman, Joseph; Bao, Lei

    2017-07-01

    This study used eye-tracking technology to investigate students’ visual attention while taking the Force Concept Inventory (FCI) in a web-based interface. Eighty nine university students were randomly selected into a pre-test group and a post-test group. Students took the 30-question FCI on a computer equipped with an eye-tracker. There were seven weeks of instruction between the pre- and post-test data collection. Students’ performance on the FCI improved significantly from pre-test to post-test. Meanwhile, the eye-tracking results reveal that the time students spent on taking the FCI test was not affected by student performance and did not change from pre-test to post-test. Analysis of students’ attention to answer choices shows that on the pre-test students primarily focused on the naïve choices and ignored the expert choices. On the post-test, although students had shifted their primary attention to the expert choices, they still kept a high level of attention to the naïve choices, indicating significant conceptual mixing and competition during problem solving. Outcomes of this study provide new insights on students’ conceptual development in learning physics.

  18. Visual Fatigue Induced by Viewing a Tablet Computer with a High-resolution Display.

    PubMed

    Kim, Dong Ju; Lim, Chi Yeon; Gu, Namyi; Park, Choul Yong

    2017-10-01

    In the present study, the visual discomfort induced by smart mobile devices was assessed in normal and healthy adults. Fifty-nine volunteers (age, 38.16 ± 10.23 years; male : female = 19 : 40) were exposed to tablet computer screen stimuli (iPad Air, Apple Inc.) for 1 hour. Participants watched a movie or played a computer game on the tablet computer. Visual fatigue and discomfort were assessed using an asthenopia questionnaire, tear film break-up time, and total ocular wavefront aberration before and after viewing smart mobile devices. Based on the questionnaire, viewing smart mobile devices for 1 hour significantly increased mean total asthenopia score from 19.59 ± 8.58 to 22.68 ± 9.39 (p < 0.001). Specifically, the scores for five items (tired eyes, sore/aching eyes, irritated eyes, watery eyes, and hot/burning eye) were significantly increased by viewing smart mobile devices. Tear film break-up time significantly decreased from 5.09 ± 1.52 seconds to 4.63 ± 1.34 seconds (p = 0.003). However, total ocular wavefront aberration was unchanged. Visual fatigue and discomfort were significantly induced by viewing smart mobile devices, even though the devices were equipped with state-of-the-art display technology. © 2017 The Korean Ophthalmological Society

  19. Visual Fatigue Induced by Viewing a Tablet Computer with a High-resolution Display

    PubMed Central

    Kim, Dong Ju; Lim, Chi-Yeon; Gu, Namyi

    2017-01-01

    Purpose In the present study, the visual discomfort induced by smart mobile devices was assessed in normal and healthy adults. Methods Fifty-nine volunteers (age, 38.16 ± 10.23 years; male : female = 19 : 40) were exposed to tablet computer screen stimuli (iPad Air, Apple Inc.) for 1 hour. Participants watched a movie or played a computer game on the tablet computer. Visual fatigue and discomfort were assessed using an asthenopia questionnaire, tear film break-up time, and total ocular wavefront aberration before and after viewing smart mobile devices. Results Based on the questionnaire, viewing smart mobile devices for 1 hour significantly increased mean total asthenopia score from 19.59 ± 8.58 to 22.68 ± 9.39 (p < 0.001). Specifically, the scores for five items (tired eyes, sore/aching eyes, irritated eyes, watery eyes, and hot/burning eye) were significantly increased by viewing smart mobile devices. Tear film break-up time significantly decreased from 5.09 ± 1.52 seconds to 4.63 ± 1.34 seconds (p = 0.003). However, total ocular wavefront aberration was unchanged. Conclusions Visual fatigue and discomfort were significantly induced by viewing smart mobile devices, even though the devices were equipped with state-of-the-art display technology. PMID:28914003

  20. Patient Positioning Based on a Radioactive Tracer Implanted in Patients With Localized Prostate Cancer: A Performance and Safety Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kruijf, Willy J.M. de, E-mail: kruijf.de.w@bvi.nl; Verstraete, Jan; Neustadter, David

    2013-02-01

    Purpose: To evaluate the performance and safety of a radiation therapy positioning system (RealEye) based on tracking a radioactive marker (Tracer) implanted in patients with localized prostate cancer. Methods and Materials: We performed a single-arm multi-institutional trial in 20 patients. The iridium-192 ({sup 192}Ir)-containing Tracer was implanted in the patient together with 4 standard gold seed fiducials. Patient prostate-related symptoms were evaluated with the International Prostate Symptom Score (IPSS) questionnaire. Computed tomography (CT) was performed for treatment planning, during treatment, and after treatment to evaluate the migration stability of the Tracer. At 5 treatment sessions, cone beam CT was performedmore » to test the positioning accuracy of the RealEye. Results: The Tracer was successfully implanted in all patients. No device or procedure-related adverse events occurred. Changes in IPSS scores were limited. The difference between the mean change in Tracer-fiducial distance and the mean change in fiducial-fiducial distance was -0.39 mm (95% confidence interval [CI] upper boundary, -0.22 mm). The adjusted mean difference between Tracer position according to RealEye and the Tracer position on the CBCT for all patients was 1.34 mm (95% CI upper boundary, 1.41 mm). Conclusions: Implantation of the Tracer is feasible and safe. Migration stability of the Tracer is good. Prostate patients can be positioned and monitored accurately by using RealEye.« less

  1. Patient positioning based on a radioactive tracer implanted in patients with localized prostate cancer: a performance and safety evaluation.

    PubMed

    de Kruijf, Willy J M; Verstraete, Jan; Neustadter, David; Corn, Benjamin W; Hol, Sandra; Venselaar, Jack L M; Davits, Rob J; Wijsman, Bart P; Van den Bergh, Laura; Budiharto, Tom; Oyen, Raymond; Haustermans, Karin; Poortmans, Philip M P

    2013-02-01

    To evaluate the performance and safety of a radiation therapy positioning system (RealEye) based on tracking a radioactive marker (Tracer) implanted in patients with localized prostate cancer. We performed a single-arm multi-institutional trial in 20 patients. The iridium-192 ((192)Ir)-containing Tracer was implanted in the patient together with 4 standard gold seed fiducials. Patient prostate-related symptoms were evaluated with the International Prostate Symptom Score (IPSS) questionnaire. Computed tomography (CT) was performed for treatment planning, during treatment, and after treatment to evaluate the migration stability of the Tracer. At 5 treatment sessions, cone beam CT was performed to test the positioning accuracy of the RealEye. The Tracer was successfully implanted in all patients. No device or procedure-related adverse events occurred. Changes in IPSS scores were limited. The difference between the mean change in Tracer-fiducial distance and the mean change in fiducial-fiducial distance was -0.39 mm (95% confidence interval [CI] upper boundary, -0.22 mm). The adjusted mean difference between Tracer position according to RealEye and the Tracer position on the CBCT for all patients was 1.34 mm (95% CI upper boundary, 1.41 mm). Implantation of the Tracer is feasible and safe. Migration stability of the Tracer is good. Prostate patients can be positioned and monitored accurately by using RealEye. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. How are learning strategies reflected in the eyes? Combining results from self-reports and eye-tracking.

    PubMed

    Catrysse, Leen; Gijbels, David; Donche, Vincent; De Maeyer, Sven; Lesterhuis, Marije; Van den Bossche, Piet

    2018-03-01

    Up until now, empirical studies in the Student Approaches to Learning field have mainly been focused on the use of self-report instruments, such as interviews and questionnaires, to uncover differences in students' general preferences towards learning strategies, but have focused less on the use of task-specific and online measures. This study aimed at extending current research on students' learning strategies by combining general and task-specific measurements of students' learning strategies using both offline and online measures. We want to clarify how students process learning contents and to what extent this is related to their self-report of learning strategies. Twenty students with different generic learning profiles (according to self-report questionnaires) read an expository text, while their eye movements were registered to answer questions on the content afterwards. Eye-tracking data were analysed with generalized linear mixed-effects models. The results indicate that students with an all-high profile, combining both deep and surface learning strategies, spend more time on rereading the text than students with an all-low profile, scoring low on both learning strategies. This study showed that we can use eye-tracking to distinguish very strategic students, characterized using cognitive processing and regulation strategies, from low strategic students, characterized by a lack of cognitive and regulation strategies. These students processed the expository text according to how they self-reported. © 2017 The British Psychological Society.

  3. Computer vision enhances mobile eye-tracking to expose expert cognition in natural-scene visual-search tasks

    NASA Astrophysics Data System (ADS)

    Keane, Tommy P.; Cahill, Nathan D.; Tarduno, John A.; Jacobs, Robert A.; Pelz, Jeff B.

    2014-02-01

    Mobile eye-tracking provides the fairly unique opportunity to record and elucidate cognition in action. In our research, we are searching for patterns in, and distinctions between, the visual-search performance of experts and novices in the geo-sciences. Traveling to regions resultant from various geological processes as part of an introductory field studies course in geology, we record the prima facie gaze patterns of experts and novices when they are asked to determine the modes of geological activity that have formed the scene-view presented to them. Recording eye video and scene video in natural settings generates complex imagery that requires advanced applications of computer vision research to generate registrations and mappings between the views of separate observers. By developing such mappings, we could then place many observers into a single mathematical space where we can spatio-temporally analyze inter- and intra-subject fixations, saccades, and head motions. While working towards perfecting these mappings, we developed an updated experiment setup that allowed us to statistically analyze intra-subject eye-movement events without the need for a common domain. Through such analyses we are finding statistical differences between novices and experts in these visual-search tasks. In the course of this research we have developed a unified, open-source, software framework for processing, visualization, and interaction of mobile eye-tracking and high-resolution panoramic imagery.

  4. Adaptive optics with pupil tracking for high resolution retinal imaging

    PubMed Central

    Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris

    2012-01-01

    Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics. PMID:22312577

  5. Adaptive optics with pupil tracking for high resolution retinal imaging.

    PubMed

    Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris

    2012-02-01

    Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics.

  6. Genetics Home Reference: horizontal gaze palsy with progressive scoliosis

    MedlinePlus

    ... to track moving objects. Up-and-down (vertical) eye movements are typically normal. In people with HGPPS , an ... the brainstem is the underlying cause of the eye movement abnormalities associated with the disorder. The cause of ...

  7. The Use of Eye Movements in the Study of Multimedia Learning

    ERIC Educational Resources Information Center

    Hyona, Jukka

    2010-01-01

    This commentary focuses on the use of the eye-tracking methodology to study cognitive processes during multimedia learning. First, some general remarks are made about how the method is applied to investigate visual information processing, followed by a reflection on the eye movement measures employed in the studies published in this special issue.…

  8. Reliability and Validity of Eye Movement Measures of Children's Reading

    ERIC Educational Resources Information Center

    Foster, Tori E.; Ardoin, Scott P.; Binder, Katherine S.

    2018-01-01

    Although strong claims have been made regarding the educational utility of eye tracking, such statements seem somewhat unfounded in the absence of clear evidence regarding the technical adequacy of eye movement (EM) data. Past studies have yielded direct and indirect evidence concerning the utility of EMs as measures of reading, but recent…

  9. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    PubMed

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Eye tracking reveals a crucial role for facial motion in recognition of faces by infants.

    PubMed

    Xiao, Naiqi G; Quinn, Paul C; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-06-01

    Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces, and then their face recognition was tested with static face images. Eye-tracking methodology was used to record eye movements during the familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better their face recognition was, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. (c) 2015 APA, all rights reserved).

  11. A Model-Based Approach for the Measurement of Eye Movements Using Image Processing

    NASA Technical Reports Server (NTRS)

    Sung, Kwangjae; Reschke, Millard F.

    1997-01-01

    This paper describes a video eye-tracking algorithm which searches for the best fit of the pupil modeled as a circular disk. The algorithm is robust to common image artifacts such as the droopy eyelids and light reflections while maintaining the measurement resolution available by the centroid algorithm. The presented algorithm is used to derive the pupil size and center coordinates, and can be combined with iris-tracking techniques to measure ocular torsion. A comparison search method of pupil candidates using pixel coordinate reference lookup tables optimizes the processing requirements for a least square fit of the circular disk model. This paper includes quantitative analyses and simulation results for the resolution and the robustness of the algorithm. The algorithm presented in this paper provides a platform for a noninvasive, multidimensional eye measurement system which can be used for clinical and research applications requiring the precise recording of eye movements in three-dimensional space.

  12. Through the eyes of the own-race bias: eye-tracking and pupillometry during face recognition.

    PubMed

    Wu, Esther Xiu Wen; Laeng, Bruno; Magnussen, Svein

    2012-01-01

    People are generally better at remembering faces of their own race than faces of a different race, and this effect is known as the own-race bias (ORB) effect. We used eye-tracking and pupillometry to investigate whether Caucasian and Asian face stimuli elicited different-looking patterns in Caucasian participants in a face-memory task. Consistent with the ORB effect, we found better recognition performance for own-race faces than other-race faces, and shorter response times. In addition, at encoding, eye movements and pupillary responses to Asian faces (i.e., the other race) were different from those to Caucasian faces (i.e., the own race). Processing of own-race faces was characterized by more active scanning, with a larger number of shorter fixations, and more frequent saccades. Moreover, pupillary diameters were larger when viewing other-race than own-race faces, suggesting a greater cognitive effort when encoding other-race faces.

  13. Comprehensive Oculomotor Behavioral Response Assessment (COBRA)

    NASA Technical Reports Server (NTRS)

    Stone, Leland S. (Inventor); Liston, Dorion B. (Inventor)

    2017-01-01

    An eye movement-based methodology and assessment tool may be used to quantify many aspects of human dynamic visual processing using a relatively simple and short oculomotor task, noninvasive video-based eye tracking, and validated oculometric analysis techniques. By examining the eye movement responses to a task including a radially-organized appropriately randomized sequence of Rashbass-like step-ramp pursuit-tracking trials, distinct performance measurements may be generated that may be associated with, for example, pursuit initiation (e.g., latency and open-loop pursuit acceleration), steady-state tracking (e.g., gain, catch-up saccade amplitude, and the proportion of the steady-state response consisting of smooth movement), direction tuning (e.g., oblique effect amplitude, horizontal-vertical asymmetry, and direction noise), and speed tuning (e.g., speed responsiveness and noise). This quantitative approach may provide fast and results (e.g., a multi-dimensional set of oculometrics and a single scalar impairment index) that can be interpreted by one without a high degree of scientific sophistication or extensive training.

  14. Bilingualism influences inhibitory control in auditory comprehension

    PubMed Central

    Blumenfeld, Henrike K.; Marian, Viorica

    2013-01-01

    Bilinguals have been shown to outperform monolinguals at suppressing task-irrelevant information. The present study aimed to identify how processing linguistic ambiguity during auditory comprehension may be associated with inhibitory control. Monolinguals and bilinguals listened to words in their native language (English) and identified them among four pictures while their eye-movements were tracked. Each target picture (e.g., hamper) appeared together with a similar-sounding within-language competitor picture (e.g., hammer) and two neutral pictures. Following each eye-tracking trial, priming probe trials indexed residual activation of target words, and residual inhibition of competitor words. Eye-tracking showed similar within-language competition across groups; priming showed stronger competitor inhibition in monolinguals than in bilinguals, suggesting differences in how inhibitory control was used to resolve within-language competition. Notably, correlation analyses revealed that inhibition performance on a nonlinguistic Stroop task was related to linguistic competition resolution in bilinguals but not in monolinguals. Together, monolingual-bilingual comparisons suggest that cognitive control mechanisms can be shaped by linguistic experience. PMID:21159332

  15. A resource for assessing information processing in the developing brain using EEG and eye tracking

    PubMed Central

    Langer, Nicolas; Ho, Erica J.; Alexander, Lindsay M.; Xu, Helen Y.; Jozanovic, Renee K.; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T.; Parra, Lucas C.; Milham, Michael P.; Kelly, Simon P.

    2017-01-01

    We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6–44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes. PMID:28398357

  16. A resource for assessing information processing in the developing brain using EEG and eye tracking.

    PubMed

    Langer, Nicolas; Ho, Erica J; Alexander, Lindsay M; Xu, Helen Y; Jozanovic, Renee K; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T; Parra, Lucas C; Milham, Michael P; Kelly, Simon P

    2017-04-11

    We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6-44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes.

  17. Raman Shifted Nd:YAG Class I Eye-Safe Laser Development 21 January 1986

    NASA Astrophysics Data System (ADS)

    Nichols, R. W.; Ng, W. K.

    1986-07-01

    Hughes Aircraft has been developing a hand-held eye-safe laser rangefinder fo1r the Army utilizing Stimulated Raman Scattering technology. The device uses the 2915 cm-1 vibrational mode of methane (CH4) to wavelength shift the Nd:YAG pump laser's 1.064 micron to an eye-safe 1.543 micron. The result is a lightweight BRH Class I eye-safe tactical device. A brief description of Raman wavelength shifting basics is followed by description of the Hughes system.

  18. 21 CFR 888.3060 - Spinal intervertebral body fixation orthosis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... each of a series of vertebral bodies. An eye-type screw is inserted in a hole in the center of each of the plates. A braided cable is threaded through each eye-type screw. The cable is tightened with a tension device and it is fastened or crimped at each eye-type screw. The device is used to apply force to...

  19. 21 CFR 888.3060 - Spinal intervertebral body fixation orthosis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... each of a series of vertebral bodies. An eye-type screw is inserted in a hole in the center of each of the plates. A braided cable is threaded through each eye-type screw. The cable is tightened with a tension device and it is fastened or crimped at each eye-type screw. The device is used to apply force to...

  20. 29 CFR Appendix B to Subpart I to... - Non-mandatory Compliance Guidelines for Hazard Assessment and Personal Protective Equipment...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the foot, head, eye and face, and hand hazard situations that exist in an occupational or educational...) sources of high temperatures that could result in burns, eye injury or ignition of protective equipment... devices for eye protection against dust and chemical splash to ensure that the devices are sealed to the...

  1. Object motion computation for the initiation of smooth pursuit eye movements in humans.

    PubMed

    Wallace, Julian M; Stone, Leland S; Masson, Guillaume S

    2005-04-01

    Pursuing an object with smooth eye movements requires an accurate estimate of its two-dimensional (2D) trajectory. This 2D motion computation requires that different local motion measurements are extracted and combined to recover the global object-motion direction and speed. Several combination rules have been proposed such as vector averaging (VA), intersection of constraints (IOC), or 2D feature tracking (2DFT). To examine this computation, we investigated the time course of smooth pursuit eye movements driven by simple objects of different shapes. For type II diamond (where the direction of true object motion is dramatically different from the vector average of the 1-dimensional edge motions, i.e., VA not equal IOC = 2DFT), the ocular tracking is initiated in the vector average direction. Over a period of less than 300 ms, the eye-tracking direction converges on the true object motion. The reduction of the tracking error starts before the closing of the oculomotor loop. For type I diamonds (where the direction of true object motion is identical to the vector average direction, i.e., VA = IOC = 2DFT), there is no such bias. We quantified this effect by calculating the direction error between responses to types I and II and measuring its maximum value and time constant. At low contrast and high speeds, the initial bias in tracking direction is larger and takes longer to converge onto the actual object-motion direction. This effect is attenuated with the introduction of more 2D information to the extent that it was totally obliterated with a texture-filled type II diamond. These results suggest a flexible 2D computation for motion integration, which combines all available one-dimensional (edge) and 2D (feature) motion information to refine the estimate of object-motion direction over time.

  2. I spy with my little eye: Analysis of airline pilots' gaze patterns in a manual instrument flight scenario.

    PubMed

    Haslbeck, Andreas; Zhang, Bo

    2017-09-01

    The aim of this study was to analyze pilots' visual scanning in a manual approach and landing scenario. Manual flying skills suffer from increasing use of automation. In addition, predominantly long-haul pilots with only a few opportunities to practice these skills experience this decline. Airline pilots representing different levels of practice (short-haul vs. long-haul) had to perform a manual raw data precision approach while their visual scanning was recorded by an eye-tracking device. The analysis of gaze patterns, which are based on predominant saccades, revealed one main group of saccades among long-haul pilots. In contrast, short-haul pilots showed more balanced scanning using two different groups of saccades. Short-haul pilots generally demonstrated better manual flight performance and within this group, one type of scan pattern was found to facilitate the manual landing task more. Long-haul pilots tend to utilize visual scanning behaviors that are inappropriate for the manual ILS landing task. This lack of skills needs to be addressed by providing specific training and more practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Time Course of Visual Attention to High-Calorie Virtual Food in Individuals with Bulimic Tendencies.

    PubMed

    Kim, Jiwon; Kim, Kiho; Lee, Jang-Han

    2016-01-01

    The aim of the present study was to use an eye-tracking device to investigate attention bias and its mechanism toward high-calorie virtual food in individuals with bulimic tendencies (BT). A total of 76 participants were divided into two groups: a BT group (n = 38) and a control group (n = 38). The eye movements of all participants were continuously measured while the participants were confronted with pairs of high-calorie, low-calorie, and nonfood virtual stimuli (pictures). It was found that the BT group detected high-calorie food more quickly than they did the low-calorie food and nonfood stimuli, but they also avoided the high-calorie food. These results indicate that individuals with BT automatically allocate their attention toward high-calorie food and, subsequently, try to avoid it. Based on these results, we suggest that this approach-avoidance pattern for high-calorie virtual food could be a factor in the development and maintenance of bulimia symptoms by encouraging individuals with BT to be in conflict with the urge to overeat.

  4. A new vestibulo-ocular reflex recording system designed for routine vestibular clinical use.

    PubMed

    Funabiki, K; Naito, Y; Matsuda, K; Honjo, I

    1999-01-01

    A new vestibulo-ocular reflex (VOR) recording system was developed, which consists of an infrared eye camera, a small velocity sensor and a frequency modulator. Using this system, the head velocity signal was frequency modulated and simultaneously recorded as a sound signal on the audio track of a Hi8 video recorder with eye images. This device enabled recording of the VOR response in routine vestibular clinical practice. The reliability and effectiveness of this system were estimated by recording and analysing the VOR response against manually controlled rotation in normal subjects (n = 22) and in patients with unilateral severe vestibular hypofunction (n = 11). VOR gain on clockwise rotation viewed from the top was defined as R gain, and counterclockwise rotation as L gain. Directional preponderance (DP%) was also calculated. VOR gain towards the diseased side was significantly lower than that towards the intact side, and also significantly lower than that of normal subjects. DP% of unilateral vestibular hypofunction cases was significantly larger than that of normal subjects. These findings indicate that this VOR recording system reliably detects severe unilateral vestibular hypofunction.

  5. Simulator scene display evaluation device

    NASA Technical Reports Server (NTRS)

    Haines, R. F. (Inventor)

    1986-01-01

    An apparatus for aligning and calibrating scene displays in an aircraft simulator has a base on which all of the instruments for the aligning and calibrating are mounted. Laser directs beam at double right prism which is attached to pivoting support on base. The pivot point of the prism is located at the design eye point (DEP) of simulator during the aligning and calibrating. The objective lens in the base is movable on a track to follow the laser beam at different angles within the field of vision at the DEP. An eyepiece and a precision diopter are movable into a position behind the prism during the scene evaluation. A photometer or illuminometer is pivotable about the pivot into and out of position behind the eyepiece.

  6. Understanding Student Cognition about Complex Earth System Processes Related to Climate Change

    NASA Astrophysics Data System (ADS)

    McNeal, K. S.; Libarkin, J.; Ledley, T. S.; Dutta, S.; Templeton, M. C.; Geroux, J.; Blakeney, G. A.

    2011-12-01

    The Earth's climate system includes complex behavior and interconnections with other Earth spheres that present challenges to student learning. To better understand these unique challenges, we have conducted experiments with high-school and introductory level college students to determine how information pertaining to the connections between the Earth's atmospheric system and the other Earth spheres (e.g., hydrosphere and cryosphere) are processed. Specifically, we include psychomotor tests (e.g., eye-tracking) and open-ended questionnaires in this research study, where participants were provided scientific images of the Earth (e.g., global precipitation and ocean and atmospheric currents), eye-tracked, and asked to provide causal or relational explanations about the viewed images. In addition, the students engaged in on-line modules (http://serc.carleton.edu/eslabs/climate/index.html) focused on Earth system science as training activities to address potential cognitive barriers. The developed modules included interactive media, hands-on lessons, links to outside resources, and formative assessment questions to promote a supportive and data-rich learning environment. Student eye movements were tracked during engagement with the materials to determine the role of perception and attention on understanding. Students also completed a conceptual questionnaire pre-post to determine if these on-line curriculum materials assisted in their development of connections between Earth's atmospheric system and the other Earth systems. The pre-post results of students' thinking about climate change concepts, as well as eye-tracking results, will be presented.

  7. Measuring social attention and motivation in autism spectrum disorder using eye-tracking: Stimulus type matters.

    PubMed

    Chevallier, Coralie; Parish-Morris, Julia; McVey, Alana; Rump, Keiran M; Sasson, Noah J; Herrington, John D; Schultz, Robert T

    2015-10-01

    Autism Spectrum Disorder (ASD) is characterized by social impairments that have been related to deficits in social attention, including diminished gaze to faces. Eye-tracking studies are commonly used to examine social attention and social motivation in ASD, but they vary in sensitivity. In this study, we hypothesized that the ecological nature of the social stimuli would affect participants' social attention, with gaze behavior during more naturalistic scenes being most predictive of ASD vs. typical development. Eighty-one children with and without ASD participated in three eye-tracking tasks that differed in the ecological relevance of the social stimuli. In the "Static Visual Exploration" task, static images of objects and people were presented; in the "Dynamic Visual Exploration" task, video clips of individual faces and objects were presented side-by-side; in the "Interactive Visual Exploration" task, video clips of children playing with objects in a naturalistic context were presented. Our analyses uncovered a three-way interaction between Task, Social vs. Object Stimuli, and Diagnosis. This interaction was driven by group differences on one task only-the Interactive task. Bayesian analyses confirmed that the other two tasks were insensitive to group membership. In addition, receiver operating characteristic analyses demonstrated that, unlike the other two tasks, the Interactive task had significant classification power. The ecological relevance of social stimuli is an important factor to consider for eye-tracking studies aiming to measure social attention and motivation in ASD. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  8. How young adults with autism spectrum disorder watch and interpret pragmatically complex scenes.

    PubMed

    Lönnqvist, Linda; Loukusa, Soile; Hurtig, Tuula; Mäkinen, Leena; Siipo, Antti; Väyrynen, Eero; Palo, Pertti; Laukka, Seppo; Mämmelä, Laura; Mattila, Marja-Leena; Ebeling, Hanna

    2017-11-01

    The aim of the current study was to investigate subtle characteristics of social perception and interpretation in high-functioning individuals with autism spectrum disorders (ASDs), and to study the relation between watching and interpreting. As a novelty, we used an approach that combined moment-by-moment eye tracking and verbal assessment. Sixteen young adults with ASD and 16 neurotypical control participants watched a video depicting a complex communication situation while their eye movements were tracked. The participants also completed a verbal task with questions related to the pragmatic content of the video. We compared verbal task scores and eye movements between groups, and assessed correlations between task performance and eye movements. Individuals with ASD had more difficulty than the controls in interpreting the video, and during two short moments there were significant group differences in eye movements. Additionally, we found significant correlations between verbal task scores and moment-level eye movement in the ASD group, but not among the controls. We concluded that participants with ASD had slight difficulties in understanding the pragmatic content of the video stimulus and attending to social cues, and that the connection between pragmatic understanding and eye movements was more pronounced for participants with ASD than for neurotypical participants.

  9. Design of a Gaze-Sensitive Virtual Social Interactive System for Children With Autism

    PubMed Central

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child’s dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. PMID:21609889

  10. Design of a gaze-sensitive virtual social interactive system for children with autism.

    PubMed

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2011-08-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child's dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. © 2011 IEEE

  11. Video-Based Eye Tracking to Detect the Attention Shift: A Computer Classroom Context-Aware System

    ERIC Educational Resources Information Center

    Kuo, Yung-Lung; Lee, Jiann-Shu; Hsieh, Min-Chai

    2014-01-01

    Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom.…

  12. Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements

    ERIC Educational Resources Information Center

    Yu, Chen; Yurovsky, Daniel; Xu, Tian

    2012-01-01

    Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…

  13. A Statistical Physics Perspective to Understand Social Visual Attention in Autism Spectrum Disorder.

    PubMed

    Liberati, Alessio; Fadda, Roberta; Doneddu, Giuseppe; Congiu, Sara; Javarone, Marco A; Striano, Tricia; Chessa, Alessandro

    2017-08-01

    This study investigated social visual attention in children with Autism Spectrum Disorder (ASD) and with typical development (TD) in the light of Brockmann and Geisel's model of visual attention. The probability distribution of gaze movements and clustering of gaze points, registered with eye-tracking technology, was studied during a free visual exploration of a gaze stimulus. A data-driven analysis of the distribution of eye movements was chosen to overcome any possible methodological problems related to the subjective expectations of the experimenters about the informative contents of the image in addition to a computational model to simulate group differences. Analysis of the eye-tracking data indicated that the scanpaths of children with TD and ASD were characterized by eye movements geometrically equivalent to Lévy flights. Children with ASD showed a higher frequency of long saccadic amplitudes compared with controls. A clustering analysis revealed a greater dispersion of eye movements for these children. Modeling of the results indicated higher values of the model parameter modulating the dispersion of eye movements for children with ASD. Together, the experimental results and the model point to a greater dispersion of gaze points in ASD.

  14. Is eye to eye contact really threatening and avoided in social anxiety?--An eye-tracking and psychophysiology study.

    PubMed

    Wieser, Matthias J; Pauli, Paul; Alpers, Georg W; Mühlberger, Andreas

    2009-01-01

    The effects of direct and averted gaze on autonomic arousal and gaze behavior in social anxiety were investigated using a new paradigm including animated movie stimuli and eye-tracking methodology. While high, medium, and low socially anxious (HSA vs. MSA vs. LSA) women watched animated movie clips, in which faces responded to the gaze of the participants with either direct or averted gaze, their eye movements, heart rate (HR) and skin conductance responses (SCR) were continuously recorded. Groups did not differ in their gaze behavior concerning direct vs. averted gaze, but high socially anxious women tended to fixate the eye region of the presented face longer than MSA and LSA, respectively. Furthermore, they responded to direct gaze with more pronounced cardiac acceleration. This physiological finding indicates that direct gaze may be a fear-relevant feature for socially anxious individuals in social interaction. However, this seems not to result in gaze avoidance. Future studies should examine the role of gaze direction and its interaction with facial expressions in social anxiety and its consequences for avoidance behavior and fear responses. Additionally, further research is needed to clarify the role of gaze perception in social anxiety.

  15. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  16. A minimally invasive approach to long-term head fixation in behaving nonhuman primates

    PubMed Central

    Davis, T.S.; Torab, K.; House, P.; Greger, B.

    2009-01-01

    We have designed a device for long-term head fixation for use in behaving nonhuman primates that is robust yet minimally invasive and simple to use. This device is a modified version of the halo system that is used in humans for cervical traction and stabilization after spinal column injuries. This device consists of an aluminum halo with four titanium skull pins offset from the halo by aluminum posts. The titanium pins insert onto small segments of cranially reinforcing titanium plate, which are attached to the skull with titanium cortex screws. The surgery involves four scalp incisions, placement of the reinforcing plates, insertion of the pins for attachment of the halo, and incision closure. After the halo is attached, the animal’s head can be fixed to a primate chair using a custom-built attachment arm that provides three degrees of adjustability for proper positioning during behavioral tasks. We have installed this device on two Macaque monkeys weighing seven and ten kilograms. The halos have been in place on these animals for up to eight months without signs of discomfort or loss of fixation. Using this method of head fixation, we have been able to track the animals’ eye positions with an accuracy of less than two visual degrees while they perform behavioral tasks. PMID:19394360

  17. Object tracking on mobile devices using binary descriptors

    NASA Astrophysics Data System (ADS)

    Savakis, Andreas; Quraishi, Mohammad Faiz; Minnehan, Breton

    2015-03-01

    With the growing ubiquity of mobile devices, advanced applications are relying on computer vision techniques to provide novel experiences for users. Currently, few tracking approaches take into consideration the resource constraints on mobile devices. Designing efficient tracking algorithms and optimizing performance for mobile devices can result in better and more efficient tracking for applications, such as augmented reality. In this paper, we use binary descriptors, including Fast Retina Keypoint (FREAK), Oriented FAST and Rotated BRIEF (ORB), Binary Robust Independent Features (BRIEF), and Binary Robust Invariant Scalable Keypoints (BRISK) to obtain real time tracking performance on mobile devices. We consider both Google's Android and Apple's iOS operating systems to implement our tracking approach. The Android implementation is done using Android's Native Development Kit (NDK), which gives the performance benefits of using native code as well as access to legacy libraries. The iOS implementation was created using both the native Objective-C and the C++ programing languages. We also introduce simplified versions of the BRIEF and BRISK descriptors that improve processing speed without compromising tracking accuracy.

  18. Looking to the eyes influences the processing of emotion on face-sensitive event-related potentials in 7-month-old infants.

    PubMed

    Vanderwert, Ross E; Westerlund, Alissa; Montoya, Lina; McCormick, Sarah A; Miguel, Helga O; Nelson, Charles A

    2015-10-01

    Previous studies in infants have shown that face-sensitive components of the ongoing electroencephalogram (the event-related potential, or ERP) are larger in amplitude to negative emotions (e.g., fear, anger) versus positive emotions (e.g., happy). However, it is still unclear whether the negative emotions linked with the face or the negative emotions alone contribute to these amplitude differences. We simultaneously recorded infant looking behaviors (via eye-tracking) and face-sensitive ERPs while 7-month-old infants viewed human faces or animals displaying happy, fear, or angry expressions. We observed that the amplitude of the N290 was greater (i.e., more negative) to angry animals compared to happy or fearful animals; no such differences were obtained for human faces. Eye-tracking data highlighted the importance of the eye region in processing emotional human faces. Infants that spent more time looking to the eye region of human faces showing fearful or angry expressions had greater N290 or P400 amplitudes, respectively. © 2014 Wiley Periodicals, Inc.

  19. The specificity of attentional biases by type of gambling: An eye-tracking study

    PubMed Central

    Meitner, Amadeus; Sears, Christopher R.

    2018-01-01

    A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers. PMID:29385164

  20. The Potential Utility of Eye Movements in the Detection and Characterization of Everyday Functional Difficulties in Mild Cognitive Impairment.

    PubMed

    Seligman, Sarah C; Giovannetti, Tania

    2015-06-01

    Mild cognitive impairment (MCI) refers to the intermediate period between the typical cognitive decline of normal aging and more severe decline associated with dementia, and it is associated with greater risk for progression to dementia. Research has suggested that functional abilities are compromised in MCI, but the degree of impairment and underlying mechanisms remain poorly understood. The development of sensitive measures to assess subtle functional decline poses a major challenge for characterizing functional limitations in MCI. Eye-tracking methodology has been used to describe visual processes in everyday, naturalistic action among healthy older adults as well as several case studies of severely impaired individuals, and it has successfully differentiated healthy older adults from those with MCI on specific visual tasks. These studies highlight the promise of eye-tracking technology as a method to characterize subtle functional decline in MCI. However, to date no studies have examined visual behaviors during completion of naturalistic tasks in MCI. This review describes the current understanding of functional ability in MCI, summarizes findings of eye-tracking studies in healthy individuals, severe impairment, and MCI, and presents future research directions to aid with early identification and prevention of functional decline in disorders of aging.

Top