Yang, Yushi
2015-01-01
Background Eye-tracking technology has been used to measure human cognitive processes and has the potential to improve the usability of health information technology (HIT). However, it is still unclear how the eye-tracking method can be integrated with other traditional usability methodologies to achieve its full potential. Objective The objective of this study was to report on HIT evaluation studies that have used eye-tracker technology, and to envision the potential use of eye-tracking technology in future research. Methods We used four reference databases to initially identify 5248 related papers, which resulted in only 9 articles that met our inclusion criteria. Results Eye-tracking technology was useful in finding usability problems in many ways, but is still in its infancy for HIT usability evaluation. Limited types of HITs have been evaluated by eye trackers, and there has been a lack of evaluation research in natural settings. Conclusions More research should be done in natural settings to discover the real contextual-based usability problems of clinical and mobile HITs using eye-tracking technology with more standardized methodologies and guidance. PMID:27026079
Emerging applications of eye-tracking technology in dermatology.
John, Kevin K; Jensen, Jakob D; King, Andy J; Pokharel, Manusheela; Grossman, Douglas
2018-04-06
Eye-tracking technology has been used within a multitude of disciplines to provide data linking eye movements to visual processing of various stimuli (i.e., x-rays, situational positioning, printed information, and warnings). Despite the benefits provided by eye-tracking in allowing for the identification and quantification of visual attention, the discipline of dermatology has yet to see broad application of the technology. Notwithstanding dermatologists' heavy reliance upon visual patterns and cues to discriminate between benign and atypical nevi, literature that applies eye-tracking to the study of dermatology is sparse; and literature specific to patient-initiated behaviors, such as skin self-examination (SSE), is largely non-existent. The current article provides a review of eye-tracking research in various medical fields, culminating in a discussion of current applications and advantages of eye-tracking for dermatology research. Copyright © 2018 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Magnussen, Rikke; Zachariassen, Maria; Kharlamov, Nikita; Larsen, Birger
2017-01-01
This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye-tracking studies of audience interaction and knowledge generation in the technology-enhanced health…
Eye-Tracking in the Study of Visual Expertise: Methodology and Approaches in Medicine
ERIC Educational Resources Information Center
Fox, Sharon E.; Faulkner-Jones, Beverly E.
2017-01-01
Eye-tracking is the measurement of eye motions and point of gaze of a viewer. Advances in this technology have been essential to our understanding of many forms of visual learning, including the development of visual expertise. In recent years, these studies have been extended to the medical professions, where eye-tracking technology has helped us…
Measuring Human Performance in Simulated Nuclear Power Plant Control Rooms Using Eye Tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, Casey Robert; Rice, Brandon Charles; Bower, Gordon Ross
Control room modernization will be an important part of life extension for the existing light water reactor fleet. As part of modernization efforts, personnel will need to gain a full understanding of how control room technologies affect performance of human operators. Recent advances in technology enables the use of eye tracking technology to continuously measure an operator’s eye movement, which correlates with a variety of human performance constructs such as situation awareness and workload. This report describes eye tracking metrics in the context of how they will be used in nuclear power plant control room simulator studies.
Using eye-tracking technology for communication in Rett syndrome: perceptions of impact.
Vessoyan, Kelli; Steckle, Gill; Easton, Barb; Nichols, Megan; Mok Siu, Victoria; McDougall, Janette
2018-04-27
Studies have investigated the use of eye-tracking technology to assess cognition in individuals with Rett syndrome, but few have looked at this access method for communication for this group. Loss of speech, decreased hand use, and severe motor apraxia significantly impact functional communication for this population. Eye gaze is one modality that may be used successfully by individuals with Rett syndrome. This multiple case study explored whether using eye-tracking technology, with ongoing support from a team of augmentative and alternative communication (AAC) therapists, could help four participants with Rett syndrome meet individualized communication goals. Two secondary objectives were to examine parents' perspectives on (a) the psychosocial impact of their child's use of the technology, and (b) satisfaction with using the technology. All four participants were rated by the treating therapists to have made improvement on their goals. According to both quantitative findings and descriptive information, eye-tracking technology was viewed by parents as contributing to participants' improved psychosocial functioning. Parents reported being highly satisfied with both the device and the clinical services received. This study provides initial evidence that eye-tracking may be perceived as a worthwhile and potentially satisfactory technology to support individuals with Rett syndrome in communicating. Future, more rigorous research that addresses the limitations of a case study design is required to substantiate study findings.
Eye Tracking: A Brief Guide for Developmental Researchers
ERIC Educational Resources Information Center
Feng, Gary
2011-01-01
Eye tracking offers a powerful research tool for developmental scientists. In this brief article, the author introduces the methodology and issues associated with its applications in developmental research, beginning with an overview of eye movements and eye-tracking technologies, followed by examples of how it is used to study the developing mind…
Using Eye-Tracking in Applied Linguistics and Second Language Research
ERIC Educational Resources Information Center
Conklin, Kathy; Pellicer-Sánchez, Ana
2016-01-01
With eye-tracking technology the eye is thought to give researchers a window into the mind. Importantly, eye-tracking has significant advantages over traditional online processing measures: chiefly that it allows for more "natural" processing as it does not require a secondary task, and that it provides a very rich moment-to-moment data…
Can eye-tracking technology improve situational awareness in paramedic clinical education?
Williams, Brett; Quested, Andrew; Cooper, Simon
2013-01-01
Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.
Use of Cognitive and Metacognitive Strategies in Online Search: An Eye-Tracking Study
ERIC Educational Resources Information Center
Zhou, Mingming; Ren, Jing
2016-01-01
This study used eye-tracking technology to track students' eye movements while searching information on the web. The research question guiding this study was "Do students with different search performance levels have different visual attention distributions while searching information online? If yes, what are the patterns for high and low…
King, Roderick; Hanhan, Jaber; Harrison, T Kyle; Kou, Alex; Howard, Steven K; Borg, Lindsay K; Shum, Cynthia; Udani, Ankeet D; Mariano, Edward R
2018-05-15
Malignant hyperthermia is a rare but potentially fatal complication of anesthesia, and several different cognitive aids designed to facilitate a timely and accurate response to this crisis currently exist. Eye tracking technology can measure voluntary and involuntary eye movements, gaze fixation within an area of interest, and speed of visual response and has been used to a limited extent in anesthesiology. With eye tracking technology, we compared the accessibility of five malignant hyperthermia cognitive aids by collecting gaze data from twelve volunteer participants. Recordings were reviewed and annotated to measure the time required for participants to locate objects on the cognitive aid to provide an answer; cumulative time to answer was the primary outcome. For the primary outcome, there were differences detected between cumulative time to answer survival curves (P < 0.001). Participants demonstrated the shortest cumulative time to answer when viewing the Society for Pediatric Anesthesia (SPA) cognitive aid compared to four other publicly available cognitive aids for malignant hyperthermia, and this outcome was not influenced by the anesthesiologists' years of experience. This is the first study to utilize eye tracking technology in a comparative evaluation of cognitive aid design, and our experience suggests that there may be additional applications of eye tracking technology in healthcare and medical education. Potentially advantageous design features of the SPA cognitive aid include a single page, linear layout, and simple typescript with minimal use of single color blocking.
Eye Tracking and Head Movement Detection: A State-of-Art Survey
2013-01-01
Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851
The Right Track for Vision Correction
NASA Technical Reports Server (NTRS)
2003-01-01
More and more people are putting away their eyeglasses and contact lenses as a result of laser vision correction surgery. LASIK, the most widely performed version of this surgical procedure, improves vision by reshaping the cornea, the clear front surface of the eye, using an excimer laser. One excimer laser system, Alcon s LADARVision 4000, utilizes a laser radar (LADAR) eye tracking device that gives it unmatched precision. During LASIK surgery, laser During LASIK surgery, laser pulses must be accurately placed to reshape the cornea. A challenge to this procedure is the patient s constant eye movement. A person s eyes make small, involuntary movements known as saccadic movements about 100 times per second. Since the saccadic movements will not stop during LASIK surgery, most excimer laser systems use an eye tracking device that measures the movements and guides the placement of the laser beam. LADARVision s eye tracking device stems from the LADAR technology originally developed through several Small Business Innovation Research (SBIR) contracts with NASA s Johnson Space Center and the U.S. Department of Defense s Ballistic Missile Defense Office (BMDO). In the 1980s, Johnson awarded Autonomous Technologies Corporation a Phase I SBIR contract to develop technology for autonomous rendezvous and docking of space vehicles to service satellites. During Phase II of the Johnson SBIR contract, Autonomous Technologies developed a prototype range and velocity imaging LADAR to demonstrate technology that could be used for this purpose.
How Young Children View Mathematical Representations: A Study Using Eye-Tracking Technology
ERIC Educational Resources Information Center
Bolden, David; Barmby, Patrick; Raine, Stephanie; Gardner, Matthew
2015-01-01
Background: It has been shown that mathematical representations can aid children's understanding of mathematical concepts but that children can sometimes have difficulty in interpreting them correctly. New advances in eye-tracking technology can help in this respect because it allows data to be gathered concerning children's focus of attention and…
Katz, Trixie A; Weinberg, Danielle D; Fishman, Claire E; Nadkarni, Vinay; Tremoulet, Patrice; Te Pas, Arjan B; Sarcevic, Aleksandra; Foglia, Elizabeth E
2018-06-14
A respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV. Mixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses. Level 3 academic neonatal intensive care unit. Twenty neonatal resuscitation providers. Visual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation. Twenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13-51%), highest visit count (median 5.17 per 10 s, IQR 2.82-6.16) and longest visit duration (median 0.48 s, IQR 0.38-0.81 s). All participants were willing to wear the glasses during clinical resuscitation. Wearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
ERIC Educational Resources Information Center
Chuang, Hsueh-Hua; Liu, Han-Chin
2012-01-01
This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…
Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review.
Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G; Goldstein, Adam O; Ranney, Leah
2016-10-01
In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics.
Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review
Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G.; Goldstein, Adam O.; Ranney, Leah
2016-01-01
Objective In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. Methods We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Results Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Conclusions Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics. PMID:27668270
ERIC Educational Resources Information Center
Kabugo, David; Muyinda, Paul B.; Masagazi, Fred. M.; Mugagga, Anthony M.; Mulumba, Mathias B.
2016-01-01
Although eye-tracking technologies such as Tobii-T120/TX and Eye-Tribe are steadily becoming ubiquitous, and while their appropriation in education can aid teachers to collect robust information on how students move their eyes when reading and engaging with different learning objects, many teachers of Luganda language are yet to gain experiences…
Prior Knowledge and Online Inquiry-Based Science Reading: Evidence from Eye Tracking
ERIC Educational Resources Information Center
Ho, Hsin Ning Jessie; Tsai, Meng-Jung; Wang, Ching-Yeh; Tsai, Chin-Chung
2014-01-01
This study employed eye-tracking technology to examine how students with different levels of prior knowledge process text and data diagrams when reading a web-based scientific report. Students' visual behaviors were tracked and recorded when they read a report demonstrating the relationship between the greenhouse effect and global climate…
Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.
Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido
2017-06-01
The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovesdi, C.; Spielman, Z.; LeBlanc, K.
An important element of human factors engineering (HFE) pertains to measurement and evaluation (M&E). The role of HFE-M&E should be integrated throughout the entire control room modernization (CRM) process and be used for human-system performance evaluation and diagnostic purposes with resolving potential human engineering deficiencies (HEDs) and other human machine interface (HMI) design issues. NUREG-0711 describes how HFE in CRM should employ a hierarchical set of measures, particularly during integrated system validation (ISV), including plant performance, personnel task performance, situation awareness, cognitive workload, and anthropometric/ physiological factors. Historically, subjective measures have been primarily used since they are easier to collectmore » and do not require specialized equipment. However, there are pitfalls with relying solely on subjective measures in M&E such that negatively impact reliability, sensitivity, and objectivity. As part of comprehensively capturing a diverse set of measures that strengthen findings and inferences made of the benefits from emerging technologies like advanced displays, this paper discusses the value of using eye tracking as an objective method that can be used in M&E. A brief description of eye tracking technology and relevant eye tracking measures is provided. Additionally, technical considerations and the unique challenges with using eye tracking in full-scaled simulations are addressed. Finally, this paper shares preliminary findings regarding the use of a wearable eye tracking system in a full-scale simulator study. These findings should help guide future full-scale simulator studies using eye tracking as a methodology to evaluate human-system performance.« less
NASA Astrophysics Data System (ADS)
Maguen, Ezra I.; Salz, James J.; Nesburn, Anthony B.
1997-05-01
Preliminary results of the correction of myopia up to -7.00 D by tracked photorefractive keratectomy (T-PRK) with a scanning and tracking excimer laser by Autonomous Technologies are discussed. 41 eyes participated (20 males). 28 eyes were evaluated one month postop. At epithelization day mean uncorrected vision was 20/45.3. At one month postop, 92.8 of eyes were 20/40 and 46.4% were 20/20. No eye was worse than 20/50. 75% of eyes were within +/- 0.5 D of emmetropia and 82% were within +/- 1.00 D of emmetropia. Eyes corrected for monovision were included. One eye lost 3 lines of best corrected vision, and had more than 1.00 D induced astigmatism due to a central corneal ulcer. Additional complications included symptomatic recurrent corneal erosions which were controlled with topical hypertonic saline. T-PRK appears to allow effective correction of low to moderate myopia. Further study will establish safety and efficacy of the procedure.
A psychotechnological review on eye-tracking systems: towards user experience.
Mele, Maria Laura; Federici, Stefano
2012-07-01
The aim of the present work is to show a critical review of the international literature on eye-tracking technologies by focusing on those features that characterize them as 'psychotechnologies'. A critical literature review was conducted through the main psychology, engineering, and computer sciences databases by following specific inclusion and exclusion criteria. A total of 46 matches from 1998 to 2010 were selected for content analysis. Results have been divided into four broad thematic areas. We found that, although there is a growing attention to end-users, most of the studies reviewed in this work are far from being considered as adopting holistic human-computer interaction models that include both individual differences and needs of users. User is often considered only as a measurement object of the functioning of the technological system and not as a real alter-ego of the intrasystemic interaction. In order to fully benefit from the communicative functions of gaze, the research on eye-tracking must emphasize user experience. Eye-tracking systems would become an effective assistive technology for integration, adaptation and neutralization of the environmental barrier only when a holistic model can be applied for both design processes and assessment of the functional components of the interaction.
ERIC Educational Resources Information Center
Bazar, Nancy Sceery
2009-01-01
The purpose of this primarily quantitative study was to compare how young adults with and without intellectual disabilities examine different types of images. Two experiments were conducted. The first, a replication and extension of a classic eye-tracking study (Yarbus, 1967), generated eye gaze patterns and data in response to questions related…
Video-Based Eye Tracking in Sex Research: A Systematic Literature Review.
Wenzlaff, Frederike; Briken, Peer; Dekker, Arne
2015-12-21
Although eye tracking has been used for decades, it has gained popularity in the area of sex research only recently. The aim of this article is to examine the potential merits of eye tracking for this field. We present a systematic review of the current use of video-based eye-tracking technology in this area, evaluate the findings, and identify future research opportunities. A total of 34 relevant studies published between 2006 and 2014 were identified for inclusion by means of online databases and other methods. We grouped them into three main areas of research: body perception and attractiveness, forensic research, and sexual orientation. Despite the methodological and theoretical differences across the studies, eye tracking has been shown to be a promising tool for sex research. The article suggests there is much potential for further studies to employ this technique because it is noninvasive and yet still allows for the assessment of both conscious and unconscious perceptional processes. Furthermore, eye tracking can be implemented in investigations of various theoretical backgrounds, ranging from biology to the social sciences.
Long-range eye tracking: A feasibility study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jayaweera, S.K.; Lu, Shin-yee
1994-08-24
The design considerations for a long-range Purkinje effects based video tracking system using current technology is presented. Past work, current experiments, and future directions are thoroughly discussed, with an emphasis on digital signal processing techniques and obstacles. It has been determined that while a robust, efficient, long-range, and non-invasive eye tracking system will be difficult to develop, such as a project is indeed feasible.
Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim
2017-01-01
Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.
Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.
Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M
2017-01-01
To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.
Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma
Black, Alex A.
2017-01-01
To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433
Li, Bin; Fu, Hong; Wen, Desheng; Lo, WaiLun
2018-05-19
Eye tracking technology has become increasingly important for psychological analysis, medical diagnosis, driver assistance systems, and many other applications. Various gaze-tracking models have been established by previous researchers. However, there is currently no near-eye display system with accurate gaze-tracking performance and a convenient user experience. In this paper, we constructed a complete prototype of the mobile gaze-tracking system ' Etracker ' with a near-eye viewing device for human gaze tracking. We proposed a combined gaze-tracking algorithm. In this algorithm, the convolutional neural network is used to remove blinking images and predict coarse gaze position, and then a geometric model is defined for accurate human gaze tracking. Moreover, we proposed using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms, so that an individual user only needs to calibrate it the first time, which makes our system more convenient. The experiments on gaze data from 26 participants show that the eye center detection accuracy is 98% and Etracker can provide an average gaze accuracy of 0.53° at a rate of 30⁻60 Hz.
The Influence of Different Representations on Solving Concentration Problems at Elementary School
NASA Astrophysics Data System (ADS)
Liu, Chia-Ju; Shen, Ming-Hsun
2011-10-01
This study investigated the students' learning process of the concept of concentration at the elementary school level in Taiwan. The influence of different representational types on the process of proportional reasoning was also explored. The participants included nineteen third-grade and eighteen fifth-grade students. Eye-tracking technology was used in conducting the experiment. The materials were adapted from Noelting's (1980a) "orange juice test" experiment. All problems on concentration included three stages (the intuitive, the concrete operational, and the formal operational), and each problem was displayed in iconic and symbolic representations. The data were collected through eye-tracking technology and post-test interviews. The results showed that the representational types influenced students' solving of concentration problems. Furthermore, the data on eye movement indicated that students used different strategies or rules to solve concentration problems at the different stages of the problems with different representational types. This study is intended to contribute to the understanding of elementary school students' problem-solving strategies and the usability of eye-tracking technology in related studies.
Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.
2015-01-01
Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591
Quantifying Pilot Visual Attention in Low Visibility Terminal Operations
NASA Technical Reports Server (NTRS)
Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.
2012-01-01
Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation
Design of a Gaze-Sensitive Virtual Social Interactive System for Children With Autism
Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan
2013-01-01
Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child’s dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. PMID:21609889
Design of a gaze-sensitive virtual social interactive system for children with autism.
Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan
2011-08-01
Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child's dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. © 2011 IEEE
ERIC Educational Resources Information Center
Andrzejewska, Magdalena; Stolinska, Anna; Blasiak, Wladyslaw; Peczkowski, Pawel; Rosiek, Roman; Rozek, Bozena; Sajka, Miroslawa; Wcislo, Dariusz
2016-01-01
The results of qualitative and quantitative investigations conducted with individuals who learned algorithms in school are presented in this article. In these investigations, eye-tracking technology was used to follow the process of solving algorithmic problems. The algorithmic problems were presented in two comparable variants: in a pseudocode…
ERIC Educational Resources Information Center
Tang, Hui; Kirk, John; Pienta, Norbert J.
2014-01-01
This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…
ERIC Educational Resources Information Center
van Hooft, Edwin A. J.; Born, Marise Ph.
2012-01-01
Intentional response distortion or faking among job applicants completing measures such as personality and integrity tests is a concern in personnel selection. The present study aimed to investigate whether eye-tracking technology can improve our understanding of the response process when faking. In an experimental within-participants design, a…
Using eye tracking to identify faking attempts during penile plethysmography assessment.
Trottier, Dominique; Rouleau, Joanne-Lucine; Renaud, Patrice; Goyette, Mathieu
2014-01-01
Penile plethysmography (PPG) is considered the most rigorous method for sexual interest assessment. Nevertheless, it is subject to faking attempts by participants, which compromises the internal validity of the instrument. To date, various attempts have been made to limit voluntary control of sexual response during PPG assessments, without satisfactory results. This exploratory research examined eye-tracking technologies' ability to identify the presence of cognitive strategies responsible for erectile inhibition during PPG assessment. Eye movements and penile responses for 20 subjects were recorded while exploring animated human-like computer-generated stimuli in a virtual environment under three distinct viewing conditions: (a) the free visual exploration of a preferred sexual stimulus without erectile inhibition; (b) the viewing of a preferred sexual stimulus with erectile inhibition; and (c) the free visual exploration of a non-preferred sexual stimulus. Results suggest that attempts to control erectile responses generate specific eye-movement variations, characterized by a general deceleration of the exploration process and limited exploration of the erogenous zone. Findings indicate that recording eye movements can provide significant information on the presence of competing covert processes responsible for erectile inhibition. The use of eye-tracking technologies during PPG could therefore lead to improved internal validity of the plethysmographic procedure.
Breen, Cathal J; Bond, Raymond; Finlay, Dewar
2014-01-01
This study investigated eye tracking technology for 12 lead electrocardiography interpretation to Healthcare Scientist students. Participants (n=33) interpreted ten 12 lead ECG recordings and randomized to receive objective individual appraisal on their efforts either by traditional didactic format or by eye tracker software. One hundred percent of participants reported the experience positively at improving their ECG interpretation competency. ECG analysis time ranged between 13.2 and 59.5s. The rhythm strip was the most common lead studied and fixated on for the longest duration (mean 9.9s). Lead I was studied for the shortest duration (mean 0.25s). Feedback using eye tracking data during ECG interpretation did not produce any significant variation between the assessment marks of the study and the control groups (p=0.32). Although the hypothesis of this study was rejected active teaching and early feedback practices are recommended within this discipline. Copyright © 2014 Elsevier Inc. All rights reserved.
Eye-Tracking Analysis of the Figures of Anti-Smoking Health Promoting Periodical's Illustrations
ERIC Educational Resources Information Center
Maródi, Ágnes; Devosa, Iván; Steklács, János; Fáyné-Dombi, Alice; Buzas, Zsuzsanna; Vanya, Melinda
2015-01-01
Nowadays new education technologies and e-communication devices give new measuring and assessing tools for researchers. Eye-tracking is one of these new methods in education. In our study we assessed 4 figures from the anti-smoking heath issues of National Institute for Health Development. In the study 22 students were included from a 7th grade…
Measuring advertising effectiveness in Travel 2.0 websites through eye-tracking technology.
Muñoz-Leiva, Francisco; Hernández-Méndez, Janet; Gómez-Carmona, Diego
2018-03-06
The advent of Web 2.0 is changing tourists' behaviors, prompting them to take on a more active role in preparing their travel plans. It is also leading tourism companies to have to adapt their marketing strategies to different online social media. The present study analyzes advertising effectiveness in social media in terms of customers' visual attention and self-reported memory (recall). Data were collected through a within-subjects and between-groups design based on eye-tracking technology, followed by a self-administered questionnaire. Participants were instructed to visit three Travel 2.0 websites (T2W), including a hotel's blog, social network profile (Facebook), and virtual community profile (Tripadvisor). Overall, the results revealed greater advertising effectiveness in the case of the hotel social network; and visual attention measures based on eye-tracking data differed from measures of self-reported recall. Visual attention to the ad banner was paid at a low level of awareness, which explains why the associations with the ad did not activate its subsequent recall. The paper offers a pioneering attempt in the application of eye-tracking technology, and examines the possible impact of visual marketing stimuli on user T2W-related behavior. The practical implications identified in this research, along with its limitations and future research opportunities, are of interest both for further theoretical development and practical application. Copyright © 2018 Elsevier Inc. All rights reserved.
Gold, Jeffrey Allen; Stephenson, Laurel E; Gorsuch, Adriel; Parthasarathy, Keshav; Mohan, Vishnu
2016-09-01
Numerous reports describe unintended consequences of electronic health record implementation. Having previously described physicians' failures to recognize patient safety issues within our electronic health record simulation environment, we now report on our use of eye and screen-tracking technology to understand factors associated with poor error recognition during an intensive care unit-based electronic health record simulation. We linked performance on the simulation to standard eye and screen-tracking readouts including number of fixations, saccades, mouse clicks and screens visited. In addition, we developed an overall Composite Eye Tracking score which measured when, where and how often each safety item was viewed. For 39 participants, the Composite Eye Tracking score correlated with performance on the simulation (p = 0.004). Overall, the improved performance was associated with a pattern of rapid scanning of data manifested by increased number of screens visited (p = 0.001), mouse clicks (p = 0.03) and saccades (p = 0.004). Eye tracking can be successfully integrated into electronic health record-based simulation and provides a surrogate measure of cognitive decision making and electronic health record usability. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Pansing, Craig W.; Hua, Hong; Rolland, Jannick P.
2005-08-01
Head-mounted display (HMD) technologies find a variety of applications in the field of 3D virtual and augmented environments, 3D scientific visualization, as well as wearable displays. While most of the current HMDs use head pose to approximate line of sight, we propose to investigate approaches and designs for integrating eye tracking capability into HMDs from a low-level system design perspective and to explore schemes for optimizing system performance. In this paper, we particularly propose to optimize the illumination scheme, which is a critical component in designing an eye tracking-HMD (ET-HMD) integrated system. An optimal design can improve not only eye tracking accuracy, but also robustness. Using LightTools, we present the simulation of a complete eye illumination and imaging system using an eye model along with multiple near infrared LED (IRLED) illuminators and imaging optics, showing the irradiance variation of the different eye structures. The simulation of dark pupil effects along with multiple 1st-order Purkinje images will be presented. A parametric analysis is performed to investigate the relationships between the IRLED configurations and the irradiance distribution at the eye, and a set of optimal configuration parameters is recommended. The analysis will be further refined by actual eye image acquisition and processing.
ERIC Educational Resources Information Center
Swanson, Meghan R.; Siller, Michael
2014-01-01
The current study takes advantage of modern eye-tracking technology and evaluates how individuals allocate their attention when viewing social videos that display an adult model who is gazing at a series of targets that appear and disappear in the four corners of the screen (congruent condition), or gazing elsewhere (incongruent condition). Data…
Bond, R R; Kligfield, P D; Zhu, T; Finlay, D D; Drew, B; Guldenring, D; Breen, C; Clifford, G D; Wagner, G S
2015-01-01
The 12-lead electrocardiogram (ECG) is a complex set of cardiac signals that require a high degree of skill and clinical knowledge to interpret. Therefore, it is imperative to record and understand how expert readers interpret the 12-lead ECG. This short paper showcases how eye tracking technology and audio data can be fused together and visualised to gain insight into the interpretation techniques employed by an eminent ECG champion, namely Dr Rory Childers. Copyright © 2015 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Kochukhova, Olga; Gredeback, Gustaf
2010-01-01
This study relies on eye tracking technology to investigate how humans perceive others' feeding actions. Results demonstrate that 6-month-olds (n = 54) anticipate that food is brought to the mouth when observing an adult feeding herself with a spoon. Still, they fail to anticipate self-propelled (SP) spoons that move toward the mouth and manual…
An eye tracking study of bloodstain pattern analysts during pattern classification.
Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G
2018-05-01
Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.
A Novel Hybrid Mental Spelling Application Based on Eye Tracking and SSVEP-Based BCI
Stawicki, Piotr; Gembler, Felix; Rezeika, Aya; Volosyak, Ivan
2017-01-01
Steady state visual evoked potentials (SSVEPs)-based Brain-Computer interfaces (BCIs), as well as eyetracking devices, provide a pathway for re-establishing communication for people with severe disabilities. We fused these control techniques into a novel eyetracking/SSVEP hybrid system, which utilizes eye tracking for initial rough selection and the SSVEP technology for fine target activation. Based on our previous studies, only four stimuli were used for the SSVEP aspect, granting sufficient control for most BCI users. As Eye tracking data is not used for activation of letters, false positives due to inappropriate dwell times are avoided. This novel approach combines the high speed of eye tracking systems and the high classification accuracies of low target SSVEP-based BCIs, leading to an optimal combination of both methods. We evaluated accuracy and speed of the proposed hybrid system with a 30-target spelling application implementing all three control approaches (pure eye tracking, SSVEP and the hybrid system) with 32 participants. Although the highest information transfer rates (ITRs) were achieved with pure eye tracking, a considerable amount of subjects was not able to gain sufficient control over the stand-alone eye-tracking device or the pure SSVEP system (78.13% and 75% of the participants reached reliable control, respectively). In this respect, the proposed hybrid was most universal (over 90% of users achieved reliable control), and outperformed the pure SSVEP system in terms of speed and user friendliness. The presented hybrid system might offer communication to a wider range of users in comparison to the standard techniques. PMID:28379187
NASA Astrophysics Data System (ADS)
Vuori, Tero; Olkkonen, Maria
2006-01-01
The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.
An Examination of Cognitive Processing of Multimedia Information Based on Viewers' Eye Movements
ERIC Educational Resources Information Center
Liu, Han-Chin; Chuang, Hsueh-Hua
2011-01-01
This study utilized qualitative and quantitative designs and eye-tracking technology to understand how viewers process multimedia information. Eye movement data were collected from eight college students (non-science majors) while they were viewing web pages containing different types of text and illustrations depicting the mechanism of…
Parallax barrier engineering for image quality improvement in an autostereoscopic 3D display.
Kim, Sung-Kyu; Yoon, Ki-Hyuk; Yoon, Seon Kyu; Ju, Heongkyu
2015-05-18
We present a image quality improvement in a parallax barrier (PB)-based multiview autostereoscopic 3D display system under a real-time tracking of positions of a viewer's eyes. The system presented exploits a parallax barrier engineered to offer significantly improved quality of three-dimensional images for a moving viewer without an eyewear under the dynamic eye tracking. The improved image quality includes enhanced uniformity of image brightness, reduced point crosstalk, and no pseudoscopic effects. We control the relative ratio between two parameters i.e., a pixel size and the aperture of a parallax barrier slit to improve uniformity of image brightness at a viewing zone. The eye tracking that monitors positions of a viewer's eyes enables pixel data control software to turn on only pixels for view images near the viewer's eyes (the other pixels turned off), thus reducing point crosstalk. The eye tracking combined software provides right images for the respective eyes, therefore producing no pseudoscopic effects at its zone boundaries. The viewing zone can be spanned over area larger than the central viewing zone offered by a conventional PB-based multiview autostereoscopic 3D display (no eye tracking). Our 3D display system also provides multiviews for motion parallax under eye tracking. More importantly, we demonstrate substantial reduction of point crosstalk of images at the viewing zone, its level being comparable to that of a commercialized eyewear-assisted 3D display system. The multiview autostereoscopic 3D display presented can greatly resolve the point crosstalk problem, which is one of the critical factors that make it difficult for previous technologies for a multiview autostereoscopic 3D display to replace an eyewear-assisted counterpart.
Possibilities and Implications of Using a Motion-Tracking System in Physical Education
ERIC Educational Resources Information Center
Chow, Jia Yi; Tan, Clara Wee Keat; Lee, Miriam Chang Yi; Button, Chris
2014-01-01
Advances in technology have created new opportunities for enhanced delivery of teaching to improve the acquisition of game skills in physical education (PE). The availability of a motion-tracking system (i.e. the A-Eye), which determines positional information of students in a practice context, might offer a suitable technology to support…
Statistical Analysis of Online Eye and Face-tracking Applications in Marketing
NASA Astrophysics Data System (ADS)
Liu, Xuan
Eye-tracking and face-tracking technology have been widely adopted to study viewers' attention and emotional response. In the dissertation, we apply these two technologies to investigate effective online contents that are designed to attract and direct attention and engage viewers emotional responses. In the first part of the dissertation, we conduct a series of experiments that use eye-tracking technology to explore how online models' facial cues affect users' attention on static e-commerce websites. The joint effects of two facial cues, gaze direction and facial expression on attention, are estimated by Bayesian ANOVA, allowing various distributional assumptions. We also consider the similarities and differences in the effects of facial cues among American and Chinese consumers. This study offers insights on how to attract and retain customers' attentions for advertisers that use static advertisement on various websites or ad networks. In the second part of the dissertation, we conduct a face-tracking study where we investigate the relation between experiment participants' emotional responseswhile watching comedy movie trailers and their watching intentions to the actual movies. Viewers' facial expressions are collected in real-time and converted to emo- tional responses with algorithms based on facial coding system. To analyze the data, we propose to use a joint modeling method that link viewers' longitudinal emotion measurements and their watching intentions. This research provides recommenda- tions to filmmakers on how to improve the effectiveness of movie trailers, and how to boost audiences' desire to watch the movies.
Puckett, Yana; Baronia, Benedicto C
2016-09-20
With the recent advances in eye tracking technology, it is now possible to track surgeons' eye movements while engaged in a surgical task or when surgical residents practice their surgical skills. Several studies have compared eye movements of surgical experts and novices and developed techniques to assess surgical skill on the basis of eye movement utilizing simulators and live surgery. None have evaluated simultaneous visual tracking between an expert and a novice during live surgery. Here, we describe a successful simultaneous deployment of visual tracking of an expert and a novice during live laparoscopic cholecystectomy. One expert surgeon and one chief surgical resident at an accredited surgical program in Lubbock, TX, USA performed a live laparoscopic cholecystectomy while simultaneously wearing the visual tracking devices. Their visual attitudes and movements were monitored via video recordings. The recordings were then analyzed for correlation between the expert and the novice. The visual attitudes and movements correlated approximately 85% between an expert surgeon and a chief surgical resident. The surgery was carried out uneventfully, and the data was abstracted with ease. We conclude that simultaneous deployment of visual tracking during live laparoscopic surgery is a possibility. More studies and subjects are needed to verify the success of our results and obtain data analysis.
1989-08-01
paths for integration with the off-aperture and dual-mirror VPD designs. PREFACE The goal of this work was to explore integration of an eye line-of- gaze ...Relationship in one plane between point-of- gaze on a flat scene and relative eye, detector, and scene positions...and eye line-of- gaze measurement. As a first step towards the design of an appropriate eye trac.<ing system for interface with the virtual cockpit
Baronia, Benedicto C
2016-01-01
With the recent advances in eye tracking technology, it is now possible to track surgeons’ eye movements while engaged in a surgical task or when surgical residents practice their surgical skills. Several studies have compared eye movements of surgical experts and novices and developed techniques to assess surgical skill on the basis of eye movement utilizing simulators and live surgery. None have evaluated simultaneous visual tracking between an expert and a novice during live surgery. Here, we describe a successful simultaneous deployment of visual tracking of an expert and a novice during live laparoscopic cholecystectomy. One expert surgeon and one chief surgical resident at an accredited surgical program in Lubbock, TX, USA performed a live laparoscopic cholecystectomy while simultaneously wearing the visual tracking devices. Their visual attitudes and movements were monitored via video recordings. The recordings were then analyzed for correlation between the expert and the novice. The visual attitudes and movements correlated approximately 85% between an expert surgeon and a chief surgical resident. The surgery was carried out uneventfully, and the data was abstracted with ease. We conclude that simultaneous deployment of visual tracking during live laparoscopic surgery is a possibility. More studies and subjects are needed to verify the success of our results and obtain data analysis. PMID:27774359
Eye gaze tracking based on the shape of pupil image
NASA Astrophysics Data System (ADS)
Wang, Rui; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng
2018-01-01
Eye tracker is an important instrument for research in psychology, widely used in attention, visual perception, reading and other fields of research. Because of its potential function in human-computer interaction, the eye gaze tracking has already been a topic of research in many fields over the last decades. Nowadays, with the development of technology, non-intrusive methods are more and more welcomed. In this paper, we will present a method based on the shape of pupil image to estimate the gaze point of human eyes without any other intrusive devices such as a hat, a pair of glasses and so on. After using the ellipse fitting algorithm to deal with the pupil image we get, we can determine the direction of the fixation by the shape of the pupil.The innovative aspect of this method is to utilize the new idea of the shape of the pupil so that we can avoid much complicated algorithm. The performance proposed is very helpful for the study of eye gaze tracking, which just needs one camera without infrared light to know the changes in the shape of the pupil to determine the direction of the eye gazing, no additional condition is required.
Development of SPIES (Space Intelligent Eyeing System) for smart vehicle tracing and tracking
NASA Astrophysics Data System (ADS)
Abdullah, Suzanah; Ariffin Osoman, Muhammad; Guan Liyong, Chua; Zulfadhli Mohd Noor, Mohd; Mohamed, Ikhwan
2016-06-01
SPIES or Space-based Intelligent Eyeing System is an intelligent technology which can be utilized for various applications such as gathering spatial information of features on Earth, tracking system for the movement of an object, tracing system to trace the history information, monitoring driving behavior, security and alarm system as an observer in real time and many more. SPIES as will be developed and supplied modularly will encourage the usage based on needs and affordability of users. SPIES are a complete system with camera, GSM, GPS/GNSS and G-Sensor modules with intelligent function and capabilities. Mainly the camera is used to capture pictures and video and sometimes with audio of an event. Its usage is not limited to normal use for nostalgic purpose but can be used as a reference for security and material of evidence when an undesirable event such as crime occurs. When integrated with space based technology of the Global Navigational Satellite System (GNSS), photos and videos can be recorded together with positioning information. A product of the integration of these technologies when integrated with Information, Communication and Technology (ICT) and Geographic Information System (GIS) will produce innovation in the form of information gathering methods in still picture or video with positioning information that can be conveyed in real time via the web to display location on the map hence creating an intelligent eyeing system based on space technology. The importance of providing global positioning information is a challenge but overcome by SPIES even in areas without GNSS signal reception for the purpose of continuous tracking and tracing capability
Understanding Health Literacy Measurement Through Eye Tracking
Mackert, Michael; Champlin, Sara E.; Pasch, Keryn E.; Weiss, Barry D.
2013-01-01
This study used eye-tracking technology to explore how individuals with different levels of health literacy visualize health-related information. The authors recruited 25 university administrative staff (more likely to have adequate health literacy skills) and 25 adults enrolled in an adult literacy program (more likely to have limited health literacy skills). The authors administered the Newest Vital Sign (NVS) health literacy assessment to each participant. The assessment involves having individuals answer questions about a nutrition label while viewing the label. The authors used computerized eye-tracking technology to measure the amount of time each participant spent fixing their view at nutrition label information that was relevant to the questions being asked and the amount of time they spent viewing nonrelevant information. Results showed that lower NVS scores were significantly associated with more time spent on information not relevant for answering the NVS items. This finding suggests that efforts to improve health literacy measurement should include the ability to differentiate not just between individuals who have difficulty interpreting and using health information, but also between those who have difficulty finding relevant information. In addition, this finding suggests that health education material should minimize the inclusion of nonrelevant information. PMID:24093355
Light, Janice; McNaughton, David
2014-06-01
In order to improve outcomes for individuals who require AAC, there is an urgent need for research across the full spectrum--from basic research to investigate fundamental language and communication processes, to applied clinical research to test applications of this new knowledge in the real world. To date, there has been a notable lack of basic research in the AAC field to investigate the underlying cognitive, sensory perceptual, linguistic, and motor processes of individuals with complex communication needs. Eye tracking research technology provides a promising method for researchers to investigate some of the visual cognitive processes that underlie interaction via AAC. The eye tracking research technology automatically records the latency, duration, and sequence of visual fixations, providing key information on what elements attract the individual's attention (and which ones do not), for how long, and in what sequence. As illustrated by the papers in this special issue, this information can be used to improve the design of AAC systems, assessments, and interventions to better meet the needs of individuals with developmental and acquired disabilities who require AAC (e.g., individuals with autism spectrum disorders, Down syndrome, intellectual disabilities of unknown origin, aphasia).
Eye Tracking Based Control System for Natural Human-Computer Interaction
Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528
Eye Tracking Based Control System for Natural Human-Computer Interaction.
Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.
A compact eyetracked optical see-through head-mounted display
NASA Astrophysics Data System (ADS)
Hua, Hong; Gao, Chunyu
2012-03-01
An eye-tracked head-mounted display (ET-HMD) system is able to display virtual images as a classical HMD does, while additionally tracking the gaze direction of the user. There is ample evidence that a fully-integrated ETHMD system offers multi-fold benefits, not only to fundamental scientific research but also to emerging applications of such technology. For instance eyetracking capability in HMDs adds a very valuable tool and objective metric for scientists to quantitatively assess user interaction with 3D environments and investigate the effectiveness of various 3D visualization technologies for various specific tasks including training, education, and augmented cognition tasks. In this paper, we present an innovative optical approach to the design of an optical see-through ET-HMD system based on freeform optical technology and an innovative optical scheme that uniquely combines the display optics with the eye imaging optics. A preliminary design of the described ET-HMD system will be presented.
ERIC Educational Resources Information Center
Jian, Yu-Cin
2016-01-01
Previous research suggests that multiple representations can improve science reading comprehension. This facilitation effect is premised on the observation that readers can efficiently integrate information in text and diagram formats; however, this effect in young readers is still contested. Using eye-tracking technology and sequential analysis,…
NASA Technical Reports Server (NTRS)
1999-01-01
NASA'S Ames Research Center contracted with SRI international to contract a device that would be able to anticipate, track, and monitor involuntary ocular movement horizontally, vertically, and with respect to depth-of-field. This development helped research institutions to understand the eye. The Eyetracker, manufactured and distributed by Forward Optical Technologies, Inc. is now used in the clinical/medical field.
What Does the Eye See? Reading Online Primary Source Photographs in History
ERIC Educational Resources Information Center
Levesque, Stephane; Ng-A-Fook, Nicholas; Corrigan, Julie
2014-01-01
This exploratory study looks at how a sample of preservice teachers and historians read visuals in the context of school history. The participants used eye tracking technology and think-aloud protocol, as they examined a series of online primary source photographs from a virtual exhibit. Voluntary participants (6 students and 2 professional…
Warren, Amy L; Donnon, Tyrone L; Wagg, Catherine R; Priest, Heather; Fernandez, Nicole J
2018-01-18
Visual diagnostic reasoning is the cognitive process by which pathologists reach a diagnosis based on visual stimuli (cytologic, histopathologic, or gross imagery). Currently, there is little to no literature examining visual reasoning in veterinary pathology. The objective of the study was to use eye tracking to establish baseline quantitative and qualitative differences between the visual reasoning processes of novice and expert veterinary pathologists viewing cytology specimens. Novice and expert participants were each shown 10 cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently verbalizing their thought processes using the think-aloud protocol (5 slides). Compared to novices, experts demonstrated significantly higher diagnostic accuracy (p<.017), shorter time to diagnosis (p<.017), and a higher percentage of time spent viewing areas of diagnostic interest (p<.017). Experts elicited more key diagnostic features in the think-aloud protocol and had more efficient patterns of eye movement. These findings suggest that experts' fast time to diagnosis, efficient eye-movement patterns, and preference for viewing areas of interest supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis.
Fashler, Samantha R; Katz, Joel
2016-01-01
Attentional biases to painful stimuli are evident in individuals with chronic pain, although the directional tendency of these biases (ie, toward or away from threat-related stimuli) remains unclear. This study used eye-tracking technology, a measure of visual attention, to evaluate the attentional patterns of individuals with and without chronic pain during exposure to injury-related and neutral pictures. Individuals with (N=51) and without chronic pain (N=62) completed a dot-probe task using injury-related and neutral pictures while their eye movements were recorded. Mixed-design analysis of variance evaluated the interaction between group (chronic pain, pain-free) and picture type (injury-related, neutral). Reaction time results showed that regardless of chronic pain status, participants responded faster to trials with neutral stimuli in comparison to trials that included injury-related pictures. Eye-tracking measures showed within-group differences whereby injury-related pictures received more frequent fixations and visits, as well as longer average visit durations. Between-group differences showed that individuals with chronic pain had fewer fixations and shorter average visit durations for all stimuli. An examination of how biases change over the time-course of stimulus presentation showed that during the late phase of attention, individuals with chronic pain had longer average gaze durations on injury pictures relative to pain-free individuals. The results show the advantage of incorporating eye-tracking methodology when examining attentional biases, and suggest future avenues of research. PMID:27570461
Rotational symmetric HMD with eye-tracking capability
NASA Astrophysics Data System (ADS)
Liu, Fangfang; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian
2016-10-01
As an important auxiliary function of head-mounted displays (HMDs), eye tracking has an important role in the field of intelligent human-machine interaction. In this paper, an eye-tracking HMD system (ET-HMD) is designed based on the rotational symmetric system. The tracking principle in this paper is based on pupil-corneal reflection. The ET-HMD system comprises three optical paths for virtual display, infrared illumination, and eye tracking. The display optics is shared by three optical paths and consists of four spherical lenses. For the eye-tracking path, an extra imaging lens is added to match the image sensor and achieve eye tracking. The display optics provides users a 40° diagonal FOV with a ״ 0.61 OLED, the 19 mm eye clearance, and 10 mm exit pupil diameter. The eye-tracking path can capture 15 mm × 15 mm of the users' eyes. The average MTF is above 0.1 at 26 lp/mm for the display path, and exceeds 0.2 at 46 lp/mm for the eye-tracking path. Eye illumination is simulated using LightTools with an eye model and an 850 nm near-infrared LED (NIR-LED). The results of the simulation show that the illumination of the NIR-LED can cover the area of the eye model with the display optics that is sufficient for eye tracking. The integrated optical system HMDs with eye-tracking feature can help improve the HMD experience of users.
Borg, Lindsay K; Harrison, T Kyle; Kou, Alex; Mariano, Edward R; Udani, Ankeet D; Kim, T Edward; Shum, Cynthia; Howard, Steven K
2018-02-01
Objective measures are needed to guide the novice's pathway to expertise. Within and outside medicine, eye tracking has been used for both training and assessment. We designed this study to test the hypothesis that eye tracking may differentiate novices from experts in static image interpretation for ultrasound (US)-guided regional anesthesia. We recruited novice anesthesiology residents and regional anesthesiology experts. Participants wore eye-tracking glasses, were shown 5 sonograms of US-guided regional anesthesia, and were asked a series of anatomy-based questions related to each image while their eye movements were recorded. The answer to each question was a location on the sonogram, defined as the area of interest (AOI). The primary outcome was the total gaze time in the AOI (seconds). Secondary outcomes were the total gaze time outside the AOI (seconds), total time to answer (seconds), and time to first fixation on the AOI (seconds). Five novices and 5 experts completed the study. Although the gaze time (mean ± SD) in the AOI was not different between groups (7 ± 4 seconds for novices and 7 ± 3 seconds for experts; P = .150), the gaze time outside the AOI was greater for novices (75 ± 18 versus 44 ± 4 seconds for experts; P = .005). The total time to answer and total time to first fixation in the AOI were both shorter for experts. Experts in US-guided regional anesthesia take less time to identify sonoanatomy and spend less unfocused time away from a target compared to novices. Eye tracking is a potentially useful tool to differentiate novices from experts in the domain of US image interpretation. © 2017 by the American Institute of Ultrasound in Medicine.
Online webcam-based eye tracking in cognitive science: A first look.
Semmelmann, Kilian; Weigelt, Sarah
2018-04-01
Online experimentation is emerging in many areas of cognitive psychology as a viable alternative or supplement to classical in-lab experimentation. While performance- and reaction-time-based paradigms are covered in recent studies, one instrument of cognitive psychology has not received much attention up to now: eye tracking. In this study, we used JavaScript-based eye tracking algorithms recently made available by Papoutsaki et al. (International Joint Conference on Artificial Intelligence, 2016) together with consumer-grade webcams to investigate the potential of online eye tracking to benefit from the common advantages of online data conduction. We compared three in-lab conducted tasks (fixation, pursuit, and free viewing) with online-acquired data to analyze the spatial precision in the first two, and replicability of well-known gazing patterns in the third task. Our results indicate that in-lab data exhibit an offset of about 172 px (15% of screen size, 3.94° visual angle) in the fixation task, while online data is slightly less accurate (18% of screen size, 207 px), and shows higher variance. The same results were found for the pursuit task with a constant offset during the stimulus movement (211 px in-lab, 216 px online). In the free-viewing task, we were able to replicate the high attention attribution to eyes (28.25%) compared to other key regions like the nose (9.71%) and mouth (4.00%). Overall, we found web technology-based eye tracking to be suitable for all three tasks and are confident that the required hard- and software will be improved continuously for even more sophisticated experimental paradigms in all of cognitive psychology.
ERIC Educational Resources Information Center
Liang, Jiali; Wilkinson, Krista
2018-01-01
Purpose: A striking characteristic of the social communication deficits in individuals with autism is atypical patterns of eye contact during social interactions. We used eye-tracking technology to evaluate how the number of human figures depicted and the presence of sharing activity between the human figures in still photographs influenced visual…
Liang, Jiali; Wilkinson, Krista
2018-04-18
A striking characteristic of the social communication deficits in individuals with autism is atypical patterns of eye contact during social interactions. We used eye-tracking technology to evaluate how the number of human figures depicted and the presence of sharing activity between the human figures in still photographs influenced visual attention by individuals with autism, typical development, or Down syndrome. We sought to examine visual attention to the contents of visual scene displays, a growing form of augmentative and alternative communication support. Eye-tracking technology recorded point-of-gaze while participants viewed 32 photographs in which either 2 or 3 human figures were depicted. Sharing activities between these human figures are either present or absent. The sampling rate was 60 Hz; that is, the technology gathered 60 samples of gaze behavior per second, per participant. Gaze behaviors, including latency to fixate and time spent fixating, were quantified. The overall gaze behaviors were quite similar across groups, regardless of the social content depicted. However, individuals with autism were significantly slower than the other groups in latency to first view the human figures, especially when there were 3 people depicted in the photographs (as compared with 2 people). When participants' own viewing pace was considered, individuals with autism resembled those with Down syndrome. The current study supports the inclusion of social content with various numbers of human figures and sharing activities between human figures into visual scene displays, regardless of the population served. Study design and reporting practices in eye-tracking literature as it relates to autism and Down syndrome are discussed. https://doi.org/10.23641/asha.6066545.
White, Matthew R; Braund, Heather; Howes, Daniel; Egan, Rylan; Gegenfurtner, Andreas; van Merrienboer, Jeroen J G; Szulewski, Adam
2018-04-23
Crisis resource management skills are integral to leading the resuscitation of a critically ill patient. Despite their importance, crisis resource management skills (and their associated cognitive processes) have traditionally been difficult to study in the real world. The objective of this study was to derive key cognitive processes underpinning expert performance in resuscitation medicine, using a new eye-tracking-based video capture method during clinical cases. During an 18-month period, a sample of 10 trauma resuscitations led by 4 expert trauma team leaders was analyzed. The physician team leaders were outfitted with mobile eye-tracking glasses for each case. After each resuscitation, participants were debriefed with a modified cognitive task analysis, based on a cued-recall protocol, augmented by viewing their own first-person perspective eye-tracking video from the clinical encounter. Eye-tracking technology was successfully applied as a tool to aid in the qualitative analysis of expert performance in a clinical setting. All participants stated that using these methods helped uncover previously unconscious aspects of their cognition. Overall, 5 major themes were derived from the interviews: logistic awareness, managing uncertainty, visual fixation behaviors, selective attendance to information, and anticipatory behaviors. The novel approach of cognitive task analysis augmented by eye tracking allowed the derivation of 5 unique cognitive processes underpinning expert performance in leading a resuscitation. An understanding of these cognitive processes has the potential to enhance educational methods and to create new assessment modalities of these previously tacit aspects of expertise in this field. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
The Biosocial Subject: Sensor Technologies and Worldly Sensibility
ERIC Educational Resources Information Center
de Freitas, Elizabeth
2018-01-01
Sensor technologies are increasingly part of everyday life, embedded in buildings (movement, sound, temperature) and worn on persons (heart rate, electro-dermal activity, eye tracking). This paper presents a theoretical framework for research on computational sensor data. My approach moves away from theories of agent-centered perceptual synthesis…
Advanced Engineering Technology for Measuring Performance.
Rutherford, Drew N; D'Angelo, Anne-Lise D; Law, Katherine E; Pugh, Carla M
2015-08-01
The demand for competency-based assessments in surgical training is growing. Use of advanced engineering technology for clinical skills assessment allows for objective measures of hands-on performance. Clinical performance can be assessed in several ways via quantification of an assessee's hand movements (motion tracking), direction of visual attention (eye tracking), levels of stress (physiologic marker measurements), and location and pressure of palpation (force measurements). Innovations in video recording technology and qualitative analysis tools allow for a combination of observer- and technology-based assessments. Overall the goal is to create better assessments of surgical performance with robust validity evidence. Copyright © 2015 Elsevier Inc. All rights reserved.
Boivin, Michael J; Weiss, Jonathan; Chhaya, Ronak; Seffren, Victoria; Awadu, Jorem; Sikorskii, Alla; Giordani, Bruno
2017-07-01
Tobii eye tracking was compared with webcam-based observer scoring on an animation viewing measure of attention (Early Childhood Vigilance Test; ECVT) to evaluate the feasibility of automating measurement and scoring. Outcomes from both scoring approaches were compared with the Mullen Scales of Early Learning (MSEL), Color-Object Association Test (COAT), and Behavior Rating Inventory of Executive Function for preschool children (BRIEF-P). A total of 44 children 44 to 65 months of age were evaluated with the ECVT, COAT, MSEL, and BRIEF-P. Tobii ×2-30 portable infrared cameras were programmed to monitor pupil direction during the ECVT 6-min animation and compared with observer-based PROCODER webcam scoring. Children watched 78% of the cartoon (Tobii) compared with 67% (webcam scoring), although the 2 measures were highly correlated (r = .90, p = .001). It is possible for 2 such measures to be highly correlated even if one is consistently higher than the other (Bergemann et al., 2012). Both ECVT Tobii and webcam ECVT measures significantly correlated with COAT immediate recall (r = .37, p = .02 vs. r = .38, p = .01, respectively) and total recall (r = .33, p = .06 vs. r = .42, p = .005) measures. However, neither the Tobii eye tracking nor PROCODER webcam ECVT measures of attention correlated with MSEL composite cognitive performance or BRIEF-P global executive composite. ECVT scoring using Tobii eye tracking is feasible with at-risk very young African children and consistent with webcam-based scoring approaches in their correspondence to one another and other neurocognitive performance-based measures. By automating measurement and scoring, eye tracking technologies can improve the efficiency and help better standardize ECVT testing of attention in younger children. This holds promise for other neurodevelopmental tests where eye movements, tracking, and gaze length can provide important behavioral markers of neuropsychological and neurodevelopmental processes associated with such tests. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Before your very eyes: the value and limitations of eye tracking in medical education.
Kok, Ellen M; Jarodzka, Halszka
2017-01-01
Medicine is a highly visual discipline. Physicians from many specialties constantly use visual information in diagnosis and treatment. However, they are often unable to explain how they use this information. Consequently, it is unclear how to train medical students in this visual processing. Eye tracking is a research technique that may offer answers to these open questions, as it enables researchers to investigate such visual processes directly by measuring eye movements. This may help researchers understand the processes that support or hinder a particular learning outcome. In this article, we clarify the value and limitations of eye tracking for medical education researchers. For example, eye tracking can clarify how experience with medical images mediates diagnostic performance and how students engage with learning materials. Furthermore, eye tracking can also be used directly for training purposes by displaying eye movements of experts in medical images. Eye movements reflect cognitive processes, but cognitive processes cannot be directly inferred from eye-tracking data. In order to interpret eye-tracking data properly, theoretical models must always be the basis for designing experiments as well as for analysing and interpreting eye-tracking data. The interpretation of eye-tracking data is further supported by sound experimental design and methodological triangulation. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Eye/head tracking technology to improve HCI with iPad applications.
Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña
2015-01-22
In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future.
Eye/Head Tracking Technology to Improve HCI with iPad Applications
Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña
2015-01-01
In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future. PMID:25621603
Eye-tracking novice and expert geologist groups in the field and laboratory
NASA Astrophysics Data System (ADS)
Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.
2010-12-01
We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.
Real time eye tracking using Kalman extended spatio-temporal context learning
NASA Astrophysics Data System (ADS)
Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu
2017-06-01
Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.
Implicit prosody mining based on the human eye image capture technology
NASA Astrophysics Data System (ADS)
Gao, Pei-pei; Liu, Feng
2013-08-01
The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.
Murray, Nicholas P.; Hunfalvay, Melissa; Bolte, Takumi
2017-01-01
Purpose The purpose of this study was to determine the reliability of interpupillary distance (IPD) and pupil diameter (PD) measures using an infrared eye tracker and central point stimuli. Validity of the test compared to known clinical tools was determined, and normative data was established against which individuals can measure themselves. Methods Participants (416) across various demographics were examined for normative data. Of these, 50 were examined for reliability and validity. Validity for IPD measured the test (RightEye IPD/PD) against the PL850 Pupilometer and the Essilor Digital CRP. For PD, the test was measured against the Rosenbaum Pocket Vision Screener (RPVS). Reliability was analyzed with intraclass correlation coefficients (ICC) between trials with Cronbach's alpha (CA) and the standard error of measurement for each ICC. Convergent validity was investigated by calculating the bivariate correlation coefficient. Results Reliability results were strong (CA > 0.7) for all measures. High positive significant correlations were found between the RightEye IPD test and the PL850 Pupilometer (P < 0.001) and Essilor Digital CRP (P < 0.001) and for the RightEye PD test and the RPVS (P < 0.001). Conclusions Using infrared eye tracking and the RightEye IPD/PD test stimuli, reliable and accurate measures of IPD and PD were found. Results from normative data showed an adequate comparison for people with normal vision development. Translational Relevance Results revealed a central point of fixation may remove variability in examining PD reliably using infrared eye tracking when consistent environmental and experimental procedures are conducted. PMID:28685104
Eye-tracking of visual attention in web-based assessment using the Force Concept Inventory
NASA Astrophysics Data System (ADS)
Han, Jing; Chen, Li; Fu, Zhao; Fritchman, Joseph; Bao, Lei
2017-07-01
This study used eye-tracking technology to investigate students’ visual attention while taking the Force Concept Inventory (FCI) in a web-based interface. Eighty nine university students were randomly selected into a pre-test group and a post-test group. Students took the 30-question FCI on a computer equipped with an eye-tracker. There were seven weeks of instruction between the pre- and post-test data collection. Students’ performance on the FCI improved significantly from pre-test to post-test. Meanwhile, the eye-tracking results reveal that the time students spent on taking the FCI test was not affected by student performance and did not change from pre-test to post-test. Analysis of students’ attention to answer choices shows that on the pre-test students primarily focused on the naïve choices and ignored the expert choices. On the post-test, although students had shifted their primary attention to the expert choices, they still kept a high level of attention to the naïve choices, indicating significant conceptual mixing and competition during problem solving. Outcomes of this study provide new insights on students’ conceptual development in learning physics.
Fly eye radar or micro-radar sensor technology
NASA Astrophysics Data System (ADS)
Molchanov, Pavlo; Asmolova, Olga
2014-05-01
To compensate for its eye's inability to point its eye at a target, the fly's eye consists of multiple angularly spaced sensors giving the fly the wide-area visual coverage it needs to detect and avoid the threats around him. Based on a similar concept a revolutionary new micro-radar sensor technology is proposed for detecting and tracking ground and/or airborne low profile low altitude targets in harsh urban environments. Distributed along a border or around a protected object (military facility and buildings, camp, stadium) small size, low power unattended radar sensors can be used for target detection and tracking, threat warning, pre-shot sniper protection and provides effective support for homeland security. In addition it can provide 3D recognition and targets classification due to its use of five orders more pulses than any scanning radar to each space point, by using few points of view, diversity signals and intelligent processing. The application of an array of directional antennas eliminates the need for a mechanical scanning antenna or phase processor. It radically decreases radar size and increases bearing accuracy several folds. The proposed micro-radar sensors can be easy connected to one or several operators by point-to-point invisible protected communication. The directional antennas have higher gain, can be multi-frequency and connected to a multi-functional network. Fly eye micro-radars are inexpensive, can be expendable and will reduce cost of defense.
Heathcote, L C; Lau, J Y F; Mueller, S C; Eccleston, C; Fox, E; Bosmans, M; Vervoort, T
2017-02-01
Pain is common and can be debilitating in childhood. Theoretical models propose that attention to pain plays a key role in pain outcomes, however, very little research has investigated this in youth. This study examined how anxiety-related variables and attention control interacted to predict children's attention to pain cues using eye-tracking methodology, and their pain tolerance on the cold pressor test (CPT). Children aged 8-17 years had their eye-gaze tracked whilst they viewed photographs of other children displaying painful facial expressions during the CPT, before completing the CPT themselves. Children also completed self-report measures of anxiety and attention control. Findings indicated that anxiety and attention control did not impact children's initial fixations on pain or neutral faces, but did impact how long they dwelled on pain versus neutral faces. For children reporting low levels of attention control, higher anxiety was associated with less dwell time on pain faces as opposed to neutral faces, and the opposite pattern was observed for children with high attention control. Anxiety and attention control also interacted to predict pain outcomes. For children with low attention control, increasing anxiety was associated with anticipating more pain and tolerating pain for less time. This is the first study to examine children's attention to pain cues using eye-tracking technology in the context of a salient painful experience. Data suggest that attention control is an important moderator of anxiety on multiple outcomes relevant to young people's pain experiences. This study uses eye tracking to study attention to pain cues in children. Attention control is an important moderator of anxiety on attention bias to pain and tolerance of cold pressor pain in youth. © 2016 European Pain Federation - EFIC®.
Gaze-contingent displays: a review.
Duchowski, Andrew T; Cournia, Nathan; Murphy, Hunter
2004-12-01
Gaze-contingent displays (GCDs) attempt to balance the amount of information displayed against the visual information processing capacity of the observer through real-time eye movement sensing. Based on the assumed knowledge of the instantaneous location of the observer's focus of attention, GCD content can be "tuned" through several display processing means. Screen-based displays alter pixel level information generally matching the resolvability of the human retina in an effort to maximize bandwidth. Model-based displays alter geometric-level primitives along similar goals. Attentive user interfaces (AUIs) manage object- level entities (e.g., windows, applications) depending on the assumed attentive state of the observer. Such real-time display manipulation is generally achieved through non-contact, unobtrusive tracking of the observer's eye movements. This paper briefly reviews past and present display techniques as well as emerging graphics and eye tracking technology for GCD development.
Advanced Physiological Estimation of Cognitive Status. Part 2
2011-05-24
Neurofeedback Algorithms and Gaze Controller EEG Sensor System g.USBamp *, ** • internal 24-bit ADC and digital signal processor • 16 channels (expandable...SUBJECT TERMS EEG eye-tracking mental state estimation machine learning Leonard J. Trejo Pacific Development and Technology LLC 999 Commercial St. Palo...fatigue, overload) Technology Transfer Opportunity Technology from PDT – Methods to acquire various physiological signals ( EEG , EOG, EMG, ECG, etc
NASA Technical Reports Server (NTRS)
Grant, Michael P.; Leigh, R. John; Seidman, Scott H.; Riley, David E.; Hanna, Joseph P.
1992-01-01
We compared the ability of eight normal subjects and 15 patients with brainstem or cerebellar disease to follow a moving visual stimulus smoothly with either the eyes alone or with combined eye-head tracking. The visual stimulus was either a laser spot (horizontal and vertical planes) or a large rotating disc (torsional plane), which moved at one sinusoidal frequency for each subject. The visually enhanced Vestibulo-Ocular Reflex (VOR) was also measured in each plane. In the horizontal and vertical planes, we found that if tracking gain (gaze velocity/target velocity) for smooth pursuit was close to 1, the gain of combined eye-hand tracking was similar. If the tracking gain during smooth pursuit was less than about 0.7, combined eye-head tracking was usually superior. Most patients, irrespective of diagnosis, showed combined eye-head tracking that was superior to smooth pursuit; only two patients showed the converse. In the torsional plane, in which optokinetic responses were weak, combined eye-head tracking was much superior, and this was the case in both subjects and patients. We found that a linear model, in which an internal ocular tracking signal cancelled the VOR, could account for our findings in most normal subjects in the horizontal and vertical planes, but not in the torsional plane. The model failed to account for tracking behaviour in most patients in any plane, and suggested that the brain may use additional mechanisms to reduce the internal gain of the VOR during combined eye-head tracking. Our results confirm that certain patients who show impairment of smooth-pursuit eye movements preserve their ability to smoothly track a moving target with combined eye-head tracking.
Learning and Treatment of Anaphylaxis by Laypeople: A Simulation Study Using Pupilar Technology
Fernandez-Mendez, Felipe; Barcala-Furelos, Roberto; Padron-Cabo, Alexis; Garcia-Magan, Carlos; Moure-Gonzalez, Jose; Contreras-Jordan, Onofre; Rodriguez-Nuñez, Antonio
2017-01-01
An anaphylactic shock is a time-critical emergency situation. The decision-making during emergencies is an important responsibility but difficult to study. Eye-tracking technology allows us to identify visual patterns involved in the decision-making. The aim of this pilot study was to evaluate two training models for the recognition and treatment of anaphylaxis by laypeople, based on expert assessment and eye-tracking technology. A cross-sectional quasi-experimental simulation study was made to evaluate the identification and treatment of anaphylaxis. 50 subjects were randomly assigned to four groups: three groups watching different training videos with content supervised by sanitary personnel and one control group who received face-to-face training during paediatric practice. To evaluate the learning, a simulation scenario represented by an anaphylaxis' victim was designed. A device capturing eye movement as well as expert valuation was used to evaluate the performance. The subjects that underwent paediatric face-to-face training achieved better and faster recognition of the anaphylaxis. They also used the adrenaline injector with better precision and less mistakes, and they needed a smaller number of visual fixations to recognise the anaphylaxis and to make the decision to inject epinephrine. Analysing the different video formats, mixed results were obtained. Therefore, they should be tested to evaluate their usability before implementation. PMID:28758128
Learning and Treatment of Anaphylaxis by Laypeople: A Simulation Study Using Pupilar Technology.
Fernandez-Mendez, Felipe; Saez-Gallego, Nieves Maria; Barcala-Furelos, Roberto; Abelairas-Gomez, Cristian; Padron-Cabo, Alexis; Perez-Ferreiros, Alexandra; Garcia-Magan, Carlos; Moure-Gonzalez, Jose; Contreras-Jordan, Onofre; Rodriguez-Nuñez, Antonio
2017-01-01
An anaphylactic shock is a time-critical emergency situation. The decision-making during emergencies is an important responsibility but difficult to study. Eye-tracking technology allows us to identify visual patterns involved in the decision-making. The aim of this pilot study was to evaluate two training models for the recognition and treatment of anaphylaxis by laypeople, based on expert assessment and eye-tracking technology. A cross-sectional quasi-experimental simulation study was made to evaluate the identification and treatment of anaphylaxis. 50 subjects were randomly assigned to four groups: three groups watching different training videos with content supervised by sanitary personnel and one control group who received face-to-face training during paediatric practice. To evaluate the learning, a simulation scenario represented by an anaphylaxis' victim was designed. A device capturing eye movement as well as expert valuation was used to evaluate the performance. The subjects that underwent paediatric face-to-face training achieved better and faster recognition of the anaphylaxis. They also used the adrenaline injector with better precision and less mistakes, and they needed a smaller number of visual fixations to recognise the anaphylaxis and to make the decision to inject epinephrine. Analysing the different video formats, mixed results were obtained. Therefore, they should be tested to evaluate their usability before implementation.
Advances in Eye Tracking in Infancy Research
ERIC Educational Resources Information Center
Oakes, Lisa M.
2012-01-01
In 2004, McMurray and Aslin edited for "Infancy" a special section on eye tracking. The articles in that special issue revealed the enormous promise of automatic eye tracking with young infants and demonstrated that eye-tracking procedures can provide significant insight into the emergence of cognitive, social, and emotional processing in infancy.…
A Statistical Physics Perspective to Understand Social Visual Attention in Autism Spectrum Disorder.
Liberati, Alessio; Fadda, Roberta; Doneddu, Giuseppe; Congiu, Sara; Javarone, Marco A; Striano, Tricia; Chessa, Alessandro
2017-08-01
This study investigated social visual attention in children with Autism Spectrum Disorder (ASD) and with typical development (TD) in the light of Brockmann and Geisel's model of visual attention. The probability distribution of gaze movements and clustering of gaze points, registered with eye-tracking technology, was studied during a free visual exploration of a gaze stimulus. A data-driven analysis of the distribution of eye movements was chosen to overcome any possible methodological problems related to the subjective expectations of the experimenters about the informative contents of the image in addition to a computational model to simulate group differences. Analysis of the eye-tracking data indicated that the scanpaths of children with TD and ASD were characterized by eye movements geometrically equivalent to Lévy flights. Children with ASD showed a higher frequency of long saccadic amplitudes compared with controls. A clustering analysis revealed a greater dispersion of eye movements for these children. Modeling of the results indicated higher values of the model parameter modulating the dispersion of eye movements for children with ASD. Together, the experimental results and the model point to a greater dispersion of gaze points in ASD.
Access to augmentative and alternative communication: new technologies and clinical decision-making.
Fager, Susan; Bardach, Lisa; Russell, Susanne; Higginbotham, Jeff
2012-01-01
Children with severe physical impairments require a variety of access options to augmentative and alternative communication (AAC) and computer technology. Access technologies have continued to develop, allowing children with severe motor control impairments greater independence and access to communication. This article will highlight new advances in access technology, including eye and head tracking, scanning, and access to mainstream technology, as well as discuss future advances. Considerations for clinical decision-making and implementation of these technologies will be presented along with case illustrations.
Infant Eye-Tracking in the Context of Goal-Directed Actions
ERIC Educational Resources Information Center
Corbetta, Daniela; Guan, Yu; Williams, Joshua L.
2012-01-01
This paper presents two methods that we applied to our research to record infant gaze in the context of goal-oriented actions using different eye-tracking devices: head-mounted and remote eye-tracking. For each type of eye-tracking system, we discuss their advantages and disadvantages, describe the particular experimental setups we used to study…
Lu, Shengfu; Xu, Jiying; Li, Mi; Xue, Jia; Lu, Xiaofeng; Feng, Lei; Fu, Bingbing; Wang, Gang; Zhong, Ning; Hu, Bin
2017-10-01
Objective To compare the attentional bias of depressed patients and non-depressed control subjects and examine the effects of age using eye-tracking technology in a free-viewing set of tasks. Methods Patients with major depressive disorder (MDD) and non-depressed control subjects completed an eye-tracking task to assess attention of processing negative, positive and neutral facial expressions. In this cross-sectional study, the tasks were separated in two types (neutral versus happy faces and neutral versus sad faces) and assessed in two age groups ('young' [18-30 years] and 'middle-aged' [31-55 years]). Results Compared with non-depressed control subjects ( n = 75), patients with MDD ( n = 90) had a significant reduced positive attentional bias and enhanced negative attentional bias irrespective of age. The positive attentional bias in 'middle-aged' patients with MDD was significantly lower than in 'young' patients, although there was no difference between the two age groups in negative attentional bias. Conclusions These results confirm that there are emotional attentional biases in patients with MDD and that positive attentional biases are influenced by age.
The effect of concurrent hand movement on estimated time to contact in a prediction motion task.
Zheng, Ran; Maraj, Brian K V
2018-04-27
In many activities, we need to predict the arrival of an occluded object. This action is called prediction motion or motion extrapolation. Previous researchers have found that both eye tracking and the internal clocking model are involved in the prediction motion task. Additionally, it is reported that concurrent hand movement facilitates the eye tracking of an externally generated target in a tracking task, even if the target is occluded. The present study examined the effect of concurrent hand movement on the estimated time to contact in a prediction motion task. We found different (accurate/inaccurate) concurrent hand movements had the opposite effect on the eye tracking accuracy and estimated TTC in the prediction motion task. That is, the accurate concurrent hand tracking enhanced eye tracking accuracy and had the trend to increase the precision of estimated TTC, but the inaccurate concurrent hand tracking decreased eye tracking accuracy and disrupted estimated TTC. However, eye tracking accuracy does not determine the precision of estimated TTC.
A new mapping function in table-mounted eye tracker
NASA Astrophysics Data System (ADS)
Tong, Qinqin; Hua, Xiao; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng
2018-01-01
Eye tracker is a new apparatus of human-computer interaction, which has caught much attention in recent years. Eye tracking technology is to obtain the current subject's "visual attention (gaze)" direction by using mechanical, electronic, optical, image processing and other means of detection. While the mapping function is one of the key technology of the image processing, and is also the determination of the accuracy of the whole eye tracker system. In this paper, we present a new mapping model based on the relationship among the eyes, the camera and the screen that the eye gazed. Firstly, according to the geometrical relationship among the eyes, the camera and the screen, the framework of mapping function between the pupil center and the screen coordinate is constructed. Secondly, in order to simplify the vectors inversion of the mapping function, the coordinate of the eyes, the camera and screen was modeled by the coaxial model systems. In order to verify the mapping function, corresponding experiment was implemented. It is also compared with the traditional quadratic polynomial function. And the results show that our approach can improve the accuracy of the determination of the gazing point. Comparing with other methods, this mapping function is simple and valid.
Seligman, Sarah C; Giovannetti, Tania
2015-06-01
Mild cognitive impairment (MCI) refers to the intermediate period between the typical cognitive decline of normal aging and more severe decline associated with dementia, and it is associated with greater risk for progression to dementia. Research has suggested that functional abilities are compromised in MCI, but the degree of impairment and underlying mechanisms remain poorly understood. The development of sensitive measures to assess subtle functional decline poses a major challenge for characterizing functional limitations in MCI. Eye-tracking methodology has been used to describe visual processes in everyday, naturalistic action among healthy older adults as well as several case studies of severely impaired individuals, and it has successfully differentiated healthy older adults from those with MCI on specific visual tasks. These studies highlight the promise of eye-tracking technology as a method to characterize subtle functional decline in MCI. However, to date no studies have examined visual behaviors during completion of naturalistic tasks in MCI. This review describes the current understanding of functional ability in MCI, summarizes findings of eye-tracking studies in healthy individuals, severe impairment, and MCI, and presents future research directions to aid with early identification and prevention of functional decline in disorders of aging.
ERIC Educational Resources Information Center
Mabila, Jabulisiwe; Gelderblom, Helene; Ssemugabi, Samuel
2014-01-01
The internet gives individuals access to learning through online technologies. The prolific use of Learning Management Systems (LMSs) in higher education institutions makes Information and Communication Technology (ICT) skills or e-skills very important. ICT skill levels have been positively related to students' effectiveness and efficiency in…
Design of integrated eye tracker-display device for head mounted systems
NASA Astrophysics Data System (ADS)
David, Y.; Apter, B.; Thirer, N.; Baal-Zedaka, I.; Efron, U.
2009-08-01
We propose an Eye Tracker/Display system, based on a novel, dual function device termed ETD, which allows sharing the optical paths of the Eye tracker and the display and on-chip processing. The proposed ETD design is based on a CMOS chip combining a Liquid-Crystal-on-Silicon (LCoS) micro-display technology with near infrared (NIR) Active Pixel Sensor imager. The ET operation allows capturing the Near IR (NIR) light, back-reflected from the eye's retina. The retinal image is then used for the detection of the current direction of eye's gaze. The design of the eye tracking imager is based on the "deep p-well" pixel technology, providing low crosstalk while shielding the active pixel circuitry, which serves the imaging and the display drivers, from the photo charges generated in the substrate. The use of the ETD in the HMD Design enables a very compact design suitable for Smart Goggle applications. A preliminary optical, electronic and digital design of the goggle and its associated ETD chip and digital control, are presented.
MR-Compatible Integrated Eye Tracking System
2016-03-10
SECURITY CLASSIFICATION OF: This instrumentation grant was used to purchase state-of-the-art, high-resolution video eye tracker that can be used to...P.O. Box 12211 Research Triangle Park, NC 27709-2211 video eye tracking, eye movments, visual search; camouflage-breaking REPORT DOCUMENTATION PAGE...Report: MR-Compatible Integrated Eye Tracking System Report Title This instrumentation grant was used to purchase state-of-the-art, high-resolution video
Billeci, L; Narzisi, A; Campatelli, G; Crifaci, G; Calderoni, S; Gagliano, A; Calzone, C; Colombi, C; Pioggia, G; Muratori, F
2016-05-17
Joint attention (JA), whose deficit is an early risk marker for autism spectrum disorder (ASD), has two dimensions: (1) responding to JA and (2) initiating JA. Eye-tracking technology has largely been used to investigate responding JA, but rarely to study initiating JA especially in young children with ASD. The aim of this study was to describe the differences in the visual patterns of toddlers with ASD and those with typical development (TD) during both responding JA and initiating JA tasks. Eye-tracking technology was used to monitor the gaze of 17 children with ASD and 15 age-matched children with TD during the presentation of short video sequences involving one responding JA and two initiating JA tasks (initiating JA-1 and initiating JA-2). Gaze accuracy, transitions and fixations were analyzed. No differences were found in the responding JA task between children with ASD and those with TD, whereas, in the initiating JA tasks, different patterns of fixation and transitions were shown between the groups. These results suggest that children with ASD and those with TD show different visual patterns when they are expected to initiate joint attention but not when they respond to joint attention. We hypothesized that differences in transitions and fixations are linked to ASD impairments in visual disengagement from face, in global scanning of the scene and in the ability to anticipate object's action.
Eye-Tracking Study of Complexity in Gas Law Problems
ERIC Educational Resources Information Center
Tang, Hui; Pienta, Norbert
2012-01-01
This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…
Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI
2017-06-01
report. 10 Supporting Data None. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI Psychological Health...Award Number: W81XWH-13-1-0095 TITLE: Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI PRINCIPAL INVESTIGATOR...COVERED 08 MAR 2016 – 07 MAR 2017 4. TITLE AND SUBTITLE Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI 5a
Active eye-tracking for an adaptive optics scanning laser ophthalmoscope
Sheehy, Christy K.; Tiruveedhula, Pavan; Sabesan, Ramkumar; Roorda, Austin
2015-01-01
We demonstrate a system that combines a tracking scanning laser ophthalmoscope (TSLO) and an adaptive optics scanning laser ophthalmoscope (AOSLO) system resulting in both optical (hardware) and digital (software) eye-tracking capabilities. The hybrid system employs the TSLO for active eye-tracking at a rate up to 960 Hz for real-time stabilization of the AOSLO system. AOSLO videos with active eye-tracking signals showed, at most, an amplitude of motion of 0.20 arcminutes for horizontal motion and 0.14 arcminutes for vertical motion. Subsequent real-time digital stabilization limited residual motion to an average of only 0.06 arcminutes (a 95% reduction). By correcting for high amplitude, low frequency drifts of the eye, the active TSLO eye-tracking system enabled the AOSLO system to capture high-resolution retinal images over a larger range of motion than previously possible with just the AOSLO imaging system alone. PMID:26203370
Kotani, Manato; Shimono, Kohei; Yoneyama, Toshihiro; Nakako, Tomokazu; Matsumoto, Kenji; Ogi, Yuji; Konoike, Naho; Nakamura, Katsuki; Ikeda, Kazuhito
2017-09-01
Eye tracking systems are used to investigate eyes position and gaze patterns presumed as eye contact in humans. Eye contact is a useful biomarker of social communication and known to be deficient in patients with autism spectrum disorders (ASDs). Interestingly, the same eye tracking systems have been used to directly compare face scanning patterns in some non-human primates to those in human. Thus, eye tracking is expected to be a useful translational technique for investigating not only social attention and visual interest, but also the effects of psychiatric drugs, such as oxytocin, a neuropeptide that regulates social behavior. In this study, we report on a newly established method for eye tracking in common marmosets as unique New World primates that, like humans, use eye contact as a mean of communication. Our investigation was aimed at characterizing these primates face scanning patterns and evaluating the effects of oxytocin on their eye contact behavior. We found that normal common marmosets spend more time viewing the eyes region in common marmoset's picture than the mouth region or a scrambled picture. In oxytocin experiment, the change in eyes/face ratio was significantly greater in the oxytocin group than in the vehicle group. Moreover, oxytocin-induced increase in the change in eyes/face ratio was completely blocked by the oxytocin receptor antagonist L-368,899. These results indicate that eye tracking in common marmosets may be useful for evaluating drug candidates targeting psychiatric conditions, especially ASDs. Copyright © 2017 Elsevier Ltd. All rights reserved.
Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H.; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P.; Smith, R. Theodore
2015-01-01
OBJECT Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. METHODS The authors recorded subjects’ eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. RESULTS In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III palsy had significantly decreased ratios of 0.19 and 0.06, respectively. Three patients with surgically treatable pathological conditions impacting CN VI, such as infratentorial mass effect or hydrocephalus, had significantly increased ratios (1.84, 1.44, and 1.34, respectively) relative to normal controls, and 6 patients with supratentorial mass effect had significantly decreased ratios (0.27, 0.53, 0.62, 0.45, 0.49, and 0.41, respectively). These alterations in eye tracking all reverted to normal ranges after surgical treatment of underlying pathological conditions in these 9 neurosurgical cases. CONCLUSIONS This proof of concept series of cases suggests that the use of eye tracking to detect CN palsy while the patient watches television or its equivalent represents a new capacity for this technology. It may provide a new tool for the assessment of multiple CNS functions that can potentially be useful in the assessment of awake patients with elevated intracranial pressure from hydrocephalus or trauma. PMID:25495739
Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P; Smith, R Theodore
2015-03-01
Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. The authors recorded subjects' eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III palsy had significantly decreased ratios of 0.19 and 0.06, respectively. Three patients with surgically treatable pathological conditions impacting CN VI, such as infratentorial mass effect or hydrocephalus, had significantly increased ratios (1.84, 1.44, and 1.34, respectively) relative to normal controls, and 6 patients with supratentorial mass effect had significantly decreased ratios (0.27, 0.53, 0.62, 0.45, 0.49, and 0.41, respectively). These alterations in eye tracking all reverted to normal ranges after surgical treatment of underlying pathological conditions in these 9 neurosurgical cases. This proof of concept series of cases suggests that the use of eye tracking to detect CN palsy while the patient watches television or its equivalent represents a new capacity for this technology. It may provide a new tool for the assessment of multiple CNS functions that can potentially be useful in the assessment of awake patients with elevated intracranial pressure from hydrocephalus or trauma.
Investigating Hypervigilance for Social Threat of Lonely Children
ERIC Educational Resources Information Center
Qualter, Pamela; Rotenberg, Ken; Barrett, Louise; Henzi, Peter; Barlow, Alexandra; Stylianou, Maria; Harris, Rebecca A.
2013-01-01
The hypothesis that lonely children show hypervigilance for social threat was examined in a series of three studies that employed different methods including advanced eye-tracking technology. Hypervigilance for social threat was operationalized as hostility to ambiguously motivated social exclusion in a variation of the hostile attribution…
Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M
2017-02-01
Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.
Kolecki, Radek; Dammavalam, Vikalpa; Bin Zahid, Abdullah; Hubbard, Molly; Choudhry, Osamah; Reyes, Marleen; Han, ByoungJun; Wang, Tom; Papas, Paraskevi Vivian; Adem, Aylin; North, Emily; Gilbertson, David T; Kondziolka, Douglas; Huang, Jason H; Huang, Paul P; Samadani, Uzma
2018-03-01
OBJECTIVE The precise threshold differentiating normal and elevated intracranial pressure (ICP) is variable among individuals. In the context of several pathophysiological conditions, elevated ICP leads to abnormalities in global cerebral functioning and impacts the function of cranial nerves (CNs), either or both of which may contribute to ocular dysmotility. The purpose of this study was to assess the impact of elevated ICP on eye-tracking performed while patients were watching a short film clip. METHODS Awake patients requiring placement of an ICP monitor for clinical purposes underwent eye tracking while watching a 220-second continuously playing video moving around the perimeter of a viewing monitor. Pupil position was recorded at 500 Hz and metrics associated with each eye individually and both eyes together were calculated. Linear regression with generalized estimating equations was performed to test the association of eye-tracking metrics with changes in ICP. RESULTS Eye tracking was performed at ICP levels ranging from -3 to 30 mm Hg in 23 patients (12 women, 11 men, mean age 46.8 years) on 55 separate occasions. Eye-tracking measures correlating with CN function linearly decreased with increasing ICP (p < 0.001). Measures for CN VI were most prominently affected. The area under the curve (AUC) for eye-tracking metrics to discriminate between ICP < 12 and ≥ 12 mm Hg was 0.798. To discriminate an ICP < 15 from ≥ 15 mm Hg the AUC was 0.833, and to discriminate ICP < 20 from ≥ 20 mm Hg the AUC was 0.889. CONCLUSIONS Increasingly elevated ICP was associated with increasingly abnormal eye tracking detected while patients were watching a short film clip. These results suggest that eye tracking may be used as a noninvasive, automatable means to quantitate the physiological impact of elevated ICP, which has clinical application for assessment of shunt malfunction, pseudotumor cerebri, concussion, and prevention of second-impact syndrome.
NASA Astrophysics Data System (ADS)
Maguen, Ezra I.; Nesburn, Anthony B.; Salz, James J.
2000-06-01
A study was undertaken to assess the safety and efficacy of LASIK with the LADARVision laser by Autonomous Technologies, (Orlando, FL). The study included four subsets: Spherical myopia -- up to -11.00D, spherical hyperopia -- up to +6.00D. Both myopic and hyperopic astigmatism could be corrected, up to 6.00D of astigmatism. A total of 105 patients participated. Sixty-six patients were myopic and 39 were hyperopic. The mean (+/- SD) age was 42.8 +/- 9.3 years for myopia and 53.2 +/- 9.9 years for hyperopia. At 3 months postop. Sixty-one myopic eyes were available for evaluation. Uncorrected visual acuity was 20/20 in 70% of eyes and 20/40 in 92.9% of all eyes. The refractive outcome was within +/- 0.50D in 73.8% of eyes and within +/- 1.00D in 96.7 of eyes. Thirty-eight hyperopic eyes were available. Uncorrected visual acuity was 20/20 in 42.1% of eyes and 20/40 in 88% of all eyes. The refractive outcome was within +/- 0.50D in 57.9% of eyes and within +/- 1.00D in 86.8% of eyes. Complications were not sight threatening and were discussed in detail. Lasik with the LADARVision laser appears to be safe and effective.
NASA Technical Reports Server (NTRS)
Kornilova, L. N.; Naumov, I. A.; Azarov, K. A.; Sagalovitch, S. V.; Reschke, Millard F.; Kozlovskaya, I. B.
2007-01-01
The vestibular function and tracking eye movements were investigated in 12 Russian crew members of ISS missions on days 1(2), 4(5-6), and 8(9-10) after prolonged exposure to microgravity (126 to 195 days). The spontaneous oculomotor activity, static torsional otolith-cervico-ocular reflex, dynamic vestibulo-cervico-ocular responses, vestibular reactivity, tracking eye movements, and gaze-holding were studied using videooculography (VOG) and electrooculography (EOG) for parallel eye movement recording. On post-flight days 1-2 (R+1-2) some cosmonauts demonstrated: - an increased spontaneous oculomotor activity (floating eye movements, spontaneous nystagmus of the typical and atypical form, square wave jerks, gaze nystagmus) with the head held in the vertical position; - suppressed otolith function (absent or reduced by one half amplitude of torsional compensatory eye counter-rolling) with the head inclined statically right- or leftward by 300; - increased vestibular reactivity (lowered threshold and increased intensity of the vestibular nystagmus) during head turns around the longitudinal body axis at 0.125 Hz; - a significant change in the accuracy, velocity, and temporal characteristics of the eye tracking. The pattern, depth, dynamics, and velocity of the vestibular function and tracking eye movements recovery varied with individual participants in the investigation. However, there were also regular responses during readaptation to the normal gravity: - suppression of the otolith function was typically accompanied by an exaggerated vestibular reactivity; - the structure of visual tracking (the accuracy of fixational eye rotations, smooth tracking, and gaze-holding) was disturbed (the appearance of correcting saccades, the transition of smooth tracking to saccadic tracking) only in those cosmonauts who, in parallel to an increased reactivity of the vestibular input, also had central changes in the oculomotor system (spontaneous nystagmus, gaze nystagmus).
Real-time eye motion correction in phase-resolved OCT angiography with tracking SLO
Braaf, Boy; Vienola, Kari V.; Sheehy, Christy K.; Yang, Qiang; Vermeer, Koenraad A.; Tiruveedhula, Pavan; Arathorn, David W.; Roorda, Austin; de Boer, Johannes F.
2012-01-01
In phase-resolved OCT angiography blood flow is detected from phase changes in between A-scans that are obtained from the same location. In ophthalmology, this technique is vulnerable to eye motion. We address this problem by combining inter-B-scan phase-resolved OCT angiography with real-time eye tracking. A tracking scanning laser ophthalmoscope (TSLO) at 840 nm provided eye tracking functionality and was combined with a phase-stabilized optical frequency domain imaging (OFDI) system at 1040 nm. Real-time eye tracking corrected eye drift and prevented discontinuity artifacts from (micro)saccadic eye motion in OCT angiograms. This improved the OCT spot stability on the retina and consequently reduced the phase-noise, thereby enabling the detection of slower blood flows by extending the inter-B-scan time interval. In addition, eye tracking enabled the easy compounding of multiple data sets from the fovea of a healthy volunteer to create high-quality eye motion artifact-free angiograms. High-quality images are presented of two distinct layers of vasculature in the retina and the dense vasculature of the choroid. Additionally we present, for the first time, a phase-resolved OCT angiogram of the mesh-like network of the choriocapillaris containing typical pore openings. PMID:23304647
King, Andrew J; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F
2017-01-01
Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device's accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use.
King, Andrew J.; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F.
2017-01-01
Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device’s accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use. PMID:28815151
The Effectiveness of Gaze-Contingent Control in Computer Games.
Orlov, Paul A; Apraksin, Nikolay
2015-01-01
Eye-tracking technology and gaze-contingent control in human-computer interaction have become an objective reality. This article reports on a series of eye-tracking experiments, in which we concentrated on one aspect of gaze-contingent interaction: Its effectiveness compared with mouse-based control in a computer strategy game. We propose a measure for evaluating the effectiveness of interaction based on "the time of recognition" the game unit. In this article, we use this measure to compare gaze- and mouse-contingent systems, and we present the analysis of the differences as a function of the number of game units. Our results indicate that performance of gaze-contingent interaction is typically higher than mouse manipulation in a visual searching task. When tested on 60 subjects, the results showed that the effectiveness of gaze-contingent systems over 1.5 times higher. In addition, we obtained that eye behavior stays quite stabile with or without mouse interaction. © The Author(s) 2015.
Spinks, Jean; Mortimer, Duncan
2016-02-03
The provision of additional information is often assumed to improve consumption decisions, allowing consumers to more accurately weigh the costs and benefits of alternatives. However, increasing the complexity of decision problems may prompt changes in information processing. This is particularly relevant for experimental methods such as discrete choice experiments (DCEs) where the researcher can manipulate the complexity of the decision problem. The primary aims of this study are (i) to test whether consumers actually process additional information in an already complex decision problem, and (ii) consider the implications of any such 'complexity-driven' changes in information processing for design and analysis of DCEs. A discrete choice experiment (DCE) is used to simulate a complex decision problem; here, the choice between complementary and conventional medicine for different health conditions. Eye-tracking technology is used to capture the number of times and the duration that a participant looks at any part of a computer screen during completion of DCE choice sets. From this we can analyse what has become known in the DCE literature as 'attribute non-attendance' (ANA). Using data from 32 participants, we model the likelihood of ANA as a function of choice set complexity and respondent characteristics using fixed and random effects models to account for repeated choice set completion. We also model whether participants are consistent with regard to which characteristics (attributes) they consider across choice sets. We find that complexity is the strongest predictor of ANA when other possible influences, such as time pressure, ordering effects, survey specific effects and socio-demographic variables (including proxies for prior experience with the decision problem) are considered. We also find that most participants do not apply a consistent information processing strategy across choice sets. Eye-tracking technology shows promise as a way of obtaining additional information from consumer research, improving DCE design, and informing the design of policy measures. With regards to DCE design, results from the present study suggest that eye-tracking data can identify the point at which adding complexity (and realism) to DCE choice scenarios becomes self-defeating due to unacceptable increases in ANA. Eye-tracking data therefore has clear application in the construction of guidelines for DCE design and during piloting of DCE choice scenarios. With regards to design of policy measures such as labelling requirements for CAM and conventional medicines, the provision of additional information has the potential to make difficult decisions even harder and may not have the desired effect on decision-making.
Young and Older Adults' Reading of Distracters
ERIC Educational Resources Information Center
Kemper, Susan; Mcdowd, Joan; Metcalf, Kim; Liu, Chiung-Ju
2008-01-01
Eye-tracking technology was employed to examine young and older adults' performance in the reading with distraction paradigm. Distracters of 1, 2, and 4 words that formed meaningful phrases were used. There were marked age differences in fixation patterns. Young adults' fixations to the distracters and targets increased with distracter length.…
New and Improved: Security Goes High-Tech.
ERIC Educational Resources Information Center
Gamble, Cheryl
2002-01-01
Explains the technology of biometrics, the science of identifying a person by unique physical characteristics, and its application in the fight against terrorism. Argues that biometrics, such as hand readers, fingerprint readers, and eye scans, are reliable and efficient. Also describes proximity cards, digital tracking systems, and smart cards.…
Kelly, Brendan S; Rainford, Louise A; Darcy, Sarah P; Kavanagh, Eoin C; Toomey, Rachel J
2016-07-01
Purpose To investigate the development of chest radiograph interpretation skill through medical training by measuring both diagnostic accuracy and eye movements during visual search. Materials and Methods An institutional exemption from full ethical review was granted for the study. Five consultant radiologists were deemed the reference expert group, and four radiology registrars, five senior house officers (SHOs), and six interns formed four clinician groups. Participants were shown 30 chest radiographs, 14 of which had a pneumothorax, and were asked to give their level of confidence as to whether a pneumothorax was present. Receiver operating characteristic (ROC) curve analysis was carried out on diagnostic decisions. Eye movements were recorded with a Tobii TX300 (Tobii Technology, Stockholm, Sweden) eye tracker. Four eye-tracking metrics were analyzed. Variables were compared to identify any differences between groups. All data were compared by using the Friedman nonparametric method. Results The average area under the ROC curve for the groups increased with experience (0.947 for consultants, 0.792 for registrars, 0.693 for SHOs, and 0.659 for interns; P = .009). A significant difference in diagnostic accuracy was found between consultants and registrars (P = .046). All four eye-tracking metrics decreased with experience, and there were significant differences between registrars and SHOs. Total reading time decreased with experience; it was significantly lower for registrars compared with SHOs (P = .046) and for SHOs compared with interns (P = .025). Conclusion Chest radiograph interpretation skill increased with experience, both in terms of diagnostic accuracy and visual search. The observed level of experience at which there was a significant difference was higher for diagnostic accuracy than for eye-tracking metrics. (©) RSNA, 2016 Online supplemental material is available for this article.
Scanning mid-IR laser apparatus with eye tracking for refractive surgery
NASA Astrophysics Data System (ADS)
Telfair, William B.; Yoder, Paul R., Jr.; Bekker, Carsten; Hoffman, Hanna J.; Jensen, Eric F.
1999-06-01
A robust, real-time, dynamic eye tracker has been integrated with the short pulse mid-infrared laser scanning delivery system previously described. This system employs a Q- switched Nd:YAG laser pumped optical parametric oscillator operating at 2.94 micrometers. Previous ablation studies on human cadaver eyes and in-vivo cat eyes demonstrated very smooth ablations with extremely low damage levels similar to results with an excimer. A 4-month healing study with cats indicated no adverse healing effects. In order to treat human eyes, the tracker is required because the eyes move during the procedure due to both voluntary and involuntary motions such as breathing, heartbeat, drift, loss of fixation, saccades and microsaccades. Eye tracking techniques from the literature were compared. A limbus tracking system was best for this application. Temporal and spectral filtering techniques were implemented to reduce tracking errors, reject stray light, and increase signal to noise ratio. The expanded-capability system (IRVision AccuScan 2000 Laser System) has been tested in the lab on simulated eye targets, glass eyes, cadaver eyes, and live human subjects. Circular targets ranging from 10-mm to 14-mm diameter were successfully tracked. The tracker performed beyond expectations while the system performed myopic photorefractive keratectomy procedures on several legally blind human subjects.
Payne, Hannah L
2017-01-01
Eye movements provide insights about a wide range of brain functions, from sensorimotor integration to cognition; hence, the measurement of eye movements is an important tool in neuroscience research. We describe a method, based on magnetic sensing, for measuring eye movements in head-fixed and freely moving mice. A small magnet was surgically implanted on the eye, and changes in the magnet angle as the eye rotated were detected by a magnetic field sensor. Systematic testing demonstrated high resolution measurements of eye position of <0.1°. Magnetic eye tracking offers several advantages over the well-established eye coil and video-oculography methods. Most notably, it provides the first method for reliable, high-resolution measurement of eye movements in freely moving mice, revealing increased eye movements and altered binocular coordination compared to head-fixed mice. Overall, magnetic eye tracking provides a lightweight, inexpensive, easily implemented, and high-resolution method suitable for a wide range of applications. PMID:28872455
Samadani, Uzma; Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H; McStay, Christopher; Todd, S Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul
2015-04-15
Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury.
Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H.; McStay, Christopher; Todd, S. Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul
2015-01-01
Abstract Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury. PMID:25582436
Schaarup, Clara; Hartvigsen, Gunnar; Larsen, Lars Bo; Tan, Zheng-Hua; Årsand, Eirik; Hejlesen, Ole Kristian
2015-01-01
The Online Diabetes Exercise System was developed to motivate people with Type 2 diabetes to do a 25 minutes low-volume high-intensity interval training program. In a previous multi-method evaluation of the system, several usability issues were identified and corrected. Despite the thorough testing, it was unclear whether all usability problems had been identified using the multi-method evaluation. Our hypothesis was that adding the eye-tracking triangulation to the multi-method evaluation would increase the accuracy and completeness when testing the usability of the system. The study design was an Eye-tracking Triangulation; conventional eye-tracking with predefined tasks followed by The Post-Experience Eye-Tracked Protocol (PEEP). Six Areas of Interests were the basis for the PEEP-session. The eye-tracking triangulation gave objective and subjective results, which are believed to be highly relevant for designing, implementing, evaluating and optimizing systems in the field of health informatics. Future work should include testing the method on a larger and more representative group of users and apply the method on different system types.
Frutos-Pascual, Maite; Garcia-Zapirain, Begonya
2015-05-12
This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems.
Frutos-Pascual, Maite; Garcia-Zapirain, Begonya
2015-01-01
This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems. PMID:25985158
Examining Focused L2 Practice: From in Vitro to in Vivo
ERIC Educational Resources Information Center
Cornillie, Frederik; Van Den Noortgate, Wim; Van den Branden, Kris; Desmet, Piet
2017-01-01
Behaviour-tracking technology has been used for decades in SLA research on focused practice with an eye toward elucidating the nature of L2 automatization (e.g. DeKeyser, 1997; Robinson, 1997). This involves longitudinally capturing learners' judgments or linguistic production along with their response times in order to investigate how specific…
Shape Up: An Eye-Tracking Study of Preschoolers' Shape Name Processing and Spatial Development
ERIC Educational Resources Information Center
Verdine, Brian N.; Bunger, Ann; Athanasopoulou, Angeliki; Golinkoff, Roberta Michnick; Hirsh-Pasek, Kathy
2017-01-01
Learning the names of geometric shapes is at the intersection of early spatial, mathematical, and language skills, all important for school-readiness and predictors of later abilities in science, technology, engineering, and mathematics (STEM). We investigated whether socioeconomic status (SES) influenced children's processing of shape names and…
The Influence of Different Representations on Solving Concentration Problems at Elementary School
ERIC Educational Resources Information Center
Liu, Chia-Ju; Shen, Ming-Hsun
2011-01-01
This study investigated the students' learning process of the concept of concentration at the elementary school level in Taiwan. The influence of different representational types on the process of proportional reasoning was also explored. The participants included nineteen third-grade and eighteen fifth-grade students. Eye-tracking technology was…
Dube, William V.; Wilkinson, Krista M.
2014-01-01
This paper examines the phenomenon of “stimulus overselectivity” or “overselective attention” as it may impact AAC training and use in individuals with intellectual disabilities. Stimulus overselectivity is defined as an atypical limitation in the number of stimuli or stimulus features within an image that are attended to and subsequently learned. Within AAC, the term “stimulus” could refer to symbols or line drawings on speech generating devices, drawings or pictures on low-technology systems, and/or the elements within visual scene displays. In this context, overselective attention may result in unusual or uneven error patterns such as confusion between two symbols that share a single feature or difficulties with transitioning between different types of hardware. We review some of the ways that overselective attention has been studied behaviorally. We then examine how eye tracking technology allows a glimpse into some of the behavioral characteristics of overselective attention. We describe an intervention approach, differential observing responses, that may reduce or eliminate overselectivity, and we consider this type of intervention as it relates to issues of relevance for AAC. PMID:24773053
Video-based eye tracking for neuropsychiatric assessment.
Adhikari, Sam; Stark, David E
2017-01-01
This paper presents a video-based eye-tracking method, ideally deployed via a mobile device or laptop-based webcam, as a tool for measuring brain function. Eye movements and pupillary motility are tightly regulated by brain circuits, are subtly perturbed by many disease states, and are measurable using video-based methods. Quantitative measurement of eye movement by readily available webcams may enable early detection and diagnosis, as well as remote/serial monitoring, of neurological and neuropsychiatric disorders. We successfully extracted computational and semantic features for 14 testing sessions, comprising 42 individual video blocks and approximately 17,000 image frames generated across several days of testing. Here, we demonstrate the feasibility of collecting video-based eye-tracking data from a standard webcam in order to assess psychomotor function. Furthermore, we were able to demonstrate through systematic analysis of this data set that eye-tracking features (in particular, radial and tangential variance on a circular visual-tracking paradigm) predict performance on well-validated psychomotor tests. © 2017 New York Academy of Sciences.
Promoting autonomy in a smart home environment with a smarter interface.
Brennan, C P; McCullagh, P J; Galway, L; Lightbody, G
2015-01-01
In the not too distant future, the median population age will tend towards 65; an age at which the need for dependency increases. Most older people want to remain autonomous and self-sufficient for as long as possible. As environments become smarter home automation solutions can be provided to support this aspiration. The technology discussed within this paper focuses on providing a home automation system that can be controlled by most users regardless of mobility restrictions, and hence it may be applicable to older people. It comprises a hybrid Brain-Computer Interface, home automation user interface and actuators. In the first instance, our system is controlled with conventional computer input, which is then replaced with eye tracking and finally a BCI and eye tracking collaboration. The systems have been assessed in terms of information throughput; benefits and limitations are evaluated.
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822
Demšar, Urška; Çöltekin, Arzu
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.
Context Effects and Spoken Word Recognition of Chinese: An Eye-Tracking Study
ERIC Educational Resources Information Center
Yip, Michael C. W.; Zhai, Mingjun
2018-01-01
This study examined the time-course of context effects on spoken word recognition during Chinese sentence processing. We recruited 60 native Mandarin listeners to participate in an eye-tracking experiment. In this eye-tracking experiment, listeners were told to listen to a sentence carefully, which ended with a Chinese homophone, and look at…
Eye Movements during Multiple Object Tracking: Where Do Participants Look?
ERIC Educational Resources Information Center
Fehd, Hilda M.; Seiffert, Adriane E.
2008-01-01
Similar to the eye movements you might make when viewing a sports game, this experiment investigated where participants tend to look while keeping track of multiple objects. While eye movements were recorded, participants tracked either 1 or 3 of 8 red dots that moved randomly within a square box on a black background. Results indicated that…
A pilot study of eye-tracking devices in intensive care.
Garry, Jonah; Casey, Kelly; Cole, Therese Kling; Regensburg, Angela; McElroy, Colleen; Schneider, Eric; Efron, David; Chi, Albert
2016-03-01
Eye-tracking devices have been suggested as a means of improving communication and psychosocial status among patients in the intensive care unit (ICU). This study was undertaken to explore the psychosocial impact and communication effects of eye-tracking devices in the ICU. A convenience sample of patients in the medical ICU, surgical ICU, and neurosciences critical care unit were enrolled prospectively. Patients participated in 5 guided sessions of 45 minutes each with the eye-tracking computer. After completion of the sessions, the Psychosocial Impact of Assistive Devices Scale (PIADS) was used to evaluate the device from the patient's perspective. All patients who participated in the study were able to communicate basic needs to nursing staff and family. Delirium as assessed by the Confusion Assessment Method for the Intensive Care Unit was present in 4 patients at recruitment and none after training. The device's overall psychosocial impact ranged from neutral (-0.29) to strongly positive (2.76). Compared with the absence of intervention (0 = no change), patients exposed to eye-tracking computers demonstrated a positive mean overall impact score (PIADS = 1.30; P = .004). This finding was present in mean scores for each PIADS domain: competence = 1.26, adaptability = 1.60, and self-esteem = 1.02 (all P < .01). There is a population of patients in the ICU whose psychosocial status, delirium, and communication ability may be enhanced by eye-tracking devices. These 3 outcomes are intertwined with ICU patient outcomes and indirectly suggest that eye-tracking devices might improve outcomes. A more in-depth exploration of the population to be targeted, the device's limitations, and the benefits of eye-tracking devices in the ICU is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.
Huang, Chien-Ting; Hwang, Ing-Shiou
2012-01-01
Visual feedback and non-visual information play different roles in tracking of an external target. This study explored the respective roles of the visual and non-visual information in eleven healthy volunteers who coupled the manual cursor to a rhythmically moving target of 0.5 Hz under three sensorimotor conditions: eye-alone tracking (EA), eye-hand tracking with visual feedback of manual outputs (EH tracking), and the same tracking without such feedback (EHM tracking). Tracking error, kinematic variables, and movement intermittency (saccade and speed pulse) were contrasted among tracking conditions. The results showed that EHM tracking exhibited larger pursuit gain, less tracking error, and less movement intermittency for the ocular plant than EA tracking. With the vision of manual cursor, EH tracking achieved superior tracking congruency of the ocular and manual effectors with smaller movement intermittency than EHM tracking, except that the rate precision of manual action was similar for both types of tracking. The present study demonstrated that visibility of manual consequences altered mutual relationships between movement intermittency and tracking error. The speed pulse metrics of manual output were linked to ocular tracking error, and saccade events were time-locked to the positional error of manual tracking during EH tracking. In conclusion, peripheral non-visual information is critical to smooth pursuit characteristics and rate control of rhythmic manual tracking. Visual information adds to eye-hand synchrony, underlying improved amplitude control and elaborate error interpretation during oculo-manual tracking. PMID:23236498
ERIC Educational Resources Information Center
Yang, Fang-Ying; Tsai, Meng-Jung; Chiou, Guo-Li; Lee, Silvia Wen-Yu; Chang, Cheng-Chieh; Chen, Li-Ling
2018-01-01
The main purpose of this study was to provide instructional suggestions for supporting science learning in digital environments based on a review of eye tracking studies in e-learning related areas. Thirty-three eye-tracking studies from 2005 to 2014 were selected from the Social Science Citation Index (SSCI) database for review. Through a…
ERIC Educational Resources Information Center
van der Gijp, A.; Ravesloot, C. J.; Jarodzka, H.; van der Schaaf, M. F.; van der Schaaf, I. C.; van Schaik, J. P.; ten Cate, Th. J.
2017-01-01
Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology…
Using Eye Tracking as a Tool to Teach Informatics Students the Importance of User Centered Design
ERIC Educational Resources Information Center
Gelderblom, Helene; Adebesin, Funmi; Brosens, Jacques; Kruger, Rendani
2017-01-01
In this article the authors describe how they incorporate eye tracking in a human-computer interaction (HCI) course that forms part of a postgraduate Informatics degree. The focus is on an eye tracking assignment that involves student groups performing usability evaluation studies for real world clients. Over the past three years the authors have…
Assessing cognitive functioning in females with Rett syndrome by eye-tracking methodology.
Ahonniska-Assa, Jaana; Polack, Orli; Saraf, Einat; Wine, Judy; Silberg, Tamar; Nissenkorn, Andreea; Ben-Zeev, Bruria
2018-01-01
While many individuals with severe developmental impairments learn to communicate with augmentative and alternative communication (AAC) devices, a significant number of individuals show major difficulties in the effective use of AAC. Recent technological innovations, i.e., eye-tracking technology (ETT), aim to improve the transparency of communication and may also enable a more valid cognitive assessment. To investigate whether ETT in forced-choice tasks can enable children with very severe motor and speech impairments to respond consistently, allowing a more reliable evaluation of their language comprehension. Participants were 17 girls with Rett syndrome (M = 6:06 years). Their ability to respond by eye gaze was first practiced with computer games using ETT. Afterwards, their receptive vocabulary was assessed using the Peabody Picture Vocabulary Test-4 (PPVT-4). Target words were orally presented and participants responded by focusing their eyes on the preferred picture. Remarkable differences between the participants in receptive vocabulary were demonstrated using ETT. The verbal comprehension abilities of 32% of the participants ranged from low-average to mild cognitive impairment, and the other 68% of the participants showed moderate to severe impairment. Young age at the time of assessment was positively correlated with higher receptive vocabulary. The use of ETT seems to make the communicational signals of children with severe motor and communication impairments more easily understood. Early practice of ETT may improve the quality of communication and enable more reliable conclusions in learning and assessment sessions. Copyright © 2017 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.
Bekele, Esubalew; Zheng, Zhi; Swanson, Amy; Crittendon, Julie; Warren, Zachary; Sarkar, Nilanjan
2013-01-01
Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD. PMID:23428456
Bekele, Esubalew; Zheng, Zhi; Swanson, Amy; Crittendon, Julie; Warren, Zachary; Sarkar, Nilanjan
2013-04-01
Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.
A relationship between eye movement patterns and performance in a precognitive tracking task
NASA Technical Reports Server (NTRS)
Repperger, D. W.; Hartzell, E. J.
1977-01-01
Eye movements made by various subjects in the performance of a precognitive tracking task are studied. The tracking task persented by an antiaircraft artillery (AAA) simulator has an input forcing function represented by a deterministic aircraft fly-by. The performance of subjects is ranked by two metrics. Good, mediocre, and poor trackers are selected for analysis based on performance during the difficult segment of the tracking task and over replications. Using phase planes to characterize both the eye movement patterns and the displayed error signal, a simple metric is developed to study these patterns. Two characterizations of eye movement strategies are defined and quantified. Using these two types of eye strategies, two conclusions are obtained about good, mediocre, and poor trackers. First, the eye tracker who used a fixed strategy will consistently perform better. Secondly, the best fixed strategy is defined as a Crosshair Fixator.
3D ocular ultrasound using gaze tracking on the contralateral eye: a feasibility study.
Afsham, Narges; Najafi, Mohammad; Abolmaesumi, Purang; Rohling, Robert
2011-01-01
A gaze-deviated examination of the eye with a 2D ultrasound transducer is a common and informative ophthalmic test; however, the complex task of the pose estimation of the ultrasound images relative to the eye affects 3D interpretation. To tackle this challenge, a novel system for 3D image reconstruction based on gaze tracking of the contralateral eye has been proposed. The gaze fixates on several target points and, for each fixation, the pose of the examined eye is inferred from the gaze tracking. A single camera system has been developed for pose estimation combined with subject-specific parameter identification. The ultrasound images are then transformed to the coordinate system of the examined eye to create a 3D volume. Accuracy of the proposed gaze tracking system and the pose estimation of the eye have been validated in a set of experiments. Overall system error, including pose estimation and calibration, are 3.12 mm and 4.68 degrees.
A laser-based eye-tracking system.
Irie, Kenji; Wilson, Bruce A; Jones, Richard D; Bones, Philip J; Anderson, Tim J
2002-11-01
This paper reports on the development of a new eye-tracking system for noninvasive recording of eye movements. The eye tracker uses a flying-spot laser to selectively image landmarks on the eye and, subsequently, measure horizontal, vertical, and torsional eye movements. Considerable work was required to overcome the adverse effects of specular reflection of the flying-spot from the surface of the eye onto the sensing elements of the eye tracker. These effects have been largely overcome, and the eye-tracker has been used to document eye movement abnormalities, such as abnormal torsional pulsion of saccades, in the clinical setting.
ERIC Educational Resources Information Center
Rodriguez, Christina M.; Cook, Anne E.; Jedrziewski, Chezlie T.
2012-01-01
Objective: Researchers in the child maltreatment field have traditionally relied on explicit self-reports to study factors that may exacerbate physical child abuse risk. The current investigation evaluated an implicit analog task utilizing eye tracking technology to assess both parental attributions of child misbehavior and empathy. Method: Based…
ERIC Educational Resources Information Center
Belenky, Daniel; Ringenberg, Michael; Olsen, Jennifer; Aleven, Vincent; Rummel, Nikol
2014-01-01
As learning technologies proliferate, it is important for research to address how to best align instruction to educational goals. For example, recent evidence indicates that working collaboratively may have unique benefits for facilitating the acquisition of conceptual understanding, as opposed to procedural fluency (Mullins, Rummel & Spada,…
Laparra-Hernández, José; Medina, Enric; Sancho, María; Soriano, Carolina; Durá, Juanvi; Barberà-Guillem, Ricard; Poveda-Puente, Rakel
2015-01-01
Senior citizens can benefit from banking services but the lack of usability hampers this possibility. New approaches based on physiological response, eye tracking and user movement analysis can provide more information during interface interaction. This research shows the differences depending on user knowledge and use of technology, gender and type of interface.
Evaluation of an eye-pointer interaction device for human-computer interaction.
Cáceres, Enrique; Carrasco, Miguel; Ríos, Sebastián
2018-03-01
Advances in eye-tracking technology have led to better human-computer interaction, and involve controlling a computer without any kind of physical contact. This research describes the transformation of a commercial eye-tracker for use as an alternative peripheral device in human-computer interactions, implementing a pointer that only needs the eye movements of a user facing a computer screen, thus replacing the need to control the software by hand movements. The experiment was performed with 30 test individuals who used the prototype with a set of educational videogames. The results show that, although most of the test subjects would prefer a mouse to control the pointer, the prototype tested has an empirical precision similar to that of the mouse, either when trying to control its movements or when attempting to click on a point of the screen.
Photorefractive keratectomy with a small spot laser and tracker.
Pallikaris, I G; Koufala, K I; Siganos, D S; Papadaki, T G; Katsanevaki, V J; Tourtsan, V; McDonald, M B
1999-01-01
The Autonomous Technologies LADARVision excimer laser system utilizes an eye tracking mechanism and a small spot for photorefractive keratectomy. One hundred and two eyes of 102 patients were treated for -1.50 to -6.25 D of spherical myopia at the spectacle plane using a 6-mm diameter ablation zone. One year follow-up was available for 93 eyes (91%). Uncorrected visual acuity for eyes treated for distance vision was 20/40 or better in 99% (n = 90), and 20/20 or better in 70% (n = 64) of eyes at 12 months. Spectacle-corrected visual acuity was 20/25 or better in all 92 eyes reported; no eye lost more than 2 lines of spectacle-corrected visual acuity, and only 1 eye (1.0%) experienced a loss of 2 lines (20/12.5 to 20/20) at 1 year. The refractive result was within +/- 0.50 D of the desired correction in 75% (n = 70), and within +/- 1.00 D in 93% (n = 86) of eyes at 12 months. Refractive stability was achieved between 3 and 6 months. Corneal haze was graded as trace or less in 100% of the 93 eyes. No significant reductions were noted in contrast sensitivity or endothelial cell density. Patients treated with the Autonomous Technologies LADARVision excimer laser system for -1.50 to -6.25 D of spherical myopia with 1 year follow-up had uncorrected visual acuity of 20/20 or better in 70%, no significant loss of spectacle-corrected visual acuity, no reduction of endothelial cell density or contrast sensitivity, and low levels of corneal haze.
Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis
2006-01-01
Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis Laura Kurland, Abigail Gertner, Tom Bartee, Michael Chisholm and...have used these to study the analysts search behavior in detail. 2 EXPERIMENT Using a Cognitive Task Analysis (CTA) framework for knowledge...TITLE AND SUBTITLE Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM
An Exploration of Cognitive Agility as Quantified by Attention Allocation in a Complex Environment
2017-03-01
quantified by eye-tracking data collected while subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether...subjects played a military-relevant cognitive agility computer game (Make Goal), to determine whether certain patterns are associated with effective...Group and Control Group on Eye Tracking and Game Performance .....................36 3. Comparison between High and Low Performers on Eye tracking and
Heuer, Sabine; Hallowell, Brooke
2015-01-01
Numerous authors report that people with aphasia have greater difficulty allocating attention than people without neurological disorders. Studying how attention deficits contribute to language deficits is important. However, existing methods for indexing attention allocation in people with aphasia pose serious methodological challenges. Eye-tracking methods have great potential to address such challenges. We developed and assessed the validity of a new dual-task method incorporating eye tracking to assess attention allocation. Twenty-six adults with aphasia and 33 control participants completed auditory sentence comprehension and visual search tasks. To test whether the new method validly indexes well-documented patterns in attention allocation, demands were manipulated by varying task complexity in single- and dual-task conditions. Differences in attention allocation were indexed via eye-tracking measures. For all participants significant increases in attention allocation demands were observed from single- to dual-task conditions and from simple to complex stimuli. Individuals with aphasia had greater difficulty allocating attention with greater task demands. Relationships between eye-tracking indices of comprehension during single and dual tasks and standardized testing were examined. Results support the validity of the novel eye-tracking method for assessing attention allocation in people with and without aphasia. Clinical and research implications are discussed. PMID:25913549
Moacdieh, Nadine; Sarter, Nadine
2015-06-01
The objective was to use eye tracking to trace the underlying changes in attention allocation associated with the performance effects of clutter, stress, and task difficulty in visual search and noticing tasks. Clutter can degrade performance in complex domains, yet more needs to be known about the associated changes in attention allocation, particularly in the presence of stress and for different tasks. Frequently used and relatively simple eye tracking metrics do not effectively capture the various effects of clutter, which is critical for comprehensively analyzing clutter and developing targeted, real-time countermeasures. Electronic medical records (EMRs) were chosen as the application domain for this research. Clutter, stress, and task difficulty were manipulated, and physicians' performance on search and noticing tasks was recorded. Several eye tracking metrics were used to trace attention allocation throughout those tasks, and subjective data were gathered via a debriefing questionnaire. Clutter degraded performance in terms of response time and noticing accuracy. These decrements were largely accentuated by high stress and task difficulty. Eye tracking revealed the underlying attentional mechanisms, and several display-independent metrics were shown to be significant indicators of the effects of clutter. Eye tracking provides a promising means to understand in detail (offline) and prevent (in real time) major performance breakdowns due to clutter. Display designers need to be aware of the risks of clutter in EMRs and other complex displays and can use the identified eye tracking metrics to evaluate and/or adjust their display. © 2015, Human Factors and Ergonomics Society.
Lykins, Amy D; Meana, Marta; Kambe, Gretchen
2006-10-01
As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.
NASA Astrophysics Data System (ADS)
Iatsun, Iana; Larabi, Mohamed-Chaker; Fernandez-Maloigne, Christine
2014-03-01
The changing of TV systems from 2D to 3D mode is the next expected step in the telecommunication world. Some works have already been done to perform this progress technically, but interaction of the third dimension with humans is not yet clear. Previously, it was found that any increased load of visual system can create visual fatigue, like prolonged TV watching, computer work or video gaming. But watching S3D can cause another nature of visual fatigue, since all S3D technologies creates illusion of the third dimension based on characteristics of binocular vision. In this work we propose to evaluate and compare the visual fatigue from watching 2D and S3D content. This work shows the difference in accumulation of visual fatigue and its assessment for two types of content. In order to perform this comparison eye-tracking experiments using six commercially available movies were conducted. Healthy naive participants took part into the test and gave their answers feeling the subjective evaluation. It was found that watching stereo 3D content induce stronger feeling of visual fatigue than conventional 2D, and the nature of video has an important effect on its increase. Visual characteristics obtained by using eye-tracking were investigated regarding their relation with visual fatigue.
Active eye-tracking improves LASIK results.
Lee, Yuan-Chieh
2007-06-01
To study the advantage of active eye-tracking for photorefractive surgery. In a prospective, double-masked study, LASIK for myopia and myopic astigmatism was performed in 50 patients using the ALLEGRETTO WAVE version 1007. All patients received LASIK with full comprehension of the importance of fixation during the procedure. All surgical procedures were performed by a single surgeon. The eye-tracker was turned off in one group (n = 25) and kept on in another group (n = 25). Preoperatively and 3 months postoperatively, patients underwent a standard ophthalmic examination, which included comeal topography. In the patients treated with the eye-tracker off, all had uncorrected visual acuity (UCVA) of > or = 20/40 and 64% had > or = 20/20. Compared with the patients treated with the eye-tracker on, they had higher residual cylindrical astigmatism (P < .05). Those treated with the eye-tracker on achieved better UCVA and best spectacle-corrected visual acuity (P < .05). Spherical error and potential visual acuity (TMS-II) were not significantly different between the groups. The flying-spot system can achieve a fair result without active eye-tracking, but active eye-tracking helps improve the visual outcome and reduces postoperative cylindrical astigmatism.
Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.
Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo
2017-07-01
Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.
ERIC Educational Resources Information Center
Corden, Ben; Chilvers, Rebecca; Skuse, David
2008-01-01
We combined eye-tracking technology with a test of facial affect recognition and a measure of self-reported social anxiety in order to explore the aetiology of social-perceptual deficits in Asperger's syndrome (AS). Compared to controls matched for age, IQ and visual-perceptual ability, we found a group of AS adults was impaired in their…
Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K
2008-01-01
A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.
Bond, R R; Zhu, T; Finlay, D D; Drew, B; Kligfield, P D; Guldenring, D; Breen, C; Gallagher, A G; Daly, M J; Clifford, G D
2014-01-01
It is well known that accurate interpretation of the 12-lead electrocardiogram (ECG) requires a high degree of skill. There is also a moderate degree of variability among those who interpret the ECG. While this is the case, there are no best practice guidelines for the actual ECG interpretation process. Hence, this study adopts computerized eye tracking technology to investigate whether eye-gaze can be used to gain a deeper insight into how expert annotators interpret the ECG. Annotators were recruited in San Jose, California at the 2013 International Society of Computerised Electrocardiology (ISCE). Each annotator was recruited to interpret a number of 12-lead ECGs (N=12) while their eye gaze was recorded using a Tobii X60 eye tracker. The device is based on corneal reflection and is non-intrusive. With a sampling rate of 60Hz, eye gaze coordinates were acquired every 16.7ms. Fixations were determined using a predefined computerized classification algorithm, which was then used to generate heat maps of where the annotators looked. The ECGs used in this study form four groups (3=ST elevation myocardial infarction [STEMI], 3=hypertrophy, 3=arrhythmias and 3=exhibiting unique artefacts). There was also an equal distribution of difficulty levels (3=easy to interpret, 3=average and 3=difficult). ECGs were displayed using the 4x3+1 display format and computerized annotations were concealed. Precisely 252 expert ECG interpretations (21 annotators×12 ECGs) were recorded. Average duration for ECG interpretation was 58s (SD=23). Fleiss' generalized kappa coefficient (Pa=0.56) indicated a moderate inter-rater reliability among the annotators. There was a 79% inter-rater agreement for STEMI cases, 71% agreement for arrhythmia cases, 65% for the lead misplacement and dextrocardia cases and only 37% agreement for the hypertrophy cases. In analyzing the total fixation duration, it was found that on average annotators study lead V1 the most (4.29s), followed by leads V2 (3.83s), the rhythm strip (3.47s), II (2.74s), V3 (2.63s), I (2.53s), aVL (2.45s), V5 (2.27s), aVF (1.74s), aVR (1.63s), V6 (1.39s), III (1.32s) and V4 (1.19s). It was also found that on average the annotator spends an equal amount of time studying leads in the frontal plane (15.89s) when compared to leads in the transverse plane (15.70s). It was found that on average the annotators fixated on lead I first followed by leads V2, aVL, V1, II, aVR, V3, rhythm strip, III, aVF, V5, V4 and V6. We found a strong correlation (r=0.67) between time to first fixation on a lead and the total fixation duration on each lead. This indicates that leads studied first are studied the longest. There was a weak negative correlation between duration and accuracy (r=-0.2) and a strong correlation between age and accuracy (r=0.67). Eye tracking facilitated a deeper insight into how expert annotators interpret the 12-lead ECG. As a result, the authors recommend ECG annotators to adopt an initial first impression/pattern recognition approach followed by a conventional systematic protocol to ECG interpretation. This recommendation is based on observing misdiagnoses given due to first impression only. In summary, this research presents eye gaze results from expert ECG annotators and provides scope for future work that involves exploiting computerized eye tracking technology to further the science of ECG interpretation. Copyright © 2014 Elsevier Inc. All rights reserved.
Bölte, S; Bartl-Pokorny, KD; Jonsson, U; Berggren, S; Zhang, D; Kostrzewa, E; Falck-Ytter, T; Einspieler, C; Pokorny, FB; Jones, EJH; Roeyers, H; Charman, T; Marschik, PB
2018-01-01
We reviewed original research papers that used quantifiable technology to detect early autism spectrum disorder (ASD) and identified 376 studies from 34 countries from 1965-2013. Publications have increased significantly since 2000, with most coming from the USA. Electroencephalogram, magnetic resonance imaging and eye-tracking were the most frequently used technologies. Conclusion The use of quantifiable technology to detect early ASD has increased in recent decades, but has had limited impact on early detection and treatment. Further scientific developments are anticipated and we hope that they will increasingly be used in clinical practice for early ASD screening, diagnosis and intervention. PMID:26479859
Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M
2016-01-26
Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.
Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions.
Ho, Simon; Foulsham, Tom; Kingstone, Alan
2015-01-01
Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest.
Improving data retention in EEG research with children using child-centered eye tracking
Maguire, Mandy J.; Magnon, Grant; Fitzhugh, Anna E.
2014-01-01
Background Event Related Potentials (ERPs) elicited by visual stimuli have increased our understanding of developmental disorders and adult cognitive abilities for decades; however, these studies are very difficult with populations who cannot sustain visual attention such as infants and young children. Current methods for studying such populations include requiring a button response, which may be impossible for some participants, and experimenter monitoring, which is subject to error, highly variable, and spatially imprecise. New Method We developed a child-centered methodology to integrate EEG data acquisition and eye-tracking technologies that uses “attention-getters” in which stimulus display is contingent upon the child’s gaze. The goal was to increase the number of trials retained. Additionally, we used the eye-tracker to categorize and analyze the EEG data based on gaze to specific areas of the visual display, compared to analyzing based on stimulus presentation. Results Compared with Existing Methods The number of trials retained was substantially improved using the child-centered methodology compared to a button-press response in 7–8 year olds. In contrast, analyzing the EEG based on eye gaze to specific points within the visual display as opposed to stimulus presentation provided too few trials for reliable interpretation. Conclusions By using the linked EEG-eye-tracker we significantly increased data retention. With this method, studies can be completed with fewer participants and a wider range of populations. However, caution should be used when epoching based on participants’ eye gaze because, in this case, this technique provided substantially fewer trials. PMID:25251555
Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction.
Black, David; Unger, Michael; Fischer, Nele; Kikinis, Ron; Hahn, Horst; Neumuth, Thomas; Glaser, Bernhard
2018-01-01
The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response. An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures. When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings. Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.
Garcia-Zapirain, Begoña; de la Torre Díez, Isabel; López-Coronado, Miguel
2017-07-01
Attention Deficit Hyperactivity Disorder (ADHD) is a brain disorder marked by an ongoing pattern of inattention and/or hyperactivity-impulsivity that affects with development or functioning. It affects 3-5% of all American and European children. The objective of this paper is to develop and test a dual system for the rehabilitation of cognitive functions in children with ADHD. A technological platform has been developed using the ". NET framework", which makes use of two physiological sensors, -an eye-tracker and a hand gesture recognition sensor- in order to provide children with the opportunity to develop their learning and attention skills. The two physiological sensors we utilized for the development are the Tobii X1 Light Eye Tracker and the Leap Motion. SUS and QUIS questionnaires have been carried out. 19 users tested the system and the average age was 10.88 years (SD = 3.14). The results obtained after tests were performed were quite positive and hopeful. The learning of the users caused by the system and the interfaces item got a high punctuation with a mean of 7.34 (SD = 1.06) for SUS questionnaire and 7.73 (SD = 0.6) for QUIS questionnaire. We didn't find differences between boys and girls. The developed multimodal rehabilitation system can help to children with attention deficit and learning issues. Moreover, the teachers may utilize this system to track the progression of their students and see their behavior.
Lee, Jun-Hak; Lim, Jeong-Hwan; Hwang, Han-Jeong; Im, Chang-Hwan
2013-01-01
The main goal of this study was to develop a hybrid mental spelling system combining a steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) technology and a webcam-based eye-tracker, which utilizes information from the brain electrical activity and eye gaze direction at the same time. In the hybrid mental spelling system, a character decoded using SSVEP was not typed if the position of the selected character was not matched with the eye direction information ('left' or 'right') obtained from the eye-tracker. Thus, the users did not need to correct a misspelled character using a 'BACKSPACE' key. To verify the feasibility of the developed hybrid mental spelling system, we conducted online experiments with ten healthy participants. Each participant was asked to type 15 English words consisting of 68 characters. As a result, 16.6 typing errors could be prevented on average, demonstrating that the implemented hybrid mental spelling system could enhance the practicality of our mental spelling system.
Isaacowitz, Derek M.; Livingstone, Kimberly M.; Harris, Julia A.; Marcotte, Stacy L.
2014-01-01
We report two studies representing the first use of mobile eye tracking to study emotion regulation across adulthood. Past research on age differences in attentional deployment using stationary eye tracking has found older adults show relatively more positive looking, and seem to benefit more mood-wise from this looking pattern, compared to younger adults. However, these past studies have greatly constrained the stimuli participants can look at, despite real-world settings providing numerous possibilities for what to choose to look at. We therefore used mobile eye tracking to study age differences in attentional selection, as indicated by fixation patterns to stimuli of different valence freely chosen by the participant. In contrast to stationary eye tracking studies of attentional deployment, Study 1 showed that younger and older individuals generally selected similar proportions of valenced stimuli, and attentional selection had similar effects on mood across age groups. Study 2 replicated this pattern with an adult lifespan sample including middle-aged individuals. Emotion regulation-relevant attention may thus differ depending on whether stimuli are freely chosen or not. PMID:25527965
A MATLAB-based eye tracking control system using non-invasive helmet head restraint in the macaque.
De Luna, Paolo; Mohamed Mustafar, Mohamed Faiz Bin; Rainer, Gregor
2014-09-30
Tracking eye position is vital for behavioral and neurophysiological investigations in systems and cognitive neuroscience. Infrared camera systems which are now available can be used for eye tracking without the need to surgically implant magnetic search coils. These systems are generally employed using rigid head fixation in monkeys, which maintains the eye in a constant position and facilitates eye tracking. We investigate the use of non-rigid head fixation using a helmet that constrains only general head orientation and allows some freedom of movement. We present a MATLAB software solution to gather and process eye position data, present visual stimuli, interact with various devices, provide experimenter feedback and store data for offline analysis. Our software solution achieves excellent timing performance due to the use of data streaming, instead of the traditionally employed data storage mode for processing analog eye position data. We present behavioral data from two monkeys, demonstrating that adequate performance levels can be achieved on a simple fixation paradigm and show how performance depends on parameters such as fixation window size. Our findings suggest that non-rigid head restraint can be employed for behavioral training and testing on a variety of gaze-dependent visual paradigms, reducing the need for rigid head restraint systems for some applications. While developed for macaque monkey, our system of course can work equally well for applications in human eye tracking where head constraint is undesirable. Copyright © 2014. Published by Elsevier B.V.
Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects.
Kang, Ziho; Mandal, Saptarshi; Crutchfield, Jerry; Millan, Angel; McClung, Sarah N
2016-01-01
Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance.
Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects
Mandal, Saptarshi
2016-01-01
Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance. PMID:27725830
López-Gil, Juan-Miguel; Virgili-Gomá, Jordi; Gil, Rosa; Guilera, Teresa; Batalla, Iolanda; Soler-González, Jorge; García, Roberto
2016-01-01
Technical advances, particularly the integration of wearable and embedded sensors, facilitate tracking of physiological responses in a less intrusive way. Currently, there are many devices that allow gathering biometric measurements from human beings, such as EEG Headsets or Health Bracelets. The massive data sets generated by tracking of EEG and physiology may be used, among other things, to infer knowledge about human moods and emotions. Apart from direct biometric signal measurement, eye tracking systems are nowadays capable of determining the point of gaze of the users when interacting in ICT environments, which provides an added value research on many different areas, such as psychology or marketing. We present a process in which devices for eye tracking, biometric, and EEG signal measurements are synchronously used for studying both basic and complex emotions. We selected the least intrusive devices for different signal data collection given the study requirements and cost constraints, so users would behave in the most natural way possible. On the one hand, we have been able to determine basic emotions participants were experiencing by means of valence and arousal. On the other hand, a complex emotion such as empathy has also been detected. To validate the usefulness of this approach, a study involving forty-four people has been carried out, where they were exposed to a series of affective stimuli while their EEG activity, biometric signals, and eye position were synchronously recorded to detect self-regulation. The hypothesis of the work was that people who self-regulated would show significantly different results when analyzing their EEG data. Participants were divided into two groups depending on whether Electro Dermal Activity (EDA) data indicated they self-regulated or not. The comparison of the results obtained using different machine learning algorithms for emotion recognition shows that using EEG activity alone as a predictor for self-regulation does not allow properly determining whether a person in self-regulation its emotions while watching affective stimuli. However, adequately combining different data sources in a synchronous way to detect emotions makes it possible to overcome the limitations of single detection methods. PMID:27594831
López-Gil, Juan-Miguel; Virgili-Gomá, Jordi; Gil, Rosa; García, Roberto
2016-01-01
Technical advances, particularly the integration of wearable and embedded sensors, facilitate tracking of physiological responses in a less intrusive way. Currently, there are many devices that allow gathering biometric measurements from human beings, such as EEG Headsets or Health Bracelets. The massive data sets generated by tracking of EEG and physiology may be used, among other things, to infer knowledge about human moods and emotions. Apart from direct biometric signal measurement, eye tracking systems are nowadays capable of determining the point of gaze of the users when interacting in ICT environments, which provides an added value research on many different areas, such as psychology or marketing. We present a process in which devices for eye tracking, biometric, and EEG signal measurements are synchronously used for studying both basic and complex emotions. We selected the least intrusive devices for different signal data collection given the study requirements and cost constraints, so users would behave in the most natural way possible. On the one hand, we have been able to determine basic emotions participants were experiencing by means of valence and arousal. On the other hand, a complex emotion such as empathy has also been detected. To validate the usefulness of this approach, a study involving forty-four people has been carried out, where they were exposed to a series of affective stimuli while their EEG activity, biometric signals, and eye position were synchronously recorded to detect self-regulation. The hypothesis of the work was that people who self-regulated would show significantly different results when analyzing their EEG data. Participants were divided into two groups depending on whether Electro Dermal Activity (EDA) data indicated they self-regulated or not. The comparison of the results obtained using different machine learning algorithms for emotion recognition shows that using EEG activity alone as a predictor for self-regulation does not allow properly determining whether a person in self-regulation its emotions while watching affective stimuli. However, adequately combining different data sources in a synchronous way to detect emotions makes it possible to overcome the limitations of single detection methods.
Real-time eye tracking for the assessment of driver fatigue.
Xu, Junli; Min, Jianliang; Hu, Jianfeng
2018-04-01
Eye-tracking is an important approach to collect evidence regarding some participants' driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants' eye state for collecting eye-movement data. These data are useful to get insights into assessing participants' fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1-2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K -nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue.
Real-time eye tracking for the assessment of driver fatigue
Xu, Junli; Min, Jianliang
2018-01-01
Eye-tracking is an important approach to collect evidence regarding some participants’ driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants’ eye state for collecting eye-movement data. These data are useful to get insights into assessing participants’ fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1–2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K-nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue. PMID:29750113
Three-Dimensional Eye Tracking in a Surgical Scenario.
Bogdanova, Rositsa; Boulanger, Pierre; Zheng, Bin
2015-10-01
Eye tracking has been widely used in studying the eye behavior of surgeons in the past decade. Most eye-tracking data are reported in a 2-dimensional (2D) fashion, and data for describing surgeons' behaviors on stereoperception are often missed. With the introduction of stereoscopes in laparoscopic procedures, there is an increasing need for studying the depth perception of surgeons under 3D image-guided surgery. We developed a new algorithm for the computation of convergence points in stereovision by measuring surgeons' interpupillary distance, the distance to the view target, and the difference between gaze locations of the 2 eyes. To test the feasibility of our new algorithm, we recruited 10 individuals to watch stereograms using binocular disparity and asked them to develop stereoperception using a cross-eyed viewing technique. Participants' eye motions were recorded by the Tobii eye tracker while they performed the trials. Convergence points between normal and stereo-viewing conditions were computed using the developed algorithm. All 10 participants were able to develop stereovision after a short period of training. During stereovision, participants' eye convergence points were 14 ± 1 cm in front of their eyes, which was significantly closer than the convergence points under the normal viewing condition (77 ± 20 cm). By applying our method of calculating convergence points using eye tracking, we were able to elicit the eye movement patterns of human operators between the normal and stereovision conditions. Knowledge from this study can be applied to the design of surgical visual systems, with the goal of improving surgical performance and patient safety. © The Author(s) 2015.
Out of the Corner of My Eye: Foveal Semantic Load Modulates Parafoveal Processing in Reading.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Brennan R.; Stites, Mallory C.; Federmeier, Kara D.
In two experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERP) in a modified RSVP paradigm to track the time-course of foveal semantic influences on convert attentional allocation to parafoveal word processing. Furthermore, eye-tracking and ERP data converged to reveal graded effects of semantic foveal load on parafoveal processing.
Out of the Corner of My Eye: Foveal Semantic Load Modulates Parafoveal Processing in Reading.
Payne, Brennan R.; Stites, Mallory C.; Federmeier, Kara D.
2016-07-18
In two experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERP) in a modified RSVP paradigm to track the time-course of foveal semantic influences on convert attentional allocation to parafoveal word processing. Furthermore, eye-tracking and ERP data converged to reveal graded effects of semantic foveal load on parafoveal processing.
Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI
2016-04-01
and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task has been completed and is in beta testing...neurocognitive test battery, and self-report measures of cognitive efficacy. We will also include functional magnetic resonance imagining ( fMRI ) and... fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye tracking data will be
Our self-tracking movement and health literacy: are we really making every moment count?
Vamos, Sandra; Klein, Klaus
2016-08-03
There is a growing movement related to self-tracking in the quest for better health. Why do so many people like to use 'intelligent tools' like shiny sensors or mobile apps to keep an eye on every move? Do they really help us drive sustained healthy behavioral changes? Despite technological advances and product promises, we must remember that technology alone does not facilitate change to optimize health benefits. The purpose of the commentary is to pose the question: How 'health literate' do we have to be to reap the actionable health benefits of self-tracking? Research has revealed the prevalence of limited health literacy across the globe. Health literacy involves a complex set of inter-connected skills, including acting upon health information. This commentary puts attention on health literacy as an essential human tool to better equip people to overcome barriers and use devices to leverage their full potential. © The Author(s) 2016.
Processing Control Information in a Nominal Control Construction: An Eye-Tracking Study
ERIC Educational Resources Information Center
Kwon, Nayoung; Sturt, Patrick
2016-01-01
In an eye-tracking experiment, we examined the processing of the nominal control construction. Participants' eye-movements were monitored while they read sentences that included either giver control nominals (e.g. "promise" in "Luke's promise to Sophia to photograph himself") or recipient control nominals (e.g. "plea"…
NMR Spectra through the Eyes of a Student: Eye Tracking Applied to NMR Items
ERIC Educational Resources Information Center
Topczewski, Joseph J.; Topczewski, Anna M.; Tang, Hui; Kendhammer, Lisa K.; Pienta, Norbert J.
2017-01-01
Nuclear magnetic resonance spectroscopy (NMR) plays a key role in introductory organic chemistry, spanning theory, concepts, and experimentation. Therefore, it is imperative that the instruction methods for NMR are both efficient and effective. By utilizing eye tracking equipment, the researchers were able to monitor how second-semester organic…
Optimizations and Applications in Head-Mounted Video-Based Eye Tracking
ERIC Educational Resources Information Center
Li, Feng
2011-01-01
Video-based eye tracking techniques have become increasingly attractive in many research fields, such as visual perception and human-computer interface design. The technique primarily relies on the positional difference between the center of the eye's pupil and the first-surface reflection at the cornea, the corneal reflection (CR). This…
Compensating For Movement Of Eye In Laser Surgery
NASA Technical Reports Server (NTRS)
Juday, Richard D.
1991-01-01
Conceptual system for laser surgery of retina includes subsystem that tracks position of retina. Tracking signal used to control galvanometer-driven mirrors keeping laser aimed at desired spot on retina as eye moves. Alternatively or additionally, indication of position used to prevent firing of laser when eye moved too far from proper aiming position.
Zimmermann, Jan; Vazquez, Yuriria; Glimcher, Paul W; Pesaran, Bijan; Louie, Kenway
2016-09-01
Video-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive, limiting wide-spread use. Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (<0.5°), and low system latency (∼1.8ms, 0.32ms STD) at a relatively low-cost. Oculomatic compares favorably to our existing scleral search-coil system while being fully non invasive. We propose that Oculomatic can support a wide range of research into the properties and neural mechanisms of oculomotor behavior. Copyright © 2016 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Treves, Richard; Viterbo, Paolo; Haklay, Mordechai
2015-01-01
Research into virtual field trips (VFTs) started in the 1990s but, only recently, the maturing technology of devices and networks has made them viable options for educational settings. By considering an experiment, the learning benefits of logging the movement of students within a VFT are shown. The data are visualized by two techniques:…
Understanding Optimal Decision-Making in Wargaming
2013-10-01
beneficial outcomes from wargaming, one of which is a better understanding of the impact of decisions as a part of combat processes. However, using...under instrument flight rules ( IFR ) (Bellenkes et al., 1997; Katoh, 1997). Of note, eye-tracking technology also has been applied to investigate...Neuroscience, 7 . Skinner, A., Berka, C., Ohara-Long, L., & Sebrechts, M. (2010). Impact of Virtual En- vironment Fidelity on Behavioral and
[A tracking function of human eye in microgravity and during readaptation to earth's gravity].
Kornilova, L N
2001-01-01
The paper summarizes results of electro-oculography of all ways of visual tracking: fixative eye movements (saccades), smooth pursuit of linearly, pendulum-like and circularly moving point stimuli, pursuit of vertically moving foveoretinal optokinetic stimuli, and presents values of thresholds and amplification coefficients of the optokinetic nystagmus during tracking of linear movement of foveoretinal optokinetic stimuli. Investigations were performed aboard the Salyut and Mir space stations with participation of 31 cosmonauts of whom 27 made long-term (76 up to 438 day) and 4 made short-term (7 to 9 day) missions. It was shown that in space flight the saccadic structure within the tracking reaction does not change; yet, corrective movements (additional microsaccades to achieve tracking) appeared in 47% of observations at the onset and in 76% of observations on months 3 to 6 of space flight. After landing, the structure of vertical saccades was found altered in half the cosmonauts. No matter in or after flight, reverse nystagmus was present along with the gaze nystagmus during static saccades in 22% (7 cosmonauts) of the observations. Amplitude of tracking vertically, diagonally or circularly moving stimuli was significantly reduced as period on mission increased. Early in flight (40% of the cosmonauts) and shortly afterwards (21% of the cosmonauts) the structure of smooth tracking reaction was totally broken up, that is eye followed stimulus with micro- or macrosaccades. The structure of smooth eye tracking recovered on flight days 6-8 and on postflight days 3-4. However, in 46% of the cosmonauts on long-term missions the structure of smooth eye tracking was noted to be disturbed periodically, i.e. smooth tracking was replaced by saccadic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Via, Riccardo, E-mail: riccardo.via@polimi.it; Fassi, Aurora; Fattori, Giovanni
Purpose: External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Methods: Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by twomore » calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Results: Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. Conclusions: A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.« less
Via, Riccardo; Fassi, Aurora; Fattori, Giovanni; Fontana, Giulia; Pella, Andrea; Tagaste, Barbara; Riboldi, Marco; Ciocca, Mario; Orecchia, Roberto; Baroni, Guido
2015-05-01
External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by two calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.
Maa, April Y; Wojciechowski, Barbara; Hunt, Kelly J; Dismuke, Clara; Shyu, Jason; Janjua, Rabeea; Lu, Xiaoqin; Medert, Charles M; Lynch, Mary G
2017-04-01
The aging population is at risk of common eye diseases, and routine eye examinations are recommended to prevent visual impairment. Unfortunately, patients are less likely to seek care as they age, which may be the result of significant travel and time burdens associated with going to an eye clinic in person. A new method of eye-care delivery that mitigates distance barriers and improves access was developed to improve screening for potentially blinding conditions. We present the quality data from the early experience (first 13 months) of Technology-Based Eye Care Services (TECS), a novel ophthalmologic telemedicine program. With TECS, a trained ophthalmology technician is stationed in a primary care clinic away from the main hospital. The ophthalmology technician follows a detailed protocol that collects information about the patient's eyes. The information then is interpreted remotely. Patients with possible abnormal findings are scheduled for a face-to-face examination in the eye clinic. Any patient with no known ocular disease who desires a routine eye screening examination is eligible. Technology-Based Eye Care Services was established in 5 primary care clinics in Georgia surrounding the Atlanta Veterans Affairs hospital. Four program operation metrics (patient satisfaction, eyeglass remakes, disease detection, and visit length) and 2 access-to-care metrics (appointment wait time and no-show rate) were tracked. Care was rendered to 2690 patients over the first 13 months of TECS. The program has been met with high patient satisfaction (4.95 of 5). Eyeglass remake rate was 0.59%. Abnormal findings were noted in 36.8% of patients and there was >90% agreement between the TECS reading and the face-to-face findings of the physician. TECS saved both patient (25% less) and physician time (50% less), and access to care substantially improved with 99% of patients seen within 14 days of contacting the eye clinic, with a TECS no-show rate of 5.2%. The early experience with TECS has been promising. Tele-ophthalmology has the potential to improve operational efficiency, reduce cost, and significantly improve access to care. Although further study is necessary, TECS shows potential to help prevent avoidable vision loss. Published by Elsevier Inc.
Like a rolling stone: naturalistic visual kinematics facilitate tracking eye movements.
Souto, David; Kerzel, Dirk
2013-02-06
Newtonian physics constrains object kinematics in the real world. We asked whether eye movements towards tracked objects depend on their compliance with those constraints. In particular, the force of gravity constrains round objects to roll on the ground with a particular rotational and translational motion. We measured tracking eye movements towards rolling objects. We found that objects with rotational and translational motion that was congruent with an object rolling on the ground elicited faster tracking eye movements during pursuit initiation than incongruent stimuli. Relative to a condition without rotational component, we compared objects with this motion with a condition in which there was no rotational component, we essentially obtained benefits of congruence, and, to a lesser extent, costs from incongruence. Anticipatory pursuit responses showed no congruence effect, suggesting that the effect is based on visually-driven predictions, not on velocity storage. We suggest that the eye movement system incorporates information about object kinematics acquired by a lifetime of experience with visual stimuli obeying the laws of Newtonian physics.
Methodical aspects of text testing in a driving simulator.
Sundin, A; Patten, C J D; Bergmark, M; Hedberg, A; Iraeus, I-M; Pettersson, I
2012-01-01
A test with 30 test persons was conducted in a driving simulator. The test was a concept exploration and comparison of existing user interaction technologies for text message handling with focus on traffic safety and experience (technology familiarity and learning effects). Focus was put on methodical aspects how to measure and how to analyze the data. Results show difficulties with the eye tracking system (calibration etc.) per se, and also include the subsequent raw data preparation. The physical setup in the car where found important for the test completion.
Physiologically Modulating Videogames or Simulations which use Motion-Sensing Input Devices
NASA Technical Reports Server (NTRS)
Pope, Alan T. (Inventor); Stephens, Chad L. (Inventor); Blanson, Nina Marie (Inventor)
2014-01-01
New types of controllers allow players to make inputs to a video game or simulation by moving the entire controller itself. This capability is typically accomplished using a wireless input device having accelerometers, gyroscopes, and an infrared LED tracking camera. The present invention exploits these wireless motion-sensing technologies to modulate the player's movement inputs to the videogame based upon physiological signals. Such biofeedback-modulated video games train valuable mental skills beyond eye-hand coordination. These psychophysiological training technologies enhance personal improvement, not just the diversion, of the user.
NASA Technical Reports Server (NTRS)
Leigh, R. J.; Thurston, S. E.; Sharpe, J. A.; Ranalli, P. J.; Hamid, M. A.
1987-01-01
The effects of deficient labyrinthine function on smooth visual tracking with the eyes and head were investigated, using ten patients with bilateral peripheral vestibular disease and ten normal controls. Active, combined eye-head tracking (EHT) was significantly better in patients than smooth pursuit with the eyes alone, whereas normal subjects pursued equally well in both cases. Compensatory eye movements during active head rotation in darkness were always less in patients than in normal subjects. These data were used to examine current hypotheses that postulate central cancellation of the vestibulo-ocular reflex (VOR) during EHT. A model that proposes summation of an integral smooth pursuit command and VOR/compensatory eye movements is consistent with the findings. Observation of passive EHT (visual fixation of a head-fixed target during en bloc rotation) appears to indicate that in this mode parametric gain changes contribute to modulation of the VOR.
Tian, Shu; Yin, Xu-Cheng; Wang, Zhi-Bin; Zhou, Fang; Hao, Hong-Wei
2015-01-01
The phacoemulsification surgery is one of the most advanced surgeries to treat cataract. However, the conventional surgeries are always with low automatic level of operation and over reliance on the ability of surgeons. Alternatively, one imaginative scene is to use video processing and pattern recognition technologies to automatically detect the cataract grade and intelligently control the release of the ultrasonic energy while operating. Unlike cataract grading in the diagnosis system with static images, complicated background, unexpected noise, and varied information are always introduced in dynamic videos of the surgery. Here we develop a Video-Based Intelligent Recognitionand Decision (VeBIRD) system, which breaks new ground by providing a generic framework for automatically tracking the operation process and classifying the cataract grade in microscope videos of the phacoemulsification cataract surgery. VeBIRD comprises a robust eye (iris) detector with randomized Hough transform to precisely locate the eye in the noise background, an effective probe tracker with Tracking-Learning-Detection to thereafter track the operation probe in the dynamic process, and an intelligent decider with discriminative learning to finally recognize the cataract grade in the complicated video. Experiments with a variety of real microscope videos of phacoemulsification verify VeBIRD's effectiveness.
Yin, Xu-Cheng; Wang, Zhi-Bin; Zhou, Fang; Hao, Hong-Wei
2015-01-01
The phacoemulsification surgery is one of the most advanced surgeries to treat cataract. However, the conventional surgeries are always with low automatic level of operation and over reliance on the ability of surgeons. Alternatively, one imaginative scene is to use video processing and pattern recognition technologies to automatically detect the cataract grade and intelligently control the release of the ultrasonic energy while operating. Unlike cataract grading in the diagnosis system with static images, complicated background, unexpected noise, and varied information are always introduced in dynamic videos of the surgery. Here we develop a Video-Based Intelligent Recognitionand Decision (VeBIRD) system, which breaks new ground by providing a generic framework for automatically tracking the operation process and classifying the cataract grade in microscope videos of the phacoemulsification cataract surgery. VeBIRD comprises a robust eye (iris) detector with randomized Hough transform to precisely locate the eye in the noise background, an effective probe tracker with Tracking-Learning-Detection to thereafter track the operation probe in the dynamic process, and an intelligent decider with discriminative learning to finally recognize the cataract grade in the complicated video. Experiments with a variety of real microscope videos of phacoemulsification verify VeBIRD's effectiveness. PMID:26693249
Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys.
Ando, K; Johanson, C E; Levy, D L; Yasillo, N J; Holzman, P S; Schuster, C R
1983-01-01
Rhesus monkeys were trained to track a moving disk using a procedure in which responses on a lever were reinforced with water delivery only when the disk, oscillating in a horizontal plane on a screen at a frequency of 0.4 Hz in a visual angle of 20 degrees, dimmed for a brief period. Pursuit eye movements were recorded by electrooculography (EOG). IM phencyclidine, secobarbital, and diazepam injections decreased the number of reinforced lever presses in a dose-related manner. Both secobarbital and diazepam produced episodic jerky-pursuit eye movements, while phencyclidine had no consistent effects on eye movements. Lever pressing was disrupted at doses which had little effect on the quality of smooth-pursuit eye movements in some monkeys. This separation was particularly pronounced with diazepam. The similarities of the drug effects on smooth-pursuit eye movements between the present study and human studies indicate that the present method using rhesus monkeys may be useful for predicting drug effects on eye tracking and oculomotor function in humans.
Do the Eyes Have It? Using Eye Tracking to Assess Students Cognitive Dimensions
ERIC Educational Resources Information Center
Nisiforou, Efi A.; Laghos, Andrew
2013-01-01
Field dependence/independence (FD/FI) is a significant dimension of cognitive styles. The paper presents results of a study that seeks to identify individuals' level of field independence during visual stimulus tasks processing. Specifically, it examined the relationship between the Hidden Figure Test (HFT) scores and the eye tracking metrics.…
Tracking the Eye Movement of Four Years Old Children Learning Chinese Words
ERIC Educational Resources Information Center
Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei
2018-01-01
Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese…
The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders
ERIC Educational Resources Information Center
Sterling, Lindsey; Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth
2008-01-01
It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically…
Eye-Tracking as a Measure of Responsiveness to Joint Attention in Infants at Risk for Autism
ERIC Educational Resources Information Center
Navab, Anahita; Gillespie-Lynch, Kristen; Johnson, Scott P.; Sigman, Marian; Hutman, Ted
2012-01-01
Reduced responsiveness to joint attention (RJA), as assessed by the Early Social Communication Scales (ESCS), is predictive of both subsequent language difficulties and autism diagnosis. Eye-tracking measurement of RJA is a promising prognostic tool because it is highly precise and standardized. However, the construct validity of eye-tracking…
Reading Mathematics Representations: An Eye-Tracking Study
ERIC Educational Resources Information Center
Andrá, Chiara; Lindström, Paulina; Arzarello, Ferdinando; Holmqvist, Kenneth; Robutti, Ornella; Sabena, Cristina
2015-01-01
We use eye tracking as a method to examine how different mathematical representations of the same mathematical object are attended to by students. The results of this study show that there is a meaningful difference in the eye movements between formulas and graphs. This difference can be understood in terms of the cultural and social shaping of…
A Critical Test of Temporal and Spatial Accuracy of the Tobii T60XL Eye Tracker
ERIC Educational Resources Information Center
Morgante, James D.; Zolfaghari, Rahman; Johnson, Scott P.
2012-01-01
Infant eye tracking is becoming increasingly popular for its presumed precision relative to traditional looking time paradigms and potential to yield new insights into developmental processes. However, there is strong reason to suspect that the temporal and spatial resolution of popular eye tracking systems is not entirely accurate, potentially…
Graham, Dan J.; Jeffery, Robert W.
2012-01-01
Background Nutrition Facts labels can keep consumers better informed about their diets' nutritional composition, however, consumers currently do not understand these labels well or use them often. Thus, modifying existing labels may benefit public health. Objective The present study tracked the visual attention of individuals making simulated food-purchasing decisions to assess Nutrition Facts label viewing. Primary research questions were how self-reported viewing of Nutrition Facts labels and their components relates to measured viewing and whether locations of labels and specific label components relate to viewing. Design The study involved a simulated grocery shopping exercise conducted on a computer equipped with an eye-tracking camera. A post-task survey assessed self-reported nutrition information viewing, health behaviors, and demographics. Subjects/setting Individuals 18 years old and older and capable of reading English words on a computer (n=203) completed the 1-hour protocol at the University of Minnesota during Spring 2010. Statistical analyses Primary analyses included χ2, analysis of variance, and t tests comparing self-reported and measured viewing of label components in different presentation configurations. Results Self-reported viewing of Nutrition Facts label components was higher than objectively measured viewing. Label components at the top of the label were viewed more than those at the bottom, and labels positioned in the center of the screen were viewed more than those located on the sides. Conclusions Nutrition Facts label position within a viewing area and position of specific components on a label relate to viewing. Eye tracking is a valuable technology for evaluating consumers' attention to nutrition information, informing nutrition labeling policy (eg, front-of-pack labels), and designing labels that best support healthy dietary decisions. PMID:22027053
Chen, Janice D; Falkmer, Torbjörn; Parsons, Richard; Buzzard, Jennifer; Ciccarelli, Marina
2014-05-01
The Rapid Upper Limb Assessment (RULA) is an observation-based screening tool that has been used to assess postural risks of children in school settings. Studies using eye-tracking technology suggest that visual search strategies are influenced by experience in the task performed. This study investigated if experience in postural risk assessments contributed to differences in outcome scores on the RULA and the visual search strategies utilized. While wearing an eye-tracker, 16 student occupational therapists and 16 experienced occupational therapists used the RULA to assess 11 video scenarios of a child using different mobile information and communication technologies (ICT) in the home environment. No significant differences in RULA outcome scores, and no conclusive differences in visual search strategies between groups were found. RULA can be used as a screening tool for postural risks following a short training session regardless of the assessor's experience in postural risk assessments. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Eyes wide shut: implied social presence, eye tracking and attention.
Risko, Evan F; Kingstone, Alan
2011-02-01
People often behave differently when they know they are being watched. Here, we report the first investigation of whether such social presence effects also influence looking behavior--a popular measure of attention allocation. We demonstrate that wearing an eye tracker, an implied social presence, leads individuals to avoid looking at particular stimuli. These results demonstrate that an implied social presence, here an eye tracker, can alter looking behavior. These data provide a new manipulation of social attention, as well as presenting a methodological challenge to researchers using eye tracking.
Dual Use of Image Based Tracking Techniques: Laser Eye Surgery and Low Vision Prosthesis
NASA Technical Reports Server (NTRS)
Juday, Richard D.; Barton, R. Shane
1994-01-01
With a concentration on Fourier optics pattern recognition, we have developed several methods of tracking objects in dynamic imagery to automate certain space applications such as orbital rendezvous and spacecraft capture, or planetary landing. We are developing two of these techniques for Earth applications in real-time medical image processing. The first is warping of a video image, developed to evoke shift invariance to scale and rotation in correlation pattern recognition. The technology is being applied to compensation for certain field defects in low vision humans. The second is using the optical joint Fourier transform to track the translation of unmodeled scenes. Developed as an image fixation tool to assist in calculating shape from motion, it is being applied to tracking motions of the eyeball quickly enough to keep a laser photocoagulation spot fixed on the retina, thus avoiding collateral damage.
Dual use of image based tracking techniques: Laser eye surgery and low vision prosthesis
NASA Technical Reports Server (NTRS)
Juday, Richard D.
1994-01-01
With a concentration on Fourier optics pattern recognition, we have developed several methods of tracking objects in dynamic imagery to automate certain space applications such as orbital rendezvous and spacecraft capture, or planetary landing. We are developing two of these techniques for Earth applications in real-time medical image processing. The first is warping of a video image, developed to evoke shift invariance to scale and rotation in correlation pattern recognition. The technology is being applied to compensation for certain field defects in low vision humans. The second is using the optical joint Fourier transform to track the translation of unmodeled scenes. Developed as an image fixation tool to assist in calculating shape from motion, it is being applied to tracking motions of the eyeball quickly enough to keep a laser photocoagulation spot fixed on the retina, thus avoiding collateral damage.
Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions
Ho, Simon; Foulsham, Tom; Kingstone, Alan
2015-01-01
Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest. PMID:26309216
Prakash, Gaurav; Agarwal, Amar; Kumar, Dhivya Ashok; Jacob, Soosan; Agarwal, Athiya; Maity, Amrita
2011-03-01
To evaluate the visual and refractive outcomes and expected benefits of Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking. This prospective, interventional case series comprised 122 eyes (70 patients). Pre- and postoperative assessment included uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), refraction, and higher order aberrations. All patients underwent Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking using the Technolas 217z 100-Hz excimer platform (Technolas Perfect Vision GmbH). Follow-up was performed up to 6 months postoperatively. Theoretical benefit analysis was performed to evaluate the algorithm's outcomes compared to others. Preoperative spherocylindrical power was sphere -3.62 ± 1.60 diopters (D) (range: 0 to -6.75 D), cylinder -1.15 ± 1.00 D (range: 0 to -3.50 D), and spherical equivalent -4.19 ± 1.60 D (range: -7.75 to -2.00 D). At 6 months, 91% (111/122) of eyes were within ± 0.50 D of attempted correction. Postoperative UDVA was comparable to preoperative CDVA at 1 month (P=.47) and progressively improved at 6 months (P=.004). Two eyes lost one line of CDVA at 6 months. Theoretical benefit analysis revealed that of 101 eyes with astigmatism, 29 would have had cyclotorsion-induced astigmatism of ≥ 10% if iris recognition and dynamic rotational eye tracking were not used. Furthermore, the mean percentage decrease in maximum depth of ablation by using the Tissue Saving Treatment was 11.8 ± 2.9% over Aspheric, 17.8 ± 6.2% over Personalized, and 18.2 ± 2.8% over Planoscan algorithms. Tissue saving surface ablation with iris recognition and dynamic rotational eye tracking was safe and effective in this series of eyes. Copyright 2011, SLACK Incorporated.
ERIC Educational Resources Information Center
Thiessen, Amber; Beukelman, David; Hux, Karen; Longenecker, Maria
2016-01-01
Purpose: The purpose of the study was to compare the visual attention patterns of adults with aphasia and adults without neurological conditions when viewing visual scenes with 2 types of engagement. Method: Eye-tracking technology was used to measure the visual attention patterns of 10 adults with aphasia and 10 adults without neurological…
Lueck, Jennifer A
2017-07-01
Although disproportionally affected by depression, most depressed college students do not seek the help they need. Research has recently uncovered the potential negative effects of depression help-seeking messages if depressed cognition is not considered in the health message design process. It is unclear if depression determines whether and how individuals pay attention to gain- and loss-framed depression help-seeking messages-a mechanism that has significant implications for the strategic planning of health communication interventions. In order to enable the effective matching of message design and audience features, this study investigated attention patterns for gain (n = 75)- and loss (n = 78)-framed depression help-seeking messages using eye-tracking technology and self-report measures. The results confirmed that depression is a characteristic of risk avoidance and negative cognition. Depressed participants tended to pay more attention to disease information that was placed in a loss-framed rather than a gain-framed depression help-seeking message. Using negative message framing strategies for health messages seeking to educate about depression symptoms might therefore be a useful persuasive strategy-particularly when disseminated to vulnerable populations affected by depression. Furthermore, the present study emphasizes the effective use of eye-tracking technology in communication research.
Hepach, Robert; Vaish, Amrisha; Tomasello, Michael
2015-01-01
A central challenge of investigating the underlying mechanisms of and the individual differences in young children’s behavior is the measurement of the internal physiological mechanism and the involved expressive emotions. Here, we illustrate two paradigms that assess concurrent indicators of both children’s social perception as well as their emotional expression. In one set of studies, children view situations while their eye movements are mapped onto a live scene. In these studies, children’s internal arousal is measured via changes in their pupil dilation by using eye tracking technology. In another set of studies, we measured children’s emotional expression via changes in their upper-body posture by using depth sensor imaging technology. Together, these paradigms can provide new insights into the internal mechanism and outward emotional expression involved in young children’s behavior. PMID:26217246
Dissociable Frontal Controls during Visible and Memory-guided Eye-Tracking of Moving Targets
Ding, Jinhong; Powell, David; Jiang, Yang
2009-01-01
When tracking visible or occluded moving targets, several frontal regions including the frontal eye fields (FEF), dorsal-lateral prefrontal cortex (DLPFC), and Anterior Cingulate Cortex (ACC) are involved in smooth pursuit eye movements (SPEM). To investigate how these areas play different roles in predicting future locations of moving targets, twelve healthy college students participated in a smooth pursuit task of visual and occluded targets. Their eye movements and brain responses measured by event-related functional MRI were simultaneously recorded. Our results show that different visual cues resulted in time discrepancies between physical and estimated pursuit time only when the moving dot was occluded. Visible phase velocity gain was higher than that of occlusion phase. We found bilateral FEF association with eye-movement whether moving targets are visible or occluded. However, the DLPFC and ACC showed increased activity when tracking and predicting locations of occluded moving targets, and were suppressed during smooth pursuit of visible targets. When visual cues were increasingly available, less activation in the DLPFC and the ACC was observed. Additionally, there was a significant hemisphere effect in DLPFC, where right DLPFC showed significantly increased responses over left when pursuing occluded moving targets. Correlation results revealed that DLPFC, the right DLPFC in particular, communicates more with FEF during tracking of occluded moving targets (from memory). The ACC modulates FEF more during tracking of visible targets (likely related to visual attention). Our results suggest that DLPFC and ACC modulate FEF and cortical networks differentially during visible and memory-guided eye tracking of moving targets. PMID:19434603
Maa, April Y; Wojciechowski, Barbara; Hunt, Kelly; Dismuke, Clara; Janjua, Rabeea; Lynch, Mary G
2017-01-01
Veterans are at high risk for eye disease because of age and comorbid conditions. Access to eye care is challenging within the entire Veterans Hospital Administration's network of hospitals and clinics in the USA because it is the third busiest outpatient clinical service and growing at a rate of 9% per year. Rural and highly rural veterans face many more barriers to accessing eye care because of distance, cost to travel, and difficulty finding care in the community as many live in medically underserved areas. Also, rural veterans may be diagnosed in later stages of eye disease than their non-rural counterparts due to lack of access to specialty care. In March 2015, Technology-based Eye Care Services (TECS) was launched from the Atlanta Veterans Affairs (VA) as a quality improvement project to provide eye screening services for rural veterans. By tracking multiple measures including demographic and access to care metrics, data shows that TECS significantly improved access to care, with 33% of veterans receiving same-day access and >98% of veterans receiving an appointment within 30 days of request. TECS also provided care to a significant percentage of homeless veterans, 10.6% of the patients screened. Finally, TECS reduced healthcare costs, saving the VA up to US$148 per visit and approximately US$52 per patient in round trip travel reimbursements when compared to completing a face-to-face exam at the medical center. Overall savings to the VA system in this early phase of TECS totaled US$288,400, about US$41,200 per month. Other healthcare facilities may be able to use a similar protocol to extend care to at-risk patients.
Remote gaze tracking system for 3D environments.
Congcong Liu; Herrup, Karl; Shi, Bertram E
2017-07-01
Eye tracking systems are typically divided into two categories: remote and mobile. Remote systems, where the eye tracker is located near the object being viewed by the subject, have the advantage of being less intrusive, but are typically used for tracking gaze points on fixed two dimensional (2D) computer screens. Mobile systems such as eye tracking glasses, where the eye tracker are attached to the subject, are more intrusive, but are better suited for cases where subjects are viewing objects in the three dimensional (3D) environment. In this paper, we describe how remote gaze tracking systems developed for 2D computer screens can be used to track gaze points in a 3D environment. The system is non-intrusive. It compensates for small head movements by the user, so that the head need not be stabilized by a chin rest or bite bar. The system maps the 3D gaze points of the user onto 2D images from a scene camera and is also located remotely from the subject. Measurement results from this system indicate that it is able to estimate gaze points in the scene camera to within one degree over a wide range of head positions.
Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V
2017-07-01
Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and p<;0.05). This implies that with vigilance decrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.
Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors
Zhan, Zehui; Zhang, Lei; Mei, Hu; Fong, Patrick S. W.
2016-01-01
The detection of university online learners’ reading ability is generally problematic and time-consuming. Thus the eye-tracking sensors have been employed in this study, to record temporal and spatial human eye movements. Learners’ pupils, blinks, fixation, saccade, and regression are recognized as primary indicators for detecting reading abilities. A computational model is established according to the empirical eye-tracking data, and applying the multi-feature regularization machine learning mechanism based on a Low-rank Constraint. The model presents good generalization ability with an error of only 4.9% when randomly running 100 times. It has obvious advantages in saving time and improving precision, with only 20 min of testing required for prediction of an individual learner’s reading ability. PMID:27626418
Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.
Danion, Frederic; Mathew, James; Flanagan, J Randall
2017-01-01
Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.
Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics
Mathew, James
2017-01-01
Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964
Human-like object tracking and gaze estimation with PKD android
Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K; Bugnariu, Nicoleta L.; Popa, Dan O.
2018-01-01
As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans. PMID:29416193
Human-like object tracking and gaze estimation with PKD android
NASA Astrophysics Data System (ADS)
Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K.; Bugnariu, Nicoleta L.; Popa, Dan O.
2016-05-01
As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.
ERIC Educational Resources Information Center
Vilppu, Henna; Mikkilä-Erdmann, Mirjamaija; Södervik, Ilona; Österholm-Matikainen, Erika
2017-01-01
This study used the eye-tracking method to explore how the level of expertise influences reading, and solving, two written patient cases on cardiac failure and pulmonary embolus. Eye-tracking is a fairly commonly used method in medical education research, but it has been primarily applied to studies analyzing the processing of visualizations, such…
ERIC Educational Resources Information Center
Godfroid, Aline; Boers, Frank; Housen, Alex
2013-01-01
This eye-tracking study tests the hypothesis that more attention leads to more learning, following claims that attention to new language elements in the input results in their initial representation in long-term memory (i.e., intake; Robinson, 2003; Schmidt, 1990, 2001). Twenty-eight advanced learners of English read English texts that contained…
Design of a virtual reality based adaptive response technology for children with autism.
Lahiri, Uttama; Bekele, Esubalew; Dohrmann, Elizabeth; Warren, Zachary; Sarkar, Nilanjan
2013-01-01
Children with autism spectrum disorder (ASD) demonstrate potent impairments in social communication skills including atypical viewing patterns during social interactions. Recently, several assistive technologies, particularly virtual reality (VR), have been investigated to address specific social deficits in this population. Some studies have coupled eye-gaze monitoring mechanisms to design intervention strategies. However, presently available systems are designed to primarily chain learning via aspects of one's performance only which affords restricted range of individualization. The presented work seeks to bridge this gap by developing a novel VR-based interactive system with Gaze-sensitive adaptive response technology that can seamlessly integrate VR-based tasks with eye-tracking techniques to intelligently facilitate engagement in tasks relevant to advancing social communication skills. Specifically, such a system is capable of objectively identifying and quantifying one's engagement level by measuring real-time viewing patterns, subtle changes in eye physiological responses, as well as performance metrics in order to adaptively respond in an individualized manner to foster improved social communication skills among the participants. The developed system was tested through a usability study with eight adolescents with ASD. The results indicate the potential of the system to promote improved social task performance along with socially-appropriate mechanisms during VR-based social conversation tasks.
Design of a Virtual Reality Based Adaptive Response Technology for Children With Autism
Lahiri, Uttama; Bekele, Esubalew; Dohrmann, Elizabeth; Warren, Zachary; Sarkar, Nilanjan
2013-01-01
Children with autism spectrum disorder (ASD) demonstrate potent impairments in social communication skills including atypical viewing patterns during social interactions. Recently, several assistive technologies, particularly virtual reality (VR), have been investigated to address specific social deficits in this population. Some studies have coupled eye-gaze monitoring mechanisms to design intervention strategies. However, presently available systems are designed to primarily chain learning via aspects of one’s performance only which affords restricted range of individualization. The presented work seeks to bridge this gap by developing a novel VR-based interactive system with Gaze-sensitive adaptive response technology that can seamlessly integrate VR-based tasks with eye-tracking techniques to intelligently facilitate engagement in tasks relevant to advancing social communication skills. Specifically, such a system is capable of objectively identifying and quantifying one’s engagement level by measuring real-time viewing patterns, subtle changes in eye physiological responses, as well as performance metrics in order to adaptively respond in an individualized manner to foster improved social communication skills among the participants. The developed system was tested through a usability study with eight adolescents with ASD. The results indicate the potential of the system to promote improved social task performance along with socially-appropriate mechanisms during VR-based social conversation tasks. PMID:23033333
Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.
2018-01-01
Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370
Eye Tracking Metrics for Workload Estimation in Flight Deck Operation
NASA Technical Reports Server (NTRS)
Ellis, Kyle; Schnell, Thomas
2010-01-01
Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.
Accounting for direction and speed of eye motion in planning visually guided manual tracking.
Leclercq, Guillaume; Blohm, Gunnar; Lefèvre, Philippe
2013-10-01
Accurate motor planning in a dynamic environment is a critical skill for humans because we are often required to react quickly and adequately to the visual motion of objects. Moreover, we are often in motion ourselves, and this complicates motor planning. Indeed, the retinal and spatial motions of an object are different because of the retinal motion component induced by self-motion. Many studies have investigated motion perception during smooth pursuit and concluded that eye velocity is partially taken into account by the brain. Here we investigate whether the eye velocity during ongoing smooth pursuit is taken into account for the planning of visually guided manual tracking. We had 10 human participants manually track a target while in steady-state smooth pursuit toward another target such that the difference between the retinal and spatial target motion directions could be large, depending on both the direction and the speed of the eye. We used a measure of initial arm movement direction to quantify whether motor planning occurred in retinal coordinates (not accounting for eye motion) or was spatially correct (incorporating eye velocity). Results showed that the eye velocity was nearly fully taken into account by the neuronal areas involved in the visuomotor velocity transformation (between 75% and 102%). In particular, these neuronal pathways accounted for the nonlinear effects due to the relative velocity between the target and the eye. In conclusion, the brain network transforming visual motion into a motor plan for manual tracking adequately uses extraretinal signals about eye velocity.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-08-31
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user's head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest.
Pan, Weiyuan; Jung, Dongwook; Yoon, Hyo Sik; Lee, Dong Eun; Naqvi, Rizwan Ali; Lee, Kwan Woo; Park, Kang Ryoung
2016-01-01
Gaze tracking is the technology that identifies a region in space that a user is looking at. Most previous non-wearable gaze tracking systems use a near-infrared (NIR) light camera with an NIR illuminator. Based on the kind of camera lens used, the viewing angle and depth-of-field (DOF) of a gaze tracking camera can be different, which affects the performance of the gaze tracking system. Nevertheless, to our best knowledge, most previous researches implemented gaze tracking cameras without ground truth information for determining the optimal viewing angle and DOF of the camera lens. Eye-tracker manufacturers might also use ground truth information, but they do not provide this in public. Therefore, researchers and developers of gaze tracking systems cannot refer to such information for implementing gaze tracking system. We address this problem providing an empirical study in which we design an optimal gaze tracking camera based on experimental measurements of the amount and velocity of user’s head movements. Based on our results and analyses, researchers and developers might be able to more easily implement an optimal gaze tracking system. Experimental results show that our gaze tracking system shows high performance in terms of accuracy, user convenience and interest. PMID:27589768
A holographic waveguide based eye tracker
NASA Astrophysics Data System (ADS)
Liu, Changgeng; Pazzucconi, Beatrice; Liu, Juan; Liu, Lei; Yao, Xincheng
2018-02-01
We demonstrated the feasibility of using holographic waveguide for eye tracking. A custom-built holographic waveguide, a 20 mm x 60 mm x 3 mm flat glass substrate with integrated in- and out-couplers, was used for the prototype development. The in- and out-couplers, photopolymer films with holographic fringes, induced total internal reflection in the glass substrate. Diffractive optical elements were integrated into the in-coupler to serve as an optical collimator. The waveguide captured images of the anterior segment of the eye right in front of it and guided the images to a processing unit distant from the eye. The vector connecting the pupil center (PC) and the corneal reflex (CR) of the eye was used to compute eye position in the socket. An eye model, made of a high quality prosthetic eye, was used prototype validation. The benchtop prototype demonstrated a linear relationship between the angular eye position and the PC/CR vector over a range of 60 horizontal degrees and 30 vertical degrees at a resolution of 0.64-0.69 degrees/pixel by simple pixel count. The uncertainties of the measurements at different angular positions were within 1.2 pixels, which indicated that the prototype exhibited a high level of repeatability. These results confirmed that the holographic waveguide technology could be a feasible platform for developing a wearable eye tracker. Further development can lead to a compact, see-through eye tracker, which allows continuous monitoring of eye movement during real life tasks, and thus benefits diagnosis of oculomotor disorders.
García-Blanco, Ana; Salmerón, Ladislao; Perea, Manuel; Livianos, Lorenzo
2014-03-30
Attentional biases toward emotional information may represent vulnerability and maintenance factors in bipolar disorder (BD). The present experimental study examined the processing of emotional information in BD patients using the eye-tracking technology. Bipolar patients in their different states (euthymia, mania, depression) simultaneously viewed four pictures with different emotional valence (happy, neutral, sad, threatening) for 20s while their eye movements were monitored. A group of healthy individuals served as the control. The data revealed the following: (i) a decrease in attention to happy images in BD patients in their depressive episodes compared to healthy individuals, and (ii) an increase in attention to threatening images in BD patients (regardless of their episode) relative to the healthy controls. These biases appeared in the late stages of information processing and were sustained over the 20s interval. Thus, the present findings reveal that attentional biases toward emotional information can be a key feature of BD, in that: (i) an anhedonic lack of sensitivity to positive stimuli during the bipolar depressive episode may be considered a maintaining factor of this clinical state, and (ii) the trait-bias toward threat, even in asymptomatic patients, may reflect a marker of vulnerability in BD. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Weiner, A. M.; Gundy, J.; Brown-Bertold, B.; Yates, H.; Dobler, J. T.
2017-12-01
Since their introduction, geostationary weather satellites have enabled us to track hurricane life-cycle movement from development to dissipation. During the 2017 hurricane season, the new GOES-16 geostationary satellite demonstrated just how far we have progressed technologically in geostationary satellite imaging, with hurricane imagery showing never-before-seen detail of the hurricane eye and eyewall structure and life cycle. In addition, new ground system technology, leveraging high-performance computing, delivered imagery and data to forecasters with unprecedented speed—and with updates as often as every 30 seconds. As additional satellites and new products become operational, forecasters will be able to track hurricanes with even greater accuracy and assist in aftermath evaluations. This presentation will present glimpses into the past, a look at the present, and a prediction for the future utilization of geostationary satellites with respect to all facets of hurricane support.
Human-Computer Interaction in Smart Environments
Paravati, Gianluca; Gatteschi, Valentina
2015-01-01
Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.
Eye-tracking-based assessment of cognitive function in low-resource settings.
Forssman, Linda; Ashorn, Per; Ashorn, Ulla; Maleta, Kenneth; Matchado, Andrew; Kortekangas, Emma; Leppänen, Jukka M
2017-04-01
Early development of neurocognitive functions in infants can be compromised by poverty, malnutrition and lack of adequate stimulation. Optimal management of neurodevelopmental problems in infants requires assessment tools that can be used early in life, and are objective and applicable across economic, cultural and educational settings. The present study examined the feasibility of infrared eye tracking as a novel and highly automated technique for assessing visual-orienting and sequence-learning abilities as well as attention to facial expressions in young (9-month-old) infants. Techniques piloted in a high-resource laboratory setting in Finland (N=39) were subsequently field-tested in a community health centre in rural Malawi (N=40). Parents' perception of the acceptability of the method (Finland 95%, Malawi 92%) and percentages of infants completing the whole eye-tracking test (Finland 95%, Malawi 90%) were high, and percentages of valid test trials (Finland 69-85%, Malawi 68-73%) satisfactory at both sites. Test completion rates were slightly higher for eye tracking (90%) than traditional observational tests (87%) in Malawi. The predicted response pattern indicative of specific cognitive function was replicated in Malawi, but Malawian infants exhibited lower response rates and slower processing speed across tasks. High test completion rates and the replication of the predicted test patterns in a novel environment in Malawi support the feasibility of eye tracking as a technique for assessing infant development in low-resource setting. Further research is needed to the test-retest stability and predictive validity of the eye-tracking scores in low-income settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Practical low-cost stereo head-mounted display
NASA Astrophysics Data System (ADS)
Pausch, Randy; Dwivedi, Pramod; Long, Allan C., Jr.
1991-08-01
A high-resolution head-mounted display has been developed from substantially cheaper components than previous systems. Monochrome displays provide 720 by 280 monochrome pixels to each eye in a one-inch-square region positioned approximately one inch from each eye. The display hardware is the Private Eye, manufactured by Reflection Technologies, Inc. The tracking system uses the Polhemus Isotrak, providing (x,y,z, azimuth, elevation and roll) information on the user''s head position and orientation 60 times per second. In combination with a modified Nintendo Power Glove, this system provides a full-functionality virtual reality/simulation system. Using two host 80386 computers, real-time wire frame images can be produced. Other virtual reality systems require roughly 250,000 in hardware, while this one requires only 5,000. Stereo is particularly useful for this system because shading or occlusion cannot be used as depth cues.
ERIC Educational Resources Information Center
Schneider, Bertrand; Pea, Roy
2014-01-01
We describe preliminary applications of network analysis techniques to eye-tracking data collected during a collaborative learning activity. This paper makes three contributions: first, we visualize collaborative eye-tracking data as networks, where the nodes of the graph represent fixations and edges represent saccades. We found that those…
Eye Tracking Dysfunction in Schizophrenia: Characterization and Pathophysiology
Sereno, Anne B.; Gooding, Diane C.; O’Driscoll, Gilllian A.
2011-01-01
Eye tracking dysfunction (ETD) is one of the most widely replicated behavioral deficits in schizophrenia and is over-represented in clinically unaffected first-degree relatives of schizophrenia patients. Here, we provide an overview of research relevant to the characterization and pathophysiology of this impairment. Deficits are most robust in the maintenance phase of pursuit, particularly during the tracking of predictable target movement. Impairments are also found in pursuit initiation and correlate with performance on tests of motion processing, implicating early sensory processing of motion signals. Taken together, the evidence suggests that ETD involves higher-order structures, including the frontal eye fields, which adjust the gain of the pursuit response to visual and anticipated target movement, as well as early parts of the pursuit pathway, including motion areas (the middle temporal area and the adjacent medial superior temporal area). Broader application of localizing behavioral paradigms in patient and family studies would be advantageous for refining the eye tracking phenotype for genetic studies. PMID:21312405
Raghunath, Vignesh; Braxton, Melissa O.; Gagnon, Stephanie A.; Brunyé, Tad T.; Allison, Kimberly H.; Reisch, Lisa M.; Weaver, Donald L.; Elmore, Joann G.; Shapiro, Linda G.
2012-01-01
Context: Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists’ viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists’ viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists’ viewing strategies and time expenditures in their interpretive workflow. Aims: To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists’ attention and viewing behavior. Settings and Design: Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Materials and Methods: Participants’ foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Statistical Analysis Used: Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists’ accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Results: Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Conclusions: Data detailing mouse cursor movements may be a useful addition to future studies of pathologists’ accuracy and efficiency when using digital pathology. PMID:23372984
Raghunath, Vignesh; Braxton, Melissa O; Gagnon, Stephanie A; Brunyé, Tad T; Allison, Kimberly H; Reisch, Lisa M; Weaver, Donald L; Elmore, Joann G; Shapiro, Linda G
2012-01-01
Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists' viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists' viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists' viewing strategies and time expenditures in their interpretive workflow. To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists' attention and viewing behavior. Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Participants' foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists' accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Data detailing mouse cursor movements may be a useful addition to future studies of pathologists' accuracy and efficiency when using digital pathology.
Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela
2017-01-01
Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Insights into numerical cognition: considering eye-fixations in number processing and arithmetic.
Mock, J; Huber, S; Klein, E; Moeller, K
2016-05-01
Considering eye-fixation behavior is standard in reading research to investigate underlying cognitive processes. However, in numerical cognition research eye-tracking is used less often and less systematically. Nevertheless, we identified over 40 studies on this topic from the last 40 years with an increase of eye-tracking studies on numerical cognition during the last decade. Here, we review and discuss these empirical studies to evaluate the added value of eye-tracking for the investigation of number processing. Our literature review revealed that the way eye-fixation behavior is considered in numerical cognition research ranges from investigating basic perceptual aspects of processing non-symbolic and symbolic numbers, over assessing the common representational space of numbers and space, to evaluating the influence of characteristics of the base-10 place-value structure of Arabic numbers and executive control on number processing. Apart from basic results such as reading times of numbers increasing with their magnitude, studies revealed that number processing can influence domain-general processes such as attention shifting-but also the other way round. Domain-general processes such as cognitive control were found to affect number processing. In summary, eye-fixation behavior allows for new insights into both domain-specific and domain-general processes involved in number processing. Based thereon, a processing model of the temporal dynamics of numerical cognition is postulated, which distinguishes an early stage of stimulus-driven bottom-up processing from later more top-down controlled stages. Furthermore, perspectives for eye-tracking research in numerical cognition are discussed to emphasize the potential of this methodology for advancing our understanding of numerical cognition.
What interests them in the pictures?--differences in eye-tracking between rhesus monkeys and humans.
Hu, Ying-Zhou; Jiang, Hui-Hui; Liu, Ci-Rong; Wang, Jian-Hong; Yu, Cheng-Yang; Carlson, Synnöve; Yang, Shang-Chuan; Saarinen, Veli-Matti; Rizak, Joshua D; Tian, Xiao-Guang; Tan, Hen; Chen, Zhu-Yue; Ma, Yuan-Ye; Hu, Xin-Tian
2013-10-01
Studies estimating eye movements have demonstrated that non-human primates have fixation patterns similar to humans at the first sight of a picture. In the current study, three sets of pictures containing monkeys, humans or both were presented to rhesus monkeys and humans. The eye movements on these pictures by the two species were recorded using a Tobii eye-tracking system. We found that monkeys paid more attention to the head and body in pictures containing monkeys, whereas both monkeys and humans paid more attention to the head in pictures containing humans. The humans always concentrated on the eyes and head in all the pictures, indicating the social role of facial cues in society. Although humans paid more attention to the hands than monkeys, both monkeys and humans were interested in the hands and what was being done with them in the pictures. This may suggest the importance and necessity of hands for survival. Finally, monkeys scored lower in eye-tracking when fixating on the pictures, as if they were less interested in looking at the screen than humans. The locations of fixation in monkeys may provide insight into the role of eye movements in an evolutionary context.
Effects of reward on the accuracy and dynamics of smooth pursuit eye movements.
Brielmann, Aenne A; Spering, Miriam
2015-08-01
Reward modulates behavioral choices and biases goal-oriented behavior, such as eye or hand movements, toward locations or stimuli associated with higher rewards. We investigated reward effects on the accuracy and timing of smooth pursuit eye movements in 4 experiments. Eye movements were recorded in participants tracking a moving visual target on a computer monitor. Before target motion onset, a monetary reward cue indicated whether participants could earn money by tracking accurately, or whether the trial was unrewarded (Experiments 1 and 2, n = 11 each). Reward significantly improved eye-movement accuracy across different levels of task difficulty. Improvements were seen even in the earliest phase of the eye movement, within 70 ms of tracking onset, indicating that reward impacts visual-motor processing at an early level. We obtained similar findings when reward was not precued but explicitly associated with the pursuit target (Experiment 3, n = 16); critically, these results were not driven by stimulus prevalence or other factors such as preparation or motivation. Numerical cues (Experiment 4, n = 9) were not effective. (c) 2015 APA, all rights reserved).
An Active Vision Approach to Understanding and Improving Visual Training in the Geosciences
NASA Astrophysics Data System (ADS)
Voronov, J.; Tarduno, J. A.; Jacobs, R. A.; Pelz, J. B.; Rosen, M. R.
2009-12-01
Experience in the field is a fundamental aspect of geologic training, and its effectiveness is largely unchallenged because of anecdotal evidence of its success among expert geologists. However, there have been only a few quantitative studies based on large data collection efforts to investigate how Earth Scientists learn in the field. In a recent collaboration between Earth scientists, Cognitive scientists and experts in Imaging science at the University of Rochester and Rochester Institute of Technology, we are investigating such a study. Within Cognitive Science, one school of thought, referred to as the Active Vision approach, emphasizes that visual perception is an active process requiring us to move our eyes to acquire new information about our environment. The Active Vision approach indicates the perceptual skills which experts possess and which novices will need to acquire to achieve expert performance. We describe data collection efforts using portable eye-trackers to assess how novice and expert geologists acquire visual knowledge in the field. We also discuss our efforts to collect images for use in a semi-immersive classroom environment, useful for further testing of novices and experts using eye-tracking technologies.
Eye tracking a self-moved target with complex hand-target dynamics
Landelle, Caroline; Montagnini, Anna; Madelain, Laurent
2016-01-01
Previous work has shown that the ability to track with the eye a moving target is substantially improved when the target is self-moved by the subject's hand compared with when being externally moved. Here, we explored a situation in which the mapping between hand movement and target motion was perturbed by simulating an elastic relationship between the hand and target. Our objective was to determine whether the predictive mechanisms driving eye-hand coordination could be updated to accommodate this complex hand-target dynamics. To fully appreciate the behavioral effects of this perturbation, we compared eye tracking performance when self-moving a target with a rigid mapping (simple) and a spring mapping as well as when the subject tracked target trajectories that he/she had previously generated when using the rigid or spring mapping. Concerning the rigid mapping, our results confirmed that smooth pursuit was more accurate when the target was self-moved than externally moved. In contrast, with the spring mapping, eye tracking had initially similar low spatial accuracy (though shorter temporal lag) in the self versus externally moved conditions. However, within ∼5 min of practice, smooth pursuit improved in the self-moved spring condition, up to a level similar to the self-moved rigid condition. Subsequently, when the mapping unexpectedly switched from spring to rigid, the eye initially followed the expected target trajectory and not the real one, thereby suggesting that subjects used an internal representation of the new hand-target dynamics. Overall, these results emphasize the stunning adaptability of smooth pursuit when self-maneuvering objects with complex dynamics. PMID:27466129
Nerve Fiber Flux Analysis Using Wide-Field Swept-Source Optical Coherence Tomography.
Tan, Ou; Liu, Liang; Liu, Li; Huang, David
2018-02-01
To devise a method to quantify nerve fibers over their arcuate courses over an extended peripapillary area using optical coherence tomography (OCT). Participants were imaged with 8 × 8-mm volumetric OCT scans centered at the optic disc. A new quantity, nerve fiber flux (NFF), represents the cross-sectional area transected perpendicular to the nerve fibers. The peripapillary area was divided into 64 tracks with equal flux. An iterative algorithm traced the trajectory of the tracks assuming that the relative distribution of the NFF was conserved with compensation for fiber connections to ganglion cells on the macular side. Average trajectory was averaged from normal eyes and use to calculate the NFF maps for glaucomatous eyes. The NFF maps were divided into eight sectors that correspond to visual field regions. There were 24 healthy and 10 glaucomatous eyes enrolled. The algorithm converged on similar patterns of NFL tracks for all healthy eyes. In glaucomatous eyes, NFF correlated with visual field sensitivity in the arcuate sectors (Spearman ρ = 0.53-0.62). Focal nerve fiber loss in glaucomatous eyes appeared as uniform tracks of NFF defects that followed the expected arcuate fiber trajectory. Using an algorithm based on the conservation of flux, we derived nerve fiber trajectories in the peripapillary area. The NFF map is useful for the visualization of focal defects and quantification of sector nerve fiber loss from wide-area volumetric OCT scans. NFF provides a cumulative measure of volumetric loss along nerve fiber tracks and could improve the detection of focal glaucoma damage.
DOT National Transportation Integrated Search
1972-12-01
Alcohol ingestion interferes with visual control of vestibular eye movements and thereby produces significant decrements in performance at a compensatory tracking task during oscillation about the yaw axis; significant or consistent decrements in per...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-15
... also help improve questionnaire design. Different respondents may pay differing degrees of attention to... and strategies for improving the design (Refs. 5 and 6). Finally, eye tracking data can provide... design elements (e.g., prominence, text vs. graphics) will cause variations in information seeking. To...
High-Speed Noninvasive Eye-Tracking System
NASA Technical Reports Server (NTRS)
Talukder, Ashit; LaBaw, Clayton; Michael-Morookian, John; Monacos, Steve; Serviss, Orin
2007-01-01
The figure schematically depicts a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Relative to the prior commercial systems, the present system operates at much higher speed and thereby offers enhanced capability for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals,and eye-based diagnosis of physiological disorders that affect gaze responses.
Pulay, Márk Ágoston
2015-01-01
Letting children with severe physical disabilities (like Tetraparesis spastica) to get relevant motional experiences of appropriate quality and quantity is now the greatest challenge for us in the field of neurorehabilitation. These motional experiences may establish many cognitive processes, but may also cause additional secondary cognitive dysfunctions such as disorders in body image, figure invariance, visual perception, auditory differentiation, concentration, analytic and synthetic ways of thinking, visual memory etc. Virtual Reality is a technology that provides a sense of presence in a real environment with the help of 3D pictures and animations formed in a computer environment and enable the person to interact with the objects in that environment. One of our biggest challenges is to find a well suited input device (hardware) to let the children with severe physical disabilities to interact with the computer. Based on our own experiences and a thorough literature review we have come to the conclusion that an effective combination of eye-tracking and EMG devices should work well.
Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor
Tanno, Koichi
2017-01-01
A gaze estimation system is one of the communication methods for severely disabled people who cannot perform gestures and speech. We previously developed an eye tracking method using a compact and light electrooculogram (EOG) signal, but its accuracy is not very high. In the present study, we conducted experiments to investigate the EOG component strongly correlated with the change of eye movements. The experiments in this study are of two types: experiments to see objects only by eye movements and experiments to see objects by face and eye movements. The experimental results show the possibility of an eye tracking method using EOG signals and a Kinect sensor. PMID:28912800
Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI
2016-04-01
but these delays are nearing resolution and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task...resonance imagining ( fMRI ) and diffusion tensor imaging (DTI) to characterize the extent of functional cortical recruitment and white matter injury...respectively. The inclusion of fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye
Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system
NASA Astrophysics Data System (ADS)
Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio
2017-03-01
It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.
Effects of Detailed Illustrations on Science Learning: An Eye-Tracking Study
ERIC Educational Resources Information Center
Lin, Yu Ying; Holmqvist, Kenneth; Miyoshi, Kiyofumi; Ashida, Hiroshi
2017-01-01
The eye-tracking method was used to assess the influence of detailed, colorful illustrations on reading behaviors and learning outcomes. Based on participants' subjective ratings in a pre-study, we selected eight one-page human anatomy lessons. In the main study, participants learned these eight human anatomy lessons; four were accompanied by…
Incidental L2 Vocabulary Acquisition "from" and "while" Reading: An Eye-Tracking Study
ERIC Educational Resources Information Center
Pellicer-Sánchez, Ana
2016-01-01
Previous studies have shown that reading is an important source of incidental second language (L2) vocabulary acquisition. However, we still do not have a clear picture of what happens when readers encounter unknown words. Combining offline (vocabulary tests) and online (eye-tracking) measures, the incidental acquisition of vocabulary knowledge…
Subtitles and Eye Tracking: Reading and Performance
ERIC Educational Resources Information Center
Kruger, Jan-Louis; Steyn, Faans
2014-01-01
This article presents an experimental study to investigate whether subtitle reading has a positive impact on academic performance. In the absence of reliable indexes of reading behavior in dynamic texts, the article first formulates and validates an index to measure the reading of text, such as subtitles on film. Eye-tracking measures (fixations…
Looking at Movies and Cartoons: Eye-Tracking Evidence from Williams Syndrome and Autism
ERIC Educational Resources Information Center
Riby, D.; Hancock, P. J. B.
2009-01-01
Background: Autism and Williams syndrome (WS) are neuro-developmental disorders associated with distinct social phenotypes. While individuals with autism show a lack of interest in socially important cues, individuals with WS often show increased interest in socially relevant information. Methods: The current eye-tracking study explores how…
ERIC Educational Resources Information Center
Godfroid, Aline; Spino, Le Anne
2015-01-01
This study extends previous reactivity research on the cognitive effects of think-alouds to include eye-tracking methodology. Unlike previous studies, we supplemented traditional superiority tests with equivalence tests, because only the latter are conceptually appropriate for demonstrating nonreactivity. Advanced learners of English read short…
Smooth Pursuit in Schizophrenia: A Meta-Analytic Review of Research since 1993
ERIC Educational Resources Information Center
O'Driscoll, Gillian A.; Callahan, Brandy L.
2008-01-01
Abnormal smooth pursuit eye-tracking is one of the most replicated deficits in the psychophysiological literature in schizophrenia [Levy, D. L., Holzman, P. S., Matthysse, S., & Mendell, N. R. (1993). "Eye tracking dysfunction and schizophrenia: A critical perspective." "Schizophrenia Bulletin, 19", 461-505]. We used meta-analytic procedures to…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... images (Refs. 1 to 4, 7). Data from eye tracking studies can also help improve questionnaire design... response options. Eye tracking data can help to identify the need and strategies for improving the design... product familiarity or personal needs will cause variations in information seeking and that design...
Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis
ERIC Educational Resources Information Center
Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying
2012-01-01
This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…
ERIC Educational Resources Information Center
Belenky, Daniel; Ringenberg, Michael; Olsen, Jennifer; Aleven, Vincent; Rummel, Nikol
2013-01-01
Dual eye-tracking measures enable novel ways to test predictions about collaborative learning. For example, the research project we are engaging in uses measures of gaze recurrence to help understand how collaboration may differ when students are completing various learning activities focused on different learning objectives. Specifically, we…
Eye-Tracking as a Tool in Process-Oriented Reading Test Validation
ERIC Educational Resources Information Center
Solheim, Oddny Judith; Uppstad, Per Henning
2011-01-01
The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…
Linguistic Complexity and Information Structure in Korean: Evidence from Eye-Tracking during Reading
ERIC Educational Resources Information Center
Lee, Yoonhyoung; Lee, Hanjung; Gordon, Peter C.
2007-01-01
The nature of the memory processes that support language comprehension and the manner in which information packaging influences online sentence processing were investigated in three experiments that used eye-tracking during reading to measure the ease of understanding complex sentences in Korean. All three experiments examined reading of embedded…
Amster, Brian; Marquard, Jenna; Henneman, Elizabeth; Fisher, Donald
2015-01-01
In this clinical simulation study using an eye-tracking device, 40% of senior nursing students administered a contraindicated medication to a patient. Our findings suggest that the participants who did not identify the error did not know that amoxicillin is a type of penicillin. Eye-tracking devices may be valuable for determining whether nursing students are making rule- or knowledge-based errors, a distinction not easily captured via observations and interviews.
Measure and Analysis of a Gaze Position Using Infrared Light Technique
2001-10-25
MEASURE AND ANALYSIS OF A GAZE POSITION USING INFRARED LIGHT TECHNIQUE Z. Ramdane-Cherif1,2, A. Naït-Ali2, J F. Motsch2, M. O. Krebs1 1INSERM E 01-17...also proposes a method to correct head movements. Keywords: eye movement, gaze tracking, visual scan path, spatial mapping. INTRODUCTION The eye gaze ...tracking has been used for clinical purposes to detect illnesses, such as nystagmus , unusual eye movements and many others [1][2][3]. It is also used
Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan
2017-01-01
Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women’s eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants (N = 150; 84% Han ethnicity) were aged 18–29 years (M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment. PMID:28184210
Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan
2017-01-01
Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women's eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants ( N = 150; 84% Han ethnicity) were aged 18-29 years ( M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment.
NASA Technical Reports Server (NTRS)
Krauzlis, R. J.; Stone, L. S.
1999-01-01
The two components of voluntary tracking eye-movements in primates, pursuit and saccades, are generally viewed as relatively independent oculomotor subsystems that move the eyes in different ways using independent visual information. Although saccades have long been known to be guided by visual processes related to perception and cognition, only recently have psychophysical and physiological studies provided compelling evidence that pursuit is also guided by such higher-order visual processes, rather than by the raw retinal stimulus. Pursuit and saccades also do not appear to be entirely independent anatomical systems, but involve overlapping neural mechanisms that might be important for coordinating these two types of eye movement during the tracking of a selected visual object. Given that the recovery of objects from real-world images is inherently ambiguous, guiding both pursuit and saccades with perception could represent an explicit strategy for ensuring that these two motor actions are driven by a single visual interpretation.
Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.
2014-01-01
Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525
McCamy, Michael B.; Otero-Millan, Jorge; Leigh, R. John; King, Susan A.; Schneider, Rosalyn M.; Macknik, Stephen L.; Martinez-Conde, Susana
2015-01-01
Human eyes move continuously, even during visual fixation. These “fixational eye movements” (FEMs) include microsaccades, intersaccadic drift and oculomotor tremor. Research in human FEMs has grown considerably in the last decade, facilitated by the manufacture of noninvasive, high-resolution/speed video-oculography eye trackers. Due to the small magnitude of FEMs, obtaining reliable data can be challenging, however, and depends critically on the sensitivity and precision of the eye tracking system. Yet, no study has conducted an in-depth comparison of human FEM recordings obtained with the search coil (considered the gold standard for measuring microsaccades and drift) and with contemporary, state-of-the art video trackers. Here we measured human microsaccades and drift simultaneously with the search coil and a popular state-of-the-art video tracker. We found that 95% of microsaccades detected with the search coil were also detected with the video tracker, and 95% of microsaccades detected with video tracking were also detected with the search coil, indicating substantial agreement between the two systems. Peak/mean velocities and main sequence slopes of microsaccades detected with video tracking were significantly higher than those of the same microsaccades detected with the search coil, however. Ocular drift was significantly correlated between the two systems, but drift speeds were higher with video tracking than with the search coil. Overall, our combined results suggest that contemporary video tracking now approaches the search coil for measuring FEMs. PMID:26035820
Perceptions of rapport across the life span: Gaze patterns and judgment accuracy.
Vicaria, Ishabel M; Bernieri, Frank J; Isaacowitz, Derek M
2015-06-01
Although age-related deficits in emotion perception have been established using photographs of individuals, the extension of these findings to dynamic displays and dyads is just beginning. Similarly, most eye-tracking research in the person perception literature, including those that study age differences, have focused on individual attributes gleaned from static images; to our knowledge, no previous research has considered cue use in dyadic judgments with eye-tracking. The current study employed a Brunswikian lens model analysis in conjunction with eye-tracking measurements to study age differences in the judgment of rapport, a social construct comprised of mutual attentiveness, positive feelings, and coordination between interacting partners. Judgment accuracy and cue utilization of younger (n = 47) and older (n = 46) adults were operationalized as correlations between a perceiver's judgments and criterion values within a set of 34 brief interaction videos in which 2 opposite sex college students discussed a controversial topic. No age differences emerged in the accuracy of judgments; however, pathways to accuracy differed by age: Younger adults' judgments relied on some behavioral cues more than older adults. In addition, eye-tracking analyses revealed that older adults spent more time looking at the bodies of the targets in the videos, whereas younger adults spent more time looking at the targets' heads. The contributions from both the lens model and eye-tracking findings provide distinct but complementary insights to our understanding of age-related continuities and shifts in social perceptual processing. (c) 2015 APA, all rights reserved.
What triggers catch-up saccades during visual tracking?
de Brouwer, Sophie; Yuksel, Demet; Blohm, Gunnar; Missal, Marcus; Lefèvre, Philippe
2002-03-01
When tracking moving visual stimuli, primates orient their visual axis by combining two kinds of eye movements, smooth pursuit and saccades, that have very different dynamics. Yet, the mechanisms that govern the decision to switch from one type of eye movement to the other are still poorly understood, even though they could bring a significant contribution to the understanding of how the CNS combines different kinds of control strategies to achieve a common motor and sensory goal. In this study, we investigated the oculomotor responses to a large range of different combinations of position error and velocity error during visual tracking of moving stimuli in humans. We found that the oculomotor system uses a prediction of the time at which the eye trajectory will cross the target, defined as the "eye crossing time" (T(XE)). The eye crossing time, which depends on both position error and velocity error, is the criterion used to switch between smooth and saccadic pursuit, i.e., to trigger catch-up saccades. On average, for T(XE) between 40 and 180 ms, no saccade is triggered and target tracking remains purely smooth. Conversely, when T(XE) becomes smaller than 40 ms or larger than 180 ms, a saccade is triggered after a short latency (around 125 ms).
SET: a pupil detection method using sinusoidal approximation
Javadi, Amir-Homayoun; Hakimi, Zahra; Barati, Morteza; Walsh, Vincent; Tcheang, Lili
2015-01-01
Mobile eye-tracking in external environments remains challenging, despite recent advances in eye-tracking software and hardware engineering. Many current methods fail to deal with the vast range of outdoor lighting conditions and the speed at which these can change. This confines experiments to artificial environments where conditions must be tightly controlled. Additionally, the emergence of low-cost eye tracking devices calls for the development of analysis tools that enable non-technical researchers to process the output of their images. We have developed a fast and accurate method (known as “SET”) that is suitable even for natural environments with uncontrolled, dynamic and even extreme lighting conditions. We compared the performance of SET with that of two open-source alternatives by processing two collections of eye images: images of natural outdoor scenes with extreme lighting variations (“Natural”); and images of less challenging indoor scenes (“CASIA-Iris-Thousand”). We show that SET excelled in outdoor conditions and was faster, without significant loss of accuracy, indoors. SET offers a low cost eye-tracking solution, delivering high performance even in challenging outdoor environments. It is offered through an open-source MATLAB toolkit as well as a dynamic-link library (“DLL”), which can be imported into many programming languages including C# and Visual Basic in Windows OS (www.eyegoeyetracker.co.uk). PMID:25914641
Screening for Dyslexia Using Eye Tracking during Reading.
Nilsson Benfatto, Mattias; Öqvist Seimyr, Gustaf; Ygge, Jan; Pansell, Tony; Rydberg, Agneta; Jacobson, Christer
2016-01-01
Dyslexia is a neurodevelopmental reading disability estimated to affect 5-10% of the population. While there is yet no full understanding of the cause of dyslexia, or agreement on its precise definition, it is certain that many individuals suffer persistent problems in learning to read for no apparent reason. Although it is generally agreed that early intervention is the best form of support for children with dyslexia, there is still a lack of efficient and objective means to help identify those at risk during the early years of school. Here we show that it is possible to identify 9-10 year old individuals at risk of persistent reading difficulties by using eye tracking during reading to probe the processes that underlie reading ability. In contrast to current screening methods, which rely on oral or written tests, eye tracking does not depend on the subject to produce some overt verbal response and thus provides a natural means to objectively assess the reading process as it unfolds in real-time. Our study is based on a sample of 97 high-risk subjects with early identified word decoding difficulties and a control group of 88 low-risk subjects. These subjects were selected from a larger population of 2165 school children attending second grade. Using predictive modeling and statistical resampling techniques, we develop classification models from eye tracking records less than one minute in duration and show that the models are able to differentiate high-risk subjects from low-risk subjects with high accuracy. Although dyslexia is fundamentally a language-based learning disability, our results suggest that eye movements in reading can be highly predictive of individual reading ability and that eye tracking can be an efficient means to identify children at risk of long-term reading difficulties.
Henneman, Elizabeth A
2017-07-01
The Institute of Medicine (now National Academy of Medicine) reports "To Err is Human" and "Crossing the Chasm" made explicit 3 previously unappreciated realities: (1) Medical errors are common and result in serious, preventable adverse events; (2) The majority of medical errors are the result of system versus human failures; and (3) It would be impossible for any system to prevent all errors. With these realities, the role of the nurse in the "near miss" process and as the final safety net for the patient is of paramount importance. The nurse's role in patient safety is described from both a systems perspective and a human factors perspective. Critical care nurses use specific strategies to identify, interrupt, and correct medical errors. Strategies to identify errors include knowing the patient, knowing the plan of care, double-checking, and surveillance. Nursing strategies to interrupt errors include offering assistance, clarifying, and verbally interrupting. Nurses correct errors by persevering, being physically present, reviewing/confirming the plan of care, or involving another nurse or physician. Each of these strategies has implications for education, practice, and research. Surveillance is a key nursing strategy for identifying medical errors and reducing adverse events. Eye-tracking technology is a novel approach for evaluating the surveillance process during common, high-risk processes such as blood transfusion and medication administration. Eye tracking has also been used to examine the impact of interruptions to care caused by bedside alarms as well as by other health care personnel. Findings from this safety-related eye-tracking research provide new insight into effective bedside surveillance and interruption management strategies. ©2017 American Association of Critical-Care Nurses.
Attention and Recall of Point-of-sale Tobacco Marketing: A Mobile Eye-Tracking Pilot Study.
Bansal-Travers, Maansi; Adkison, Sarah E; O'Connor, Richard J; Thrasher, James F
2016-01-01
As tobacco advertising restrictions have increased, the retail 'power wall' behind the counter is increasingly invaluable for marketing tobacco products. The primary objectives of this pilot study were 3-fold: (1) evaluate the attention paid/fixations on the area behind the cash register where tobacco advertising is concentrated and tobacco products are displayed in a real-world setting, (2) evaluate the duration (dwell-time) of these fixations, and (3) evaluate the recall of advertising displayed on the tobacco power wall. Data from 13 Smokers (S) and 12 Susceptible or non-daily Smokers (SS) aged 180-30 from a mobile eye-tracking study. Mobile-eye tracking technology records the orientation (fixation) and duration (dwell-time) of visual attention. Participants were randomized to one of three purchase tasks at a convenience store: Candy bar Only (CO; N = 10), Candy bar + Specified cigarette Brand (CSB; N = 6), and Candy bar + cigarette Brand of their Choice (CBC; N = 9). A post-session survey evaluated recall of tobacco marketing. Key outcomes were fixations and dwell-time on the cigarette displays at the point-of-sale. Participants spent a median time of 44 seconds during the standardized time evaluated and nearly three-quarters (72%) fixated on the power wall during their purchase, regardless of smoking status (S: 77%, SS: 67%) or purchase task (CO: 44%, CSB: 71%, CBC: 100%). In the post session survey, nearly all participants (96%) indicated they noticed a cigarette brand and 64% were able to describe a specific part of the tobacco wall or recall a promotional offer. Consumers are exposed to point-of-sale tobacco marketing, regardless of smoking status. FDA should consider regulations that limit exposure to point-of-sale tobacco marketing among consumers.
Use of JPSS ATMS, CrIS, and VIIRS data to Improve Tropical Cyclone Track and Intensity Forecasting
NASA Astrophysics Data System (ADS)
Chirokova, G.; Demaria, M.; DeMaria, R.; Knaff, J. A.; Dostalek, J.; Musgrave, K. D.; Beven, J. L.
2015-12-01
JPSS data provide unique information that could be critical for the forecasting of tropical cyclone (TC) track and intensity and is currently underutilized. Preliminary results from several TC applications using data from the Advanced Technology Microwave Sounder (ATMS), the Cross-Track Infrared Sounder (CrIS), and the Visible Infrared Imaging Radiometer Suite (VIIRS), carried by the Suomi National Polar-Orbiting Partnership satellite (SNPP), will be discussed. The first group of applications, which includes applications for moisture flux and for eye-detection, aims to improve rapid intensification (RI) forecasts, which is one of the highest priorities within NOAA. The applications could be used by forecasters directly and will also provide additional input to the Rapid Intensification Index (RII), the statistical-dynamical tool for forecasting RI events that is operational at the National Hurricane Center. The moisture flux application uses bias-corrected ATMS-MIRS (Microwave Integrated Retrieval System) and NUCAPS (NOAA Unique CrIS ATMS Processing System), retrievals that provide very accurate temperature and humidity soundings in the TC environment to detect dry air intrusions. The objective automated eye-detection application uses geostationary and VIIRS data in combination with machine learning and computer vision techniques for determining the onset of eye formation which is very important for TC intensity forecast but is usually determined by subjective methods. First version of the algorithm showed very promising results with a 75% success rate. The second group of applications develops tools to better utilize VIIRS data, including day-night band (DNB) imagery, for tropical cyclone forecasting. Disclaimer: The views, opinions, and findings contained in this article are those of the authors and should not be construed as an official National Oceanic and Atmospheric Administration (NOAA) or U.S. Government position, policy, or decision.
van der Gijp, A; Ravesloot, C J; Jarodzka, H; van der Schaaf, M F; van der Schaaf, I C; van Schaik, J P J; Ten Cate, Th J
2017-08-01
Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology domain aims to identify visual search patterns associated with high perceptual performance. Databases PubMed, EMBASE, ERIC, PsycINFO, Scopus and Web of Science were searched using 'visual perception' OR 'eye tracking' AND 'radiology' and synonyms. Two authors independently screened search results and included eye tracking studies concerning visual skills in radiology published between January 1, 1994 and July 31, 2015. Two authors independently assessed study quality with the Medical Education Research Study Quality Instrument, and extracted study data with respect to design, participant and task characteristics, and variables. A thematic analysis was conducted to extract and arrange study results, and a textual narrative synthesis was applied for data integration and interpretation. The search resulted in 22 relevant full-text articles. Thematic analysis resulted in six themes that informed the relation between visual search and level of expertise: (1) time on task, (2) eye movement characteristics of experts, (3) differences in visual attention, (4) visual search patterns, (5) search patterns in cross sectional stack imaging, and (6) teaching visual search strategies. Expert search was found to be characterized by a global-focal search pattern, which represents an initial global impression, followed by a detailed, focal search-to-find mode. Specific task-related search patterns, like drilling through CT scans and systematic search in chest X-rays, were found to be related to high expert levels. One study investigated teaching of visual search strategies, and did not find a significant effect on perceptual performance. Eye tracking literature in radiology indicates several search patterns are related to high levels of expertise, but teaching novices to search as an expert may not be effective. Experimental research is needed to find out which search strategies can improve image perception in learners.
Advanced autostereoscopic display for G-7 pilot project
NASA Astrophysics Data System (ADS)
Hattori, Tomohiko; Ishigaki, Takeo; Shimamoto, Kazuhiro; Sawaki, Akiko; Ishiguchi, Tsuneo; Kobayashi, Hiromi
1999-05-01
An advanced auto-stereoscopic display is described that permits the observation of a stereo pair by several persons simultaneously without the use of special glasses and any kind of head tracking devices for the viewers. The system is composed of a right eye system, a left eye system and a sophisticated head tracking system. In the each eye system, a transparent type color liquid crystal imaging plate is used with a special back light unit. The back light unit consists of a monochrome 2D display and a large format convex lens. The unit distributes the light of the viewers' correct each eye only. The right eye perspective system is combined with a left eye perspective system is combined with a left eye perspective system by a half mirror in order to function as a time-parallel stereoscopic system. The viewer's IR image is taken through and focused by the large format convex lens and feed back to the back light as a modulated binary half face image. The auto-stereoscopic display employs the TTL method as the accurate head tracking. The system was worked as a stereoscopic TV phone between Duke University Department Tele-medicine and Nagoya University School of Medicine Department Radiology using a high-speed digital line of GIBN. The applications are also described in this paper.
2010-08-01
astigmatism and other sources, and stay constant from time to time (LC Technologies, 2000). Systematic errors can sometimes reach many degrees of visual angle...Taking the average of all disparities would mean treating each as equally important regardless of whether they are from correct or incorrect mappings. In...likely stop somewhere near the centroid because the large hM basically treats every point equally (or nearly equally if using the multivariate
1977-05-10
apply this method of forecast- ing in the solution of all major scientific-technical problems of the na- tional economy. Citing the slow...the future, however, computers will "mature" and learn to recognize patterns in what amounts to a much more complex language—the language of visual...images. Photoelectronic tracking devices or "eyes" will allow the computer to take in information in a much more complex form and to perform opera
Geometry and Gesture-Based Features from Saccadic Eye-Movement as a Biometric in Radiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Tracy; Tourassi, Georgia; Yoon, Hong-Jun
In this study, we present a novel application of sketch gesture recognition on eye-movement for biometric identification and estimating task expertise. The study was performed for the task of mammographic screening with simultaneous viewing of four coordinated breast views as typically done in clinical practice. Eye-tracking data and diagnostic decisions collected for 100 mammographic cases (25 normal, 25 benign, 50 malignant) and 10 readers (three board certified radiologists and seven radiology residents), formed the corpus for this study. Sketch gesture recognition techniques were employed to extract geometric and gesture-based features from saccadic eye-movements. Our results show that saccadic eye-movement, characterizedmore » using sketch-based features, result in more accurate models for predicting individual identity and level of expertise than more traditional eye-tracking features.« less
Remote vs. head-mounted eye-tracking: a comparison using radiologists reading mammograms
NASA Astrophysics Data System (ADS)
Mello-Thoms, Claudia; Gur, David
2007-03-01
Eye position monitoring has been used for decades in Radiology in order to determine how radiologists interpret medical images. Using these devices several discoveries about the perception/decision making process have been made, such as the importance of comparisons of perceived abnormalities with selected areas of the background, the likelihood that a true lesion will attract visual attention early in the reading process, and the finding that most misses attract prolonged visual dwell, often comparable to dwell in the location of reported lesions. However, eye position tracking is a cumbersome process, which often requires the observer to wear a helmet gear which contains the eye tracker per se and a magnetic head tracker, which allows for the computation of head position. Observers tend to complain of fatigue after wearing the gear for a prolonged time. Recently, with the advances made to remote eye-tracking, the use of head-mounted systems seemed destined to become a thing of the past. In this study we evaluated a remote eye tracking system, and compared it to a head-mounted system, as radiologists read a case set of one-view mammograms on a high-resolution display. We compared visual search parameters between the two systems, such as time to hit the location of the lesion for the first time, amount of dwell time in the location of the lesion, total time analyzing the image, etc. We also evaluated the observers' impressions of both systems, and what their perceptions were of the restrictions of each system.
ERIC Educational Resources Information Center
Jian, Yu-Cin; Wu, Chao-Jung
2015-01-01
We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our…
Sight-Reading Expertise: Cross-Modality Integration Investigated Using Eye Tracking
ERIC Educational Resources Information Center
Drai-Zerbib, Veronique; Baccino, Thierry; Bigand, Emmanuel
2012-01-01
It is often said that experienced musicians are capable of hearing what they read (and vice versa). This suggests that they are able to process and to integrate multimodal information. The present study investigates this issue with an eye-tracking technique. Two groups of musicians chosen on the basis of their level of expertise (experts,…
Morphosyntactic Development in a Second Language: An Eye-Tracking Study on the Role of Attention
ERIC Educational Resources Information Center
Issa, Bernard Ibrahim, II
2015-01-01
One common claim in second language (L2) acquisition research is that attention is crucial for development to occur. Although previous empirical research supports this claim, methodological approaches have not been able to directly measure attention. This thesis utilized eye-tracking to directly measure attention and thus provide converging…
Hidden Communicative Competence: Case Study Evidence Using Eye-Tracking and Video Analysis
ERIC Educational Resources Information Center
Grayson, Andrew; Emerson, Anne; Howard-Jones, Patricia; O'Neil, Lynne
2012-01-01
A facilitated communication (FC) user with an autism spectrum disorder produced sophisticated texts by pointing, with physical support, to letters on a letterboard while their eyes were tracked and while their pointing movements were video recorded. This FC user has virtually no independent means of expression, and is held to have no literacy…
An Eye-Tracking Investigation of Written Sarcasm Comprehension: The Roles of Familiarity and Context
ERIC Educational Resources Information Center
?urcan, Alexandra; Filik, Ruth
2016-01-01
This article addresses a current theoretical debate between the standard pragmatic model, the graded salience hypothesis, and the implicit display theory, by investigating the roles of the context and of the properties of the sarcastic utterance itself in the comprehension of a sarcastic remark. Two eye-tracking experiments were conducted where we…
ERIC Educational Resources Information Center
Liu, Pei-Lin
2014-01-01
This study examined the influence of morphological instruction in an eye-tracking English vocabulary recognition task. Sixty-eight freshmen enrolled in an English course and received either traditional or morphological instruction for learning English vocabulary. The experimental part of the study was conducted over two-hour class periods for…
Basic Number Processing Deficits in Developmental Dyscalculia: Evidence from Eye Tracking
ERIC Educational Resources Information Center
Moeller, K.; Neuburger, S.; Kaufmann, L.; Landerl, K.; Nuerk, H. C.
2009-01-01
Recent research suggests that developmental dyscalculia is associated with a subitizing deficit (i.e., the inability to quickly enumerate small sets of up to 3 objects). However, the nature of this deficit has not previously been investigated. In the present study the eye-tracking methodology was employed to clarify whether (a) the subitizing…
ERIC Educational Resources Information Center
Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.
2013-01-01
Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…
ERIC Educational Resources Information Center
Mirman, Daniel; Yee, Eiling; Blumstein, Sheila E.; Magnuson, James S.
2011-01-01
We used eye-tracking to investigate lexical processing in aphasic participants by examining the fixation time course for rhyme (e.g., "carrot-parrot") and cohort (e.g., "beaker-beetle") competitors. Broca's aphasic participants exhibited larger rhyme competition effects than age-matched controls. A re-analysis of previously reported data (Yee,…
Target Selection by the Frontal Cortex during Coordinated Saccadic and Smooth Pursuit Eye Movements
ERIC Educational Resources Information Center
Srihasam, Krishna; Bullock, Daniel; Grossberg, Stephen
2009-01-01
Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth-pursuit eye movements. In particular, the saccadic and smooth-pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... notice. This notice solicits comments on research entitled, ``Eye Tracking Study of Direct-to-Consumer... the FDA to conduct research relating to health information. Section 1003(d)(2)(C) of the Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 393(b)(2)(c)) authorizes FDA to conduct research...
Visual Processing of Faces in Individuals with Fragile X Syndrome: An Eye Tracking Study
ERIC Educational Resources Information Center
Farzin, Faraz; Rivera, Susan M.; Hessl, David
2009-01-01
Gaze avoidance is a hallmark behavioral feature of fragile X syndrome (FXS), but little is known about whether abnormalities in the visual processing of faces, including disrupted autonomic reactivity, may underlie this behavior. Eye tracking was used to record fixations and pupil diameter while adolescents and young adults with FXS and sex- and…
Factors Influencing the Use of Captions by Foreign Language Learners: An Eye-Tracking Study
ERIC Educational Resources Information Center
Winke, Paula; Gass, Susan; Sydorenko, Tetyana
2013-01-01
This study investigates caption-reading behavior by foreign language (L2) learners and, through eye-tracking methodology, explores the extent to which the relationship between the native and target language affects that behavior. Second-year (4th semester) English-speaking learners of Arabic, Chinese, Russian, and Spanish watched 2 videos…
ERIC Educational Resources Information Center
Riby, Deborah M.; Hancock, Peter J. B.
2009-01-01
The neuro-developmental disorders of Williams syndrome (WS) and autism can reveal key components of social cognition. Eye-tracking techniques were applied in two tasks exploring attention to pictures containing faces. Images were (i) scrambled pictures containing faces or (ii) pictures of scenes with embedded faces. Compared to individuals who…
ERIC Educational Resources Information Center
Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn
2011-01-01
Several studies, using eye tracking methodology, suggest that different visual strategies in persons with autism spectrum conditions, compared with controls, are applied when viewing facial stimuli. Most eye tracking studies are, however, made in laboratory settings with either static (photos) or non-interactive dynamic stimuli, such as video…
Through Their Eyes: Tracking the Gaze of Students in a Geology Field Course
ERIC Educational Resources Information Center
Maltese, Adam V.; Balliet, Russell N.; Riggs, Eric M.
2013-01-01
The focus of this research was to investigate how students learn to do fieldwork through observation. This study addressed the following questions: (1) Can mobile eye-tracking devices provide a robust source of data to investigate the observations and workflow of novice students while participating in a field exercise? If so, what are the…
ERIC Educational Resources Information Center
Gordon, Peter C.; Hendrick, Randall; Johnson, Marcus; Lee, Yoonhyoung
2006-01-01
The nature of working memory operation during complex sentence comprehension was studied by means of eye-tracking methodology. Readers had difficulty when the syntax of a sentence required them to hold 2 similar noun phrases (NPs) in working memory before syntactically and semantically integrating either of the NPs with a verb. In sentence …
ERIC Educational Resources Information Center
Tsang, Vicky
2018-01-01
The eye-tracking experiment was carried out to assess fixation duration and scan paths that individuals with and without high-functioning autism spectrum disorders employed when identifying simple and complex emotions. Participants viewed human photos of facial expressions and decided on the identification of emotion, the negative-positive emotion…
A full-parallax 3D display with restricted viewing zone tracking viewer's eye
NASA Astrophysics Data System (ADS)
Beppu, Naoto; Yendo, Tomohiro
2015-03-01
The Three-Dimensional (3D) vision became widely known as familiar imaging technique now. The 3D display has been put into practical use in various fields, such as entertainment and medical fields. Development of 3D display technology will play an important role in a wide range of fields. There are various ways to the method of displaying 3D image. There is one of the methods that showing 3D image method to use the ray reproduction and we focused on it. This method needs many viewpoint images when achieve a full-parallax because this method display different viewpoint image depending on the viewpoint. We proposed to reduce wasteful rays by limiting projector's ray emitted to around only viewer using a spinning mirror, and to increase effectiveness of display device to achieve a full-parallax 3D display. We propose a method by using a tracking viewer's eye, a high-speed projector, a rotating mirror that tracking viewer (a spinning mirror), a concave mirror array having the different vertical slope arranged circumferentially (a concave mirror array), a cylindrical mirror. About proposed method in simulation, we confirmed the scanning range and the locus of the movement in the horizontal direction of the ray. In addition, we confirmed the switching of the viewpoints and convergence performance in the vertical direction of rays. Therefore, we confirmed that it is possible to realize a full-parallax.
Gregori Grgič, Regina; Calore, Enrico; de'Sperati, Claudio
2016-01-01
Whereas overt visuospatial attention is customarily measured with eye tracking, covert attention is assessed by various methods. Here we exploited Steady-State Visual Evoked Potentials (SSVEPs) - the oscillatory responses of the visual cortex to incoming flickering stimuli - to record the movements of covert visuospatial attention in a way operatively similar to eye tracking (attention tracking), which allowed us to compare motion observation and motion extrapolation with and without eye movements. Observers fixated a central dot and covertly tracked a target oscillating horizontally and sinusoidally. In the background, the left and the right halves of the screen flickered at two different frequencies, generating two SSVEPs in occipital regions whose size varied reciprocally as observers attended to the moving target. The two signals were combined into a single quantity that was modulated at the target frequency in a quasi-sinusoidal way, often clearly visible in single trials. The modulation continued almost unchanged when the target was switched off and observers mentally extrapolated its motion in imagery, and also when observers pointed their finger at the moving target during covert tracking, or imagined doing so. The amplitude of modulation during covert tracking was ∼25-30% of that measured when observers followed the target with their eyes. We used 4 electrodes in parieto-occipital areas, but similar results were achieved with a single electrode in Oz. In a second experiment we tested ramp and step motion. During overt tracking, SSVEPs were remarkably accurate, showing both saccadic-like and smooth pursuit-like modulations of cortical responsiveness, although during covert tracking the modulation deteriorated. Covert tracking was better with sinusoidal motion than ramp motion, and better with moving targets than stationary ones. The clear modulation of cortical responsiveness recorded during both overt and covert tracking, identical for motion observation and motion extrapolation, suggests to include covert attention movements in enactive theories of mental imagery. Copyright © 2015 Elsevier Ltd. All rights reserved.
Context effects on smooth pursuit and manual interception of a disappearing target.
Kreyenmeier, Philipp; Fooken, Jolande; Spering, Miriam
2017-07-01
In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points. Copyright © 2017 the American Physiological Society.
Sources of the monochromatic aberrations induced in human eyes after laser refractive surgery
NASA Astrophysics Data System (ADS)
Porter, Jason
Laser in-situ keratomileusis (LASIK) procedures correct the eye's defocus and astigmatism but also introduce higher order monochromatic aberrations. Little is known about the origins of these induced aberrations. The advent of wavefront sensor technology has made it possible to measure accurately and quickly the aberrations of normal and postoperative LASIK eyes. The goal of this thesis was to exploit this technology to better understand some of the potential mechanisms by which aberrations could be introduced during LASIK. A first step towards investigating these sources was to characterize the aberration changes in post-LASIK eyes. Higher order rms wavefront error increased after conventional and customized LASIK surgery. On average, spherical aberration approximately doubled, and significant changes in vertical and horizontal coma were observed. We examined two sources of postoperative aberrations: the creation of a microkeratome flap and the subsequent laser ablation. Higher order rms increased slightly and there was a wide variation in the response of individual Zernike modes after cutting a flap. The majority of induced spherical aberration was due to the laser ablation and not the flap-cut. Aberrations are also induced by static and dynamic decentrations of the patient's pupil. We found that ablations were typically decentered in the superotemporal direction due to shifts in pupil center location between aberration measurement (dilated) and surgical (undilated) conditions in customized LASIK eyes. There was a weak correlation between the horizontal coma theoretically induced by this offset and that measured postoperatively. Finally, dynamic eye movements during the procedure induce higher order aberrations. We found that the most problematic decentrations during LASIK are relatively slow drifts in eye position. An eye-tracking system with a 2-Hz closed-loop bandwidth could compensate for most eye movements during LASIK. One solution for reducing the aberrations induced by static and dynamic shifts in pupil center location is to reference the aberration measurement and treatment with respect to fixed features on the eye. Several other sources of aberration induction in LASIK, such as the efficiency of laser pulses striking the cornea perpendicularly versus obliquely, must still be investigated to optimize postoperative optical quality after LASIK.
Webcam mouse using face and eye tracking in various illumination environments.
Lin, Yuan-Pin; Chao, Yi-Ping; Lin, Chung-Chih; Chen, Jyh-Horng
2005-01-01
Nowadays, due to enhancement of computer performance and popular usage of webcam devices, it has become possible to acquire users' gestures for the human-computer-interface with PC via webcam. However, the effects of illumination variation would dramatically decrease the stability and accuracy of skin-based face tracking system; especially for a notebook or portable platform. In this study we present an effective illumination recognition technique, combining K-Nearest Neighbor classifier and adaptive skin model, to realize the real-time tracking system. We have demonstrated that the accuracy of face detection based on the KNN classifier is higher than 92% in various illumination environments. In real-time implementation, the system successfully tracks user face and eyes features at 15 fps under standard notebook platforms. Although KNN classifier only initiates five environments at preliminary stage, the system permits users to define and add their favorite environments to KNN for computer access. Eventually, based on this efficient tracking algorithm, we have developed a "Webcam Mouse" system to control the PC cursor using face and eye tracking. Preliminary studies in "point and click" style PC web games also shows promising applications in consumer electronic markets in the future.
Blackwood, D H; Sharp, C W; Walker, M T; Doody, G A; Glabus, M F; Muir, W J
1996-06-01
In large families with affective illness, identification of a biological variable is needed that reflects brain dysfunction at an earlier point than symptom development. Eye movement disorder, a possible vulnerability marker in schizophrenia, is less clearly associated with affective illness, although a subgroup of affective disorders shows smooth-pursuit eye movement disorder. The auditory P300 event-related potential may be a useful marker for risk to schizophrenia, but a role in bipolar illness is less certain. The distribution of these two biological variables and their association with symptoms in two multiply affected bipolar families is described. In a single, five-generation family identified for linkage studies through two bipolar I (BPI) probands, 128 members (including 20 spouses) were interviewed. The 108 related individuals had diagnoses of BPI (7), bipolar II (2), cyclothymia (3), or major depressive disorder (19). Eight others had generalised anxiety (1), minor depression (5), intermittent depression (1), or alcoholism (1). Sixty-nine subjects had no psychiatric diagnosis. P300 latency (81) and eye tracking (71) were recorded from a subgroup of relatives within the pedigree. Eye tracking was abnormal in 11 of 71 relatives (15.5%) and was bimodally distributed. In these 11 relatives, clinical diagnoses included minor depression (1), alcoholism (1) and generalised anxiety disorder (1). P300 latency was normally distributed and did not differ from controls. In a second family in which five of seven siblings have BPI illness, P300 latency and eye movement disorder were found in affected relatives and in some unaffected offspring. In these large families, clinical diagnoses of general anxiety, alcoholism and minor depression, when associated with eye tracking abnormality, may be considered alternative clinical manifestations of the same trait that in other relatives is expressed as bipolar illness.
Combining user logging with eye tracking for interactive and dynamic applications.
Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise
2015-12-01
User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.
Driver eye-scanning behavior at intersections at night.
DOT National Transportation Integrated Search
2009-10-01
This research project analyzed drivers eye scanning behavior at night when approaching signalized : and unsignalized intersections using the data from a head-mounted eye-tracking system during open road : driving on a prescribed route. During the ...
A model that integrates eye velocity commands to keep track of smooth eye displacements.
Blohm, Gunnar; Optican, Lance M; Lefèvre, Philippe
2006-08-01
Past results have reported conflicting findings on the oculomotor system's ability to keep track of smooth eye movements in darkness. Whereas some results indicate that saccades cannot compensate for smooth eye displacements, others report that memory-guided saccades during smooth pursuit are spatially correct. Recently, it was shown that the amount of time before the saccade made a difference: short-latency saccades were retinotopically coded, whereas long-latency saccades were spatially coded. Here, we propose a model of the saccadic system that can explain the available experimental data. The novel part of this model consists of a delayed integration of efferent smooth eye velocity commands. Two alternative physiologically realistic neural mechanisms for this integration stage are proposed. Model simulations accurately reproduced prior findings. Thus, this model reconciles the earlier contradictory reports from the literature about compensation for smooth eye movements before saccades because it involves a slow integration process.
High-resolution eye tracking using V1 neuron activity
McFarland, James M.; Bondy, Adrian G.; Cumming, Bruce G.; Butts, Daniel A.
2014-01-01
Studies of high-acuity visual cortical processing have been limited by the inability to track eye position with sufficient accuracy to precisely reconstruct the visual stimulus on the retina. As a result, studies on primary visual cortex (V1) have been performed almost entirely on neurons outside the high-resolution central portion of the visual field (the fovea). Here we describe a procedure for inferring eye position using multi-electrode array recordings from V1 coupled with nonlinear stimulus processing models. We show that this method can be used to infer eye position with one arc-minute accuracy – significantly better than conventional techniques. This allows for analysis of foveal stimulus processing, and provides a means to correct for eye-movement induced biases present even outside the fovea. This method could thus reveal critical insights into the role of eye movements in cortical coding, as well as their contribution to measures of cortical variability. PMID:25197783
Experiencing Light's Properties within Your Own Eye
ERIC Educational Resources Information Center
Mauser, Michael
2011-01-01
Seeing the reflection, refraction, dispersion, absorption, polarization, and scattering or diffraction of light within your own eye makes these properties of light truly personal. There are practical aspects of these within the eye phenomena, such as eye tracking for computer interfaces. They also offer some intriguing diversions, for example,…
ERIC Educational Resources Information Center
Boucheix, Jean-Michel; Lowe, Richard K.
2010-01-01
Two experiments used eye tracking to investigate a novel cueing approach for directing learner attention to low salience, high relevance aspects of a complex animation. In the first experiment, comprehension of a piano mechanism animation containing spreading-colour cues was compared with comprehension obtained with arrow cues or no cues. Eye…
ERIC Educational Resources Information Center
Stewart, Andrew J.; Haigh, Matthew; Ferguson, Heather J.
2013-01-01
Statements of the form if… then… can be used to communicate conditional speech acts such as tips and promises. Conditional promises require the speaker to have perceived control over the outcome event, whereas conditional tips do not. In an eye-tracking study, we examined whether readers are sensitive to information about perceived speaker control…
ERIC Educational Resources Information Center
Tai, Robert H.; Loehr, John F.; Brigham, Frederick J.
2006-01-01
This pilot study investigated the capacity of eye-gaze tracking to identify differences in problem-solving behaviours within a group of individuals who possessed varying degrees of knowledge and expertise in three disciplines of science (biology, chemistry and physics). The six participants, all pre-service science teachers, completed an 18-item…
An Eye-Tracking Study of Learning from Science Text with Concrete and Abstract Illustrations
ERIC Educational Resources Information Center
Mason, Lucia; Pluchino, Patrik; Tornatora, Maria Caterina; Ariasi, Nicola
2013-01-01
This study investigated the online process of reading and the offline learning from an illustrated science text. The authors examined the effects of using a concrete or abstract picture to illustrate a text and adopted eye-tracking methodology to trace text and picture processing. They randomly assigned 59 eleventh-grade students to 3 reading…
Peer Assessment of Webpage Design: Behavioral Sequential Analysis Based on Eye-Tracking Evidence
ERIC Educational Resources Information Center
Hsu, Ting-Chia; Chang, Shao-Chen; Liu, Nan-Cen
2018-01-01
This study employed an eye-tracking machine to record the process of peer assessment. Each web page was divided into several regions of interest (ROIs) based on the frame design and content. A total of 49 undergraduate students with a visual learning style participated in the experiment. This study investigated the peer assessment attitudes of the…
ERIC Educational Resources Information Center
Gegenfurtner, Andreas; Lehtinen, Erno; Saljo, Roger
2011-01-01
This meta-analysis integrates 296 effect sizes reported in eye-tracking research on expertise differences in the comprehension of visualizations. Three theories were evaluated: Ericsson and Kintsch's ("Psychol Rev" 102:211-245, 1995) theory of long-term working memory, Haider and Frensch's ("J Exp Psychol Learn Mem Cognit" 25:172-190, 1999)…
ERIC Educational Resources Information Center
Manelis, Anna; Reder, Lynne M.
2012-01-01
Using a combination of eye tracking and fMRI in a contextual cueing task, we explored the mechanisms underlying the facilitation of visual search for repeated spatial configurations. When configurations of distractors were repeated, greater activation in the right hippocampus corresponded to greater reductions in the number of saccades to locate…
An Eye Tracking Investigation of Attentional Biases towards Affect in Young Children
ERIC Educational Resources Information Center
Burris, Jessica L.; Barry-Anwar, Ryan A.; Rivera, Susan M.
2017-01-01
This study examines attentional biases in the presence of angry, happy and neutral faces using a modified eye tracking version of the dot probe task (DPT). Participants were 111 young children between 9 and 48 months. Children passively viewed an affective attention bias task that consisted of a face pairing (neutral paired with either neutral,…
ERIC Educational Resources Information Center
Cortina, Kai S.; Miller, Kevin F.; McKenzie, Ryan; Epstein, Alanna
2015-01-01
Classroom observation research and research on teacher expertise are similar in their reliance on observational data with high-inference procedure to assess the quality of instruction. Expertise research usually uses low-inference measures like eye tracking to identify qualitative difference between expert and novice behaviors and cognition. In…
Keller, Jürgen; Krimly, Amon; Bauer, Lisa; Schulenburg, Sarah; Böhm, Sarah; Aho-Özhan, Helena E A; Uttner, Ingo; Gorges, Martin; Kassubek, Jan; Pinkhardt, Elmar H; Abrahams, Sharon; Ludolph, Albert C; Lulé, Dorothée
2017-08-01
Reliable assessment of cognitive functions is a challenging task in amyotrophic lateral sclerosis (ALS) patients unable to speak and write. We therefore present an eye-tracking based neuropsychological screening tool based on the Edinburgh Cognitive and Behavioural ALS Screen (ECAS), a standard screening tool for cognitive deficits in ALS. In total, 46 ALS patients and 50 healthy controls matched for age, gender and education were tested with an oculomotor based and a standard paper-and-pencil version of the ECAS. Significant correlation between both versions was observed for ALS patients and healthy controls in the ECAS total score and in all of its ALS-specific domains (all r > 0.3; all p < 0.05). The eye-tracking version of the ECAS reliably distinguished between ALS patients and healthy controls in the ECAS total score (p < 0.05). Also, cognitively impaired and non-impaired patients could be reliably distinguished with a specificity of 95%. This study provides first evidence that the eye-tracking based ECAS version is a promising approach for assessing cognitive deficits in ALS patients who are unable to speak or write.
Innovative Techniques for Evaluating Behavioral Nutrition Interventions1234
Laugero, Kevin D; Cunningham, Brian T; Lora, Karina R; Reicks, Marla
2017-01-01
Assessing outcomes and the impact from behavioral nutrition interventions has remained challenging because of the lack of methods available beyond traditional nutrition assessment tools and techniques. With the current high global obesity and related chronic disease rates, novel methods to evaluate the impact of behavioral nutrition-based interventions are much needed. The objective of this narrative review is to describe and review the current status of knowledge as it relates to 4 different innovative methods or tools to assess behavioral nutrition interventions. Methods reviewed include 1) the assessment of stress and stress responsiveness to enhance the evaluation of nutrition interventions, 2) eye-tracking technology in nutritional interventions, 3) smartphone biosensors to assess nutrition and health-related outcomes, and 4) skin carotenoid measurements to assess fruit and vegetable intake. Specifically, the novel use of functional magnetic resonance imaging, by characterizing the brain’s responsiveness to an intervention, can help researchers develop programs with greater efficacy. Similarly, if eye-tracking technology can enable researchers to get a better sense as to how participants view materials, the materials may be better tailored to create an optimal impact. The latter 2 techniques reviewed, smartphone biosensors and methods to detect skin carotenoids, can provide the research community with portable, effective, nonbiased ways to assess dietary intake and quality and more in the field. The information gained from using these types of methodologies can improve the efficacy and assessment of behavior-based nutrition interventions. PMID:28096132
Validation of a Behavioral Approach for Measuring Saccades in Parkinson's Disease.
Turner, Travis H; Renfroe, Jenna B; Duppstadt-Delambo, Amy; Hinson, Vanessa K
2017-01-01
Speed and control of saccades are related to disease progression and cognitive functioning in Parkinson's disease (PD). Traditional eye-tracking complexities encumber application for individual evaluations and clinical trials. The authors examined psychometric properties of standalone tasks for reflexive prosaccade latency, volitional saccade initiation, and saccade inhibition (antisaccade) in a heterogeneous sample of 65 PD patients. Demographics had minimal impact on task performance. Thirty-day test-retest reliability estimates for behavioral tasks were acceptable and similar to traditional eye tracking. Behavioral tasks demonstrated concurrent validity with traditional eye-tracking measures; discriminant validity was less clear. Saccade initiation and inhibition discriminated PD patients with cognitive impairment. The present findings support further development and use of the behavioral tasks for assessing latency and control of saccades in PD.
Workload assessment of surgeons: correlation between NASA TLX and blinks.
Zheng, Bin; Jiang, Xianta; Tien, Geoffrey; Meneghetti, Adam; Panton, O Neely M; Atkins, M Stella
2012-10-01
Blinks are known as an indicator of visual attention and mental stress. In this study, surgeons' mental workload was evaluated utilizing a paper assessment instrument (National Aeronautics and Space Administration Task Load Index, NASA TLX) and by examining their eye blinks. Correlation between these two assessments was reported. Surgeons' eye motions were video-recorded using a head-mounted eye-tracker while the surgeons performed a laparoscopic procedure on a virtual reality trainer. Blink frequency and duration were computed using computer vision technology. The level of workload experienced during the procedure was reported by surgeons using the NASA TLX. A total of 42 valid videos were recorded from 23 surgeons. After blinks were computed, videos were divided into two groups based on the blink frequency: infrequent group (≤ 6 blinks/min) and frequent group (more than 6 blinks/min). Surgical performance (measured by task time and trajectories of tool tips) was not significantly different between these two groups, but NASA TLX scores were significantly different. Surgeons who blinked infrequently reported a higher level of frustration (46 vs. 34, P = 0.047) and higher overall level of workload (57 vs. 47, P = 0.045) than those who blinked more frequently. The correlation coefficients (Pearson test) between NASA TLX and the blink frequency and duration were -0.17 and 0.446. Reduction of blink frequency and shorter blink duration matched the increasing level of mental workload reported by surgeons. The value of using eye-tracking technology for assessment of surgeon mental workload was shown.
Temporal eye movement strategies during naturalistic viewing
Wang, Helena X.; Freeman, Jeremy; Merriam, Elisha P.; Hasson, Uri; Heeger, David J.
2011-01-01
The deployment of eye movements to complex spatiotemporal stimuli likely involves a variety of cognitive factors. However, eye movements to movies are surprisingly reliable both within and across observers. We exploited and manipulated that reliability to characterize observers’ temporal viewing strategies. Introducing cuts and scrambling the temporal order of the resulting clips systematically changed eye movement reliability. We developed a computational model that exhibited this behavior and provided an excellent fit to the measured eye movement reliability. The model assumed that observers searched for, found, and tracked a point-of-interest, and that this process reset when there was a cut. The model did not require that eye movements depend on temporal context in any other way, and it managed to describe eye movements consistently across different observers and two movie sequences. Thus, we found no evidence for the integration of information over long time scales (greater than a second). The results are consistent with the idea that observers employ a simple tracking strategy even while viewing complex, engaging naturalistic stimuli. PMID:22262911
[Virtual reality in ophthalmological education].
Wagner, C; Schill, M; Hennen, M; Männer, R; Jendritza, B; Knorz, M C; Bender, H J
2001-04-01
We present a computer-based medical training workstation for the simulation of intraocular eye surgery. The surgeon manipulates two original instruments inside a mechanical model of the eye. The instrument positions are tracked by CCD cameras and monitored by a PC which renders the scenery using a computer-graphic model of the eye and the instruments. The simulator incorporates a model of the operation table, a mechanical eye, three CCD cameras for the position tracking, the stereo display, and a computer. The three cameras are mounted under the operation table from where they can observe the interior of the mechanical eye. Using small markers the cameras recognize the instruments and the eye. Their position and orientation in space is determined by stereoscopic back projection. The simulation runs with more than 20 frames per second and provides a realistic impression of the surgery. It includes the cold light source which can be moved inside the eye and the shadow of the instruments on the retina which is important for navigational purposes.
NASA Astrophysics Data System (ADS)
Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo
In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.
Bueeler, Michael; Mrochen, Michael
2005-01-01
The aim of this theoretical work was to investigate the robustness of scanning spot laser treatments with different laser spot diameters and peak ablation depths in case of incomplete compensation of eye movements due to eye-tracker latency. Scanning spot corrections of 3rd to 5th Zernike order wavefront errors were numerically simulated. Measured eye-movement data were used to calculate the positioning error of each laser shot assuming eye-tracker latencies of 0, 5, 30, and 100 ms, and for the case of no eye tracking. The single spot ablation depth ranged from 0.25 to 1.0 microm and the spot diameter from 250 to 1000 microm. The quality of the ablation was rated by the postoperative surface variance and the Strehl intensity ratio, which was calculated after a low-pass filter was applied to simulate epithelial surface smoothing. Treatments performed with nearly ideal eye tracking (latency approximately 0) provide the best results with a small laser spot (0.25 mm) and a small ablation depth (250 microm). However, combinations of a large spot diameter (1000 microm) and a small ablation depth per pulse (0.25 microm) yield the better results for latencies above a certain threshold to be determined specifically. Treatments performed with tracker latencies in the order of 100 ms yield similar results as treatments done completely without eye-movement compensation. CONCWSIONS: Reduction of spot diameter was shown to make the correction more susceptible to eye movement induced error. A smaller spot size is only beneficial when eye movement is neutralized with a tracking system with a latency <5 ms.
Binocular eye movement control and motion perception: what is being tracked?
van der Steen, Johannes; Dits, Joyce
2012-10-19
We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking.
Binocular Eye Movement Control and Motion Perception: What Is Being Tracked?
van der Steen, Johannes; Dits, Joyce
2012-01-01
Purpose. We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. Methods. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Results. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. Conclusions. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking. PMID:22997286
Face landmark point tracking using LK pyramid optical flow
NASA Astrophysics Data System (ADS)
Zhang, Gang; Tang, Sikan; Li, Jiaquan
2018-04-01
LK pyramid optical flow is an effective method to implement object tracking in a video. It is used for face landmark point tracking in a video in the paper. The landmark points, i.e. outer corner of left eye, inner corner of left eye, inner corner of right eye, outer corner of right eye, tip of a nose, left corner of mouth, right corner of mouth, are considered. It is in the first frame that the landmark points are marked by hand. For subsequent frames, performance of tracking is analyzed. Two kinds of conditions are considered, i.e. single factors such as normalized case, pose variation and slowly moving, expression variation, illumination variation, occlusion, front face and rapidly moving, pose face and rapidly moving, and combination of the factors such as pose and illumination variation, pose and expression variation, pose variation and occlusion, illumination and expression variation, expression variation and occlusion. Global measures and local ones are introduced to evaluate performance of tracking under different factors or combination of the factors. The global measures contain the number of images aligned successfully, average alignment error, the number of images aligned before failure, and the local ones contain the number of images aligned successfully for components of a face, average alignment error for the components. To testify performance of tracking for face landmark points under different cases, tests are carried out for image sequences gathered by us. Results show that the LK pyramid optical flow method can implement face landmark point tracking under normalized case, expression variation, illumination variation which does not affect facial details, pose variation, and that different factors or combination of the factors have different effect on performance of alignment for different landmark points.
Prakash, Gaurav; Ashok Kumar, Dhivya; Agarwal, Amar; Jacob, Soosan; Sarvanan, Yoga; Agarwal, Athiya
2010-02-01
To analyze the predictive factors associated with success of iris recognition and dynamic rotational eye tracking on a laser in situ keratomileusis (LASIK) platform with active assessment and correction of intraoperative cyclotorsion. Interventional case series. Two hundred seventy-five eyes of 142 consecutive candidates underwent LASIK with attempted iris recognition and dynamic rotational tracking on the Technolas 217z100 platform (Techolas Perfect Vision, St Louis, Missouri, USA) at a tertiary care ophthalmic hospital. The main outcome measures were age, gender, flap creation method (femtosecond, microkeratome, epi-LASIK), success of static rotational tracking, ablation algorithm, pulses, and depth; preablation and intraablation rotational activity were analyzed and evaluated using regression models. Preablation static iris recognition was successful in 247 eyes, without difference in flap creation methods (P = .6). Age (partial correlation, -0.16; P = .014), amount of pulses (partial correlation, 0.39; P = 1.6 x 10(-8)), and gender (P = .02) were significant predictive factors for the amount of intraoperative cyclodeviation. Tracking difficulties leading to linking the ablation with a new intraoperatively acquired iris image were more with femtosecond-assisted flaps (P = 2.8 x 10(-7)) and the amount of intraoperative cyclotorsion (P = .02). However, the number of cases having nonresolvable failure of intraoperative rotational tracking was similar in the 3 flap creation methods (P = .22). Intraoperative cyclotorsional activity depends on the age, gender, and duration of ablation (pulses delivered). Femtosecond flaps do not seem to have a disadvantage over microkeratome flaps as far as iris recognition and success of intraoperative dynamic rotational tracking is concerned. Copyright (c) 2010 Elsevier Inc. All rights reserved.
Adaptive optics optical coherence tomography with dynamic retinal tracking
Kocaoglu, Omer P.; Ferguson, R. Daniel; Jonnal, Ravi S.; Liu, Zhuolin; Wang, Qiang; Hammer, Daniel X.; Miller, Donald T.
2014-01-01
Adaptive optics optical coherence tomography (AO-OCT) is a highly sensitive and noninvasive method for three dimensional imaging of the microscopic retina. Like all in vivo retinal imaging techniques, however, it suffers the effects of involuntary eye movements that occur even under normal fixation. In this study we investigated dynamic retinal tracking to measure and correct eye motion at KHz rates for AO-OCT imaging. A customized retina tracking module was integrated into the sample arm of the 2nd-generation Indiana AO-OCT system and images were acquired on three subjects. Analyses were developed based on temporal amplitude and spatial power spectra in conjunction with strip-wise registration to independently measure AO-OCT tracking performance. After optimization of the tracker parameters, the system was found to correct eye movements up to 100 Hz and reduce residual motion to 10 µm root mean square. Between session precision was 33 µm. Performance was limited by tracker-generated noise at high temporal frequencies. PMID:25071963
Tracking Students' Cognitive Processes during Program Debugging--An Eye-Movement Approach
ERIC Educational Resources Information Center
Lin, Yu-Tzu; Wu, Cheng-Chih; Hou, Ting-Yun; Lin, Yu-Chih; Yang, Fang-Ying; Chang, Chia-Hu
2016-01-01
This study explores students' cognitive processes while debugging programs by using an eye tracker. Students' eye movements during debugging were recorded by an eye tracker to investigate whether and how high- and low-performance students act differently during debugging. Thirty-eight computer science undergraduates were asked to debug two C…
Eye-Movement Patterns Are Associated with Communicative Competence in Autistic Spectrum Disorders
ERIC Educational Resources Information Center
Norbury, Courtenay Frazier; Brock, Jon; Cragg, Lucy; Einav, Shiri; Griffiths, Helen; Nation, Kate
2009-01-01
Background: Investigations using eye-tracking have reported reduced fixations to salient social cues such as eyes when participants with autism spectrum disorders (ASD) view social scenes. However, these studies have not distinguished different cognitive phenotypes. Methods: The eye-movements of 28 teenagers with ASD and 18 typically developing…
Improving Silent Reading Performance through Feedback on Eye Movements: A Feasibility Study
ERIC Educational Resources Information Center
Korinth, Sebastian P.; Fiebach, Christian J.
2018-01-01
This feasibility study investigated if feedback about individual eye movements, reflecting varying word processing stages, can improve reading performance. Twenty-five university students read 90 newspaper articles during 9 eye-tracking sessions. Training group participants (n = 12) were individually briefed before each session, which eye movement…
Tracking without perceiving: a dissociation between eye movements and motion perception.
Spering, Miriam; Pomplun, Marc; Carrasco, Marisa
2011-02-01
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.
Tracking Without Perceiving: A Dissociation Between Eye Movements and Motion Perception
Spering, Miriam; Pomplun, Marc; Carrasco, Marisa
2011-01-01
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept. PMID:21189353
The Human Engineering Eye Movement Measurement Research Facility.
1985-04-01
tracked reliably. When tracking is disrupted (e.g., by gross and sudden head movements, gross change in the head position, sneezing, prolonged eye...these are density ^\\ and " busyness " of the slides (stimulus material), as well as consistency . I„ between successive... change the material being projected based on the subject’s previous performance. The minicomputer relays the calibrated data to one of the magnetic
ERIC Educational Resources Information Center
Roebers, Claudia M.; Schmid, Corinne; Roderer, Thomas
2010-01-01
The authors explored different aspects of encoding strategy use in primary school children by including (a) an encoding strategy task in which children's encoding strategy use was recorded through a remote eye-tracking device and, later, free recall and recognition for target items was assessed; and (b) tasks measuring resistance to interference…
Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI
2015-04-01
virtual reality driving simulator data acquisition. Data collection for the pilot study is nearly complete and data analyses are currently under way...Training for primary study procedures including neuropsychological testing, eye- tracking, virtual reality driving simulator, and EEG data acquisition is...the virtual reality driving simulator. Participants are instructed to drive along a coastal highway while performing the target detection task
Eye-Tracking Provides a Sensitive Measure of Exploration Deficits After Acute Right MCA Stroke
Delazer, Margarete; Sojer, Martin; Ellmerer, Philipp; Boehme, Christian; Benke, Thomas
2018-01-01
The eye-tracking study aimed at assessing spatial biases in visual exploration in patients after acute right MCA (middle cerebral artery) stroke. Patients affected by unilateral neglect show less functional recovery and experience severe difficulties in everyday life. Thus, accurate diagnosis is essential, and specific treatment is required. Early assessment is of high importance as rehabilitative interventions are more effective when applied soon after stroke. Previous research has shown that deficits may be overlooked when classical paper-and-pencil tasks are used for diagnosis. Conversely, eye-tracking allows direct monitoring of visual exploration patterns. We hypothesized that the analysis of eye-tracking provides more sensitive measures for spatial exploration deficits after right middle cerebral artery stroke. Twenty-two patients with right MCA stroke (median 5 days after stroke) and 28 healthy controls were included. Lesions were confirmed by MRI/CCT. Groups performed comparably in the Mini–Mental State Examination (patients and controls median 29) and in a screening of executive functions. Eleven patients scored at ceiling in neglect screening tasks, 11 showed minimal to severe signs of unilateral visual neglect. An overlap plot based on MRI and CCT imaging showed lesions in the temporo–parieto–frontal cortex, basal ganglia, and adjacent white matter tracts. Visual exploration was evaluated in two eye-tracking tasks, one assessing free visual exploration of photographs, the other visual search using symbols and letters. An index of fixation asymmetries proved to be a sensitive measure of spatial exploration deficits. Both patient groups showed a marked exploration bias to the right when looking at complex photographs. A single case analysis confirmed that also most of those patients who showed no neglect in screening tasks performed outside the range of controls in free exploration. The analysis of patients’ scoring at ceiling in neglect screening tasks is of special interest, as possible deficits may be overlooked and thus remain untreated. Our findings are in line with other studies suggesting considerable limitations of laboratory screening procedures to fully appreciate the occurrence of neglect symptoms. Future investigations are needed to explore the predictive value of the eye-tracking index and its validity in everyday situations.
Improvement of design of a surgical interface using an eye tracking device
2014-01-01
Background Surgical interfaces are used for helping surgeons in interpretation and quantification of the patient information, and for the presentation of an integrated workflow where all available data are combined to enable optimal treatments. Human factors research provides a systematic approach to design user interfaces with safety, accuracy, satisfaction and comfort. One of the human factors research called user-centered design approach is used to develop a surgical interface for kidney tumor cryoablation. An eye tracking device is used to obtain the best configuration of the developed surgical interface. Methods Surgical interface for kidney tumor cryoablation has been developed considering the four phases of user-centered design approach, which are analysis, design, implementation and deployment. Possible configurations of the surgical interface, which comprise various combinations of menu-based command controls, visual display of multi-modal medical images, 2D and 3D models of the surgical environment, graphical or tabulated information, visual alerts, etc., has been developed. Experiments of a simulated cryoablation of a tumor task have been performed with surgeons to evaluate the proposed surgical interface. Fixation durations and number of fixations at informative regions of the surgical interface have been analyzed, and these data are used to modify the surgical interface. Results Eye movement data has shown that participants concentrated their attention on informative regions more when the number of displayed Computer Tomography (CT) images has been reduced. Additionally, the time required to complete the kidney tumor cryoablation task by the participants had been decreased with the reduced number of CT images. Furthermore, the fixation durations obtained after the revision of the surgical interface are very close to what is observed in visual search and natural scene perception studies suggesting more efficient and comfortable interaction with the surgical interface. The National Aeronautics and Space Administration Task Load Index (NASA-TLX) and Short Post-Assessment Situational Awareness (SPASA) questionnaire results have shown that overall mental workload of surgeons related with surgical interface has been low as it has been aimed, and overall situational awareness scores of surgeons have been considerably high. Conclusions This preliminary study highlights the improvement of a developed surgical interface using eye tracking technology to obtain the best SI configuration. The results presented here reveal that visual surgical interface design prepared according to eye movement characteristics may lead to improved usability. PMID:25080176
Improvement of design of a surgical interface using an eye tracking device.
Erol Barkana, Duygun; Açık, Alper; Duru, Dilek Goksel; Duru, Adil Deniz
2014-05-07
Surgical interfaces are used for helping surgeons in interpretation and quantification of the patient information, and for the presentation of an integrated workflow where all available data are combined to enable optimal treatments. Human factors research provides a systematic approach to design user interfaces with safety, accuracy, satisfaction and comfort. One of the human factors research called user-centered design approach is used to develop a surgical interface for kidney tumor cryoablation. An eye tracking device is used to obtain the best configuration of the developed surgical interface. Surgical interface for kidney tumor cryoablation has been developed considering the four phases of user-centered design approach, which are analysis, design, implementation and deployment. Possible configurations of the surgical interface, which comprise various combinations of menu-based command controls, visual display of multi-modal medical images, 2D and 3D models of the surgical environment, graphical or tabulated information, visual alerts, etc., has been developed. Experiments of a simulated cryoablation of a tumor task have been performed with surgeons to evaluate the proposed surgical interface. Fixation durations and number of fixations at informative regions of the surgical interface have been analyzed, and these data are used to modify the surgical interface. Eye movement data has shown that participants concentrated their attention on informative regions more when the number of displayed Computer Tomography (CT) images has been reduced. Additionally, the time required to complete the kidney tumor cryoablation task by the participants had been decreased with the reduced number of CT images. Furthermore, the fixation durations obtained after the revision of the surgical interface are very close to what is observed in visual search and natural scene perception studies suggesting more efficient and comfortable interaction with the surgical interface. The National Aeronautics and Space Administration Task Load Index (NASA-TLX) and Short Post-Assessment Situational Awareness (SPASA) questionnaire results have shown that overall mental workload of surgeons related with surgical interface has been low as it has been aimed, and overall situational awareness scores of surgeons have been considerably high. This preliminary study highlights the improvement of a developed surgical interface using eye tracking technology to obtain the best SI configuration. The results presented here reveal that visual surgical interface design prepared according to eye movement characteristics may lead to improved usability.
Evaluation of helmet-mounted display targeting symbology based on eye tracking technology
NASA Astrophysics Data System (ADS)
Wang, Lijing; Wen, Fuzhen; Ma, Caixin; Zhao, Shengchu; Liu, Xiaodong
2014-06-01
The purpose of this paper is to find the Target Locator Lines (TLLs) which perform best by contrasting and comparing experiment based on three kinds of TTLs of fighter HMD. 10 university students, male, with an average age of 21-23, corrected visual acuity 1.5, participated in the experiment. In the experiment, head movement data was obtained by TrackIR. The geometric relationship between the coordinates of the real world and coordinates of the visual display was obtained by calculating the distance from viewpoint to midpoint of both eyes and the head movement data. Virtual helmet system simulation experiment environment was created by drawing TLLs of fighter HMD in the flight simulator visual scene. In the experiment, eye tracker was used to record the time and saccade trajectory. The results were evaluated by the duration of the time and saccade trajectory. The results showed that the symbol"locator line with digital vector length indication" cost most time and had the longest length of the saccade trajectory. It is the most ineffective and most unacceptable way. "Locator line with extending head vector length symbol" cost less time and had less length of the saccade trajectory. It is effective and acceptable;"Locator line with reflected vector length symbol" cost the least time and had the least length of the saccade trajectory. It is the most effective and most acceptable way. "Locator line with reflected vector length symbol" performs best. The results will provide reference value for the research of TTLs in future.
Bernard, Florian; Deuter, Christian Eric; Gemmar, Peter; Schachinger, Hartmut
2013-10-01
Using the positions of the eyelids is an effective and contact-free way for the measurement of startle induced eye-blinks, which plays an important role in human psychophysiological research. To the best of our knowledge, no methods for an efficient detection and tracking of the exact eyelid contours in image sequences captured at high-speed exist that are conveniently usable by psychophysiological researchers. In this publication a semi-automatic model-based eyelid contour detection and tracking algorithm for the analysis of high-speed video recordings from an eye tracker is presented. As a large number of images have been acquired prior to method development it was important that our technique is able to deal with images that are recorded without any special parametrisation of the eye tracker. The method entails pupil detection, specular reflection removal and makes use of dynamic model adaption. In a proof-of-concept study we could achieve a correct detection rate of 90.6%. With this approach, we provide a feasible method to accurately assess eye-blinks from high-speed video recordings. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A; Kagemann, Larry; Schuman, Joel S
2016-01-01
Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t -test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects.
Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Kagemann, Larry; Schuman, Joel S.
2016-01-01
Purpose Developing a novel image enhancement method so that nonframe-averaged optical coherence tomography (OCT) images become comparable to active eye-tracking frame-averaged OCT images. Methods Twenty-one eyes of 21 healthy volunteers were scanned with noneye-tracking nonframe-averaged OCT device and active eye-tracking frame-averaged OCT device. Virtual averaging was applied to nonframe-averaged images with voxel resampling and adding amplitude deviation with 15-time repetitions. Signal-to-noise (SNR), contrast-to-noise ratios (CNR), and the distance between the end of visible nasal retinal nerve fiber layer (RNFL) and the foveola were assessed to evaluate the image enhancement effect and retinal layer visibility. Retinal thicknesses before and after processing were also measured. Results All virtual-averaged nonframe-averaged images showed notable improvement and clear resemblance to active eye-tracking frame-averaged images. Signal-to-noise and CNR were significantly improved (SNR: 30.5 vs. 47.6 dB, CNR: 4.4 vs. 6.4 dB, original versus processed, P < 0.0001, paired t-test). The distance between the end of visible nasal RNFL and the foveola was significantly different before (681.4 vs. 446.5 μm, Cirrus versus Spectralis, P < 0.0001) but not after processing (442.9 vs. 446.5 μm, P = 0.76). Sectoral macular total retinal and circumpapillary RNFL thicknesses showed systematic differences between Cirrus and Spectralis that became not significant after processing. Conclusion The virtual averaging method successfully improved nontracking nonframe-averaged OCT image quality and made the images comparable to active eye-tracking frame-averaged OCT images. Translational Relevance Virtual averaging may enable detailed retinal structure studies on images acquired using a mixture of nonframe-averaged and frame-averaged OCT devices without concerning about systematic differences in both qualitative and quantitative aspects. PMID:26835180
Pupil Tracking for Real-Time Motion Corrected Anterior Segment Optical Coherence Tomography
Carrasco-Zevallos, Oscar M.; Nankivil, Derek; Viehland, Christian; Keller, Brenton; Izatt, Joseph A.
2016-01-01
Volumetric acquisition with anterior segment optical coherence tomography (ASOCT) is necessary to obtain accurate representations of the tissue structure and to account for asymmetries of the anterior eye anatomy. Additionally, recent interest in imaging of anterior segment vasculature and aqueous humor flow resulted in application of OCT angiography techniques to generate en face and 3D micro-vasculature maps of the anterior segment. Unfortunately, ASOCT structural and vasculature imaging systems do not capture volumes instantaneously and are subject to motion artifacts due to involuntary eye motion that may hinder their accuracy and repeatability. Several groups have demonstrated real-time tracking for motion-compensated in vivo OCT retinal imaging, but these techniques are not applicable in the anterior segment. In this work, we demonstrate a simple and low-cost pupil tracking system integrated into a custom swept-source OCT system for real-time motion-compensated anterior segment volumetric imaging. Pupil oculography hardware coaxial with the swept-source OCT system enabled fast detection and tracking of the pupil centroid. The pupil tracking ASOCT system with a field of view of 15 x 15 mm achieved diffraction-limited imaging over a lateral tracking range of +/- 2.5 mm and was able to correct eye motion at up to 22 Hz. Pupil tracking ASOCT offers a novel real-time motion compensation approach that may facilitate accurate and reproducible anterior segment imaging. PMID:27574800
Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.
Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart
2017-01-01
Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built-in web cameras are a standard feature of most smart devices (e.g., laptops, tablets, smart phones) and can be effectively employed to track eye movements on decisional tasks with high accuracy and minimal cost.
The perception of heading during eye movements
NASA Technical Reports Server (NTRS)
Royden, Constance S.; Banks, Martin S.; Crowell, James A.
1992-01-01
Warren and Hannon (1988, 1990), while studying the perception of heading during eye movements, concluded that people do not require extraretinal information to judge heading with eye/head movements present. Here, heading judgments are examined at higher, more typical eye movement velocities than the extremely slow tracking eye movements used by Warren and Hannon. It is found that people require extraretinal information about eye position to perceive heading accurately under many viewing conditions.
Differences in gaze behaviour of expert and junior surgeons performing open inguinal hernia repair.
Tien, Tony; Pucher, Philip H; Sodergren, Mikael H; Sriskandarajah, Kumuthan; Yang, Guang-Zhong; Darzi, Ara
2015-02-01
Various fields have used gaze behaviour to evaluate task proficiency. This may also apply to surgery for the assessment of technical skill, but has not previously been explored in live surgery. The aim was to assess differences in gaze behaviour between expert and junior surgeons during open inguinal hernia repair. Gaze behaviour of expert and junior surgeons (defined by operative experience) performing the operation was recorded using eye-tracking glasses (SMI Eye Tracking Glasses 2.0, SensoMotoric Instruments, Germany). Primary endpoints were fixation frequency (steady eye gaze rate) and dwell time (fixation and saccades duration) and were analysed for designated areas of interest in the subject's visual field. Secondary endpoints were maximum pupil size, pupil rate of change (change frequency in pupil size) and pupil entropy (predictability of pupil change). NASA TLX scale measured perceived workload. Recorded metrics were compared between groups for the entire procedure and for comparable procedural segments. Twenty-five cases were recorded, with 13 operations analysed, from 9 surgeons giving 630 min of data, recorded at 30 Hz. Experts demonstrated higher fixation frequency (median[IQR] 1.86 [0.3] vs 0.96 [0.3]; P = 0.006) and dwell time on the operative site during application of mesh (792 [159] vs 469 [109] s; P = 0.028), closure of the external oblique (1.79 [0.2] vs 1.20 [0.6]; P = 0.003) (625 [154] vs 448 [147] s; P = 0.032) and dwelled more on the sterile field during cutting of mesh (716 [173] vs 268 [297] s; P = 0.019). NASA TLX scores indicated experts found the procedure less mentally demanding than juniors (3 [2] vs 12 [5.2]; P = 0.038). No subjects reported problems with wearing of the device, or obstruction of view. Use of portable eye-tracking technology in open surgery is feasible, without impinging surgical performance. Differences in gaze behaviour during open inguinal hernia repair can be seen between expert and junior surgeons and may have uses for assessment of surgical skill.
Driver fatigue detection based on eye state.
Lin, Lizong; Huang, Chao; Ni, Xiaopeng; Wang, Jiawen; Zhang, Hao; Li, Xiao; Qian, Zhiqin
2015-01-01
Nowadays, more and more traffic accidents occur because of driver fatigue. In order to reduce and prevent it, in this study, a calculation method using PERCLOS (percentage of eye closure time) parameter characteristics based on machine vision was developed. It determined whether a driver's eyes were in a fatigue state according to the PERCLOS value. The overall workflow solutions included face detection and tracking, detection and location of the human eye, human eye tracking, eye state recognition, and driver fatigue testing. The key aspects of the detection system incorporated the detection and location of human eyes and driver fatigue testing. The simplified method of measuring the PERCLOS value of the driver was to calculate the ratio of the eyes being open and closed with the total number of frames for a given period. If the eyes were closed more than the set threshold in the total number of frames, the system would alert the driver. Through many experiments, it was shown that besides the simple detection algorithm, the rapid computing speed, and the high detection and recognition accuracies of the system, the system was demonstrated to be in accord with the real-time requirements of a driver fatigue detection system.
Rajjoub, Raneem D; Trimboli-Heidler, Carmelina; Packer, Roger J; Avery, Robert A
2015-01-01
To determine the intra- and intervisit reproducibility of circumpapillary retinal nerve fiber layer (RNFL) thickness measures using eye tracking-assisted spectral-domain optical coherence tomography (SD OCT) in children with nonglaucomatous optic neuropathy. Prospective longitudinal study. Circumpapillary RNFL thickness measures were acquired with SD OCT using the eye-tracking feature at 2 separate study visits. Children with normal and abnormal vision (visual acuity ≥ 0.2 logMAR above normal and/or visual field loss) who demonstrated clinical and radiographic stability were enrolled. Intra- and intervisit reproducibility was calculated for the global average and 9 anatomic sectors by calculating the coefficient of variation and intraclass correlation coefficient. Forty-two subjects (median age 8.6 years, range 3.9-18.2 years) met inclusion criteria and contributed 62 study eyes. Both the abnormal and normal vision cohort demonstrated the lowest intravisit coefficient of variation for the global RNFL thickness. Intervisit reproducibility remained good for those with normal and abnormal vision, although small but statistically significant increases in the coefficient of variation were observed for multiple anatomic sectors in both cohorts. The magnitude of visual acuity loss was significantly associated with the global (ß = 0.026, P < .01) and temporal sector coefficient of variation (ß = 0.099, P < .01). SD OCT with eye tracking demonstrates highly reproducible RNFL thickness measures. Subjects with vision loss demonstrate greater intra- and intervisit variability than those with normal vision. Copyright © 2015 Elsevier Inc. All rights reserved.
A new generation of IC based beam steering devices for free-space optical communication
NASA Astrophysics Data System (ADS)
Bedi, Vijit
Free Space Optical (FSO) communication has tremendously advanced within the last decade to meet the ever increasing demand for higher communication bandwidth. Advancement in laser technology since its invention in the 1960's [1] attracted them to be the dominant source in FSO communication modules. The future of FSO systems lay in implementing semiconductor lasers due to their small size, power efficiency and mass fabrication abilities. In the near future, these systems are very likely to be used in space and ground based applications and revolutionary beam steering technologies will be required for distant communications in free-space. The highly directional characteristic inherent to a laser beam challenges and calls for new beam pointing and steering technologies for such type of communication. In this dissertation, research is done on a novel FSO communication device based on semiconductor lasers for high bandwidth communication. The "Fly eye transceiver" is an extremely wide steering bandwidth, completely non-mechanical FSO laser communication device primarily designed to replace traditional mechanical beam steering optical systems. This non-mechanical FSO device possesses a full spherical steering range and a very high tracking bandwidth. Inspired by the evolutionary model of a fly's eye, the full spherical steering range is assured by electronically controlled switching of its sub-eyes. Non mechanical technologies used in the past for beam steering such as acousto-optic Bragg cells, liquid crystal arrays or piezoelectric elements offer the wide steering bandwidth and fast response time, but are limited in their angular steering range. Mechanical gimbals offer a much greater steering range but face a much slower response time or steering bandwidth problem and often require intelligent adaptive controls with bulky driver amplifiers to feed their actuators. As a solution to feed both the fast and full spherical steering, the Fly-eye transceiver is studied as part of my PhD work. The design tool created for the research of the fly eye is then used to study different applications that may be implemented with the concept. Research is done on the mathematical feasibility, modeling, design, application of the technology, and its characterization in a simulation environment. In addition, effects of atmospheric turbulence on beam propagation in free space, and applying data security using optical encryption are also researched.
ERIC Educational Resources Information Center
Liu, Shaoying; Quinn, Paul C.; Wheeler, Andrea; Xiao, Naiqi; Ge, Liezhong; Lee, Kang
2011-01-01
Fixation duration for same-race (i.e., Asian) and other-race (i.e., Caucasian) female faces by Asian infant participants between 4 and 9 months of age was investigated with an eye-tracking procedure. The age range tested corresponded with prior reports of processing differences between same- and other-race faces observed in behavioral looking time…
New Eye-Tracking Techniques May Revolutionize Mental Health Screening
2015-11-04
health? Recent progress in eye-tracking tech- niques is opening new avenues for quanti - tative, objective, simple, inexpensive, and rapid evaluation ...to check with your doctor whether any corrective action should be taken. What if similar devices could be made available for the evaluation of mental... evaluations , especially for those disor- ders for which a clear chemical, genetic, morphological, physiological, or histologi- cal biomarker has not yet
ERIC Educational Resources Information Center
Siyanova-Chanturia, Anna; Conklin, Kathy; Schmitt, Norbert
2011-01-01
Using eye-tracking, we investigate on-line processing of idioms in a biasing story context by native and non-native speakers of English. The stimuli are idioms used figuratively ("at the end of the day"--"eventually"), literally ("at the end of the day"--"in the evening"), and novel phrases ("at the end of the war"). Native speaker results…
ERIC Educational Resources Information Center
Roderer, Thomas; Roebers, Claudia M.
2014-01-01
This study focuses on relations between 7- and 9-year-old children's and adults' metacognitive monitoring and control processes. In addition to explicit confidence judgments (CJ), data for participants' control behavior during learning and recall as well as implicit CJs were collected with an eye-tracking device (Tobii 1750).…
Gaze-contingent control for minimally invasive robotic surgery.
Mylonas, George P; Darzi, Ara; Yang, Guang Zhong
2006-09-01
Recovering tissue depth and deformation during robotically assisted minimally invasive procedures is an important step towards motion compensation, stabilization and co-registration with preoperative data. This work demonstrates that eye gaze derived from binocular eye tracking can be effectively used to recover 3D motion and deformation of the soft tissue. A binocular eye-tracking device was integrated into the stereoscopic surgical console. After calibration, the 3D fixation point of the participating subjects could be accurately resolved in real time. A CT-scanned phantom heart model was used to demonstrate the accuracy of gaze-contingent depth extraction and motion stabilization of the soft tissue. The dynamic response of the oculomotor system was assessed with the proposed framework by using autoregressive modeling techniques. In vivo data were also used to perform gaze-contingent decoupling of cardiac and respiratory motion. Depth reconstruction, deformation tracking, and motion stabilization of the soft tissue were possible with binocular eye tracking. The dynamic response of the oculomotor system was able to cope with frequencies likely to occur under most routine minimally invasive surgical operations. The proposed framework presents a novel approach towards the tight integration of a human and a surgical robot where interaction in response to sensing is required to be under the control of the operating surgeon.
Application of TrackEye in equine locomotion research.
Drevemo, S; Roepstorff, L; Kallings, P; Johnston, C J
1993-01-01
TrackEye is an analysis system, which is applicable for equine biokinematic studies. It covers the whole process from digitizing of images, automatic target tracking and analysis. Key components in the system are an image work station for processing of video images and a high-resolution film-to-video scanner for 16-mm film. A recording module controls the input device and handles the capture of image sequences into a videodisc system, and a tracking module is able to follow reference markers automatically. The system offers a flexible analysis including calculations of markers displacements, distances and joint angles, velocities and accelerations. TrackEye was used to study effects of phenylbutazone on the fetlock and carpal joint angle movements in a horse with a mild lameness caused by osteo-arthritis in the fetlock joint of a forelimb. Significant differences, most evident before treatment, were observed in the minimum fetlock and carpal joint angles when contralateral limbs were compared (p < 0.001). The minimum fetlock angle and the minimum carpal joint angle were significantly greater in the lame limb before treatment compared to those 6, 37 and 49 h after the last treatment (p < 0.001).
The impact of fatigue on latent print examinations as revealed by behavioral and eye gaze testing.
Busey, Thomas; Swofford, Henry J; Vanderkolk, John; Emerick, Brandi
2015-06-01
Eye tracking and behavioral methods were used to assess the effects of fatigue on performance in latent print examiners. Eye gaze was measured both before and after a fatiguing exercise involving fine-grained examination decisions. The eye tracking tasks used similar images, often laterally reversed versions of previously viewed prints, which holds image detail constant while minimizing prior recognition. These methods, as well as a within-subject design with fine grained analyses of the eye gaze data, allow fairly strong conclusions despite a relatively small subject population. Consistent with the effects of fatigue on practitioners in other fields such as radiology, behavioral performance declined with fatigue, and the eye gaze statistics suggested a smaller working memory capacity. Participants also terminated the search/examination process sooner when fatigued. However, fatigue did not produce changes in inter-examiner consistency as measured by the Earth Mover Metric. Implications for practice are discussed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Data on eye behavior during idea generation and letter-by-letter reading.
Walcher, Sonja; Körner, Christof; Benedek, Mathias
2017-12-01
This article includes the description of data information from an idea generation task (alternate uses task, (Guilford, 1967) [1]) and a letter-by-letter reading task under two background brightness conditions with healthy adults as well as a baseline measurement and questionnaire data (SIPI (Huba et al., 1981) [2]; DDFS (Singer and Antrobus, 1972) [3], 1963; RIBS (Runco et al., 2001) [4]). Data are hosted at the Open Science Framework (OSF): https://osf.io/fh66g/ (Walcher et al., 2017) [5]. There you will find eye tracking data, task performance data, questionnaires data, analyses scripts (in R, R Core Team, 2017 [6]), eye tracking paradigms (in the Experiment Builder (SR Research Ltd., [7]) and graphs on pupil and angle of eye vergence dynamics. Data are interpreted and discussed in the article 'Looking for ideas: Eye behavior during goal-directed internally focused cognition' (Walcher et al., 2017) [8].
Tracking the Eye Movement of Four Years Old Children Learning Chinese Words.
Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei
2018-02-01
Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese words, was used in the pretest, 5-min eye-tracking learning session, and posttest. Additionally, visual spatial skill and phonological awareness were assessed in the pretest as cognitive controls. The results showed that the children's attention was attracted quickly by pictures, on which their attention was focused most, with only 13% of the time looking at words. Moreover, significant learning gains in word reading were observed, from the pretest to posttest, from 5-min exposure to simulated storybook pages with words, picture and pronunciation of two-character words present. Furthermore, the children's attention to words significantly predicted posttest reading beyond socioeconomic status, age, visual spatial skill, phonological awareness and pretest reading performance. This eye-movement evidence of storybook reading by children as young as four years, reading a non-alphabetic script (i.e., Chinese), has demonstrated exciting findings that children can learn words effectively with minimal exposure and little instruction; these findings suggest that learning to read requires attention to the basic words itself. The study contributes to our understanding of early reading acquisition with eye-movement evidence from beginning readers.
The Head Tracks and Gaze Predicts: How the World’s Best Batters Hit a Ball
Mann, David L.; Spratford, Wayne; Abernethy, Bruce
2013-01-01
Hitters in fast ball-sports do not align their gaze with the ball throughout ball-flight; rather, they use predictive eye movement strategies that contribute towards their level of interceptive skill. Existing studies claim that (i) baseball and cricket batters cannot track the ball because it moves too quickly to be tracked by the eyes, and that consequently (ii) batters do not – and possibly cannot – watch the ball at the moment they hit it. However, to date no studies have examined the gaze of truly elite batters. We examined the eye and head movements of two of the world’s best cricket batters and found both claims do not apply to these batters. Remarkably, the batters coupled the rotation of their head to the movement of the ball, ensuring the ball remained in a consistent direction relative to their head. To this end, the ball could be followed if the batters simply moved their head and kept their eyes still. Instead of doing so, we show the elite batters used distinctive eye movement strategies, usually relying on two predictive saccades to anticipate (i) the location of ball-bounce, and (ii) the location of bat-ball contact, ensuring they could direct their gaze towards the ball as they hit it. These specific head and eye movement strategies play important functional roles in contributing towards interceptive expertise. PMID:23516460
Lin, Jane-Ming; Chen, Wen-Lu; Chiang, Chun-Chi; Tsai, Yi-Yu
2008-04-01
To evaluate ablation centration of flying-spot LASIK, investigate the effect of patient- and surgeon-related factors on centration, and compare flying-spot and broad-beam laser results. This retrospective study comprised 173 eyes of 94 patients who underwent LASIK with the Alcon LADARVision4000 with an active eye-tracking system. The effective tracking rate of the system is 100 Hz. The amount of decentration was analyzed by corneal topography. Patient- (low, high, and extreme myopia; effect of learning) and surgeon-related (learning curve) factors influencing centration were identified. Centration was compared to the SCHWIND Multiscan broad-beam laser with a 50-Hz tracker from a previous study. Mean decentration was 0.36+/-0.18 mm (range: 0 to 0.9 mm). Centration did not differ in low, high, and extreme myopia or in patients' first and second eyes. There were no significant differences in centration between the first 50 LASIK procedures and the last 50 procedures. Comparing flying-spot and broad-beam laser results, there were no differences in centration in low myopia. However, the LADARVision4000 yielded better centration results in high and extreme myopia. The Alcon LADARVision4000 active eye tracking system provides good centration for all levels of myopic correction and better centration than the Schwind broad-beam Multiscan in eyes with high and extreme myopia.
Shechner, Tomer; Jarcho, Johanna M.; Britton, Jennifer C.; Leibenluft, Ellen; Pine, Daniel S.; Nelson, Eric E.
2012-01-01
Background Previous studies demonstrate that anxiety is characterized by biased attention toward threats, typically measured by differences in motor reaction time to threat and neutral cues. Using eye-tracking methodology, the current study measured attention biases in anxious and nonanxious youth, using unrestricted free viewing of angry, happy, and neutral faces. Methods Eighteen anxious and 15 nonanxious youth (8–17 years old) passively viewed angry-neutral and happy-neutral face pairs for 10 s while their eye movements were recorded. Results Anxious youth displayed a greater attention bias toward angry faces than nonanxious youth, and this bias occurred in the earliest phases of stimulus presentation. Specifically, anxious youth were more likely to direct their first fixation to angry faces, and they made faster fixations to angry than neutral faces. Conclusions Consistent with findings from earlier, reaction-time studies, the current study shows that anxious youth, like anxious adults, exhibit biased orienting to threat-related stimuli. This study adds to the existing literature by documenting that threat biases in eye-tracking patterns are manifest at initial attention orienting. PMID:22815254
Low frequency rTMS over posterior parietal cortex impairs smooth pursuit eye tracking.
Hutton, Samuel B; Weekes, Brendan S
2007-11-01
The role of the posterior parietal cortex in smooth pursuit eye movements remains unclear. We used low frequency repetitive transcranial magnetic stimulation (rTMS) to study the cognitive and neural systems involved in the control of smooth pursuit eye movements. Eighteen participants were tested on two separate occasions. On each occasion we measured smooth pursuit eye tracking before and after 6 min of 1 Hz rTMS delivered at 90% of motor threshold. Low frequency rTMS over the posterior parietal cortex led to a significant reduction in smooth pursuit velocity gain, whereas rTMS over the motor cortex had no effect on gain. We conclude that low frequency offline rTMS is a potentially useful tool with which to explore the cortical systems involved in oculomotor control.
Quantifying Eye Tracking Between Skilled Nurses and Nursing Students in Intravenous Injection.
Maekawa, Yasuko; Majima, Yukie; Soga, Masato
2016-01-01
In nursing education, it is important that nursing students acquire the appropriate nursing knowledge and skills which include the empirical tacit knowledge of the skilled nurses. Verbalizing them is difficult. We paid attention to the eye tracking at the time of the skill enforcement of expert nurses and the nursing students. It is said that the sight accounts for 70% higher than of all sense information. For the purpose of the learning support of the tacit nursing skill, we analyzed the difference of both including the gaze from an actual measured value with the eye mark recorder. In the results the nurses particularly address the part related to inserting a needle among the other actions, they should move their eyes safely, surely, and economically along with the purposes of their tasks.
CEFR and Eye Movement Characteristics during EFL Reading: The Case of Intermediate Readers
ERIC Educational Resources Information Center
Dolgunsöz, Emrah; Sariçoban, Arif
2016-01-01
This study primarily aims to (1) examine the relationship between foreign language reading proficiency and eye movements during reading, and (2) to describe eye movement differences between two CEFR proficiency groups (B1 and B2) by using eye tracking technique. 57 learners of EFL were tested under two experimental conditions: Natural L2 reading…
ERIC Educational Resources Information Center
Tamaoka, Katsuo; Asano, Michiko; Miyaoka, Yayoi; Yokosawa, Kazuhiko
2014-01-01
Using the eye-tracking method, the present study depicted pre- and post-head processing for simple scrambled sentences of head-final languages. Three versions of simple Japanese active sentences with ditransitive verbs were used: namely, (1) SO[subscript 1]O[subscript 2]V canonical, (2) SO[subscript 2]O[subscript 1]V single-scrambled, and (3)…
ERIC Educational Resources Information Center
Quinn, Paul C.; Doran, Matthew M.; Reiss, Jason E.; Hoffman, James E.
2009-01-01
Previous looking time studies have shown that infants use the heads of cat and dog images to form category representations for these animal classes. The present research used an eye-tracking procedure to determine the time course of attention to the head and whether it reflects a preexisting bias or online learning. Six- to 7-month-olds were…
Remote gaze tracking system on a large display.
Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun
2013-10-07
We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°~±0.775° and a speed of 5~10 frames/s.
Remote Gaze Tracking System on a Large Display
Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun
2013-01-01
We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°∼±0.775° and a speed of 5∼10 frames/s. PMID:24105351
Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger
2014-01-01
An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.
Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger
2014-01-01
An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184
Thomsen, Steven R; Fulton, Kristi
2007-07-01
To investigate whether adolescent readers attend to responsibility or moderation messages (e.g., "drink responsibly") included in magazine advertisements for alcoholic beverages and to assess the association between attention and the ability to accurately recall the content of these messages. An integrated head-eye tracking system (ASL Eye-TRAC 6000) was used to measure the eye movements, including fixations and fixation duration, of a group of 63 adolescents (ages 12-14 years) as they viewed six print advertisements for alcoholic beverages. Immediately after the eye-tracking sessions, participants completed a masked-recall exercise. Overall, the responsibility or moderation messages were the least frequently viewed textual or visual areas of the advertisements. Participants spent an average of only .35 seconds, or 7% of the total viewing time, fixating on each responsibility message. Beverage bottles, product logos, and cartoon illustrations were the most frequently viewed elements of the advertisements. Among those participants who fixated at least once on an advertisement's warning message, only a relatively small percentage were able to recall its general concept or restate it verbatim in the masked recall test. Voluntary responsibility or moderation messages failed to capture the attention of teenagers who participated in this study and need to be typographically modified to be more effective.
Scheer, Clara; Mattioni Maturana, Felipe; Jansen, Petra
2018-05-07
In chronometric mental rotation tasks, sex differences are widely discussed. Most studies find men to be more skilled in mental rotation than women, which can be explained by the holistic strategy that they use to rotate stimuli. Women are believed to apply a piecemeal strategy. So far, there have been no studies investigating this phenomenon using eye-tacking methods in combination with electroencephalography (EEG) analysis: the present study compared behavioral responses, EEG activity, and eye movements of 15 men and 15 women while solving a three-dimensional chronometric mental rotation test. The behavioral analysis showed neither differences in reaction time nor in the accuracy rate between men and women. The EEG data showed a higher right activation on parietal electrodes for women and the eye-tracking results indicated a longer fixation in a higher number of areas of interest at 0° for women. Men and women are likely to possess different perceptual (visual search) and decision-making mechanisms, but similar mental rotation processes. Furthermore, men presented a longer visual search processing, characterized by the greater saccade latency of 0°-135°. Generally, this study could be considered a pilot study to investigate sex differences in mental rotation tasks while combining eye-tracking and EEG methods.
Objective Methods to Test Visual Dysfunction in the Presence of Cognitive Impairment
2015-12-01
the eye and 3) purposeful eye movements to track targets that are resolved. Major Findings: Three major objective tests of vision were successfully...developed and optimized to detect disease. These were 1) the pupil light reflex (either comparing the two eyes or independently evaluating each eye ...separately for retina or optic nerve damage, 2) eye movement based analysis of target acquisition, fixation, and eccentric viewing as a means of
Alteration of travel patterns with vision loss from glaucoma and macular degeneration.
Curriero, Frank C; Pinchoff, Jessie; van Landingham, Suzanne W; Ferrucci, Luigi; Friedman, David S; Ramulu, Pradeep Y
2013-11-01
The distance patients can travel outside the home influences how much of the world they can sample and to what extent they can live independently. Recent technological advances have allowed travel outside the home to be directly measured in patients' real-world routines. To determine whether decreased visual acuity (VA) from age-related macular degeneration (AMD) and visual field (VF) loss from glaucoma are associated with restricted travel patterns in older adults. Cross-sectional study. Patients were recruited from an eye clinic, while travel patterns were recorded during their real-world routines using a cellular tracking device. Sixty-one control subjects with normal vision, 84 subjects with glaucoma with bilateral VF loss, and 65 subjects with AMD with bilateral or severe unilateral loss of VA had their location tracked every 15 minutes between 7 am and 11 pm for 7 days using a tracking device. Average daily excursion size (defined as maximum distance away from home) and average daily excursion span (defined as maximum span of travel) were defined for each individual. The effects of vision loss on travel patterns were evaluated after controlling for individual and geographic factors. In multivariable models comparing subjects with AMD and control subjects, average excursion size and span decreased by approximately one-quarter mile for each line of better-eye VA loss (P ≤ .03 for both). Similar but not statistically significant associations were observed between average daily excursion size and span for severity of better-eye VF loss in subjects with glaucoma and control subjects. Being married or living with someone and younger age were associated with more distant travel, while less-distant travel was noted for older individuals, African Americans, and those living in more densely populated regions. Age-related macular degeneration-related loss of VA, but not glaucoma-related loss of VF, is associated with restriction of travel to more nearby locations. This constriction of life space may impact quality of life and restrict access to services.
iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker.
Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak
2014-06-01
Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees.
iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker
Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak
2015-01-01
Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees. PMID:26539565
... 3½, kids should have eye health screenings and visual acuity tests (tests that measure sharpness of vision) ... eye rubbing extreme light sensitivity poor focusing poor visual tracking (following an object) abnormal alignment or movement ...
The added value of eye-tracking in diagnosing dyscalculia: a case study
van Viersen, Sietske; Slot, Esther M.; Kroesbergen, Evelyn H.; van't Noordende, Jaccoline E.; Leseman, Paul P. M.
2013-01-01
The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R2) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures. PMID:24098294
NASA Astrophysics Data System (ADS)
Klein, P.; Viiri, J.; Mozaffari, S.; Dengel, A.; Kuhn, J.
2018-06-01
Relating mathematical concepts to graphical representations is a challenging task for students. In this paper, we introduce two visual strategies to qualitatively interpret the divergence of graphical vector field representations. One strategy is based on the graphical interpretation of partial derivatives, while the other is based on the flux concept. We test the effectiveness of both strategies in an instruction-based eye-tracking study with N =41 physics majors. We found that students' performance improved when both strategies were introduced (74% correct) instead of only one strategy (64% correct), and students performed best when they were free to choose between the two strategies (88% correct). This finding supports the idea of introducing multiple representations of a physical concept to foster student understanding. Relevant eye-tracking measures demonstrate that both strategies imply different visual processing of the vector field plots, therefore reflecting conceptual differences between the strategies. Advanced analysis methods further reveal significant differences in eye movements between the best and worst performing students. For instance, the best students performed predominantly horizontal and vertical saccades, indicating correct interpretation of partial derivatives. They also focused on smaller regions when they balanced positive and negative flux. This mixed-method research leads to new insights into student visual processing of vector field representations, highlights the advantages and limitations of eye-tracking methodologies in this context, and discusses implications for teaching and for future research. The introduction of saccadic direction analysis expands traditional methods, and shows the potential to discover new insights into student understanding and learning difficulties.
The added value of eye-tracking in diagnosing dyscalculia: a case study.
van Viersen, Sietske; Slot, Esther M; Kroesbergen, Evelyn H; Van't Noordende, Jaccoline E; Leseman, Paul P M
2013-01-01
The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R (2)) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures.
Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision.
Ben-Simon, Avi; Ben-Shahar, Ohad; Segev, Ronen
2009-11-15
The archer fish (Toxotes chatareus) exhibits unique visual behavior in that it is able to aim at and shoot down with a squirt of water insects resting on the foliage above water level and then feed on them. This extreme behavior requires excellent visual acuity, learning, and tight synchronization between the visual system and body motion. This behavior also raises many important questions, such as the fish's ability to compensate for air-water refraction and the neural mechanisms underlying target acquisition. While many such questions remain open, significant insights towards solving them can be obtained by tracking the eye and body movements of freely behaving fish. Unfortunately, existing tracking methods suffer from either a high level of invasiveness or low resolution. Here, we present a video-based eye tracking method for accurately and remotely measuring the eye and body movements of a freely moving behaving fish. Based on a stereo vision system and a unique triangulation method that corrects for air-glass-water refraction, we are able to measure a full three-dimensional pose of the fish eye and body with high temporal and spatial resolution. Our method, being generic, can be applied to studying the behavior of marine animals in general. We demonstrate how data collected by our method may be used to show that the hunting behavior of the archer fish is composed of surfacing concomitant with rotating the body around the direction of the fish's fixed gaze towards the target, until the snout reaches in the correct shooting position at water level.
Kimmel, Daniel L.; Mammo, Dagem; Newsome, William T.
2012-01-01
From human perception to primate neurophysiology, monitoring eye position is critical to the study of vision, attention, oculomotor control, and behavior. Two principal techniques for the precise measurement of eye position—the long-standing sclera-embedded search coil and more recent optical tracking techniques—are in use in various laboratories, but no published study compares the performance of the two methods simultaneously in the same primates. Here we compare two popular systems—a sclera-embedded search coil from C-N-C Engineering and the EyeLink 1000 optical system from SR Research—by recording simultaneously from the same eye in the macaque monkey while the animal performed a simple oculomotor task. We found broad agreement between the two systems, particularly in positional accuracy during fixation, measurement of saccade amplitude, detection of fixational saccades, and sensitivity to subtle changes in eye position from trial to trial. Nonetheless, certain discrepancies persist, particularly elevated saccade peak velocities, post-saccadic ringing, influence of luminance change on reported position, and greater sample-to-sample variation in the optical system. Our study shows that optical performance now rivals that of the search coil, rendering optical systems appropriate for many if not most applications. This finding is consequential, especially for animal subjects, because the optical systems do not require invasive surgery for implantation and repair of search coils around the eye. Our data also allow laboratories using the optical system in human subjects to assess the strengths and limitations of the technique for their own applications. PMID:22912608
NASA Astrophysics Data System (ADS)
Talukder, Ashit; Morookian, John M.; Monacos, Steve P.; Lam, Raymond K.; Lebaw, C.; Bond, A.
2004-04-01
Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals. Current non-invasive eyetracking methods achieve a 30 Hz rate with possibly low accuracy in gaze estimation, that is insufficient for many applications. We propose a new non-invasive visual eyetracking system that is capable of operating at speeds as high as 6-12 KHz. A new CCD video camera and hardware architecture is used, and a novel fast image processing algorithm leverages specific features of the input CCD camera to yield a real-time eyetracking system. A field programmable gate array (FPGA) is used to control the CCD camera and execute the image processing operations. Initial results show the excellent performance of our system under severe head motion and low contrast conditions.
Domkin, Dmitry; Forsman, Mikael; Richter, Hans O
2016-06-01
Previous studies have shown an association of visual demands during near work and increased activity of the trapezius muscle. Those studies were conducted under stationary postural conditions with fixed gaze and artificial visual load. The present study investigated the relationship between ciliary muscle contraction force and trapezius muscle activity across individuals during performance of a natural dynamic motor task under free gaze conditions. Participants (N=11) tracked a moving visual target with a digital pen on a computer screen. Tracking performance, eye refraction and trapezius muscle activity were continuously measured. Ciliary muscle contraction force was computed from eye accommodative response. There was a significant Pearson correlation between ciliary muscle contraction force and trapezius muscle activity on the tracking side (0.78, p<0.01) and passive side (0.64, p<0.05). The study supports the hypothesis that high visual demands, leading to an increased ciliary muscle contraction during continuous eye-hand coordination, may increase trapezius muscle tension and thus contribute to the development of musculoskeletal complaints in the neck-shoulder area. Further experimental studies are required to clarify whether the relationship is valid within each individual or may represent a general personal trait, when individuals with higher eye accommodative response tend to have higher trapezius muscle activity. Copyright © 2015 Elsevier Ltd. All rights reserved.
The socialization effect on decision making in the Prisoner's Dilemma game: An eye-tracking study
Myagkov, Mikhail G.; Harriff, Kyle
2017-01-01
We used a mobile eye-tracking system (in the form of glasses) to study the characteristics of visual perception in decision making in the Prisoner's Dilemma game. In each experiment, one of the 12 participants was equipped with eye-tracking glasses. The experiment was conducted in three stages: an anonymous Individual Game stage against a randomly chosen partner (one of the 12 other participants of the experiment); a Socialization stage, in which the participants were divided into two groups; and a Group Game stage, in which the participants played with partners in the groups. After each round, the respondent received information about his or her personal score in the last round and the overall winner of the game at the moment. The study proves that eye-tracking systems can be used for studying the process of decision making and forecasting. The total viewing time and the time of fixation on areas corresponding to noncooperative decisions is related to the participants’ overall level of cooperation. The increase in the total viewing time and the time of fixation on the areas of noncooperative choice is due to a preference for noncooperative decisions and a decrease in the overall level of cooperation. The number of fixations on the group attributes is associated with group identity, but does not necessarily lead to cooperative behavior. PMID:28394939
Improving visual search in instruction manuals using pictograms.
Kovačević, Dorotea; Brozović, Maja; Možina, Klementina
2016-11-01
Instruction manuals provide important messages about the proper use of a product. They should communicate in such a way that they facilitate users' searches for specific information. Despite the increasing research interest in visual search, there is a lack of empirical knowledge concerning the role of pictograms in search performance during the browsing of a manual's pages. This study investigates how the inclusion of pictograms improves the search for the target information. Furthermore, it examines whether this search process is influenced by the visual similarity between the pictograms and the searched for information. On the basis of eye-tracking measurements, as objective indicators of the participants' visual attention, it was found that pictograms can be a useful element of search strategy. Another interesting finding was that boldface highlighting is a more effective method for improving user experience in information seeking, rather than the similarity between the pictorial and adjacent textual information. Implications for designing effective user manuals are discussed. Practitioner Summary: Users often view instruction manuals with the aim of finding specific information. We used eye-tracking technology to examine different manual pages in order to improve the user's visual search for target information. The results indicate that the use of pictograms and bold highlighting of relevant information facilitate the search process.
Lami, Mariam; Singh, Harsimrat; Dilley, James H; Ashraf, Hajra; Edmondon, Matthew; Orihuela-Espina, Felipe; Hoare, Jonathan; Darzi, Ara; Sodergren, Mikael H
2018-02-07
The adenoma detection rate (ADR) is an important quality indicator in colonoscopy. The aim of this study was to evaluate the changes in visual gaze patterns (VGPs) with increasing polyp detection rate (PDR), a surrogate marker of ADR. 18 endoscopists participated in the study. VGPs were measured using eye-tracking technology during the withdrawal phase of colonoscopy. VGPs were characterized using two analyses - screen and anatomy. Eye-tracking parameters were used to characterize performance, which was further substantiated using hidden Markov model (HMM) analysis. Subjects with higher PDRs spent more time viewing the outer ring of the 3 × 3 grid for both analyses (screen-based: r = 0.56, P = 0.02; anatomy: r = 0.62, P < 0.01). Fixation distribution to the "bottom U" of the screen in screen-based analysis was positively correlated with PDR (r = 0.62, P = 0.01). HMM demarcated the VGPs into three PDR groups. This study defined distinct VGPs that are associated with expert behavior. These data may allow introduction of visual gaze training within structured training programs, and have implications for adoption in higher-level assessment. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella
In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.
Fujita, Takuo; Nakamura, Shoji; Ohue, Mutsumi; Fujii, Yoshio; Miyauchi, Akimitsu; Takagi, Yasuyuki; Tsugeno, Hirofumi
2007-01-01
Sway and postural instability have drawn attention as a risk factor for osteoporotic fracture, in addition to low bone mineral density (BMD) and poor bone quality. In view of the fracture-reducing effect of alfacalcidol and active absorbable algal calcium (AAA Ca) not readily explained by rather mild increases of BMD, attempts were made to evaluate postural stabilizing effect of alfacalcidol, AAA Ca, and calcium carbonate (CaCO(3)) by computerized posturography. Track of the gravity center was analyzed to calculate parameters related to tract length, track range, and track density to express the degree of sway before and after supplementation in 126 subjects ranging in age between 20 and 81 years randomly divided into four groups. Supplementation with AAA Ca containing 900 mg elemental Ca (group A), no calcium (group B), CaCO(3) also containing 900 mg elemental Ca (group C), or alfacalcidol (group D) continued daily for 12 months. For each parameter, the ratio closed eye value/open eye value (Romberg ratio) was calculated to detect aggravation of sway by eye closure. Age, parameters of Ca and P, and proportions of subjects with fracture and those with low BMD showed no marked deviation among the groups. With eyes open, significant decreases of a track range parameter (REC) from group B was noted in groups A (P = 0.0397) and D (P = 0.0296), but not in group C according to multiple comparison by Scheffe, indicating superior postural stabilizing effect of A and D over C. In the first 2 months, a significant fall was already evident in REC from group B in group D (P = 0.0120) with eyes open. Paired comparison of sway parameters before and after supplementation revealed a significant increase of track density parameter (LNGA), indicating sway control efficiency and a significant decrease of REC in groups A and D compared to group B with eyes open. With eyes closed, only group A showed a significant improvement from group B (P = 0.0456; Fig. 1), with a significant shortening on paired After/Before comparison (P = 0.0142; Fig. 2). Computerized posturography appears to be useful in analyzing sway phenomena especially as to the effects of vitamin D and various Ca preparations.
Avoidance of Cigarette Pack Health Warnings among Regular Cigarette Smokers
Maynard, Olivia M.; Attwood, Angela; O’Brien, Laura; Brooks, Sabrina; Hedge, Craig; Leonards, Ute; Munafò, Marcus R.
2016-01-01
Background Previous research with adults and adolescents indicates that plain cigarette packs increase visual attention to health warnings among non-smokers and non-regular smokers, but not among regular smokers. This may be because regular smokers: 1) are familiar with the health warnings, 2) preferentially attend to branding, or 3) actively avoid health warnings. We sought to distinguish between these explanations using eye-tracking technology. Method A convenience sample of 30 adult dependant smokers were recruited to participate in an eye-tracking study. Participants viewed branded, plain and blank packs of cigarettes with familiar and unfamiliar health warnings. The number of fixations to health warnings and branding on the different pack types were recorded. Results Analysis of variance indicated that regular smokers were biased towards fixating the branding location rather than the health warning location on all three pack types (p < 0.002). This bias was smaller, but still evident, for blank packs, where smokers preferentially attended the blank region over the health warnings. Time-course analysis showed that for branded and plain packs, attention was preferentially directed to the branding location for the entire 10 seconds of the stimulus presentation, while for blank packs this occurred for the last 8 seconds of the stimulus presentation. Familiarity with health warnings had no effect on eye gaze location. Conclusion Smokers actively avoid cigarette pack health warnings, and this remains the case even in the absence of salient branding information. Smokers may have learned to divert their attention away from cigarette pack health warnings. These findings have policy implications for the design of health warning on cigarette packs. PMID:24485554
Case Comparisons: An Efficient Way of Learning Radiology.
Kok, Ellen M; de Bruin, Anique B H; Leppink, Jimmie; van Merriënboer, Jeroen J G; Robben, Simon G F
2015-10-01
Radiologists commonly use comparison films to improve their differential diagnosis. Educational literature suggests that this technique might also be used to bolster the process of learning to interpret radiographs. We investigated the effectiveness of three comparison techniques in medical students, whom we invited to compare cases of the same disease (same-disease comparison), cases of different diseases (different-disease comparison), disease images with normal images (disease/normal comparison), and identical images (no comparison/control condition). Furthermore, we used eye-tracking technology to investigate which elements of the two cases were compared by the students. We randomly assigned 84 medical students to one of four conditions and had them study different diseases on chest radiographs, while their eye movements were being measured. Thereafter, participants took two tests that measured diagnostic performance and their ability to locate diseases, respectively. Students studied most efficiently in the same-disease and different-disease comparison conditions: test 1, F(3, 68) = 3.31, P = .025, ηp(2) = 0.128; test 2, F(3, 65) = 2.88, P = .043, ηp(2) = 0.117. We found that comparisons were effected in 91% of all trials (except for the control condition). Comparisons between normal anatomy were particularly common (45.8%) in all conditions. Comparing cases can be an efficient way of learning to interpret radiographs, especially when the comparison technique used is specifically tailored to the learning goal. Eye tracking provided insight into the comparison process, by showing that few comparisons were made between abnormalities, for example. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.
Eye tracking reveals a crucial role for facial motion in recognition of faces by infants
Xiao, Naiqi G.; Quinn, Paul C.; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang
2015-01-01
Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces and then their face recognition was tested with static face images. Eye tracking methodology was used to record eye movements during familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better was their face recognition, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. PMID:26010387
Reasoning strategies with rational numbers revealed by eye tracking.
Plummer, Patrick; DeWolf, Melissa; Bassok, Miriam; Gordon, Peter C; Holyoak, Keith J
2017-07-01
Recent research has begun to investigate the impact of different formats for rational numbers on the processes by which people make relational judgments about quantitative relations. DeWolf, Bassok, and Holyoak (Journal of Experimental Psychology: General, 144(1), 127-150, 2015) found that accuracy on a relation identification task was highest when fractions were presented with countable sets, whereas accuracy was relatively low for all conditions where decimals were presented. However, it is unclear what processing strategies underlie these disparities in accuracy. We report an experiment that used eye-tracking methods to externalize the strategies that are evoked by different types of rational numbers for different types of quantities (discrete vs. continuous). Results showed that eye-movement behavior during the task was jointly determined by image and number format. Discrete images elicited a counting strategy for both fractions and decimals, but this strategy led to higher accuracy only for fractions. Continuous images encouraged magnitude estimation and comparison, but to a greater degree for decimals than fractions. This strategy led to decreased accuracy for both number formats. By analyzing participants' eye movements when they viewed a relational context and made decisions, we were able to obtain an externalized representation of the strategic choices evoked by different ontological types of entities and different types of rational numbers. Our findings using eye-tracking measures enable us to go beyond previous studies based on accuracy data alone, demonstrating that quantitative properties of images and the different formats for rational numbers jointly influence strategies that generate eye-movement behavior.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays.
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A; Wetzstein, Gordon
2017-02-28
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.
Garcia-Martin, Elena; Pinilla, Isabel; Sancho, Eva; Almarcegui, Carmen; Dolz, Isabel; Rodriguez-Mena, Diego; Fuertes, Isabel; Cuenca, Nicolas
2012-09-01
To evaluate the ability of time-domain and Fourier-domain optical coherence tomographies (OCTs) to detect macular and retinal nerve fiber layer atrophies in retinitis pigmentosa (RP). To test the intrasession reproducibility using three OCT instruments (Stratus, Cirrus, and Spectralis). Eighty eyes of 80 subjects (40 RP patients and 40 healthy subjects) underwent a visual field examination, together with 3 macular scans and 3 optic disk evaluations by the same experienced examiner using 3 OCT instruments. Differences between healthy and RP eyes were compared. The relationship between measurements with each OCT instrument was evaluated. Repeatability was studied by intraclass correlation coefficients and coefficients of variation. Macular and retinal nerve fiber layer atrophies were detected in RP patients for all OCT parameters. Macular and retinal nerve fiber layer thicknesses, as determined by the different OCTs, were correlated but significantly different (P < 0.05). Reproducibility was moderately high using Stratus, good using Cirrus and Spectralis, and excellent using the Tru-track technology of Spectralis. In RP eyes, measurements showed higher variability compared with healthy eyes. Differences in thickness measurements existed between OCT instruments, despite there being a high degree of correlation. Fourier-domain OCT can be considered a valid and repeatability technique to detect retinal nerve fiber layer atrophy in RP patients.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays
NASA Astrophysics Data System (ADS)
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Cooper, Emily A.; Wetzstein, Gordon
2017-02-01
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.
Distractor interference during smooth pursuit eye movements.
Spering, Miriam; Gegenfurtner, Karl R; Kerzel, Dirk
2006-10-01
When 2 targets for pursuit eye movements move in different directions, the eye velocity follows the vector average (S. G. Lisberger & V. P. Ferrera, 1997). The present study investigates the mechanisms of target selection when observers are instructed to follow a predefined horizontal target and to ignore a moving distractor stimulus. Results show that at 140 ms after distractor onset, horizontal eye velocity is decreased by about 25%. Vertical eye velocity increases or decreases by 1 degrees /s in the direction opposite from the distractor. This deviation varies in size with distractor direction, velocity, and contrast. The effect was present during the initiation and steady-state tracking phase of pursuit but only when the observer had prior information about target motion. Neither vector averaging nor winner-take-all models could predict the response to a moving to-be-ignored distractor during steady-state tracking of a predefined target. The contributions of perceptual mislocalization and spatial attention to the vertical deviation in pursuit are discussed. Copyright 2006 APA.
Richards, Michael R; Fields, Henry W; Beck, F Michael; Firestone, Allen R; Walther, Dirk B; Rosenstiel, Stephen; Sacksteder, James M
2015-04-01
There is disagreement in the literature concerning the importance of the mouth in overall facial attractiveness. Eye tracking provides an objective method to evaluate what people see. The objective of this study was to determine whether dental and facial attractiveness alters viewers' visual attention in terms of which area of the face (eyes, nose, mouth, chin, ears, or other) is viewed first, viewed the greatest number of times, and viewed for the greatest total time (duration) using eye tracking. Seventy-six viewers underwent 1 eye tracking session. Of these, 53 were white (49% female, 51% male). Their ages ranged from 18 to 29 years, with a mean of 19.8 years, and none were dental professionals. After being positioned and calibrated, they were shown 24 unique female composite images, each image shown twice for reliability. These images reflected a repaired unilateral cleft lip or 3 grades of dental attractiveness similar to those of grades 1 (near ideal), 7 (borderline treatment need), and 10 (definite treatment need) as assessed in the aesthetic component of the Index of Orthodontic Treatment Need (AC-IOTN). The images were then embedded in faces of 3 levels of attractiveness: attractive, average, and unattractive. During viewing, data were collected for the first location, frequency, and duration of each viewer's gaze. Observer reliability ranged from 0.58 to 0.92 (intraclass correlation coefficients) but was less than 0.07 (interrater) for the chin, which was eliminated from the study. Likewise, reliability for the area of first fixation was kappa less than 0.10 for both intrarater and interrater reliabilities; the area of first fixation was also removed from the data analysis. Repeated-measures analysis of variance showed a significant effect (P <0.001) for level of attractiveness by malocclusion by area of the face. For both number of fixations and duration of fixations, the eyes overwhelmingly were most salient, with the mouth receiving the second most visual attention. At times, the mouth and the eyes were statistically indistinguishable in viewers' gazes of fixation and duration. As the dental attractiveness decreased, the visual attention increased on the mouth, approaching that of the eyes. AC-IOTN grade 10 gained the most attention, followed by both AC-IOTN grade 7 and the cleft. AC-IOTN grade 1 received the least amount of visual attention. Also, lower dental attractiveness (AC-IOTN 7 and AC-IOTN 10) received more visual attention as facial attractiveness increased. Eye tracking indicates that dental attractiveness can alter the level of visual attention depending on the female models' facial attractiveness when viewed by laypersons. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Emerging Technologies Look Deeper into the Eyes to Catch Signs of Disease
... Eye Disease Vision Screening World Sight Day Emerging technologies look deeper into the eyes to catch signs ... to eye gazing Adaptive optics (AO) is one technology helping to overcome this problem. It deals with ...
Dual Purkinje-Image Eyetracker
1996-01-01
Abnormal nystagmus can also be detected through the use of an eyetracker [4]. Through tracking points of eye gaze within a scene, it is possible to...moving, even when gazing . Correcting for these unpredictable micro eye movements would allow corrective procedures in eye surgery to become more accurate...victim with a screen of letters on a monitor. A calibrated eyetracker then provides a processor with information about the location of eye gaze . The
Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas
2016-01-01
The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125
The specificity of attentional biases by type of gambling: An eye-tracking study.
McGrath, Daniel S; Meitner, Amadeus; Sears, Christopher R
2018-01-01
A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers.
Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study.
Kesner, Ladislav; Grygarová, Dominika; Fajnerová, Iveta; Lukavský, Jiří; Nekovářová, Tereza; Tintěra, Jaroslav; Zaytseva, Yuliya; Horáček, Jiří
2018-06-15
In this study, we use separate eye-tracking measurements and functional magnetic resonance imaging to investigate the neuronal and behavioral response to painted portraits with direct versus averted gaze. We further explored modulatory effects of several painting characteristics (premodern vs modern period, influence of style and pictorial context). In the fMRI experiment, we show that the direct versus averted gaze elicited increased activation in lingual and inferior occipital and the fusiform face area, as well as in several areas involved in attentional and social cognitive processes, especially the theory of mind: angular gyrus/temporo-parietal junction, inferior frontal gyrus and dorsolateral prefrontal cortex. The additional eye-tracking experiment showed that participants spent more time viewing the portrait's eyes and mouth when the portrait's gaze was directed towards the observer. These results suggest that static and, in some cases, highly stylized depictions of human beings in artistic portraits elicit brain activation commensurate with the experience of being observed by a watchful intelligent being. They thus involve observers in implicit inferences of the painted subject's mental states and emotions. We further confirm the substantial influence of representational medium on brain activity. Copyright © 2018 Elsevier Inc. All rights reserved.
The eye-tracking computer device for communication in amyotrophic lateral sclerosis.
Spataro, R; Ciriacono, M; Manno, C; La Bella, V
2014-07-01
To explore the effectiveness of communication and the variables affecting the eye-tracking computer system (ETCS) utilization in patients with late-stage amyotrophic lateral sclerosis (ALS). We performed a telephone survey on 30 patients with advanced non-demented ALS that were provisioned an ECTS device. Median age at interview was 55 years (IQR = 48-62), with a relatively high education (13 years, IQR = 8-13). A one-off interview was made and answers were later provided with the help of the caregiver. The interview included items about demographic and clinical variables affecting the daily ETCS utilization. The median time of ETCS device possession was 15 months (IQR = 9-20). The actual daily utilization was 300 min (IQR = 100-720), mainly for the communication with relatives/caregiver, internet surfing, e-mailing, and social networking. 23.3% of patients with ALS (n = 7) had a low daily ETCS utilization; most reported causes were eye-gaze tiredness and oculomotor dysfunction. Eye-tracking computer system is a valuable device for AAC in patients with ALS, and it can be operated with a good performance. The development of oculomotor impairment may limit its functional use. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Catrysse, Leen; Gijbels, David; Donche, Vincent; De Maeyer, Sven; Lesterhuis, Marije; Van den Bossche, Piet
2018-03-01
Up until now, empirical studies in the Student Approaches to Learning field have mainly been focused on the use of self-report instruments, such as interviews and questionnaires, to uncover differences in students' general preferences towards learning strategies, but have focused less on the use of task-specific and online measures. This study aimed at extending current research on students' learning strategies by combining general and task-specific measurements of students' learning strategies using both offline and online measures. We want to clarify how students process learning contents and to what extent this is related to their self-report of learning strategies. Twenty students with different generic learning profiles (according to self-report questionnaires) read an expository text, while their eye movements were registered to answer questions on the content afterwards. Eye-tracking data were analysed with generalized linear mixed-effects models. The results indicate that students with an all-high profile, combining both deep and surface learning strategies, spend more time on rereading the text than students with an all-low profile, scoring low on both learning strategies. This study showed that we can use eye-tracking to distinguish very strategic students, characterized using cognitive processing and regulation strategies, from low strategic students, characterized by a lack of cognitive and regulation strategies. These students processed the expository text according to how they self-reported. © 2017 The British Psychological Society.
NASA Astrophysics Data System (ADS)
Keane, Tommy P.; Cahill, Nathan D.; Tarduno, John A.; Jacobs, Robert A.; Pelz, Jeff B.
2014-02-01
Mobile eye-tracking provides the fairly unique opportunity to record and elucidate cognition in action. In our research, we are searching for patterns in, and distinctions between, the visual-search performance of experts and novices in the geo-sciences. Traveling to regions resultant from various geological processes as part of an introductory field studies course in geology, we record the prima facie gaze patterns of experts and novices when they are asked to determine the modes of geological activity that have formed the scene-view presented to them. Recording eye video and scene video in natural settings generates complex imagery that requires advanced applications of computer vision research to generate registrations and mappings between the views of separate observers. By developing such mappings, we could then place many observers into a single mathematical space where we can spatio-temporally analyze inter- and intra-subject fixations, saccades, and head motions. While working towards perfecting these mappings, we developed an updated experiment setup that allowed us to statistically analyze intra-subject eye-movement events without the need for a common domain. Through such analyses we are finding statistical differences between novices and experts in these visual-search tasks. In the course of this research we have developed a unified, open-source, software framework for processing, visualization, and interaction of mobile eye-tracking and high-resolution panoramic imagery.
Lukasova, Katerina; Silva, Isadora P.; Macedo, Elizeu C.
2016-01-01
Analysis of eye movement patterns during tracking tasks represents a potential way to identify differences in the cognitive processing and motor mechanisms underlying reading in dyslexic children before the occurrence of school failure. The current study aimed to evaluate the pattern of eye movements in antisaccades, predictive saccades and visually guided saccades in typical readers and readers with developmental dyslexia. The study included 30 children (age M = 11; SD = 1.67), 15 diagnosed with developmental dyslexia (DG) and 15 regular readers (CG), matched by age, gender and school grade. Cognitive assessment was performed prior to the eye-tracking task during which both eyes were registered using the Tobii® 1750 eye-tracking device. The results demonstrated a lower correct antisaccades rate in dyslexic children compared to the controls (p < 0.001, DG = 25%, CC = 37%). Dyslexic children also made fewer saccades in predictive latency (p < 0.001, DG = 34%, CG = 46%, predictive latency within −300–120 ms with target as 0 point). No between-group difference was found for visually guided saccades. In this task, both groups showed shorter latency for right-side targets. The results indicated altered oculomotor behavior in dyslexic children, which has been reported in previous studies. We extend these findings by demonstrating impaired implicit learning of target's time/position patterns in dyslexic children. PMID:27445945
Yoon, Ki-Hyuk; Kang, Min-Koo; Lee, Hwasun; Kim, Sung-Kyu
2018-01-01
We study optical technologies for viewer-tracked autostereoscopic 3D display (VTA3D), which provides improved 3D image quality and extended viewing range. In particular, we utilize a technique-the so-called dynamic fusion of viewing zone (DFVZ)-for each 3D optical line to realize image quality equivalent to that achievable at optimal viewing distance, even when a viewer is moving in a depth direction. In addition, we examine quantitative properties of viewing zones provided by the VTA3D system that adopted DFVZ, revealing that the optimal viewing zone can be formed at viewer position. Last, we show that the comfort zone is extended due to DFVZ. This is demonstrated by a viewer's subjective evaluation of the 3D display system that employs both multiview autostereoscopic 3D display and DFVZ.
Pilots' visual scan patterns and situation awareness in flight operations.
Yu, Chung-San; Wang, Eric Min-Yang; Li, Wen-Chin; Braithwaite, Graham
2014-07-01
Situation awareness (SA) is considered an essential prerequisite for safe flying. If the impact of visual scanning patterns on a pilot's situation awareness could be identified in flight operations, then eye-tracking tools could be integrated with flight simulators to improve training efficiency. Participating in this research were 18 qualified, mission-ready fighter pilots. The equipment included high-fidelity and fixed-base type flight simulators and mobile head-mounted eye-tracking devices to record a subject's eye movements and SA while performing air-to-surface tasks. There were significant differences in pilots' percentage of fixation in three operating phases: preparation (M = 46.09, SD = 14.79), aiming (M = 24.24, SD = 11.03), and release and break-away (M = 33.98, SD = 14.46). Also, there were significant differences in pilots' pupil sizes, which were largest in the aiming phase (M = 27,621, SD = 6390.8), followed by release and break-away (M = 27,173, SD = 5830.46), then preparation (M = 25,710, SD = 6078.79), which was the smallest. Furthermore, pilots with better SA performance showed lower perceived workload (M = 30.60, SD = 17.86), and pilots with poor SA performance showed higher perceived workload (M = 60.77, SD = 12.72). Pilots' percentage of fixation and average fixation duration among five different areas of interest showed significant differences as well. Eye-tracking devices can aid in capturing pilots' visual scan patterns and SA performance, unlike traditional flight simulators. Therefore, integrating eye-tracking devices into the simulator may be a useful method for promoting SA training in flight operations, and can provide in-depth understanding of the mechanism of visual scan patterns and information processing to improve training effectiveness in aviation.
Adaptive optics with pupil tracking for high resolution retinal imaging
Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris
2012-01-01
Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics. PMID:22312577
Adaptive optics with pupil tracking for high resolution retinal imaging.
Sahin, Betul; Lamory, Barbara; Levecq, Xavier; Harms, Fabrice; Dainty, Chris
2012-02-01
Adaptive optics, when integrated into retinal imaging systems, compensates for rapidly changing ocular aberrations in real time and results in improved high resolution images that reveal the photoreceptor mosaic. Imaging the retina at high resolution has numerous potential medical applications, and yet for the development of commercial products that can be used in the clinic, the complexity and high cost of the present research systems have to be addressed. We present a new method to control the deformable mirror in real time based on pupil tracking measurements which uses the default camera for the alignment of the eye in the retinal imaging system and requires no extra cost or hardware. We also present the first experiments done with a compact adaptive optics flood illumination fundus camera where it was possible to compensate for the higher order aberrations of a moving model eye and in vivo in real time based on pupil tracking measurements, without the real time contribution of a wavefront sensor. As an outcome of this research, we showed that pupil tracking can be effectively used as a low cost and practical adaptive optics tool for high resolution retinal imaging because eye movements constitute an important part of the ocular wavefront dynamics.
Genetics Home Reference: horizontal gaze palsy with progressive scoliosis
... to track moving objects. Up-and-down (vertical) eye movements are typically normal. In people with HGPPS , an ... the brainstem is the underlying cause of the eye movement abnormalities associated with the disorder. The cause of ...
The Use of Eye Movements in the Study of Multimedia Learning
ERIC Educational Resources Information Center
Hyona, Jukka
2010-01-01
This commentary focuses on the use of the eye-tracking methodology to study cognitive processes during multimedia learning. First, some general remarks are made about how the method is applied to investigate visual information processing, followed by a reflection on the eye movement measures employed in the studies published in this special issue.…
Reliability and Validity of Eye Movement Measures of Children's Reading
ERIC Educational Resources Information Center
Foster, Tori E.; Ardoin, Scott P.; Binder, Katherine S.
2018-01-01
Although strong claims have been made regarding the educational utility of eye tracking, such statements seem somewhat unfounded in the absence of clear evidence regarding the technical adequacy of eye movement (EM) data. Past studies have yielded direct and indirect evidence concerning the utility of EMs as measures of reading, but recent…
Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke
2018-04-01
Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.
Wibirama, Sunu; Nugroho, Hanung A
2017-07-01
Mobile devices addiction has been an important research topic in cognitive science, mental health, and human-machine interaction. Previous works observed mobile device addiction by logging mobile devices activity. Although immersion has been linked as a significant predictor of video game addiction, investigation on addiction factors of mobile device with behavioral measurement has never been done before. In this research, we demonstrated the usage of eye tracking to observe effect of screen size on experience of immersion. We compared subjective judgment with eye movements analysis. Non-parametric analysis on immersion score shows that screen size affects experience of immersion (p<;0.05). Furthermore, our experimental results suggest that fixational eye movements may be used as an indicator for future investigation of mobile devices addiction. Our experimental results are also useful to develop a guideline as well as intervention strategy to deal with smartphone addiction.
Eye tracking reveals a crucial role for facial motion in recognition of faces by infants.
Xiao, Naiqi G; Quinn, Paul C; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang
2015-06-01
Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces, and then their face recognition was tested with static face images. Eye-tracking methodology was used to record eye movements during the familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better their face recognition was, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. (c) 2015 APA, all rights reserved).
A Model-Based Approach for the Measurement of Eye Movements Using Image Processing
NASA Technical Reports Server (NTRS)
Sung, Kwangjae; Reschke, Millard F.
1997-01-01
This paper describes a video eye-tracking algorithm which searches for the best fit of the pupil modeled as a circular disk. The algorithm is robust to common image artifacts such as the droopy eyelids and light reflections while maintaining the measurement resolution available by the centroid algorithm. The presented algorithm is used to derive the pupil size and center coordinates, and can be combined with iris-tracking techniques to measure ocular torsion. A comparison search method of pupil candidates using pixel coordinate reference lookup tables optimizes the processing requirements for a least square fit of the circular disk model. This paper includes quantitative analyses and simulation results for the resolution and the robustness of the algorithm. The algorithm presented in this paper provides a platform for a noninvasive, multidimensional eye measurement system which can be used for clinical and research applications requiring the precise recording of eye movements in three-dimensional space.
Through the eyes of the own-race bias: eye-tracking and pupillometry during face recognition.
Wu, Esther Xiu Wen; Laeng, Bruno; Magnussen, Svein
2012-01-01
People are generally better at remembering faces of their own race than faces of a different race, and this effect is known as the own-race bias (ORB) effect. We used eye-tracking and pupillometry to investigate whether Caucasian and Asian face stimuli elicited different-looking patterns in Caucasian participants in a face-memory task. Consistent with the ORB effect, we found better recognition performance for own-race faces than other-race faces, and shorter response times. In addition, at encoding, eye movements and pupillary responses to Asian faces (i.e., the other race) were different from those to Caucasian faces (i.e., the own race). Processing of own-race faces was characterized by more active scanning, with a larger number of shorter fixations, and more frequent saccades. Moreover, pupillary diameters were larger when viewing other-race than own-race faces, suggesting a greater cognitive effort when encoding other-race faces.
Comprehensive Oculomotor Behavioral Response Assessment (COBRA)
NASA Technical Reports Server (NTRS)
Stone, Leland S. (Inventor); Liston, Dorion B. (Inventor)
2017-01-01
An eye movement-based methodology and assessment tool may be used to quantify many aspects of human dynamic visual processing using a relatively simple and short oculomotor task, noninvasive video-based eye tracking, and validated oculometric analysis techniques. By examining the eye movement responses to a task including a radially-organized appropriately randomized sequence of Rashbass-like step-ramp pursuit-tracking trials, distinct performance measurements may be generated that may be associated with, for example, pursuit initiation (e.g., latency and open-loop pursuit acceleration), steady-state tracking (e.g., gain, catch-up saccade amplitude, and the proportion of the steady-state response consisting of smooth movement), direction tuning (e.g., oblique effect amplitude, horizontal-vertical asymmetry, and direction noise), and speed tuning (e.g., speed responsiveness and noise). This quantitative approach may provide fast and results (e.g., a multi-dimensional set of oculometrics and a single scalar impairment index) that can be interpreted by one without a high degree of scientific sophistication or extensive training.
Bilingualism influences inhibitory control in auditory comprehension
Blumenfeld, Henrike K.; Marian, Viorica
2013-01-01
Bilinguals have been shown to outperform monolinguals at suppressing task-irrelevant information. The present study aimed to identify how processing linguistic ambiguity during auditory comprehension may be associated with inhibitory control. Monolinguals and bilinguals listened to words in their native language (English) and identified them among four pictures while their eye-movements were tracked. Each target picture (e.g., hamper) appeared together with a similar-sounding within-language competitor picture (e.g., hammer) and two neutral pictures. Following each eye-tracking trial, priming probe trials indexed residual activation of target words, and residual inhibition of competitor words. Eye-tracking showed similar within-language competition across groups; priming showed stronger competitor inhibition in monolinguals than in bilinguals, suggesting differences in how inhibitory control was used to resolve within-language competition. Notably, correlation analyses revealed that inhibition performance on a nonlinguistic Stroop task was related to linguistic competition resolution in bilinguals but not in monolinguals. Together, monolingual-bilingual comparisons suggest that cognitive control mechanisms can be shaped by linguistic experience. PMID:21159332
A resource for assessing information processing in the developing brain using EEG and eye tracking
Langer, Nicolas; Ho, Erica J.; Alexander, Lindsay M.; Xu, Helen Y.; Jozanovic, Renee K.; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T.; Parra, Lucas C.; Milham, Michael P.; Kelly, Simon P.
2017-01-01
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6–44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes. PMID:28398357
A resource for assessing information processing in the developing brain using EEG and eye tracking.
Langer, Nicolas; Ho, Erica J; Alexander, Lindsay M; Xu, Helen Y; Jozanovic, Renee K; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T; Parra, Lucas C; Milham, Michael P; Kelly, Simon P
2017-04-11
We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6-44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes.
Object motion computation for the initiation of smooth pursuit eye movements in humans.
Wallace, Julian M; Stone, Leland S; Masson, Guillaume S
2005-04-01
Pursuing an object with smooth eye movements requires an accurate estimate of its two-dimensional (2D) trajectory. This 2D motion computation requires that different local motion measurements are extracted and combined to recover the global object-motion direction and speed. Several combination rules have been proposed such as vector averaging (VA), intersection of constraints (IOC), or 2D feature tracking (2DFT). To examine this computation, we investigated the time course of smooth pursuit eye movements driven by simple objects of different shapes. For type II diamond (where the direction of true object motion is dramatically different from the vector average of the 1-dimensional edge motions, i.e., VA not equal IOC = 2DFT), the ocular tracking is initiated in the vector average direction. Over a period of less than 300 ms, the eye-tracking direction converges on the true object motion. The reduction of the tracking error starts before the closing of the oculomotor loop. For type I diamonds (where the direction of true object motion is identical to the vector average direction, i.e., VA = IOC = 2DFT), there is no such bias. We quantified this effect by calculating the direction error between responses to types I and II and measuring its maximum value and time constant. At low contrast and high speeds, the initial bias in tracking direction is larger and takes longer to converge onto the actual object-motion direction. This effect is attenuated with the introduction of more 2D information to the extent that it was totally obliterated with a texture-filled type II diamond. These results suggest a flexible 2D computation for motion integration, which combines all available one-dimensional (edge) and 2D (feature) motion information to refine the estimate of object-motion direction over time.
Understanding Student Cognition about Complex Earth System Processes Related to Climate Change
NASA Astrophysics Data System (ADS)
McNeal, K. S.; Libarkin, J.; Ledley, T. S.; Dutta, S.; Templeton, M. C.; Geroux, J.; Blakeney, G. A.
2011-12-01
The Earth's climate system includes complex behavior and interconnections with other Earth spheres that present challenges to student learning. To better understand these unique challenges, we have conducted experiments with high-school and introductory level college students to determine how information pertaining to the connections between the Earth's atmospheric system and the other Earth spheres (e.g., hydrosphere and cryosphere) are processed. Specifically, we include psychomotor tests (e.g., eye-tracking) and open-ended questionnaires in this research study, where participants were provided scientific images of the Earth (e.g., global precipitation and ocean and atmospheric currents), eye-tracked, and asked to provide causal or relational explanations about the viewed images. In addition, the students engaged in on-line modules (http://serc.carleton.edu/eslabs/climate/index.html) focused on Earth system science as training activities to address potential cognitive barriers. The developed modules included interactive media, hands-on lessons, links to outside resources, and formative assessment questions to promote a supportive and data-rich learning environment. Student eye movements were tracked during engagement with the materials to determine the role of perception and attention on understanding. Students also completed a conceptual questionnaire pre-post to determine if these on-line curriculum materials assisted in their development of connections between Earth's atmospheric system and the other Earth systems. The pre-post results of students' thinking about climate change concepts, as well as eye-tracking results, will be presented.
Chevallier, Coralie; Parish-Morris, Julia; McVey, Alana; Rump, Keiran M; Sasson, Noah J; Herrington, John D; Schultz, Robert T
2015-10-01
Autism Spectrum Disorder (ASD) is characterized by social impairments that have been related to deficits in social attention, including diminished gaze to faces. Eye-tracking studies are commonly used to examine social attention and social motivation in ASD, but they vary in sensitivity. In this study, we hypothesized that the ecological nature of the social stimuli would affect participants' social attention, with gaze behavior during more naturalistic scenes being most predictive of ASD vs. typical development. Eighty-one children with and without ASD participated in three eye-tracking tasks that differed in the ecological relevance of the social stimuli. In the "Static Visual Exploration" task, static images of objects and people were presented; in the "Dynamic Visual Exploration" task, video clips of individual faces and objects were presented side-by-side; in the "Interactive Visual Exploration" task, video clips of children playing with objects in a naturalistic context were presented. Our analyses uncovered a three-way interaction between Task, Social vs. Object Stimuli, and Diagnosis. This interaction was driven by group differences on one task only-the Interactive task. Bayesian analyses confirmed that the other two tasks were insensitive to group membership. In addition, receiver operating characteristic analyses demonstrated that, unlike the other two tasks, the Interactive task had significant classification power. The ecological relevance of social stimuli is an important factor to consider for eye-tracking studies aiming to measure social attention and motivation in ASD. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.
Sesin, Anaelis; Adjouadi, Malek; Cabrerizo, Mercedes; Ayala, Melvin; Barreto, Armando
2008-01-01
This study developed an adaptive real-time human-computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user's different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user's initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.
How young adults with autism spectrum disorder watch and interpret pragmatically complex scenes.
Lönnqvist, Linda; Loukusa, Soile; Hurtig, Tuula; Mäkinen, Leena; Siipo, Antti; Väyrynen, Eero; Palo, Pertti; Laukka, Seppo; Mämmelä, Laura; Mattila, Marja-Leena; Ebeling, Hanna
2017-11-01
The aim of the current study was to investigate subtle characteristics of social perception and interpretation in high-functioning individuals with autism spectrum disorders (ASDs), and to study the relation between watching and interpreting. As a novelty, we used an approach that combined moment-by-moment eye tracking and verbal assessment. Sixteen young adults with ASD and 16 neurotypical control participants watched a video depicting a complex communication situation while their eye movements were tracked. The participants also completed a verbal task with questions related to the pragmatic content of the video. We compared verbal task scores and eye movements between groups, and assessed correlations between task performance and eye movements. Individuals with ASD had more difficulty than the controls in interpreting the video, and during two short moments there were significant group differences in eye movements. Additionally, we found significant correlations between verbal task scores and moment-level eye movement in the ASD group, but not among the controls. We concluded that participants with ASD had slight difficulties in understanding the pragmatic content of the video stimulus and attending to social cues, and that the connection between pragmatic understanding and eye movements was more pronounced for participants with ASD than for neurotypical participants.
Video-Based Eye Tracking to Detect the Attention Shift: A Computer Classroom Context-Aware System
ERIC Educational Resources Information Center
Kuo, Yung-Lung; Lee, Jiann-Shu; Hsieh, Min-Chai
2014-01-01
Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom.…
Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements
ERIC Educational Resources Information Center
Yu, Chen; Yurovsky, Daniel; Xu, Tian
2012-01-01
Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…
Extracting information of fixational eye movements through pupil tracking
NASA Astrophysics Data System (ADS)
Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng
2018-01-01
Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.
Wieser, Matthias J; Pauli, Paul; Alpers, Georg W; Mühlberger, Andreas
2009-01-01
The effects of direct and averted gaze on autonomic arousal and gaze behavior in social anxiety were investigated using a new paradigm including animated movie stimuli and eye-tracking methodology. While high, medium, and low socially anxious (HSA vs. MSA vs. LSA) women watched animated movie clips, in which faces responded to the gaze of the participants with either direct or averted gaze, their eye movements, heart rate (HR) and skin conductance responses (SCR) were continuously recorded. Groups did not differ in their gaze behavior concerning direct vs. averted gaze, but high socially anxious women tended to fixate the eye region of the presented face longer than MSA and LSA, respectively. Furthermore, they responded to direct gaze with more pronounced cardiac acceleration. This physiological finding indicates that direct gaze may be a fear-relevant feature for socially anxious individuals in social interaction. However, this seems not to result in gaze avoidance. Future studies should examine the role of gaze direction and its interaction with facial expressions in social anxiety and its consequences for avoidance behavior and fear responses. Additionally, further research is needed to clarify the role of gaze perception in social anxiety.
NASA Astrophysics Data System (ADS)
McIntire, Lindsey K.; McKinley, R. Andy; Goodyear, Chuck; McIntire, John P.
2017-05-01
The purpose of this study is to determine the ability of an eye-tracker to detect changes in vigilance performance compared to the common method of using cerebral blood flow velocities (CBFV). Sixteen subjects completed this study. Each participant performed a 40-minute vigilance task while wearing an eye-tracker and a transcranial doppler (TCD) on each of four separate days. The results indicate that percentage of eye closure (PERCLOS) measured by the eye-tracker increased as vigilance performance declined and right CBFV as measured by the TCD decreased as performance declined. The results indicate that PERCLOS (left eye r=-.72 right eye r=-.67) more strongly correlated with changes in performance when compared to CBFV (r=.54). We conclude that PERCLOS, as measured by a head-worn eye tracking system, may serve as a compelling alternative (or supplemental) indicator of impending or concurrent performance declines in operational settings where sustained attention or vigilance is required. Such head-worn or perhaps even offbody oculometric sensor systems could potentially overcome some of the practical disadvantages inherent with TCD data collection for operational purposes. If portability and discomfort challenges with TCD can be overcome, both TCD and eye tracking might be advantageously combined for even greater performance monitoring than can be offered by any single device.
Storyline Visualizations of Eye Tracking of Movie Viewing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.
Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.
Vanderwert, Ross E; Westerlund, Alissa; Montoya, Lina; McCormick, Sarah A; Miguel, Helga O; Nelson, Charles A
2015-10-01
Previous studies in infants have shown that face-sensitive components of the ongoing electroencephalogram (the event-related potential, or ERP) are larger in amplitude to negative emotions (e.g., fear, anger) versus positive emotions (e.g., happy). However, it is still unclear whether the negative emotions linked with the face or the negative emotions alone contribute to these amplitude differences. We simultaneously recorded infant looking behaviors (via eye-tracking) and face-sensitive ERPs while 7-month-old infants viewed human faces or animals displaying happy, fear, or angry expressions. We observed that the amplitude of the N290 was greater (i.e., more negative) to angry animals compared to happy or fearful animals; no such differences were obtained for human faces. Eye-tracking data highlighted the importance of the eye region in processing emotional human faces. Infants that spent more time looking to the eye region of human faces showing fearful or angry expressions had greater N290 or P400 amplitudes, respectively. © 2014 Wiley Periodicals, Inc.
GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.
Sogo, Hiroyuki
2013-09-01
Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.
Micro-video display with ocular tracking and interactive voice control
NASA Technical Reports Server (NTRS)
Miller, James E.
1993-01-01
In certain space-restricted environments, many of the benefits resulting from computer technology have been foregone because of the size, weight, inconvenience, and lack of mobility associated with existing computer interface devices. Accordingly, an effort to develop a highly miniaturized and 'wearable' computer display and control interface device, referred to as the Sensory Integrated Data Interface (SIDI), is underway. The system incorporates a micro-video display that provides data display and ocular tracking on a lightweight headset. Software commands are implemented by conjunctive eye movement and voice commands of the operator. In this initial prototyping effort, various 'off-the-shelf' components have been integrated into a desktop computer and with a customized menu-tree software application to demonstrate feasibility and conceptual capabilities. When fully developed as a customized system, the interface device will allow mobile, 'hand-free' operation of portable computer equipment. It will thus allow integration of information technology applications into those restrictive environments, both military and industrial, that have not yet taken advantage of the computer revolution. This effort is Phase 1 of Small Business Innovative Research (SBIR) Topic number N90-331 sponsored by the Naval Undersea Warfare Center Division, Newport. The prime contractor is Foster-Miller, Inc. of Waltham, MA.
The specificity of attentional biases by type of gambling: An eye-tracking study
Meitner, Amadeus; Sears, Christopher R.
2018-01-01
A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers. PMID:29385164
Eyes and ears: Using eye tracking and pupillometry to understand challenges to speech recognition.
Van Engen, Kristin J; McLaughlin, Drew J
2018-05-04
Although human speech recognition is often experienced as relatively effortless, a number of common challenges can render the task more difficult. Such challenges may originate in talkers (e.g., unfamiliar accents, varying speech styles), the environment (e.g. noise), or in listeners themselves (e.g., hearing loss, aging, different native language backgrounds). Each of these challenges can reduce the intelligibility of spoken language, but even when intelligibility remains high, they can place greater processing demands on listeners. Noisy conditions, for example, can lead to poorer recall for speech, even when it has been correctly understood. Speech intelligibility measures, memory tasks, and subjective reports of listener difficulty all provide critical information about the effects of such challenges on speech recognition. Eye tracking and pupillometry complement these methods by providing objective physiological measures of online cognitive processing during listening. Eye tracking records the moment-to-moment direction of listeners' visual attention, which is closely time-locked to unfolding speech signals, and pupillometry measures the moment-to-moment size of listeners' pupils, which dilate in response to increased cognitive load. In this paper, we review the uses of these two methods for studying challenges to speech recognition. Copyright © 2018. Published by Elsevier B.V.
Use of a genetic algorithm for the analysis of eye movements from the linear vestibulo-ocular reflex
NASA Technical Reports Server (NTRS)
Shelhamer, M.
2001-01-01
It is common in vestibular and oculomotor testing to use a single-frequency (sine) or combination of frequencies [sum-of-sines (SOS)] stimulus for head or target motion. The resulting eye movements typically contain a smooth tracking component, which follows the stimulus, in which are interspersed rapid eye movements (saccades or fast phases). The parameters of the smooth tracking--the amplitude and phase of each component frequency--are of interest; many methods have been devised that attempt to identify and remove the fast eye movements from the smooth. We describe a new approach to this problem, tailored to both single-frequency and sum-of-sines stimulation of the human linear vestibulo-ocular reflex. An approximate derivative is used to identify fast movements, which are then omitted from further analysis. The remaining points form a series of smooth tracking segments. A genetic algorithm is used to fit these segments together to form a smooth (but disconnected) wave form, by iteratively removing biases due to the missing fast phases. A genetic algorithm is an iterative optimization procedure; it provides a basis for extending this approach to more complex stimulus-response situations. In the SOS case, the genetic algorithm estimates the amplitude and phase values of the component frequencies as well as removing biases.
Fukushima, Junko; Akao, Teppei; Kurkin, Sergei; Kaneko, Chris R.S.; Fukushima, Kikuro
2006-01-01
In order to see clearly when a target is moving slowly, primates with high acuity foveae use smooth-pursuit and vergence eye movements. The former rotates both eyes in the same direction to track target motion in frontal planes, while the latter rotates left and right eyes in opposite directions to track target motion in depth. Together, these two systems pursue targets precisely and maintain their images on the foveae of both eyes. During head movements, both systems must interact with the vestibular system to minimize slip of the retinal images. The primate frontal cortex contains two pursuit-related areas; the caudal part of the frontal eye fields (FEF) and supplementary eye fields (SEF). Evoked potential studies have demonstrated vestibular projections to both areas and pursuit neurons in both areas respond to vestibular stimulation. The majority of FEF pursuit neurons code parameters of pursuit such as pursuit and vergence eye velocity, gaze velocity, and retinal image motion for target velocity in frontal and depth planes. Moreover, vestibular inputs contribute to the predictive pursuit responses of FEF neurons. In contrast, the majority of SEF pursuit neurons do not code pursuit metrics and many SEF neurons are reported to be active in more complex tasks. These results suggest that FEF- and SEF-pursuit neurons are involved in different aspects of vestibular-pursuit interactions and that eye velocity coding of SEF pursuit neurons is specialized for the task condition. PMID:16917164
The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders
Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth
2010-01-01
It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically developing participants and 17 individuals with ASD were recorded while passively viewing three face categories: unfamiliar non-repeating faces, a repeating highly familiar face, and a repeating previously unfamiliar face. Results suggest that individuals with ASD do not exhibit more normative gaze patterns when viewing familiar faces. A second task assessed facial recognition accuracy and response time for familiar and novel faces. The groups did not differ on accuracy or reaction times. PMID:18306030
Exploring What’s Missing: What Do Target Absent Trials Reveal About Autism Search Superiority?
Keehn, Brandon; Joseph, Robert M.
2016-01-01
We used eye-tracking to investigate the roles of enhanced discrimination and peripheral selection in superior visual search in autism spectrum disorder (ASD). Children with ASD were faster at visual search than their typically developing peers. However, group differences in performance and eye-movements did not vary with the level of difficulty of discrimination or selection. Rather, consistent with prior ASD research, group differences were mainly the effect of faster performance on target-absent trials. Eye-tracking revealed a lack of left-visual-field search asymmetry in ASD, which may confer an additional advantage when the target is absent. Lastly, ASD symptomatology was positively associated with search superiority, the mechanisms of which may shed light on the atypical brain organization that underlies social-communicative impairment in ASD. PMID:26762114
Lencer, Rebekka; Keedy, Sarah K.; Reilly, James L.; McDonough, Bruce E.; Harris, Margret S. H.; Sprenger, Andreas; Sweeney, John A.
2011-01-01
Visual motion processing and its use for pursuit eye movement control represent a valuable model for studying the use of sensory input for action planning. In psychotic disorders, alterations of visual motion perception have been suggested to cause pursuit eye tracking deficits. We evaluated this system in functional neuroimaging studies of untreated first-episode schizophrenia (N=24), psychotic bipolar disorder patients (N=13) and healthy controls (N=20). During a passive visual motion processing task, both patient groups showed reduced activation in the posterior parietal projection fields of motion-sensitive extrastriate area V5, but not in V5 itself. This suggests reduced bottom-up transfer of visual motion information from extrastriate cortex to perceptual systems in parietal association cortex. During active pursuit, activation was enhanced in anterior intraparietal sulcus and insula in both patient groups, and in dorsolateral prefrontal cortex and dorsomedial thalamus in schizophrenia patients. This may result from increased demands on sensorimotor systems for pursuit control due to the limited availability of perceptual motion information about target speed and tracking error. Visual motion information transfer deficits to higher -level association cortex may contribute to well-established pursuit tracking abnormalities, and perhaps to a wider array of alterations in perception and action planning in psychotic disorders. PMID:21873035
ERIC Educational Resources Information Center
Vabalas, Andrius; Freeth, Megan
2016-01-01
The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking…
Stuart, Samuel; Hickey, Aodhán; Galna, Brook; Lord, Sue; Rochester, Lynn; Godfrey, Alan
2017-01-01
Detection of saccades (fast eye-movements) within raw mobile electrooculography (EOG) data involves complex algorithms which typically process data acquired during seated static tasks only. Processing of data during dynamic tasks such as walking is relatively rare and complex, particularly in older adults or people with Parkinson's disease (PD). Development of algorithms that can be easily implemented to detect saccades is required. This study aimed to develop an algorithm for the detection and measurement of saccades in EOG data during static (sitting) and dynamic (walking) tasks, in older adults and PD. Eye-tracking via mobile EOG and infra-red (IR) eye-tracker (with video) was performed with a group of older adults (n = 10) and PD participants (n = 10) (⩾50 years). Horizontal saccades made between targets set 5°, 10° and 15° apart were first measured while seated. Horizontal saccades were then measured while a participant walked and executed a 40° turn left and right. The EOG algorithm was evaluated by comparing the number of correct saccade detections and agreement (ICC 2,1 ) between output from visual inspection of eye-tracker videos and IR eye-tracker. The EOG algorithm detected 75-92% of saccades compared to video inspection and IR output during static testing, with fair to excellent agreement (ICC 2,1 0.49-0.93). However, during walking EOG saccade detection reduced to 42-88% compared to video inspection or IR output, with poor to excellent (ICC 2,1 0.13-0.88) agreement between methodologies. The algorithm was robust during seated testing but less so during walking, which was likely due to increased measurement and analysis error with a dynamic task. Future studies may consider a combination of EOG and IR for comprehensive measurement.
Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays
Padmanaban, Nitish; Konrad, Robert; Stramer, Tal; Wetzstein, Gordon
2017-01-01
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one. PMID:28193871
Attentional biases in body dysmorphic disorder (BDD): Eye-tracking using the emotional Stroop task.
Toh, Wei Lin; Castle, David J; Rossell, Susan L
2017-04-01
Body dysmorphic disorder (BDD) is characterised by repetitive behaviours and/or mental acts occurring in response to preoccupations with perceived defects or flaws in physical appearance. This study aimed to examine attentional biases in BDD via the emotional Stroop task with two modifications: i) incorporating an eye-tracking paradigm, and ii) employing an obsessive-compulsive disorder (OCD) control group. Twenty-one BDD, 19 OCD and 21 HC participants, who were age-, sex-, and IQ-matched, were included. A card version of the emotional Stroop task was employed based on seven 10-word lists: (i) BDD-positive, (ii) BDD-negative, (iii) OCD-checking, (iv) OCD-washing, (v) general positive, (vi) general threat, and (vii) neutral (as baseline). Participants were asked to read aloud words and word colours consecutively, thereby yielding accuracy and latency scores. Eye-tracking parameters were also measured. Participants with BDD exhibited significant Stroop interference for BDD-negative words relative to HC participants, as shown by extended colour-naming latencies. In contrast, the OCD group did not exhibit Stroop interference for OCD-related nor general threat words. Only mild eye-tracking anomalies were uncovered in clinical groups. Inspection of individual scanning styles and fixation heat maps however revealed that viewing strategies adopted by clinical groups were generally disorganised, with avoidance of certain disorder-relevant words and considerable visual attention devoted to non-salient card regions. The operation of attentional biases to negative disorder-specific words was corroborated in BDD. Future replication studies using other paradigms are vital, given potential ambiguities inherent in emotional Stroop task interpretation. Copyright © 2017 Elsevier Inc. All rights reserved.
Chang, Franklin; Rowland, Caroline; Ferguson, Heather; Pine, Julian
2017-01-01
We used eye-tracking to investigate if and when children show an incremental bias to assume that the first noun phrase in a sentence is the agent (first-NP-as-agent bias) while processing the meaning of English active and passive transitive sentences. We also investigated whether children can override this bias to successfully distinguish active from passive sentences, after processing the remainder of the sentence frame. For this second question we used eye-tracking (Study 1) and forced-choice pointing (Study 2). For both studies, we used a paradigm in which participants simultaneously saw two novel actions with reversed agent-patient relations while listening to active and passive sentences. We compared English-speaking 25-month-olds and 41-month-olds in between-subjects sentence structure conditions (Active Transitive Condition vs. Passive Condition). A permutation analysis found that both age groups showed a bias to incrementally map the first noun in a sentence onto an agent role. Regarding the second question, 25-month-olds showed some evidence of distinguishing the two structures in the eye-tracking study. However, the 25-month-olds did not distinguish active from passive sentences in the forced choice pointing task. In contrast, the 41-month-old children did reanalyse their initial first-NP-as-agent bias to the extent that they clearly distinguished between active and passive sentences both in the eye-tracking data and in the pointing task. The results are discussed in relation to the development of syntactic (re)parsing. PMID:29049390
Perceptions of Harm and Addiction of Snus: An Exploratory Study
Kaufman, Annette R.; Grenen, Emily; Grady, Meredith; Leyva, Bryan; Ferrer, Rebecca A.
2017-01-01
Tobacco companies in the United States are prohibited from making reduced harm claims without filing a modified risk tobacco product application with the Food and Drug Administration and obtaining an order to market as such. However, it is possible that product marketing may suggest reduced risk to individuals. This study examines perceptions, in particular those related to harm and addiction, of snus print advertisements using a combination of eye-tracking, survey, and semistructured interviews. Participants were 22 male smokers ages 19–29 (M = 26.64, SD = 2.92). Five snus advertisements were each displayed for 20 s and eye movements were tracked. Participants responded to questions about harm and addiction after each advertisement and interviews were conducted after seeing all advertisements. For each advertisement, descriptive statistics were calculated and regression analyses predicted harm and addiction perceptions from eye tracking areas of interest (e.g., warning label). Qualitative data were analyzed using inductive/deductive thematic analysis. For certain advertisements, areas of interest were significantly associated with harm and/or addiction perceptions. For example, higher total fixation duration on the graphic in the Smokeless for Smokers advertisement was associated with decreased perceptions of addiction (B = −.360, p = .048). Qualitative themes emerged and in many instances corroborated quantitative results. This study indicates that for some advertisements, attention on certain areas (measured through eye tracking) is associated with perceptions among young male smokers. Understanding how smokers perceive and understand products after viewing advertisements may inform regulations regarding claims about product harm and addiction and may guide public health efforts to educate smokers on the risks of emerging products. PMID:28068113
Placebo effects in spider phobia: an eye-tracking experiment.
Gremsl, Andreas; Schwab, Daniela; Höfler, Carina; Schienle, Anne
2018-01-05
Several eye-tracking studies have revealed that spider phobic patients show a typical hypervigilance-avoidance pattern when confronted with images of spiders. The present experiment investigated if this pattern can be changed via placebo treatment. We conducted an eye-tracking experiment with 37 women with spider phobia. They looked at picture pairs (a spider paired with a neutral picture) for 7 s each in a retest design: once with and once without a placebo pill presented along with the verbal suggestion that it can reduce phobic symptoms. The placebo was labelled as Propranolol, a beta-blocker that has been successfully used to treat spider phobia. In the placebo condition, both the fixation count and the dwell time on the spider pictures increased, especially in the second half of the presentation time. This was associated with a slight decrease in self-reported symptom severity. In summary, we were able to show that a placebo was able to positively influence visual avoidance in spider phobia. This effect might help to overcome apprehension about engaging in exposure therapy, which is present in many phobic patients.
Richmond, Jenny L; Power, Jessica
2014-09-01
Relational memory, or the ability to bind components of an event into a network of linked representations, is a primary function of the hippocampus. Here we extend eye-tracking research showing that infants are capable of forming memories for the relation between arbitrarily paired scenes and faces, by looking at age-related changes in relational memory over the first year of life. Six- and 12-month-old infants were familiarized with pairs of faces and scenes before being tested with arrays of three familiar faces that were presented on a familiar scene. Preferential looking at the face that matches the scene is typically taken as evidence of relational memory. The results showed that while 6-month-old showed very early preferential looking when face/scene pairs were tested immediately, 12-month-old did not exhibit evidence of relational memory either immediately or after a short delay. Theoretical implications for the functional development of the hippocampus and practical implications for the use of eye tracking to measure memory during early life are discussed. © 2014 Wiley Periodicals, Inc.
Bilingualism influences inhibitory control in auditory comprehension.
Blumenfeld, Henrike K; Marian, Viorica
2011-02-01
Bilinguals have been shown to outperform monolinguals at suppressing task-irrelevant information. The present study aimed to identify how processing linguistic ambiguity during auditory comprehension may be associated with inhibitory control. Monolinguals and bilinguals listened to words in their native language (English) and identified them among four pictures while their eye-movements were tracked. Each target picture (e.g., hamper) appeared together with a similar-sounding within-language competitor picture (e.g., hammer) and two neutral pictures. Following each eye-tracking trial, priming probe trials indexed residual activation of target words, and residual inhibition of competitor words. Eye-tracking showed similar within-language competition across groups; priming showed stronger competitor inhibition in monolinguals than in bilinguals, suggesting differences in how inhibitory control was used to resolve within-language competition. Notably, correlation analyses revealed that inhibition performance on a nonlinguistic Stroop task was related to linguistic competition resolution in bilinguals but not in monolinguals. Together, monolingual-bilingual comparisons suggest that cognitive control mechanisms can be shaped by linguistic experience. Copyright © 2010 Elsevier B.V. All rights reserved.
Eye Tracking and Pupillometry are Indicators of Dissociable Latent Decision Processes
Cavanagh, James F.; Wiecki, Thomas V.; Kochar, Angad; Frank, Michael J.
2014-01-01
Can you predict what someone is going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the Drift Diffusion Model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PMID:24548281
Eye gaze tracking using correlation filters
NASA Astrophysics Data System (ADS)
Karakaya, Mahmut; Bolme, David; Boehnen, Chris
2014-03-01
In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjects gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm's length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.
Patt, Virginie M; Thomas, Michael L; Minassian, Arpi; Geyer, Mark A; Brown, Gregory G; Perry, William
2014-01-01
The neurocognitive processes involved during classic spatial working memory (SWM) assessment were investigated by examining naturally preferred eye movement strategies. Cognitively healthy adult volunteers were tested in a computerized version of the Corsi Block-Tapping Task--a spatial span task requiring the short term maintenance of a series of locations presented in a specific order--coupled with eye tracking. Modeling analysis was developed to characterize eye-tracking patterns across all task phases, including encoding, retention, and recall. Results revealed a natural preference for local gaze maintenance during both encoding and retention, with fewer than 40% fixated targets. These findings contrasted with the stimulus retracing pattern expected during recall as a result of task demands, with 80% fixated targets. Along with participants' self-reported strategies of mentally "making shapes," these results suggest the involvement of covert attention shifts and higher order cognitive Gestalt processes during spatial span tasks, challenging instrument validity as a single measure of SWM storage capacity.
NASA Astrophysics Data System (ADS)
Claus, Daniel; Reichert, Carsten; Herkommer, Alois
2017-05-01
This paper relates to the improvement of conventional surgical stereo microscopy via the application of digital recording devices and adaptive optics. The research is aimed at improving the working conditions of the surgeon during the operation, such that free head movement is possible. The depth clues known from conventional stereo microscopy in interaction with the human eye's functionality, such as convergence, disparity, angular elevation, parallax, and accommodation, are implemented in a digital recording system via adaptive optomechanical components. Two laterally moving pupil apertures have been used mimicking the digital implementation of the eye's vergence and head motion. The natural eye's accommodation is mimicked via the application of a tunable lens. Additionally, another system has been built, which enables tracking the surgeon's eye pupil through a digital displaying stereoscopic microscope to supply the necessary information for steering the recording system. The optomechanical design and experimental results for both systems, digital recording stereoscopic microscope and pupil tracking system, are shown.
Gao, Xiao; Deng, Xiao; Yang, Jia; Liang, Shuang; Liu, Jie; Chen, Hong
2014-12-01
Visual attentional bias has important functions during the appearance social comparisons. However, for the limitations of experimental paradigms or analysis methods in previous studies, the time course of attentional bias to thin and fat body images among women with body dissatisfaction (BD) has still been unclear. In using free reviewing task combined with eye movement tracking, and based on event-related analyses of the critical first eye movement events, as well as epoch-related analyses of gaze durations, the current study investigated different attentional bias components to body shape/part images during 15s presentation time among 34 high BD and 34 non-BD young women. In comparison to the controls, women with BD showed sustained maintenance biases on thin and fat body images during both early automatic and late strategic processing stages. This study highlights a clear need for research on the dynamics of attentional biases related to body image and eating disturbances. Copyright © 2014 Elsevier Ltd. All rights reserved.
2017-01-01
This technical report details the results of an uncontrolled study of EyeGuide Focus, a 10-second concussion management tool which relies on eye tracking to determine the potential impairment of visual attention, an indicator often of mild traumatic brain injury (mTBI). Essentially, people who can visually keep steady and accurate attention on a moving object in their environment likely suffer from no impairment. However, if after a potential mTBI event, subjects cannot keep attention on a moving object in a normal way as demonstrated on their previous healthy baseline tests. This may indicate possible neurological impairment. Now deployed at multiple locations across the United States, Focus (EyeGuide, Lubbock, Texas, United States) to date, has recorded more than 4,000 test scores. Our data analysis of these results shows the promise of Focus as a low-cost, ocular-based impairment test for assessing potential neurological impairment caused by mTBI in subjects ages eight and older. PMID:28630809
Kelly, Michael
2017-05-15
This technical report details the results of an uncontrolled study of EyeGuide Focus, a 10-second concussion management tool which relies on eye tracking to determine the potential impairment of visual attention, an indicator often of mild traumatic brain injury (mTBI). Essentially, people who can visually keep steady and accurate attention on a moving object in their environment likely suffer from no impairment. However, if after a potential mTBI event, subjects cannot keep attention on a moving object in a normal way as demonstrated on their previous healthy baseline tests. This may indicate possible neurological impairment. Now deployed at multiple locations across the United States, Focus (EyeGuide, Lubbock, Texas, United States) to date, has recorded more than 4,000 test scores. Our data analysis of these results shows the promise of Focus as a low-cost, ocular-based impairment test for assessing potential neurological impairment caused by mTBI in subjects ages eight and older.
Jordan, Kirsten; Fromberger, Peter; Laubinger, Helge; Dechent, Peter; Müller, Jürgen L
2014-05-17
Antiandrogen therapy (ADT) has been used for 30 years to treat pedophilic patients. The aim of the treatment is a reduction in sexual drive and, in consequence, a reduced risk of recidivism. Yet the therapeutic success of antiandrogens is uncertain especially regarding recidivism. Meta-analyses and reviews report only moderate and often mutually inconsistent effects. Based on the case of a 47 year old exclusively pedophilic forensic inpatient, we examined the effectiveness of a new eye tracking method and a new functional magnetic resonance imaging (fMRI)-design in regard to the evaluation of ADT in pedophiles. We analyzed the potential of these methods in exploring the impact of ADT on automatic and controlled attentional processes in pedophiles. Eye tracking and fMRI measures were conducted before the initial ADT as well as four months after the onset of ADT. The patient simultaneously viewed an image of a child and an image of an adult while eye movements were measured. During the fMRI-measure the same stimuli were presented subliminally. Eye movements demonstrated that controlled attentional processes change under ADT, whereas automatic processes remained mostly unchanged. We assume that these results reflect either the increased ability of the patient to control his eye movements while viewing prepubertal stimuli or his better ability to manipulate his answer in a socially desirable manner. Unchanged automatic attentional processes could reflect the stable pedophilic preference of the patient. Using fMRI, the subliminal presentation of sexually relevant stimuli led to changed activation patterns under the influence of ADT in occipital and parietal brain regions, the hippocampus, and also in the orbitofrontal cortex. We suggest that even at an unconscious level ADT can lead to changed processing of sexually relevant stimuli, reflecting changes of cognitive and perceptive automatic processes. We are convinced that our experimental designs using eye tracking and fMRI could prospectively add additional and valuable information in the evaluation of ADT in paraphilic patients and sex offenders. But with respect to the limited significance of this single case study, these first results are preliminary and further studies have to be conducted with healthy subjects and patients.
2014-01-01
Background Antiandrogen therapy (ADT) has been used for 30 years to treat pedophilic patients. The aim of the treatment is a reduction in sexual drive and, in consequence, a reduced risk of recidivism. Yet the therapeutic success of antiandrogens is uncertain especially regarding recidivism. Meta-analyses and reviews report only moderate and often mutually inconsistent effects. Case presentation Based on the case of a 47 year old exclusively pedophilic forensic inpatient, we examined the effectiveness of a new eye tracking method and a new functional magnetic resonance imaging (fMRI)-design in regard to the evaluation of ADT in pedophiles. We analyzed the potential of these methods in exploring the impact of ADT on automatic and controlled attentional processes in pedophiles. Eye tracking and fMRI measures were conducted before the initial ADT as well as four months after the onset of ADT. The patient simultaneously viewed an image of a child and an image of an adult while eye movements were measured. During the fMRI-measure the same stimuli were presented subliminally. Eye movements demonstrated that controlled attentional processes change under ADT, whereas automatic processes remained mostly unchanged. We assume that these results reflect either the increased ability of the patient to control his eye movements while viewing prepubertal stimuli or his better ability to manipulate his answer in a socially desirable manner. Unchanged automatic attentional processes could reflect the stable pedophilic preference of the patient. Using fMRI, the subliminal presentation of sexually relevant stimuli led to changed activation patterns under the influence of ADT in occipital and parietal brain regions, the hippocampus, and also in the orbitofrontal cortex. We suggest that even at an unconscious level ADT can lead to changed processing of sexually relevant stimuli, reflecting changes of cognitive and perceptive automatic processes. Conclusion We are convinced that our experimental designs using eye tracking and fMRI could prospectively add additional and valuable information in the evaluation of ADT in paraphilic patients and sex offenders. But with respect to the limited significance of this single case study, these first results are preliminary and further studies have to be conducted with healthy subjects and patients. PMID:24885644
Evaluating Silent Reading Performance with an Eye Tracking System in Patients with Glaucoma
Murata, Noriaki; Fukuchi, Takeo
2017-01-01
Objective To investigate the relationship between silent reading performance and visual field defects in patients with glaucoma using an eye tracking system. Methods Fifty glaucoma patients (Group G; mean age, 52.2 years, standard deviation: 11.4 years) and 20 normal controls (Group N; mean age, 46.9 years; standard deviation: 17.2 years) were included in the study. All participants in Group G had early to advanced glaucomatous visual field defects but better than 20/20 visual acuity in both eyes. Participants silently read Japanese articles written horizontally while the eye tracking system monitored and calculated reading duration per 100 characters, number of fixations per 100 characters, and mean fixation duration, which were compared with mean deviation and visual field index values from Humphrey visual field testing (24–2 and 10–2 Swedish interactive threshold algorithm standard) of the right versus left eye and the better versus worse eye. Results There was a statistically significant difference between Groups G and N in mean fixation duration (G, 233.4 msec; N, 215.7 msec; P = 0.010). Within Group G, significant correlations were observed between reading duration and 24–2 right mean deviation (rs = -0.280, P = 0.049), 24–2 right visual field index (rs = -0.306, P = 0.030), 24–2 worse visual field index (rs = -0.304, P = 0.032), and 10–2 worse mean deviation (rs = -0.326, P = 0.025). Significant correlations were observed between mean fixation duration and 10–2 left mean deviation (rs = -0.294, P = 0.045) and 10–2 worse mean deviation (rs = -0.306, P = 0.037), respectively. Conclusions The severity of visual field defects may influence some aspects of reading performance. At least concerning silent reading, the visual field of the worse eye is an essential element of smoothness of reading. PMID:28095478
Marx, Svenja; Respondek, Gesine; Stamelou, Maria; Dowiasch, Stefan; Stoll, Josef; Bremmer, Frank; Oertel, Wolfgang H.; Höglinger, Günter U.; Einhäuser, Wolfgang
2012-01-01
Background: The decreased ability to carry out vertical saccades is a key symptom of Progressive Supranuclear Palsy (PSP). Objective measurement devices can help to reliably detect subtle eye movement disturbances to improve sensitivity and specificity of the clinical diagnosis. The present study aims at transferring findings from restricted stationary video-oculography (VOG) to a wearable head-mounted device, which can be readily applied in clinical practice. Methods: We investigated the eye movements in 10 possible or probable PSP patients, 11 Parkinson's disease (PD) patients, and 10 age-matched healthy controls (HCs) using a mobile, gaze-driven video camera setup (EyeSeeCam). Ocular movements were analyzed during a standardized fixation protocol and in an unrestricted real-life scenario while walking along a corridor. Results: The EyeSeeCam detected prominent impairment of both saccade velocity and amplitude in PSP patients, differentiating them from PD and HCs. Differences were particularly evident for saccades in the vertical plane, and stronger for saccades than for other eye movements. Differences were more pronounced during the standardized protocol than in the real-life scenario. Conclusions: Combined analysis of saccade velocity and saccade amplitude during the fixation protocol with the EyeSeeCam provides a simple, rapid (<20 s), and reliable tool to differentiate clinically established PSP patients from PD and HCs. As such, our findings prepare the ground for using wearable eye-tracking in patients with uncertain diagnoses. PMID:23248593
Naicker, Preshanta; Anoopkumar-Dukie, Shailendra; Grant, Gary D; Modenese, Luca; Kavanagh, Justin J
2017-02-01
Anticholinergic medications largely exert their effects due to actions on the muscarinic receptor, which mediates the functions of acetylcholine in the peripheral and central nervous systems. In the central nervous system, acetylcholine plays an important role in the modulation of movement. This study investigated the effects of over-the-counter medications with varying degrees of central anticholinergic properties on fixation stability, saccadic response time and the dynamics associated with this eye movement during a temporally-cued visual reaction time task, in order to establish the significance of central cholinergic pathways in influencing eye movements during reaction time tasks. Twenty-two participants were recruited into the placebo-controlled, human double-blind, four-way crossover investigation. Eye tracking technology recorded eye movements while participants reacted to visual stimuli following temporally informative and uninformative cues. The task was performed pre-ingestion as well as 0.5 and 2 h post-ingestion of promethazine hydrochloride (strong centrally acting anticholinergic), hyoscine hydrobromide (moderate centrally acting anticholinergic), hyoscine butylbromide (anticholinergic devoid of central properties) and a placebo. Promethazine decreased fixation stability during the reaction time task. In addition, promethazine was the only drug to increase saccadic response time during temporally informative and uninformative cued trials, whereby effects on response time were more pronounced following temporally informative cues. Promethazine also decreased saccadic amplitude and increased saccadic duration during the temporally-cued reaction time task. Collectively, the results of the study highlight the significant role that central cholinergic pathways play in the control of eye movements during tasks that involve stimulus identification and motor responses following temporal cues.
Real-time tracking of visually attended objects in virtual environments and its application to LOD.
Lee, Sungkil; Kim, Gerard Jounghyun; Choi, Seungmoon
2009-01-01
This paper presents a real-time framework for computationally tracking objects visually attended by the user while navigating in interactive virtual environments. In addition to the conventional bottom-up (stimulus-driven) saliency map, the proposed framework uses top-down (goal-directed) contexts inferred from the user's spatial and temporal behaviors, and identifies the most plausibly attended objects among candidates in the object saliency map. The computational framework was implemented using GPU, exhibiting high computational performance adequate for interactive virtual environments. A user experiment was also conducted to evaluate the prediction accuracy of the tracking framework by comparing objects regarded as visually attended by the framework to actual human gaze collected with an eye tracker. The results indicated that the accuracy was in the level well supported by the theory of human cognition for visually identifying single and multiple attentive targets, especially owing to the addition of top-down contextual information. Finally, we demonstrate how the visual attention tracking framework can be applied to managing the level of details in virtual environments, without any hardware for head or eye tracking.
Effects of anger and sadness on attentional patterns in decision making: an eye-tracking study.
Xing, Cai
2014-02-01
Past research examining the effect of anger and sadness on decision making has associated anger with a relatively more heuristic decision-making approach. However, it is unclear whether angry and sad individuals differ while attending to decision-relevant information. An eye-tracking experiment (N=87) was conducted to examine the role of attention in links between emotion and decision making. Angry individuals looked more and earlier toward heuristic cues while making decisions, whereas sad individuals did not show such bias. Implications for designing persuasive messages and studying motivated visual processing were discussed.
Oculomotor Behavior Metrics Change According to Circadian Phase and Time Awake
NASA Technical Reports Server (NTRS)
Flynn-Evans, Erin E.; Tyson, Terence L.; Cravalho, Patrick; Feick, Nathan; Stone, Leland S.
2017-01-01
There is a need for non-invasive, objective measures to forecast performance impairment arising from sleep loss and circadian misalignment, particularly in safety-sensitive occupations. Eye-tracking devices have been used in some operational scenarios, but such devices typically focus on eyelid closures and slow rolling eye movements and are susceptible to the intrusion of head movement artifacts. We hypothesized that an expanded suite of oculomotor behavior metrics, collected during a visual tracking task, would change according to circadian phase and time awake, and could be used as a marker of performance impairment.
Guérard, Katherine; Tremblay, Sébastien; Saint-Aubin, Jean
2009-10-01
Serial memory for spatial locations increases as the distance between successive stimuli locations decreases. This effect, known as the path length effect [Parmentier, F. B. R., Elford, G., & Maybery, M. T. (2005). Transitional information in spatial serial memory: Path characteristics affect recall performance. Journal of Experimental Psychology: Learning, Memory & Cognition, 31, 412-427], was investigated in a systematic manner using eye tracking and interference procedures to explore the mechanisms responsible for the processing of spatial information. In Experiment 1, eye movements were monitored during a spatial serial recall task--in which the participants have to remember the location of spatially and temporally separated dots on the screen. In the experimental conditions, eye movements were suppressed by requiring participants to incessantly move their eyes between irrelevant locations. Ocular suppression abolished the path length effect whether eye movements were prevented during item presentation or during a 7s retention interval. In Experiment 2, articulatory suppression was combined with a spatial serial recall task. Although articulatory suppression impaired performance, it did not alter the path length effect. Our results suggest that rehearsal plays a key role in serial memory for spatial information, though the effect of path length seems to involve other processes located at encoding, such as the time spent fixating each location and perceptual organization.
Mapping and correcting the influence of gaze position on pupil size measurements
Petrov, Alexander A.
2015-01-01
Pupil size is correlated with a wide variety of important cognitive variables and is increasingly being used by cognitive scientists. Pupil data can be recorded inexpensively and non-invasively by many commonly used video-based eye-tracking cameras. Despite the relative ease of data collection and increasing prevalence of pupil data in the cognitive literature, researchers often underestimate the methodological challenges associated with controlling for confounds that can result in misinterpretation of their data. One serious confound that is often not properly controlled is pupil foreshortening error (PFE)—the foreshortening of the pupil image as the eye rotates away from the camera. Here we systematically map PFE using an artificial eye model and then apply a geometric model correction. Three artificial eyes with different fixed pupil sizes were used to systematically measure changes in pupil size as a function of gaze position with a desktop EyeLink 1000 tracker. A grid-based map of pupil measurements was recorded with each artificial eye across three experimental layouts of the eye-tracking camera and display. Large, systematic deviations in pupil size were observed across all nine maps. The measured PFE was corrected by a geometric model that expressed the foreshortening of the pupil area as a function of the cosine of the angle between the eye-to-camera axis and the eye-to-stimulus axis. The model reduced the root mean squared error of pupil measurements by 82.5 % when the model parameters were pre-set to the physical layout dimensions, and by 97.5 % when they were optimized to fit the empirical error surface. PMID:25953668
Hand-eye calibration for rigid laparoscopes using an invariant point.
Thompson, Stephen; Stoyanov, Danail; Schneider, Crispin; Gurusamy, Kurinchi; Ourselin, Sébastien; Davidson, Brian; Hawkes, David; Clarkson, Matthew J
2016-06-01
Laparoscopic liver resection has significant advantages over open surgery due to less patient trauma and faster recovery times, yet it can be difficult due to the restricted field of view and lack of haptic feedback. Image guidance provides a potential solution but one current challenge is in accurate "hand-eye" calibration, which determines the position and orientation of the laparoscope camera relative to the tracking markers. In this paper, we propose a simple and clinically feasible calibration method based on a single invariant point. The method requires no additional hardware, can be constructed by theatre staff during surgical setup, requires minimal image processing and can be visualised in real time. Real-time visualisation allows the surgical team to assess the calibration accuracy before use in surgery. In addition, in the laboratory, we have developed a laparoscope with an electromagnetic tracking sensor attached to the camera end and an optical tracking marker attached to the distal end. This enables a comparison of tracking performance. We have evaluated our method in the laboratory and compared it to two widely used methods, "Tsai's method" and "direct" calibration. The new method is of comparable accuracy to existing methods, and we show RMS projected error due to calibration of 1.95 mm for optical tracking and 0.85 mm for EM tracking, versus 4.13 and 1.00 mm respectively, using existing methods. The new method has also been shown to be workable under sterile conditions in the operating room. We have proposed a new method of hand-eye calibration, based on a single invariant point. Initial experience has shown that the method provides visual feedback, satisfactory accuracy and can be performed during surgery. We also show that an EM sensor placed near the camera would provide significantly improved image overlay accuracy.
A unified dynamic neural field model of goal directed eye movements
NASA Astrophysics Data System (ADS)
Quinton, J. C.; Goffart, L.
2018-01-01
Primates heavily rely on their visual system, which exploits signals of graded precision based on the eccentricity of the target in the visual field. The interactions with the environment involve actively selecting and focusing on visual targets or regions of interest, instead of contemplating an omnidirectional visual flow. Eye-movements specifically allow foveating targets and track their motion. Once a target is brought within the central visual field, eye-movements are usually classified into catch-up saccades (jumping from one orientation or fixation to another) and smooth pursuit (continuously tracking a target with low velocity). Building on existing dynamic neural field equations, we introduce a novel model that incorporates internal projections to better estimate the current target location (associated to a peak of activity). Such estimate is then used to trigger an eye movement, leading to qualitatively different behaviours depending on the dynamics of the whole oculomotor system: (1) fixational eye-movements due to small variations in the weights of projections when the target is stationary, (2) interceptive and catch-up saccades when peaks build and relax on the neural field, (3) smooth pursuit when the peak stabilises near the centre of the field, the system reaching a fixed point attractor. Learning is nevertheless required for tracking a rapidly moving target, and the proposed model thus replicates recent results in the monkey, in which repeated exercise permits the maintenance of the target within in the central visual field at its current (here-and-now) location, despite the delays involved in transmitting retinal signals to the oculomotor neurons.
A Pilot Study of Horizontal Head and Eye Rotations in Baseball Batting.
Fogt, Nick; Persson, Tyler W
2017-08-01
The purpose of the study was to measure and compare horizontal head and eye tracking movements as baseball batters "took" pitches and swung at baseball pitches. Two former college baseball players were tested in two conditions. A pitching machine was used to project tennis balls toward the subjects. In the first condition, subjects acted as if they were taking (i.e., not swinging) the pitches. In the second condition, subjects attempted to bat the pitched balls. Head movements were measured with an inertial sensor; eye movements were measured with a video eye tracker. For each condition, the relationship between the horizontal head and eye rotations was similar for the two subjects, as were the overall head-, eye-, and gaze-tracking strategies. In the "take" condition, head movements in the direction of the ball were larger than eye movements for much of the pitch trajectory. Large eye movements occurred only late in the pitch trajectory. Gaze was directed near the ball until approximately 150 milliseconds before the ball arrived at the batter, at which time gaze was directed ahead of the ball to a location near that occupied when the ball crosses the plate. In the "swing" condition, head movements in the direction of the ball were larger than eye movements throughout the pitch trajectory. Gaze was directed near the ball until approximately 50 to 60 milliseconds prior to pitch arrival at the batter. Horizontal head rotations were larger than horizontal eye rotations in both the "take" and "swing" conditions. Gaze was directed ahead of the ball late in the pitch trajectory in the "take" condition, whereas gaze was directed near the ball throughout much of the pitch trajectory in the "swing" condition.
NIR tracking assists sports medicine in junior basketball training
NASA Astrophysics Data System (ADS)
Paeglis, Roberts; Bluss, Kristaps; Rudzitis, Andris; Spunde, Andris; Brice, Tamara; Nitiss, Edgars
2011-07-01
We recorded eye movements of eight elite junior basketball players. We hypothesized that a more stable gaze is correlated to a better shot rate. Upon preliminary testing we invited male juniors whose eyes could be reliably tracked in a game situation. To these ends, we used a head-mounted video-based eye tracker. The participants had no record of ocular or other health issues. No significant differences were found between shots made with and without the tracker cap, Paired samples t-test yielded p= .130 for the far and p=..900 > .050 for the middle range shots. The players made 40 shots from common far and middle range locations, 5 and 4 meters respectively for aged 14 years As expected, a statistical correlation was found between gaze fixation (in milliseconds) for the far and middle range shot rates, r=.782, p=.03. Notably, juniors who fixated longer before a shot had a more stable fixation or a lower gaze dispersion (in tracker's screen pixels), r=-.786, p=.02. This finding was augmented by the observation that the gaze dispersion while aiming at the basket was less (i.e., gaze more stable) in those who were more likely to score. We derived a regression equation linking fixation duration to shot success. We advocate infra-red eye tracking as a means to monitor player selection and training success.
Eye tracking and gating system for proton therapy of orbital tumors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongho; Yoo, Seung Hoon; Moon, Sung Ho
2012-07-15
Purpose: A new motion-based gated proton therapy for the treatment of orbital tumors using real-time eye-tracking system was designed and evaluated. Methods: We developed our system by image-pattern matching, using a normalized cross-correlation technique with LabVIEW 8.6 and Vision Assistant 8.6 (National Instruments, Austin, TX). To measure the pixel spacing of an image consistently, four different calibration modes such as the point-detection, the edge-detection, the line-measurement, and the manual measurement mode were suggested and used. After these methods were applied to proton therapy, gating was performed, and radiation dose distributions were evaluated. Results: Moving phantom verification measurements resulted in errorsmore » of less than 0.1 mm for given ranges of translation. Dosimetric evaluation of the beam-gating system versus nongated treatment delivery with a moving phantom shows that while there was only 0.83 mm growth in lateral penumbra for gated radiotherapy, there was 4.95 mm growth in lateral penumbra in case of nongated exposure. The analysis from clinical results suggests that the average of eye movements depends distinctively on each patient by showing 0.44 mm, 0.45 mm, and 0.86 mm for three patients, respectively. Conclusions: The developed automatic eye-tracking based beam-gating system enabled us to perform high-precision proton radiotherapy of orbital tumors.« less
Unification of automatic target tracking and automatic target recognition
NASA Astrophysics Data System (ADS)
Schachter, Bruce J.
2014-06-01
The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.
Investigation of visual fatigue/discomfort generated by S3D video using eye-tracking data
NASA Astrophysics Data System (ADS)
Iatsun, Iana; Larabi, Mohamed-Chaker; Fernandez-Maloigne, Christine
2013-03-01
Stereoscopic 3D is undoubtedly one of the most attractive content. It has been deployed intensively during the last decade through movies and games. Among the advantages of 3D are the strong involvement of viewers and the increased feeling of presence. However, the sanitary e ects that can be generated by 3D are still not precisely known. For example, visual fatigue and visual discomfort are among symptoms that an observer may feel. In this paper, we propose an investigation of visual fatigue generated by 3D video watching, with the help of eye-tracking. From one side, a questionnaire, with the most frequent symptoms linked with 3D, is used in order to measure their variation over time. From the other side, visual characteristics such as pupil diameter, eye movements ( xations and saccades) and eye blinking have been explored thanks to data provided by the eye-tracker. The statistical analysis showed an important link between blinking duration and number of saccades with visual fatigue while pupil diameter and xations are not precise enough and are highly dependent on content. Finally, time and content play an important role in the growth of visual fatigue due to 3D watching.
Influence of social presence on eye movements in visual search tasks.
Liu, Na; Yu, Ruifeng
2017-12-01
This study employed an eye-tracking technique to investigate the influence of social presence on eye movements in visual search tasks. A total of 20 male subjects performed visual search tasks in a 2 (target presence: present vs. absent) × 2 (task complexity: complex vs. simple) × 2 (social presence: alone vs. a human audience) within-subject experiment. Results indicated that the presence of an audience could evoke a social facilitation effect on response time in visual search tasks. Compared with working alone, the participants made fewer and shorter fixations, larger saccades and shorter scan path in simple search tasks and more and longer fixations, smaller saccades and longer scan path in complex search tasks when working with an audience. The saccade velocity and pupil diameter in the audience-present condition were larger than those in the working-alone condition. No significant change in target fixation number was observed between two social presence conditions. Practitioner Summary: This study employed an eye-tracking technique to examine the influence of social presence on eye movements in visual search tasks. Results clarified the variation mechanism and characteristics of oculomotor scanning induced by social presence in visual search.
Electronic eye occluder with time-counting and reflection control
NASA Astrophysics Data System (ADS)
Karitans, V.; Ozolinsh, M.; Kuprisha, G.
2008-09-01
In pediatric ophthalmology 2 - 3 % of all the children are impacted by a visual pathology - amblyopia. It develops if a clear image isn't presented to the retina during an early stage of the development of the visual system. A common way of treating this pathology is to cover the better-seeing eye to force the "lazy" eye to learn seeing. However, children are often reluctant to wear such an occluder because they are ashamed or simply because they find it inconvenient. This fact requires to find a way how to track the regime of occlusion because results of occlusion is a hint that the actual regime of occlusion isn't that what the optometrist has recommended. We design an electronic eye occluder that allows to track the regime of eye occlusion. We employ real-time clock DS1302 providing time information from seconds to years. Data is stored in the internal memory of the CPU (EEPROM). The MCU (PIC16F676) switches on only if a mechanical switch is closed and temperature has reached a satisfactory level. The occlusion is registered between time moments when the infrared signal appeared and disappeared.
DOT National Transportation Integrated Search
1971-07-01
A previous CAMI laboratory investigation showed that alcohol impairs the ability of men to suppress vestibular nystagmus while visually fixating on a cockpit instrument, thus degrading visual tracking performance (eye-hand coordination) during angula...