Sample records for face identification eye-tracking

  1. The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Sterling, Lindsey; Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth

    2008-01-01

    It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically…

  2. Webcam mouse using face and eye tracking in various illumination environments.

    PubMed

    Lin, Yuan-Pin; Chao, Yi-Ping; Lin, Chung-Chih; Chen, Jyh-Horng

    2005-01-01

    Nowadays, due to enhancement of computer performance and popular usage of webcam devices, it has become possible to acquire users' gestures for the human-computer-interface with PC via webcam. However, the effects of illumination variation would dramatically decrease the stability and accuracy of skin-based face tracking system; especially for a notebook or portable platform. In this study we present an effective illumination recognition technique, combining K-Nearest Neighbor classifier and adaptive skin model, to realize the real-time tracking system. We have demonstrated that the accuracy of face detection based on the KNN classifier is higher than 92% in various illumination environments. In real-time implementation, the system successfully tracks user face and eyes features at 15 fps under standard notebook platforms. Although KNN classifier only initiates five environments at preliminary stage, the system permits users to define and add their favorite environments to KNN for computer access. Eventually, based on this efficient tracking algorithm, we have developed a "Webcam Mouse" system to control the PC cursor using face and eye tracking. Preliminary studies in "point and click" style PC web games also shows promising applications in consumer electronic markets in the future.

  3. An eye tracking system for monitoring face scanning patterns reveals the enhancing effect of oxytocin on eye contact in common marmosets.

    PubMed

    Kotani, Manato; Shimono, Kohei; Yoneyama, Toshihiro; Nakako, Tomokazu; Matsumoto, Kenji; Ogi, Yuji; Konoike, Naho; Nakamura, Katsuki; Ikeda, Kazuhito

    2017-09-01

    Eye tracking systems are used to investigate eyes position and gaze patterns presumed as eye contact in humans. Eye contact is a useful biomarker of social communication and known to be deficient in patients with autism spectrum disorders (ASDs). Interestingly, the same eye tracking systems have been used to directly compare face scanning patterns in some non-human primates to those in human. Thus, eye tracking is expected to be a useful translational technique for investigating not only social attention and visual interest, but also the effects of psychiatric drugs, such as oxytocin, a neuropeptide that regulates social behavior. In this study, we report on a newly established method for eye tracking in common marmosets as unique New World primates that, like humans, use eye contact as a mean of communication. Our investigation was aimed at characterizing these primates face scanning patterns and evaluating the effects of oxytocin on their eye contact behavior. We found that normal common marmosets spend more time viewing the eyes region in common marmoset's picture than the mouth region or a scrambled picture. In oxytocin experiment, the change in eyes/face ratio was significantly greater in the oxytocin group than in the vehicle group. Moreover, oxytocin-induced increase in the change in eyes/face ratio was completely blocked by the oxytocin receptor antagonist L-368,899. These results indicate that eye tracking in common marmosets may be useful for evaluating drug candidates targeting psychiatric conditions, especially ASDs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Event-related potential and eye tracking evidence of the developmental dynamics of face processing.

    PubMed

    Meaux, Emilie; Hernandez, Nadia; Carteau-Martin, Isabelle; Martineau, Joëlle; Barthélémy, Catherine; Bonnet-Brilhault, Frédérique; Batty, Magali

    2014-04-01

    Although the wide neural network and specific processes related to faces have been revealed, the process by which face-processing ability develops remains unclear. An interest in faces appears early in infancy, and developmental findings to date have suggested a long maturation process of the mechanisms involved in face processing. These developmental changes may be supported by the acquisition of more efficient strategies to process faces (theory of expertise) and by the maturation of the face neural network identified in adults. This study aimed to clarify the link between event-related potential (ERP) development in response to faces and the behavioral changes in the way faces are scanned throughout childhood. Twenty-six young children (4-10 years of age) were included in two experimental paradigms, the first exploring ERPs during face processing, the second investigating the visual exploration of faces using an eye-tracking system. The results confirmed significant age-related changes in visual ERPs (P1, N170 and P2). Moreover, an increased interest in the eye region and an attentional shift from the mouth to the eyes were also revealed. The proportion of early fixations on the eye region was correlated with N170 and P2 characteristics, highlighting a link between the development of ERPs and gaze behavior. We suggest that these overall developmental dynamics may be sustained by a gradual, experience-dependent specialization in face processing (i.e. acquisition of face expertise), which produces a more automatic and efficient network associated with effortless identification of faces, and allows the emergence of human-specific social and communication skills. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  5. Through the eyes of the own-race bias: eye-tracking and pupillometry during face recognition.

    PubMed

    Wu, Esther Xiu Wen; Laeng, Bruno; Magnussen, Svein

    2012-01-01

    People are generally better at remembering faces of their own race than faces of a different race, and this effect is known as the own-race bias (ORB) effect. We used eye-tracking and pupillometry to investigate whether Caucasian and Asian face stimuli elicited different-looking patterns in Caucasian participants in a face-memory task. Consistent with the ORB effect, we found better recognition performance for own-race faces than other-race faces, and shorter response times. In addition, at encoding, eye movements and pupillary responses to Asian faces (i.e., the other race) were different from those to Caucasian faces (i.e., the own race). Processing of own-race faces was characterized by more active scanning, with a larger number of shorter fixations, and more frequent saccades. Moreover, pupillary diameters were larger when viewing other-race than own-race faces, suggesting a greater cognitive effort when encoding other-race faces.

  6. The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders

    PubMed Central

    Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth

    2010-01-01

    It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically developing participants and 17 individuals with ASD were recorded while passively viewing three face categories: unfamiliar non-repeating faces, a repeating highly familiar face, and a repeating previously unfamiliar face. Results suggest that individuals with ASD do not exhibit more normative gaze patterns when viewing familiar faces. A second task assessed facial recognition accuracy and response time for familiar and novel faces. The groups did not differ on accuracy or reaction times. PMID:18306030

  7. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  8. Initial eye movements during face identification are optimal and similar across cultures

    PubMed Central

    Or, Charles C.-F.; Peterson, Matthew F.; Eckstein, Miguel P.

    2015-01-01

    Culture influences not only human high-level cognitive processes but also low-level perceptual operations. Some perceptual operations, such as initial eye movements to faces, are critical for extraction of information supporting evolutionarily important tasks such as face identification. The extent of cultural effects on these crucial perceptual processes is unknown. Here, we report that the first gaze location for face identification was similar across East Asian and Western Caucasian cultural groups: Both fixated a featureless point between the eyes and the nose, with smaller between-group than within-group differences and with a small horizontal difference across cultures (8% of the interocular distance). We also show that individuals of both cultural groups initially fixated at a slightly higher point on Asian faces than on Caucasian faces. The initial fixations were found to be both fundamental in acquiring the majority of information for face identification and optimal, as accuracy deteriorated when observers held their gaze away from their preferred fixations. An ideal observer that integrated facial information with the human visual system's varying spatial resolution across the visual field showed a similar information distribution across faces of both races and predicted initial human fixations. The model consistently replicated the small vertical difference between human fixations to Asian and Caucasian faces but did not predict the small horizontal leftward bias of Caucasian observers. Together, the results suggest that initial eye movements during face identification may be driven by brain mechanisms aimed at maximizing accuracy, and less influenced by culture. The findings increase our understanding of the interplay between the brain's aims to optimally accomplish basic perceptual functions and to respond to sociocultural influences. PMID:26382003

  9. Eye tracking reveals a crucial role for facial motion in recognition of faces by infants

    PubMed Central

    Xiao, Naiqi G.; Quinn, Paul C.; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-01-01

    Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces and then their face recognition was tested with static face images. Eye tracking methodology was used to record eye movements during familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better was their face recognition, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. PMID:26010387

  10. Eye tracking reveals a crucial role for facial motion in recognition of faces by infants.

    PubMed

    Xiao, Naiqi G; Quinn, Paul C; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-06-01

    Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces, and then their face recognition was tested with static face images. Eye-tracking methodology was used to record eye movements during the familiarization and test phases. The results showed a developmental change in eye movement patterns, but only for the moving faces. In addition, the more infants shifted their fixations across facial regions, the better their face recognition was, but only for the moving faces. The results suggest that facial movement influences the way faces are encoded from early in development. (c) 2015 APA, all rights reserved).

  11. Statistical Analysis of Online Eye and Face-tracking Applications in Marketing

    NASA Astrophysics Data System (ADS)

    Liu, Xuan

    Eye-tracking and face-tracking technology have been widely adopted to study viewers' attention and emotional response. In the dissertation, we apply these two technologies to investigate effective online contents that are designed to attract and direct attention and engage viewers emotional responses. In the first part of the dissertation, we conduct a series of experiments that use eye-tracking technology to explore how online models' facial cues affect users' attention on static e-commerce websites. The joint effects of two facial cues, gaze direction and facial expression on attention, are estimated by Bayesian ANOVA, allowing various distributional assumptions. We also consider the similarities and differences in the effects of facial cues among American and Chinese consumers. This study offers insights on how to attract and retain customers' attentions for advertisers that use static advertisement on various websites or ad networks. In the second part of the dissertation, we conduct a face-tracking study where we investigate the relation between experiment participants' emotional responseswhile watching comedy movie trailers and their watching intentions to the actual movies. Viewers' facial expressions are collected in real-time and converted to emo- tional responses with algorithms based on facial coding system. To analyze the data, we propose to use a joint modeling method that link viewers' longitudinal emotion measurements and their watching intentions. This research provides recommenda- tions to filmmakers on how to improve the effectiveness of movie trailers, and how to boost audiences' desire to watch the movies.

  12. ATTENTION BIAS OF ANXIOUS YOUTH DURING EXTENDED EXPOSURE OF EMOTIONAL FACE PAIRS: AN EYE-TRACKING STUDY

    PubMed Central

    Shechner, Tomer; Jarcho, Johanna M.; Britton, Jennifer C.; Leibenluft, Ellen; Pine, Daniel S.; Nelson, Eric E.

    2012-01-01

    Background Previous studies demonstrate that anxiety is characterized by biased attention toward threats, typically measured by differences in motor reaction time to threat and neutral cues. Using eye-tracking methodology, the current study measured attention biases in anxious and nonanxious youth, using unrestricted free viewing of angry, happy, and neutral faces. Methods Eighteen anxious and 15 nonanxious youth (8–17 years old) passively viewed angry-neutral and happy-neutral face pairs for 10 s while their eye movements were recorded. Results Anxious youth displayed a greater attention bias toward angry faces than nonanxious youth, and this bias occurred in the earliest phases of stimulus presentation. Specifically, anxious youth were more likely to direct their first fixation to angry faces, and they made faster fixations to angry than neutral faces. Conclusions Consistent with findings from earlier, reaction-time studies, the current study shows that anxious youth, like anxious adults, exhibit biased orienting to threat-related stimuli. This study adds to the existing literature by documenting that threat biases in eye-tracking patterns are manifest at initial attention orienting. PMID:22815254

  13. Dynamic Eye Tracking Based Metrics for Infant Gaze Patterns in the Face-Distractor Competition Paradigm

    PubMed Central

    Ahtola, Eero; Stjerna, Susanna; Yrttiaho, Santeri; Nelson, Charles A.; Leppänen, Jukka M.; Vanhatalo, Sampsa

    2014-01-01

    Objective To develop new standardized eye tracking based measures and metrics for infants’ gaze dynamics in the face-distractor competition paradigm. Method Eye tracking data were collected from two samples of healthy 7-month-old (total n = 45), as well as one sample of 5-month-old infants (n = 22) in a paradigm with a picture of a face or a non-face pattern as a central stimulus, and a geometric shape as a lateral stimulus. The data were analyzed by using conventional measures of infants’ initial disengagement from the central to the lateral stimulus (i.e., saccadic reaction time and probability) and, additionally, novel measures reflecting infants gaze dynamics after the initial disengagement (i.e., cumulative allocation of attention to the central vs. peripheral stimulus). Results The results showed that the initial saccade away from the centrally presented stimulus is followed by a rapid re-engagement of attention with the central stimulus, leading to cumulative preference for the central stimulus over the lateral stimulus over time. This pattern tended to be stronger for salient facial expressions as compared to non-face patterns, was replicable across two independent samples of 7-month-old infants, and differentiated between 7 and 5 month-old infants. Conclusion The results suggest that eye tracking based assessments of infants’ cumulative preference for faces over time can be readily parameterized and standardized, and may provide valuable techniques for future studies examining normative developmental changes in preference for social signals. Significance Standardized measures of early developing face preferences may have potential to become surrogate biomarkers of neurocognitive and social development. PMID:24845102

  14. Do Faces Capture the Attention of Individuals with Williams Syndrome or Autism? Evidence from Tracking Eye Movements

    ERIC Educational Resources Information Center

    Riby, Deborah M.; Hancock, Peter J. B.

    2009-01-01

    The neuro-developmental disorders of Williams syndrome (WS) and autism can reveal key components of social cognition. Eye-tracking techniques were applied in two tasks exploring attention to pictures containing faces. Images were (i) scrambled pictures containing faces or (ii) pictures of scenes with embedded faces. Compared to individuals who…

  15. Visual Processing of Faces in Individuals with Fragile X Syndrome: An Eye Tracking Study

    ERIC Educational Resources Information Center

    Farzin, Faraz; Rivera, Susan M.; Hessl, David

    2009-01-01

    Gaze avoidance is a hallmark behavioral feature of fragile X syndrome (FXS), but little is known about whether abnormalities in the visual processing of faces, including disrupted autonomic reactivity, may underlie this behavior. Eye tracking was used to record fixations and pupil diameter while adolescents and young adults with FXS and sex- and…

  16. Technology survey on video face tracking

    NASA Astrophysics Data System (ADS)

    Zhang, Tong; Gomes, Herman Martins

    2014-03-01

    With the pervasiveness of monitoring cameras installed in public areas, schools, hospitals, work places and homes, video analytics technologies for interpreting these video contents are becoming increasingly relevant to people's lives. Among such technologies, human face detection and tracking (and face identification in many cases) are particularly useful in various application scenarios. While plenty of research has been conducted on face tracking and many promising approaches have been proposed, there are still significant challenges in recognizing and tracking people in videos with uncontrolled capturing conditions, largely due to pose and illumination variations, as well as occlusions and cluttered background. It is especially complex to track and identify multiple people simultaneously in real time due to the large amount of computation involved. In this paper, we present a survey on literature and software that are published or developed during recent years on the face tracking topic. The survey covers the following topics: 1) mainstream and state-of-the-art face tracking methods, including features used to model the targets and metrics used for tracking; 2) face identification and face clustering from face sequences; and 3) software packages or demonstrations that are available for algorithm development or trial. A number of publically available databases for face tracking are also introduced.

  17. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2013-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of…

  18. Face landmark point tracking using LK pyramid optical flow

    NASA Astrophysics Data System (ADS)

    Zhang, Gang; Tang, Sikan; Li, Jiaquan

    2018-04-01

    LK pyramid optical flow is an effective method to implement object tracking in a video. It is used for face landmark point tracking in a video in the paper. The landmark points, i.e. outer corner of left eye, inner corner of left eye, inner corner of right eye, outer corner of right eye, tip of a nose, left corner of mouth, right corner of mouth, are considered. It is in the first frame that the landmark points are marked by hand. For subsequent frames, performance of tracking is analyzed. Two kinds of conditions are considered, i.e. single factors such as normalized case, pose variation and slowly moving, expression variation, illumination variation, occlusion, front face and rapidly moving, pose face and rapidly moving, and combination of the factors such as pose and illumination variation, pose and expression variation, pose variation and occlusion, illumination and expression variation, expression variation and occlusion. Global measures and local ones are introduced to evaluate performance of tracking under different factors or combination of the factors. The global measures contain the number of images aligned successfully, average alignment error, the number of images aligned before failure, and the local ones contain the number of images aligned successfully for components of a face, average alignment error for the components. To testify performance of tracking for face landmark points under different cases, tests are carried out for image sequences gathered by us. Results show that the LK pyramid optical flow method can implement face landmark point tracking under normalized case, expression variation, illumination variation which does not affect facial details, pose variation, and that different factors or combination of the factors have different effect on performance of alignment for different landmark points.

  19. The Importance of the Eye Area in Face Identification Abilities and Visual Search Strategies in Persons with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Larsson, Matilda; Bjallmark, Anna; Falkmer, Torbjorn

    2010-01-01

    Partly claimed to explain social difficulties observed in people with Asperger syndrome, face identification and visual search strategies become important. Previous research findings are, however, disparate. In order to explore face identification abilities and visual search strategies, with special focus on the importance of the eye area, 24…

  20. Emerging applications of eye-tracking technology in dermatology.

    PubMed

    John, Kevin K; Jensen, Jakob D; King, Andy J; Pokharel, Manusheela; Grossman, Douglas

    2018-04-06

    Eye-tracking technology has been used within a multitude of disciplines to provide data linking eye movements to visual processing of various stimuli (i.e., x-rays, situational positioning, printed information, and warnings). Despite the benefits provided by eye-tracking in allowing for the identification and quantification of visual attention, the discipline of dermatology has yet to see broad application of the technology. Notwithstanding dermatologists' heavy reliance upon visual patterns and cues to discriminate between benign and atypical nevi, literature that applies eye-tracking to the study of dermatology is sparse; and literature specific to patient-initiated behaviors, such as skin self-examination (SSE), is largely non-existent. The current article provides a review of eye-tracking research in various medical fields, culminating in a discussion of current applications and advantages of eye-tracking for dermatology research. Copyright © 2018 Japanese Society for Investigative Dermatology. Published by Elsevier B.V. All rights reserved.

  1. Learning optimal eye movements to unusual faces

    PubMed Central

    Peterson, Matthew F.; Eckstein, Miguel P.

    2014-01-01

    Eye movements, which guide the fovea’s high resolution and computational power to relevant areas of the visual scene, are integral to efficient, successful completion of many visual tasks. How humans modify their eye movements through experience with their perceptual environments, and its functional role in learning new tasks, has not been fully investigated. Here, we used a face identification task where only the mouth discriminated exemplars to assess if, how, and when eye movement modulation may mediate learning. By interleaving trials of unconstrained eye movements with trials of forced fixation, we attempted to separate the contributions of eye movements and covert mechanisms to performance improvements. Without instruction, a majority of observers substantially increased accuracy and learned to direct their initial eye movements towards the optimal fixation point. The proximity of an observer’s default face identification eye movement behavior to the new optimal fixation point and the observer’s peripheral processing ability were predictive of performance gains and eye movement learning. After practice in a subsequent condition in which observers were directed to fixate different locations along the face, including the relevant mouth region, all observers learned to make eye movements to the optimal fixation point. In this fully learned state, augmented fixation strategy accounted for 43% of total efficiency improvements while covert mechanisms accounted for the remaining 57%. The findings suggest a critical role for eye movement planning to perceptual learning, and elucidate factors that can predict when and how well an observer can learn a new task with unusual exemplars. PMID:24291712

  2. Tracking the truth: the effect of face familiarity on eye fixations during deception.

    PubMed

    Millen, Ailsa E; Hope, Lorraine; Hillstrom, Anne P; Vrij, Aldert

    2017-05-01

    In forensic investigations, suspects sometimes conceal recognition of a familiar person to protect co-conspirators or hide knowledge of a victim. The current experiment sought to determine whether eye fixations could be used to identify memory of known persons when lying about recognition of faces. Participants' eye movements were monitored whilst they lied and told the truth about recognition of faces that varied in familiarity (newly learned, famous celebrities, personally known). Memory detection by eye movements during recognition of personally familiar and famous celebrity faces was negligibly affected by lying, thereby demonstrating that detection of memory during lies is influenced by the prior learning of the face. By contrast, eye movements did not reveal lies robustly for newly learned faces. These findings support the use of eye movements as markers of memory during concealed recognition but also suggest caution when familiarity is only a consequence of one brief exposure.

  3. Task-irrelevant own-race faces capture attention: eye-tracking evidence.

    PubMed

    Cao, Rong; Wang, Shuzhen; Rao, Congquan; Fu, Jia

    2013-04-01

    To investigate attentional capture by face's race, the current study recorded saccade latencies of eye movement measurements in an inhibition of return (IOR) task. Compared to Caucasian (other-race) faces, Chinese (own-race) faces elicited longer saccade latency. This phenomenon disappeared when faces were inverted. The results indicated that own-race faces capture attention automatically with high-level configural processing. © 2013 The Authors. Scandinavian Journal of Psychology © 2013 The Scandinavian Psychological Associations.

  4. Biracial and Monoracial Infant Own-Race Face Perception: An Eye Tracking Study

    PubMed Central

    Gaither, Sarah E.; Pauker, Kristin; Johnson, Scott P.

    2012-01-01

    We know early experience plays a crucial role in the development of face processing, but we know little about how infants learn to distinguish faces from different races, especially for non-Caucasian populations. Moreover, it is unknown whether differential processing of different race faces observed in typically-studied monoracial infants extends to biracial infants as well. Thus, we investigated 3-month-old Caucasian, Asian and biracial (Caucasian-Asian) infants’ ability to distinguish Caucasian and Asian faces. Infants completed two within-subject, infant-controlled habituation sequences and test trials as an eye tracker recorded looking times and scanning patterns. Examination of individual differences revealed significant positive correlations between own-race novelty preference and scanning frequency between eye and mouth regions of own-race habituation stimuli for Caucasian and Asian infants, suggesting that facility in own-race face discrimination stems from active inspection of internal facial features in these groups. Biracial infants, however, showed the opposite effect: An “own-race” novelty preference was associated with reduced scanning between eye and mouth regions of “own-race” habituation stimuli, suggesting that biracial infants use a distinct approach to processing frequently encountered faces. Future directions for investigating face processing development in biracial populations are discussed. PMID:23106731

  5. Biracial and monoracial infant own-race face perception: an eye tracking study.

    PubMed

    Gaither, Sarah E; Pauker, Kristin; Johnson, Scott P

    2012-11-01

    We know that early experience plays a crucial role in the development of face processing, but we know little about how infants learn to distinguish faces from different races, especially for non-Caucasian populations. Moreover, it is unknown whether differential processing of different race faces observed in typically studied monoracial infants extends to biracial infants as well. Thus, we investigated 3-month-old Caucasian, Asian and biracial (Caucasian-Asian) infants' ability to distinguish Caucasian and Asian faces. Infants completed two within-subject, infant-controlled habituation sequences and test trials as an eye tracker recorded looking times and scanning patterns. Examination of individual differences revealed significant positive correlations between own-race novelty preference and scanning frequency between eye and mouth regions of own-race habituation stimuli for Caucasian and Asian infants, suggesting that facility in own-race face discrimination stems from active inspection of internal facial features in these groups. Biracial infants, however, showed the opposite effect: An 'own-race' novelty preference was associated with reduced scanning between eye and mouth regions of 'own-race' habituation stimuli, suggesting that biracial infants use a distinct approach to processing frequently encountered faces. Future directions for investigating face processing development in biracial populations are discussed. © 2012 Blackwell Publishing Ltd.

  6. Using eye tracking to test for individual differences in attention to attractive faces

    PubMed Central

    Valuch, Christian; Pflüger, Lena S.; Wallner, Bernard; Laeng, Bruno; Ansorge, Ulrich

    2015-01-01

    We assessed individual differences in visual attention toward faces in relation to their attractiveness via saccadic reaction times. Motivated by the aim to understand individual differences in attention to faces, we tested three hypotheses: (a) Attractive faces hold or capture attention more effectively than less attractive faces; (b) men show a stronger bias toward attractive opposite-sex faces than women; and (c) blue-eyed men show a stronger bias toward blue-eyed than brown-eyed feminine faces. The latter test was included because prior research suggested a high effect size. Our data supported hypotheses (a) and (b) but not (c). By conducting separate tests for disengagement of attention and attention capture, we found that individual differences exist at distinct stages of attentional processing but these differences are of varying robustness and importance. In our conclusion, we also advocate the use of linear mixed effects models as the most appropriate statistical approach for studying inter-individual differences in visual attention with naturalistic stimuli. PMID:25698993

  7. Using eye tracking to test for individual differences in attention to attractive faces.

    PubMed

    Valuch, Christian; Pflüger, Lena S; Wallner, Bernard; Laeng, Bruno; Ansorge, Ulrich

    2015-01-01

    We assessed individual differences in visual attention toward faces in relation to their attractiveness via saccadic reaction times. Motivated by the aim to understand individual differences in attention to faces, we tested three hypotheses: (a) Attractive faces hold or capture attention more effectively than less attractive faces; (b) men show a stronger bias toward attractive opposite-sex faces than women; and (c) blue-eyed men show a stronger bias toward blue-eyed than brown-eyed feminine faces. The latter test was included because prior research suggested a high effect size. Our data supported hypotheses (a) and (b) but not (c). By conducting separate tests for disengagement of attention and attention capture, we found that individual differences exist at distinct stages of attentional processing but these differences are of varying robustness and importance. In our conclusion, we also advocate the use of linear mixed effects models as the most appropriate statistical approach for studying inter-individual differences in visual attention with naturalistic stimuli.

  8. Similarity and Difference in the Processing of Same- and Other-Race Faces as Revealed by Eye Tracking in 4- to 9-Month-Olds

    ERIC Educational Resources Information Center

    Liu, Shaoying; Quinn, Paul C.; Wheeler, Andrea; Xiao, Naiqi; Ge, Liezhong; Lee, Kang

    2011-01-01

    Fixation duration for same-race (i.e., Asian) and other-race (i.e., Caucasian) female faces by Asian infant participants between 4 and 9 months of age was investigated with an eye-tracking procedure. The age range tested corresponded with prior reports of processing differences between same- and other-race faces observed in behavioral looking time…

  9. Brief Report: Patterns of Eye Movements in Face to Face Conversation Are Associated with Autistic Traits--Evidence from a Student Sample

    ERIC Educational Resources Information Center

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking…

  10. Infants experience-dependent processing of male and female faces: Insights from eye tracking and event-related potentials

    PubMed Central

    Righi, Giulia; Westerlund, Alissa; Congdon, Eliza L.; Troller-Renfree, Sonya; Nelson, Charles A.

    2013-01-01

    The goal of the present study was to investigate infants’ processing of female and male faces. We used an event-related potential (ERP) priming task, as well as a visual-paired comparison (VPC) eye tracking task to explore how 7-month-old “female expert” infants differed in their responses to faces of different genders. Female faces elicited larger N290 amplitudes than male faces. Furthermore, infants showed a priming effect for female faces only, whereby the N290 was significantly more negative for novel females compared to primed female faces. The VPC experiment was designed to test whether infants could reliably discriminate between two female and two male faces. Analyses showed that infants were able to differentiate faces of both genders. The results of the present study suggest that 7-month olds with a large amount of female face experience show a processing advantage for forming a neural representation of female faces, compared to male faces. However, the enhanced neural sensitivity to the repetition of female faces is not due to the infants' inability to discriminate male faces. Instead, the combination of results from the two tasks suggests that the differential processing for female faces may be a signature of expert-level processing. PMID:24200421

  11. Eye-catching odors: olfaction elicits sustained gazing to faces and eyes in 4-month-old infants.

    PubMed

    Durand, Karine; Baudouin, Jean-Yves; Lewkowicz, David J; Goubet, Nathalie; Schaal, Benoist

    2013-01-01

    This study investigated whether an odor can affect infants' attention to visually presented objects and whether it can selectively direct visual gaze at visual targets as a function of their meaning. Four-month-old infants (n = 48) were exposed to their mother's body odors while their visual exploration was recorded with an eye-movement tracking system. Two groups of infants, who were assigned to either an odor condition or a control condition, looked at a scene composed of still pictures of faces and cars. As expected, infants looked longer at the faces than at the cars but this spontaneous preference for faces was significantly enhanced in presence of the odor. As expected also, when looking at the face, the infants looked longer at the eyes than at any other facial regions, but, again, they looked at the eyes significantly longer in the presence of the odor. Thus, 4-month-old infants are sensitive to the contextual effects of odors while looking at faces. This suggests that early social attention to faces is mediated by visual as well as non-visual cues.

  12. Face in profile view reduces perceived facial expression intensity: an eye-tracking study.

    PubMed

    Guo, Kun; Shaw, Heather

    2015-02-01

    Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Trustworthy-Looking Face Meets Brown Eyes

    PubMed Central

    Kleisner, Karel; Priplatova, Lenka; Frost, Peter; Flegr, Jaroslav

    2013-01-01

    We tested whether eye color influences perception of trustworthiness. Facial photographs of 40 female and 40 male students were rated for perceived trustworthiness. Eye color had a significant effect, the brown-eyed faces being perceived as more trustworthy than the blue-eyed ones. Geometric morphometrics, however, revealed significant correlations between eye color and face shape. Thus, face shape likewise had a significant effect on perceived trustworthiness but only for male faces, the effect for female faces not being significant. To determine whether perception of trustworthiness was being influenced primarily by eye color or by face shape, we recolored the eyes on the same male facial photos and repeated the test procedure. Eye color now had no effect on perceived trustworthiness. We concluded that although the brown-eyed faces were perceived as more trustworthy than the blue-eyed ones, it was not brown eye color per se that caused the stronger perception of trustworthiness but rather the facial features associated with brown eyes. PMID:23326406

  14. Eye movement identification based on accumulated time feature

    NASA Astrophysics Data System (ADS)

    Guo, Baobao; Wu, Qiang; Sun, Jiande; Yan, Hua

    2017-06-01

    Eye movement is a new kind of feature for biometrical recognition, it has many advantages compared with other features such as fingerprint, face, and iris. It is not only a sort of static characteristics, but also a combination of brain activity and muscle behavior, which makes it effective to prevent spoofing attack. In addition, eye movements can be incorporated with faces, iris and other features recorded from the face region into multimode systems. In this paper, we do an exploring study on eye movement identification based on the eye movement datasets provided by Komogortsev et al. in 2011 with different classification methods. The time of saccade and fixation are extracted from the eye movement data as the eye movement features. Furthermore, the performance analysis was conducted on different classification methods such as the BP, RBF, ELMAN and SVM in order to provide a reference to the future research in this field.

  15. Driver face tracking using semantics-based feature of eyes on single FPGA

    NASA Astrophysics Data System (ADS)

    Yu, Ying-Hao; Chen, Ji-An; Ting, Yi-Siang; Kwok, Ngaiming

    2017-06-01

    Tracking driver's face is one of the essentialities for driving safety control. This kind of system is usually designed with complicated algorithms to recognize driver's face by means of powerful computers. The design problem is not only about detecting rate but also from parts damages under rigorous environments by vibration, heat, and humidity. A feasible strategy to counteract these damages is to integrate entire system into a single chip in order to achieve minimum installation dimension, weight, power consumption, and exposure to air. Meanwhile, an extraordinary methodology is also indispensable to overcome the dilemma of low-computing capability and real-time performance on a low-end chip. In this paper, a novel driver face tracking system is proposed by employing semantics-based vague image representation (SVIR) for minimum hardware resource usages on a FPGA, and the real-time performance is also guaranteed at the same time. Our experimental results have indicated that the proposed face tracking system is viable and promising for the smart car design in the future.

  16. Is the Thatcher Illusion Modulated by Face Familiarity? Evidence from an Eye Tracking Study

    PubMed Central

    2016-01-01

    Thompson (1980) first detected and described the Thatcher Illusion, where participants instantly perceive an upright face with inverted eyes and mouth as grotesque, but fail to do so when the same face is inverted. One prominent but controversial explanation is that the processing of configural information is disrupted in inverted faces. Studies investigating the Thatcher Illusion either used famous faces or non-famous faces. Highly familiar faces were often thought to be processed in a pronounced configural mode, so they seem ideal candidates to be tested in one Thatcher study against unfamiliar faces–but this has never been addressed so far. In our study, participants evaluated 16 famous and 16 non-famous faces for their grotesqueness. We tested whether familiarity (famous/non-famous faces) modulates reaction times, correctness of grotesqueness assessments (accuracy), and eye movement patterns for the factors orientation (upright/inverted) and Thatcherisation (Thatcherised/non-Thatcherised). On a behavioural level, familiarity effects were only observable via face inversion (higher accuracy and sensitivity for famous compared to non-famous faces) but not via Thatcherisation. Regarding eye movements, however, Thatcherisation influenced the scanning of famous and non-famous faces, for instance, in scanning the mouth region of the presented faces (higher number, duration and dwell time of fixations for famous compared to non-famous faces if Thatcherised). Altogether, famous faces seem to be processed in a more elaborate, more expertise-based way than non-famous faces, whereas non-famous, inverted faces seem to cause difficulties in accurate and sensitive processing. Results are further discussed in the face of existing studies of familiar vs. unfamiliar face processing. PMID:27776145

  17. Rotational symmetric HMD with eye-tracking capability

    NASA Astrophysics Data System (ADS)

    Liu, Fangfang; Cheng, Dewen; Wang, Qiwei; Wang, Yongtian

    2016-10-01

    As an important auxiliary function of head-mounted displays (HMDs), eye tracking has an important role in the field of intelligent human-machine interaction. In this paper, an eye-tracking HMD system (ET-HMD) is designed based on the rotational symmetric system. The tracking principle in this paper is based on pupil-corneal reflection. The ET-HMD system comprises three optical paths for virtual display, infrared illumination, and eye tracking. The display optics is shared by three optical paths and consists of four spherical lenses. For the eye-tracking path, an extra imaging lens is added to match the image sensor and achieve eye tracking. The display optics provides users a 40° diagonal FOV with a ״ 0.61 OLED, the 19 mm eye clearance, and 10 mm exit pupil diameter. The eye-tracking path can capture 15 mm × 15 mm of the users' eyes. The average MTF is above 0.1 at 26 lp/mm for the display path, and exceeds 0.2 at 46 lp/mm for the eye-tracking path. Eye illumination is simulated using LightTools with an eye model and an 850 nm near-infrared LED (NIR-LED). The results of the simulation show that the illumination of the NIR-LED can cover the area of the eye model with the display optics that is sufficient for eye tracking. The integrated optical system HMDs with eye-tracking feature can help improve the HMD experience of users.

  18. Exploring the time course of face matching: temporal constraints impair unfamiliar face identification under temporally unconstrained viewing.

    PubMed

    Ozbek, Müge; Bindemann, Markus

    2011-10-01

    The identification of unfamiliar faces has been studied extensively with matching tasks, in which observers decide if pairs of photographs depict the same person (identity matches) or different people (mismatches). In experimental studies in this field, performance is usually self-paced under the assumption that this will encourage best-possible accuracy. Here, we examined the temporal characteristics of this task by limiting display times and tracking observers' eye movements. Observers were required to make match/mismatch decisions to pairs of faces shown for 200, 500, 1000, or 2000ms, or for an unlimited duration. Peak accuracy was reached within 2000ms and two fixations to each face. However, intermixing exposure conditions produced a context effect that generally reduced accuracy on identity mismatch trials, even when unlimited viewing of faces was possible. These findings indicate that less than 2s are required for face matching when exposure times are variable, but temporal constraints should be avoided altogether if accuracy is truly paramount. The implications of these findings are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Eye-Catching Odors: Olfaction Elicits Sustained Gazing to Faces and Eyes in 4-Month-Old Infants

    PubMed Central

    Lewkowicz, David J.; Goubet, Nathalie; Schaal, Benoist

    2013-01-01

    This study investigated whether an odor can affect infants' attention to visually presented objects and whether it can selectively direct visual gaze at visual targets as a function of their meaning. Four-month-old infants (n = 48) were exposed to their mother's body odors while their visual exploration was recorded with an eye-movement tracking system. Two groups of infants, who were assigned to either an odor condition or a control condition, looked at a scene composed of still pictures of faces and cars. As expected, infants looked longer at the faces than at the cars but this spontaneous preference for faces was significantly enhanced in presence of the odor. As expected also, when looking at the face, the infants looked longer at the eyes than at any other facial regions, but, again, they looked at the eyes significantly longer in the presence of the odor. Thus, 4-month-old infants are sensitive to the contextual effects of odors while looking at faces. This suggests that early social attention to faces is mediated by visual as well as non-visual cues. PMID:24015175

  20. Abnormality in face scanning by children with autism spectrum disorder is limited to the eye region: Evidence from multi-method analyses of eye tracking data

    PubMed Central

    Yi, Li; Fan, Yuebo; Quinn, Paul C.; Feng, Cong; Huang, Dan; Li, Jiao; Mao, Guoquan; Lee, Kang

    2012-01-01

    There has been considerable controversy regarding whether children with autism spectrum disorder (ASD) and typically developing children (TD) show different eye movement patterns when processing faces. We investigated ASD and age- and IQ-matched TD children's scanning of faces using a novel multi-method approach. We found that ASD children spent less time looking at the whole face generally. After controlling for this difference, ASD children's fixations of the other face parts, except for the eye region, and their scanning paths between face parts were comparable either to the age-matched or IQ-matched TD groups. In contrast, in the eye region, ASD children's scanning differed significantly from that of both TD groups: (a) ASD children fixated significantly less on the right eye (from the observer's view); (b) ASD children's fixations were more biased towards the left eye region; and (c) ASD children fixated below the left eye, whereas TD children fixated on the pupil region of the eye. Thus, ASD children do not have a general abnormality in face scanning. Rather, their abnormality is limited to the eye region, likely due to their strong tendency to avoid eye contact. PMID:23929830

  1. Eye-tracking the own-race bias in face recognition: revealing the perceptual and socio-cognitive mechanisms.

    PubMed

    Hills, Peter J; Pake, J Michael

    2013-12-01

    Own-race faces are recognised more accurately than other-race faces and may even be viewed differently as measured by an eye-tracker (Goldinger, Papesh, & He, 2009). Alternatively, observer race might direct eye-movements (Blais, Jack, Scheepers, Fiset, & Caldara, 2008). Observer differences in eye-movements are likely to be based on experience of the physiognomic characteristics that are differentially discriminating for Black and White faces. Two experiments are reported that employed standard old/new recognition paradigms in which Black and White observers viewed Black and White faces with their eye-movements recorded. Experiment 1 showed that there were observer race differences in terms of the features scanned but observers employed the same strategy across different types of faces. Experiment 2 demonstrated that other-race faces could be recognised more accurately if participants had their first fixation directed to more diagnostic features using fixation crosses. These results are entirely consistent with those presented by Blais et al. (2008) and with the perceptual interpretation that the own-race bias is due to inappropriate attention allocated to the facial features (Hills & Lewis, 2006, 2011). Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.

    PubMed

    Evitts, Paul; Gallop, Robert

    2011-01-01

    There is a large body of research demonstrating the impact of visual information on speaker intelligibility in both normal and disordered speaker populations. However, there is minimal information on which specific visual features listeners find salient during conversational discourse. To investigate listeners' eye-gaze behaviour during face-to-face conversation with normal, laryngeal and proficient alaryngeal speakers. Sixty participants individually participated in a 10-min conversation with one of four speakers (typical laryngeal, tracheoesophageal, oesophageal, electrolaryngeal; 15 participants randomly assigned to one mode of speech). All speakers were > 85% intelligible and were judged to be 'proficient' by two certified speech-language pathologists. Participants were fitted with a head-mounted eye-gaze tracking device (Mobile Eye, ASL) that calculated the region of interest and mean duration of eye-gaze. Self-reported gaze behaviour was also obtained following the conversation using a 10 cm visual analogue scale. While listening, participants viewed the lower facial region of the oesophageal speaker more than the normal or tracheoesophageal speaker. Results of non-hierarchical cluster analyses showed that while listening, the pattern of eye-gaze was predominantly directed at the lower face of the oesophageal and electrolaryngeal speaker and more evenly dispersed among the background, lower face, and eyes of the normal and tracheoesophageal speakers. Finally, results show a low correlation between self-reported eye-gaze behaviour and objective regions of interest data. Overall, results suggest similar eye-gaze behaviour when healthy controls converse with normal and tracheoesophageal speakers and that participants had significantly different eye-gaze patterns when conversing with an oesophageal speaker. Results are discussed in terms of existing eye-gaze data and its potential implications on auditory-visual speech perception. © 2011 Royal College of Speech

  3. Early sensitivity for eyes within faces: a new neuronal account of holistic and featural processing

    PubMed Central

    Nemrodov, Dan; Anderson, Thomas; Preston, Frank F.; Itier, Roxane J.

    2017-01-01

    Eyes are central to face processing however their role in early face encoding as reflected by the N170 ERP component is unclear. Using eye tracking to enforce fixation on specific facial features, we found that the N170 was larger for fixation on the eyes compared to fixation on the forehead, nasion, nose or mouth, which all yielded similar amplitudes. This eye sensitivity was seen in both upright and inverted faces and was lost in eyeless faces, demonstrating it was due to the presence of eyes at fovea. Upright eyeless faces elicited largest N170 at nose fixation. Importantly, the N170 face inversion effect (FIE) was strongly attenuated in eyeless faces when fixation was on the eyes but was less attenuated for nose fixation and was normal when fixation was on the mouth. These results suggest the impact of eye removal on the N170 FIE is a function of the angular distance between the fixated feature and the eye location. We propose the Lateral Inhibition, Face Template and Eye Detector based (LIFTED) model which accounts for all the present N170 results including the FIE and its interaction with eye removal. Although eyes elicit the largest N170 response, reflecting the activity of an eye detector, the processing of upright faces is holistic and entails an inhibitory mechanism from neurons coding parafoveal information onto neurons coding foveal information. The LIFTED model provides a neuronal account of holistic and featural processing involved in upright and inverted faces and offers precise predictions for further testing. PMID:24768932

  4. The Face Perception System becomes Species-Specific at 3 Months: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Di Giorgio, Elisa; Meary, David; Pascalis, Olivier; Simion, Francesca

    2013-01-01

    The current study aimed at investigating own- vs. other-species preferences in 3-month-old infants. The infants' eye movements were recorded during a visual preference paradigm to assess whether they show a preference for own-species faces when contrasted with other-species faces. Human and monkey faces, equated for all low-level perceptual…

  5. Magnetic eye tracking in mice

    PubMed Central

    Payne, Hannah L

    2017-01-01

    Eye movements provide insights about a wide range of brain functions, from sensorimotor integration to cognition; hence, the measurement of eye movements is an important tool in neuroscience research. We describe a method, based on magnetic sensing, for measuring eye movements in head-fixed and freely moving mice. A small magnet was surgically implanted on the eye, and changes in the magnet angle as the eye rotated were detected by a magnetic field sensor. Systematic testing demonstrated high resolution measurements of eye position of <0.1°. Magnetic eye tracking offers several advantages over the well-established eye coil and video-oculography methods. Most notably, it provides the first method for reliable, high-resolution measurement of eye movements in freely moving mice, revealing increased eye movements and altered binocular coordination compared to head-fixed mice. Overall, magnetic eye tracking provides a lightweight, inexpensive, easily implemented, and high-resolution method suitable for a wide range of applications. PMID:28872455

  6. An Eye Tracking Investigation of Attentional Biases towards Affect in Young Children

    ERIC Educational Resources Information Center

    Burris, Jessica L.; Barry-Anwar, Ryan A.; Rivera, Susan M.

    2017-01-01

    This study examines attentional biases in the presence of angry, happy and neutral faces using a modified eye tracking version of the dot probe task (DPT). Participants were 111 young children between 9 and 48 months. Children passively viewed an affective attention bias task that consisted of a face pairing (neutral paired with either neutral,…

  7. 3D ocular ultrasound using gaze tracking on the contralateral eye: a feasibility study.

    PubMed

    Afsham, Narges; Najafi, Mohammad; Abolmaesumi, Purang; Rohling, Robert

    2011-01-01

    A gaze-deviated examination of the eye with a 2D ultrasound transducer is a common and informative ophthalmic test; however, the complex task of the pose estimation of the ultrasound images relative to the eye affects 3D interpretation. To tackle this challenge, a novel system for 3D image reconstruction based on gaze tracking of the contralateral eye has been proposed. The gaze fixates on several target points and, for each fixation, the pose of the examined eye is inferred from the gaze tracking. A single camera system has been developed for pose estimation combined with subject-specific parameter identification. The ultrasound images are then transformed to the coordinate system of the examined eye to create a 3D volume. Accuracy of the proposed gaze tracking system and the pose estimation of the eye have been validated in a set of experiments. Overall system error, including pose estimation and calibration, are 3.12 mm and 4.68 degrees.

  8. Brief Report: Patterns of Eye Movements in Face to Face Conversation are Associated with Autistic Traits: Evidence from a Student Sample.

    PubMed

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking to the social partner overall, nor with reduced looking to the face. However, individuals who were high in autistic traits exhibited reduced visual exploration during the face-to-face interaction overall, as demonstrated by shorter and less frequent saccades. Visual exploration was not related to social anxiety. This study suggests that there are systematic individual differences in visual exploration during social interactions and these are related to amount of autistic traits.

  9. Looking to the eyes influences the processing of emotion on face-sensitive event-related potentials in 7-month-old infants.

    PubMed

    Vanderwert, Ross E; Westerlund, Alissa; Montoya, Lina; McCormick, Sarah A; Miguel, Helga O; Nelson, Charles A

    2015-10-01

    Previous studies in infants have shown that face-sensitive components of the ongoing electroencephalogram (the event-related potential, or ERP) are larger in amplitude to negative emotions (e.g., fear, anger) versus positive emotions (e.g., happy). However, it is still unclear whether the negative emotions linked with the face or the negative emotions alone contribute to these amplitude differences. We simultaneously recorded infant looking behaviors (via eye-tracking) and face-sensitive ERPs while 7-month-old infants viewed human faces or animals displaying happy, fear, or angry expressions. We observed that the amplitude of the N290 was greater (i.e., more negative) to angry animals compared to happy or fearful animals; no such differences were obtained for human faces. Eye-tracking data highlighted the importance of the eye region in processing emotional human faces. Infants that spent more time looking to the eye region of human faces showing fearful or angry expressions had greater N290 or P400 amplitudes, respectively. © 2014 Wiley Periodicals, Inc.

  10. Correlations between psychometric schizotypy, scan path length, fixations on the eyes and face recognition.

    PubMed

    Hills, Peter J; Eaton, Elizabeth; Pake, J Michael

    2016-01-01

    Psychometric schizotypy in the general population correlates negatively with face recognition accuracy, potentially due to deficits in inhibition, social withdrawal, or eye-movement abnormalities. We report an eye-tracking face recognition study in which participants were required to match one of two faces (target and distractor) to a cue face presented immediately before. All faces could be presented with or without paraphernalia (e.g., hats, glasses, facial hair). Results showed that paraphernalia distracted participants, and that the most distracting condition was when the cue and the distractor face had paraphernalia but the target face did not, while there was no correlation between distractibility and participants' scores on the Schizotypal Personality Questionnaire (SPQ). Schizotypy was negatively correlated with proportion of time fixating on the eyes and positively correlated with not fixating on a feature. It was negatively correlated with scan path length and this variable correlated with face recognition accuracy. These results are interpreted as schizotypal traits being associated with a restricted scan path leading to face recognition deficits.

  11. MR-Compatible Integrated Eye Tracking System

    DTIC Science & Technology

    2016-03-10

    SECURITY CLASSIFICATION OF: This instrumentation grant was used to purchase state-of-the-art, high-resolution video eye tracker that can be used to...P.O. Box 12211 Research Triangle Park, NC 27709-2211 video eye tracking, eye movments, visual search; camouflage-breaking REPORT DOCUMENTATION PAGE...Report: MR-Compatible Integrated Eye Tracking System Report Title This instrumentation grant was used to purchase state-of-the-art, high-resolution video

  12. Double attention bias for positive and negative emotional faces in clinical depression: evidence from an eye-tracking study.

    PubMed

    Duque, Almudena; Vázquez, Carmelo

    2015-03-01

    According to cognitive models, attentional biases in depression play key roles in the onset and subsequent maintenance of the disorder. The present study examines the processing of emotional facial expressions (happy, angry, and sad) in depressed and non-depressed adults. Sixteen unmedicated patients with Major Depressive Disorder (MDD) and 34 never-depressed controls (ND) completed an eye-tracking task to assess different components of visual attention (orienting attention and maintenance of attention) in the processing of emotional faces. Compared to ND, participants with MDD showed a negative attentional bias in attentional maintenance indices (i.e. first fixation duration and total fixation time) for sad faces. This attentional bias was positively associated with the severity of depressive symptoms. Furthermore, the MDD group spent a marginally less amount of time viewing happy faces compared with the ND group. No differences were found between the groups with respect to angry faces and orienting attention indices. The current study is limited by its cross-sectional design. These results support the notion that attentional biases in depression are specific to depression-related information and that they operate in later stages in the deployment of attention. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Advances in Eye Tracking in Infancy Research

    ERIC Educational Resources Information Center

    Oakes, Lisa M.

    2012-01-01

    In 2004, McMurray and Aslin edited for "Infancy" a special section on eye tracking. The articles in that special issue revealed the enormous promise of automatic eye tracking with young infants and demonstrated that eye-tracking procedures can provide significant insight into the emergence of cognitive, social, and emotional processing in infancy.…

  14. Before your very eyes: the value and limitations of eye tracking in medical education.

    PubMed

    Kok, Ellen M; Jarodzka, Halszka

    2017-01-01

    Medicine is a highly visual discipline. Physicians from many specialties constantly use visual information in diagnosis and treatment. However, they are often unable to explain how they use this information. Consequently, it is unclear how to train medical students in this visual processing. Eye tracking is a research technique that may offer answers to these open questions, as it enables researchers to investigate such visual processes directly by measuring eye movements. This may help researchers understand the processes that support or hinder a particular learning outcome. In this article, we clarify the value and limitations of eye tracking for medical education researchers. For example, eye tracking can clarify how experience with medical images mediates diagnostic performance and how students engage with learning materials. Furthermore, eye tracking can also be used directly for training purposes by displaying eye movements of experts in medical images. Eye movements reflect cognitive processes, but cognitive processes cannot be directly inferred from eye-tracking data. In order to interpret eye-tracking data properly, theoretical models must always be the basis for designing experiments as well as for analysing and interpreting eye-tracking data. The interpretation of eye-tracking data is further supported by sound experimental design and methodological triangulation. © 2016 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  15. 29 CFR 1926.102 - Eye and face protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 8 2010-07-01 2010-07-01 false Eye and face protection. 1926.102 Section 1926.102 Labor... § 1926.102 Eye and face protection. (a) General. (1) Employees shall be provided with eye and face protection equipment when machines or operations present potential eye or face injury from physical, chemical...

  16. 29 CFR 1926.102 - Eye and face protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 8 2011-07-01 2011-07-01 false Eye and face protection. 1926.102 Section 1926.102 Labor... § 1926.102 Eye and face protection. (a) General. (1) Employees shall be provided with eye and face protection equipment when machines or operations present potential eye or face injury from physical, chemical...

  17. Orienting to Eye Gaze and Face Processing

    ERIC Educational Resources Information Center

    Tipples, Jason

    2005-01-01

    The author conducted 7 experiments to examine possible interactions between orienting to eye gaze and specific forms of face processing. Participants classified a letter following either an upright or inverted face with averted, uninformative eye gaze. Eye gaze orienting effects were recorded for upright and inverted faces, irrespective of whether…

  18. Robust Eye Center Localization through Face Alignment and Invariant Isocentric Patterns

    PubMed Central

    Teng, Dongdong; Chen, Dihu; Tan, Hongzhou

    2015-01-01

    The localization of eye centers is a very useful cue for numerous applications like face recognition, facial expression recognition, and the early screening of neurological pathologies. Several methods relying on available light for accurate eye-center localization have been exploited. However, despite the considerable improvements that eye-center localization systems have undergone in recent years, only few of these developments deal with the challenges posed by the profile (non-frontal face). In this paper, we first use the explicit shape regression method to obtain the rough location of the eye centers. Because this method extracts global information from the human face, it is robust against any changes in the eye region. We exploit this robustness and utilize it as a constraint. To locate the eye centers accurately, we employ isophote curvature features, the accuracy of which has been demonstrated in a previous study. By applying these features, we obtain a series of eye-center locations which are candidates for the actual position of the eye-center. Among these locations, the estimated locations which minimize the reconstruction error between the two methods mentioned above are taken as the closest approximation for the eye centers locations. Therefore, we combine explicit shape regression and isophote curvature feature analysis to achieve robustness and accuracy, respectively. In practical experiments, we use BioID and FERET datasets to test our approach to obtaining an accurate eye-center location while retaining robustness against changes in scale and pose. In addition, we apply our method to non-frontal faces to test its robustness and accuracy, which are essential in gaze estimation but have seldom been mentioned in previous works. Through extensive experimentation, we show that the proposed method can achieve a significant improvement in accuracy and robustness over state-of-the-art techniques, with our method ranking second in terms of accuracy

  19. Using eye movements as an index of implicit face recognition in autism spectrum disorder.

    PubMed

    Hedley, Darren; Young, Robyn; Brewer, Neil

    2012-10-01

    Individuals with an autism spectrum disorder (ASD) typically show impairment on face recognition tasks. Performance has usually been assessed using overt, explicit recognition tasks. Here, a complementary method involving eye tracking was used to examine implicit face recognition in participants with ASD and in an intelligence quotient-matched non-ASD control group. Differences in eye movement indices between target and foil faces were used as an indicator of implicit face recognition. Explicit face recognition was assessed using old-new discrimination and reaction time measures. Stimuli were faces of studied (target) or unfamiliar (foil) persons. Target images at test were either identical to the images presented at study or altered by changing the lighting, pose, or by masking with visual noise. Participants with ASD performed worse than controls on the explicit recognition task. Eye movement-based measures, however, indicated that implicit recognition may not be affected to the same degree as explicit recognition. Autism Res 2012, 5: 363-379. © 2012 International Society for Autism Research, Wiley Periodicals, Inc. © 2012 International Society for Autism Research, Wiley Periodicals, Inc.

  20. Is eye to eye contact really threatening and avoided in social anxiety?--An eye-tracking and psychophysiology study.

    PubMed

    Wieser, Matthias J; Pauli, Paul; Alpers, Georg W; Mühlberger, Andreas

    2009-01-01

    The effects of direct and averted gaze on autonomic arousal and gaze behavior in social anxiety were investigated using a new paradigm including animated movie stimuli and eye-tracking methodology. While high, medium, and low socially anxious (HSA vs. MSA vs. LSA) women watched animated movie clips, in which faces responded to the gaze of the participants with either direct or averted gaze, their eye movements, heart rate (HR) and skin conductance responses (SCR) were continuously recorded. Groups did not differ in their gaze behavior concerning direct vs. averted gaze, but high socially anxious women tended to fixate the eye region of the presented face longer than MSA and LSA, respectively. Furthermore, they responded to direct gaze with more pronounced cardiac acceleration. This physiological finding indicates that direct gaze may be a fear-relevant feature for socially anxious individuals in social interaction. However, this seems not to result in gaze avoidance. Future studies should examine the role of gaze direction and its interaction with facial expressions in social anxiety and its consequences for avoidance behavior and fear responses. Additionally, further research is needed to clarify the role of gaze perception in social anxiety.

  1. Active eye-tracking improves LASIK results.

    PubMed

    Lee, Yuan-Chieh

    2007-06-01

    To study the advantage of active eye-tracking for photorefractive surgery. In a prospective, double-masked study, LASIK for myopia and myopic astigmatism was performed in 50 patients using the ALLEGRETTO WAVE version 1007. All patients received LASIK with full comprehension of the importance of fixation during the procedure. All surgical procedures were performed by a single surgeon. The eye-tracker was turned off in one group (n = 25) and kept on in another group (n = 25). Preoperatively and 3 months postoperatively, patients underwent a standard ophthalmic examination, which included comeal topography. In the patients treated with the eye-tracker off, all had uncorrected visual acuity (UCVA) of > or = 20/40 and 64% had > or = 20/20. Compared with the patients treated with the eye-tracker on, they had higher residual cylindrical astigmatism (P < .05). Those treated with the eye-tracker on achieved better UCVA and best spectacle-corrected visual acuity (P < .05). Spherical error and potential visual acuity (TMS-II) were not significantly different between the groups. The flying-spot system can achieve a fair result without active eye-tracking, but active eye-tracking helps improve the visual outcome and reduces postoperative cylindrical astigmatism.

  2. 29 CFR 1918.101 - Eye and face protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Eye and face protection. 1918.101 Section 1918.101 Labor... (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Personal Protective Equipment § 1918.101 Eye and... uses appropriate eye and/or face protection when the employee is exposed to an eye or face hazard, and...

  3. 29 CFR 1918.101 - Eye and face protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Eye and face protection. 1918.101 Section 1918.101 Labor... (CONTINUED) SAFETY AND HEALTH REGULATIONS FOR LONGSHORING Personal Protective Equipment § 1918.101 Eye and... uses appropriate eye and/or face protection when the employee is exposed to an eye or face hazard, and...

  4. 29 CFR 1918.101 - Eye and face protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 7 2014-07-01 2014-07-01 false Eye and face protection. 1918.101 Section 1918.101 Labor... face protection. (a) The employer shall ensure that: (1)(i) Employers must ensure that each employee uses appropriate eye and/or face protection when the employee is exposed to an eye or face hazard, and...

  5. 29 CFR 1918.101 - Eye and face protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 7 2012-07-01 2012-07-01 false Eye and face protection. 1918.101 Section 1918.101 Labor... face protection. (a) The employer shall ensure that: (1)(i) Employers must ensure that each employee uses appropriate eye and/or face protection when the employee is exposed to an eye or face hazard, and...

  6. 29 CFR 1918.101 - Eye and face protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 7 2013-07-01 2013-07-01 false Eye and face protection. 1918.101 Section 1918.101 Labor... face protection. (a) The employer shall ensure that: (1)(i) Employers must ensure that each employee uses appropriate eye and/or face protection when the employee is exposed to an eye or face hazard, and...

  7. Eyes only? Perceiving eye contact is neither sufficient nor necessary for attentional capture by face direction.

    PubMed

    Böckler, Anne; van der Wel, Robrecht P R D; Welsh, Timothy N

    2015-09-01

    Direct eye contact and motion onset both constitute powerful cues that capture attention. Recent research suggests that (social) gaze and (non-social) motion onset influence information processing in parallel, even when combined as sudden onset direct gaze cues (i.e., faces suddenly establishing eye contact). The present study investigated the role of eye visibility for attention capture by these sudden onset face cues. To this end, face direction was manipulated (away or towards onlooker) while faces had closed eyes (eliminating visibility of eyes, Experiment 1), wore sunglasses (eliminating visible eyes, but allowing for the expectation of eyes to be open, Experiment 2), and were inverted with visible eyes (disrupting the integration of eyes and faces, Experiment 3). Participants classified targets appearing on one of four faces. Initially, two faces were oriented towards participants and two faces were oriented away from participants. Simultaneous to target presentation, one averted face became directed and one directed face became averted. Attention capture by face direction (i.e., facilitation for faces directed towards participants) was absent when eyes were closed, but present when faces wore sunglasses. Sudden onset direct faces can, hence, induce attentional capture, even when lacking eye cues. Inverted faces, by contrast, did not elicit attentional capture. Thus, when eyes cannot be integrated into a holistic face representation they are not sufficient to capture attention. Overall, the results suggest that visibility of eyes is neither necessary nor sufficient for the sudden direct face effect. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Personal identification by eyes.

    PubMed

    Marinović, Dunja; Njirić, Sanja; Coklo, Miran; Muzić, Vedrana

    2011-09-01

    Identification of persons through the eyes is in the field of biometrical science. Many security systems are based on biometric methods of personal identification, to determine whether a person is presenting itself truly. The human eye contains an extremely large number of individual characteristics that make it particularly suitable for the process of identifying a person. Today, the eye is considered to be one of the most reliable body parts for human identification. Systems using iris recognition are among the most secure biometric systems.

  9. 33 CFR 142.27 - Eye and face protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...

  10. 33 CFR 142.27 - Eye and face protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...

  11. 33 CFR 142.27 - Eye and face protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...

  12. 33 CFR 142.27 - Eye and face protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...

  13. 33 CFR 142.27 - Eye and face protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Eye and face protection. 142.27... Eye and face protection. (a) Personnel engaged in or observing welding, grinding, machining, chipping, handling hazardous materials, or acetylene burning or cutting shall wear the eye and face protector...

  14. 49 CFR 214.117 - Eye and face protection.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Eye and face protection. 214.117 Section 214.117..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Bridge Worker Safety Standards § 214.117 Eye and face protection. (a) Railroad bridge workers shall be provided and shall wear eye and face protection equipment...

  15. 49 CFR 214.117 - Eye and face protection.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Eye and face protection. 214.117 Section 214.117..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Bridge Worker Safety Standards § 214.117 Eye and face protection. (a) Railroad bridge workers shall be provided and shall wear eye and face protection equipment...

  16. 49 CFR 214.117 - Eye and face protection.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Eye and face protection. 214.117 Section 214.117..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Bridge Worker Safety Standards § 214.117 Eye and face protection. (a) Railroad bridge workers shall be provided and shall wear eye and face protection equipment...

  17. 49 CFR 214.117 - Eye and face protection.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Eye and face protection. 214.117 Section 214.117..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Bridge Worker Safety Standards § 214.117 Eye and face protection. (a) Railroad bridge workers shall be provided and shall wear eye and face protection equipment...

  18. 49 CFR 214.117 - Eye and face protection.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Eye and face protection. 214.117 Section 214.117..., DEPARTMENT OF TRANSPORTATION RAILROAD WORKPLACE SAFETY Bridge Worker Safety Standards § 214.117 Eye and face protection. (a) Railroad bridge workers shall be provided and shall wear eye and face protection equipment...

  19. Video-based eye tracking for neuropsychiatric assessment.

    PubMed

    Adhikari, Sam; Stark, David E

    2017-01-01

    This paper presents a video-based eye-tracking method, ideally deployed via a mobile device or laptop-based webcam, as a tool for measuring brain function. Eye movements and pupillary motility are tightly regulated by brain circuits, are subtly perturbed by many disease states, and are measurable using video-based methods. Quantitative measurement of eye movement by readily available webcams may enable early detection and diagnosis, as well as remote/serial monitoring, of neurological and neuropsychiatric disorders. We successfully extracted computational and semantic features for 14 testing sessions, comprising 42 individual video blocks and approximately 17,000 image frames generated across several days of testing. Here, we demonstrate the feasibility of collecting video-based eye-tracking data from a standard webcam in order to assess psychomotor function. Furthermore, we were able to demonstrate through systematic analysis of this data set that eye-tracking features (in particular, radial and tangential variance on a circular visual-tracking paradigm) predict performance on well-validated psychomotor tests. © 2017 New York Academy of Sciences.

  20. 29 CFR 1917.91 - Eye and face protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Eye and face protection. 1917.91 Section 1917.91 Labor... (CONTINUED) MARINE TERMINALS Personal Protection § 1917.91 Eye and face protection. (a)(1)(i) The employer shall ensure that each affected employee uses protective eye and face protection devices that comply...

  1. 29 CFR 1917.91 - Eye and face protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 7 2011-07-01 2011-07-01 false Eye and face protection. 1917.91 Section 1917.91 Labor... (CONTINUED) MARINE TERMINALS Personal Protection § 1917.91 Eye and face protection. (a)(1)(i) The employer shall ensure that each affected employee uses protective eye and face protection devices that comply...

  2. Attachment Avoidance Is Significantly Related to Attentional Preference for Infant Faces: Evidence from Eye Movement Data

    PubMed Central

    Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan

    2017-01-01

    Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women’s eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants (N = 150; 84% Han ethnicity) were aged 18–29 years (M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment. PMID:28184210

  3. Attachment Avoidance Is Significantly Related to Attentional Preference for Infant Faces: Evidence from Eye Movement Data.

    PubMed

    Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan

    2017-01-01

    Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women's eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants ( N = 150; 84% Han ethnicity) were aged 18-29 years ( M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment.

  4. An Application for Driver Drowsiness Identification based on Pupil Detection using IR Camera

    NASA Astrophysics Data System (ADS)

    Kumar, K. S. Chidanand; Bhowmick, Brojeshwar

    A Driver drowsiness identification system has been proposed that generates alarms when driver falls asleep during driving. A number of different physical phenomena can be monitored and measured in order to detect drowsiness of driver in a vehicle. This paper presents a methodology for driver drowsiness identification using IR camera by detecting and tracking pupils. The face region is first determined first using euler number and template matching. Pupils are then located in the face region. In subsequent frames of video, pupils are tracked in order to find whether the eyes are open or closed. If eyes are closed for several consecutive frames then it is concluded that the driver is fatigued and alarm is generated.

  5. Eye Tracking: A Brief Guide for Developmental Researchers

    ERIC Educational Resources Information Center

    Feng, Gary

    2011-01-01

    Eye tracking offers a powerful research tool for developmental scientists. In this brief article, the author introduces the methodology and issues associated with its applications in developmental research, beginning with an overview of eye movements and eye-tracking technologies, followed by examples of how it is used to study the developing mind…

  6. Reasoning strategies with rational numbers revealed by eye tracking.

    PubMed

    Plummer, Patrick; DeWolf, Melissa; Bassok, Miriam; Gordon, Peter C; Holyoak, Keith J

    2017-07-01

    Recent research has begun to investigate the impact of different formats for rational numbers on the processes by which people make relational judgments about quantitative relations. DeWolf, Bassok, and Holyoak (Journal of Experimental Psychology: General, 144(1), 127-150, 2015) found that accuracy on a relation identification task was highest when fractions were presented with countable sets, whereas accuracy was relatively low for all conditions where decimals were presented. However, it is unclear what processing strategies underlie these disparities in accuracy. We report an experiment that used eye-tracking methods to externalize the strategies that are evoked by different types of rational numbers for different types of quantities (discrete vs. continuous). Results showed that eye-movement behavior during the task was jointly determined by image and number format. Discrete images elicited a counting strategy for both fractions and decimals, but this strategy led to higher accuracy only for fractions. Continuous images encouraged magnitude estimation and comparison, but to a greater degree for decimals than fractions. This strategy led to decreased accuracy for both number formats. By analyzing participants' eye movements when they viewed a relational context and made decisions, we were able to obtain an externalized representation of the strategic choices evoked by different ontological types of entities and different types of rational numbers. Our findings using eye-tracking measures enable us to go beyond previous studies based on accuracy data alone, demonstrating that quantitative properties of images and the different formats for rational numbers jointly influence strategies that generate eye-movement behavior.

  7. Child attention to pain and pain tolerance are dependent upon anxiety and attention control: An eye-tracking study.

    PubMed

    Heathcote, L C; Lau, J Y F; Mueller, S C; Eccleston, C; Fox, E; Bosmans, M; Vervoort, T

    2017-02-01

    Pain is common and can be debilitating in childhood. Theoretical models propose that attention to pain plays a key role in pain outcomes, however, very little research has investigated this in youth. This study examined how anxiety-related variables and attention control interacted to predict children's attention to pain cues using eye-tracking methodology, and their pain tolerance on the cold pressor test (CPT). Children aged 8-17 years had their eye-gaze tracked whilst they viewed photographs of other children displaying painful facial expressions during the CPT, before completing the CPT themselves. Children also completed self-report measures of anxiety and attention control. Findings indicated that anxiety and attention control did not impact children's initial fixations on pain or neutral faces, but did impact how long they dwelled on pain versus neutral faces. For children reporting low levels of attention control, higher anxiety was associated with less dwell time on pain faces as opposed to neutral faces, and the opposite pattern was observed for children with high attention control. Anxiety and attention control also interacted to predict pain outcomes. For children with low attention control, increasing anxiety was associated with anticipating more pain and tolerating pain for less time. This is the first study to examine children's attention to pain cues using eye-tracking technology in the context of a salient painful experience. Data suggest that attention control is an important moderator of anxiety on multiple outcomes relevant to young people's pain experiences. This study uses eye tracking to study attention to pain cues in children. Attention control is an important moderator of anxiety on attention bias to pain and tolerance of cold pressor pain in youth. © 2016 European Pain Federation - EFIC®.

  8. Eye contrast polarity is critical for face recognition by infants.

    PubMed

    Otsuka, Yumiko; Motoyoshi, Isamu; Hill, Harold C; Kobayashi, Megumi; Kanazawa, So; Yamaguchi, Masami K

    2013-07-01

    Just as faces share the same basic arrangement of features, with two eyes above a nose above a mouth, human eyes all share the same basic contrast polarity relations, with a sclera lighter than an iris and a pupil, and this is unique among primates. The current study examined whether this bright-dark relationship of sclera to iris plays a critical role in face recognition from early in development. Specifically, we tested face discrimination in 7- and 8-month-old infants while independently manipulating the contrast polarity of the eye region and of the rest of the face. This gave four face contrast polarity conditions: fully positive condition, fully negative condition, positive face with negated eyes ("negative eyes") condition, and negated face with positive eyes ("positive eyes") condition. In a familiarization and novelty preference procedure, we found that 7- and 8-month-olds could discriminate between faces only when the contrast polarity of the eyes was preserved (positive) and that this did not depend on the contrast polarity of the rest of the face. This demonstrates the critical role of eye contrast polarity for face recognition in 7- and 8-month-olds and is consistent with previous findings for adults. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Implicit negative affect predicts attention to sad faces beyond self-reported depressive symptoms in healthy individuals: An eye-tracking study.

    PubMed

    Bodenschatz, Charlott Maria; Skopinceva, Marija; Kersting, Anette; Quirin, Markus; Suslow, Thomas

    2018-04-04

    Cognitive theories of depression assume biased attention towards mood-congruent information as a central vulnerability and maintaining factor. Among other symptoms, depression is characterized by excessive negative affect (NA). Yet, little is known about the impact of naturally occurring NA on the allocation of attention to emotional information. The study investigates how implicit and explicit NA as well as self-reported depressive symptoms predict attentional biases in a sample of healthy individuals (N = 104). Attentional biases were assessed using eye-tracking during a free viewing task in which images of sad, angry, happy and neutral faces were shown simultaneously. Participants' implicit affectivity was measured indirectly using the Implicit Positive and Negative Affect Test. Questionnaires were administered to assess actual and habitual explicit NA and presence of depressive symptoms. Higher levels of depressive symptoms were associated with sustained attention to sad faces and reduced attention to happy faces. Implicit but not explicit NA significantly predicted gaze behavior towards sad faces independently from depressive symptoms. The present study supports the idea that naturally occurring implicit NA is associated with attention allocation to dysphoric facial expression. The findings demonstrate the utility of implicit affectivity measures in studying individual differences in depression-relevant attentional biases and cognitive vulnerability. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A laser-based eye-tracking system.

    PubMed

    Irie, Kenji; Wilson, Bruce A; Jones, Richard D; Bones, Philip J; Anderson, Tim J

    2002-11-01

    This paper reports on the development of a new eye-tracking system for noninvasive recording of eye movements. The eye tracker uses a flying-spot laser to selectively image landmarks on the eye and, subsequently, measure horizontal, vertical, and torsional eye movements. Considerable work was required to overcome the adverse effects of specular reflection of the flying-spot from the surface of the eye onto the sensing elements of the eye tracker. These effects have been largely overcome, and the eye-tracker has been used to document eye movement abnormalities, such as abnormal torsional pulsion of saccades, in the clinical setting.

  11. Processing of Emotional Faces in Patients with Chronic Pain Disorder: An Eye-Tracking Study.

    PubMed

    Giel, Katrin Elisabeth; Paganini, Sarah; Schank, Irena; Enck, Paul; Zipfel, Stephan; Junne, Florian

    2018-01-01

    Problems in emotion processing potentially contribute to the development and maintenance of chronic pain. Theories focusing on attentional processing have suggested that dysfunctional attention deployment toward emotional information, i.e., attentional biases for negative emotions, might entail one potential developmental and/or maintenance factor of chronic pain. We assessed self-reported alexithymia, attentional orienting to and maintenance on emotional stimuli using eye tracking in 17 patients with chronic pain disorder (CP) and two age- and sex-matched control groups, 17 healthy individuals (HC) and 17 individuals who were matched to CP according to depressive symptoms (DC). In a choice viewing paradigm, a dot indicated the position of the emotional picture in the next trial to allow for strategic attention deployment. Picture pairs consisted of a happy or sad facial expression and a neutral facial expression of the same individual. Participants were asked to explore picture pairs freely. CP and DC groups reported higher alexithymia than the HC group. HC showed a previously reported emotionality bias by preferentially orienting to the emotional face and preferentially maintaining on the happy face. CP and DC participants showed no facilitated early attention to sad facial expressions, and DC participants showed no facilitated early attention to happy facial expressions, while CP and DC participants did. We found no group differences in attentional maintenance. Our findings are in line with the clinical large overlap between pain and depression. The blunted initial reaction to sadness could be interpreted as a failure of the attentional system to attend to evolutionary salient emotional stimuli or as an attempt to suppress negative emotions. These difficulties in emotion processing might contribute to etiology or maintenance of chronic pain and depression.

  12. Who is the Usual Suspect? Evidence of a Selection Bias Toward Faces That Make Direct Eye Contact in a Lineup Task

    PubMed Central

    van Golde, Celine; Verstraten, Frans A. J.

    2017-01-01

    The speed and ease with which we recognize the faces of our friends and family members belies the difficulty we have recognizing less familiar individuals. Nonetheless, overconfidence in our ability to recognize faces has carried over into various aspects of our legal system; for instance, eyewitness identification serves a critical role in criminal proceedings. For this reason, understanding the perceptual and psychological processes that underlie false identification is of the utmost importance. Gaze direction is a salient social signal and direct eye contact, in particular, is thought to capture attention. Here, we tested the hypothesis that differences in gaze direction may influence difficult decisions in a lineup context. In a series of experiments, we show that when a group of faces differed in their gaze direction, the faces that were making eye contact with the participants were more likely to be misidentified. Interestingly, this bias disappeared when the faces are presented with their eyes closed. These findings open a critical conversation between social neuroscience and forensic psychology, and imply that direct eye contact may (wrongly) increase the perceived familiarity of a face. PMID:28203355

  13. Appearance-based multimodal human tracking and identification for healthcare in the digital home.

    PubMed

    Yang, Mau-Tsuen; Huang, Shen-Yen

    2014-08-05

    There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare.

  14. Appearance-Based Multimodal Human Tracking and Identification for Healthcare in the Digital Home

    PubMed Central

    Yang, Mau-Tsuen; Huang, Shen-Yen

    2014-01-01

    There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare. PMID:25098207

  15. Combining user logging with eye tracking for interactive and dynamic applications.

    PubMed

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  16. Music-Elicited Emotion Identification Using Optical Flow Analysis of Human Face

    NASA Astrophysics Data System (ADS)

    Kniaz, V. V.; Smirnova, Z. N.

    2015-05-01

    Human emotion identification from image sequences is highly demanded nowadays. The range of possible applications can vary from an automatic smile shutter function of consumer grade digital cameras to Biofied Building technologies, which enables communication between building space and residents. The highly perceptual nature of human emotions leads to the complexity of their classification and identification. The main question arises from the subjective quality of emotional classification of events that elicit human emotions. A variety of methods for formal classification of emotions were developed in musical psychology. This work is focused on identification of human emotions evoked by musical pieces using human face tracking and optical flow analysis. Facial feature tracking algorithm used for facial feature speed and position estimation is presented. Facial features were extracted from each image sequence using human face tracking with local binary patterns (LBP) features. Accurate relative speeds of facial features were estimated using optical flow analysis. Obtained relative positions and speeds were used as the output facial emotion vector. The algorithm was tested using original software and recorded image sequences. The proposed technique proves to give a robust identification of human emotions elicited by musical pieces. The estimated models could be used for human emotion identification from image sequences in such fields as emotion based musical background or mood dependent radio.

  17. Contribution of malocclusion and female facial attractiveness to smile esthetics evaluated by eye tracking.

    PubMed

    Richards, Michael R; Fields, Henry W; Beck, F Michael; Firestone, Allen R; Walther, Dirk B; Rosenstiel, Stephen; Sacksteder, James M

    2015-04-01

    There is disagreement in the literature concerning the importance of the mouth in overall facial attractiveness. Eye tracking provides an objective method to evaluate what people see. The objective of this study was to determine whether dental and facial attractiveness alters viewers' visual attention in terms of which area of the face (eyes, nose, mouth, chin, ears, or other) is viewed first, viewed the greatest number of times, and viewed for the greatest total time (duration) using eye tracking. Seventy-six viewers underwent 1 eye tracking session. Of these, 53 were white (49% female, 51% male). Their ages ranged from 18 to 29 years, with a mean of 19.8 years, and none were dental professionals. After being positioned and calibrated, they were shown 24 unique female composite images, each image shown twice for reliability. These images reflected a repaired unilateral cleft lip or 3 grades of dental attractiveness similar to those of grades 1 (near ideal), 7 (borderline treatment need), and 10 (definite treatment need) as assessed in the aesthetic component of the Index of Orthodontic Treatment Need (AC-IOTN). The images were then embedded in faces of 3 levels of attractiveness: attractive, average, and unattractive. During viewing, data were collected for the first location, frequency, and duration of each viewer's gaze. Observer reliability ranged from 0.58 to 0.92 (intraclass correlation coefficients) but was less than 0.07 (interrater) for the chin, which was eliminated from the study. Likewise, reliability for the area of first fixation was kappa less than 0.10 for both intrarater and interrater reliabilities; the area of first fixation was also removed from the data analysis. Repeated-measures analysis of variance showed a significant effect (P <0.001) for level of attractiveness by malocclusion by area of the face. For both number of fixations and duration of fixations, the eyes overwhelmingly were most salient, with the mouth receiving the second most

  18. Eye Tracking Detects Disconjugate Eye Movements Associated with Structural Traumatic Brain Injury and Concussion

    PubMed Central

    Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H.; McStay, Christopher; Todd, S. Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul

    2015-01-01

    Abstract Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury. PMID:25582436

  19. Eye tracking detects disconjugate eye movements associated with structural traumatic brain injury and concussion.

    PubMed

    Samadani, Uzma; Ritlop, Robert; Reyes, Marleen; Nehrbass, Elena; Li, Meng; Lamm, Elizabeth; Schneider, Julia; Shimunov, David; Sava, Maria; Kolecki, Radek; Burris, Paige; Altomare, Lindsey; Mehmood, Talha; Smith, Theodore; Huang, Jason H; McStay, Christopher; Todd, S Rob; Qian, Meng; Kondziolka, Douglas; Wall, Stephen; Huang, Paul

    2015-04-15

    Disconjugate eye movements have been associated with traumatic brain injury since ancient times. Ocular motility dysfunction may be present in up to 90% of patients with concussion or blast injury. We developed an algorithm for eye tracking in which the Cartesian coordinates of the right and left pupils are tracked over 200 sec and compared to each other as a subject watches a short film clip moving inside an aperture on a computer screen. We prospectively eye tracked 64 normal healthy noninjured control subjects and compared findings to 75 trauma subjects with either a positive head computed tomography (CT) scan (n=13), negative head CT (n=39), or nonhead injury (n=23) to determine whether eye tracking would reveal the disconjugate gaze associated with both structural brain injury and concussion. Tracking metrics were then correlated to the clinical concussion measure Sport Concussion Assessment Tool 3 (SCAT3) in trauma patients. Five out of five measures of horizontal disconjugacy were increased in positive and negative head CT patients relative to noninjured control subjects. Only one of five vertical disconjugacy measures was significantly increased in brain-injured patients relative to controls. Linear regression analysis of all 75 trauma patients demonstrated that three metrics for horizontal disconjugacy negatively correlated with SCAT3 symptom severity score and positively correlated with total Standardized Assessment of Concussion score. Abnormal eye-tracking metrics improved over time toward baseline in brain-injured subjects observed in follow-up. Eye tracking may help quantify the severity of ocular motility disruption associated with concussion and structural brain injury.

  20. Infant Eye-Tracking in the Context of Goal-Directed Actions

    ERIC Educational Resources Information Center

    Corbetta, Daniela; Guan, Yu; Williams, Joshua L.

    2012-01-01

    This paper presents two methods that we applied to our research to record infant gaze in the context of goal-oriented actions using different eye-tracking devices: head-mounted and remote eye-tracking. For each type of eye-tracking system, we discuss their advantages and disadvantages, describe the particular experimental setups we used to study…

  1. Real time eye tracking using Kalman extended spatio-temporal context learning

    NASA Astrophysics Data System (ADS)

    Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu

    2017-06-01

    Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.

  2. Eye-Tracking Study of Complexity in Gas Law Problems

    ERIC Educational Resources Information Center

    Tang, Hui; Pienta, Norbert

    2012-01-01

    This study, part of a series investigating students' use of online tools to assess problem solving, uses eye-tracking hardware and software to explore the effect of problem difficulty and cognitive processes when students solve gas law word problems. Eye movements are indices of cognition; eye-tracking data typically include the location,…

  3. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    PubMed

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  4. Sad people are more accurate at expression identification with a smaller own-ethnicity bias than happy people.

    PubMed

    Hills, Peter J; Hill, Dominic M

    2017-07-12

    Sad individuals perform more accurately at face identity recognition (Hills, Werno, & Lewis, 2011), possibly because they scan more of the face during encoding. During expression identification tasks, sad individuals do not fixate on the eyes as much as happier individuals (Wu, Pu, Allen, & Pauli, 2012). Fixating on features other than the eyes leads to a reduced own-ethnicity bias (Hills & Lewis, 2006). This background indicates that sad individuals would not view the eyes as much as happy individuals and this would result in improved expression recognition and a reduced own-ethnicity bias. This prediction was tested using an expression identification task, with eye tracking. We demonstrate that sad-induced participants show enhanced expression recognition and a reduced own-ethnicity bias than happy-induced participants due to scanning more facial features. We conclude that mood affects eye movements and face encoding by causing a wider sampling strategy and deeper encoding of facial features diagnostic for expression identification.

  5. 33 CFR 150.609 - When is eye and face protection required?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false When is eye and face protection... SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: OPERATIONS Workplace Safety and Health Eyes and Face § 150.609 When is eye and face protection required? The operator must provide eye and face protectors...

  6. 33 CFR 150.609 - When is eye and face protection required?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false When is eye and face protection... SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: OPERATIONS Workplace Safety and Health Eyes and Face § 150.609 When is eye and face protection required? The operator must provide eye and face protectors...

  7. 33 CFR 150.609 - When is eye and face protection required?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false When is eye and face protection... SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: OPERATIONS Workplace Safety and Health Eyes and Face § 150.609 When is eye and face protection required? The operator must provide eye and face protectors...

  8. 33 CFR 150.609 - When is eye and face protection required?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false When is eye and face protection... SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: OPERATIONS Workplace Safety and Health Eyes and Face § 150.609 When is eye and face protection required? The operator must provide eye and face protectors...

  9. 33 CFR 150.609 - When is eye and face protection required?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false When is eye and face protection... SECURITY (CONTINUED) DEEPWATER PORTS DEEPWATER PORTS: OPERATIONS Workplace Safety and Health Eyes and Face § 150.609 When is eye and face protection required? The operator must provide eye and face protectors...

  10. The "Eye Avoidance" Hypothesis of Autism Face Processing.

    PubMed

    Tanaka, James W; Sung, Andrew

    2016-05-01

    Although a growing body of research indicates that children with autism spectrum disorder (ASD) exhibit selective deficits in their ability to recognize facial identities and expressions, the source of their face impairment is, as yet, undetermined. In this paper, we consider three possible accounts of the autism face deficit: (1) the holistic hypothesis, (2) the local perceptual bias hypothesis and (3) the eye avoidance hypothesis. A review of the literature indicates that contrary to the holistic hypothesis, there is little evidence to suggest that individuals with autism do perceive faces holistically. The local perceptual bias account also fails to explain the selective advantage that ASD individuals demonstrate for objects and their selective disadvantage for faces. The eye avoidance hypothesis provides a plausible explanation of face recognition deficits where individuals with ASD avoid the eye region because it is perceived as socially threatening. Direct eye contact elicits a increased physiological response as indicated by heightened skin conductance and amygdala activity. For individuals with autism, avoiding the eyes is an adaptive strategy, however, this approach interferes with the ability to process facial cues of identity, expressions and intentions, exacerbating the social challenges for persons with ASD.

  11. Tracking with the mind's eye

    NASA Technical Reports Server (NTRS)

    Krauzlis, R. J.; Stone, L. S.

    1999-01-01

    The two components of voluntary tracking eye-movements in primates, pursuit and saccades, are generally viewed as relatively independent oculomotor subsystems that move the eyes in different ways using independent visual information. Although saccades have long been known to be guided by visual processes related to perception and cognition, only recently have psychophysical and physiological studies provided compelling evidence that pursuit is also guided by such higher-order visual processes, rather than by the raw retinal stimulus. Pursuit and saccades also do not appear to be entirely independent anatomical systems, but involve overlapping neural mechanisms that might be important for coordinating these two types of eye movement during the tracking of a selected visual object. Given that the recovery of objects from real-world images is inherently ambiguous, guiding both pursuit and saccades with perception could represent an explicit strategy for ensuring that these two motor actions are driven by a single visual interpretation.

  12. Active eye-tracking for an adaptive optics scanning laser ophthalmoscope

    PubMed Central

    Sheehy, Christy K.; Tiruveedhula, Pavan; Sabesan, Ramkumar; Roorda, Austin

    2015-01-01

    We demonstrate a system that combines a tracking scanning laser ophthalmoscope (TSLO) and an adaptive optics scanning laser ophthalmoscope (AOSLO) system resulting in both optical (hardware) and digital (software) eye-tracking capabilities. The hybrid system employs the TSLO for active eye-tracking at a rate up to 960 Hz for real-time stabilization of the AOSLO system. AOSLO videos with active eye-tracking signals showed, at most, an amplitude of motion of 0.20 arcminutes for horizontal motion and 0.14 arcminutes for vertical motion. Subsequent real-time digital stabilization limited residual motion to an average of only 0.06 arcminutes (a 95% reduction). By correcting for high amplitude, low frequency drifts of the eye, the active TSLO eye-tracking system enabled the AOSLO system to capture high-resolution retinal images over a larger range of motion than previously possible with just the AOSLO imaging system alone. PMID:26203370

  13. A face versus non-face context influences amygdala responses to masked fearful eye whites.

    PubMed

    Kim, M Justin; Solomon, Kimberly M; Neta, Maital; Davis, F Caroline; Oler, Jonathan A; Mazzulla, Emily C; Whalen, Paul J

    2016-12-01

    The structure of the mask stimulus is crucial in backward masking studies and we recently demonstrated such an effect when masking faces. Specifically, we showed that activity of the amygdala is increased to fearful facial expressions masked with neutral faces and decreased to fearful expressions masked with a pattern mask-but critically both masked conditions discriminated fearful expressions from happy expressions. Given this finding, we sought to test whether masked fearful eye whites would produce a similar profile of amygdala response in a face vs non-face context. During functional magnetic resonance imaging scanning sessions, 30 participants viewed fearful or happy eye whites masked with either neutral faces or pattern images. Results indicated amygdala activity was increased to fearful vs happy eye whites in the face mask condition, but decreased to fearful vs happy eye whites in the pattern mask condition-effectively replicating and expanding our previous report. Our data support the idea that the amygdala is responsive to fearful eye whites, but that the nature of this activity observed in a backward masking design depends on the mask stimulus. © The Author (2016). Published by Oxford University Press.

  14. Measuring social attention and motivation in autism spectrum disorder using eye-tracking: Stimulus type matters.

    PubMed

    Chevallier, Coralie; Parish-Morris, Julia; McVey, Alana; Rump, Keiran M; Sasson, Noah J; Herrington, John D; Schultz, Robert T

    2015-10-01

    Autism Spectrum Disorder (ASD) is characterized by social impairments that have been related to deficits in social attention, including diminished gaze to faces. Eye-tracking studies are commonly used to examine social attention and social motivation in ASD, but they vary in sensitivity. In this study, we hypothesized that the ecological nature of the social stimuli would affect participants' social attention, with gaze behavior during more naturalistic scenes being most predictive of ASD vs. typical development. Eighty-one children with and without ASD participated in three eye-tracking tasks that differed in the ecological relevance of the social stimuli. In the "Static Visual Exploration" task, static images of objects and people were presented; in the "Dynamic Visual Exploration" task, video clips of individual faces and objects were presented side-by-side; in the "Interactive Visual Exploration" task, video clips of children playing with objects in a naturalistic context were presented. Our analyses uncovered a three-way interaction between Task, Social vs. Object Stimuli, and Diagnosis. This interaction was driven by group differences on one task only-the Interactive task. Bayesian analyses confirmed that the other two tasks were insensitive to group membership. In addition, receiver operating characteristic analyses demonstrated that, unlike the other two tasks, the Interactive task had significant classification power. The ecological relevance of social stimuli is an important factor to consider for eye-tracking studies aiming to measure social attention and motivation in ASD. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  15. Dense-HOG-based drift-reduced 3D face tracking for infant pain monitoring

    NASA Astrophysics Data System (ADS)

    Saeijs, Ronald W. J. J.; Tjon A Ten, Walther E.; de With, Peter H. N.

    2017-03-01

    This paper presents a new algorithm for 3D face tracking intended for clinical infant pain monitoring. The algorithm uses a cylinder head model and 3D head pose recovery by alignment of dynamically extracted templates based on dense-HOG features. The algorithm includes extensions for drift reduction, using re-registration in combination with multi-pose state estimation by means of a square-root unscented Kalman filter. The paper reports experimental results on videos of moving infants in hospital who are relaxed or in pain. Results show good tracking behavior for poses up to 50 degrees from upright-frontal. In terms of eye location error relative to inter-ocular distance, the mean tracking error is below 9%.

  16. High-Speed Noninvasive Eye-Tracking System

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; LaBaw, Clayton; Michael-Morookian, John; Monacos, Steve; Serviss, Orin

    2007-01-01

    The figure schematically depicts a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Relative to the prior commercial systems, the present system operates at much higher speed and thereby offers enhanced capability for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals,and eye-based diagnosis of physiological disorders that affect gaze responses.

  17. 29 CFR 1917.91 - Eye and face protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MARINE TERMINALS Personal Protection § 1917.91 Eye and face protection. (a)(1)(i) The employer... requirements covering eye protection against radiant energy, see § 1917.152(h). (b) Eye protection equipment...

  18. 29 CFR 1917.91 - Eye and face protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MARINE TERMINALS Personal Protection § 1917.91 Eye and face protection. (a)(1)(i) The employer... requirements covering eye protection against radiant energy, see § 1917.152(h). (b) Eye protection equipment...

  19. 29 CFR 1917.91 - Eye and face protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MARINE TERMINALS Personal Protection § 1917.91 Eye and face protection. (a)(1)(i) The employer... requirements covering eye protection against radiant energy, see § 1917.152(h). (b) Eye protection equipment...

  20. Three-Dimensional Eye Tracking in a Surgical Scenario.

    PubMed

    Bogdanova, Rositsa; Boulanger, Pierre; Zheng, Bin

    2015-10-01

    Eye tracking has been widely used in studying the eye behavior of surgeons in the past decade. Most eye-tracking data are reported in a 2-dimensional (2D) fashion, and data for describing surgeons' behaviors on stereoperception are often missed. With the introduction of stereoscopes in laparoscopic procedures, there is an increasing need for studying the depth perception of surgeons under 3D image-guided surgery. We developed a new algorithm for the computation of convergence points in stereovision by measuring surgeons' interpupillary distance, the distance to the view target, and the difference between gaze locations of the 2 eyes. To test the feasibility of our new algorithm, we recruited 10 individuals to watch stereograms using binocular disparity and asked them to develop stereoperception using a cross-eyed viewing technique. Participants' eye motions were recorded by the Tobii eye tracker while they performed the trials. Convergence points between normal and stereo-viewing conditions were computed using the developed algorithm. All 10 participants were able to develop stereovision after a short period of training. During stereovision, participants' eye convergence points were 14 ± 1 cm in front of their eyes, which was significantly closer than the convergence points under the normal viewing condition (77 ± 20 cm). By applying our method of calculating convergence points using eye tracking, we were able to elicit the eye movement patterns of human operators between the normal and stereovision conditions. Knowledge from this study can be applied to the design of surgical visual systems, with the goal of improving surgical performance and patient safety. © The Author(s) 2015.

  1. Eye-Tracking Study on Facial Emotion Recognition Tasks in Individuals with High-Functioning Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Tsang, Vicky

    2018-01-01

    The eye-tracking experiment was carried out to assess fixation duration and scan paths that individuals with and without high-functioning autism spectrum disorders employed when identifying simple and complex emotions. Participants viewed human photos of facial expressions and decided on the identification of emotion, the negative-positive emotion…

  2. Using Eye-Tracking in Applied Linguistics and Second Language Research

    ERIC Educational Resources Information Center

    Conklin, Kathy; Pellicer-Sánchez, Ana

    2016-01-01

    With eye-tracking technology the eye is thought to give researchers a window into the mind. Importantly, eye-tracking has significant advantages over traditional online processing measures: chiefly that it allows for more "natural" processing as it does not require a secondary task, and that it provides a very rich moment-to-moment data…

  3. Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review.

    PubMed

    Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G; Goldstein, Adam O; Ranney, Leah

    2016-10-01

    In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics.

  4. Eyes wide shut: implied social presence, eye tracking and attention.

    PubMed

    Risko, Evan F; Kingstone, Alan

    2011-02-01

    People often behave differently when they know they are being watched. Here, we report the first investigation of whether such social presence effects also influence looking behavior--a popular measure of attention allocation. We demonstrate that wearing an eye tracker, an implied social presence, leads individuals to avoid looking at particular stimuli. These results demonstrate that an implied social presence, here an eye tracker, can alter looking behavior. These data provide a new manipulation of social attention, as well as presenting a methodological challenge to researchers using eye tracking.

  5. Age differences in accuracy and choosing in eyewitness identification and face recognition.

    PubMed

    Searcy, J H; Bartlett, J C; Memon, A

    1999-05-01

    Studies of aging and face recognition show age-related increases in false recognitions of new faces. To explore implications of this false alarm effect, we had young and senior adults perform (1) three eye-witness identification tasks, using both target present and target absent lineups, and (2) and old/new recognition task in which a study list of faces was followed by a test including old and new faces, along with conjunctions of old faces. Compared with the young, seniors had lower accuracy and higher choosing rates on the lineups, and they also falsely recognized more new faces on the recognition test. However, after screening for perceptual processing deficits, there was no age difference in false recognition of conjunctions, or in discriminating old faces from conjunctions. We conclude that the false alarm effect generalizes to lineup identification, but does not extend to conjunction faces. The findings are consistent with age-related deficits in recollection of context and relative age invariance in perceptual integrative processes underlying the experience of familiarity.

  6. A pilot study of eye-tracking devices in intensive care.

    PubMed

    Garry, Jonah; Casey, Kelly; Cole, Therese Kling; Regensburg, Angela; McElroy, Colleen; Schneider, Eric; Efron, David; Chi, Albert

    2016-03-01

    Eye-tracking devices have been suggested as a means of improving communication and psychosocial status among patients in the intensive care unit (ICU). This study was undertaken to explore the psychosocial impact and communication effects of eye-tracking devices in the ICU. A convenience sample of patients in the medical ICU, surgical ICU, and neurosciences critical care unit were enrolled prospectively. Patients participated in 5 guided sessions of 45 minutes each with the eye-tracking computer. After completion of the sessions, the Psychosocial Impact of Assistive Devices Scale (PIADS) was used to evaluate the device from the patient's perspective. All patients who participated in the study were able to communicate basic needs to nursing staff and family. Delirium as assessed by the Confusion Assessment Method for the Intensive Care Unit was present in 4 patients at recruitment and none after training. The device's overall psychosocial impact ranged from neutral (-0.29) to strongly positive (2.76). Compared with the absence of intervention (0 = no change), patients exposed to eye-tracking computers demonstrated a positive mean overall impact score (PIADS = 1.30; P = .004). This finding was present in mean scores for each PIADS domain: competence = 1.26, adaptability = 1.60, and self-esteem = 1.02 (all P < .01). There is a population of patients in the ICU whose psychosocial status, delirium, and communication ability may be enhanced by eye-tracking devices. These 3 outcomes are intertwined with ICU patient outcomes and indirectly suggest that eye-tracking devices might improve outcomes. A more in-depth exploration of the population to be targeted, the device's limitations, and the benefits of eye-tracking devices in the ICU is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The “eye avoidance” hypothesis of autism face processing

    PubMed Central

    Tanaka, James W.; Sung, Andrew

    2013-01-01

    Although a growing body of research indicates that children with autism spectrum disorder (ASD) exhibit selective deficits in their ability to recognize facial identities and expressions, the source of their face impairment is, as yet, undetermined. In this paper, we consider three possible accounts of the autism face deficit: 1) the holistic hypothesis, 2) the local perceptual bias hypothesis and 3) the eye avoidance hypothesis. A review of the literature indicates that contrary to the holistic hypothesis, there is little evidence to suggest that individuals with autism do not perceive faces holistically. The local perceptual bias account also fails to explain the selective advantage that ASD individuals demonstrate for objects and their selective disadvantage for faces. The eye avoidance hypothesis provides a plausible explanation of face recognition deficits where individuals with ASD avoid the eye region because it is perceived as socially threatening. Direct eye contact elicits a heightened physiological response as indicated by heightened skin conductance and increased amgydala activity. For individuals with autism, avoiding the eyes is an adaptive strategy, however, this approach interferes with the ability to process facial cues of identity, expressions and intentions, The “eye avoidance” strategy has negative effects on the ability to decode facial information about identity, expression, and intentions, exacerbating the social challenges for persons with ASD. PMID:24150885

  8. Age-related differences in memory expression during infancy: using eye-tracking to measure relational memory in 6- and 12-month-olds.

    PubMed

    Richmond, Jenny L; Power, Jessica

    2014-09-01

    Relational memory, or the ability to bind components of an event into a network of linked representations, is a primary function of the hippocampus. Here we extend eye-tracking research showing that infants are capable of forming memories for the relation between arbitrarily paired scenes and faces, by looking at age-related changes in relational memory over the first year of life. Six- and 12-month-old infants were familiarized with pairs of faces and scenes before being tested with arrays of three familiar faces that were presented on a familiar scene. Preferential looking at the face that matches the scene is typically taken as evidence of relational memory. The results showed that while 6-month-old showed very early preferential looking when face/scene pairs were tested immediately, 12-month-old did not exhibit evidence of relational memory either immediately or after a short delay. Theoretical implications for the functional development of the hippocampus and practical implications for the use of eye tracking to measure memory during early life are discussed. © 2014 Wiley Periodicals, Inc.

  9. Preserved search asymmetry in the detection of fearful faces among neutral faces in individuals with Williams syndrome revealed by measurement of both manual responses and eye tracking.

    PubMed

    Hirai, Masahiro; Muramatsu, Yukako; Mizuno, Seiji; Kurahashi, Naoko; Kurahashi, Hirokazu; Nakamura, Miho

    2017-01-01

    Individuals with Williams syndrome (WS) exhibit an atypical social phenotype termed hypersociability. One theory accounting for hypersociability presumes an atypical function of the amygdala, which processes fear-related information. However, evidence is lacking regarding the detection mechanisms of fearful faces for individuals with WS. Here, we introduce a visual search paradigm to elucidate the mechanisms for detecting fearful faces by evaluating the search asymmetry; the reaction time when both the target and distractors were swapped was asymmetrical. Eye movements reflect subtle atypical attentional properties, whereas, manual responses are unable to capture atypical attentional profiles toward faces in individuals with WS. Therefore, we measured both eye movements and manual responses of individuals with WS and typically developed children and adults in visual searching for a fearful face among neutral faces or a neutral face among fearful faces. Two task measures, namely reaction time and performance accuracy, were analyzed for each stimulus as well as gaze behavior and the initial fixation onset latency. Overall, reaction times in the WS group and the mentally age-matched control group were significantly longer than those in the chronologically age-matched group. We observed a search asymmetry effect in all groups: when a neutral target facial expression was presented among fearful faces, the reaction times were significantly prolonged in comparison with when a fearful target facial expression was displayed among neutral distractor faces. Furthermore, the first fixation onset latency of eye movement toward a target facial expression showed a similar tendency for manual responses. Although overall responses in detecting fearful faces for individuals with WS are slower than those for control groups, search asymmetry was observed. Therefore, cognitive mechanisms underlying the detection of fearful faces seem to be typical in individuals with WS. This finding

  10. Eye Tracking Outcomes in Tobacco Control Regulation and Communication: A Systematic Review

    PubMed Central

    Meernik, Clare; Jarman, Kristen; Wright, Sarah Towner; Klein, Elizabeth G.; Goldstein, Adam O.; Ranney, Leah

    2016-01-01

    Objective In this paper we synthesize the evidence from eye tracking research in tobacco control to inform tobacco regulatory strategies and tobacco communication campaigns. Methods We systematically searched 11 databases for studies that reported eye tracking outcomes in regards to tobacco regulation and communication. Two coders independently reviewed studies for inclusion and abstracted study characteristics and findings. Results Eighteen studies met full criteria for inclusion. Eye tracking studies on health warnings consistently showed these warnings often were ignored, though eye tracking demonstrated that novel warnings, graphic warnings, and plain packaging can increase attention toward warnings. Eye tracking also revealed that greater visual attention to warnings on advertisements and packages consistently was associated with cognitive processing as measured by warning recall. Conclusions Eye tracking is a valid indicator of attention, cognitive processing, and memory. The use of this technology in tobacco control research complements existing methods in tobacco regulatory and communication science; it also can be used to examine the effects of health warnings and other tobacco product communications on consumer behavior in experimental settings prior to the implementation of novel health communication policies. However, the utility of eye tracking will be enhanced by the standardization of methodology and reporting metrics. PMID:27668270

  11. Attentional bias scores in patients with depression and effects of age: a controlled, eye-tracking study.

    PubMed

    Lu, Shengfu; Xu, Jiying; Li, Mi; Xue, Jia; Lu, Xiaofeng; Feng, Lei; Fu, Bingbing; Wang, Gang; Zhong, Ning; Hu, Bin

    2017-10-01

    Objective To compare the attentional bias of depressed patients and non-depressed control subjects and examine the effects of age using eye-tracking technology in a free-viewing set of tasks. Methods Patients with major depressive disorder (MDD) and non-depressed control subjects completed an eye-tracking task to assess attention of processing negative, positive and neutral facial expressions. In this cross-sectional study, the tasks were separated in two types (neutral versus happy faces and neutral versus sad faces) and assessed in two age groups ('young' [18-30 years] and 'middle-aged' [31-55 years]). Results Compared with non-depressed control subjects ( n = 75), patients with MDD ( n = 90) had a significant reduced positive attentional bias and enhanced negative attentional bias irrespective of age. The positive attentional bias in 'middle-aged' patients with MDD was significantly lower than in 'young' patients, although there was no difference between the two age groups in negative attentional bias. Conclusions These results confirm that there are emotional attentional biases in patients with MDD and that positive attentional biases are influenced by age.

  12. Scanning mid-IR laser apparatus with eye tracking for refractive surgery

    NASA Astrophysics Data System (ADS)

    Telfair, William B.; Yoder, Paul R., Jr.; Bekker, Carsten; Hoffman, Hanna J.; Jensen, Eric F.

    1999-06-01

    A robust, real-time, dynamic eye tracker has been integrated with the short pulse mid-infrared laser scanning delivery system previously described. This system employs a Q- switched Nd:YAG laser pumped optical parametric oscillator operating at 2.94 micrometers. Previous ablation studies on human cadaver eyes and in-vivo cat eyes demonstrated very smooth ablations with extremely low damage levels similar to results with an excimer. A 4-month healing study with cats indicated no adverse healing effects. In order to treat human eyes, the tracker is required because the eyes move during the procedure due to both voluntary and involuntary motions such as breathing, heartbeat, drift, loss of fixation, saccades and microsaccades. Eye tracking techniques from the literature were compared. A limbus tracking system was best for this application. Temporal and spectral filtering techniques were implemented to reduce tracking errors, reject stray light, and increase signal to noise ratio. The expanded-capability system (IRVision AccuScan 2000 Laser System) has been tested in the lab on simulated eye targets, glass eyes, cadaver eyes, and live human subjects. Circular targets ranging from 10-mm to 14-mm diameter were successfully tracked. The tracker performed beyond expectations while the system performed myopic photorefractive keratectomy procedures on several legally blind human subjects.

  13. Eye Contact Facilitates Awareness of Faces during Interocular Suppression

    ERIC Educational Resources Information Center

    Stein, Timo; Senju, Atsushi; Peelen, Marius V.; Sterzer, Philipp

    2011-01-01

    Eye contact captures attention and receives prioritized visual processing. Here we asked whether eye contact might be processed outside conscious awareness. Faces with direct and averted gaze were rendered invisible using interocular suppression. In two experiments we found that faces with direct gaze overcame such suppression more rapidly than…

  14. Video-Based Eye Tracking in Sex Research: A Systematic Literature Review.

    PubMed

    Wenzlaff, Frederike; Briken, Peer; Dekker, Arne

    2015-12-21

    Although eye tracking has been used for decades, it has gained popularity in the area of sex research only recently. The aim of this article is to examine the potential merits of eye tracking for this field. We present a systematic review of the current use of video-based eye-tracking technology in this area, evaluate the findings, and identify future research opportunities. A total of 34 relevant studies published between 2006 and 2014 were identified for inclusion by means of online databases and other methods. We grouped them into three main areas of research: body perception and attractiveness, forensic research, and sexual orientation. Despite the methodological and theoretical differences across the studies, eye tracking has been shown to be a promising tool for sex research. The article suggests there is much potential for further studies to employ this technique because it is noninvasive and yet still allows for the assessment of both conscious and unconscious perceptional processes. Furthermore, eye tracking can be implemented in investigations of various theoretical backgrounds, ranging from biology to the social sciences.

  15. Unaware person recognition from the body when face identification fails.

    PubMed

    Rice, Allyson; Phillips, P Jonathon; Natu, Vaidehi; An, Xiaobo; O'Toole, Alice J

    2013-11-01

    How does one recognize a person when face identification fails? Here, we show that people rely on the body but are unaware of doing so. State-of-the-art face-recognition algorithms were used to select images of people with almost no useful identity information in the face. Recognition of the face alone in these cases was near chance level, but recognition of the person was accurate. Accuracy in identifying the person without the face was identical to that in identifying the whole person. Paradoxically, people reported relying heavily on facial features over noninternal face and body features in making their identity decisions. Eye movements indicated otherwise, with gaze duration and fixations shifting adaptively toward the body and away from the face when the body was a better indicator of identity than the face. This shift occurred with no cost to accuracy or response time. Human identity processing may be partially inaccessible to conscious awareness.

  16. Optimal Eye-Gaze Fixation Position for Face-Related Neural Responses

    PubMed Central

    Zerouali, Younes; Lina, Jean-Marc; Jemel, Boutheina

    2013-01-01

    It is generally agreed that some features of a face, namely the eyes, are more salient than others as indexed by behavioral diagnosticity, gaze-fixation patterns and evoked-neural responses. However, because previous studies used unnatural stimuli, there is no evidence so far that the early encoding of a whole face in the human brain is based on the eyes or other facial features. To address this issue, scalp electroencephalogram (EEG) and eye gaze-fixations were recorded simultaneously in a gaze-contingent paradigm while observers viewed faces. We found that the N170 indexing the earliest face-sensitive response in the human brain was the largest when the fixation position is located around the nasion. Interestingly, for inverted faces, this optimal fixation position was more variable, but mainly clustered in the upper part of the visual field (around the mouth). These observations extend the findings of recent behavioral studies, suggesting that the early encoding of a face, as indexed by the N170, is not driven by the eyes per se, but rather arises from a general perceptual setting (upper-visual field advantage) coupled with the alignment of a face stimulus to a stored face template. PMID:23762224

  17. Optimal eye-gaze fixation position for face-related neural responses.

    PubMed

    Zerouali, Younes; Lina, Jean-Marc; Jemel, Boutheina

    2013-01-01

    It is generally agreed that some features of a face, namely the eyes, are more salient than others as indexed by behavioral diagnosticity, gaze-fixation patterns and evoked-neural responses. However, because previous studies used unnatural stimuli, there is no evidence so far that the early encoding of a whole face in the human brain is based on the eyes or other facial features. To address this issue, scalp electroencephalogram (EEG) and eye gaze-fixations were recorded simultaneously in a gaze-contingent paradigm while observers viewed faces. We found that the N170 indexing the earliest face-sensitive response in the human brain was the largest when the fixation position is located around the nasion. Interestingly, for inverted faces, this optimal fixation position was more variable, but mainly clustered in the upper part of the visual field (around the mouth). These observations extend the findings of recent behavioral studies, suggesting that the early encoding of a face, as indexed by the N170, is not driven by the eyes per se, but rather arises from a general perceptual setting (upper-visual field advantage) coupled with the alignment of a face stimulus to a stored face template.

  18. Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm.

    PubMed

    Li, Bin; Fu, Hong; Wen, Desheng; Lo, WaiLun

    2018-05-19

    Eye tracking technology has become increasingly important for psychological analysis, medical diagnosis, driver assistance systems, and many other applications. Various gaze-tracking models have been established by previous researchers. However, there is currently no near-eye display system with accurate gaze-tracking performance and a convenient user experience. In this paper, we constructed a complete prototype of the mobile gaze-tracking system ' Etracker ' with a near-eye viewing device for human gaze tracking. We proposed a combined gaze-tracking algorithm. In this algorithm, the convolutional neural network is used to remove blinking images and predict coarse gaze position, and then a geometric model is defined for accurate human gaze tracking. Moreover, we proposed using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms, so that an individual user only needs to calibrate it the first time, which makes our system more convenient. The experiments on gaze data from 26 participants show that the eye center detection accuracy is 98% and Etracker can provide an average gaze accuracy of 0.53° at a rate of 30⁻60 Hz.

  19. Eye Tracking Dysfunction in Schizophrenia: Characterization and Pathophysiology

    PubMed Central

    Sereno, Anne B.; Gooding, Diane C.; O’Driscoll, Gilllian A.

    2011-01-01

    Eye tracking dysfunction (ETD) is one of the most widely replicated behavioral deficits in schizophrenia and is over-represented in clinically unaffected first-degree relatives of schizophrenia patients. Here, we provide an overview of research relevant to the characterization and pathophysiology of this impairment. Deficits are most robust in the maintenance phase of pursuit, particularly during the tracking of predictable target movement. Impairments are also found in pursuit initiation and correlate with performance on tests of motion processing, implicating early sensory processing of motion signals. Taken together, the evidence suggests that ETD involves higher-order structures, including the frontal eye fields, which adjust the gain of the pursuit response to visual and anticipated target movement, as well as early parts of the pursuit pathway, including motion areas (the middle temporal area and the adjacent medial superior temporal area). Broader application of localizing behavioral paradigms in patient and family studies would be advantageous for refining the eye tracking phenotype for genetic studies. PMID:21312405

  20. How children with specific language impairment view social situations: an eye tracking study.

    PubMed

    Hosozawa, Mariko; Tanaka, Kyoko; Shimizu, Toshiaki; Nakano, Tamami; Kitazawa, Shigeru

    2012-06-01

    Children with specific language impairment (SLI) face risks for social difficulties. However, the nature and developmental course of these difficulties remain unclear. Gaze behaviors have been studied by using eye tracking among those with autism spectrum disorders (ASDs). Using this method, we compared the gaze behaviors of children with SLI with those of individuals with ASD and typically developing (TD) children to explore the social perception of children with SLI. The eye gazes of 66 children (16 with SLI, 25 with ASD, and 25 TD) were studied while viewing videos of social interactions. Gaze behaviors were summarized with multidimensional scaling, and participants with similar gaze behaviors were represented proximally in a 2-dimensional plane. The SLI and TD groups each formed a cluster near the center of the multidimensional scaling plane, whereas the ASD group was distributed around the periphery. Frame-by-frame analyses showed that children with SLI and TD children viewed faces in a manner consistent with the story line, but children with ASD devoted less attention to faces and social interactions. During speech scenes, children with SLI were significantly more fixated on the mouth, whereas TD children viewed the eyes and the mouth. Children with SLI viewed social situations in ways similar to those of TD children but different from those of children with ASD. However, children with SLI concentrated on the speaker's mouth, possibly to compensate for audiovisual processing deficits. Because eyes carry important information, this difference may influence the social development of children with SLI.

  1. Children with Autism Spectrum Disorder scan own-race faces differently from other-race faces.

    PubMed

    Yi, Li; Quinn, Paul C; Fan, Yuebo; Huang, Dan; Feng, Cong; Joseph, Lisa; Li, Jiao; Lee, Kang

    2016-01-01

    It has been well documented that people recognize and scan other-race faces differently from faces of their own race. The current study examined whether this cross-racial difference in face processing found in the typical population also exists in individuals with Autism Spectrum Disorder (ASD). Participants included 5- to 10-year-old children with ASD (n=29), typically developing (TD) children matched on chronological age (n=29), and TD children matched on nonverbal IQ (n=29). Children completed a face recognition task in which they were asked to memorize and recognize both own- and other-race faces while their eye movements were tracked. We found no recognition advantage for own-race faces relative to other-race faces in any of the three groups. However, eye-tracking results indicated that, similar to TD children, children with ASD exhibited a cross-racial face-scanning pattern: they looked at the eyes of other-race faces longer than at those of own-race faces, whereas they looked at the mouth of own-race faces longer than at that of other-race faces. The findings suggest that although children with ASD have difficulty with processing some aspects of faces, their ability to process face race information is relatively spared. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Reading Mathematics Representations: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Andrá, Chiara; Lindström, Paulina; Arzarello, Ferdinando; Holmqvist, Kenneth; Robutti, Ornella; Sabena, Cristina

    2015-01-01

    We use eye tracking as a method to examine how different mathematical representations of the same mathematical object are attended to by students. The results of this study show that there is a meaningful difference in the eye movements between formulas and graphs. This difference can be understood in terms of the cultural and social shaping of…

  3. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2017-06-01

    report. 10 Supporting Data None. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI Psychological Health...Award Number: W81XWH-13-1-0095 TITLE: Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI PRINCIPAL INVESTIGATOR...COVERED 08 MAR 2016 – 07 MAR 2017 4. TITLE AND SUBTITLE Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI 5a

  4. Eye-Tracking Evidence that Happy Faces Impair Verbal Message Comprehension: The Case of Health Warnings in Direct-to-Consumer Pharmaceutical Television Commercials

    PubMed Central

    Russell, Cristel Antonia; Swasy, John L.; Russell, Dale Wesley; Engel, Larry

    2017-01-01

    Risk warning or disclosure information in advertising is only effective in correcting consumers’ judgments if enough cognitive capacity is available to process that information. Hence, comprehension of verbal warnings in TV commercials may suffer if accompanied by positive visual elements. This research addresses this concern about cross-modality interference in the context of direct-to-consumer (DTC) pharmaceutical commercials in the United States by experimentally testing whether positive facial expressions reduce consumers’ understanding of the mandated health warning. A content analysis of a sample of DTC commercials reveals that positive facial expressions are more prevalent during the verbal warning act of the commercials than during the other acts. An eye-tracking experiment conducted with specially produced DTC commercials, which vary the valence of characters’ facial expressions during the health warning, provides evidence that happy faces reduce objective comprehension of the warning. PMID:29269979

  5. Eye and Face Protection in School Science

    ERIC Educational Resources Information Center

    Kaufman, Jim

    2006-01-01

    Choosing what eye and face protection to provide for the high school science laboratory is often a challenge. Science teachers and school administrators may not fully understand the relevant safety regulations and standards or be able to correctly identify the various types of eye protection devices. Although some schools have received training…

  6. Efficient visual information for unfamiliar face matching despite viewpoint variations: It's not in the eyes!

    PubMed

    Royer, Jessica; Blais, Caroline; Barnabé-Lortie, Vincent; Carré, Mélissa; Leclerc, Josiane; Fiset, Daniel

    2016-06-01

    Faces are encountered in highly diverse angles in real-world settings. Despite this considerable diversity, most individuals are able to easily recognize familiar faces. The vast majority of studies in the field of face recognition have nonetheless focused almost exclusively on frontal views of faces. Indeed, a number of authors have investigated the diagnostic facial features for the recognition of frontal views of faces previously encoded in this same view. However, the nature of the information useful for identity matching when the encoded face and test face differ in viewing angle remains mostly unexplored. The present study addresses this issue using individual differences and bubbles, a method that pinpoints the facial features effectively used in a visual categorization task. Our results indicate that the use of features located in the center of the face, the lower left portion of the nose area and the center of the mouth, are significantly associated with individual efficiency to generalize a face's identity across different viewpoints. However, as faces become more familiar, the reliance on this area decreases, while the diagnosticity of the eye region increases. This suggests that a certain distinction can be made between the visual mechanisms subtending viewpoint invariance and face recognition in the case of unfamiliar face identification. Our results further support the idea that the eye area may only come into play when the face stimulus is particularly familiar to the observer. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Eye Gaze during Observation of Static Faces in Deaf People

    PubMed Central

    Watanabe, Katsumi; Matsuda, Tetsuya; Nishioka, Tomoyuki; Namatame, Miki

    2011-01-01

    Knowing where people look when viewing faces provides an objective measure into the part of information entering the visual system as well as into the cognitive strategy involved in facial perception. In the present study, we recorded the eye movements of 20 congenitally deaf (10 male and 10 female) and 23 (11 male and 12 female) normal-hearing Japanese participants while they evaluated the emotional valence of static face stimuli. While no difference was found in the evaluation scores, the eye movements during facial observations differed among participant groups. The deaf group looked at the eyes more frequently and for longer duration than the nose whereas the hearing group focused on the nose (or the central region of face) more than the eyes. These results suggest that the strategy employed to extract visual information when viewing static faces may differ between deaf and hearing people. PMID:21359223

  8. Eye/head tracking technology to improve HCI with iPad applications.

    PubMed

    Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña

    2015-01-22

    In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future.

  9. Eye/Head Tracking Technology to Improve HCI with iPad Applications

    PubMed Central

    Lopez-Basterretxea, Asier; Mendez-Zorrilla, Amaia; Garcia-Zapirain, Begoña

    2015-01-01

    In order to improve human computer interaction (HCI) for people with special needs, this paper presents an alternative form of interaction, which uses the iPad's front camera and eye/head tracking technology. With this functional nature/capability operating in the background, the user can control already developed or new applications for the iPad by moving their eyes and/or head. There are many techniques, which are currently used to detect facial features, such as eyes or even the face itself. Open source bookstores exist for such purpose, such as OpenCV, which enable very reliable and accurate detection algorithms to be applied, such as Haar Cascade using very high-level programming. All processing is undertaken in real time, and it is therefore important to pay close attention to the use of limited resources (processing capacity) of devices, such as the iPad. The system was validated in tests involving 22 users of different ages and characteristics (people with dark and light-colored eyes and with/without glasses). These tests are performed to assess user/device interaction and to ascertain whether it works properly. The system obtained an accuracy of between 60% and 100% in the three test exercises taken into consideration. The results showed that the Haar Cascade had a significant effect by detecting faces in 100% of cases, unlike eyes and the pupil where interference (light and shade) evidenced less effectiveness. In addition to ascertaining the effectiveness of the system via these exercises, the demo application has also helped to show that user constraints need not affect the enjoyment and use of a particular type of technology. In short, the results obtained are encouraging and these systems may continue to be developed if extended and updated in the future. PMID:25621603

  10. Eye coding mechanisms in early human face event-related potentials.

    PubMed

    Rousselet, Guillaume A; Ince, Robin A A; van Rijsbergen, Nicola J; Schyns, Philippe G

    2014-11-10

    In humans, the N170 event-related potential (ERP) is an integrated measure of cortical activity that varies in amplitude and latency across trials. Researchers often conjecture that N170 variations reflect cortical mechanisms of stimulus coding for recognition. Here, to settle the conjecture and understand cortical information processing mechanisms, we unraveled the coding function of N170 latency and amplitude variations in possibly the simplest socially important natural visual task: face detection. On each experimental trial, 16 observers saw face and noise pictures sparsely sampled with small Gaussian apertures. Reverse-correlation methods coupled with information theory revealed that the presence of the eye specifically covaries with behavioral and neural measurements: the left eye strongly modulates reaction times and lateral electrodes represent mainly the presence of the contralateral eye during the rising part of the N170, with maximum sensitivity before the N170 peak. Furthermore, single-trial N170 latencies code more about the presence of the contralateral eye than N170 amplitudes and early latencies are associated with faster reaction times. The absence of these effects in control images that did not contain a face refutes alternative accounts based on retinal biases or allocation of attention to the eye location on the face. We conclude that the rising part of the N170, roughly 120-170 ms post-stimulus, is a critical time-window in human face processing mechanisms, reflecting predominantly, in a face detection task, the encoding of a single feature: the contralateral eye. © 2014 ARVO.

  11. Eye Movements during Multiple Object Tracking: Where Do Participants Look?

    ERIC Educational Resources Information Center

    Fehd, Hilda M.; Seiffert, Adriane E.

    2008-01-01

    Similar to the eye movements you might make when viewing a sports game, this experiment investigated where participants tend to look while keeping track of multiple objects. While eye movements were recorded, participants tracked either 1 or 3 of 8 red dots that moved randomly within a square box on a black background. Results indicated that…

  12. Real-time eye tracking for the assessment of driver fatigue.

    PubMed

    Xu, Junli; Min, Jianliang; Hu, Jianfeng

    2018-04-01

    Eye-tracking is an important approach to collect evidence regarding some participants' driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants' eye state for collecting eye-movement data. These data are useful to get insights into assessing participants' fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1-2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K -nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue.

  13. Real-time eye tracking for the assessment of driver fatigue

    PubMed Central

    Xu, Junli; Min, Jianliang

    2018-01-01

    Eye-tracking is an important approach to collect evidence regarding some participants’ driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants’ eye state for collecting eye-movement data. These data are useful to get insights into assessing participants’ fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1–2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K-nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue. PMID:29750113

  14. Screening for Dyslexia Using Eye Tracking during Reading.

    PubMed

    Nilsson Benfatto, Mattias; Öqvist Seimyr, Gustaf; Ygge, Jan; Pansell, Tony; Rydberg, Agneta; Jacobson, Christer

    2016-01-01

    Dyslexia is a neurodevelopmental reading disability estimated to affect 5-10% of the population. While there is yet no full understanding of the cause of dyslexia, or agreement on its precise definition, it is certain that many individuals suffer persistent problems in learning to read for no apparent reason. Although it is generally agreed that early intervention is the best form of support for children with dyslexia, there is still a lack of efficient and objective means to help identify those at risk during the early years of school. Here we show that it is possible to identify 9-10 year old individuals at risk of persistent reading difficulties by using eye tracking during reading to probe the processes that underlie reading ability. In contrast to current screening methods, which rely on oral or written tests, eye tracking does not depend on the subject to produce some overt verbal response and thus provides a natural means to objectively assess the reading process as it unfolds in real-time. Our study is based on a sample of 97 high-risk subjects with early identified word decoding difficulties and a control group of 88 low-risk subjects. These subjects were selected from a larger population of 2165 school children attending second grade. Using predictive modeling and statistical resampling techniques, we develop classification models from eye tracking records less than one minute in duration and show that the models are able to differentiate high-risk subjects from low-risk subjects with high accuracy. Although dyslexia is fundamentally a language-based learning disability, our results suggest that eye movements in reading can be highly predictive of individual reading ability and that eye tracking can be an efficient means to identify children at risk of long-term reading difficulties.

  15. Objective assessment of the contribution of dental esthetics and facial attractiveness in men via eye tracking.

    PubMed

    Baker, Robin S; Fields, Henry W; Beck, F Michael; Firestone, Allen R; Rosenstiel, Stephen F

    2018-04-01

    Recently, greater emphasis has been placed on smile esthetics in dentistry. Eye tracking has been used to objectively evaluate attention to the dentition (mouth) in female models with different levels of dental esthetics quantified by the aesthetic component of the Index of Orthodontic Treatment Need (IOTN). This has not been accomplished in men. Our objective was to determine the visual attention to the mouth in men with different levels of dental esthetics (IOTN levels) and background facial attractiveness, for both male and female raters, using eye tracking. Facial images of men rated as unattractive, average, and attractive were digitally manipulated and paired with validated oral images, IOTN levels 1 (no treatment need), 7 (borderline treatment need), and 10 (definite treatment need). Sixty-four raters meeting the inclusion criteria were included in the data analysis. Each rater was calibrated in the eye tracker and randomly viewed the composite images for 3 seconds, twice for reliability. Reliability was good or excellent (intraclass correlation coefficients, 0.6-0.9). Significant interactions were observed with factorial repeated-measures analysis of variance and the Tukey-Kramer method for density and duration of fixations in the interactions of model facial attractiveness by area of the face (P <0.0001, P <0.0001, respectively), dental esthetics (IOTN) by area of the face (P <0.0001, P <0.0001, respectively), and rater sex by area of the face (P = 0.0166, P = 0.0290, respectively). For area by facial attractiveness, the hierarchy of visual attention in unattractive and attractive models was eye, mouth, and nose, but for men of average attractiveness, it was mouth, eye, and nose. For dental esthetics by area, at IOTN 7, the mouth had significantly more visual attention than it did at IOTN 1 and significantly more than the nose. At IOTN 10, the mouth received significantly more attention than at IOTN 7 and surpassed the nose and eye. These

  16. An eye-tracking paradigm for analyzing the processing time of sentences with different linguistic complexities.

    PubMed

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics.

  17. An Eye-Tracking Paradigm for Analyzing the Processing Time of Sentences with Different Linguistic Complexities

    PubMed Central

    Wendt, Dorothea; Brand, Thomas; Kollmeier, Birger

    2014-01-01

    An eye-tracking paradigm was developed for use in audiology in order to enable online analysis of the speech comprehension process. This paradigm should be useful in assessing impediments in speech processing. In this paradigm, two scenes, a target picture and a competitor picture, were presented simultaneously with an aurally presented sentence that corresponded to the target picture. At the same time, eye fixations were recorded using an eye-tracking device. The effect of linguistic complexity on language processing time was assessed from eye fixation information by systematically varying linguistic complexity. This was achieved with a sentence corpus containing seven German sentence structures. A novel data analysis method computed the average tendency to fixate the target picture as a function of time during sentence processing. This allowed identification of the point in time at which the participant understood the sentence, referred to as the decision moment. Systematic differences in processing time were observed as a function of linguistic complexity. These differences in processing time may be used to assess the efficiency of cognitive processes involved in resolving linguistic complexity. Thus, the proposed method enables a temporal analysis of the speech comprehension process and has potential applications in speech audiology and psychoacoustics. PMID:24950184

  18. Real-time eye motion correction in phase-resolved OCT angiography with tracking SLO

    PubMed Central

    Braaf, Boy; Vienola, Kari V.; Sheehy, Christy K.; Yang, Qiang; Vermeer, Koenraad A.; Tiruveedhula, Pavan; Arathorn, David W.; Roorda, Austin; de Boer, Johannes F.

    2012-01-01

    In phase-resolved OCT angiography blood flow is detected from phase changes in between A-scans that are obtained from the same location. In ophthalmology, this technique is vulnerable to eye motion. We address this problem by combining inter-B-scan phase-resolved OCT angiography with real-time eye tracking. A tracking scanning laser ophthalmoscope (TSLO) at 840 nm provided eye tracking functionality and was combined with a phase-stabilized optical frequency domain imaging (OFDI) system at 1040 nm. Real-time eye tracking corrected eye drift and prevented discontinuity artifacts from (micro)saccadic eye motion in OCT angiograms. This improved the OCT spot stability on the retina and consequently reduced the phase-noise, thereby enabling the detection of slower blood flows by extending the inter-B-scan time interval. In addition, eye tracking enabled the easy compounding of multiple data sets from the fovea of a healthy volunteer to create high-quality eye motion artifact-free angiograms. High-quality images are presented of two distinct layers of vasculature in the retina and the dense vasculature of the choroid. Additionally we present, for the first time, a phase-resolved OCT angiogram of the mesh-like network of the choriocapillaris containing typical pore openings. PMID:23304647

  19. EEG and Eye Tracking Demonstrate Vigilance Enhancement with Challenge Integration

    PubMed Central

    Bodala, Indu P.; Li, Junhua; Thakor, Nitish V.; Al-Nashash, Hasan

    2016-01-01

    Maintaining vigilance is possibly the first requirement for surveillance tasks where personnel are faced with monotonous yet intensive monitoring tasks. Decrement in vigilance in such situations could result in dangerous consequences such as accidents, loss of life and system failure. In this paper, we investigate the possibility to enhance vigilance or sustained attention using “challenge integration,” a strategy that integrates a primary task with challenging stimuli. A primary surveillance task (identifying an intruder in a simulated factory environment) and a challenge stimulus (periods of rain obscuring the surveillance scene) were employed to test the changes in vigilance levels. The effect of integrating challenging events (resulting from artificially simulated rain) into the task were compared to the initial monotonous phase. EEG and eye tracking data is collected and analyzed for n = 12 subjects. Frontal midline theta power and frontal theta to parietal alpha power ratio which are used as measures of engagement and attention allocation show an increase due to challenge integration (p < 0.05 in each case). Relative delta band power of EEG also shows statistically significant suppression on the frontoparietal and occipital cortices due to challenge integration (p < 0.05). Saccade amplitude, saccade velocity and blink rate obtained from eye tracking data exhibit statistically significant changes during the challenge phase of the experiment (p < 0.05 in each case). From the correlation analysis between the statistically significant measures of eye tracking and EEG, we infer that saccade amplitude and saccade velocity decrease with vigilance decrement along with frontal midline theta and frontal theta to parietal alpha ratio. Conversely, blink rate and relative delta power increase with vigilance decrement. However, these measures exhibit a reverse trend when challenge stimulus appears in the task suggesting vigilance enhancement. Moreover, the mean

  20. Long-range eye tracking: A feasibility study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jayaweera, S.K.; Lu, Shin-yee

    1994-08-24

    The design considerations for a long-range Purkinje effects based video tracking system using current technology is presented. Past work, current experiments, and future directions are thoroughly discussed, with an emphasis on digital signal processing techniques and obstacles. It has been determined that while a robust, efficient, long-range, and non-invasive eye tracking system will be difficult to develop, such as a project is indeed feasible.

  1. Vestibulo-Cervico-Ocular Responses and Tracking Eye Movements after Prolonged Exposure to Microgravity

    NASA Technical Reports Server (NTRS)

    Kornilova, L. N.; Naumov, I. A.; Azarov, K. A.; Sagalovitch, S. V.; Reschke, Millard F.; Kozlovskaya, I. B.

    2007-01-01

    The vestibular function and tracking eye movements were investigated in 12 Russian crew members of ISS missions on days 1(2), 4(5-6), and 8(9-10) after prolonged exposure to microgravity (126 to 195 days). The spontaneous oculomotor activity, static torsional otolith-cervico-ocular reflex, dynamic vestibulo-cervico-ocular responses, vestibular reactivity, tracking eye movements, and gaze-holding were studied using videooculography (VOG) and electrooculography (EOG) for parallel eye movement recording. On post-flight days 1-2 (R+1-2) some cosmonauts demonstrated: - an increased spontaneous oculomotor activity (floating eye movements, spontaneous nystagmus of the typical and atypical form, square wave jerks, gaze nystagmus) with the head held in the vertical position; - suppressed otolith function (absent or reduced by one half amplitude of torsional compensatory eye counter-rolling) with the head inclined statically right- or leftward by 300; - increased vestibular reactivity (lowered threshold and increased intensity of the vestibular nystagmus) during head turns around the longitudinal body axis at 0.125 Hz; - a significant change in the accuracy, velocity, and temporal characteristics of the eye tracking. The pattern, depth, dynamics, and velocity of the vestibular function and tracking eye movements recovery varied with individual participants in the investigation. However, there were also regular responses during readaptation to the normal gravity: - suppression of the otolith function was typically accompanied by an exaggerated vestibular reactivity; - the structure of visual tracking (the accuracy of fixational eye rotations, smooth tracking, and gaze-holding) was disturbed (the appearance of correcting saccades, the transition of smooth tracking to saccadic tracking) only in those cosmonauts who, in parallel to an increased reactivity of the vestibular input, also had central changes in the oculomotor system (spontaneous nystagmus, gaze nystagmus).

  2. Neural correlates of the eye dominance effect in human face perception: the left-visual-field superiority for faces revisited.

    PubMed

    Jung, Wookyoung; Kang, Joong-Gu; Jeon, Hyeonjin; Shim, Miseon; Sun Kim, Ji; Leem, Hyun-Sung; Lee, Seung-Hwan

    2017-08-01

    Faces are processed best when they are presented in the left visual field (LVF), a phenomenon known as LVF superiority. Although one eye contributes more when perceiving faces, it is unclear how the dominant eye (DE), the eye we unconsciously use when performing a monocular task, affects face processing. Here, we examined the influence of the DE on the LVF superiority for faces using event-related potentials. Twenty left-eye-dominant (LDE group) and 23 right-eye-dominant (RDE group) participants performed the experiments. Face stimuli were randomly presented in the LVF or right visual field (RVF). The RDE group exhibited significantly larger N170 amplitudes compared with the LDE group. Faces presented in the LVF elicited N170 amplitudes that were significantly more negative in the RDE group than they were in the LDE group, whereas the amplitudes elicited by stimuli presented in the RVF were equivalent between the groups. The LVF superiority was maintained in the RDE group but not in the LDE group. Our results provide the first neural evidence of the DE's effects on the LVF superiority for faces. We propose that the RDE may be more biologically specialized for face processing. © The Author (2017). Published by Oxford University Press.

  3. Neural correlates of the eye dominance effect in human face perception: the left-visual-field superiority for faces revisited

    PubMed Central

    Jung, Wookyoung; Kang, Joong-Gu; Jeon, Hyeonjin; Shim, Miseon; Sun Kim, Ji; Leem, Hyun-Sung

    2017-01-01

    Abstract Faces are processed best when they are presented in the left visual field (LVF), a phenomenon known as LVF superiority. Although one eye contributes more when perceiving faces, it is unclear how the dominant eye (DE), the eye we unconsciously use when performing a monocular task, affects face processing. Here, we examined the influence of the DE on the LVF superiority for faces using event-related potentials. Twenty left-eye-dominant (LDE group) and 23 right-eye-dominant (RDE group) participants performed the experiments. Face stimuli were randomly presented in the LVF or right visual field (RVF). The RDE group exhibited significantly larger N170 amplitudes compared with the LDE group. Faces presented in the LVF elicited N170 amplitudes that were significantly more negative in the RDE group than they were in the LDE group, whereas the amplitudes elicited by stimuli presented in the RVF were equivalent between the groups. The LVF superiority was maintained in the RDE group but not in the LDE group. Our results provide the first neural evidence of the DE’s effects on the LVF superiority for faces. We propose that the RDE may be more biologically specialized for face processing. PMID:28379584

  4. How a Hat May Affect 3-Month-Olds' Recognition of a Face: An Eye-Tracking Study

    PubMed Central

    Bulf, Hermann; Valenza, Eloisa; Turati, Chiara

    2013-01-01

    Recent studies have shown that infants’ face recognition rests on a robust face representation that is resilient to a variety of facial transformations such as rotations in depth, motion, occlusion or deprivation of inner/outer features. Here, we investigated whether 3-month-old infants’ ability to represent the invariant aspects of a face is affected by the presence of an external add-on element, i.e. a hat. Using a visual habituation task, three experiments were carried out in which face recognition was investigated by manipulating the presence/absence of a hat during face encoding (i.e. habituation phase) and face recognition (i.e. test phase). An eye-tracker system was used to record the time infants spent looking at face-relevant information compared to the hat. The results showed that infants’ face recognition was not affected by the presence of the external element when the type of the hat did not vary between the habituation and test phases, and when both the novel and the familiar face wore the same hat during the test phase (Experiment 1). Infants’ ability to recognize the invariant aspects of a face was preserved also when the hat was absent in the habituation phase and the same hat was shown only during the test phase (Experiment 2). Conversely, when the novel face identity competed with a novel hat, the hat triggered the infants’ attention, interfering with the recognition process and preventing the infants’ preference for the novel face during the test phase (Experiment 3). Findings from the current study shed light on how faces and objects are processed when they are simultaneously presented in the same visual scene, contributing to an understanding of how infants respond to the multiple and composite information available in their surrounding environment. PMID:24349378

  5. How a hat may affect 3-month-olds' recognition of a face: an eye-tracking study.

    PubMed

    Bulf, Hermann; Valenza, Eloisa; Turati, Chiara

    2013-01-01

    Recent studies have shown that infants' face recognition rests on a robust face representation that is resilient to a variety of facial transformations such as rotations in depth, motion, occlusion or deprivation of inner/outer features. Here, we investigated whether 3-month-old infants' ability to represent the invariant aspects of a face is affected by the presence of an external add-on element, i.e. a hat. Using a visual habituation task, three experiments were carried out in which face recognition was investigated by manipulating the presence/absence of a hat during face encoding (i.e. habituation phase) and face recognition (i.e. test phase). An eye-tracker system was used to record the time infants spent looking at face-relevant information compared to the hat. The results showed that infants' face recognition was not affected by the presence of the external element when the type of the hat did not vary between the habituation and test phases, and when both the novel and the familiar face wore the same hat during the test phase (Experiment 1). Infants' ability to recognize the invariant aspects of a face was preserved also when the hat was absent in the habituation phase and the same hat was shown only during the test phase (Experiment 2). Conversely, when the novel face identity competed with a novel hat, the hat triggered the infants' attention, interfering with the recognition process and preventing the infants' preference for the novel face during the test phase (Experiment 3). Findings from the current study shed light on how faces and objects are processed when they are simultaneously presented in the same visual scene, contributing to an understanding of how infants respond to the multiple and composite information available in their surrounding environment.

  6. An eye tracking study of bloodstain pattern analysts during pattern classification.

    PubMed

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  7. The µ-opioid system promotes visual attention to faces and eyes.

    PubMed

    Chelnokova, Olga; Laeng, Bruno; Løseth, Guro; Eikemo, Marie; Willoch, Frode; Leknes, Siri

    2016-12-01

    Paying attention to others' faces and eyes is a cornerstone of human social behavior. The µ-opioid receptor (MOR) system, central to social reward-processing in rodents and primates, has been proposed to mediate the capacity for affiliative reward in humans. We assessed the role of the human MOR system in visual exploration of faces and eyes of conspecifics. Thirty healthy males received a novel, bidirectional battery of psychopharmacological treatment (an MOR agonist, a non-selective opioid antagonist, or placebo, on three separate days). Eye-movements were recorded while participants viewed facial photographs. We predicted that the MOR system would promote visual exploration of faces, and hypothesized that MOR agonism would increase, whereas antagonism decrease overt attention to the information-rich eye region. The expected linear effect of MOR manipulation on visual attention to the stimuli was observed, such that MOR agonism increased while antagonism decreased visual exploration of faces and overt attention to the eyes. The observed effects suggest that the human MOR system promotes overt visual attention to socially significant cues, in line with theories linking reward value to gaze control and target selection. Enhanced attention to others' faces and eyes represents a putative behavioral mechanism through which the human MOR system promotes social interest. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  8. Can human eyes prevent perceptual narrowing for monkey faces in human infants?

    PubMed

    Damon, Fabrice; Bayet, Laurie; Quinn, Paul C; Hillairet de Boisferon, Anne; Méary, David; Dupierrix, Eve; Lee, Kang; Pascalis, Olivier

    2015-07-01

    Perceptual narrowing has been observed in human infants for monkey faces: 6-month-olds can discriminate between them, whereas older infants from 9 months of age display difficulty discriminating between them. The difficulty infants from 9 months have processing monkey faces has not been clearly identified. It could be due to the structural characteristics of monkey faces, particularly the key facial features that differ from human faces. The current study aimed to investigate whether the information conveyed by the eyes is of importance. We examined whether the presence of Caucasian human eyes in monkey faces allows recognition to be maintained in 6-month-olds and facilitates recognition in 9- and 12-month-olds. Our results revealed that the presence of human eyes in monkey faces maintains recognition for those faces at 6 months of age and partially facilitates recognition of those faces at 9 months of age, but not at 12 months of age. The findings are interpreted in the context of perceptual narrowing and suggest that the attenuation of processing of other-species faces is not reversed by the presence of human eyes. © 2015 Wiley Periodicals, Inc.

  9. On Biometrics With Eye Movements.

    PubMed

    Zhang, Youming; Juhola, Martti

    2017-09-01

    Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.

  10. Face pose tracking using the four-point algorithm

    NASA Astrophysics Data System (ADS)

    Fung, Ho Yin; Wong, Kin Hong; Yu, Ying Kin; Tsui, Kwan Pang; Kam, Ho Chuen

    2017-06-01

    In this paper, we have developed an algorithm to track the pose of a human face robustly and efficiently. Face pose estimation is very useful in many applications such as building virtual reality systems and creating an alternative input method for the disabled. Firstly, we have modified a face detection toolbox called DLib for the detection of a face in front of a camera. The detected face features are passed to a pose estimation method, known as the four-point algorithm, for pose computation. The theory applied and the technical problems encountered during system development are discussed in the paper. It is demonstrated that the system is able to track the pose of a face in real time using a consumer grade laptop computer.

  11. The effect of emotionally valenced eye region images on visuocortical processing of surprised faces.

    PubMed

    Li, Shuaixia; Li, Ping; Wang, Wei; Zhu, Xiangru; Luo, Wenbo

    2018-05-01

    In this study, we presented pictorial representations of happy, neutral, and fearful expressions projected in the eye regions to determine whether the eye region alone is sufficient to produce a context effect. Participants were asked to judge the valence of surprised faces that had been preceded by a picture of an eye region. Behavioral results showed that affective ratings of surprised faces were context dependent. Prime-related ERPs with presentation of happy eyes elicited a larger P1 than those for neutral and fearful eyes, likely due to the recognition advantage provided by a happy expression. Target-related ERPs showed that surprised faces in the context of fearful and happy eyes elicited dramatically larger C1 than those in the neutral context, which reflected the modulation by predictions during the earliest stages of face processing. There were larger N170 with neutral and fearful eye contexts compared to the happy context, suggesting faces were being integrated with contextual threat information. The P3 component exhibited enhanced brain activity in response to faces preceded by happy and fearful eyes compared with neutral eyes, indicating motivated attention processing may be involved at this stage. Altogether, these results indicate for the first time that the influence of isolated eye regions on the perception of surprised faces involves preferential processing at the early stages and elaborate processing at the late stages. Moreover, higher cognitive processes such as predictions and attention can modulate face processing from the earliest stages in a top-down manner. © 2017 Society for Psychophysiological Research.

  12. Understanding eye movements in face recognition using hidden Markov models.

    PubMed

    Chuk, Tim; Chan, Antoni B; Hsiao, Janet H

    2014-09-16

    We use a hidden Markov model (HMM) based approach to analyze eye movement data in face recognition. HMMs are statistical models that are specialized in handling time-series data. We conducted a face recognition task with Asian participants, and model each participant's eye movement pattern with an HMM, which summarized the participant's scan paths in face recognition with both regions of interest and the transition probabilities among them. By clustering these HMMs, we showed that participants' eye movements could be categorized into holistic or analytic patterns, demonstrating significant individual differences even within the same culture. Participants with the analytic pattern had longer response times, but did not differ significantly in recognition accuracy from those with the holistic pattern. We also found that correct and wrong recognitions were associated with distinctive eye movement patterns; the difference between the two patterns lies in the transitions rather than locations of the fixations alone. © 2014 ARVO.

  13. Cable-driven elastic parallel humanoid head with face tracking for Autism Spectrum Disorder interventions.

    PubMed

    Su, Hao; Dickstein-Fischer, Laurie; Harrington, Kevin; Fu, Qiushi; Lu, Weina; Huang, Haibo; Cole, Gregory; Fischer, Gregory S

    2010-01-01

    This paper presents the development of new prismatic actuation approach and its application in human-safe humanoid head design. To reduce actuator output impedance and mitigate unexpected external shock, the prismatic actuation method uses cables to drive a piston with preloaded spring. By leveraging the advantages of parallel manipulator and cable-driven mechanism, the developed neck has a parallel manipulator embodiment with two cable-driven limbs embedded with preloaded springs and one passive limb. The eye mechanism is adapted for low-cost webcam with succinct "ball-in-socket" structure. Based on human head anatomy and biomimetics, the neck has 3 degree of freedom (DOF) motion: pan, tilt and one decoupled roll while each eye has independent pan and synchronous tilt motion (3 DOF eyes). A Kalman filter based face tracking algorithm is implemented to interact with the human. This neck and eye structure is translatable to other human-safe humanoid robots. The robot's appearance reflects a non-threatening image of a penguin, which can be translated into a possible therapeutic intervention for children with Autism Spectrum Disorders.

  14. Elevated intracranial pressure and reversible eye-tracking changes detected while viewing a film clip.

    PubMed

    Kolecki, Radek; Dammavalam, Vikalpa; Bin Zahid, Abdullah; Hubbard, Molly; Choudhry, Osamah; Reyes, Marleen; Han, ByoungJun; Wang, Tom; Papas, Paraskevi Vivian; Adem, Aylin; North, Emily; Gilbertson, David T; Kondziolka, Douglas; Huang, Jason H; Huang, Paul P; Samadani, Uzma

    2018-03-01

    OBJECTIVE The precise threshold differentiating normal and elevated intracranial pressure (ICP) is variable among individuals. In the context of several pathophysiological conditions, elevated ICP leads to abnormalities in global cerebral functioning and impacts the function of cranial nerves (CNs), either or both of which may contribute to ocular dysmotility. The purpose of this study was to assess the impact of elevated ICP on eye-tracking performed while patients were watching a short film clip. METHODS Awake patients requiring placement of an ICP monitor for clinical purposes underwent eye tracking while watching a 220-second continuously playing video moving around the perimeter of a viewing monitor. Pupil position was recorded at 500 Hz and metrics associated with each eye individually and both eyes together were calculated. Linear regression with generalized estimating equations was performed to test the association of eye-tracking metrics with changes in ICP. RESULTS Eye tracking was performed at ICP levels ranging from -3 to 30 mm Hg in 23 patients (12 women, 11 men, mean age 46.8 years) on 55 separate occasions. Eye-tracking measures correlating with CN function linearly decreased with increasing ICP (p < 0.001). Measures for CN VI were most prominently affected. The area under the curve (AUC) for eye-tracking metrics to discriminate between ICP < 12 and ≥ 12 mm Hg was 0.798. To discriminate an ICP < 15 from ≥ 15 mm Hg the AUC was 0.833, and to discriminate ICP < 20 from ≥ 20 mm Hg the AUC was 0.889. CONCLUSIONS Increasingly elevated ICP was associated with increasingly abnormal eye tracking detected while patients were watching a short film clip. These results suggest that eye tracking may be used as a noninvasive, automatable means to quantitate the physiological impact of elevated ICP, which has clinical application for assessment of shunt malfunction, pseudotumor cerebri, concussion, and prevention of second-impact syndrome.

  15. In vivo imaging of palisades of Vogt in dry eye versus normal subjects using en-face spectral-domain optical coherence tomography

    PubMed Central

    Djerada, Zoubir; Liang, Hong; El Sanharawi, Mohamed; Labbé, Antoine; Baudouin, Christophe

    2017-01-01

    Purpose To evaluate a possible clinical application of spectral-domain optical coherence tomography (SD-OCT) using en-face module for the imaging of the corneoscleral limbus in normal subjects and dry eye patients. Patients and methods Seventy-six subjects were included in this study. Seventy eyes of 35 consecutive patients with dry eye disease and 82 eyes of 41 healthy control subjects were investigated. All subjects were examined with the Avanti RTVue® anterior segment OCT. En-face OCT images of the corneoscleral limbus were acquired in four quadrants (inferior, superior, nasal and temporal) and then were analyzed semi-quantitatively according to whether or not palisades of Vogt (POV) were visible. En-face OCT images were then compared to in vivo confocal microscopy (IVCM) in eleven eyes of 7 healthy and dry eye patients. Results En-face SD-OCT showed POV as a radially oriented network, located in superficial corneoscleral limbus, with a good correlation with IVCM features. It provided an easy and reproducible identification of POV without any special preparation or any direct contact, with a grading scale from 0 (no visualization) to 3 (high visualization). The POV were found predominantly in superior (P<0.001) and inferior (P<0.001) quadrants when compared to the nasal and temporal quadrants for all subjects examined. The visibility score decreased with age (P<0.001) and was lower in dry eye patients (P<0.01). In addition, the score decreased in accordance with the severity of dry eye disease (P<0.001). Conclusion En-face SD-OCT is a non-contact imaging technique that can be used to evaluate the POV, thus providing valuable information about differences in the limbal anatomy of dry eye patients as compared to healthy patients. PMID:29176786

  16. In vivo imaging of palisades of Vogt in dry eye versus normal subjects using en-face spectral-domain optical coherence tomography.

    PubMed

    Ghouali, Wajdene; Tahiri Joutei Hassani, Rachid; Djerada, Zoubir; Liang, Hong; El Sanharawi, Mohamed; Labbé, Antoine; Baudouin, Christophe

    2017-01-01

    To evaluate a possible clinical application of spectral-domain optical coherence tomography (SD-OCT) using en-face module for the imaging of the corneoscleral limbus in normal subjects and dry eye patients. Seventy-six subjects were included in this study. Seventy eyes of 35 consecutive patients with dry eye disease and 82 eyes of 41 healthy control subjects were investigated. All subjects were examined with the Avanti RTVue® anterior segment OCT. En-face OCT images of the corneoscleral limbus were acquired in four quadrants (inferior, superior, nasal and temporal) and then were analyzed semi-quantitatively according to whether or not palisades of Vogt (POV) were visible. En-face OCT images were then compared to in vivo confocal microscopy (IVCM) in eleven eyes of 7 healthy and dry eye patients. En-face SD-OCT showed POV as a radially oriented network, located in superficial corneoscleral limbus, with a good correlation with IVCM features. It provided an easy and reproducible identification of POV without any special preparation or any direct contact, with a grading scale from 0 (no visualization) to 3 (high visualization). The POV were found predominantly in superior (P<0.001) and inferior (P<0.001) quadrants when compared to the nasal and temporal quadrants for all subjects examined. The visibility score decreased with age (P<0.001) and was lower in dry eye patients (P<0.01). In addition, the score decreased in accordance with the severity of dry eye disease (P<0.001). En-face SD-OCT is a non-contact imaging technique that can be used to evaluate the POV, thus providing valuable information about differences in the limbal anatomy of dry eye patients as compared to healthy patients.

  17. Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study.

    PubMed

    Kesner, Ladislav; Grygarová, Dominika; Fajnerová, Iveta; Lukavský, Jiří; Nekovářová, Tereza; Tintěra, Jaroslav; Zaytseva, Yuliya; Horáček, Jiří

    2018-06-15

    In this study, we use separate eye-tracking measurements and functional magnetic resonance imaging to investigate the neuronal and behavioral response to painted portraits with direct versus averted gaze. We further explored modulatory effects of several painting characteristics (premodern vs modern period, influence of style and pictorial context). In the fMRI experiment, we show that the direct versus averted gaze elicited increased activation in lingual and inferior occipital and the fusiform face area, as well as in several areas involved in attentional and social cognitive processes, especially the theory of mind: angular gyrus/temporo-parietal junction, inferior frontal gyrus and dorsolateral prefrontal cortex. The additional eye-tracking experiment showed that participants spent more time viewing the portrait's eyes and mouth when the portrait's gaze was directed towards the observer. These results suggest that static and, in some cases, highly stylized depictions of human beings in artistic portraits elicit brain activation commensurate with the experience of being observed by a watchful intelligent being. They thus involve observers in implicit inferences of the painted subject's mental states and emotions. We further confirm the substantial influence of representational medium on brain activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Eye-Tracking in the Study of Visual Expertise: Methodology and Approaches in Medicine

    ERIC Educational Resources Information Center

    Fox, Sharon E.; Faulkner-Jones, Beverly E.

    2017-01-01

    Eye-tracking is the measurement of eye motions and point of gaze of a viewer. Advances in this technology have been essential to our understanding of many forms of visual learning, including the development of visual expertise. In recent years, these studies have been extended to the medical professions, where eye-tracking technology has helped us…

  19. Summary of tracking and identification methods

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Yang, Chun; Kadar, Ivan

    2014-06-01

    Over the last two decades, many solutions have arisen to combine target tracking estimation with classification methods. Target tracking includes developments from linear to non-linear and Gaussian to non-Gaussian processing. Pattern recognition includes detection, classification, recognition, and identification methods. Integrating tracking and pattern recognition has resulted in numerous approaches and this paper seeks to organize the various approaches. We discuss the terminology so as to have a common framework for various standards such as the NATO STANAG 4162 - Identification Data Combining Process. In a use case, we provide a comparative example highlighting that location information (as an example) with additional mission objectives from geographical, human, social, cultural, and behavioral modeling is needed to determine identification as classification alone does not allow determining identification or intent.

  20. [Slowing down the flow of facial information enhances facial scanning in children with autism spectrum disorders: A pilot eye tracking study].

    PubMed

    Charrier, A; Tardif, C; Gepner, B

    2017-02-01

    Face and gaze avoidance are among the most characteristic and salient symptoms of autism spectrum disorders (ASD). Studies using eye tracking highlighted early and lifelong ASD-specific abnormalities in attention to face such as decreased attention to internal facial features. These specificities could be partly explained by disorders in the perception and integration of rapid and complex information such as that conveyed by facial movements and more broadly by biological and physical environment. Therefore, we wish to test whether slowing down facial dynamics may improve the way children with ASD attend to a face. We used an eye tracking method to examine gaze patterns of children with ASD aged 3 to 8 (n=23) and TD controls (n=29) while viewing the face of a speaker telling a story. The story was divided into 6 sequences that were randomly displayed at 3 different speeds, i.e. a real-time speed (RT), a slow speed (S70=70% of RT speed), a very slow speed (S50=50% of RT speed). S70 and S50 were displayed thanks to software called Logiral™, aimed at slowing down visual and auditory stimuli simultaneously and without tone distortion. The visual scene was divided into four regions of interest (ROI): eyes region; mouth region; whole face region; outside the face region. The total time, number and mean duration of visual fixations on the whole visual scene and the four ROI were measured between and within the two groups. Compared to TD children, children with ASD spent significantly less time attending to the visual scenes and, when they looked at the scene, they spent less time scanning the speaker's face in general and her mouth in particular, and more time looking outside facial area. Within the ASD group mean duration of fixation increased on the whole scene and particularly on the mouth area, in R50 compared to RT. Children with mild autism spent more time looking at the face than the two other groups of ASD children, and spent more time attending to the face and

  1. Anxiety and Sensitivity to Eye Gaze in Emotional Faces

    ERIC Educational Resources Information Center

    Holmes, Amanda; Richards, Anne; Green, Simon

    2006-01-01

    This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by…

  2. Are 6-month-old human infants able to transfer emotional information (happy or angry) from voices to faces? An eye-tracking study.

    PubMed

    Palama, Amaya; Malsert, Jennifer; Gentaz, Edouard

    2018-01-01

    The present study examined whether 6-month-old infants could transfer amodal information (i.e. independently of sensory modalities) from emotional voices to emotional faces. Thus, sequences of successive emotional stimuli (voice or face from one sensory modality -auditory- to another sensory modality -visual-), corresponding to a cross-modal transfer, were displayed to 24 infants. Each sequence presented an emotional (angry or happy) or neutral voice, uniquely, followed by the simultaneous presentation of two static emotional faces (angry or happy, congruous or incongruous with the emotional voice). Eye movements in response to the visual stimuli were recorded with an eye-tracker. First, results suggested no difference in infants' looking time to happy or angry face after listening to the neutral voice or the angry voice. Nevertheless, after listening to the happy voice, infants looked longer at the incongruent angry face (the mouth area in particular) than the congruent happy face. These results revealed that a cross-modal transfer (from auditory to visual modalities) is possible for 6-month-old infants only after the presentation of a happy voice, suggesting that they recognize this emotion amodally.

  3. Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis

    DTIC Science & Technology

    2006-01-01

    Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis Laura Kurland, Abigail Gertner, Tom Bartee, Michael Chisholm and...have used these to study the analysts search behavior in detail. 2 EXPERIMENT Using a Cognitive Task Analysis (CTA) framework for knowledge...TITLE AND SUBTITLE Using Cognitive Task Analysis and Eye Tracking to Understand Imagery Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  4. Loneliness and Hypervigilance to Social Cues in Females: An Eye-Tracking Study

    PubMed Central

    Lodder, Gerine M. A.; Scholte, Ron H. J.; Clemens, Ivar A. H.; Engels, Rutger C. M. E.; Goossens, Luc; Verhagen, Maaike

    2015-01-01

    The goal of the present study was to examine whether lonely individuals differ from nonlonely individuals in their overt visual attention to social cues. Previous studies showed that loneliness was related to biased post-attentive processing of social cues (e.g., negative interpretation bias), but research on whether lonely and nonlonely individuals also show differences in an earlier information processing stage (gazing behavior) is very limited. A sample of 25 lonely and 25 nonlonely students took part in an eye-tracking study consisting of four tasks. We measured gazing (duration, number of fixations and first fixation) at the eyes, nose and mouth region of faces expressing emotions (Task 1), at emotion quadrants (anger, fear, happiness and neutral expression) (Task 2), at quadrants with positive and negative social and nonsocial images (Task 3), and at the facial area of actors in video clips with positive and negative content (Task 4). In general, participants tended to gaze most often and longest at areas that conveyed most social information, such as the eye region of the face (T1), and social images (T3). Participants gazed most often and longest at happy faces (T2) in still images, and more often and longer at the facial area in negative than in positive video clips (T4). No differences occurred between lonely and nonlonely participants in their gazing times and frequencies, nor at first fixations at social cues in the four different tasks. Based on this study, we found no evidence that overt visual attention to social cues differs between lonely and nonlonely individuals. This implies that biases in social information processing of lonely individuals may be limited to other phases of social information processing. Alternatively, biased overt attention to social cues may only occur under specific conditions, for specific stimuli or for specific lonely individuals. PMID:25915656

  5. Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing

    PubMed Central

    Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan

    2017-01-01

    Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant’s eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results. PMID:29071007

  6. Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing.

    PubMed

    Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan

    2017-01-01

    Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant's eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results.

  7. Context Effects and Spoken Word Recognition of Chinese: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Yip, Michael C. W.; Zhai, Mingjun

    2018-01-01

    This study examined the time-course of context effects on spoken word recognition during Chinese sentence processing. We recruited 60 native Mandarin listeners to participate in an eye-tracking experiment. In this eye-tracking experiment, listeners were told to listen to a sentence carefully, which ended with a Chinese homophone, and look at…

  8. Application of TrackEye in equine locomotion research.

    PubMed

    Drevemo, S; Roepstorff, L; Kallings, P; Johnston, C J

    1993-01-01

    TrackEye is an analysis system, which is applicable for equine biokinematic studies. It covers the whole process from digitizing of images, automatic target tracking and analysis. Key components in the system are an image work station for processing of video images and a high-resolution film-to-video scanner for 16-mm film. A recording module controls the input device and handles the capture of image sequences into a videodisc system, and a tracking module is able to follow reference markers automatically. The system offers a flexible analysis including calculations of markers displacements, distances and joint angles, velocities and accelerations. TrackEye was used to study effects of phenylbutazone on the fetlock and carpal joint angle movements in a horse with a mild lameness caused by osteo-arthritis in the fetlock joint of a forelimb. Significant differences, most evident before treatment, were observed in the minimum fetlock and carpal joint angles when contralateral limbs were compared (p < 0.001). The minimum fetlock angle and the minimum carpal joint angle were significantly greater in the lame limb before treatment compared to those 6, 37 and 49 h after the last treatment (p < 0.001).

  9. The Role of Eyes and Mouth in the Memory of a Face

    ERIC Educational Resources Information Center

    McKelvie, Stuart J.

    1976-01-01

    Investigates the relative importance that the eyes and mouth play in the representation in memory of a human face. Systematically applies two kinds of transformation--masking the eyes or the mouths on photographs of faces--and observes the effects on recognition. (Author/RK)

  10. Eye Tracking Metrics for Workload Estimation in Flight Deck Operation

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle; Schnell, Thomas

    2010-01-01

    Flight decks of the future are being enhanced through improved avionics that adapt to both aircraft and operator state. Eye tracking allows for non-invasive analysis of pilot eye movements, from which a set of metrics can be derived to effectively and reliably characterize workload. This research identifies eye tracking metrics that correlate to aircraft automation conditions, and identifies the correlation of pilot workload to the same automation conditions. Saccade length was used as an indirect index of pilot workload: Pilots in the fully automated condition were observed to have on average, larger saccadic movements in contrast to the guidance and manual flight conditions. The data set itself also provides a general model of human eye movement behavior and so ostensibly visual attention distribution in the cockpit for approach to land tasks with various levels of automation, by means of the same metrics used for workload algorithm development.

  11. Online Learners’ Reading Ability Detection Based on Eye-Tracking Sensors

    PubMed Central

    Zhan, Zehui; Zhang, Lei; Mei, Hu; Fong, Patrick S. W.

    2016-01-01

    The detection of university online learners’ reading ability is generally problematic and time-consuming. Thus the eye-tracking sensors have been employed in this study, to record temporal and spatial human eye movements. Learners’ pupils, blinks, fixation, saccade, and regression are recognized as primary indicators for detecting reading abilities. A computational model is established according to the empirical eye-tracking data, and applying the multi-feature regularization machine learning mechanism based on a Low-rank Constraint. The model presents good generalization ability with an error of only 4.9% when randomly running 100 times. It has obvious advantages in saving time and improving precision, with only 20 min of testing required for prediction of an individual learner’s reading ability. PMID:27626418

  12. Tracking and recognition face in videos with incremental local sparse representation model

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Wang, Yunhong; Zhang, Zhaoxiang

    2013-10-01

    This paper addresses the problem of tracking and recognizing faces via incremental local sparse representation. First a robust face tracking algorithm is proposed via employing local sparse appearance and covariance pooling method. In the following face recognition stage, with the employment of a novel template update strategy, which combines incremental subspace learning, our recognition algorithm adapts the template to appearance changes and reduces the influence of occlusion and illumination variation. This leads to a robust video-based face tracking and recognition with desirable performance. In the experiments, we test the quality of face recognition in real-world noisy videos on YouTube database, which includes 47 celebrities. Our proposed method produces a high face recognition rate at 95% of all videos. The proposed face tracking and recognition algorithms are also tested on a set of noisy videos under heavy occlusion and illumination variation. The tracking results on challenging benchmark videos demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods. In the case of the challenging dataset in which faces undergo occlusion and illumination variation, and tracking and recognition experiments under significant pose variation on the University of California, San Diego (Honda/UCSD) database, our proposed method also consistently demonstrates a high recognition rate.

  13. Like a rolling stone: naturalistic visual kinematics facilitate tracking eye movements.

    PubMed

    Souto, David; Kerzel, Dirk

    2013-02-06

    Newtonian physics constrains object kinematics in the real world. We asked whether eye movements towards tracked objects depend on their compliance with those constraints. In particular, the force of gravity constrains round objects to roll on the ground with a particular rotational and translational motion. We measured tracking eye movements towards rolling objects. We found that objects with rotational and translational motion that was congruent with an object rolling on the ground elicited faster tracking eye movements during pursuit initiation than incongruent stimuli. Relative to a condition without rotational component, we compared objects with this motion with a condition in which there was no rotational component, we essentially obtained benefits of congruence, and, to a lesser extent, costs from incongruence. Anticipatory pursuit responses showed no congruence effect, suggesting that the effect is based on visually-driven predictions, not on velocity storage. We suggest that the eye movement system incorporates information about object kinematics acquired by a lifetime of experience with visual stimuli obeying the laws of Newtonian physics.

  14. Automated night/day standoff detection, tracking, and identification of personnel for installation protection

    NASA Astrophysics Data System (ADS)

    Lemoff, Brian E.; Martin, Robert B.; Sluch, Mikhail; Kafka, Kristopher M.; McCormick, William; Ice, Robert

    2013-06-01

    The capability to positively and covertly identify people at a safe distance, 24-hours per day, could provide a valuable advantage in protecting installations, both domestically and in an asymmetric warfare environment. This capability would enable installation security officers to identify known bad actors from a safe distance, even if they are approaching under cover of darkness. We will describe an active-SWIR imaging system being developed to automatically detect, track, and identify people at long range using computer face recognition. The system illuminates the target with an eye-safe and invisible SWIR laser beam, to provide consistent high-resolution imagery night and day. SWIR facial imagery produced by the system is matched against a watch-list of mug shots using computer face recognition algorithms. The current system relies on an operator to point the camera and to review and interpret the face recognition results. Automation software is being developed that will allow the system to be cued to a location by an external system, automatically detect a person, track the person as they move, zoom in on the face, select good facial images, and process the face recognition results, producing alarms and sharing data with other systems when people are detected and identified. Progress on the automation of this system will be presented along with experimental night-time face recognition results at distance.

  15. Effects of visual expertise on a novel eye-size illusion: Implications for holistic face processing

    PubMed Central

    Fu, Genyue; Dong, Yan; Quinn, Paul C.; Xiao, Wen S.; Wang, Qiandong; Chen, Guowei; Pascalis, Olivier; Lee, Kang

    2015-01-01

    The present study examined the effect of visual experience on the magnitude of a novel eye-size illusion: when the size of a face’s frame is increased or decreased but eye size is unchanged, observers judge the size of the eyes to be different from that in the original face frame. In the current study, we asked Chinese and Caucasian participants to judge eye size in different pairs of faces and measured the magnitude of the illusion when the faces were own- or other-age (adult vs. infant faces) and when the faces were own- or other-race (Chinese vs. Caucasian faces). We found an other-age effect and an other-race effect with the eye-size illusion: The illusion was more pronounced with own-race and own-age faces than with other-race and other-age faces. These findings taken together suggest that visual experience with faces influences the magnitude of this novel illusion. Extensive experience with certain face categories strengthens the illusion in the context of these categories, but lack of it reduces the magnitude of the illusion. Our results further imply that holistic processing may play an important role in engendering the eye-size illusion. PMID:26048685

  16. A relationship between eye movement patterns and performance in a precognitive tracking task

    NASA Technical Reports Server (NTRS)

    Repperger, D. W.; Hartzell, E. J.

    1977-01-01

    Eye movements made by various subjects in the performance of a precognitive tracking task are studied. The tracking task persented by an antiaircraft artillery (AAA) simulator has an input forcing function represented by a deterministic aircraft fly-by. The performance of subjects is ranked by two metrics. Good, mediocre, and poor trackers are selected for analysis based on performance during the difficult segment of the tracking task and over replications. Using phase planes to characterize both the eye movement patterns and the displayed error signal, a simple metric is developed to study these patterns. Two characterizations of eye movement strategies are defined and quantified. Using these two types of eye strategies, two conclusions are obtained about good, mediocre, and poor trackers. First, the eye tracker who used a fixed strategy will consistently perform better. Secondly, the best fixed strategy is defined as a Crosshair Fixator.

  17. Mentalizing eye contact with a face on a video: Gaze direction does not influence autonomic arousal.

    PubMed

    Lyyra, Pessi; Myllyneva, Aki; Hietanen, Jari K

    2018-04-26

    Recent research has revealed enhanced autonomic and subjective responses to eye contact only when perceiving another live person. However, these enhanced responses to eye contact are abolished if the viewer believes that the other person is not able to look back at the viewer. We purported to investigate whether this "genuine" eye contact effect can be reproduced with pre-recorded videos of stimulus persons. Autonomic responses, gaze behavior, and subjective self-assessments were measured while participants viewed pre-recorded video persons with direct or averted gaze, imagined that the video person was real, and mentalized that the person could see them or not. Pre-recorded videos did not evoke similar physiological or subjective eye contact effect as previously observed with live persons, not even when the participants were mentalizing being seen by the person. Gaze tracking results showed, however, increased attention allocation to faces with direct gaze compared to averted gaze directions. The results suggest that elicitation of the physiological arousal in response to genuine eye contact seems to require spontaneous experience of seeing and of being seen by another individual. © 2018 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  18. Precedence of the eye region in neural processing of faces

    PubMed Central

    Issa, Elias; DiCarlo, James

    2012-01-01

    SUMMARY Functional magnetic resonance imaging (fMRI) has revealed multiple subregions in monkey inferior temporal cortex (IT) that are selective for images of faces over other objects. The earliest of these subregions, the posterior lateral face patch (PL), has not been studied previously at the neurophysiological level. Perhaps not surprisingly, we found that PL contains a high concentration of ‘face selective’ cells when tested with standard image sets comparable to those used previously to define the region at the level of fMRI. However, we here report that several different image sets and analytical approaches converge to show that nearly all face selective PL cells are driven by the presence of a single eye in the context of a face outline. Most strikingly, images containing only an eye, even when incorrectly positioned in an outline, drove neurons nearly as well as full face images, and face images lacking only this feature led to longer latency responses. Thus, bottom-up face processing is relatively local and linearly integrates features -- consistent with parts-based models -- grounding investigation of how the presence of a face is first inferred in the IT face processing hierarchy. PMID:23175821

  19. Some effects of alcohol and eye movements on cross-race face learning.

    PubMed

    Harvey, Alistair J

    2014-01-01

    This study examines the impact of acute alcohol intoxication on visual scanning in cross-race face learning. The eye movements of a group of white British participants were recorded as they encoded a series of own-and different-race faces, under alcohol and placebo conditions. Intoxication reduced the rate and extent of visual scanning during face encoding, reorienting the focus of foveal attention away from the eyes and towards the nose. Differences in encoding eye movements also varied between own-and different-race face conditions as a function of alcohol. Fixations to both face types were less frequent and more lingering following intoxication, but in the placebo condition this was only the case for different-race faces. While reducing visual scanning, however, alcohol had no adverse effect on memory, only encoding restrictions associated with sober different-race face processing led to poorer recognition. These results support perceptual expertise accounts of own-race face processing, but suggest the adverse effects of alcohol on face learning published previously are not caused by foveal encoding restrictions. The implications of these findings for alcohol myopia theory are discussed.

  20. Eye-tracking-based assessment of cognitive function in low-resource settings.

    PubMed

    Forssman, Linda; Ashorn, Per; Ashorn, Ulla; Maleta, Kenneth; Matchado, Andrew; Kortekangas, Emma; Leppänen, Jukka M

    2017-04-01

    Early development of neurocognitive functions in infants can be compromised by poverty, malnutrition and lack of adequate stimulation. Optimal management of neurodevelopmental problems in infants requires assessment tools that can be used early in life, and are objective and applicable across economic, cultural and educational settings. The present study examined the feasibility of infrared eye tracking as a novel and highly automated technique for assessing visual-orienting and sequence-learning abilities as well as attention to facial expressions in young (9-month-old) infants. Techniques piloted in a high-resource laboratory setting in Finland (N=39) were subsequently field-tested in a community health centre in rural Malawi (N=40). Parents' perception of the acceptability of the method (Finland 95%, Malawi 92%) and percentages of infants completing the whole eye-tracking test (Finland 95%, Malawi 90%) were high, and percentages of valid test trials (Finland 69-85%, Malawi 68-73%) satisfactory at both sites. Test completion rates were slightly higher for eye tracking (90%) than traditional observational tests (87%) in Malawi. The predicted response pattern indicative of specific cognitive function was replicated in Malawi, but Malawian infants exhibited lower response rates and slower processing speed across tasks. High test completion rates and the replication of the predicted test patterns in a novel environment in Malawi support the feasibility of eye tracking as a technique for assessing infant development in low-resource setting. Further research is needed to the test-retest stability and predictive validity of the eye-tracking scores in low-income settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. You Look Familiar: How Malaysian Chinese Recognize Faces

    PubMed Central

    Tan, Chrystalle B. Y.; Stephen, Ian D.; Whitehead, Ross; Sheppard, Elizabeth

    2012-01-01

    East Asian and white Western observers employ different eye movement strategies for a variety of visual processing tasks, including face processing. Recent eye tracking studies on face recognition found that East Asians tend to integrate information holistically by focusing on the nose while white Westerners perceive faces featurally by moving between the eyes and mouth. The current study examines the eye movement strategy that Malaysian Chinese participants employ when recognizing East Asian, white Western, and African faces. Rather than adopting the Eastern or Western fixation pattern, Malaysian Chinese participants use a mixed strategy by focusing on the eyes and nose more than the mouth. The combination of Eastern and Western strategies proved advantageous in participants' ability to recognize East Asian and white Western faces, suggesting that individuals learn to use fixation patterns that are optimized for recognizing the faces with which they are more familiar. PMID:22253762

  2. Use of Cognitive and Metacognitive Strategies in Online Search: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Zhou, Mingming; Ren, Jing

    2016-01-01

    This study used eye-tracking technology to track students' eye movements while searching information on the web. The research question guiding this study was "Do students with different search performance levels have different visual attention distributions while searching information online? If yes, what are the patterns for high and low…

  3. Using eye-tracking technology for communication in Rett syndrome: perceptions of impact.

    PubMed

    Vessoyan, Kelli; Steckle, Gill; Easton, Barb; Nichols, Megan; Mok Siu, Victoria; McDougall, Janette

    2018-04-27

    Studies have investigated the use of eye-tracking technology to assess cognition in individuals with Rett syndrome, but few have looked at this access method for communication for this group. Loss of speech, decreased hand use, and severe motor apraxia significantly impact functional communication for this population. Eye gaze is one modality that may be used successfully by individuals with Rett syndrome. This multiple case study explored whether using eye-tracking technology, with ongoing support from a team of augmentative and alternative communication (AAC) therapists, could help four participants with Rett syndrome meet individualized communication goals. Two secondary objectives were to examine parents' perspectives on (a) the psychosocial impact of their child's use of the technology, and (b) satisfaction with using the technology. All four participants were rated by the treating therapists to have made improvement on their goals. According to both quantitative findings and descriptive information, eye-tracking technology was viewed by parents as contributing to participants' improved psychosocial functioning. Parents reported being highly satisfied with both the device and the clinical services received. This study provides initial evidence that eye-tracking may be perceived as a worthwhile and potentially satisfactory technology to support individuals with Rett syndrome in communicating. Future, more rigorous research that addresses the limitations of a case study design is required to substantiate study findings.

  4. Looking at My Own Face: Visual Processing Strategies in Self–Other Face Recognition

    PubMed Central

    Chakraborty, Anya; Chakrabarti, Bhismadev

    2018-01-01

    We live in an age of ‘selfies.’ Yet, how we look at our own faces has seldom been systematically investigated. In this study we test if the visual processing of the highly familiar self-face is different from other faces, using psychophysics and eye-tracking. This paradigm also enabled us to test the association between the psychophysical properties of self-face representation and visual processing strategies involved in self-face recognition. Thirty-three adults performed a self-face recognition task from a series of self-other face morphs with simultaneous eye-tracking. Participants were found to look longer at the lower part of the face for self-face compared to other-face. Participants with a more distinct self-face representation, as indexed by a steeper slope of the psychometric response curve for self-face recognition, were found to look longer at upper part of the faces identified as ‘self’ vs. those identified as ‘other’. This result indicates that self-face representation can influence where we look when we process our own vs. others’ faces. We also investigated the association of autism-related traits with self-face processing metrics since autism has previously been associated with atypical self-processing. The study did not find any self-face specific association with autistic traits, suggesting that autism-related features may be related to self-processing in a domain specific manner. PMID:29487554

  5. Optimizations and Applications in Head-Mounted Video-Based Eye Tracking

    ERIC Educational Resources Information Center

    Li, Feng

    2011-01-01

    Video-based eye tracking techniques have become increasingly attractive in many research fields, such as visual perception and human-computer interface design. The technique primarily relies on the positional difference between the center of the eye's pupil and the first-surface reflection at the cornea, the corneal reflection (CR). This…

  6. A Novel Hybrid Mental Spelling Application Based on Eye Tracking and SSVEP-Based BCI

    PubMed Central

    Stawicki, Piotr; Gembler, Felix; Rezeika, Aya; Volosyak, Ivan

    2017-01-01

    Steady state visual evoked potentials (SSVEPs)-based Brain-Computer interfaces (BCIs), as well as eyetracking devices, provide a pathway for re-establishing communication for people with severe disabilities. We fused these control techniques into a novel eyetracking/SSVEP hybrid system, which utilizes eye tracking for initial rough selection and the SSVEP technology for fine target activation. Based on our previous studies, only four stimuli were used for the SSVEP aspect, granting sufficient control for most BCI users. As Eye tracking data is not used for activation of letters, false positives due to inappropriate dwell times are avoided. This novel approach combines the high speed of eye tracking systems and the high classification accuracies of low target SSVEP-based BCIs, leading to an optimal combination of both methods. We evaluated accuracy and speed of the proposed hybrid system with a 30-target spelling application implementing all three control approaches (pure eye tracking, SSVEP and the hybrid system) with 32 participants. Although the highest information transfer rates (ITRs) were achieved with pure eye tracking, a considerable amount of subjects was not able to gain sufficient control over the stand-alone eye-tracking device or the pure SSVEP system (78.13% and 75% of the participants reached reliable control, respectively). In this respect, the proposed hybrid was most universal (over 90% of users achieved reliable control), and outperformed the pure SSVEP system in terms of speed and user friendliness. The presented hybrid system might offer communication to a wider range of users in comparison to the standard techniques. PMID:28379187

  7. Clutter in electronic medical records: examining its performance and attentional costs using eye tracking.

    PubMed

    Moacdieh, Nadine; Sarter, Nadine

    2015-06-01

    The objective was to use eye tracking to trace the underlying changes in attention allocation associated with the performance effects of clutter, stress, and task difficulty in visual search and noticing tasks. Clutter can degrade performance in complex domains, yet more needs to be known about the associated changes in attention allocation, particularly in the presence of stress and for different tasks. Frequently used and relatively simple eye tracking metrics do not effectively capture the various effects of clutter, which is critical for comprehensively analyzing clutter and developing targeted, real-time countermeasures. Electronic medical records (EMRs) were chosen as the application domain for this research. Clutter, stress, and task difficulty were manipulated, and physicians' performance on search and noticing tasks was recorded. Several eye tracking metrics were used to trace attention allocation throughout those tasks, and subjective data were gathered via a debriefing questionnaire. Clutter degraded performance in terms of response time and noticing accuracy. These decrements were largely accentuated by high stress and task difficulty. Eye tracking revealed the underlying attentional mechanisms, and several display-independent metrics were shown to be significant indicators of the effects of clutter. Eye tracking provides a promising means to understand in detail (offline) and prevent (in real time) major performance breakdowns due to clutter. Display designers need to be aware of the risks of clutter in EMRs and other complex displays and can use the identified eye tracking metrics to evaluate and/or adjust their display. © 2015, Human Factors and Ergonomics Society.

  8. Eye tracking measures of uncertainty during perceptual decision making.

    PubMed

    Brunyé, Tad T; Gardony, Aaron L

    2017-10-01

    Perceptual decision making involves gathering and interpreting sensory information to effectively categorize the world and inform behavior. For instance, a radiologist distinguishing the presence versus absence of a tumor, or a luggage screener categorizing objects as threatening or non-threatening. In many cases, sensory information is not sufficient to reliably disambiguate the nature of a stimulus, and resulting decisions are done under conditions of uncertainty. The present study asked whether several oculomotor metrics might prove sensitive to transient states of uncertainty during perceptual decision making. Participants viewed images with varying visual clarity and were asked to categorize them as faces or houses, and rate the certainty of their decisions, while we used eye tracking to monitor fixations, saccades, blinks, and pupil diameter. Results demonstrated that decision certainty influenced several oculomotor variables, including fixation frequency and duration, the frequency, peak velocity, and amplitude of saccades, and phasic pupil diameter. Whereas most measures tended to change linearly along with decision certainty, pupil diameter revealed more nuanced and dynamic information about the time course of perceptual decision making. Together, results demonstrate robust alterations in eye movement behavior as a function of decision certainty and attention demands, and suggest that monitoring oculomotor variables during applied task performance may prove valuable for identifying and remediating transient states of uncertainty. Published by Elsevier B.V.

  9. Online webcam-based eye tracking in cognitive science: A first look.

    PubMed

    Semmelmann, Kilian; Weigelt, Sarah

    2018-04-01

    Online experimentation is emerging in many areas of cognitive psychology as a viable alternative or supplement to classical in-lab experimentation. While performance- and reaction-time-based paradigms are covered in recent studies, one instrument of cognitive psychology has not received much attention up to now: eye tracking. In this study, we used JavaScript-based eye tracking algorithms recently made available by Papoutsaki et al. (International Joint Conference on Artificial Intelligence, 2016) together with consumer-grade webcams to investigate the potential of online eye tracking to benefit from the common advantages of online data conduction. We compared three in-lab conducted tasks (fixation, pursuit, and free viewing) with online-acquired data to analyze the spatial precision in the first two, and replicability of well-known gazing patterns in the third task. Our results indicate that in-lab data exhibit an offset of about 172 px (15% of screen size, 3.94° visual angle) in the fixation task, while online data is slightly less accurate (18% of screen size, 207 px), and shows higher variance. The same results were found for the pursuit task with a constant offset during the stimulus movement (211 px in-lab, 216 px online). In the free-viewing task, we were able to replicate the high attention attribution to eyes (28.25%) compared to other key regions like the nose (9.71%) and mouth (4.00%). Overall, we found web technology-based eye tracking to be suitable for all three tasks and are confident that the required hard- and software will be improved continuously for even more sophisticated experimental paradigms in all of cognitive psychology.

  10. Eye tracking a self-moved target with complex hand-target dynamics

    PubMed Central

    Landelle, Caroline; Montagnini, Anna; Madelain, Laurent

    2016-01-01

    Previous work has shown that the ability to track with the eye a moving target is substantially improved when the target is self-moved by the subject's hand compared with when being externally moved. Here, we explored a situation in which the mapping between hand movement and target motion was perturbed by simulating an elastic relationship between the hand and target. Our objective was to determine whether the predictive mechanisms driving eye-hand coordination could be updated to accommodate this complex hand-target dynamics. To fully appreciate the behavioral effects of this perturbation, we compared eye tracking performance when self-moving a target with a rigid mapping (simple) and a spring mapping as well as when the subject tracked target trajectories that he/she had previously generated when using the rigid or spring mapping. Concerning the rigid mapping, our results confirmed that smooth pursuit was more accurate when the target was self-moved than externally moved. In contrast, with the spring mapping, eye tracking had initially similar low spatial accuracy (though shorter temporal lag) in the self versus externally moved conditions. However, within ∼5 min of practice, smooth pursuit improved in the self-moved spring condition, up to a level similar to the self-moved rigid condition. Subsequently, when the mapping unexpectedly switched from spring to rigid, the eye initially followed the expected target trajectory and not the real one, thereby suggesting that subjects used an internal representation of the new hand-target dynamics. Overall, these results emphasize the stunning adaptability of smooth pursuit when self-maneuvering objects with complex dynamics. PMID:27466129

  11. Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects.

    PubMed

    Kang, Ziho; Mandal, Saptarshi; Crutchfield, Jerry; Millan, Angel; McClung, Sarah N

    2016-01-01

    Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance.

  12. NORTH SIDE FACING TRACK, SHOWING ELECTRICAL BOX AND CONCRETE VAULT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    NORTH SIDE FACING TRACK, SHOWING ELECTRICAL BOX AND CONCRETE VAULT - Edwards Air Force Base, South Base Sled Track, Electrical Distribution Station, South side of Sled Track, Lancaster, Los Angeles County, CA

  13. Implications of comorbidity for genetic studies of bipolar disorder: P300 and eye tracking as biological markers for illness.

    PubMed

    Blackwood, D H; Sharp, C W; Walker, M T; Doody, G A; Glabus, M F; Muir, W J

    1996-06-01

    In large families with affective illness, identification of a biological variable is needed that reflects brain dysfunction at an earlier point than symptom development. Eye movement disorder, a possible vulnerability marker in schizophrenia, is less clearly associated with affective illness, although a subgroup of affective disorders shows smooth-pursuit eye movement disorder. The auditory P300 event-related potential may be a useful marker for risk to schizophrenia, but a role in bipolar illness is less certain. The distribution of these two biological variables and their association with symptoms in two multiply affected bipolar families is described. In a single, five-generation family identified for linkage studies through two bipolar I (BPI) probands, 128 members (including 20 spouses) were interviewed. The 108 related individuals had diagnoses of BPI (7), bipolar II (2), cyclothymia (3), or major depressive disorder (19). Eight others had generalised anxiety (1), minor depression (5), intermittent depression (1), or alcoholism (1). Sixty-nine subjects had no psychiatric diagnosis. P300 latency (81) and eye tracking (71) were recorded from a subgroup of relatives within the pedigree. Eye tracking was abnormal in 11 of 71 relatives (15.5%) and was bimodally distributed. In these 11 relatives, clinical diagnoses included minor depression (1), alcoholism (1) and generalised anxiety disorder (1). P300 latency was normally distributed and did not differ from controls. In a second family in which five of seven siblings have BPI illness, P300 latency and eye movement disorder were found in affected relatives and in some unaffected offspring. In these large families, clinical diagnoses of general anxiety, alcoholism and minor depression, when associated with eye tracking abnormality, may be considered alternative clinical manifestations of the same trait that in other relatives is expressed as bipolar illness.

  14. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study.

    PubMed

    Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M

    2017-02-01

    Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.

  15. 3. NORTH FRONT, BULLET GLASS OBSERVATION WINDOWS FACE SLED TRACK. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. NORTH FRONT, BULLET GLASS OBSERVATION WINDOWS FACE SLED TRACK. - Edwards Air Force Base, South Base Sled Track, Instrumentation & Control Building, South of Sled Track, Station "50" area, Lancaster, Los Angeles County, CA

  16. Updating OSHA Standards Based on National Consensus Standards; Eye and Face Protection. Final rule.

    PubMed

    2016-03-25

    On March 13, 2015, OSHA published in the Federal Register a notice of proposed rulemaking (NPRM) to revise its eye and face protection standards for general industry, shipyard employment, marine terminals, longshoring, and construction by updating the references to national consensus standards approved by the American National Standards Institute (ANSI). OSHA received no significant objections from commenters and therefore is adopting the amendments as proposed. This final rule updates the references in OSHA's eye and face standards to reflect the most recent edition of the ANSI/International Safety Equipment Association (ISEA) eye and face protection standard. It removes the oldest-referenced edition of the same ANSI standard. It also amends other provisions of the construction eye and face protection standard to bring them into alignment with OSHA's general industry and maritime standards.

  17. Eye gaze tracking using correlation filters

    NASA Astrophysics Data System (ADS)

    Karakaya, Mahmut; Bolme, David; Boehnen, Chris

    2014-03-01

    In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjects gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm's length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.

  18. Eye Gaze Tracking using Correlation Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karakaya, Mahmut; Boehnen, Chris Bensing; Bolme, David S

    In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjectsmore » gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm s length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.« less

  19. Caucasian Infants Scan Own- and Other-Race Faces Differently

    PubMed Central

    Wheeler, Andrea; Anzures, Gizelle; Quinn, Paul C.; Pascalis, Olivier; Omrin, Danielle S.; Lee, Kang

    2011-01-01

    Young infants are known to prefer own-race faces to other race faces and recognize own-race faces better than other-race faces. However, it is entirely unclear as to whether infants also attend to different parts of own- and other-race faces differently, which may provide an important clue as to how and why the own-race face recognition advantage emerges so early. The present study used eye tracking methodology to investigate whether 6- to 10-month-old Caucasian infants (N = 37) have differential scanning patterns for dynamically displayed own- and other-race faces. We found that even though infants spent a similar amount of time looking at own- and other-race faces, with increased age, infants increasingly looked longer at the eyes of own-race faces and less at the mouths of own-race faces. These findings suggest experience-based tuning of the infant's face processing system to optimally process own-race faces that are different in physiognomy from other-race faces. In addition, the present results, taken together with recent own- and other-race eye tracking findings with infants and adults, provide strong support for an enculturation hypothesis that East Asians and Westerners may be socialized to scan faces differently due to each culture's conventions regarding mutual gaze during interpersonal communication. PMID:21533235

  20. Eye gaze tracking based on the shape of pupil image

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is an important instrument for research in psychology, widely used in attention, visual perception, reading and other fields of research. Because of its potential function in human-computer interaction, the eye gaze tracking has already been a topic of research in many fields over the last decades. Nowadays, with the development of technology, non-intrusive methods are more and more welcomed. In this paper, we will present a method based on the shape of pupil image to estimate the gaze point of human eyes without any other intrusive devices such as a hat, a pair of glasses and so on. After using the ellipse fitting algorithm to deal with the pupil image we get, we can determine the direction of the fixation by the shape of the pupil.The innovative aspect of this method is to utilize the new idea of the shape of the pupil so that we can avoid much complicated algorithm. The performance proposed is very helpful for the study of eye gaze tracking, which just needs one camera without infrared light to know the changes in the shape of the pupil to determine the direction of the eye gazing, no additional condition is required.

  1. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2016-04-01

    and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task has been completed and is in beta testing...neurocognitive test battery, and self-report measures of cognitive efficacy. We will also include functional magnetic resonance imagining ( fMRI ) and... fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye tracking data will be

  2. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy.

    PubMed

    Via, Riccardo; Fassi, Aurora; Fattori, Giovanni; Fontana, Giulia; Pella, Andrea; Tagaste, Barbara; Riboldi, Marco; Ciocca, Mario; Orecchia, Roberto; Baroni, Guido

    2015-05-01

    External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by two calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The

  3. Eye Tracking and Head Movement Detection: A State-of-Art Survey

    PubMed Central

    2013-01-01

    Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851

  4. Optimization of illumination schemes in a head-mounted display integrated with eye tracking capabilities

    NASA Astrophysics Data System (ADS)

    Pansing, Craig W.; Hua, Hong; Rolland, Jannick P.

    2005-08-01

    Head-mounted display (HMD) technologies find a variety of applications in the field of 3D virtual and augmented environments, 3D scientific visualization, as well as wearable displays. While most of the current HMDs use head pose to approximate line of sight, we propose to investigate approaches and designs for integrating eye tracking capability into HMDs from a low-level system design perspective and to explore schemes for optimizing system performance. In this paper, we particularly propose to optimize the illumination scheme, which is a critical component in designing an eye tracking-HMD (ET-HMD) integrated system. An optimal design can improve not only eye tracking accuracy, but also robustness. Using LightTools, we present the simulation of a complete eye illumination and imaging system using an eye model along with multiple near infrared LED (IRLED) illuminators and imaging optics, showing the irradiance variation of the different eye structures. The simulation of dark pupil effects along with multiple 1st-order Purkinje images will be presented. A parametric analysis is performed to investigate the relationships between the IRLED configurations and the irradiance distribution at the eye, and a set of optimal configuration parameters is recommended. The analysis will be further refined by actual eye image acquisition and processing.

  5. Dissociable Frontal Controls during Visible and Memory-guided Eye-Tracking of Moving Targets

    PubMed Central

    Ding, Jinhong; Powell, David; Jiang, Yang

    2009-01-01

    When tracking visible or occluded moving targets, several frontal regions including the frontal eye fields (FEF), dorsal-lateral prefrontal cortex (DLPFC), and Anterior Cingulate Cortex (ACC) are involved in smooth pursuit eye movements (SPEM). To investigate how these areas play different roles in predicting future locations of moving targets, twelve healthy college students participated in a smooth pursuit task of visual and occluded targets. Their eye movements and brain responses measured by event-related functional MRI were simultaneously recorded. Our results show that different visual cues resulted in time discrepancies between physical and estimated pursuit time only when the moving dot was occluded. Visible phase velocity gain was higher than that of occlusion phase. We found bilateral FEF association with eye-movement whether moving targets are visible or occluded. However, the DLPFC and ACC showed increased activity when tracking and predicting locations of occluded moving targets, and were suppressed during smooth pursuit of visible targets. When visual cues were increasingly available, less activation in the DLPFC and the ACC was observed. Additionally, there was a significant hemisphere effect in DLPFC, where right DLPFC showed significantly increased responses over left when pursuing occluded moving targets. Correlation results revealed that DLPFC, the right DLPFC in particular, communicates more with FEF during tracking of occluded moving targets (from memory). The ACC modulates FEF more during tracking of visible targets (likely related to visual attention). Our results suggest that DLPFC and ACC modulate FEF and cortical networks differentially during visible and memory-guided eye tracking of moving targets. PMID:19434603

  6. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze.

    PubMed

    Wells, Laura Jean; Gillespie, Steven Mark; Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.

  7. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze

    PubMed Central

    Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down. PMID:27942030

  8. Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys.

    PubMed

    Ando, K; Johanson, C E; Levy, D L; Yasillo, N J; Holzman, P S; Schuster, C R

    1983-01-01

    Rhesus monkeys were trained to track a moving disk using a procedure in which responses on a lever were reinforced with water delivery only when the disk, oscillating in a horizontal plane on a screen at a frequency of 0.4 Hz in a visual angle of 20 degrees, dimmed for a brief period. Pursuit eye movements were recorded by electrooculography (EOG). IM phencyclidine, secobarbital, and diazepam injections decreased the number of reinforced lever presses in a dose-related manner. Both secobarbital and diazepam produced episodic jerky-pursuit eye movements, while phencyclidine had no consistent effects on eye movements. Lever pressing was disrupted at doses which had little effect on the quality of smooth-pursuit eye movements in some monkeys. This separation was particularly pronounced with diazepam. The similarities of the drug effects on smooth-pursuit eye movements between the present study and human studies indicate that the present method using rhesus monkeys may be useful for predicting drug effects on eye tracking and oculomotor function in humans.

  9. Designs and Algorithms to Map Eye Tracking Data with Dynamic Multielement Moving Objects

    PubMed Central

    Mandal, Saptarshi

    2016-01-01

    Design concepts and algorithms were developed to address the eye tracking analysis issues that arise when (1) participants interrogate dynamic multielement objects that can overlap on the display and (2) visual angle error of the eye trackers is incapable of providing exact eye fixation coordinates. These issues were addressed by (1) developing dynamic areas of interests (AOIs) in the form of either convex or rectangular shapes to represent the moving and shape-changing multielement objects, (2) introducing the concept of AOI gap tolerance (AGT) that controls the size of the AOIs to address the overlapping and visual angle error issues, and (3) finding a near optimal AGT value. The approach was tested in the context of air traffic control (ATC) operations where air traffic controller specialists (ATCSs) interrogated multiple moving aircraft on a radar display to detect and control the aircraft for the purpose of maintaining safe and expeditious air transportation. In addition, we show how eye tracking analysis results can differ based on how we define dynamic AOIs to determine eye fixations on moving objects. The results serve as a framework to more accurately analyze eye tracking data and to better support the analysis of human performance. PMID:27725830

  10. Do the Eyes Have It? Using Eye Tracking to Assess Students Cognitive Dimensions

    ERIC Educational Resources Information Center

    Nisiforou, Efi A.; Laghos, Andrew

    2013-01-01

    Field dependence/independence (FD/FI) is a significant dimension of cognitive styles. The paper presents results of a study that seeks to identify individuals' level of field independence during visual stimulus tasks processing. Specifically, it examined the relationship between the Hidden Figure Test (HFT) scores and the eye tracking metrics.…

  11. Can eye-tracking technology improve situational awareness in paramedic clinical education?

    PubMed

    Williams, Brett; Quested, Andrew; Cooper, Simon

    2013-01-01

    Human factors play a significant part in clinical error. Situational awareness (SA) means being aware of one's surroundings, comprehending the present situation, and being able to predict outcomes. It is a key human skill that, when properly applied, is associated with reducing medical error: eye-tracking technology can be used to provide an objective and qualitative measure of the initial perception component of SA. Feedback from eye-tracking technology can be used to improve the understanding and teaching of SA in clinical contexts, and consequently, has potential for reducing clinician error and the concomitant adverse events.

  12. A psychotechnological review on eye-tracking systems: towards user experience.

    PubMed

    Mele, Maria Laura; Federici, Stefano

    2012-07-01

    The aim of the present work is to show a critical review of the international literature on eye-tracking technologies by focusing on those features that characterize them as 'psychotechnologies'. A critical literature review was conducted through the main psychology, engineering, and computer sciences databases by following specific inclusion and exclusion criteria. A total of 46 matches from 1998 to 2010 were selected for content analysis. Results have been divided into four broad thematic areas. We found that, although there is a growing attention to end-users, most of the studies reviewed in this work are far from being considered as adopting holistic human-computer interaction models that include both individual differences and needs of users. User is often considered only as a measurement object of the functioning of the technological system and not as a real alter-ego of the intrasystemic interaction. In order to fully benefit from the communicative functions of gaze, the research on eye-tracking must emphasize user experience. Eye-tracking systems would become an effective assistive technology for integration, adaptation and neutralization of the environmental barrier only when a holistic model can be applied for both design processes and assessment of the functional components of the interaction.

  13. Face-Scanning Behavior to Silently-Talking Faces in 12-Month-Old Infants: The Impact of Pre-Exposed Auditory Speech

    ERIC Educational Resources Information Center

    Kubicek, Claudia; de Boisferon, Anne Hillairet; Dupierrix, Eve; Loevenbruck, Helene; Gervain, Judit; Schwarzer, Gudrun

    2013-01-01

    The present eye-tracking study aimed to investigate the impact of auditory speech information on 12-month-olds' gaze behavior to silently-talking faces. We examined German infants' face-scanning behavior to side-by-side presentation of a bilingual speaker's face silently speaking German utterances on one side and French on the other side, before…

  14. Measuring Human Performance in Simulated Nuclear Power Plant Control Rooms Using Eye Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, Casey Robert; Rice, Brandon Charles; Bower, Gordon Ross

    Control room modernization will be an important part of life extension for the existing light water reactor fleet. As part of modernization efforts, personnel will need to gain a full understanding of how control room technologies affect performance of human operators. Recent advances in technology enables the use of eye tracking technology to continuously measure an operator’s eye movement, which correlates with a variety of human performance constructs such as situation awareness and workload. This report describes eye tracking metrics in the context of how they will be used in nuclear power plant control room simulator studies.

  15. [A tracking function of human eye in microgravity and during readaptation to earth's gravity].

    PubMed

    Kornilova, L N

    2001-01-01

    The paper summarizes results of electro-oculography of all ways of visual tracking: fixative eye movements (saccades), smooth pursuit of linearly, pendulum-like and circularly moving point stimuli, pursuit of vertically moving foveoretinal optokinetic stimuli, and presents values of thresholds and amplification coefficients of the optokinetic nystagmus during tracking of linear movement of foveoretinal optokinetic stimuli. Investigations were performed aboard the Salyut and Mir space stations with participation of 31 cosmonauts of whom 27 made long-term (76 up to 438 day) and 4 made short-term (7 to 9 day) missions. It was shown that in space flight the saccadic structure within the tracking reaction does not change; yet, corrective movements (additional microsaccades to achieve tracking) appeared in 47% of observations at the onset and in 76% of observations on months 3 to 6 of space flight. After landing, the structure of vertical saccades was found altered in half the cosmonauts. No matter in or after flight, reverse nystagmus was present along with the gaze nystagmus during static saccades in 22% (7 cosmonauts) of the observations. Amplitude of tracking vertically, diagonally or circularly moving stimuli was significantly reduced as period on mission increased. Early in flight (40% of the cosmonauts) and shortly afterwards (21% of the cosmonauts) the structure of smooth tracking reaction was totally broken up, that is eye followed stimulus with micro- or macrosaccades. The structure of smooth eye tracking recovered on flight days 6-8 and on postflight days 3-4. However, in 46% of the cosmonauts on long-term missions the structure of smooth eye tracking was noted to be disturbed periodically, i.e. smooth tracking was replaced by saccadic.

  16. Comparison of Predictable Smooth Ocular and Combined Eye-Head Tracking Behaviour in Patients with Lesions Affecting the Brainstem and Cerebellum

    NASA Technical Reports Server (NTRS)

    Grant, Michael P.; Leigh, R. John; Seidman, Scott H.; Riley, David E.; Hanna, Joseph P.

    1992-01-01

    We compared the ability of eight normal subjects and 15 patients with brainstem or cerebellar disease to follow a moving visual stimulus smoothly with either the eyes alone or with combined eye-head tracking. The visual stimulus was either a laser spot (horizontal and vertical planes) or a large rotating disc (torsional plane), which moved at one sinusoidal frequency for each subject. The visually enhanced Vestibulo-Ocular Reflex (VOR) was also measured in each plane. In the horizontal and vertical planes, we found that if tracking gain (gaze velocity/target velocity) for smooth pursuit was close to 1, the gain of combined eye-hand tracking was similar. If the tracking gain during smooth pursuit was less than about 0.7, combined eye-head tracking was usually superior. Most patients, irrespective of diagnosis, showed combined eye-head tracking that was superior to smooth pursuit; only two patients showed the converse. In the torsional plane, in which optokinetic responses were weak, combined eye-head tracking was much superior, and this was the case in both subjects and patients. We found that a linear model, in which an internal ocular tracking signal cancelled the VOR, could account for our findings in most normal subjects in the horizontal and vertical planes, but not in the torsional plane. The model failed to account for tracking behaviour in most patients in any plane, and suggested that the brain may use additional mechanisms to reduce the internal gain of the VOR during combined eye-head tracking. Our results confirm that certain patients who show impairment of smooth-pursuit eye movements preserve their ability to smoothly track a moving target with combined eye-head tracking.

  17. Detection of differential viewing patterns to erotic and non-erotic stimuli using eye-tracking methodology.

    PubMed

    Lykins, Amy D; Meana, Marta; Kambe, Gretchen

    2006-10-01

    As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.

  18. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  19. Combining EEG and eye tracking: identification, characterization, and correction of eye movement artifacts in electroencephalographic data

    PubMed Central

    Plöchl, Michael; Ossandón, José P.; König, Peter

    2012-01-01

    Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts

  20. Children and Adults Scan Faces of Own and Other Races Differently

    PubMed Central

    Hu, Chao; Wang, Qiandong; Fu, Genyue; Quinn, Paul C.; Lee, Kang

    2014-01-01

    Extensive behavioral and neural evidence suggests that processing of own-race faces differs from that of other-race faces in both adults and infants. However, little research has examined whether and how children scan faces of own and other races differently for face recognition. In this eye-tracking study, Chinese children aged from 4 to 7 years and Chinese adults were asked to remember Chinese and Caucasian faces. None of the participants had any direct contact with foreign individuals. Multi-method analyses of eye-tracking data revealed that regardless of age group, proportional fixation duration on the eyes of Chinese faces was significantly lower than that on the eyes of Caucasian faces, whereas proportional fixation duration on the nose and mouth of Chinese faces was significantly higher than that on the nose and mouth of Caucasian faces. In addition, the amplitude of saccades on Chinese faces was significantly lower than that on Caucasian faces, potentially reflecting finer-grained processing for own-race faces. Moreover, adults’ fixation duration/saccade numbers on the whole faces, proportional fixation percentage on the nose, proportional number of saccades between AOIs, and accuracy in recognizing faces were higher than those of children. These results together demonstrated that an abundance of visual experience with own-race faces and a lack of it with other-race faces may result in differential facial scanning in both adults and children. Furthermore, the increased experience of processing faces may result in a more holistic and advanced scanning strategy in Chinese adults. PMID:24929225

  1. Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments.

    PubMed

    Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and p<;0.05). This implies that with vigilance decrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.

  2. A model that integrates eye velocity commands to keep track of smooth eye displacements.

    PubMed

    Blohm, Gunnar; Optican, Lance M; Lefèvre, Philippe

    2006-08-01

    Past results have reported conflicting findings on the oculomotor system's ability to keep track of smooth eye movements in darkness. Whereas some results indicate that saccades cannot compensate for smooth eye displacements, others report that memory-guided saccades during smooth pursuit are spatially correct. Recently, it was shown that the amount of time before the saccade made a difference: short-latency saccades were retinotopically coded, whereas long-latency saccades were spatially coded. Here, we propose a model of the saccadic system that can explain the available experimental data. The novel part of this model consists of a delayed integration of efferent smooth eye velocity commands. Two alternative physiologically realistic neural mechanisms for this integration stage are proposed. Model simulations accurately reproduced prior findings. Thus, this model reconciles the earlier contradictory reports from the literature about compensation for smooth eye movements before saccades because it involves a slow integration process.

  3. Visual attention on a respiratory function monitor during simulated neonatal resuscitation: an eye-tracking study.

    PubMed

    Katz, Trixie A; Weinberg, Danielle D; Fishman, Claire E; Nadkarni, Vinay; Tremoulet, Patrice; Te Pas, Arjan B; Sarcevic, Aleksandra; Foglia, Elizabeth E

    2018-06-14

    A respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV. Mixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses. Level 3 academic neonatal intensive care unit. Twenty neonatal resuscitation providers. Visual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation. Twenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13-51%), highest visit count (median 5.17 per 10 s, IQR 2.82-6.16) and longest visit duration (median 0.48 s, IQR 0.38-0.81 s). All participants were willing to wear the glasses during clinical resuscitation. Wearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Via, Riccardo, E-mail: riccardo.via@polimi.it; Fassi, Aurora; Fattori, Giovanni

    Purpose: External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Methods: Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by twomore » calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Results: Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. Conclusions: A noninvasive ETS prototype was designed to perform real-time target localization and eye movement

  5. NMR Spectra through the Eyes of a Student: Eye Tracking Applied to NMR Items

    ERIC Educational Resources Information Center

    Topczewski, Joseph J.; Topczewski, Anna M.; Tang, Hui; Kendhammer, Lisa K.; Pienta, Norbert J.

    2017-01-01

    Nuclear magnetic resonance spectroscopy (NMR) plays a key role in introductory organic chemistry, spanning theory, concepts, and experimentation. Therefore, it is imperative that the instruction methods for NMR are both efficient and effective. By utilizing eye tracking equipment, the researchers were able to monitor how second-semester organic…

  6. High-resolution eye tracking using V1 neuron activity

    PubMed Central

    McFarland, James M.; Bondy, Adrian G.; Cumming, Bruce G.; Butts, Daniel A.

    2014-01-01

    Studies of high-acuity visual cortical processing have been limited by the inability to track eye position with sufficient accuracy to precisely reconstruct the visual stimulus on the retina. As a result, studies on primary visual cortex (V1) have been performed almost entirely on neurons outside the high-resolution central portion of the visual field (the fovea). Here we describe a procedure for inferring eye position using multi-electrode array recordings from V1 coupled with nonlinear stimulus processing models. We show that this method can be used to infer eye position with one arc-minute accuracy – significantly better than conventional techniques. This allows for analysis of foveal stimulus processing, and provides a means to correct for eye-movement induced biases present even outside the fovea. This method could thus reveal critical insights into the role of eye movements in cortical coding, as well as their contribution to measures of cortical variability. PMID:25197783

  7. Auditory display as feedback for a novel eye-tracking system for sterile operating room interaction.

    PubMed

    Black, David; Unger, Michael; Fischer, Nele; Kikinis, Ron; Hahn, Horst; Neumuth, Thomas; Glaser, Bernhard

    2018-01-01

    The growing number of technical systems in the operating room has increased attention on developing touchless interaction methods for sterile conditions. However, touchless interaction paradigms lack the tactile feedback found in common input devices such as mice and keyboards. We propose a novel touchless eye-tracking interaction system with auditory display as a feedback method for completing typical operating room tasks. Auditory display provides feedback concerning the selected input into the eye-tracking system as well as a confirmation of the system response. An eye-tracking system with a novel auditory display using both earcons and parameter-mapping sonification was developed to allow touchless interaction for six typical scrub nurse tasks. An evaluation with novice participants compared auditory display with visual display with respect to reaction time and a series of subjective measures. When using auditory display to substitute for the lost tactile feedback during eye-tracking interaction, participants exhibit reduced reaction time compared to using visual-only display. In addition, the auditory feedback led to lower subjective workload and higher usefulness and system acceptance ratings. Due to the absence of tactile feedback for eye-tracking and other touchless interaction methods, auditory display is shown to be a useful and necessary addition to new interaction concepts for the sterile operating room, reducing reaction times while improving subjective measures, including usefulness, user satisfaction, and cognitive workload.

  8. Eye-Hand Synergy and Intermittent Behaviors during Target-Directed Tracking with Visual and Non-visual Information

    PubMed Central

    Huang, Chien-Ting; Hwang, Ing-Shiou

    2012-01-01

    Visual feedback and non-visual information play different roles in tracking of an external target. This study explored the respective roles of the visual and non-visual information in eleven healthy volunteers who coupled the manual cursor to a rhythmically moving target of 0.5 Hz under three sensorimotor conditions: eye-alone tracking (EA), eye-hand tracking with visual feedback of manual outputs (EH tracking), and the same tracking without such feedback (EHM tracking). Tracking error, kinematic variables, and movement intermittency (saccade and speed pulse) were contrasted among tracking conditions. The results showed that EHM tracking exhibited larger pursuit gain, less tracking error, and less movement intermittency for the ocular plant than EA tracking. With the vision of manual cursor, EH tracking achieved superior tracking congruency of the ocular and manual effectors with smaller movement intermittency than EHM tracking, except that the rate precision of manual action was similar for both types of tracking. The present study demonstrated that visibility of manual consequences altered mutual relationships between movement intermittency and tracking error. The speed pulse metrics of manual output were linked to ocular tracking error, and saccade events were time-locked to the positional error of manual tracking during EH tracking. In conclusion, peripheral non-visual information is critical to smooth pursuit characteristics and rate control of rhythmic manual tracking. Visual information adds to eye-hand synchrony, underlying improved amplitude control and elaborate error interpretation during oculo-manual tracking. PMID:23236498

  9. Can Individuals with Autism Abstract Prototypes of Natural Faces?

    ERIC Educational Resources Information Center

    Gastgeb, Holly Zajac; Wilkinson, Desiree A.; Minshew, Nancy J.; Strauss, Mark S.

    2011-01-01

    There is a growing amount of evidence suggesting that individuals with autism have difficulty with face processing. One basic cognitive ability that may underlie face processing difficulties is the ability to abstract a prototype. The current study examined prototype formation with natural faces using eye-tracking in high-functioning adults with…

  10. Using eye tracking to identify faking attempts during penile plethysmography assessment.

    PubMed

    Trottier, Dominique; Rouleau, Joanne-Lucine; Renaud, Patrice; Goyette, Mathieu

    2014-01-01

    Penile plethysmography (PPG) is considered the most rigorous method for sexual interest assessment. Nevertheless, it is subject to faking attempts by participants, which compromises the internal validity of the instrument. To date, various attempts have been made to limit voluntary control of sexual response during PPG assessments, without satisfactory results. This exploratory research examined eye-tracking technologies' ability to identify the presence of cognitive strategies responsible for erectile inhibition during PPG assessment. Eye movements and penile responses for 20 subjects were recorded while exploring animated human-like computer-generated stimuli in a virtual environment under three distinct viewing conditions: (a) the free visual exploration of a preferred sexual stimulus without erectile inhibition; (b) the viewing of a preferred sexual stimulus with erectile inhibition; and (c) the free visual exploration of a non-preferred sexual stimulus. Results suggest that attempts to control erectile responses generate specific eye-movement variations, characterized by a general deceleration of the exploration process and limited exploration of the erogenous zone. Findings indicate that recording eye movements can provide significant information on the presence of competing covert processes responsible for erectile inhibition. The use of eye-tracking technologies during PPG could therefore lead to improved internal validity of the plethysmographic procedure.

  11. APPLICATION OF EYE TRACKING FOR MEASUREMENT AND EVALUATION IN HUMAN FACTORS STUDIES IN CONTROL ROOM MODERNIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, C.; Spielman, Z.; LeBlanc, K.

    An important element of human factors engineering (HFE) pertains to measurement and evaluation (M&E). The role of HFE-M&E should be integrated throughout the entire control room modernization (CRM) process and be used for human-system performance evaluation and diagnostic purposes with resolving potential human engineering deficiencies (HEDs) and other human machine interface (HMI) design issues. NUREG-0711 describes how HFE in CRM should employ a hierarchical set of measures, particularly during integrated system validation (ISV), including plant performance, personnel task performance, situation awareness, cognitive workload, and anthropometric/ physiological factors. Historically, subjective measures have been primarily used since they are easier to collectmore » and do not require specialized equipment. However, there are pitfalls with relying solely on subjective measures in M&E such that negatively impact reliability, sensitivity, and objectivity. As part of comprehensively capturing a diverse set of measures that strengthen findings and inferences made of the benefits from emerging technologies like advanced displays, this paper discusses the value of using eye tracking as an objective method that can be used in M&E. A brief description of eye tracking technology and relevant eye tracking measures is provided. Additionally, technical considerations and the unique challenges with using eye tracking in full-scaled simulations are addressed. Finally, this paper shares preliminary findings regarding the use of a wearable eye tracking system in a full-scale simulator study. These findings should help guide future full-scale simulator studies using eye tracking as a methodology to evaluate human-system performance.« less

  12. Understanding Health Literacy Measurement Through Eye Tracking

    PubMed Central

    Mackert, Michael; Champlin, Sara E.; Pasch, Keryn E.; Weiss, Barry D.

    2013-01-01

    This study used eye-tracking technology to explore how individuals with different levels of health literacy visualize health-related information. The authors recruited 25 university administrative staff (more likely to have adequate health literacy skills) and 25 adults enrolled in an adult literacy program (more likely to have limited health literacy skills). The authors administered the Newest Vital Sign (NVS) health literacy assessment to each participant. The assessment involves having individuals answer questions about a nutrition label while viewing the label. The authors used computerized eye-tracking technology to measure the amount of time each participant spent fixing their view at nutrition label information that was relevant to the questions being asked and the amount of time they spent viewing nonrelevant information. Results showed that lower NVS scores were significantly associated with more time spent on information not relevant for answering the NVS items. This finding suggests that efforts to improve health literacy measurement should include the ability to differentiate not just between individuals who have difficulty interpreting and using health information, but also between those who have difficulty finding relevant information. In addition, this finding suggests that health education material should minimize the inclusion of nonrelevant information. PMID:24093355

  13. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR.

    PubMed

    King, Andrew J; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device's accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use.

  14. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR

    PubMed Central

    King, Andrew J.; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F.

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device’s accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use. PMID:28815151

  15. Mobile Eye Tracking Reveals Little Evidence for Age Differences in Attentional Selection for Mood Regulation

    PubMed Central

    Isaacowitz, Derek M.; Livingstone, Kimberly M.; Harris, Julia A.; Marcotte, Stacy L.

    2014-01-01

    We report two studies representing the first use of mobile eye tracking to study emotion regulation across adulthood. Past research on age differences in attentional deployment using stationary eye tracking has found older adults show relatively more positive looking, and seem to benefit more mood-wise from this looking pattern, compared to younger adults. However, these past studies have greatly constrained the stimuli participants can look at, despite real-world settings providing numerous possibilities for what to choose to look at. We therefore used mobile eye tracking to study age differences in attentional selection, as indicated by fixation patterns to stimuli of different valence freely chosen by the participant. In contrast to stationary eye tracking studies of attentional deployment, Study 1 showed that younger and older individuals generally selected similar proportions of valenced stimuli, and attentional selection had similar effects on mood across age groups. Study 2 replicated this pattern with an adult lifespan sample including middle-aged individuals. Emotion regulation-relevant attention may thus differ depending on whether stimuli are freely chosen or not. PMID:25527965

  16. Processing Control Information in a Nominal Control Construction: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Kwon, Nayoung; Sturt, Patrick

    2016-01-01

    In an eye-tracking experiment, we examined the processing of the nominal control construction. Participants' eye-movements were monitored while they read sentences that included either giver control nominals (e.g. "promise" in "Luke's promise to Sophia to photograph himself") or recipient control nominals (e.g. "plea"…

  17. Making Tracks on Mars (left-eye)

    NASA Technical Reports Server (NTRS)

    2004-01-01

    NASA's Mars Exploration Rover Spirit has been making tracks on Mars for seven months now, well beyond its original 90-day mission. The rover traveled more than 3 kilometers (2 miles) to reach the 'Columbia Hills' pictured here. In this 360-degree view of the rolling martian terrain, its wheel tracks can be seen approaching from the northwest (right side of image).

    Spirit's navigation camera took the images that make up this mosaic on sols 210 and 213 (Aug. 5 and Aug. 8, 2004). The rover is now conducting scientific studies of the local geology on the 'Clovis' outcrop of the 'West Spur' region of the 'Columbia Hills.' The view is presented in a cylindrical-perspective projection with geometrical seam correction. This is the left-eye view of a stereo pair. Scientists plan for Spirit to take a color panoramic image from this location.

  18. Keeping an eye on pain: investigating visual attention biases in individuals with chronic pain using eye-tracking methodology

    PubMed Central

    Fashler, Samantha R; Katz, Joel

    2016-01-01

    Attentional biases to painful stimuli are evident in individuals with chronic pain, although the directional tendency of these biases (ie, toward or away from threat-related stimuli) remains unclear. This study used eye-tracking technology, a measure of visual attention, to evaluate the attentional patterns of individuals with and without chronic pain during exposure to injury-related and neutral pictures. Individuals with (N=51) and without chronic pain (N=62) completed a dot-probe task using injury-related and neutral pictures while their eye movements were recorded. Mixed-design analysis of variance evaluated the interaction between group (chronic pain, pain-free) and picture type (injury-related, neutral). Reaction time results showed that regardless of chronic pain status, participants responded faster to trials with neutral stimuli in comparison to trials that included injury-related pictures. Eye-tracking measures showed within-group differences whereby injury-related pictures received more frequent fixations and visits, as well as longer average visit durations. Between-group differences showed that individuals with chronic pain had fewer fixations and shorter average visit durations for all stimuli. An examination of how biases change over the time-course of stimulus presentation showed that during the late phase of attention, individuals with chronic pain had longer average gaze durations on injury pictures relative to pain-free individuals. The results show the advantage of incorporating eye-tracking methodology when examining attentional biases, and suggest future avenues of research. PMID:27570461

  19. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.

    PubMed

    Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo

    2017-07-01

    Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.

  20. Mobile Eye Tracking Methodology in Informal E-Learning in Social Groups in Technology-Enhanced Science Centres

    ERIC Educational Resources Information Center

    Magnussen, Rikke; Zachariassen, Maria; Kharlamov, Nikita; Larsen, Birger

    2017-01-01

    This paper presents a methodological discussion of the potential and challenges of involving mobile eye tracking technology in studies of knowledge generation and learning in a science centre context. The methodological exploration is based on eye-tracking studies of audience interaction and knowledge generation in the technology-enhanced health…

  1. Using eye tracking technology to compare the effectiveness of malignant hyperthermia cognitive aid design.

    PubMed

    King, Roderick; Hanhan, Jaber; Harrison, T Kyle; Kou, Alex; Howard, Steven K; Borg, Lindsay K; Shum, Cynthia; Udani, Ankeet D; Mariano, Edward R

    2018-05-15

    Malignant hyperthermia is a rare but potentially fatal complication of anesthesia, and several different cognitive aids designed to facilitate a timely and accurate response to this crisis currently exist. Eye tracking technology can measure voluntary and involuntary eye movements, gaze fixation within an area of interest, and speed of visual response and has been used to a limited extent in anesthesiology. With eye tracking technology, we compared the accessibility of five malignant hyperthermia cognitive aids by collecting gaze data from twelve volunteer participants. Recordings were reviewed and annotated to measure the time required for participants to locate objects on the cognitive aid to provide an answer; cumulative time to answer was the primary outcome. For the primary outcome, there were differences detected between cumulative time to answer survival curves (P < 0.001). Participants demonstrated the shortest cumulative time to answer when viewing the Society for Pediatric Anesthesia (SPA) cognitive aid compared to four other publicly available cognitive aids for malignant hyperthermia, and this outcome was not influenced by the anesthesiologists' years of experience. This is the first study to utilize eye tracking technology in a comparative evaluation of cognitive aid design, and our experience suggests that there may be additional applications of eye tracking technology in healthcare and medical education. Potentially advantageous design features of the SPA cognitive aid include a single page, linear layout, and simple typescript with minimal use of single color blocking.

  2. ENLARGEMENT OF FOVEAL AVASCULAR ZONE IN DIABETIC EYES EVALUATED BY EN FACE OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY.

    PubMed

    Takase, Noriaki; Nozaki, Miho; Kato, Aki; Ozeki, Hironori; Yoshida, Munenori; Ogura, Yuichiro

    2015-11-01

    To evaluate the area of the foveal avascular zone (FAZ) detected by en face OCTA (AngioVue, Avanti OCT; Optovue) in healthy and diabetic eyes. Retrospective chart review of patients who underwent fundus examination including en face OCTA. Eyes with proliferative diabetic retinopathy and history of laser photocoagulation were excluded. The FAZ area in the superficial and deep plexus layers were measured and evaluated using ImageJ software. The FAZ area in the superficial layer was 0.25 ± 0.06 mm² in healthy eyes (n = 19), whereas it was 0.37 ± 0.07 mm² in diabetic eyes without retinopathy (n = 24) and 0.38 ± 0.11 mm² in eyes with diabetic retinopathy (n = 20). Diabetic eyes showed statistically significant FAZ enlargement compared with healthy eyes, regardless of the presence of retinopathy (P < 0.01). The FAZ area in the deep plexus layer was also significantly larger in diabetic eyes than in healthy eyes (P < 0.01). Our data suggest that diabetic eyes show retinal microcirculation impairment in the macula even before retinopathy develops. En face OCTA is a useful noninvasive screening tool for detecting early microcirculatory disturbance in patients with diabetes.

  3. Accounting for direction and speed of eye motion in planning visually guided manual tracking.

    PubMed

    Leclercq, Guillaume; Blohm, Gunnar; Lefèvre, Philippe

    2013-10-01

    Accurate motor planning in a dynamic environment is a critical skill for humans because we are often required to react quickly and adequately to the visual motion of objects. Moreover, we are often in motion ourselves, and this complicates motor planning. Indeed, the retinal and spatial motions of an object are different because of the retinal motion component induced by self-motion. Many studies have investigated motion perception during smooth pursuit and concluded that eye velocity is partially taken into account by the brain. Here we investigate whether the eye velocity during ongoing smooth pursuit is taken into account for the planning of visually guided manual tracking. We had 10 human participants manually track a target while in steady-state smooth pursuit toward another target such that the difference between the retinal and spatial target motion directions could be large, depending on both the direction and the speed of the eye. We used a measure of initial arm movement direction to quantify whether motor planning occurred in retinal coordinates (not accounting for eye motion) or was spatially correct (incorporating eye velocity). Results showed that the eye velocity was nearly fully taken into account by the neuronal areas involved in the visuomotor velocity transformation (between 75% and 102%). In particular, these neuronal pathways accounted for the nonlinear effects due to the relative velocity between the target and the eye. In conclusion, the brain network transforming visual motion into a motor plan for manual tracking adequately uses extraretinal signals about eye velocity.

  4. Intuitive Face Judgments Rely on Holistic Eye Movement Pattern.

    PubMed

    Mega, Laura F; Volz, Kirsten G

    2017-01-01

    Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an 'intuitive group,' instructed to rely on their "gut feeling" for the authenticity judgments, and a 'deliberative group,' instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the "gestalt" of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research.

  5. Eye-movement strategies in developmental prosopagnosia and "super" face recognition.

    PubMed

    Bobak, Anna K; Parris, Benjamin A; Gregory, Nicola J; Bennetts, Rachel J; Bate, Sarah

    2017-02-01

    Developmental prosopagnosia (DP) is a cognitive condition characterized by a severe deficit in face recognition. Few investigations have examined whether impairments at the early stages of processing may underpin the condition, and it is also unknown whether DP is simply the "bottom end" of the typical face-processing spectrum. To address these issues, we monitored the eye-movements of DPs, typical perceivers, and "super recognizers" (SRs) while they viewed a set of static images displaying people engaged in naturalistic social scenarios. Three key findings emerged: (a) Individuals with more severe prosopagnosia spent less time examining the internal facial region, (b) as observed in acquired prosopagnosia, some DPs spent less time examining the eyes and more time examining the mouth than controls, and (c) SRs spent more time examining the nose-a measure that also correlated with face recognition ability in controls. These findings support previous suggestions that DP is a heterogeneous condition, but suggest that at least the most severe cases represent a group of individuals that qualitatively differ from the typical population. While SRs seem to merely be those at the "top end" of normal, this work identifies the nose as a critical region for successful face recognition.

  6. Tracking the Eye Movement of Four Years Old Children Learning Chinese Words

    ERIC Educational Resources Information Center

    Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei

    2018-01-01

    Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese…

  7. Remote vs. head-mounted eye-tracking: a comparison using radiologists reading mammograms

    NASA Astrophysics Data System (ADS)

    Mello-Thoms, Claudia; Gur, David

    2007-03-01

    Eye position monitoring has been used for decades in Radiology in order to determine how radiologists interpret medical images. Using these devices several discoveries about the perception/decision making process have been made, such as the importance of comparisons of perceived abnormalities with selected areas of the background, the likelihood that a true lesion will attract visual attention early in the reading process, and the finding that most misses attract prolonged visual dwell, often comparable to dwell in the location of reported lesions. However, eye position tracking is a cumbersome process, which often requires the observer to wear a helmet gear which contains the eye tracker per se and a magnetic head tracker, which allows for the computation of head position. Observers tend to complain of fatigue after wearing the gear for a prolonged time. Recently, with the advances made to remote eye-tracking, the use of head-mounted systems seemed destined to become a thing of the past. In this study we evaluated a remote eye tracking system, and compared it to a head-mounted system, as radiologists read a case set of one-view mammograms on a high-resolution display. We compared visual search parameters between the two systems, such as time to hit the location of the lesion for the first time, amount of dwell time in the location of the lesion, total time analyzing the image, etc. We also evaluated the observers' impressions of both systems, and what their perceptions were of the restrictions of each system.

  8. Face value: eye movements and the evaluation of facial crowds in social anxiety.

    PubMed

    Lange, Wolf-Gero; Heuer, Kathrin; Langner, Oliver; Keijsers, Ger P J; Becker, Eni S; Rinck, Mike

    2011-09-01

    Scientific evidence is equivocal on whether Social Anxiety Disorder (SAD) is characterized by a biased negative evaluation of (grouped) facial expressions, even though it is assumed that such a bias plays a crucial role in the maintenance of the disorder. To shed light on the underlying mechanisms of face evaluation in social anxiety, the eye movements of 22 highly socially anxious (SAs) and 21 non-anxious controls (NACs) were recorded while they rated the degree of friendliness of neutral-angry and smiling-angry face combinations. While the Crowd Rating Task data showed no significant differences between SAs and NACs, the resultant eye-movement patterns revealed that SAs, compared to NACs, looked away faster when the face first fixated was angry. Additionally, in SAs the proportion of fixated angry faces was significantly higher than for other expressions. Independent of social anxiety, these fixated angry faces were the best predictor of subsequent affect ratings for either group. Angry faces influence attentional processes such as eye movements in SAs and by doing so reflect biased evaluations. As these processes do not correlate with explicit ratings of faces, however, it remains unclear at what point implicit attentional behaviors lead to anxiety-prone behaviors and the maintenance of SAD. The relevance of these findings is discussed in the light of the current theories. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Assessing the Potential Use of Eye-Tracking Triangulation for Evaluating the Usability of an Online Diabetes Exercise System.

    PubMed

    Schaarup, Clara; Hartvigsen, Gunnar; Larsen, Lars Bo; Tan, Zheng-Hua; Årsand, Eirik; Hejlesen, Ole Kristian

    2015-01-01

    The Online Diabetes Exercise System was developed to motivate people with Type 2 diabetes to do a 25 minutes low-volume high-intensity interval training program. In a previous multi-method evaluation of the system, several usability issues were identified and corrected. Despite the thorough testing, it was unclear whether all usability problems had been identified using the multi-method evaluation. Our hypothesis was that adding the eye-tracking triangulation to the multi-method evaluation would increase the accuracy and completeness when testing the usability of the system. The study design was an Eye-tracking Triangulation; conventional eye-tracking with predefined tasks followed by The Post-Experience Eye-Tracked Protocol (PEEP). Six Areas of Interests were the basis for the PEEP-session. The eye-tracking triangulation gave objective and subjective results, which are believed to be highly relevant for designing, implementing, evaluating and optimizing systems in the field of health informatics. Future work should include testing the method on a larger and more representative group of users and apply the method on different system types.

  10. Prior Knowledge and Online Inquiry-Based Science Reading: Evidence from Eye Tracking

    ERIC Educational Resources Information Center

    Ho, Hsin Ning Jessie; Tsai, Meng-Jung; Wang, Ching-Yeh; Tsai, Chin-Chung

    2014-01-01

    This study employed eye-tracking technology to examine how students with different levels of prior knowledge process text and data diagrams when reading a web-based scientific report. Students' visual behaviors were tracked and recorded when they read a report demonstrating the relationship between the greenhouse effect and global climate…

  11. Correlation peak analysis applied to a sequence of images using two different filters for eye tracking model

    NASA Astrophysics Data System (ADS)

    Patrón, Verónica A.; Álvarez Borrego, Josué; Coronel Beltrán, Ángel

    2015-09-01

    Eye tracking has many useful applications that range from biometrics to face recognition and human-computer interaction. The analysis of the characteristics of the eyes has become one of the methods to accomplish the location of the eyes and the tracking of the point of gaze. Characteristics such as the contrast between the iris and the sclera, the shape, and distribution of colors and dark/light zones in the area are the starting point for these analyses. In this work, the focus will be on the contrast between the iris and the sclera, performing a correlation in the frequency domain. The images are acquired with an ordinary camera, which with were taken images of thirty-one volunteers. The reference image is an image of the subjects looking to a point in front of them at 0° angle. Then sequences of images are taken with the subject looking at different angles. These images are processed in MATLAB, obtaining the maximum correlation peak for each image, using two different filters. Each filter were analyzed and then one was selected, which is the filter that gives the best performance in terms of the utility of the data, which is displayed in graphs that shows the decay of the correlation peak as the eye moves progressively at different angle. This data will be used to obtain a mathematical model or function that establishes a relationship between the angle of vision (AOV) and the maximum correlation peak (MCP). This model will be tested using different input images from other subject not contained in the initial database, being able to predict angle of vision using the maximum correlation peak data.

  12. A MATLAB-based eye tracking control system using non-invasive helmet head restraint in the macaque.

    PubMed

    De Luna, Paolo; Mohamed Mustafar, Mohamed Faiz Bin; Rainer, Gregor

    2014-09-30

    Tracking eye position is vital for behavioral and neurophysiological investigations in systems and cognitive neuroscience. Infrared camera systems which are now available can be used for eye tracking without the need to surgically implant magnetic search coils. These systems are generally employed using rigid head fixation in monkeys, which maintains the eye in a constant position and facilitates eye tracking. We investigate the use of non-rigid head fixation using a helmet that constrains only general head orientation and allows some freedom of movement. We present a MATLAB software solution to gather and process eye position data, present visual stimuli, interact with various devices, provide experimenter feedback and store data for offline analysis. Our software solution achieves excellent timing performance due to the use of data streaming, instead of the traditionally employed data storage mode for processing analog eye position data. We present behavioral data from two monkeys, demonstrating that adequate performance levels can be achieved on a simple fixation paradigm and show how performance depends on parameters such as fixation window size. Our findings suggest that non-rigid head restraint can be employed for behavioral training and testing on a variety of gaze-dependent visual paradigms, reducing the need for rigid head restraint systems for some applications. While developed for macaque monkey, our system of course can work equally well for applications in human eye tracking where head constraint is undesirable. Copyright © 2014. Published by Elsevier B.V.

  13. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.

    PubMed

    Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.

  14. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma

    PubMed Central

    Black, Alex A.

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433

  15. Facial perception of conspecifics: chimpanzees (Pan troglodytes) preferentially attend to proper orientation and open eyes.

    PubMed

    Hirata, Satoshi; Fuwa, Koki; Sugama, Keiko; Kusunoki, Kiyo; Fujita, Shin

    2010-09-01

    This paper reports on the use of an eye-tracking technique to examine how chimpanzees look at facial photographs of conspecifics. Six chimpanzees viewed a sequence of pictures presented on a monitor while their eye movements were measured by an eye tracker. The pictures presented conspecific faces with open or closed eyes in an upright or inverted orientation in a frame. The results demonstrated that chimpanzees looked at the eyes, nose, and mouth more frequently than would be expected on the basis of random scanning of faces. More specifically, they looked at the eyes longer than they looked at the nose and mouth when photographs of upright faces with open eyes were presented, suggesting that particular attention to the eyes represents a spontaneous face-scanning strategy shared among monkeys, apes, and humans. In contrast to the results obtained for upright faces with open eyes, the viewing times for the eyes, nose, and mouth of inverted faces with open eyes did not differ from one another. The viewing times for the eyes, nose, and mouth of faces with closed eyes did not differ when faces with closed eyes were presented in either an upright or inverted orientation. These results suggest the possibility that open eyes play an important role in the configural processing of faces and that chimpanzees perceive and process open and closed eyes differently.

  16. Modeling Face Identification Processing in Children and Adults.

    ERIC Educational Resources Information Center

    Schwarzer, Gudrun; Massaro, Dominic W.

    2001-01-01

    Two experiments studied whether and how 5-year-olds integrate single facial features to identify faces. Results indicated that children could evaluate and integrate information from eye and mouth features to identify a face when salience of features was varied. A weighted Fuzzy Logical Model of Perception fit better than a Single Channel Model,…

  17. Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications.

    PubMed

    Calvo, Manuel G; Nummenmaa, Lauri

    2009-12-01

    Happy, surprised, disgusted, angry, sad, fearful, and neutral faces were presented extrafoveally, with fixations on faces allowed or not. The faces were preceded by a cue word that designated the face to be saccaded in a two-alternative forced-choice discrimination task (2AFC; Experiments 1 and 2), or were followed by a probe word for recognition (Experiment 3). Eye tracking was used to decompose the recognition process into stages. Relative to the other expressions, happy faces (1) were identified faster (as early as 160 msec from stimulus onset) in extrafoveal vision, as revealed by shorter saccade latencies in the 2AFC task; (2) required less encoding effort, as indexed by shorter first fixations and dwell times; and (3) required less decision-making effort, as indicated by fewer refixations on the face after the recognition probe was presented. This reveals a happy-face identification advantage both prior to and during overt attentional processing. The results are discussed in relation to prior neurophysiological findings on latencies in facial expression recognition.

  18. Subtitles and Eye Tracking: Reading and Performance

    ERIC Educational Resources Information Center

    Kruger, Jan-Louis; Steyn, Faans

    2014-01-01

    This article presents an experimental study to investigate whether subtitle reading has a positive impact on academic performance. In the absence of reliable indexes of reading behavior in dynamic texts, the article first formulates and validates an index to measure the reading of text, such as subtitles on film. Eye-tracking measures (fixations…

  19. Performance Testing Updates in Head, Face, and Eye Protection

    PubMed Central

    2001-01-01

    Objective: To describe the evolution and implementation of standards for head, face, and eye protection in sports. Background: Recent changes in testing standards for head, face, and eye protection include the development of new equipment, the mandating of tougher standards, and the coverage of additional products by these standards, all in an effort to improve athletes' safety and reduce their risk of injury. The person selecting equipment needs to understand these standards, how they are developed for each piece of equipment, and which standards the piece of equipment is purported to meet. Conclusions/Recommendations: The sports medicine clinician must recommend only the use of personal protective equipment that meets a current standard; must ensure that the equipment is maintained in its original form and that all parts and labels are present; and must ascertain that equipment is refurbished by a qualified reconditioner. By following these guidelines, we improve sport safety for our athletes and lessen their risk of injury. PMID:12937504

  20. Instructional Suggestions Supporting Science Learning in Digital Environments Based on a Review of Eye-Tracking Studies

    ERIC Educational Resources Information Center

    Yang, Fang-Ying; Tsai, Meng-Jung; Chiou, Guo-Li; Lee, Silvia Wen-Yu; Chang, Cheng-Chieh; Chen, Li-Ling

    2018-01-01

    The main purpose of this study was to provide instructional suggestions for supporting science learning in digital environments based on a review of eye tracking studies in e-learning related areas. Thirty-three eye-tracking studies from 2005 to 2014 were selected from the Social Science Citation Index (SSCI) database for review. Through a…

  1. Using Eye Tracking as a Tool to Teach Informatics Students the Importance of User Centered Design

    ERIC Educational Resources Information Center

    Gelderblom, Helene; Adebesin, Funmi; Brosens, Jacques; Kruger, Rendani

    2017-01-01

    In this article the authors describe how they incorporate eye tracking in a human-computer interaction (HCI) course that forms part of a postgraduate Informatics degree. The focus is on an eye tracking assignment that involves student groups performing usability evaluation studies for real world clients. Over the past three years the authors have…

  2. Eye Tracking Based Control System for Natural Human-Computer Interaction

    PubMed Central

    Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528

  3. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    PubMed

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  4. Patterns of Visual Attention to Faces and Objects in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    McPartland, James C.; Webb, Sara Jane; Keehn, Brandon; Dawson, Geraldine

    2011-01-01

    This study used eye-tracking to examine visual attention to faces and objects in adolescents with autism spectrum disorder (ASD) and typical peers. Point of gaze was recorded during passive viewing of images of human faces, inverted human faces, monkey faces, three-dimensional curvilinear objects, and two-dimensional geometric patterns.…

  5. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics.

    PubMed

    Danion, Frederic; Mathew, James; Flanagan, J Randall

    2017-01-01

    Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance.

  6. Eye Tracking of Occluded Self-Moved Targets: Role of Haptic Feedback and Hand-Target Dynamics

    PubMed Central

    Mathew, James

    2017-01-01

    Abstract Previous studies on smooth pursuit eye movements have shown that humans can continue to track the position of their hand, or a target controlled by the hand, after it is occluded, thereby demonstrating that arm motor commands contribute to the prediction of target motion driving pursuit eye movements. Here, we investigated this predictive mechanism by manipulating both the complexity of the hand-target mapping and the provision of haptic feedback. Two hand-target mappings were used, either a rigid (simple) one in which hand and target motion matched perfectly or a nonrigid (complex) one in which the target behaved as a mass attached to the hand by means of a spring. Target animation was obtained by asking participants to oscillate a lightweight robotic device that provided (or not) haptic feedback consistent with the target dynamics. Results showed that as long as 7 s after target occlusion, smooth pursuit continued to be the main contributor to total eye displacement (∼60%). However, the accuracy of eye-tracking varied substantially across experimental conditions. In general, eye-tracking was less accurate under the nonrigid mapping, as reflected by higher positional and velocity errors. Interestingly, haptic feedback helped to reduce the detrimental effects of target occlusion when participants used the nonrigid mapping, but not when they used the rigid one. Overall, we conclude that the ability to maintain smooth pursuit in the absence of visual information can extend to complex hand-target mappings, but the provision of haptic feedback is critical for the maintenance of accurate eye-tracking performance. PMID:28680964

  7. Face identification with frequency domain matched filtering in mobile environments

    NASA Astrophysics Data System (ADS)

    Lee, Dong-Su; Woo, Yong-Hyun; Yeom, Seokwon; Kim, Shin-Hwan

    2012-06-01

    Face identification at a distance is very challenging since captured images are often degraded by blur and noise. Furthermore, the computational resources and memory are often limited in the mobile environments. Thus, it is very challenging to develop a real-time face identification system on the mobile device. This paper discusses face identification based on frequency domain matched filtering in the mobile environments. Face identification is performed by the linear or phase-only matched filter and sequential verification stages. The candidate window regions are decided by the major peaks of the linear or phase-only matched filtering outputs. The sequential stages comprise a skin-color test and an edge mask filtering test, which verify color and shape information of the candidate regions in order to remove false alarms. All algorithms are built on the mobile device using Android platform. The preliminary results show that face identification of East Asian people can be performed successfully in the mobile environments.

  8. Placebo effects in spider phobia: an eye-tracking experiment.

    PubMed

    Gremsl, Andreas; Schwab, Daniela; Höfler, Carina; Schienle, Anne

    2018-01-05

    Several eye-tracking studies have revealed that spider phobic patients show a typical hypervigilance-avoidance pattern when confronted with images of spiders. The present experiment investigated if this pattern can be changed via placebo treatment. We conducted an eye-tracking experiment with 37 women with spider phobia. They looked at picture pairs (a spider paired with a neutral picture) for 7 s each in a retest design: once with and once without a placebo pill presented along with the verbal suggestion that it can reduce phobic symptoms. The placebo was labelled as Propranolol, a beta-blocker that has been successfully used to treat spider phobia. In the placebo condition, both the fixation count and the dwell time on the spider pictures increased, especially in the second half of the presentation time. This was associated with a slight decrease in self-reported symptom severity. In summary, we were able to show that a placebo was able to positively influence visual avoidance in spider phobia. This effect might help to overcome apprehension about engaging in exposure therapy, which is present in many phobic patients.

  9. A novel thermal face recognition approach using face pattern words

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng

    2010-04-01

    A reliable thermal face recognition system can enhance the national security applications such as prevention against terrorism, surveillance, monitoring and tracking, especially at nighttime. The system can be applied at airports, customs or high-alert facilities (e.g., nuclear power plant) for 24 hours a day. In this paper, we propose a novel face recognition approach utilizing thermal (long wave infrared) face images that can automatically identify a subject at both daytime and nighttime. With a properly acquired thermal image (as a query image) in monitoring zone, the following processes will be employed: normalization and denoising, face detection, face alignment, face masking, Gabor wavelet transform, face pattern words (FPWs) creation, face identification by similarity measure (Hamming distance). If eyeglasses are present on a subject's face, an eyeglasses mask will be automatically extracted from the querying face image, and then masked with all comparing FPWs (no more transforms). A high identification rate (97.44% with Top-1 match) has been achieved upon our preliminary face dataset (of 39 subjects) from the proposed approach regardless operating time and glasses-wearing condition.e

  10. Extracting information of fixational eye movements through pupil tracking

    NASA Astrophysics Data System (ADS)

    Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng

    2018-01-01

    Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.

  11. Intuitive Face Judgments Rely on Holistic Eye Movement Pattern

    PubMed Central

    Mega, Laura F.; Volz, Kirsten G.

    2017-01-01

    Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an ‘intuitive group,’ instructed to rely on their “gut feeling” for the authenticity judgments, and a ‘deliberative group,’ instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the “gestalt” of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research. PMID:28676773

  12. Biracial and Monoracial Infant Own-Race Face Perception: An Eye Tracking Study

    ERIC Educational Resources Information Center

    Gaither, Sarah E.; Pauker, Kristin; Johnson, Scott P.

    2012-01-01

    We know that early experience plays a crucial role in the development of face processing, but we know little about how infants learn to distinguish faces from different races, especially for non-Caucasian populations. Moreover, it is unknown whether differential processing of different race faces observed in typically studied monoracial infants…

  13. Oculomatic: High speed, reliable, and accurate open-source eye tracking for humans and non-human primates.

    PubMed

    Zimmermann, Jan; Vazquez, Yuriria; Glimcher, Paul W; Pesaran, Bijan; Louie, Kenway

    2016-09-01

    Video-based noninvasive eye trackers are an extremely useful tool for many areas of research. Many open-source eye trackers are available but current open-source systems are not designed to track eye movements with the temporal resolution required to investigate the mechanisms of oculomotor behavior. Commercial systems are available but employ closed source hardware and software and are relatively expensive, limiting wide-spread use. Here we present Oculomatic, an open-source software and modular hardware solution to eye tracking for use in humans and non-human primates. Oculomatic features high temporal resolution (up to 600Hz), real-time eye tracking with high spatial accuracy (<0.5°), and low system latency (∼1.8ms, 0.32ms STD) at a relatively low-cost. Oculomatic compares favorably to our existing scleral search-coil system while being fully non invasive. We propose that Oculomatic can support a wide range of research into the properties and neural mechanisms of oculomotor behavior. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Binocular eye movement control and motion perception: what is being tracked?

    PubMed

    van der Steen, Johannes; Dits, Joyce

    2012-10-19

    We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking.

  15. Binocular Eye Movement Control and Motion Perception: What Is Being Tracked?

    PubMed Central

    van der Steen, Johannes; Dits, Joyce

    2012-01-01

    Purpose. We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. Methods. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Results. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. Conclusions. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking. PMID:22997286

  16. Autistic Symptomatology, Face Processing Abilities, and Eye Fixation Patterns

    ERIC Educational Resources Information Center

    Kirchner, Jennifer C.; Hatri, Alexander; Heekeren, Hauke R.; Dziobek, Isabel

    2011-01-01

    Deviant gaze behavior is a defining characteristic of autism. Its relevance as a pathophysiological mechanism, however, remains unknown. In the present study, we compared eye fixations of 20 adults with autism and 21 controls while they were engaged in taking the Multifaceted Empathy Test (MET). Additional measures of face emotion and identity…

  17. The utility of modeling word identification from visual input within models of eye movements in reading

    PubMed Central

    Bicknell, Klinton; Levy, Roger

    2012-01-01

    Decades of empirical work have shown that a range of eye movement phenomena in reading are sensitive to the details of the process of word identification. Despite this, major models of eye movement control in reading do not explicitly model word identification from visual input. This paper presents a argument for developing models of eye movements that do include detailed models of word identification. Specifically, we argue that insights into eye movement behavior can be gained by understanding which phenomena naturally arise from an account in which the eyes move for efficient word identification, and that one important use of such models is to test which eye movement phenomena can be understood this way. As an extended case study, we present evidence from an extension of a previous model of eye movement control in reading that does explicitly model word identification from visual input, Mr. Chips (Legge, Klitz, & Tjan, 1997), to test two proposals for the effect of using linguistic context on reading efficiency. PMID:23074362

  18. Social and attention-to-detail subclusters of autistic traits differentially predict looking at eyes and face identity recognition ability.

    PubMed

    Davis, Joshua; McKone, Elinor; Zirnsak, Marc; Moore, Tirin; O'Kearney, Richard; Apthorp, Deborah; Palermo, Romina

    2017-02-01

    This study distinguished between different subclusters of autistic traits in the general population and examined the relationships between these subclusters, looking at the eyes of faces, and the ability to recognize facial identity. Using the Autism Spectrum Quotient (AQ) measure in a university-recruited sample, we separate the social aspects of autistic traits (i.e., those related to communication and social interaction; AQ-Social) from the non-social aspects, particularly attention-to-detail (AQ-Attention). We provide the first evidence that these social and non-social aspects are associated differentially with looking at eyes: While AQ-Social showed the commonly assumed tendency towards reduced looking at eyes, AQ-Attention was associated with increased looking at eyes. We also report that higher attention-to-detail (AQ-Attention) was then indirectly related to improved face recognition, mediated by increased number of fixations to the eyes during face learning. Higher levels of socially relevant autistic traits (AQ-Social) trended in the opposite direction towards being related to poorer face recognition (significantly so in females on the Cambridge Face Memory Test). There was no evidence of any mediated relationship between AQ-Social and face recognition via reduced looking at the eyes. These different effects of AQ-Attention and AQ-Social suggest face-processing studies in Autism Spectrum Disorder might similarly benefit from considering symptom subclusters. Additionally, concerning mechanisms of face recognition, our results support the view that more looking at eyes predicts better face memory. © 2016 The British Psychological Society.

  19. Development of Face Recognition in Infant Chimpanzees (Pan Troglodytes)

    ERIC Educational Resources Information Center

    Myowa-Yamakoshi, M.; Yamaguchi, M.K.; Tomonaga, M.; Tanaka, M.; Matsuzawa, T.

    2005-01-01

    In this paper, we assessed the developmental changes in face recognition by three infant chimpanzees aged 1-18 weeks, using preferential-looking procedures that measured the infants' eye- and head-tracking of moving stimuli. In Experiment 1, we prepared photographs of the mother of each infant and an ''average'' chimpanzee face using…

  20. The added value of eye-tracking in diagnosing dyscalculia: a case study

    PubMed Central

    van Viersen, Sietske; Slot, Esther M.; Kroesbergen, Evelyn H.; van't Noordende, Jaccoline E.; Leseman, Paul P. M.

    2013-01-01

    The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R2) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures. PMID:24098294

  1. The added value of eye-tracking in diagnosing dyscalculia: a case study.

    PubMed

    van Viersen, Sietske; Slot, Esther M; Kroesbergen, Evelyn H; Van't Noordende, Jaccoline E; Leseman, Paul P M

    2013-01-01

    The present study compared eye movements and performance of a 9-year-old girl with Developmental Dyscalculia (DD) on a series of number line tasks to those of a group of typically developing (TD) children (n = 10), in order to answer the question whether eye-tracking data from number line estimation tasks can be a useful tool to discriminate between TD children and children with a number processing deficit. Quantitative results indicated that the child with dyscalculia performed worse on all symbolic number line tasks compared to the control group, indicated by a low linear fit (R (2)) and a low accuracy measured by mean percent absolute error. In contrast to the control group, her magnitude representations seemed to be better represented by a logarithmic than a linear fit. Furthermore, qualitative analyses on the data of the child with dyscalculia revealed more unidentifiable fixation patterns in the processing of multi-digit numbers and more dysfunctional estimation strategy use in one third of the estimation trials as opposed to ~10% in the control group. In line with her dyscalculia diagnosis, these results confirm the difficulties with spatially representing and manipulating numerosities on a number line, resulting in inflexible and inadequate estimation or processing strategies. It can be concluded from this case study that eye-tracking data can be used to discern different number processing and estimation strategies in TD children and children with a number processing deficit. Hence, eye-tracking data in combination with number line estimation tasks might be a valuable and promising addition to current diagnostic measures.

  2. The Development of Emotional Face and Eye Gaze Processing

    ERIC Educational Resources Information Center

    Hoehl, Stefanie; Striano, Tricia

    2010-01-01

    Recent research has demonstrated that infants' attention towards novel objects is affected by an adult's emotional expression and eye gaze toward the object. The current event-related potential (ERP) study investigated how infants at 3, 6, and 9 months of age process fearful compared to neutral faces looking toward objects or averting gaze away…

  3. Weight and See: Loading Working Memory Improves Incidental Identification of Irrelevant Faces

    PubMed Central

    Carmel, David; Fairnie, Jake; Lavie, Nilli

    2012-01-01

    Are task-irrelevant stimuli processed to a level enabling individual identification? This question is central both for perceptual processing models and for applied settings (e.g., eye-witness testimony). Lavie’s load theory proposes that working memory actively maintains attentional prioritization of relevant over irrelevant information. Loading working memory thus impairs attentional prioritization, leading to increased processing of task-irrelevant stimuli. Previous research has shown that increased working memory load leads to greater interference effects from response-competing distractors. Here we test the novel prediction that increased processing of irrelevant stimuli under high working memory load should lead to a greater likelihood of incidental identification of entirely irrelevant stimuli. To test this, we asked participants to perform a word-categorization task while ignoring task-irrelevant images. The categorization task was performed during the retention interval of a working memory task with either low or high load (defined by memory set size). Following the final experimental trial, a surprise question assessed incidental identification of the irrelevant image. Loading working memory was found to improve identification of task-irrelevant faces, but not of building stimuli (shown in a separate experiment to be less distracting). These findings suggest that working memory plays a critical role in determining whether distracting stimuli will be subsequently identified. PMID:22912623

  4. Tracking without perceiving: a dissociation between eye movements and motion perception.

    PubMed

    Spering, Miriam; Pomplun, Marc; Carrasco, Marisa

    2011-02-01

    Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.

  5. Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio

    2017-03-01

    It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.

  6. How are learning strategies reflected in the eyes? Combining results from self-reports and eye-tracking.

    PubMed

    Catrysse, Leen; Gijbels, David; Donche, Vincent; De Maeyer, Sven; Lesterhuis, Marije; Van den Bossche, Piet

    2018-03-01

    Up until now, empirical studies in the Student Approaches to Learning field have mainly been focused on the use of self-report instruments, such as interviews and questionnaires, to uncover differences in students' general preferences towards learning strategies, but have focused less on the use of task-specific and online measures. This study aimed at extending current research on students' learning strategies by combining general and task-specific measurements of students' learning strategies using both offline and online measures. We want to clarify how students process learning contents and to what extent this is related to their self-report of learning strategies. Twenty students with different generic learning profiles (according to self-report questionnaires) read an expository text, while their eye movements were registered to answer questions on the content afterwards. Eye-tracking data were analysed with generalized linear mixed-effects models. The results indicate that students with an all-high profile, combining both deep and surface learning strategies, spend more time on rereading the text than students with an all-low profile, scoring low on both learning strategies. This study showed that we can use eye-tracking to distinguish very strategic students, characterized using cognitive processing and regulation strategies, from low strategic students, characterized by a lack of cognitive and regulation strategies. These students processed the expository text according to how they self-reported. © 2017 The British Psychological Society.

  7. Eyes and ears: Using eye tracking and pupillometry to understand challenges to speech recognition.

    PubMed

    Van Engen, Kristin J; McLaughlin, Drew J

    2018-05-04

    Although human speech recognition is often experienced as relatively effortless, a number of common challenges can render the task more difficult. Such challenges may originate in talkers (e.g., unfamiliar accents, varying speech styles), the environment (e.g. noise), or in listeners themselves (e.g., hearing loss, aging, different native language backgrounds). Each of these challenges can reduce the intelligibility of spoken language, but even when intelligibility remains high, they can place greater processing demands on listeners. Noisy conditions, for example, can lead to poorer recall for speech, even when it has been correctly understood. Speech intelligibility measures, memory tasks, and subjective reports of listener difficulty all provide critical information about the effects of such challenges on speech recognition. Eye tracking and pupillometry complement these methods by providing objective physiological measures of online cognitive processing during listening. Eye tracking records the moment-to-moment direction of listeners' visual attention, which is closely time-locked to unfolding speech signals, and pupillometry measures the moment-to-moment size of listeners' pupils, which dilate in response to increased cognitive load. In this paper, we review the uses of these two methods for studying challenges to speech recognition. Copyright © 2018. Published by Elsevier B.V.

  8. Structural functional associations of the orbit in thyroid eye disease: Kalman filters to track extraocular rectal muscles

    NASA Astrophysics Data System (ADS)

    Chaganti, Shikha; Nelson, Katrina; Mundy, Kevin; Luo, Yifu; Harrigan, Robert L.; Damon, Steve; Fabbri, Daniel; Mawn, Louise; Landman, Bennett

    2016-03-01

    Pathologies of the optic nerve and orbit impact millions of Americans and quantitative assessment of the orbital structures on 3-D imaging would provide objective markers to enhance diagnostic accuracy, improve timely intervention, and eventually preserve visual function. Recent studies have shown that the multi-atlas methodology is suitable for identifying orbital structures, but challenges arise in the identification of the individual extraocular rectus muscles that control eye movement. This is increasingly problematic in diseased eyes, where these muscles often appear to fuse at the back of the orbit (at the resolution of clinical computed tomography imaging) due to inflammation or crowding. We propose the use of Kalman filters to track the muscles in three-dimensions to refine multi-atlas segmentation and resolve ambiguity due to imaging resolution, noise, and artifacts. The purpose of our study is to investigate a method of automatically generating orbital metrics from CT imaging and demonstrate the utility of the approach by correlating structural metrics of the eye orbit with clinical data and visual function measures in subjects with thyroid eye disease. The pilot study demonstrates that automatically calculated orbital metrics are strongly correlated with several clinical characteristics. Moreover, it is shown that the superior, inferior, medial and lateral rectus muscles obtained using Kalman filters are each correlated with different categories of functional deficit. These findings serve as foundation for further investigation in the use of CT imaging in the study, analysis and diagnosis of ocular diseases, specifically thyroid eye disease.

  9. New Eye-Tracking Techniques May Revolutionize Mental Health Screening

    DTIC Science & Technology

    2015-11-04

    health? Recent progress in eye-tracking tech- niques is opening new avenues for quanti - tative, objective, simple, inexpensive, and rapid evaluation ...to check with your doctor whether any corrective action should be taken. What if similar devices could be made available for the evaluation of mental... evaluations , especially for those disor- ders for which a clear chemical, genetic, morphological, physiological, or histologi- cal biomarker has not yet

  10. Eye-tracking novice and expert geologist groups in the field and laboratory

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.

    2010-12-01

    We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.

  11. How Visual Search Relates to Visual Diagnostic Performance: A Narrative Systematic Review of Eye-Tracking Research in Radiology

    ERIC Educational Resources Information Center

    van der Gijp, A.; Ravesloot, C. J.; Jarodzka, H.; van der Schaaf, M. F.; van der Schaaf, I. C.; van Schaik, J. P.; ten Cate, Th. J.

    2017-01-01

    Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology…

  12. Cooperative multisensor system for real-time face detection and tracking in uncontrolled conditions

    NASA Astrophysics Data System (ADS)

    Marchesotti, Luca; Piva, Stefano; Turolla, Andrea; Minetti, Deborah; Regazzoni, Carlo S.

    2005-03-01

    The presented work describes an innovative architecture for multi-sensor distributed video surveillance applications. The aim of the system is to track moving objects in outdoor environments with a cooperative strategy exploiting two video cameras. The system also exhibits the capacity of focusing its attention on the faces of detected pedestrians collecting snapshot frames of face images, by segmenting and tracking them over time at different resolution. The system is designed to employ two video cameras in a cooperative client/server structure: the first camera monitors the entire area of interest and detects the moving objects using change detection techniques. The detected objects are tracked over time and their position is indicated on a map representing the monitored area. The objects" coordinates are sent to the server sensor in order to point its zooming optics towards the moving object. The second camera tracks the objects at high resolution. As well as the client camera, this sensor is calibrated and the position of the object detected on the image plane reference system is translated in its coordinates referred to the same area map. In the map common reference system, data fusion techniques are applied to achieve a more precise and robust estimation of the objects" track and to perform face detection and tracking. The work novelties and strength reside in the cooperative multi-sensor approach, in the high resolution long distance tracking and in the automatic collection of biometric data such as a person face clip for recognition purposes.

  13. The geometric preference subtype in ASD: identifying a consistent, early-emerging phenomenon through eye tracking.

    PubMed

    Moore, Adrienne; Wozniak, Madeline; Yousef, Andrew; Barnes, Cindy Carter; Cha, Debra; Courchesne, Eric; Pierce, Karen

    2018-01-01

    the new Complex Social GeoPref test), eye tracking of toddlers can accurately identify a specific ASD "GeoPref" subtype with elevated symptom severity. The GeoPref tests are predictive of ASD at the individual subject level and thus potentially useful for various clinical applications (e.g., early identification, prognosis, or development of subtype-specific treatments).

  14. The role of eye fixation in memory enhancement under stress - An eye tracking study.

    PubMed

    Herten, Nadja; Otto, Tobias; Wolf, Oliver T

    2017-04-01

    In a stressful situation, attention is shifted to potentially relevant stimuli. Recent studies from our laboratory revealed that participants stressed perform superior in a recognition task involving objects of the stressful episode. In order to characterize the role of a stress induced alteration in visual exploration, the present study investigated whether participants experiencing a laboratory social stress situation differ in their fixation from participants of a control group. Further, we aimed at shedding light on the relation of fixation behaviour with obtained memory measures. We randomly assigned 32 male and 31 female participants to a control or a stress condition consisting of the Trier Social Stress Test (TSST), a public speaking paradigm causing social evaluative threat. In an established 'friendly' control condition (f-TSST) participants talk to a friendly committee. During both conditions, the committee members used ten office items (central objects) while another ten objects were present without being used (peripheral objects). Participants wore eye tracking glasses recording their fixations. On the next day, participants performed free recall and recognition tasks involving the objects present the day before. Stressed participants showed enhanced memory for central objects, accompanied by longer fixation times and larger fixation amounts on these objects. Contrasting this, fixation towards the committee faces showed the reversed pattern; here, control participants exhibited longer fixations. Fixation indices and memory measures were, however, not correlated with each other. Psychosocial stress is associated with altered fixation behaviour. Longer fixation on objects related to the stressful situation may reflect enhanced encoding, whereas diminished face fixation suggests gaze avoidance of aversive, socially threatening stimuli. Modified visual exploration should be considered in future stress research, in particular when focussing on memory for a

  15. Tracking Without Perceiving: A Dissociation Between Eye Movements and Motion Perception

    PubMed Central

    Spering, Miriam; Pomplun, Marc; Carrasco, Marisa

    2011-01-01

    Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept. PMID:21189353

  16. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents.

    PubMed

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf ( n = 25) and hearing ( n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and

  17. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2016-04-01

    but these delays are nearing resolution and we anticipate the initiation of the neuroimaging portion of the study early in Year 3. The fMRI task...resonance imagining ( fMRI ) and diffusion tensor imaging (DTI) to characterize the extent of functional cortical recruitment and white matter injury...respectively. The inclusion of fMRI and DTI will provide an objective basis for cross-validating the EEG and eye tracking system. Both the EEG and eye

  18. Tracking the Eye Movement of Four Years Old Children Learning Chinese Words.

    PubMed

    Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei

    2018-02-01

    Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese words, was used in the pretest, 5-min eye-tracking learning session, and posttest. Additionally, visual spatial skill and phonological awareness were assessed in the pretest as cognitive controls. The results showed that the children's attention was attracted quickly by pictures, on which their attention was focused most, with only 13% of the time looking at words. Moreover, significant learning gains in word reading were observed, from the pretest to posttest, from 5-min exposure to simulated storybook pages with words, picture and pronunciation of two-character words present. Furthermore, the children's attention to words significantly predicted posttest reading beyond socioeconomic status, age, visual spatial skill, phonological awareness and pretest reading performance. This eye-movement evidence of storybook reading by children as young as four years, reading a non-alphabetic script (i.e., Chinese), has demonstrated exciting findings that children can learn words effectively with minimal exposure and little instruction; these findings suggest that learning to read requires attention to the basic words itself. The study contributes to our understanding of early reading acquisition with eye-movement evidence from beginning readers.

  19. Comparison of smooth pursuit and combined eye-head tracking in human subjects with deficient labyrinthine function

    NASA Technical Reports Server (NTRS)

    Leigh, R. J.; Thurston, S. E.; Sharpe, J. A.; Ranalli, P. J.; Hamid, M. A.

    1987-01-01

    The effects of deficient labyrinthine function on smooth visual tracking with the eyes and head were investigated, using ten patients with bilateral peripheral vestibular disease and ten normal controls. Active, combined eye-head tracking (EHT) was significantly better in patients than smooth pursuit with the eyes alone, whereas normal subjects pursued equally well in both cases. Compensatory eye movements during active head rotation in darkness were always less in patients than in normal subjects. These data were used to examine current hypotheses that postulate central cancellation of the vestibulo-ocular reflex (VOR) during EHT. A model that proposes summation of an integral smooth pursuit command and VOR/compensatory eye movements is consistent with the findings. Observation of passive EHT (visual fixation of a head-fixed target during en bloc rotation) appears to indicate that in this mode parametric gain changes contribute to modulation of the VOR.

  20. Face Inversion Disproportionately Disrupts Sensitivity to Vertical over Horizontal Changes in Eye Position

    ERIC Educational Resources Information Center

    Crookes, Kate; Hayward, William G.

    2012-01-01

    Presenting a face inverted (upside down) disrupts perceptual sensitivity to the spacing between the features. Recently, it has been shown that this disruption is greater for vertical than horizontal changes in eye position. One explanation for this effect proposed that inversion disrupts the processing of long-range (e.g., eye-to-mouth distance)…

  1. Bumetanide for autism: more eye contact, less amygdala activation.

    PubMed

    Hadjikhani, Nouchine; Åsberg Johnels, Jakob; Lassalle, Amandine; Zürcher, Nicole R; Hippolyte, Loyse; Gillberg, Christopher; Lemonnier, Eric; Ben-Ari, Yehezkel

    2018-02-26

    We recently showed that constraining eye contact leads to exaggerated increase of amygdala activation in autism. Here, in a proof of concept pilot study, we demonstrate that administration of bumetanide (a NKCC1 chloride importer antagonist that restores GABAergic inhibition) normalizes the level of amygdala activation during constrained eye contact with dynamic emotional face stimuli in autism. In addition, eye-tracking data reveal that bumetanide administration increases the time spent in spontaneous eye gaze during in a free-viewing mode of the same face stimuli. In keeping with clinical trials, our data support the Excitatory/Inhibitory dysfunction hypothesis in autism, and indicate that bumetanide may improve specific aspects of social processing in autism. Future double-blind placebo controlled studies with larger cohorts of participants will help clarify the mechanisms of bumetanide action in autism.

  2. Sight-Reading Expertise: Cross-Modality Integration Investigated Using Eye Tracking

    ERIC Educational Resources Information Center

    Drai-Zerbib, Veronique; Baccino, Thierry; Bigand, Emmanuel

    2012-01-01

    It is often said that experienced musicians are capable of hearing what they read (and vice versa). This suggests that they are able to process and to integrate multimodal information. The present study investigates this issue with an eye-tracking technique. Two groups of musicians chosen on the basis of their level of expertise (experts,…

  3. Aesthetic phenomena as supernormal stimuli: the case of eye, lip, and lower-face size and roundness in artistic portraits.

    PubMed

    Costa, Marco; Corazza, Leonardo

    2006-01-01

    In the first study, eye and lip size and roundness, and lower-face roundness were compared between a control sample of 289 photographic portraits and an experimental sample of 776 artistic portraits covering the whole period of the history of art. Results showed that eye roundness, lip roundness, eye height, eye width, and lip height were significantly enhanced in artistic portraits compared to photographic ones. Lip width and lower-face roundness, on the contrary, were less prominent in artistic than in photographic portraits. In a second study, forty-two art academy students were requested to draw two self-portraits, one with a mirror and one without (from memory). Eye, lip, and lower-face roundness in artistic self-portraits was compared to the same features derived from photographic portraits of the participants. The results obtained confirmed those found in the first study. Eye and lip size and roundness were greater in artistic self-portraits, while lower-face roundness was significantly reduced. The same degree of modification was found also when a mirror was available to the subjects. In a third study the effect of lower-face roundness on the perception of attractiveness was assessed: fifty-three participants had to adjust the face width of 24 photographic portraits in order to achieve the highest level of attractiveness. Participants contracted the face width by a mean value of 5.26%, showing a preference for a reduced lower-face roundness. All results are discussed in terms of the importance of the 'supernormalisation' process as a means of assigning aesthetic value to perceptual stimuli.

  4. What interests them in the pictures?--differences in eye-tracking between rhesus monkeys and humans.

    PubMed

    Hu, Ying-Zhou; Jiang, Hui-Hui; Liu, Ci-Rong; Wang, Jian-Hong; Yu, Cheng-Yang; Carlson, Synnöve; Yang, Shang-Chuan; Saarinen, Veli-Matti; Rizak, Joshua D; Tian, Xiao-Guang; Tan, Hen; Chen, Zhu-Yue; Ma, Yuan-Ye; Hu, Xin-Tian

    2013-10-01

    Studies estimating eye movements have demonstrated that non-human primates have fixation patterns similar to humans at the first sight of a picture. In the current study, three sets of pictures containing monkeys, humans or both were presented to rhesus monkeys and humans. The eye movements on these pictures by the two species were recorded using a Tobii eye-tracking system. We found that monkeys paid more attention to the head and body in pictures containing monkeys, whereas both monkeys and humans paid more attention to the head in pictures containing humans. The humans always concentrated on the eyes and head in all the pictures, indicating the social role of facial cues in society. Although humans paid more attention to the hands than monkeys, both monkeys and humans were interested in the hands and what was being done with them in the pictures. This may suggest the importance and necessity of hands for survival. Finally, monkeys scored lower in eye-tracking when fixating on the pictures, as if they were less interested in looking at the screen than humans. The locations of fixation in monkeys may provide insight into the role of eye movements in an evolutionary context.

  5. Development of Visual Preference for Own- versus Other-Race Faces in Infancy

    ERIC Educational Resources Information Center

    Liu, Shaoying; Xiao, Wen Sara; Xiao, Naiqi G.; Quinn, Paul C.; Zhang, Yueyan; Chen, Hui; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-01-01

    Previous research has shown that 3-month-olds prefer own- over other-race faces. The current study used eye-tracking methodology to examine how this visual preference develops with age beyond 3 months and how infants differentially scan between own- and other-race faces when presented simultaneously. We showed own- versus other-race face pairs to…

  6. The eye-tracking computer device for communication in amyotrophic lateral sclerosis.

    PubMed

    Spataro, R; Ciriacono, M; Manno, C; La Bella, V

    2014-07-01

    To explore the effectiveness of communication and the variables affecting the eye-tracking computer system (ETCS) utilization in patients with late-stage amyotrophic lateral sclerosis (ALS). We performed a telephone survey on 30 patients with advanced non-demented ALS that were provisioned an ECTS device. Median age at interview was 55 years (IQR = 48-62), with a relatively high education (13 years, IQR = 8-13). A one-off interview was made and answers were later provided with the help of the caregiver. The interview included items about demographic and clinical variables affecting the daily ETCS utilization. The median time of ETCS device possession was 15 months (IQR = 9-20). The actual daily utilization was 300 min (IQR = 100-720), mainly for the communication with relatives/caregiver, internet surfing, e-mailing, and social networking. 23.3% of patients with ALS (n = 7) had a low daily ETCS utilization; most reported causes were eye-gaze tiredness and oculomotor dysfunction. Eye-tracking computer system is a valuable device for AAC in patients with ALS, and it can be operated with a good performance. The development of oculomotor impairment may limit its functional use. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Eye-Tracking Provides a Sensitive Measure of Exploration Deficits After Acute Right MCA Stroke

    PubMed Central

    Delazer, Margarete; Sojer, Martin; Ellmerer, Philipp; Boehme, Christian; Benke, Thomas

    2018-01-01

    The eye-tracking study aimed at assessing spatial biases in visual exploration in patients after acute right MCA (middle cerebral artery) stroke. Patients affected by unilateral neglect show less functional recovery and experience severe difficulties in everyday life. Thus, accurate diagnosis is essential, and specific treatment is required. Early assessment is of high importance as rehabilitative interventions are more effective when applied soon after stroke. Previous research has shown that deficits may be overlooked when classical paper-and-pencil tasks are used for diagnosis. Conversely, eye-tracking allows direct monitoring of visual exploration patterns. We hypothesized that the analysis of eye-tracking provides more sensitive measures for spatial exploration deficits after right middle cerebral artery stroke. Twenty-two patients with right MCA stroke (median 5 days after stroke) and 28 healthy controls were included. Lesions were confirmed by MRI/CCT. Groups performed comparably in the Mini–Mental State Examination (patients and controls median 29) and in a screening of executive functions. Eleven patients scored at ceiling in neglect screening tasks, 11 showed minimal to severe signs of unilateral visual neglect. An overlap plot based on MRI and CCT imaging showed lesions in the temporo–parieto–frontal cortex, basal ganglia, and adjacent white matter tracts. Visual exploration was evaluated in two eye-tracking tasks, one assessing free visual exploration of photographs, the other visual search using symbols and letters. An index of fixation asymmetries proved to be a sensitive measure of spatial exploration deficits. Both patient groups showed a marked exploration bias to the right when looking at complex photographs. A single case analysis confirmed that also most of those patients who showed no neglect in screening tasks performed outside the range of controls in free exploration. The analysis of patients’ scoring at ceiling in neglect

  8. Anxiety symptoms and children's eye gaze during fear learning.

    PubMed

    Michalska, Kalina J; Machlin, Laura; Moroney, Elizabeth; Lowet, Daniel S; Hettema, John M; Roberson-Nay, Roxann; Averbeck, Bruno B; Brotman, Melissa A; Nelson, Eric E; Leibenluft, Ellen; Pine, Daniel S

    2017-11-01

    The eye region of the face is particularly relevant for decoding threat-related signals, such as fear. However, it is unclear if gaze patterns to the eyes can be influenced by fear learning. Previous studies examining gaze patterns in adults find an association between anxiety and eye gaze avoidance, although no studies to date examine how associations between anxiety symptoms and eye-viewing patterns manifest in children. The current study examined the effects of learning and trait anxiety on eye gaze using a face-based fear conditioning task developed for use in children. Participants were 82 youth from a general population sample of twins (aged 9-13 years), exhibiting a range of anxiety symptoms. Participants underwent a fear conditioning paradigm where the conditioned stimuli (CS+) were two neutral faces, one of which was randomly selected to be paired with an aversive scream. Eye tracking, physiological, and subjective data were acquired. Children and parents reported their child's anxiety using the Screen for Child Anxiety Related Emotional Disorders. Conditioning influenced eye gaze patterns in that children looked longer and more frequently to the eye region of the CS+ than CS- face; this effect was present only during fear acquisition, not at baseline or extinction. Furthermore, consistent with past work in adults, anxiety symptoms were associated with eye gaze avoidance. Finally, gaze duration to the eye region mediated the effect of anxious traits on self-reported fear during acquisition. Anxiety symptoms in children relate to face-viewing strategies deployed in the context of a fear learning experiment. This relationship may inform attempts to understand the relationship between pediatric anxiety symptoms and learning. © 2017 Association for Child and Adolescent Mental Health.

  9. Measuring advertising effectiveness in Travel 2.0 websites through eye-tracking technology.

    PubMed

    Muñoz-Leiva, Francisco; Hernández-Méndez, Janet; Gómez-Carmona, Diego

    2018-03-06

    The advent of Web 2.0 is changing tourists' behaviors, prompting them to take on a more active role in preparing their travel plans. It is also leading tourism companies to have to adapt their marketing strategies to different online social media. The present study analyzes advertising effectiveness in social media in terms of customers' visual attention and self-reported memory (recall). Data were collected through a within-subjects and between-groups design based on eye-tracking technology, followed by a self-administered questionnaire. Participants were instructed to visit three Travel 2.0 websites (T2W), including a hotel's blog, social network profile (Facebook), and virtual community profile (Tripadvisor). Overall, the results revealed greater advertising effectiveness in the case of the hotel social network; and visual attention measures based on eye-tracking data differed from measures of self-reported recall. Visual attention to the ad banner was paid at a low level of awareness, which explains why the associations with the ad did not activate its subsequent recall. The paper offers a pioneering attempt in the application of eye-tracking technology, and examines the possible impact of visual marketing stimuli on user T2W-related behavior. The practical implications identified in this research, along with its limitations and future research opportunities, are of interest both for further theoretical development and practical application. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Mapping the emotional face. How individual face parts contribute to successful emotion recognition.

    PubMed

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation.

  11. Mapping the emotional face. How individual face parts contribute to successful emotion recognition

    PubMed Central

    Wegrzyn, Martin; Vogt, Maria; Kireclioglu, Berna; Schneider, Julia; Kissler, Johanna

    2017-01-01

    Which facial features allow human observers to successfully recognize expressions of emotion? While the eyes and mouth have been frequently shown to be of high importance, research on facial action units has made more precise predictions about the areas involved in displaying each emotion. The present research investigated on a fine-grained level, which physical features are most relied on when decoding facial expressions. In the experiment, individual faces expressing the basic emotions according to Ekman were hidden behind a mask of 48 tiles, which was sequentially uncovered. Participants were instructed to stop the sequence as soon as they recognized the facial expression and assign it the correct label. For each part of the face, its contribution to successful recognition was computed, allowing to visualize the importance of different face areas for each expression. Overall, observers were mostly relying on the eye and mouth regions when successfully recognizing an emotion. Furthermore, the difference in the importance of eyes and mouth allowed to group the expressions in a continuous space, ranging from sadness and fear (reliance on the eyes) to disgust and happiness (mouth). The face parts with highest diagnostic value for expression identification were typically located in areas corresponding to action units from the facial action coding system. A similarity analysis of the usefulness of different face parts for expression recognition demonstrated that faces cluster according to the emotion they express, rather than by low-level physical features. Also, expressions relying more on the eyes or mouth region were in close proximity in the constructed similarity space. These analyses help to better understand how human observers process expressions of emotion, by delineating the mapping from facial features to psychological representation. PMID:28493921

  12. Interior detail of main entry with railroad tracks; camera facing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior detail of main entry with railroad tracks; camera facing east. - Mare Island Naval Shipyard, Mechanics Shop, Waterfront Avenue, west side between A Street & Third Street, Vallejo, Solano County, CA

  13. Comparing Eye Tracking with Electrooculography for Measuring Individual Sentence Comprehension Duration

    PubMed Central

    Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas

    2016-01-01

    The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125

  14. Reading the Mind in the Eyes or reading between the lines? Theory of Mind predicts collective intelligence equally well online and face-to-face.

    PubMed

    Engel, David; Woolley, Anita Williams; Jing, Lisa X; Chabris, Christopher F; Malone, Thomas W

    2014-01-01

    Recent research with face-to-face groups found that a measure of general group effectiveness (called "collective intelligence") predicted a group's performance on a wide range of different tasks. The same research also found that collective intelligence was correlated with the individual group members' ability to reason about the mental states of others (an ability called "Theory of Mind" or "ToM"). Since ToM was measured in this work by a test that requires participants to "read" the mental states of others from looking at their eyes (the "Reading the Mind in the Eyes" test), it is uncertain whether the same results would emerge in online groups where these visual cues are not available. Here we find that: (1) a collective intelligence factor characterizes group performance approximately as well for online groups as for face-to-face groups; and (2) surprisingly, the ToM measure is equally predictive of collective intelligence in both face-to-face and online groups, even though the online groups communicate only via text and never see each other at all. This provides strong evidence that ToM abilities are just as important to group performance in online environments with limited nonverbal cues as they are face-to-face. It also suggests that the Reading the Mind in the Eyes test measures a deeper, domain-independent aspect of social reasoning, not merely the ability to recognize facial expressions of mental states.

  15. Driver fatigue detection based on eye state.

    PubMed

    Lin, Lizong; Huang, Chao; Ni, Xiaopeng; Wang, Jiawen; Zhang, Hao; Li, Xiao; Qian, Zhiqin

    2015-01-01

    Nowadays, more and more traffic accidents occur because of driver fatigue. In order to reduce and prevent it, in this study, a calculation method using PERCLOS (percentage of eye closure time) parameter characteristics based on machine vision was developed. It determined whether a driver's eyes were in a fatigue state according to the PERCLOS value. The overall workflow solutions included face detection and tracking, detection and location of the human eye, human eye tracking, eye state recognition, and driver fatigue testing. The key aspects of the detection system incorporated the detection and location of human eyes and driver fatigue testing. The simplified method of measuring the PERCLOS value of the driver was to calculate the ratio of the eyes being open and closed with the total number of frames for a given period. If the eyes were closed more than the set threshold in the total number of frames, the system would alert the driver. Through many experiments, it was shown that besides the simple detection algorithm, the rapid computing speed, and the high detection and recognition accuracies of the system, the system was demonstrated to be in accord with the real-time requirements of a driver fatigue detection system.

  16. Joint Transform Correlation for face tracking: elderly fall detection application

    NASA Astrophysics Data System (ADS)

    Katz, Philippe; Aron, Michael; Alfalou, Ayman

    2013-03-01

    In this paper, an iterative tracking algorithm based on a non-linear JTC (Joint Transform Correlator) architecture and enhanced by a digital image processing method is proposed and validated. This algorithm is based on the computation of a correlation plane where the reference image is updated at each frame. For that purpose, we use the JTC technique in real time to track a patient (target image) in a room fitted with a video camera. The correlation plane is used to localize the target image in the current video frame (frame i). Then, the reference image to be exploited in the next frame (frame i+1) is updated according to the previous one (frame i). In an effort to validate our algorithm, our work is divided into two parts: (i) a large study based on different sequences with several situations and different JTC parameters is achieved in order to quantify their effects on the tracking performances (decimation, non-linearity coefficient, size of the correlation plane, size of the region of interest...). (ii) the tracking algorithm is integrated into an application of elderly fall detection. The first reference image is a face detected by means of Haar descriptors, and then localized into the new video image thanks to our tracking method. In order to avoid a bad update of the reference frame, a method based on a comparison of image intensity histograms is proposed and integrated in our algorithm. This step ensures a robust tracking of the reference frame. This article focuses on face tracking step optimisation and evalutation. A supplementary step of fall detection, based on vertical acceleration and position, will be added and studied in further work.

  17. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment.

    PubMed

    Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M

    2016-01-26

    Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal

  18. Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop.

    PubMed

    Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K

    2008-01-01

    A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.

  19. Eye-Tracking as a Tool in Process-Oriented Reading Test Validation

    ERIC Educational Resources Information Center

    Solheim, Oddny Judith; Uppstad, Per Henning

    2011-01-01

    The present paper addresses the continuous need for methodological reflection on how to validate inferences made on the basis of test scores. Validation is a process that requires many lines of evidence. In this article we discuss the potential of eye tracking methodology in process-oriented reading test validation. Methodological considerations…

  20. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  1. Face Processing: Models For Recognition

    NASA Astrophysics Data System (ADS)

    Turk, Matthew A.; Pentland, Alexander P.

    1990-03-01

    The human ability to process faces is remarkable. We can identify perhaps thousands of faces learned throughout our lifetime and read facial expression to understand such subtle qualities as emotion. These skills are quite robust, despite sometimes large changes in the visual stimulus due to expression, aging, and distractions such as glasses or changes in hairstyle or facial hair. Computers which model and recognize faces will be useful in a variety of applications, including criminal identification, human-computer interface, and animation. We discuss models for representing faces and their applicability to the task of recognition, and present techniques for identifying faces and detecting eye blinks.

  2. A Novel Eye-Tracking Method to Assess Attention Allocation in Individuals with and without Aphasia Using a Dual-Task Paradigm

    PubMed Central

    Heuer, Sabine; Hallowell, Brooke

    2015-01-01

    Numerous authors report that people with aphasia have greater difficulty allocating attention than people without neurological disorders. Studying how attention deficits contribute to language deficits is important. However, existing methods for indexing attention allocation in people with aphasia pose serious methodological challenges. Eye-tracking methods have great potential to address such challenges. We developed and assessed the validity of a new dual-task method incorporating eye tracking to assess attention allocation. Twenty-six adults with aphasia and 33 control participants completed auditory sentence comprehension and visual search tasks. To test whether the new method validly indexes well-documented patterns in attention allocation, demands were manipulated by varying task complexity in single- and dual-task conditions. Differences in attention allocation were indexed via eye-tracking measures. For all participants significant increases in attention allocation demands were observed from single- to dual-task conditions and from simple to complex stimuli. Individuals with aphasia had greater difficulty allocating attention with greater task demands. Relationships between eye-tracking indices of comprehension during single and dual tasks and standardized testing were examined. Results support the validity of the novel eye-tracking method for assessing attention allocation in people with and without aphasia. Clinical and research implications are discussed. PMID:25913549

  3. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.

    PubMed

    Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart

    2017-01-01

    Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built

  4. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    PubMed

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Adolescents' attention to responsibility messages in magazine alcohol advertisements: an eye-tracking approach.

    PubMed

    Thomsen, Steven R; Fulton, Kristi

    2007-07-01

    To investigate whether adolescent readers attend to responsibility or moderation messages (e.g., "drink responsibly") included in magazine advertisements for alcoholic beverages and to assess the association between attention and the ability to accurately recall the content of these messages. An integrated head-eye tracking system (ASL Eye-TRAC 6000) was used to measure the eye movements, including fixations and fixation duration, of a group of 63 adolescents (ages 12-14 years) as they viewed six print advertisements for alcoholic beverages. Immediately after the eye-tracking sessions, participants completed a masked-recall exercise. Overall, the responsibility or moderation messages were the least frequently viewed textual or visual areas of the advertisements. Participants spent an average of only .35 seconds, or 7% of the total viewing time, fixating on each responsibility message. Beverage bottles, product logos, and cartoon illustrations were the most frequently viewed elements of the advertisements. Among those participants who fixated at least once on an advertisement's warning message, only a relatively small percentage were able to recall its general concept or restate it verbatim in the masked recall test. Voluntary responsibility or moderation messages failed to capture the attention of teenagers who participated in this study and need to be typographically modified to be more effective.

  6. Applying face identification to detecting hijacking of airplane

    NASA Astrophysics Data System (ADS)

    Luo, Xuanwen; Cheng, Qiang

    2004-09-01

    That terrorists hijacked the airplanes and crashed the World Trade Center is disaster to civilization. To avoid the happening of hijack is critical to homeland security. To report the hijacking in time, limit the terrorist to operate the plane if happened and land the plane to the nearest airport could be an efficient way to avoid the misery. Image processing technique in human face recognition or identification could be used for this task. Before the plane take off, the face images of pilots are input into a face identification system installed in the airplane. The camera in front of pilot seat keeps taking the pilot face image during the flight and comparing it with pre-input pilot face images. If a different face is detected, a warning signal is sent to ground automatically. At the same time, the automatic cruise system is started or the plane is controlled by the ground. The terrorists will have no control over the plane. The plane will be landed to a nearest or appropriate airport under the control of the ground or cruise system. This technique could also be used in automobile industry as an image key to avoid car stealth.

  7. Perceptual impairment in face identification with poor sleep

    PubMed Central

    Beattie, Louise; Walsh, Darragh; McLaren, Jessica; Biello, Stephany M.

    2016-01-01

    Previous studies have shown impaired memory for faces following restricted sleep. However, it is not known whether lack of sleep impairs performance on face identification tasks that do not rely on recognition memory, despite these tasks being more prevalent in security and forensic professions—for example, in photo-ID checks at national borders. Here we tested whether poor sleep affects accuracy on a standard test of face-matching ability that does not place demands on memory: the Glasgow Face-Matching Task (GFMT). In Experiment 1, participants who reported sleep disturbance consistent with insomnia disorder show impaired accuracy on the GFMT when compared with participants reporting normal sleep behaviour. In Experiment 2, we then used a sleep diary method to compare GFMT accuracy in a control group to participants reporting poor sleep on three consecutive nights—and again found lower accuracy scores in the short sleep group. In both experiments, reduced face-matching accuracy in those with poorer sleep was not associated with lower confidence in their decisions, carrying implications for occupational settings where identification errors made with high confidence can have serious outcomes. These results suggest that sleep-related impairments in face memory reflect difficulties in perceptual encoding of identity, and point towards metacognitive impairment in face matching following poor sleep. PMID:27853547

  8. Effects of Detailed Illustrations on Science Learning: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Lin, Yu Ying; Holmqvist, Kenneth; Miyoshi, Kiyofumi; Ashida, Hiroshi

    2017-01-01

    The eye-tracking method was used to assess the influence of detailed, colorful illustrations on reading behaviors and learning outcomes. Based on participants' subjective ratings in a pre-study, we selected eight one-page human anatomy lessons. In the main study, participants learned these eight human anatomy lessons; four were accompanied by…

  9. Basic Number Processing Deficits in Developmental Dyscalculia: Evidence from Eye Tracking

    ERIC Educational Resources Information Center

    Moeller, K.; Neuburger, S.; Kaufmann, L.; Landerl, K.; Nuerk, H. C.

    2009-01-01

    Recent research suggests that developmental dyscalculia is associated with a subitizing deficit (i.e., the inability to quickly enumerate small sets of up to 3 objects). However, the nature of this deficit has not previously been investigated. In the present study the eye-tracking methodology was employed to clarify whether (a) the subitizing…

  10. The Ontogeny of Face Recognition: Eye Contact and Sweet Taste Induce Face Preference in 9- and 12-Week-Old Human Infants.

    ERIC Educational Resources Information Center

    Blass, Elliott M.; Camp, Carole A.

    2001-01-01

    Calm or crying 9- and 12-week-olds sat facing a researcher who gazed into their eyes or at their forehead and delivered either a sucrose solution or pacifier or delivered nothing. Found that combining sweet taste and eye contact was necessary and sufficient for calm 9- and 12-week-olds to form a preference for the researcher, but not for crying…

  11. Eye tracking and gating system for proton therapy of orbital tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Dongho; Yoo, Seung Hoon; Moon, Sung Ho

    2012-07-15

    Purpose: A new motion-based gated proton therapy for the treatment of orbital tumors using real-time eye-tracking system was designed and evaluated. Methods: We developed our system by image-pattern matching, using a normalized cross-correlation technique with LabVIEW 8.6 and Vision Assistant 8.6 (National Instruments, Austin, TX). To measure the pixel spacing of an image consistently, four different calibration modes such as the point-detection, the edge-detection, the line-measurement, and the manual measurement mode were suggested and used. After these methods were applied to proton therapy, gating was performed, and radiation dose distributions were evaluated. Results: Moving phantom verification measurements resulted in errorsmore » of less than 0.1 mm for given ranges of translation. Dosimetric evaluation of the beam-gating system versus nongated treatment delivery with a moving phantom shows that while there was only 0.83 mm growth in lateral penumbra for gated radiotherapy, there was 4.95 mm growth in lateral penumbra in case of nongated exposure. The analysis from clinical results suggests that the average of eye movements depends distinctively on each patient by showing 0.44 mm, 0.45 mm, and 0.86 mm for three patients, respectively. Conclusions: The developed automatic eye-tracking based beam-gating system enabled us to perform high-precision proton radiotherapy of orbital tumors.« less

  12. An evaluation of eye tracking technology in the assessment of 12 lead electrocardiography interpretation.

    PubMed

    Breen, Cathal J; Bond, Raymond; Finlay, Dewar

    2014-01-01

    This study investigated eye tracking technology for 12 lead electrocardiography interpretation to Healthcare Scientist students. Participants (n=33) interpreted ten 12 lead ECG recordings and randomized to receive objective individual appraisal on their efforts either by traditional didactic format or by eye tracker software. One hundred percent of participants reported the experience positively at improving their ECG interpretation competency. ECG analysis time ranged between 13.2 and 59.5s. The rhythm strip was the most common lead studied and fixated on for the longest duration (mean 9.9s). Lead I was studied for the shortest duration (mean 0.25s). Feedback using eye tracking data during ECG interpretation did not produce any significant variation between the assessment marks of the study and the control groups (p=0.32). Although the hypothesis of this study was rejected active teaching and early feedback practices are recommended within this discipline. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Ethnicity identification from face images

    NASA Astrophysics Data System (ADS)

    Lu, Xiaoguang; Jain, Anil K.

    2004-08-01

    Human facial images provide the demographic information, such as ethnicity and gender. Conversely, ethnicity and gender also play an important role in face-related applications. Image-based ethnicity identification problem is addressed in a machine learning framework. The Linear Discriminant Analysis (LDA) based scheme is presented for the two-class (Asian vs. non-Asian) ethnicity classification task. Multiscale analysis is applied to the input facial images. An ensemble framework, which integrates the LDA analysis for the input face images at different scales, is proposed to further improve the classification performance. The product rule is used as the combination strategy in the ensemble. Experimental results based on a face database containing 263 subjects (2,630 face images, with equal balance between the two classes) are promising, indicating that LDA and the proposed ensemble framework have sufficient discriminative power for the ethnicity classification problem. The normalized ethnicity classification scores can be helpful in the facial identity recognition. Useful as a "soft" biometric, face matching scores can be updated based on the output of ethnicity classification module. In other words, ethnicity classifier does not have to be perfect to be useful in practice.

  14. Spontaneous Attention to Faces in Asperger Syndrome Using Ecologically Valid Static Stimuli

    ERIC Educational Resources Information Center

    Hanley, Mary; McPhillips, Martin; Mulhern, Gerry; Riby, Deborah M.

    2013-01-01

    Previous eye tracking research on the allocation of attention to social information by individuals with autism spectrum disorders is equivocal and may be in part a consequence of variation in stimuli used between studies. The current study explored attention allocation to faces, and within faces, by individuals with Asperger syndrome using a range…

  15. Agent tracking: a psycho-historical theory of the identification of living and social agents.

    PubMed

    Bullot, Nicolas J

    To explain agent-identification behaviours, universalist theories in the biological and cognitive sciences have posited mental mechanisms thought to be universal to all humans, such as agent detection and face recognition mechanisms. These universalist theories have paid little attention to how particular sociocultural or historical contexts interact with the psychobiological processes of agent-identification. In contrast to universalist theories, contextualist theories appeal to particular historical and sociocultural contexts for explaining agent-identification. Contextualist theories tend to adopt idiographic methods aimed at recording the heterogeneity of human behaviours across history, space, and cultures. Defenders of the universalist approach tend to criticise idiographic methods because such methods can lead to relativism or may lack generality. To overcome explanatory limitations of proposals that adopt either universalist or contextualist approaches in isolation, I propose a philosophical model that integrates contributions from both traditions: the psycho-historical theory of agent-identification. This theory investigates how the tracking processes that humans use for identifying agents interact with the unique socio-historical contexts that support agent-identification practices. In integrating hypotheses about the history of agents with psychological and epistemological principles regarding agent-identification, the theory can generate novel hypotheses regarding the distinction between recognition-based, heuristic-based, and explanation-based agent-identification.

  16. Face recognition for criminal identification: An implementation of principal component analysis for face recognition

    NASA Astrophysics Data System (ADS)

    Abdullah, Nurul Azma; Saidi, Md. Jamri; Rahman, Nurul Hidayah Ab; Wen, Chuah Chai; Hamid, Isredza Rahmi A.

    2017-10-01

    In practice, identification of criminal in Malaysia is done through thumbprint identification. However, this type of identification is constrained as most of criminal nowadays getting cleverer not to leave their thumbprint on the scene. With the advent of security technology, cameras especially CCTV have been installed in many public and private areas to provide surveillance activities. The footage of the CCTV can be used to identify suspects on scene. However, because of limited software developed to automatically detect the similarity between photo in the footage and recorded photo of criminals, the law enforce thumbprint identification. In this paper, an automated facial recognition system for criminal database was proposed using known Principal Component Analysis approach. This system will be able to detect face and recognize face automatically. This will help the law enforcements to detect or recognize suspect of the case if no thumbprint present on the scene. The results show that about 80% of input photo can be matched with the template data.

  17. Disentangling the initiation from the response in joint attention: an eye-tracking study in toddlers with autism spectrum disorders.

    PubMed

    Billeci, L; Narzisi, A; Campatelli, G; Crifaci, G; Calderoni, S; Gagliano, A; Calzone, C; Colombi, C; Pioggia, G; Muratori, F

    2016-05-17

    Joint attention (JA), whose deficit is an early risk marker for autism spectrum disorder (ASD), has two dimensions: (1) responding to JA and (2) initiating JA. Eye-tracking technology has largely been used to investigate responding JA, but rarely to study initiating JA especially in young children with ASD. The aim of this study was to describe the differences in the visual patterns of toddlers with ASD and those with typical development (TD) during both responding JA and initiating JA tasks. Eye-tracking technology was used to monitor the gaze of 17 children with ASD and 15 age-matched children with TD during the presentation of short video sequences involving one responding JA and two initiating JA tasks (initiating JA-1 and initiating JA-2). Gaze accuracy, transitions and fixations were analyzed. No differences were found in the responding JA task between children with ASD and those with TD, whereas, in the initiating JA tasks, different patterns of fixation and transitions were shown between the groups. These results suggest that children with ASD and those with TD show different visual patterns when they are expected to initiate joint attention but not when they respond to joint attention. We hypothesized that differences in transitions and fixations are linked to ASD impairments in visual disengagement from face, in global scanning of the scene and in the ability to anticipate object's action.

  18. Talent Identification in Track and Field.

    ERIC Educational Resources Information Center

    Henson, Phillip; And Others

    Talent identification in most sports occurs through mass participation and the process of natural selection; track and field does not enjoy such widespread participation. This paper reports on a project undertaken for the following purposes: improve the means by which youth with the potential for high level performance can be identified; develop…

  19. An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents

    PubMed Central

    Mastrantuono, Eliana; Saldaña, David; Rodríguez-Ortiz, Isabel R.

    2017-01-01

    An eye tracking experiment explored the gaze behavior of deaf individuals when perceiving language in spoken and sign language only, and in sign-supported speech (SSS). Participants were deaf (n = 25) and hearing (n = 25) Spanish adolescents. Deaf students were prelingually profoundly deaf individuals with cochlear implants (CIs) used by age 5 or earlier, or prelingually profoundly deaf native signers with deaf parents. The effectiveness of SSS has rarely been tested within the same group of children for discourse-level comprehension. Here, video-recorded texts, including spatial descriptions, were alternately transmitted in spoken language, sign language and SSS. The capacity of these communicative systems to equalize comprehension in deaf participants with that of spoken language in hearing participants was tested. Within-group analyses of deaf participants tested if the bimodal linguistic input of SSS favored discourse comprehension compared to unimodal languages. Deaf participants with CIs achieved equal comprehension to hearing controls in all communicative systems while deaf native signers with no CIs achieved equal comprehension to hearing participants if tested in their native sign language. Comprehension of SSS was not increased compared to spoken language, even when spatial information was communicated. Eye movements of deaf and hearing participants were tracked and data of dwell times spent looking at the face or body area of the sign model were analyzed. Within-group analyses focused on differences between native and non-native signers. Dwell times of hearing participants were equally distributed across upper and lower areas of the face while deaf participants mainly looked at the mouth area; this could enable information to be obtained from mouthings in sign language and from lip-reading in SSS and spoken language. Few fixations were directed toward the signs, although these were more frequent when spatial language was transmitted. Both native and non

  20. Disk space and load time requirements for eye movement biometric databases

    NASA Astrophysics Data System (ADS)

    Kasprowski, Pawel; Harezlak, Katarzyna

    2016-06-01

    Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.

  1. The efficacy of obtaining genetic-based identifications from putative wolverine snow tracks

    Treesearch

    Todd J. Ulizio; John R. Squires; Daniel H. Pletscher; Michael K. Schwartz; James J. Claar; Leonard F. Ruggiero

    2006-01-01

    Snow-track surveys to detect rare carnivores require unequivocal species identification because of management and political ramifications associated with the presence of such species. Collecting noninvasive genetic samples from putative wolverine (Gulo gulo) snow tracks is an effective method for providing definitive species identification for use in presence-...

  2. Eye-Tracking as a Measure of Responsiveness to Joint Attention in Infants at Risk for Autism

    ERIC Educational Resources Information Center

    Navab, Anahita; Gillespie-Lynch, Kristen; Johnson, Scott P.; Sigman, Marian; Hutman, Ted

    2012-01-01

    Reduced responsiveness to joint attention (RJA), as assessed by the Early Social Communication Scales (ESCS), is predictive of both subsequent language difficulties and autism diagnosis. Eye-tracking measurement of RJA is a promising prognostic tool because it is highly precise and standardized. However, the construct validity of eye-tracking…

  3. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis

    ERIC Educational Resources Information Center

    Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying

    2012-01-01

    This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…

  4. Mouse cursor movement and eye tracking data as an indicator of pathologists' attention when viewing digital whole slide images.

    PubMed

    Raghunath, Vignesh; Braxton, Melissa O; Gagnon, Stephanie A; Brunyé, Tad T; Allison, Kimberly H; Reisch, Lisa M; Weaver, Donald L; Elmore, Joann G; Shapiro, Linda G

    2012-01-01

    Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists' viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists' viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists' viewing strategies and time expenditures in their interpretive workflow. To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists' attention and viewing behavior. Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Participants' foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists' accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16

  5. Geometry and Gesture-Based Features from Saccadic Eye-Movement as a Biometric in Radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Tracy; Tourassi, Georgia; Yoon, Hong-Jun

    In this study, we present a novel application of sketch gesture recognition on eye-movement for biometric identification and estimating task expertise. The study was performed for the task of mammographic screening with simultaneous viewing of four coordinated breast views as typically done in clinical practice. Eye-tracking data and diagnostic decisions collected for 100 mammographic cases (25 normal, 25 benign, 50 malignant) and 10 readers (three board certified radiologists and seven radiology residents), formed the corpus for this study. Sketch gesture recognition techniques were employed to extract geometric and gesture-based features from saccadic eye-movements. Our results show that saccadic eye-movement, characterizedmore » using sketch-based features, result in more accurate models for predicting individual identity and level of expertise than more traditional eye-tracking features.« less

  6. How Young Children View Mathematical Representations: A Study Using Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Bolden, David; Barmby, Patrick; Raine, Stephanie; Gardner, Matthew

    2015-01-01

    Background: It has been shown that mathematical representations can aid children's understanding of mathematical concepts but that children can sometimes have difficulty in interpreting them correctly. New advances in eye-tracking technology can help in this respect because it allows data to be gathered concerning children's focus of attention and…

  7. How visual search relates to visual diagnostic performance: a narrative systematic review of eye-tracking research in radiology.

    PubMed

    van der Gijp, A; Ravesloot, C J; Jarodzka, H; van der Schaaf, M F; van der Schaaf, I C; van Schaik, J P J; Ten Cate, Th J

    2017-08-01

    Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology domain aims to identify visual search patterns associated with high perceptual performance. Databases PubMed, EMBASE, ERIC, PsycINFO, Scopus and Web of Science were searched using 'visual perception' OR 'eye tracking' AND 'radiology' and synonyms. Two authors independently screened search results and included eye tracking studies concerning visual skills in radiology published between January 1, 1994 and July 31, 2015. Two authors independently assessed study quality with the Medical Education Research Study Quality Instrument, and extracted study data with respect to design, participant and task characteristics, and variables. A thematic analysis was conducted to extract and arrange study results, and a textual narrative synthesis was applied for data integration and interpretation. The search resulted in 22 relevant full-text articles. Thematic analysis resulted in six themes that informed the relation between visual search and level of expertise: (1) time on task, (2) eye movement characteristics of experts, (3) differences in visual attention, (4) visual search patterns, (5) search patterns in cross sectional stack imaging, and (6) teaching visual search strategies. Expert search was found to be characterized by a global-focal search pattern, which represents an initial global impression, followed by a detailed, focal search-to-find mode. Specific task-related search patterns, like drilling through CT scans and systematic search in chest X-rays, were found to be related to high expert levels. One study investigated teaching of visual search strategies, and did not find a significant effect on perceptual performance. Eye

  8. Quantifying Eye Tracking Between Skilled Nurses and Nursing Students in Intravenous Injection.

    PubMed

    Maekawa, Yasuko; Majima, Yukie; Soga, Masato

    2016-01-01

    In nursing education, it is important that nursing students acquire the appropriate nursing knowledge and skills which include the empirical tacit knowledge of the skilled nurses. Verbalizing them is difficult. We paid attention to the eye tracking at the time of the skill enforcement of expert nurses and the nursing students. It is said that the sight accounts for 70% higher than of all sense information. For the purpose of the learning support of the tacit nursing skill, we analyzed the difference of both including the gaze from an actual measured value with the eye mark recorder. In the results the nurses particularly address the part related to inserting a needle among the other actions, they should move their eyes safely, surely, and economically along with the purposes of their tasks.

  9. Looking at faces from different angles: Europeans fixate different features in Asian and Caucasian faces.

    PubMed

    Brielmann, Aenne A; Bülthoff, Isabelle; Armann, Regine

    2014-07-01

    Race categorization of faces is a fast and automatic process and is known to affect further face processing profoundly and at earliest stages. Whether processing of own- and other-race faces might rely on different facial cues, as indicated by diverging viewing behavior, is much under debate. We therefore aimed to investigate two open questions in our study: (1) Do observers consider information from distinct facial features informative for race categorization or do they prefer to gain global face information by fixating the geometrical center of the face? (2) Does the fixation pattern, or, if facial features are considered relevant, do these features differ between own- and other-race faces? We used eye tracking to test where European observers look when viewing Asian and Caucasian faces in a race categorization task. Importantly, in order to disentangle centrally located fixations from those towards individual facial features, we presented faces in frontal, half-profile and profile views. We found that observers showed no general bias towards looking at the geometrical center of faces, but rather directed their first fixations towards distinct facial features, regardless of face race. However, participants looked at the eyes more often in Caucasian faces than in Asian faces, and there were significantly more fixations to the nose for Asian compared to Caucasian faces. Thus, observers rely on information from distinct facial features rather than facial information gained by centrally fixating the face. To what extent specific features are looked at is determined by the face's race. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Face exploration dynamics differentiate men and women.

    PubMed

    Coutrot, Antoine; Binetti, Nicola; Harrison, Charlotte; Mareschal, Isabelle; Johnston, Alan

    2016-11-01

    The human face is central to our everyday social interactions. Recent studies have shown that while gazing at faces, each one of us has a particular eye-scanning pattern, highly stable across time. Although variables such as culture or personality have been shown to modulate gaze behavior, we still don't know what shapes these idiosyncrasies. Moreover, most previous observations rely on static analyses of small-sized eye-position data sets averaged across time. Here, we probe the temporal dynamics of gaze to explore what information can be extracted about the observers and what is being observed. Controlling for any stimuli effect, we demonstrate that among many individual characteristics, the gender of both the participant (gazer) and the person being observed (actor) are the factors that most influence gaze patterns during face exploration. We record and exploit the largest set of eye-tracking data (405 participants, 58 nationalities) from participants watching videos of another person. Using novel data-mining techniques, we show that female gazers follow a much more exploratory scanning strategy than males. Moreover, female gazers watching female actresses look more at the eye on the left side. These results have strong implications in every field using gaze-based models from computer vision to clinical psychology.

  11. Hidden Communicative Competence: Case Study Evidence Using Eye-Tracking and Video Analysis

    ERIC Educational Resources Information Center

    Grayson, Andrew; Emerson, Anne; Howard-Jones, Patricia; O'Neil, Lynne

    2012-01-01

    A facilitated communication (FC) user with an autism spectrum disorder produced sophisticated texts by pointing, with physical support, to letters on a letterboard while their eyes were tracked and while their pointing movements were video recorded. This FC user has virtually no independent means of expression, and is held to have no literacy…

  12. 14. VIEW SHOWING UPSTREAM FACE OF HORSE MESA. TRACK FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. VIEW SHOWING UPSTREAM FACE OF HORSE MESA. TRACK FROM AGGREGATE BARGES TO MIXING PLANT IS AT LOWER LEFT, RIGHT SPILLWAY CHUTE IS TAKING FORM AT UPPER RIGHT April 29, 1927 - Horse Mesa Dam, Salt River, 65 miles East of Phoenix, Phoenix, Maricopa County, AZ

  13. iTrack: instrumented mobile electrooculography (EOG) eye-tracking in older adults and Parkinson's disease.

    PubMed

    Stuart, Samuel; Hickey, Aodhán; Galna, Brook; Lord, Sue; Rochester, Lynn; Godfrey, Alan

    2017-01-01

    Detection of saccades (fast eye-movements) within raw mobile electrooculography (EOG) data involves complex algorithms which typically process data acquired during seated static tasks only. Processing of data during dynamic tasks such as walking is relatively rare and complex, particularly in older adults or people with Parkinson's disease (PD). Development of algorithms that can be easily implemented to detect saccades is required. This study aimed to develop an algorithm for the detection and measurement of saccades in EOG data during static (sitting) and dynamic (walking) tasks, in older adults and PD. Eye-tracking via mobile EOG and infra-red (IR) eye-tracker (with video) was performed with a group of older adults (n  =  10) and PD participants (n  =  10) (⩾50 years). Horizontal saccades made between targets set 5°, 10° and 15° apart were first measured while seated. Horizontal saccades were then measured while a participant walked and executed a 40° turn left and right. The EOG algorithm was evaluated by comparing the number of correct saccade detections and agreement (ICC 2,1 ) between output from visual inspection of eye-tracker videos and IR eye-tracker. The EOG algorithm detected 75-92% of saccades compared to video inspection and IR output during static testing, with fair to excellent agreement (ICC 2,1 0.49-0.93). However, during walking EOG saccade detection reduced to 42-88% compared to video inspection or IR output, with poor to excellent (ICC 2,1 0.13-0.88) agreement between methodologies. The algorithm was robust during seated testing but less so during walking, which was likely due to increased measurement and analysis error with a dynamic task. Future studies may consider a combination of EOG and IR for comprehensive measurement.

  14. Directional templates for real-time detection of coronal axis rotated faces

    NASA Astrophysics Data System (ADS)

    Perez, Claudio A.; Estevez, Pablo A.; Garate, Patricio

    2004-10-01

    Real-time face and iris detection on video images has gained renewed attention because of multiple possible applications in studying eye function, drowsiness detection, virtual keyboard interfaces, face recognition, video processing and multimedia retrieval. In this paper, a study is presented on using directional templates in the detection of faces rotated in the coronal axis. The templates are built by extracting the directional image information from the regions of the eyes, nose and mouth. The face position is determined by computing a line integral using the templates over the face directional image. The line integral reaches a maximum when it coincides with the face position. It is shown an improvement in localization selectivity by the increased value in the line integral computed with the directional template. Besides, improvements in the line integral value for face size and face rotation angle was also found through the computation of the line integral using the directional template. Based on these results the new templates should improve selectivity and hence provide the means to restrict computations to a fewer number of templates and restrict the region of search during the face and eye tracking procedure. The proposed method is real time, completely non invasive and was applied with no background limitation and normal illumination conditions in an indoor environment.

  15. Eye Tracking Reveals a Crucial Role for Facial Motion in Recognition of Faces by Infants

    ERIC Educational Resources Information Center

    Xiao, Naiqi G.; Quinn, Paul C.; Liu, Shaoying; Ge, Liezhong; Pascalis, Olivier; Lee, Kang

    2015-01-01

    Current knowledge about face processing in infancy comes largely from studies using static face stimuli, but faces that infants see in the real world are mostly moving ones. To bridge this gap, 3-, 6-, and 9-month-old Asian infants (N = 118) were familiarized with either moving or static Asian female faces, and then their face recognition was…

  16. Modulations of eye movement patterns by spatial filtering during the learning and testing phases of an old/new face recognition task.

    PubMed

    Lemieux, Chantal L; Collin, Charles A; Nelson, Elizabeth A

    2015-02-01

    In two experiments, we examined the effects of varying the spatial frequency (SF) content of face images on eye movements during the learning and testing phases of an old/new recognition task. At both learning and testing, participants were presented with face stimuli band-pass filtered to 11 different SF bands, as well as an unfiltered baseline condition. We found that eye movements varied significantly as a function of SF. Specifically, the frequency of transitions between facial features showed a band-pass pattern, with more transitions for middle-band faces (≈5-20 cycles/face) than for low-band (≈<5 cpf) or high-band (≈>20 cpf) ones. These findings were similar for the learning and testing phases. The distributions of transitions across facial features were similar for the middle-band, high-band, and unfiltered faces, showing a concentration on the eyes and mouth; conversely, low-band faces elicited mostly transitions involving the nose and nasion. The eye movement patterns elicited by low, middle, and high bands are similar to those previous researchers have suggested reflect holistic, configural, and featural processing, respectively. More generally, our results are compatible with the hypotheses that eye movements are functional, and that the visual system makes flexible use of visuospatial information in face processing. Finally, our finding that only middle spatial frequencies yielded the same number and distribution of fixations as unfiltered faces adds more evidence to the idea that these frequencies are especially important for face recognition, and reveals a possible mediator for the superior performance that they elicit.

  17. 77 FR 35983 - Agency Information Collection Activities; Proposed Collection; Comment Request; Eye Tracking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-15

    ... also help improve questionnaire design. Different respondents may pay differing degrees of attention to... and strategies for improving the design (Refs. 5 and 6). Finally, eye tracking data can provide... design elements (e.g., prominence, text vs. graphics) will cause variations in information seeking. To...

  18. [Eye lens radiation exposure during ureteroscopy with and without a face protection shield: Investigations on a phantom model].

    PubMed

    Zöller, G; Figel, M; Denk, J; Schulz, K; Sabo, A

    2016-03-01

    Eye lens radiation exposure during radiologically-guided endoscopic procedures may result in radiation-induced cataracts; therefore, we investigated the ocular radiation exposure during ureteroscopy on a phantom model. Using an Alderson phantom model and eye lens dosimeters, we measured the ocular radiation exposure depending on the number of X-ray images and on the duration of fluoroscopic imaging. The measurements were done with and without using a face protection shield. We could demonstrate that a significant ocular radiation exposure can occur, depending on the number of X-ray images and on the duration time of fluoroscopy. Eye lens doses up to 0.025 mSv were recorded even using modern digital X-ray systems. Using face protection shields this ocular radiation exposure can be reduced to a minimum. The International Commission on Radiological Protection (ICRP) recommendations of a mean eye lens dosage of 20 mSv/year may be exceeded during repeated ureteroscopy by a high volume surgeon. Using a face protection shield, the eye lens dose during ureteroscopy could be reduced to a minimum in a phantom model. Further investigations will show whether these results can be transferred to real life ureteroscopic procedures.

  19. The influence of variations in eating disorder-related symptoms on processing of emotional faces in a non-clinical female sample: An eye-tracking study.

    PubMed

    Sharpe, Emma; Wallis, Deborah J; Ridout, Nathan

    2016-06-30

    This study aimed to: (i) determine if the attention bias towards angry faces reported in eating disorders generalises to a non-clinical sample varying in eating disorder-related symptoms; (ii) examine if the bias occurs during initial orientation or later strategic processing; and (iii) confirm previous findings of impaired facial emotion recognition in non-clinical disordered eating. Fifty-two females viewed a series of face-pairs (happy or angry paired with neutral) whilst their attentional deployment was continuously monitored using an eye-tracker. They subsequently identified the emotion portrayed in a separate series of faces. The highest (n=18) and lowest scorers (n=17) on the Eating Disorders Inventory (EDI) were compared on the attention and facial emotion recognition tasks. Those with relatively high scores exhibited impaired facial emotion recognition, confirming previous findings in similar non-clinical samples. They also displayed biased attention away from emotional faces during later strategic processing, which is consistent with previously observed impairments in clinical samples. These differences were related to drive-for-thinness. Although we found no evidence of a bias towards angry faces, it is plausible that the observed impairments in emotion recognition and avoidance of emotional faces could disrupt social functioning and act as a risk factor for the development of eating disorders. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Eye Tracking and Pupillometry are Indicators of Dissociable Latent Decision Processes

    PubMed Central

    Cavanagh, James F.; Wiecki, Thomas V.; Kochar, Angad; Frank, Michael J.

    2014-01-01

    Can you predict what someone is going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the Drift Diffusion Model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PMID:24548281

  1. The eyes have it: Using eye tracking to inform information processing strategies in multi-attributes choices.

    PubMed

    Ryan, Mandy; Krucien, Nicolas; Hermens, Frouke

    2018-04-01

    Although choice experiments (CEs) are widely applied in economics to study choice behaviour, understanding of how individuals process attribute information remains limited. We show how eye-tracking methods can provide insight into how decisions are made. Participants completed a CE, while their eye movements were recorded. Results show that although the information presented guided participants' decisions, there were also several processing biases at work. Evidence was found of (a) top-to-bottom, (b) left-to-right, and (c) first-to-last order biases. Experimental factors-whether attributes are defined as "best" or "worst," choice task complexity, and attribute ordering-also influence information processing. How individuals visually process attribute information was shown to be related to their choices. Implications for the design and analysis of CEs and future research are discussed. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Deficits in Cross-Race Face Learning: Insights From Eye Movements and Pupillometry

    PubMed Central

    Goldinger, Stephen D.; He, Yi; Papesh, Megan H.

    2010-01-01

    The own-race bias (ORB) is a well-known finding wherein people are better able to recognize and discriminate own-race faces, relative to cross-race faces. In 2 experiments, participants viewed Asian and Caucasian faces, in preparation for recognition memory tests, while their eye movements and pupil diameters were continuously monitored. In Experiment 1 (with Caucasian participants), systematic differences emerged in both measures as a function of depicted race: While encoding cross-race faces, participants made fewer (and longer) fixations, they preferentially attended to different sets of features, and their pupils were more dilated, all relative to own-race faces. Also, in both measures, a pattern emerged wherein some participants reduced their apparent encoding effort to cross-race faces over trials. In Experiment 2 (with Asian participants), the authors observed the same patterns, although the ORB favored the opposite set of faces. Taken together, the results suggest that the ORB appears during initial perceptual encoding. Relative to own-race face encoding, cross-race encoding requires greater effort, which may reduce vigilance in some participants. PMID:19686008

  3. Similarity-Based Interference during Language Comprehension: Evidence from Eye Tracking during Reading

    ERIC Educational Resources Information Center

    Gordon, Peter C.; Hendrick, Randall; Johnson, Marcus; Lee, Yoonhyoung

    2006-01-01

    The nature of working memory operation during complex sentence comprehension was studied by means of eye-tracking methodology. Readers had difficulty when the syntax of a sentence required them to hold 2 similar noun phrases (NPs) in working memory before syntactically and semantically integrating either of the NPs with a verb. In sentence …

  4. Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision.

    PubMed

    Ben-Simon, Avi; Ben-Shahar, Ohad; Segev, Ronen

    2009-11-15

    The archer fish (Toxotes chatareus) exhibits unique visual behavior in that it is able to aim at and shoot down with a squirt of water insects resting on the foliage above water level and then feed on them. This extreme behavior requires excellent visual acuity, learning, and tight synchronization between the visual system and body motion. This behavior also raises many important questions, such as the fish's ability to compensate for air-water refraction and the neural mechanisms underlying target acquisition. While many such questions remain open, significant insights towards solving them can be obtained by tracking the eye and body movements of freely behaving fish. Unfortunately, existing tracking methods suffer from either a high level of invasiveness or low resolution. Here, we present a video-based eye tracking method for accurately and remotely measuring the eye and body movements of a freely moving behaving fish. Based on a stereo vision system and a unique triangulation method that corrects for air-glass-water refraction, we are able to measure a full three-dimensional pose of the fish eye and body with high temporal and spatial resolution. Our method, being generic, can be applied to studying the behavior of marine animals in general. We demonstrate how data collected by our method may be used to show that the hunting behavior of the archer fish is composed of surfacing concomitant with rotating the body around the direction of the fish's fixed gaze towards the target, until the snout reaches in the correct shooting position at water level.

  5. What Do Infants See in Faces? ERP Evidence of Different Roles of Eyes and Mouth for Face Perception in 9-Month-Old Infants

    ERIC Educational Resources Information Center

    Key, Alexandra P. F.; Stone, Wendy; Williams, Susan M.

    2009-01-01

    The study examined whether face-specific perceptual brain mechanisms in 9-month-old infants are differentially sensitive to changes in individual facial features (eyes versus mouth) and whether sensitivity to such changes is related to infants' social and communicative skills. Infants viewed photographs of a smiling unfamiliar female face. On 30%…

  6. Face-off: A new identification procedure for child eyewitnesses.

    PubMed

    Price, Heather L; Fitzgerald, Ryan J

    2016-09-01

    In 2 experiments, we introduce a new "face-off" procedure for child eyewitness identifications. The new procedure, which is premised on reducing the stimulus set size, was compared with the showup and simultaneous procedures in Experiment 1 and with modified versions of the simultaneous and elimination procedures in Experiment 2. Several benefits of the face-off procedure were observed: it was significantly more diagnostic than the showup procedure; it led to significantly more correct rejections of target-absent lineups than the simultaneous procedures in both experiments, and it led to greater information gain than the modified elimination and simultaneous procedures. The face-off procedure led to consistently more conservative responding than the simultaneous procedures in both experiments. Given the commonly cited concern that children are too lenient in their decision criteria for identification tasks, the face-off procedure may offer a concrete technique to reduce children's high choosing rates. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  7. Neural mechanisms of eye contact when listening to another person talking

    PubMed Central

    Borowiak, Kamila; Tudge, Luke; Otto, Carolin; von Kriegstein, Katharina

    2017-01-01

    Abstract Eye contact occurs frequently and voluntarily during face-to-face verbal communication. However, the neural mechanisms underlying eye contact when it is accompanied by spoken language remain unexplored to date. Here we used a novel approach, fixation-based event-related functional magnetic resonance imaging (fMRI), to simulate the listener making eye contact with a speaker during verbal communication. Participants’ eye movements and fMRI data were recorded simultaneously while they were freely viewing a pre-recorded speaker talking. The eye tracking data were then used to define events for the fMRI analyses. The results showed that eye contact in contrast to mouth fixation involved visual cortical areas (cuneus, calcarine sulcus), brain regions related to theory of mind/intentionality processing (temporoparietal junction, posterior superior temporal sulcus, medial prefrontal cortex) and the dorsolateral prefrontal cortex. In addition, increased effective connectivity was found between these regions for eye contact in contrast to mouth fixations. The results provide first evidence for neural mechanisms underlying eye contact when watching and listening to another person talking. The network we found might be well suited for processing the intentions of communication partners during eye contact in verbal communication. PMID:27576745

  8. Looking at Movies and Cartoons: Eye-Tracking Evidence from Williams Syndrome and Autism

    ERIC Educational Resources Information Center

    Riby, D.; Hancock, P. J. B.

    2009-01-01

    Background: Autism and Williams syndrome (WS) are neuro-developmental disorders associated with distinct social phenotypes. While individuals with autism show a lack of interest in socially important cues, individuals with WS often show increased interest in socially relevant information. Methods: The current eye-tracking study explores how…

  9. Building face composites can harm lineup identification performance.

    PubMed

    Wells, Gary L; Charman, Steve D; Olson, Elizabeth A

    2005-09-01

    Face composite programs permit eyewitnesses to build likenesses of target faces by selecting facial features and combining them into an intact face. Research has shown that these composites are generally poor likenesses of the target face. Two experiments tested the proposition that this composite-building process could harm the builder's memory for the face. In Experiment 1 (n = 150), the authors used 50 different faces and found that the building of a composite reduced the chances that the person could later identify the original face from a lineup when compared with no composite control conditions or with yoked composite-exposure control conditions. In Experiment 2 (n = 200), the authors found that this effect generalized to a simulated-crime video, but mistaken identifications from target-absent lineups were not inflated by composite building. Copyright 2005 APA, all rights reserved.

  10. The specificity of attentional biases by type of gambling: An eye-tracking study.

    PubMed

    McGrath, Daniel S; Meitner, Amadeus; Sears, Christopher R

    2018-01-01

    A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers.

  11. The specificity of attentional biases by type of gambling: An eye-tracking study

    PubMed Central

    Meitner, Amadeus; Sears, Christopher R.

    2018-01-01

    A growing body of research indicates that gamblers develop an attentional bias for gambling-related stimuli. Compared to research on substance use, however, few studies have examined attentional biases in gamblers using eye-gaze tracking, which has many advantages over other measures of attention. In addition, previous studies of attentional biases in gamblers have not directly matched type of gambler with personally-relevant gambling cues. The present study investigated the specificity of attentional biases for individual types of gambling using an eye-gaze tracking paradigm. Three groups of participants (poker players, video lottery terminal/slot machine players, and non-gambling controls) took part in one test session in which they viewed 25 sets of four images (poker, VLTs/slot machines, bingo, and board games). Participants' eye fixations were recorded throughout each 8-second presentation of the four images. The results indicated that, as predicted, the two gambling groups preferentially attended to their primary form of gambling, whereas control participants attended to board games more than gambling images. The findings have clinical implications for the treatment of individuals with gambling disorder. Understanding the importance of personally-salient gambling cues will inform the development of effective attentional bias modification treatments for problem gamblers. PMID:29385164

  12. Route planning with transportation network maps: an eye-tracking study.

    PubMed

    Grison, Elise; Gyselinck, Valérie; Burkhardt, Jean-Marie; Wiener, Jan Malte

    2017-09-01

    Planning routes using transportation network maps is a common task that has received little attention in the literature. Here, we present a novel eye-tracking paradigm to investigate psychological processes and mechanisms involved in such a route planning. In the experiment, participants were first presented with an origin and destination pair before we presented them with fictitious public transportation maps. Their task was to find the connecting route that required the minimum number of transfers. Based on participants' gaze behaviour, each trial was split into two phases: (1) the search for origin and destination phase, i.e., the initial phase of the trial until participants gazed at both origin and destination at least once and (2) the route planning and selection phase. Comparisons of other eye-tracking measures between these phases and the time to complete them, which depended on the complexity of the planning task, suggest that these two phases are indeed distinct and supported by different cognitive processes. For example, participants spent more time attending the centre of the map during the initial search phase, before directing their attention to connecting stations, where transitions between lines were possible. Our results provide novel insights into the psychological processes involved in route planning from maps. The findings are discussed in relation to the current theories of route planning.

  13. SacLab: A toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice.

    PubMed

    Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela

    2017-01-01

    Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Anticipatory Effects of Intonation: Eye Movements during Instructed Visual Search

    ERIC Educational Resources Information Center

    Ito, Kiwako; Speer, Shari R.

    2008-01-01

    Three eye-tracking experiments investigated the role of pitch accents during online discourse comprehension. Participants faced a grid with ornaments, and followed prerecorded instructions such as "Next, hang the blue ball" to decorate holiday trees. Experiment 1 demonstrated a processing advantage for felicitous as compared to infelicitous uses…

  15. Surface ablation with iris recognition and dynamic rotational eye tracking-based tissue saving treatment with the Technolas 217z excimer laser.

    PubMed

    Prakash, Gaurav; Agarwal, Amar; Kumar, Dhivya Ashok; Jacob, Soosan; Agarwal, Athiya; Maity, Amrita

    2011-03-01

    To evaluate the visual and refractive outcomes and expected benefits of Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking. This prospective, interventional case series comprised 122 eyes (70 patients). Pre- and postoperative assessment included uncorrected distance visual acuity (UDVA), corrected distance visual acuity (CDVA), refraction, and higher order aberrations. All patients underwent Tissue Saving Treatment algorithm-guided surface ablation with iris recognition and dynamic rotational eye tracking using the Technolas 217z 100-Hz excimer platform (Technolas Perfect Vision GmbH). Follow-up was performed up to 6 months postoperatively. Theoretical benefit analysis was performed to evaluate the algorithm's outcomes compared to others. Preoperative spherocylindrical power was sphere -3.62 ± 1.60 diopters (D) (range: 0 to -6.75 D), cylinder -1.15 ± 1.00 D (range: 0 to -3.50 D), and spherical equivalent -4.19 ± 1.60 D (range: -7.75 to -2.00 D). At 6 months, 91% (111/122) of eyes were within ± 0.50 D of attempted correction. Postoperative UDVA was comparable to preoperative CDVA at 1 month (P=.47) and progressively improved at 6 months (P=.004). Two eyes lost one line of CDVA at 6 months. Theoretical benefit analysis revealed that of 101 eyes with astigmatism, 29 would have had cyclotorsion-induced astigmatism of ≥ 10% if iris recognition and dynamic rotational eye tracking were not used. Furthermore, the mean percentage decrease in maximum depth of ablation by using the Tissue Saving Treatment was 11.8 ± 2.9% over Aspheric, 17.8 ± 6.2% over Personalized, and 18.2 ± 2.8% over Planoscan algorithms. Tissue saving surface ablation with iris recognition and dynamic rotational eye tracking was safe and effective in this series of eyes. Copyright 2011, SLACK Incorporated.

  16. Face, neck, and eye protection: adapting body armour to counter the changing patterns of injuries on the battlefield.

    PubMed

    Breeze, J; Horsfall, I; Hepper, A; Clasper, J

    2011-12-01

    Recent international papers have suggested an urgent need for new methods of protecting the face, neck, and eyes in battle. We made a systematic analysis to identify all papers that reported the incidence and mortality of combat wounds to the face, eyes, or neck in the 21st century, and any papers that described methods of protecting the face, neck, or eyes. Neck wounds were found in 2-11% of injuries in battle, and associated with high mortality, but no new methods of protecting the neck were identified. Facial wounds were found in 6-30% of injuries in battle, but despite the psychological effects of this type of injury only one paper suggested methods for protection. If soldiers wore existing eye protection they potentially reduced the mean incidence of eye injuries in combat from the 4.5% found in this analysis to 0.5%. Given the need to balance protection with the functional requirements of the individual soldier, a multidisciplinary approach is required. Military surgeons are well placed to work with material scientists and biomechanical engineers to suggest modifications to the design of both personal and vehicle-mounted protection. Further research needs is needed to find out how effective current methods of protecting the neck are, and to develop innovative methods of protecting the vulnerable regions of the neck and face. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  17. Mouse cursor movement and eye tracking data as an indicator of pathologists’ attention when viewing digital whole slide images

    PubMed Central

    Raghunath, Vignesh; Braxton, Melissa O.; Gagnon, Stephanie A.; Brunyé, Tad T.; Allison, Kimberly H.; Reisch, Lisa M.; Weaver, Donald L.; Elmore, Joann G.; Shapiro, Linda G.

    2012-01-01

    Context: Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists’ viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists’ viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists’ viewing strategies and time expenditures in their interpretive workflow. Aims: To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists’ attention and viewing behavior. Settings and Design: Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Materials and Methods: Participants’ foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Statistical Analysis Used: Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists’ accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Results: Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial

  18. Feature saliency in judging the sex and familiarity of faces.

    PubMed

    Roberts, T; Bruce, V

    1988-01-01

    Two experiments are reported on the effect of feature masking on judgements of the sex and familiarity of faces. In experiment 1 the effect of masking the eyes, nose, or mouth of famous and nonfamous, male and female faces on response times in two tasks was investigated. In the first, recognition, task only masking of the eyes had a significant effect on response times. In the second, sex-judgement, task masking of the nose gave rise to a significant and large increase in response times. In experiment 2 it was found that when facial features were presented in isolation in a sex-judgement task, responses to noses were at chance level, unlike those for eyes or mouths. It appears that visual information available from the nose in isolation from the rest of the face is not sufficient for sex judgement, yet masking of the nose may disrupt the extraction of information about the overall topography of the face, information that may be more useful for sex judgement than for identification of a face.

  19. An Eye on Trafficking Genes: Identification of Four Eye Color Mutations in Drosophila

    PubMed Central

    Grant, Paaqua; Maga, Tara; Loshakov, Anna; Singhal, Rishi; Wali, Aminah; Nwankwo, Jennifer; Baron, Kaitlin; Johnson, Diana

    2016-01-01

    Genes that code for proteins involved in organelle biogenesis and intracellular trafficking produce products that are critical in normal cell function . Conserved orthologs of these are present in most or all eukaryotes, including Drosophila melanogaster. Some of these genes were originally identified as eye color mutants with decreases in both types of pigments found in the fly eye. These criteria were used for identification of such genes, four eye color mutations that are not annotated in the genome sequence: chocolate, maroon, mahogany, and red Malpighian tubules were molecularly mapped and their genome sequences have been evaluated. Mapping was performed using deletion analysis and complementation tests. chocolate is an allele of the VhaAC39-1 gene, which is an ortholog of the Vacuolar H+ ATPase AC39 subunit 1. maroon corresponds to the Vps16A gene and its product is part of the HOPS complex, which participates in transport and organelle fusion. red Malpighian tubule is the CG12207 gene, which encodes a protein of unknown function that includes a LysM domain. mahogany is the CG13646 gene, which is predicted to be an amino acid transporter. The strategy of identifying eye color genes based on perturbations in quantities of both types of eye color pigments has proven useful in identifying proteins involved in trafficking and biogenesis of lysosome-related organelles. Mutants of these genes can form the basis of valuable in vivo models to understand these processes. PMID:27558665

  20. Eye tracking, strategies, and sex differences in virtual navigation.

    PubMed

    Andersen, Nicolas E; Dahmani, Louisa; Konishi, Kyoko; Bohbot, Véronique D

    2012-01-01

    Reports of sex differences in wayfinding have typically used paradigms sensitive to the female advantage (navigation by landmarks) or sensitive to the male advantage (navigation by cardinal directions, Euclidian coordinates, environmental geometry, and absolute distances). The current virtual navigation paradigm allowed both men and women an equal advantage. We studied sex differences by systematically varying the number of landmarks. Eye tracking was used to quantify sex differences in landmark utilisation as participants solved an eight-arm radial maze task within different virtual environments. To solve the task, participants were required to remember the locations of target objects within environments containing 0, 2, 4, 6, or 8 landmarks. We found that, as the number of landmarks available in the environment increases, the proportion of time men and women spend looking at landmarks and the number of landmarks they use to find their way increases. Eye tracking confirmed that women rely more on landmarks to navigate, although landmark fixations were also associated with an increase in task completion time. Sex differences in navigational behaviour occurred only in environments devoid of landmarks and disappeared in environments containing multiple landmarks. Moreover, women showed sustained landmark-oriented gaze, while men's decreased over time. Finally, we found that men and women use spatial and response strategies to the same extent. Together, these results shed new light on the discrepancy in landmark utilisation between men and women and help explain the differences in navigational behaviour previously reported. Copyright © 2011 Elsevier Inc. All rights reserved.

  1. An Eye for Words: Gauging the Role of Attention in Incidental L2 Vocabulary Acquisition by Means of Eye-Tracking

    ERIC Educational Resources Information Center

    Godfroid, Aline; Boers, Frank; Housen, Alex

    2013-01-01

    This eye-tracking study tests the hypothesis that more attention leads to more learning, following claims that attention to new language elements in the input results in their initial representation in long-term memory (i.e., intake; Robinson, 2003; Schmidt, 1990, 2001). Twenty-eight advanced learners of English read English texts that contained…

  2. Emotional Expression and Heart Rate in High-Risk Infants during the Face-To-Face/Still-Face

    PubMed Central

    Mattson, Whitney I.; Ekas, Naomi V.; Lambert, Brittany; Tronick, Ed; Lester, Barry M.; Messinger, Daniel S.

    2013-01-01

    In infants, eye constriction—the Duchenne marker—and mouth opening appear to index the intensity of both positive and negative facial expressions. We combined eye constriction and mouth opening that co-occurred with smiles and cry-faces (respectively, the prototypic expressions of infant joy and distress) to measure emotional expression intensity. Expression intensity and heart rate were measured throughout the Face-to-Face/Still Face (FFSF) in a sample of infants with prenatal cocaine exposure who were at risk for developmental difficulties. Smiles declined and cry-faces increased in the still-face episode, but the distribution of eye constriction and mouth opening in smiles and cry-faces did not differ across episodes of the FFSF. As time elapsed in the still face episode potential indices of intensity increased, cry-faces were more likely to be accompanied by eye constriction and mouth opening. During cry-faces there were also moderately stable individual differences in the quantity of eye constriction and mouth opening. Infant heart rate was higher during cry-faces and lower during smiles, but did not vary with intensity of expression or by episode. In sum, infants express more intense negative affect as the still-face progresses, but do not show clear differences in expressive intensity between episodes of the FFSF. PMID:24095807

  3. Linguistic Complexity and Information Structure in Korean: Evidence from Eye-Tracking during Reading

    ERIC Educational Resources Information Center

    Lee, Yoonhyoung; Lee, Hanjung; Gordon, Peter C.

    2007-01-01

    The nature of the memory processes that support language comprehension and the manner in which information packaging influences online sentence processing were investigated in three experiments that used eye-tracking during reading to measure the ease of understanding complex sentences in Korean. All three experiments examined reading of embedded…

  4. Eye-tracking of visual attention in web-based assessment using the Force Concept Inventory

    NASA Astrophysics Data System (ADS)

    Han, Jing; Chen, Li; Fu, Zhao; Fritchman, Joseph; Bao, Lei

    2017-07-01

    This study used eye-tracking technology to investigate students’ visual attention while taking the Force Concept Inventory (FCI) in a web-based interface. Eighty nine university students were randomly selected into a pre-test group and a post-test group. Students took the 30-question FCI on a computer equipped with an eye-tracker. There were seven weeks of instruction between the pre- and post-test data collection. Students’ performance on the FCI improved significantly from pre-test to post-test. Meanwhile, the eye-tracking results reveal that the time students spent on taking the FCI test was not affected by student performance and did not change from pre-test to post-test. Analysis of students’ attention to answer choices shows that on the pre-test students primarily focused on the naïve choices and ignored the expert choices. On the post-test, although students had shifted their primary attention to the expert choices, they still kept a high level of attention to the naïve choices, indicating significant conceptual mixing and competition during problem solving. Outcomes of this study provide new insights on students’ conceptual development in learning physics.

  5. The socialization effect on decision making in the Prisoner's Dilemma game: An eye-tracking study

    PubMed Central

    Myagkov, Mikhail G.; Harriff, Kyle

    2017-01-01

    We used a mobile eye-tracking system (in the form of glasses) to study the characteristics of visual perception in decision making in the Prisoner's Dilemma game. In each experiment, one of the 12 participants was equipped with eye-tracking glasses. The experiment was conducted in three stages: an anonymous Individual Game stage against a randomly chosen partner (one of the 12 other participants of the experiment); a Socialization stage, in which the participants were divided into two groups; and a Group Game stage, in which the participants played with partners in the groups. After each round, the respondent received information about his or her personal score in the last round and the overall winner of the game at the moment. The study proves that eye-tracking systems can be used for studying the process of decision making and forecasting. The total viewing time and the time of fixation on areas corresponding to noncooperative decisions is related to the participants’ overall level of cooperation. The increase in the total viewing time and the time of fixation on the areas of noncooperative choice is due to a preference for noncooperative decisions and a decrease in the overall level of cooperation. The number of fixations on the group attributes is associated with group identity, but does not necessarily lead to cooperative behavior. PMID:28394939

  6. Recognition and identification of famous faces in patients with unilateral temporal lobe epilepsy.

    PubMed

    Seidenberg, Michael; Griffith, Randall; Sabsevitz, David; Moran, Maria; Haltiner, Alan; Bell, Brian; Swanson, Sara; Hammeke, Thomas; Hermann, Bruce

    2002-01-01

    We examined the performance of 21 patients with unilateral temporal lobe epilepsy (TLE) and hippocampal damage (10 lefts, and 11 rights) and 10 age-matched controls on the recognition and identification (name and occupation) of well-known faces. Famous face stimuli were selected from four time periods; 1970s, 1980s, 1990-1994, and 1995-1996. Differential patterns of performance were observed for the left and right TLE group across distinct face processing components. The left TLE group showed a selective impairment in naming famous faces while they performed similar to the controls in face recognition and semantic identification (i.e. occupation). In contrast, the right TLE group was impaired across all components of face memory; face recognition, semantic identification, and face naming. Face naming impairment in the left TLE group was characterized by a temporal gradient with better naming performance for famous faces from more distant time periods. Findings are discussed in terms of the role of the temporal lobe system for the acquisition, retention, and retrieval of face semantic networks, and the differential effects of lateralized temporal lobe lesions in this process.

  7. Explaining Sad People's Memory Advantage for Faces.

    PubMed

    Hills, Peter J; Marquardt, Zoe; Young, Isabel; Goodenough, Imogen

    2017-01-01

    Sad people recognize faces more accurately than happy people (Hills et al., 2011). We devised four hypotheses for this finding that are tested between in the current study. The four hypotheses are: (1) sad people engage in more expert processing associated with face processing; (2) sad people are motivated to be more accurate than happy people in an attempt to repair their mood; (3) sad people have a defocused attentional strategy that allows more information about a face to be encoded; and (4) sad people scan more of the face than happy people leading to more facial features to be encoded. In Experiment 1, we found that dysphoria (sad mood often associated with depression) was not correlated with the face-inversion effect (a measure of expert processing) nor with response times but was correlated with defocused attention and recognition accuracy. Experiment 2 established that dysphoric participants detected changes made to more facial features than happy participants. In Experiment 3, using eye-tracking we found that sad-induced participants sampled more of the face whilst avoiding the eyes. Experiment 4 showed that sad-induced people demonstrated a smaller own-ethnicity bias. These results indicate that sad people show different attentional allocation to faces than happy and neutral people.

  8. Remote eye care screening for rural veterans with Technology-based Eye Care Services: a quality improvement project.

    PubMed

    Maa, April Y; Wojciechowski, Barbara; Hunt, Kelly; Dismuke, Clara; Janjua, Rabeea; Lynch, Mary G

    2017-01-01

    Veterans are at high risk for eye disease because of age and comorbid conditions. Access to eye care is challenging within the entire Veterans Hospital Administration's network of hospitals and clinics in the USA because it is the third busiest outpatient clinical service and growing at a rate of 9% per year. Rural and highly rural veterans face many more barriers to accessing eye care because of distance, cost to travel, and difficulty finding care in the community as many live in medically underserved areas. Also, rural veterans may be diagnosed in later stages of eye disease than their non-rural counterparts due to lack of access to specialty care. In March 2015, Technology-based Eye Care Services (TECS) was launched from the Atlanta Veterans Affairs (VA) as a quality improvement project to provide eye screening services for rural veterans. By tracking multiple measures including demographic and access to care metrics, data shows that TECS significantly improved access to care, with 33% of veterans receiving same-day access and >98% of veterans receiving an appointment within 30 days of request. TECS also provided care to a significant percentage of homeless veterans, 10.6% of the patients screened. Finally, TECS reduced healthcare costs, saving the VA up to US$148 per visit and approximately US$52 per patient in round trip travel reimbursements when compared to completing a face-to-face exam at the medical center. Overall savings to the VA system in this early phase of TECS totaled US$288,400, about US$41,200 per month. Other healthcare facilities may be able to use a similar protocol to extend care to at-risk patients.

  9. Efficient human face detection in infancy.

    PubMed

    Jakobsen, Krisztina V; Umstead, Lindsey; Simpson, Elizabeth A

    2016-01-01

    Adults detect conspecific faces more efficiently than heterospecific faces; however, the development of this own-species bias (OSB) remains unexplored. We tested whether 6- and 11-month-olds exhibit OSB in their attention to human and animal faces in complex visual displays with high perceptual load (25 images competing for attention). Infants (n = 48) and adults (n = 43) passively viewed arrays containing a face among 24 non-face distractors while we measured their gaze with remote eye tracking. While OSB is typically not observed until about 9 months, we found that, already by 6 months, human faces were more likely to be detected, were detected more quickly (attention capture), and received longer looks (attention holding) than animal faces. These data suggest that 6-month-olds already exhibit OSB in face detection efficiency, consistent with perceptual attunement. This specialization may reflect the biological importance of detecting conspecific faces, a foundational ability for early social interactions. © 2015 Wiley Periodicals, Inc.

  10. Eye tracking to evaluate evidence recognition in crime scene investigations.

    PubMed

    Watalingam, Renuka Devi; Richetelli, Nicole; Pelz, Jeff B; Speir, Jacqueline A

    2017-11-01

    Crime scene analysts are the core of criminal investigations; decisions made at the scene greatly affect the speed of analysis and the quality of conclusions, thereby directly impacting the successful resolution of a case. If an examiner fails to recognize the pertinence of an item on scene, the analyst's theory regarding the crime will be limited. Conversely, unselective evidence collection will most likely include irrelevant material, thus increasing a forensic laboratory's backlog and potentially sending the investigation into an unproductive and costly direction. Therefore, it is critical that analysts recognize and properly evaluate forensic evidence that can assess the relative support of differing hypotheses related to event reconstruction. With this in mind, the aim of this study was to determine if quantitative eye tracking data and qualitative reconstruction accuracy could be used to distinguish investigator expertise. In order to assess this, 32 participants were successfully recruited and categorized as experts or trained novices based on their practical experiences and educational backgrounds. Each volunteer then processed a mock crime scene while wearing a mobile eye tracker, wherein visual fixations, durations, search patterns, and reconstruction accuracy were evaluated. The eye tracking data (dwell time and task percentage on areas of interest or AOIs) were compared using Earth Mover's Distance (EMD) and the Needleman-Wunsch (N-W) algorithm, revealing significant group differences for both search duration (EMD), as well as search sequence (N-W). More specifically, experts exhibited greater dissimilarity in search duration, but greater similarity in search sequences than their novice counterparts. In addition to the quantitative visual assessment of examiner variability, each participant's reconstruction skill was assessed using a 22-point binary scoring system, in which significant group differences were detected as a function of total

  11. The Reliability, Validity, and Normative Data of Interpupillary Distance and Pupil Diameter Using Eye-Tracking Technology

    PubMed Central

    Murray, Nicholas P.; Hunfalvay, Melissa; Bolte, Takumi

    2017-01-01

    Purpose The purpose of this study was to determine the reliability of interpupillary distance (IPD) and pupil diameter (PD) measures using an infrared eye tracker and central point stimuli. Validity of the test compared to known clinical tools was determined, and normative data was established against which individuals can measure themselves. Methods Participants (416) across various demographics were examined for normative data. Of these, 50 were examined for reliability and validity. Validity for IPD measured the test (RightEye IPD/PD) against the PL850 Pupilometer and the Essilor Digital CRP. For PD, the test was measured against the Rosenbaum Pocket Vision Screener (RPVS). Reliability was analyzed with intraclass correlation coefficients (ICC) between trials with Cronbach's alpha (CA) and the standard error of measurement for each ICC. Convergent validity was investigated by calculating the bivariate correlation coefficient. Results Reliability results were strong (CA > 0.7) for all measures. High positive significant correlations were found between the RightEye IPD test and the PL850 Pupilometer (P < 0.001) and Essilor Digital CRP (P < 0.001) and for the RightEye PD test and the RPVS (P < 0.001). Conclusions Using infrared eye tracking and the RightEye IPD/PD test stimuli, reliable and accurate measures of IPD and PD were found. Results from normative data showed an adequate comparison for people with normal vision development. Translational Relevance Results revealed a central point of fixation may remove variability in examining PD reliably using infrared eye tracking when consistent environmental and experimental procedures are conducted. PMID:28685104

  12. Evaluating Silent Reading Performance with an Eye Tracking System in Patients with Glaucoma

    PubMed Central

    Murata, Noriaki; Fukuchi, Takeo

    2017-01-01

    Objective To investigate the relationship between silent reading performance and visual field defects in patients with glaucoma using an eye tracking system. Methods Fifty glaucoma patients (Group G; mean age, 52.2 years, standard deviation: 11.4 years) and 20 normal controls (Group N; mean age, 46.9 years; standard deviation: 17.2 years) were included in the study. All participants in Group G had early to advanced glaucomatous visual field defects but better than 20/20 visual acuity in both eyes. Participants silently read Japanese articles written horizontally while the eye tracking system monitored and calculated reading duration per 100 characters, number of fixations per 100 characters, and mean fixation duration, which were compared with mean deviation and visual field index values from Humphrey visual field testing (24–2 and 10–2 Swedish interactive threshold algorithm standard) of the right versus left eye and the better versus worse eye. Results There was a statistically significant difference between Groups G and N in mean fixation duration (G, 233.4 msec; N, 215.7 msec; P = 0.010). Within Group G, significant correlations were observed between reading duration and 24–2 right mean deviation (rs = -0.280, P = 0.049), 24–2 right visual field index (rs = -0.306, P = 0.030), 24–2 worse visual field index (rs = -0.304, P = 0.032), and 10–2 worse mean deviation (rs = -0.326, P = 0.025). Significant correlations were observed between mean fixation duration and 10–2 left mean deviation (rs = -0.294, P = 0.045) and 10–2 worse mean deviation (rs = -0.306, P = 0.037), respectively. Conclusions The severity of visual field defects may influence some aspects of reading performance. At least concerning silent reading, the visual field of the worse eye is an essential element of smoothness of reading. PMID:28095478

  13. Incidental L2 Vocabulary Acquisition "from" and "while" Reading: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Pellicer-Sánchez, Ana

    2016-01-01

    Previous studies have shown that reading is an important source of incidental second language (L2) vocabulary acquisition. However, we still do not have a clear picture of what happens when readers encounter unknown words. Combining offline (vocabulary tests) and online (eye-tracking) measures, the incidental acquisition of vocabulary knowledge…

  14. Is having similar eye movement patterns during face learning and recognition beneficial for recognition performance? Evidence from hidden Markov modeling.

    PubMed

    Chuk, Tim; Chan, Antoni B; Hsiao, Janet H

    2017-12-01

    The hidden Markov model (HMM)-based approach for eye movement analysis is able to reflect individual differences in both spatial and temporal aspects of eye movements. Here we used this approach to understand the relationship between eye movements during face learning and recognition, and its association with recognition performance. We discovered holistic (i.e., mainly looking at the face center) and analytic (i.e., specifically looking at the two eyes in addition to the face center) patterns during both learning and recognition. Although for both learning and recognition, participants who adopted analytic patterns had better recognition performance than those with holistic patterns, a significant positive correlation between the likelihood of participants' patterns being classified as analytic and their recognition performance was only observed during recognition. Significantly more participants adopted holistic patterns during learning than recognition. Interestingly, about 40% of the participants used different patterns between learning and recognition, and among them 90% switched their patterns from holistic at learning to analytic at recognition. In contrast to the scan path theory, which posits that eye movements during learning have to be recapitulated during recognition for the recognition to be successful, participants who used the same or different patterns during learning and recognition did not differ in recognition performance. The similarity between their learning and recognition eye movement patterns also did not correlate with their recognition performance. These findings suggested that perceptuomotor memory elicited by eye movement patterns during learning does not play an important role in recognition. In contrast, the retrieval of diagnostic information for recognition, such as the eyes for face recognition, is a better predictor for recognition performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Integrated Eye Tracking and Neural Monitoring for Enhanced Assessment of Mild TBI

    DTIC Science & Technology

    2015-04-01

    virtual reality driving simulator data acquisition. Data collection for the pilot study is nearly complete and data analyses are currently under way...Training for primary study procedures including neuropsychological testing, eye- tracking, virtual reality driving simulator, and EEG data acquisition is...the virtual reality driving simulator. Participants are instructed to drive along a coastal highway while performing the target detection task

  16. Quantifying Novice and Expert Differences in Visual Diagnostic Reasoning in Veterinary Pathology Using Eye-Tracking Technology.

    PubMed

    Warren, Amy L; Donnon, Tyrone L; Wagg, Catherine R; Priest, Heather; Fernandez, Nicole J

    2018-01-18

    Visual diagnostic reasoning is the cognitive process by which pathologists reach a diagnosis based on visual stimuli (cytologic, histopathologic, or gross imagery). Currently, there is little to no literature examining visual reasoning in veterinary pathology. The objective of the study was to use eye tracking to establish baseline quantitative and qualitative differences between the visual reasoning processes of novice and expert veterinary pathologists viewing cytology specimens. Novice and expert participants were each shown 10 cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently verbalizing their thought processes using the think-aloud protocol (5 slides). Compared to novices, experts demonstrated significantly higher diagnostic accuracy (p<.017), shorter time to diagnosis (p<.017), and a higher percentage of time spent viewing areas of diagnostic interest (p<.017). Experts elicited more key diagnostic features in the think-aloud protocol and had more efficient patterns of eye movement. These findings suggest that experts' fast time to diagnosis, efficient eye-movement patterns, and preference for viewing areas of interest supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis.

  17. Low frequency rTMS over posterior parietal cortex impairs smooth pursuit eye tracking.

    PubMed

    Hutton, Samuel B; Weekes, Brendan S

    2007-11-01

    The role of the posterior parietal cortex in smooth pursuit eye movements remains unclear. We used low frequency repetitive transcranial magnetic stimulation (rTMS) to study the cognitive and neural systems involved in the control of smooth pursuit eye movements. Eighteen participants were tested on two separate occasions. On each occasion we measured smooth pursuit eye tracking before and after 6 min of 1 Hz rTMS delivered at 90% of motor threshold. Low frequency rTMS over the posterior parietal cortex led to a significant reduction in smooth pursuit velocity gain, whereas rTMS over the motor cortex had no effect on gain. We conclude that low frequency offline rTMS is a potentially useful tool with which to explore the cortical systems involved in oculomotor control.

  18. EEG and Eye Tracking Signatures of Target Encoding during Structured Visual Search

    PubMed Central

    Brouwer, Anne-Marie; Hogervorst, Maarten A.; Oudejans, Bob; Ries, Anthony J.; Touryan, Jonathan

    2017-01-01

    EEG and eye tracking variables are potential sources of information about the underlying processes of target detection and storage during visual search. Fixation duration, pupil size and event related potentials (ERPs) locked to the onset of fixation or saccade (saccade-related potentials, SRPs) have been reported to differ dependent on whether a target or a non-target is currently fixated. Here we focus on the question of whether these variables also differ between targets that are subsequently reported (hits) and targets that are not (misses). Observers were asked to scan 15 locations that were consecutively highlighted for 1 s in pseudo-random order. Highlighted locations displayed either a target or a non-target stimulus with two, three or four targets per trial. After scanning, participants indicated which locations had displayed a target. To induce memory encoding failures, participants concurrently performed an aurally presented math task (high load condition). In a low load condition, participants ignored the math task. As expected, more targets were missed in the high compared with the low load condition. For both conditions, eye tracking features distinguished better between hits and misses than between targets and non-targets (with larger pupil size and shorter fixations for missed compared with correctly encoded targets). In contrast, SRP features distinguished better between targets and non-targets than between hits and misses (with average SRPs showing larger P300 waveforms for targets than for non-targets). Single trial classification results were consistent with these averages. This work suggests complementary contributions of eye and EEG measures in potential applications to support search and detect tasks. SRPs may be useful to monitor what objects are relevant to an observer, and eye variables may indicate whether the observer should be reminded of them later. PMID:28559807

  19. Preliminary Experience Using Eye-Tracking Technology to Differentiate Novice and Expert Image Interpretation for Ultrasound-Guided Regional Anesthesia.

    PubMed

    Borg, Lindsay K; Harrison, T Kyle; Kou, Alex; Mariano, Edward R; Udani, Ankeet D; Kim, T Edward; Shum, Cynthia; Howard, Steven K

    2018-02-01

    Objective measures are needed to guide the novice's pathway to expertise. Within and outside medicine, eye tracking has been used for both training and assessment. We designed this study to test the hypothesis that eye tracking may differentiate novices from experts in static image interpretation for ultrasound (US)-guided regional anesthesia. We recruited novice anesthesiology residents and regional anesthesiology experts. Participants wore eye-tracking glasses, were shown 5 sonograms of US-guided regional anesthesia, and were asked a series of anatomy-based questions related to each image while their eye movements were recorded. The answer to each question was a location on the sonogram, defined as the area of interest (AOI). The primary outcome was the total gaze time in the AOI (seconds). Secondary outcomes were the total gaze time outside the AOI (seconds), total time to answer (seconds), and time to first fixation on the AOI (seconds). Five novices and 5 experts completed the study. Although the gaze time (mean ± SD) in the AOI was not different between groups (7 ± 4 seconds for novices and 7 ± 3 seconds for experts; P = .150), the gaze time outside the AOI was greater for novices (75 ± 18 versus 44 ± 4 seconds for experts; P = .005). The total time to answer and total time to first fixation in the AOI were both shorter for experts. Experts in US-guided regional anesthesia take less time to identify sonoanatomy and spend less unfocused time away from a target compared to novices. Eye tracking is a potentially useful tool to differentiate novices from experts in the domain of US image interpretation. © 2017 by the American Institute of Ultrasound in Medicine.

  20. Emotional expression and heart rate in high-risk infants during the face-to-face/still-face.

    PubMed

    Mattson, Whitney I; Ekas, Naomi V; Lambert, Brittany; Tronick, Ed; Lester, Barry M; Messinger, Daniel S

    2013-12-01

    In infants, eye constriction-the Duchenne marker-and mouth opening appear to index the intensity of both positive and negative facial expressions. We combined eye constriction and mouth opening that co-occurred with smiles and cry-faces (respectively, the prototypic expressions of infant joy and distress) to measure emotional expression intensity. Expression intensity and heart rate were measured throughout the face-to-face/still-face (FFSF) in a sample of infants with prenatal cocaine exposure who were at risk for developmental difficulties. Smiles declined and cry-faces increased in the still-face episode, but the distribution of eye constriction and mouth opening in smiles and cry-faces did not differ across episodes of the FFSF. As time elapsed in the still face episode potential indices of intensity increased, cry-faces were more likely to be accompanied by eye constriction and mouth opening. During cry-faces there were also moderately stable individual differences in the quantity of eye constriction and mouth opening. Infant heart rate was higher during cry-faces and lower during smiles, but did not vary with intensity of expression or by episode. In sum, infants express more intense negative affect as the still-face progresses, but do not show clear differences in expressive intensity between episodes of the FFSF. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Eye tracking and pupillometry are indicators of dissociable latent decision processes.

    PubMed

    Cavanagh, James F; Wiecki, Thomas V; Kochar, Angad; Frank, Michael J

    2014-08-01

    Can you predict what people are going to do just by watching them? This is certainly difficult: it would require a clear mapping between observable indicators and unobservable cognitive states. In this report, we demonstrate how this is possible by monitoring eye gaze and pupil dilation, which predict dissociable biases during decision making. We quantified decision making using the drift diffusion model (DDM), which provides an algorithmic account of how evidence accumulation and response caution contribute to decisions through separate latent parameters of drift rate and decision threshold, respectively. We used a hierarchical Bayesian estimation approach to assess the single trial influence of observable physiological signals on these latent DDM parameters. Increased eye gaze dwell time specifically predicted an increased drift rate toward the fixated option, irrespective of the value of the option. In contrast, greater pupil dilation specifically predicted an increase in decision threshold during difficult decisions. These findings suggest that eye tracking and pupillometry reflect the operations of dissociated latent decision processes. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  2. When Art Moves the Eyes: A Behavioral and Eye-Tracking Study

    PubMed Central

    Massaro, Davide; Savazzi, Federica; Di Dio, Cinzia; Freedberg, David; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2012-01-01

    The aim of this study was to investigate, using eye-tracking technique, the influence of bottom-up and top-down processes on visual behavior while subjects, naïve to art criticism, were presented with representational paintings. Forty-two subjects viewed color and black and white paintings (Color) categorized as dynamic or static (Dynamism) (bottom-up processes). Half of the images represented natural environments and half human subjects (Content); all stimuli were displayed under aesthetic and movement judgment conditions (Task) (top-down processes). Results on gazing behavior showed that content-related top-down processes prevailed over low-level visually-driven bottom-up processes when a human subject is represented in the painting. On the contrary, bottom-up processes, mediated by low-level visual features, particularly affected gazing behavior when looking at nature-content images. We discuss our results proposing a reconsideration of the definition of content-related top-down processes in accordance with the concept of embodied simulation in art perception. PMID:22624007

  3. When art moves the eyes: a behavioral and eye-tracking study.

    PubMed

    Massaro, Davide; Savazzi, Federica; Di Dio, Cinzia; Freedberg, David; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2012-01-01

    The aim of this study was to investigate, using eye-tracking technique, the influence of bottom-up and top-down processes on visual behavior while subjects, naïve to art criticism, were presented with representational paintings. Forty-two subjects viewed color and black and white paintings (Color) categorized as dynamic or static (Dynamism) (bottom-up processes). Half of the images represented natural environments and half human subjects (Content); all stimuli were displayed under aesthetic and movement judgment conditions (Task) (top-down processes). Results on gazing behavior showed that content-related top-down processes prevailed over low-level visually-driven bottom-up processes when a human subject is represented in the painting. On the contrary, bottom-up processes, mediated by low-level visual features, particularly affected gazing behavior when looking at nature-content images. We discuss our results proposing a reconsideration of the definition of content-related top-down processes in accordance with the concept of embodied simulation in art perception.

  4. Attentional bias to betel quid cues: An eye tracking study.

    PubMed

    Shen, Bin; Chiu, Meng-Chun; Li, Shuo-Heng; Huang, Guo-Joe; Liu, Ling-Jun; Ho, Ming-Chou

    2016-09-01

    The World Health Organization regards betel quid as a human carcinogen, and DSM-IV and ICD-10 dependence symptoms may develop with heavy use. This study, conducted in central Taiwan, investigated whether betel quid chewers can exhibit overt orienting to selectively respond to the betel quid cues. Twenty-four male chewers' and 23 male nonchewers' eye movements to betel-quid-related pictures and matched pictures were assessed during a visual probe task. The eye movement index showed that betel quid chewers were more likely to initially direct their gaze to the betel quid cues, t(23) = 3.70, p < .01, d = .75, and spent more time, F(1, 23) = 4.58, p < .05, η₂ = .17, and were more fixated, F(1, 23) = 5.18, p < .05, η₂ = .18, on them. The visual probe index (response time) failed to detect the chewers' attentional bias. The current study provided the first eye movement evidence of betel quid chewers' attentional bias. The results demonstrated that the betel quid chewers (but not the nonchewers) were more likely to initially direct their gaze to the betel quid cues, and spent more time and were more fixated on them. These findings suggested that when attention is directly measured through the eye tracking technique, this methodology may be more sensitive to detecting attentional biases in betel quid chewers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Reading the Mind in the Eyes or Reading between the Lines? Theory of Mind Predicts Collective Intelligence Equally Well Online and Face-To-Face

    PubMed Central

    Engel, David; Woolley, Anita Williams; Jing, Lisa X.; Chabris, Christopher F.; Malone, Thomas W.

    2014-01-01

    Recent research with face-to-face groups found that a measure of general group effectiveness (called “collective intelligence”) predicted a group’s performance on a wide range of different tasks. The same research also found that collective intelligence was correlated with the individual group members’ ability to reason about the mental states of others (an ability called “Theory of Mind” or “ToM”). Since ToM was measured in this work by a test that requires participants to “read” the mental states of others from looking at their eyes (the “Reading the Mind in the Eyes” test), it is uncertain whether the same results would emerge in online groups where these visual cues are not available. Here we find that: (1) a collective intelligence factor characterizes group performance approximately as well for online groups as for face-to-face groups; and (2) surprisingly, the ToM measure is equally predictive of collective intelligence in both face-to-face and online groups, even though the online groups communicate only via text and never see each other at all. This provides strong evidence that ToM abilities are just as important to group performance in online environments with limited nonverbal cues as they are face-to-face. It also suggests that the Reading the Mind in the Eyes test measures a deeper, domain-independent aspect of social reasoning, not merely the ability to recognize facial expressions of mental states. PMID:25514387

  6. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research.

    PubMed

    Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim

    2017-01-01

    Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  7. 78 FR 71621 - Agency Information Collection Activities; Proposed Collection; Comment Request; Eye Tracking...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... notice. This notice solicits comments on research entitled, ``Eye Tracking Study of Direct-to-Consumer... the FDA to conduct research relating to health information. Section 1003(d)(2)(C) of the Federal Food, Drug, and Cosmetic Act (the FD&C Act) (21 U.S.C. 393(b)(2)(c)) authorizes FDA to conduct research...

  8. Through Their Eyes: Tracking the Gaze of Students in a Geology Field Course

    ERIC Educational Resources Information Center

    Maltese, Adam V.; Balliet, Russell N.; Riggs, Eric M.

    2013-01-01

    The focus of this research was to investigate how students learn to do fieldwork through observation. This study addressed the following questions: (1) Can mobile eye-tracking devices provide a robust source of data to investigate the observations and workflow of novice students while participating in a field exercise? If so, what are the…

  9. Neural mechanisms of eye contact when listening to another person talking.

    PubMed

    Jiang, Jing; Borowiak, Kamila; Tudge, Luke; Otto, Carolin; von Kriegstein, Katharina

    2017-02-01

    Eye contact occurs frequently and voluntarily during face-to-face verbal communication. However, the neural mechanisms underlying eye contact when it is accompanied by spoken language remain unexplored to date. Here we used a novel approach, fixation-based event-related functional magnetic resonance imaging (fMRI), to simulate the listener making eye contact with a speaker during verbal communication. Participants' eye movements and fMRI data were recorded simultaneously while they were freely viewing a pre-recorded speaker talking. The eye tracking data were then used to define events for the fMRI analyses. The results showed that eye contact in contrast to mouth fixation involved visual cortical areas (cuneus, calcarine sulcus), brain regions related to theory of mind/intentionality processing (temporoparietal junction, posterior superior temporal sulcus, medial prefrontal cortex) and the dorsolateral prefrontal cortex. In addition, increased effective connectivity was found between these regions for eye contact in contrast to mouth fixations. The results provide first evidence for neural mechanisms underlying eye contact when watching and listening to another person talking. The network we found might be well suited for processing the intentions of communication partners during eye contact in verbal communication. © The Author (2016). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  10. Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study.

    PubMed

    Król, Magdalena Ewa

    2018-01-01

    We investigated the effect of auditory noise added to speech on patterns of looking at faces in 40 toddlers. We hypothesised that noise would increase the difficulty of processing speech, making children allocate more attention to the mouth of the speaker to gain visual speech cues from mouth movements. We also hypothesised that this shift would cause a decrease in fixation time to the eyes, potentially decreasing the ability to monitor gaze. We found that adding noise increased the number of fixations to the mouth area, at the price of a decreased number of fixations to the eyes. Thus, to our knowledge, this is the first study demonstrating a mouth-eyes trade-off between attention allocated to social cues coming from the eyes and linguistic cues coming from the mouth. We also found that children with higher word recognition proficiency and higher average pupil response had an increased likelihood of fixating the mouth, compared to the eyes and the rest of the screen, indicating stronger motivation to decode the speech.

  11. Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study

    PubMed Central

    2018-01-01

    We investigated the effect of auditory noise added to speech on patterns of looking at faces in 40 toddlers. We hypothesised that noise would increase the difficulty of processing speech, making children allocate more attention to the mouth of the speaker to gain visual speech cues from mouth movements. We also hypothesised that this shift would cause a decrease in fixation time to the eyes, potentially decreasing the ability to monitor gaze. We found that adding noise increased the number of fixations to the mouth area, at the price of a decreased number of fixations to the eyes. Thus, to our knowledge, this is the first study demonstrating a mouth-eyes trade-off between attention allocated to social cues coming from the eyes and linguistic cues coming from the mouth. We also found that children with higher word recognition proficiency and higher average pupil response had an increased likelihood of fixating the mouth, compared to the eyes and the rest of the screen, indicating stronger motivation to decode the speech. PMID:29558514

  12. Reproducibility of retinal nerve fiber layer thickness measures using eye tracking in children with nonglaucomatous optic neuropathy.

    PubMed

    Rajjoub, Raneem D; Trimboli-Heidler, Carmelina; Packer, Roger J; Avery, Robert A

    2015-01-01

    To determine the intra- and intervisit reproducibility of circumpapillary retinal nerve fiber layer (RNFL) thickness measures using eye tracking-assisted spectral-domain optical coherence tomography (SD OCT) in children with nonglaucomatous optic neuropathy. Prospective longitudinal study. Circumpapillary RNFL thickness measures were acquired with SD OCT using the eye-tracking feature at 2 separate study visits. Children with normal and abnormal vision (visual acuity ≥ 0.2 logMAR above normal and/or visual field loss) who demonstrated clinical and radiographic stability were enrolled. Intra- and intervisit reproducibility was calculated for the global average and 9 anatomic sectors by calculating the coefficient of variation and intraclass correlation coefficient. Forty-two subjects (median age 8.6 years, range 3.9-18.2 years) met inclusion criteria and contributed 62 study eyes. Both the abnormal and normal vision cohort demonstrated the lowest intravisit coefficient of variation for the global RNFL thickness. Intervisit reproducibility remained good for those with normal and abnormal vision, although small but statistically significant increases in the coefficient of variation were observed for multiple anatomic sectors in both cohorts. The magnitude of visual acuity loss was significantly associated with the global (ß = 0.026, P < .01) and temporal sector coefficient of variation (ß = 0.099, P < .01). SD OCT with eye tracking demonstrates highly reproducible RNFL thickness measures. Subjects with vision loss demonstrate greater intra- and intervisit variability than those with normal vision. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Face inversion effects in autism: a combined looking time and pupillometric study.

    PubMed

    Falck-Ytter, Terje

    2008-10-01

    Previous research has found that in typically developing individuals, behavioral performance declines and electrophysiological brain responses are altered when the face is inverted. Such effects are generally attributed to disruption of configural information. Individuals with autism spectrum disorder (ASD) have been found to show less pronounced inversion effects, a result in line with the view that featural processing of faces is enhanced in ASD. No study has determined if, or how, such local bias is reflected in the eye movements used in face observation. In this eye tracking study, looking time and pupil dilation were investigated during the presentation of upright and inverted faces in preschool children with ASD and typically developing preschoolers. On average, both children with ASD and typically developing children looked less at the face and the eye areas during inverted presentations than during upright presentations. Nevertheless, individuals with ASD had a stronger tendency than typically developing children to look at the same face features during upright and inverted presentations, which is suggestive of a local bias. Pupil dilation, reflecting increased processing load, was larger for inverted than upright faces in the ASD group only, and pupillary inversion effects were stronger in ASD than in typically developing children.

  14. Differential emotion attribution to neutral faces of own and other races.

    PubMed

    Hu, Chao S; Wang, Qiandong; Han, Tong; Weare, Ethan; Fu, Genyue

    2017-02-01

    Past research has demonstrated differential recognition of emotion on faces of different races. This paper reports the first study to explore differential emotion attribution to neutral faces of different races. Chinese and Caucasian adults viewed a series of Chinese and Caucasian neutral faces and judged their outward facial expression: neutral, positive, or negative. The results showed that both Chinese and Caucasian viewers perceived more Chinese faces than Caucasian faces as neutral. Nevertheless, Chinese viewers attributed positive emotion to Caucasian faces more than to Chinese faces, whereas Caucasian viewers attributed negative emotion to Caucasian faces more than to Chinese faces. Moreover, Chinese viewers attributed negative and neutral emotion to the faces of both races without significant difference in frequency, whereas Caucasian viewers mostly attributed neutral emotion to the faces. These differences between Chinese and Caucasian viewers may be due to differential visual experience, culture, racial stereotype, or expectation of the experiment. We also used eye tracking among the Chinese participants to explore the relationship between face-processing strategy and emotion attribution to neutral faces. The results showed that the interaction between emotion attribution and face race was significant on face-processing strategy, such as fixation proportion on eyes and saccade amplitude. Additionally, pupil size during processing Caucasian faces was larger than during processing Chinese faces.

  15. The Processing of Human Emotional Faces by Pet and Lab Dogs: Evidence for Lateralization and Experience Effects

    PubMed Central

    Barber, Anjuli L. A.; Randi, Dania; Müller, Corsin A.; Huber, Ludwig

    2016-01-01

    From all non-human animals dogs are very likely the best decoders of human behavior. In addition to a high sensitivity to human attentive status and to ostensive cues, they are able to distinguish between individual human faces and even between human facial expressions. However, so far little is known about how they process human faces and to what extent this is influenced by experience. Here we present an eye-tracking study with dogs emanating from two different living environments and varying experience with humans: pet and lab dogs. The dogs were shown pictures of familiar and unfamiliar human faces expressing four different emotions. The results, extracted from several different eye-tracking measurements, revealed pronounced differences in the face processing of pet and lab dogs, thus indicating an influence of the amount of exposure to humans. In addition, there was some evidence for the influences of both, the familiarity and the emotional expression of the face, and strong evidence for a left gaze bias. These findings, together with recent evidence for the dog's ability to discriminate human facial expressions, indicate that dogs are sensitive to some emotions expressed in human faces. PMID:27074009

  16. Explaining Sad People’s Memory Advantage for Faces

    PubMed Central

    Hills, Peter J.; Marquardt, Zoe; Young, Isabel; Goodenough, Imogen

    2017-01-01

    Sad people recognize faces more accurately than happy people (Hills et al., 2011). We devised four hypotheses for this finding that are tested between in the current study. The four hypotheses are: (1) sad people engage in more expert processing associated with face processing; (2) sad people are motivated to be more accurate than happy people in an attempt to repair their mood; (3) sad people have a defocused attentional strategy that allows more information about a face to be encoded; and (4) sad people scan more of the face than happy people leading to more facial features to be encoded. In Experiment 1, we found that dysphoria (sad mood often associated with depression) was not correlated with the face-inversion effect (a measure of expert processing) nor with response times but was correlated with defocused attention and recognition accuracy. Experiment 2 established that dysphoric participants detected changes made to more facial features than happy participants. In Experiment 3, using eye-tracking we found that sad-induced participants sampled more of the face whilst avoiding the eyes. Experiment 4 showed that sad-induced people demonstrated a smaller own-ethnicity bias. These results indicate that sad people show different attentional allocation to faces than happy and neutral people. PMID:28261138

  17. Peer Assessment of Webpage Design: Behavioral Sequential Analysis Based on Eye-Tracking Evidence

    ERIC Educational Resources Information Center

    Hsu, Ting-Chia; Chang, Shao-Chen; Liu, Nan-Cen

    2018-01-01

    This study employed an eye-tracking machine to record the process of peer assessment. Each web page was divided into several regions of interest (ROIs) based on the frame design and content. A total of 49 undergraduate students with a visual learning style participated in the experiment. This study investigated the peer assessment attitudes of the…

  18. Eye-Tracking Analysis of the Figures of Anti-Smoking Health Promoting Periodical's Illustrations

    ERIC Educational Resources Information Center

    Maródi, Ágnes; Devosa, Iván; Steklács, János; Fáyné-Dombi, Alice; Buzas, Zsuzsanna; Vanya, Melinda

    2015-01-01

    Nowadays new education technologies and e-communication devices give new measuring and assessing tools for researchers. Eye-tracking is one of these new methods in education. In our study we assessed 4 figures from the anti-smoking heath issues of National Institute for Health Development. In the study 22 students were included from a 7th grade…

  19. The Impact of Early Bilingualism on Face Recognition Processes.

    PubMed

    Kandel, Sonia; Burfin, Sabine; Méary, David; Ruiz-Tada, Elisa; Costa, Albert; Pascalis, Olivier

    2016-01-01

    Early linguistic experience has an impact on the way we decode audiovisual speech in face-to-face communication. The present study examined whether differences in visual speech decoding could be linked to a broader difference in face processing. To identify a phoneme we have to do an analysis of the speaker's face to focus on the relevant cues for speech decoding (e.g., locating the mouth with respect to the eyes). Face recognition processes were investigated through two classic effects in face recognition studies: the Other-Race Effect (ORE) and the Inversion Effect. Bilingual and monolingual participants did a face recognition task with Caucasian faces (own race), Chinese faces (other race), and cars that were presented in an Upright or Inverted position. The results revealed that monolinguals exhibited the classic ORE. Bilinguals did not. Overall, bilinguals were slower than monolinguals. These results suggest that bilinguals' face processing abilities differ from monolinguals'. Early exposure to more than one language may lead to a perceptual organization that goes beyond language processing and could extend to face analysis. We hypothesize that these differences could be due to the fact that bilinguals focus on different parts of the face than monolinguals, making them more efficient in other race face processing but slower. However, more studies using eye-tracking techniques are necessary to confirm this explanation.

  20. Benchmark Eye Movement Effects during Natural Reading in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Howard, Philippa L.; Liversedge, Simon P.; Benson, Valerie

    2017-01-01

    In 2 experiments, eye tracking methodology was used to assess on-line lexical, syntactic and semantic processing in autism spectrum disorder (ASD). In Experiment 1, lexical identification was examined by manipulating the frequency of target words. Both typically developed (TD) and ASD readers showed normal frequency effects, suggesting that the…

  1. Prioritized Identification of Attractive and Romantic Partner Faces in Rapid Serial Visual Presentation.

    PubMed

    Nakamura, Koyo; Arai, Shihoko; Kawabata, Hideaki

    2017-11-01

    People are sensitive to facial attractiveness because it is an important biological and social signal. As such, our perceptual and attentional system seems biased toward attractive faces. We tested whether attractive faces capture attention and enhance memory access in an involuntary manner using a dual-task rapid serial visual presentation (dtRSVP) paradigm, wherein multiple faces were successively presented for 120 ms. In Experiment 1, participants (N = 26) were required to identify two female faces embedded in a stream of animal faces as distractors. The results revealed that identification of the second female target (T2) was better when it was attractive compared to neutral or unattractive. In Experiment 2, we investigated whether perceived attractiveness affects T2 identification (N = 27). To this end, we performed another dtRSVP task involving participants in a romantic partnership with the opposite sex, wherein T2 was their romantic partner's face. The results demonstrated that a romantic partner's face was correctly identified more often than was the face of a friend or unknown person. Furthermore, the greater the intensity of passionate love participants felt for their partner (as measured by the Passionate Love Scale), the more often they correctly identified their partner's face. Our experiments indicate that attractive and romantic partners' faces facilitate the identification of the faces in an involuntary manner.

  2. Attentional biases in body dysmorphic disorder (BDD): Eye-tracking using the emotional Stroop task.

    PubMed

    Toh, Wei Lin; Castle, David J; Rossell, Susan L

    2017-04-01

    Body dysmorphic disorder (BDD) is characterised by repetitive behaviours and/or mental acts occurring in response to preoccupations with perceived defects or flaws in physical appearance. This study aimed to examine attentional biases in BDD via the emotional Stroop task with two modifications: i) incorporating an eye-tracking paradigm, and ii) employing an obsessive-compulsive disorder (OCD) control group. Twenty-one BDD, 19 OCD and 21 HC participants, who were age-, sex-, and IQ-matched, were included. A card version of the emotional Stroop task was employed based on seven 10-word lists: (i) BDD-positive, (ii) BDD-negative, (iii) OCD-checking, (iv) OCD-washing, (v) general positive, (vi) general threat, and (vii) neutral (as baseline). Participants were asked to read aloud words and word colours consecutively, thereby yielding accuracy and latency scores. Eye-tracking parameters were also measured. Participants with BDD exhibited significant Stroop interference for BDD-negative words relative to HC participants, as shown by extended colour-naming latencies. In contrast, the OCD group did not exhibit Stroop interference for OCD-related nor general threat words. Only mild eye-tracking anomalies were uncovered in clinical groups. Inspection of individual scanning styles and fixation heat maps however revealed that viewing strategies adopted by clinical groups were generally disorganised, with avoidance of certain disorder-relevant words and considerable visual attention devoted to non-salient card regions. The operation of attentional biases to negative disorder-specific words was corroborated in BDD. Future replication studies using other paradigms are vital, given potential ambiguities inherent in emotional Stroop task interpretation. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Eyelid contour detection and tracking for startle research related eye-blink measurements from high-speed video records.

    PubMed

    Bernard, Florian; Deuter, Christian Eric; Gemmar, Peter; Schachinger, Hartmut

    2013-10-01

    Using the positions of the eyelids is an effective and contact-free way for the measurement of startle induced eye-blinks, which plays an important role in human psychophysiological research. To the best of our knowledge, no methods for an efficient detection and tracking of the exact eyelid contours in image sequences captured at high-speed exist that are conveniently usable by psychophysiological researchers. In this publication a semi-automatic model-based eyelid contour detection and tracking algorithm for the analysis of high-speed video recordings from an eye tracker is presented. As a large number of images have been acquired prior to method development it was important that our technique is able to deal with images that are recorded without any special parametrisation of the eye tracker. The method entails pupil detection, specular reflection removal and makes use of dynamic model adaption. In a proof-of-concept study we could achieve a correct detection rate of 90.6%. With this approach, we provide a feasible method to accurately assess eye-blinks from high-speed video recordings. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Can Children with Autism Spectrum Disorders "Hear" a Speaking Face?

    ERIC Educational Resources Information Center

    Irwin, Julia R.; Tornatore, Lauren A.; Brancazio, Lawrence; Whalen, D. H.

    2011-01-01

    This study used eye-tracking methodology to assess audiovisual speech perception in 26 children ranging in age from 5 to 15 years, half with autism spectrum disorders (ASD) and half with typical development. Given the characteristic reduction in gaze to the faces of others in children with ASD, it was hypothesized that they would show reduced…

  5. Automatic vasculature identification in coronary angiograms by adaptive geometrical tracking.

    PubMed

    Xiao, Ruoxiu; Yang, Jian; Goyal, Mahima; Liu, Yue; Wang, Yongtian

    2013-01-01

    As the uneven distribution of contrast agents and the perspective projection principle of X-ray, the vasculatures in angiographic image are with low contrast and are generally superposed with other organic tissues; therefore, it is very difficult to identify the vasculature and quantitatively estimate the blood flow directly from angiographic images. In this paper, we propose a fully automatic algorithm named adaptive geometrical vessel tracking (AGVT) for coronary artery identification in X-ray angiograms. Initially, the ridge enhancement (RE) image is obtained utilizing multiscale Hessian information. Then, automatic initialization procedures including seed points detection, and initial directions determination are performed on the RE image. The extracted ridge points can be adjusted to the geometrical centerline points adaptively through diameter estimation. Bifurcations are identified by discriminating connecting relationship of the tracked ridge points. Finally, all the tracked centerlines are merged and smoothed by classifying the connecting components on the vascular structures. Synthetic angiographic images and clinical angiograms are used to evaluate the performance of the proposed algorithm. The proposed algorithm is compared with other two vascular tracking techniques in terms of the efficiency and accuracy, which demonstrate successful applications of the proposed segmentation and extraction scheme in vasculature identification.

  6. Factors Influencing the Use of Captions by Foreign Language Learners: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Winke, Paula; Gass, Susan; Sydorenko, Tetyana

    2013-01-01

    This study investigates caption-reading behavior by foreign language (L2) learners and, through eye-tracking methodology, explores the extent to which the relationship between the native and target language affects that behavior. Second-year (4th semester) English-speaking learners of Arabic, Chinese, Russian, and Spanish watched 2 videos…

  7. Looking while Listening and Speaking: Eye-to-Face Gaze in Adolescents with and without Traumatic Brain Injury

    ERIC Educational Resources Information Center

    Turkstra, Lyn S.

    2005-01-01

    Purpose: The purpose of this study was to address the lack of quantitative data on eye-to-face gaze (also known as eye contact) in the literature on pragmatic communication. The study focused on adolescents and young adults with traumatic brain injury (TBI), as gaze often is included in social skills intervention in this population. Method: Gaze…

  8. An Eye-Tracking Investigation of Written Sarcasm Comprehension: The Roles of Familiarity and Context

    ERIC Educational Resources Information Center

    ?urcan, Alexandra; Filik, Ruth

    2016-01-01

    This article addresses a current theoretical debate between the standard pragmatic model, the graded salience hypothesis, and the implicit display theory, by investigating the roles of the context and of the properties of the sarcastic utterance itself in the comprehension of a sarcastic remark. Two eye-tracking experiments were conducted where we…

  9. A system for tracking and recognizing pedestrian faces using a network of loosely coupled cameras

    NASA Astrophysics Data System (ADS)

    Gagnon, L.; Laliberté, F.; Foucher, S.; Branzan Albu, A.; Laurendeau, D.

    2006-05-01

    A face recognition module has been developed for an intelligent multi-camera video surveillance system. The module can recognize a pedestrian face in terms of six basic emotions and the neutral state. Face and facial features detection (eyes, nasal root, nose and mouth) are first performed using cascades of boosted classifiers. These features are used to normalize the pose and dimension of the face image. Gabor filters are then sampled on a regular grid covering the face image to build a facial feature vector that feeds a nearest neighbor classifier with a cosine distance similarity measure for facial expression interpretation and face model construction. A graphical user interface allows the user to adjust the module parameters.

  10. Tracking Students' Eye-Movements When Reading Learning Objects on Mobile Phones: A Discourse Analysis of Luganda Language Teacher-Trainees' Reflective Observations

    ERIC Educational Resources Information Center

    Kabugo, David; Muyinda, Paul B.; Masagazi, Fred. M.; Mugagga, Anthony M.; Mulumba, Mathias B.

    2016-01-01

    Although eye-tracking technologies such as Tobii-T120/TX and Eye-Tribe are steadily becoming ubiquitous, and while their appropriation in education can aid teachers to collect robust information on how students move their eyes when reading and engaging with different learning objects, many teachers of Luganda language are yet to gain experiences…

  11. Different foveal schisis patterns in each retinal layer in eyes with hereditary juvenile retinoschisis evaluated by en-face optical coherence tomography.

    PubMed

    Yoshida-Uemura, Tomoyo; Katagiri, Satoshi; Yokoi, Tadashi; Nishina, Sachiko; Azuma, Noriyuki

    2017-04-01

    To analyze the structures of schisis in eyes with hereditary juvenile retinoschisis using en-face optical coherence tomography (OCT) imaging. In this retrospective observational study, we reviewed the medical records of patients with hereditary juvenile retinoschisis who underwent comprehensive ophthalmic examinations including swept-source OCT. OCT images were obtained from 16 eyes of nine boys (mean age ± standard deviation, 10.6 ± 4.0 years). The horizontal OCT images at the fovea showed inner nuclear layer (INL) schisis in one eye (6.3 %), ganglion cell layer (GCL) and INL schisis in 12 eyes (75.0 %), INL and outer plexiform layer (OPL) schisis in two eyes (12.5 %), and GCL, INL, and OPL schisis in one eye (6.3 %). En-face OCT images showed characteristic schisis patterns in each retinal layer, which were represented by multiple hyporeflective holes in the parafoveal region in the GCL, a spoke-like pattern in the foveal region, a reticular pattern in the parafoveal region in the INL, and multiple hyporeflective polygonal cavities with partitions in the OPL. Our results using en-face OCT imaging clarified different patterns of schisis formation among the GCL, INL, and OPL, which lead to further recognition of structure in hereditary juvenile retinoschisis.

  12. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  13. Social attention in ASD: A review and meta-analysis of eye-tracking studies.

    PubMed

    Chita-Tegmark, Meia

    2016-01-01

    Determining whether social attention is reduced in Autism Spectrum Disorder (ASD) and what factors influence social attention is important to our theoretical understanding of developmental trajectories of ASD and to designing targeted interventions for ASD. This meta-analysis examines data from 38 articles that used eye-tracking methods to compare individuals with ASD and TD controls. In this paper, the impact of eight factors on the size of the effect for the difference in social attention between these two groups are evaluated: age, non-verbal IQ matching, verbal IQ matching, motion, social content, ecological validity, audio input and attention bids. Results show that individuals with ASD spend less time attending to social stimuli than typically developing (TD) controls, with a mean effect size of 0.55. Social attention in ASD was most impacted when stimuli had a high social content (showed more than one person). This meta-analysis provides an opportunity to survey the eye-tracking research on social attention in ASD and to outline potential future research directions, more specifically research of social attention in the context of stimuli with high social content. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Variations of pupil centration and their effects on video eye tracking.

    PubMed

    Wildenmann, Ulrich; Schaeffel, Frank

    2013-11-01

    To evaluate measurement errors that are introduced in video eye tracking when pupil centration changes with pupil size. Software was developed under Visual C++ to track both pupil centre and corneal centre at 87 Hz sampling rate at baseline pupil sizes of 4.75 mm (800 lux room illuminance) and while pupil constrictions were elicited by a flashlight. Corneal centres were determined by a circle fit through the pixels detected at the corneal margin by an edge detection algorithm. Standard deviations for repeated measurements were ± 0.04 mm for horizontal pupil centre position and ± 0.04 mm for horizontal corneal centre positions and ±0.03 mm for vertical pupil centre position and ± 0.05 mm for vertical corneal centre position. Ten subjects were tested (five female, five male, age 25-58 years). At 4 mm pupil sizes, the pupils were nasally decentred relative to the corneal centre by 0.18 ± 0.19 mm in the right eyes and -0.14 ± 0.22 mm in the left eyes. Vertical decentrations were 0.30 ± 0.30 mm and 0.27 ± 0.29 mm, respectively, always in a superior direction. At baseline pupil sizes (the natural pupil sizes at 800 lux) of 4.75 ± 0.52 mm, the decentrations became less (right and left eyes: horizontal 0.17 ± 0.20 mm and -0.12 ± 0.22 mm, and vertical 0.26 ± 0.28 mm and 0.20 ± 0.25 mm). While pupil decentration changed minimally in eight of the subjects, it shifted considerably in two others. Averaged over all subjects, the shift of the pupil centre position per millimetre pupil constriction was not significant (right and left eyes: -0.03 ± 0.07 mm and 0.03 ± 0.04 mm nasally per mm pupil size change, respectively, and -0.04 ± 0.06 mm and -0.05 ± 0.12 mm superiorly). Direction and magnitude of the changes in pupil centration could not be predicted from the initial decentration at baseline pupil sizes. In line with data in the literature, the pupil centre was significantly decentred relative to the corneal centre in the nasal and superior direction. Pupil

  15. The feasibility of automated eye tracking with the Early Childhood Vigilance Test of attention in younger HIV-exposed Ugandan children.

    PubMed

    Boivin, Michael J; Weiss, Jonathan; Chhaya, Ronak; Seffren, Victoria; Awadu, Jorem; Sikorskii, Alla; Giordani, Bruno

    2017-07-01

    Tobii eye tracking was compared with webcam-based observer scoring on an animation viewing measure of attention (Early Childhood Vigilance Test; ECVT) to evaluate the feasibility of automating measurement and scoring. Outcomes from both scoring approaches were compared with the Mullen Scales of Early Learning (MSEL), Color-Object Association Test (COAT), and Behavior Rating Inventory of Executive Function for preschool children (BRIEF-P). A total of 44 children 44 to 65 months of age were evaluated with the ECVT, COAT, MSEL, and BRIEF-P. Tobii ×2-30 portable infrared cameras were programmed to monitor pupil direction during the ECVT 6-min animation and compared with observer-based PROCODER webcam scoring. Children watched 78% of the cartoon (Tobii) compared with 67% (webcam scoring), although the 2 measures were highly correlated (r = .90, p = .001). It is possible for 2 such measures to be highly correlated even if one is consistently higher than the other (Bergemann et al., 2012). Both ECVT Tobii and webcam ECVT measures significantly correlated with COAT immediate recall (r = .37, p = .02 vs. r = .38, p = .01, respectively) and total recall (r = .33, p = .06 vs. r = .42, p = .005) measures. However, neither the Tobii eye tracking nor PROCODER webcam ECVT measures of attention correlated with MSEL composite cognitive performance or BRIEF-P global executive composite. ECVT scoring using Tobii eye tracking is feasible with at-risk very young African children and consistent with webcam-based scoring approaches in their correspondence to one another and other neurocognitive performance-based measures. By automating measurement and scoring, eye tracking technologies can improve the efficiency and help better standardize ECVT testing of attention in younger children. This holds promise for other neurodevelopmental tests where eye movements, tracking, and gaze length can provide important behavioral markers of neuropsychological and neurodevelopmental processes

  16. An Eye-Tracking Study of Learning from Science Text with Concrete and Abstract Illustrations

    ERIC Educational Resources Information Center

    Mason, Lucia; Pluchino, Patrik; Tornatora, Maria Caterina; Ariasi, Nicola

    2013-01-01

    This study investigated the online process of reading and the offline learning from an illustrated science text. The authors examined the effects of using a concrete or abstract picture to illustrate a text and adopted eye-tracking methodology to trace text and picture processing. They randomly assigned 59 eleventh-grade students to 3 reading…

  17. Predictive factor analysis for successful performance of iris recognition-assisted dynamic rotational eye tracking during laser in situ keratomileusis.

    PubMed

    Prakash, Gaurav; Ashok Kumar, Dhivya; Agarwal, Amar; Jacob, Soosan; Sarvanan, Yoga; Agarwal, Athiya

    2010-02-01

    To analyze the predictive factors associated with success of iris recognition and dynamic rotational eye tracking on a laser in situ keratomileusis (LASIK) platform with active assessment and correction of intraoperative cyclotorsion. Interventional case series. Two hundred seventy-five eyes of 142 consecutive candidates underwent LASIK with attempted iris recognition and dynamic rotational tracking on the Technolas 217z100 platform (Techolas Perfect Vision, St Louis, Missouri, USA) at a tertiary care ophthalmic hospital. The main outcome measures were age, gender, flap creation method (femtosecond, microkeratome, epi-LASIK), success of static rotational tracking, ablation algorithm, pulses, and depth; preablation and intraablation rotational activity were analyzed and evaluated using regression models. Preablation static iris recognition was successful in 247 eyes, without difference in flap creation methods (P = .6). Age (partial correlation, -0.16; P = .014), amount of pulses (partial correlation, 0.39; P = 1.6 x 10(-8)), and gender (P = .02) were significant predictive factors for the amount of intraoperative cyclodeviation. Tracking difficulties leading to linking the ablation with a new intraoperatively acquired iris image were more with femtosecond-assisted flaps (P = 2.8 x 10(-7)) and the amount of intraoperative cyclotorsion (P = .02). However, the number of cases having nonresolvable failure of intraoperative rotational tracking was similar in the 3 flap creation methods (P = .22). Intraoperative cyclotorsional activity depends on the age, gender, and duration of ablation (pulses delivered). Femtosecond flaps do not seem to have a disadvantage over microkeratome flaps as far as iris recognition and success of intraoperative dynamic rotational tracking is concerned. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  18. Computer vision enhances mobile eye-tracking to expose expert cognition in natural-scene visual-search tasks

    NASA Astrophysics Data System (ADS)

    Keane, Tommy P.; Cahill, Nathan D.; Tarduno, John A.; Jacobs, Robert A.; Pelz, Jeff B.

    2014-02-01

    Mobile eye-tracking provides the fairly unique opportunity to record and elucidate cognition in action. In our research, we are searching for patterns in, and distinctions between, the visual-search performance of experts and novices in the geo-sciences. Traveling to regions resultant from various geological processes as part of an introductory field studies course in geology, we record the prima facie gaze patterns of experts and novices when they are asked to determine the modes of geological activity that have formed the scene-view presented to them. Recording eye video and scene video in natural settings generates complex imagery that requires advanced applications of computer vision research to generate registrations and mappings between the views of separate observers. By developing such mappings, we could then place many observers into a single mathematical space where we can spatio-temporally analyze inter- and intra-subject fixations, saccades, and head motions. While working towards perfecting these mappings, we developed an updated experiment setup that allowed us to statistically analyze intra-subject eye-movement events without the need for a common domain. Through such analyses we are finding statistical differences between novices and experts in these visual-search tasks. In the course of this research we have developed a unified, open-source, software framework for processing, visualization, and interaction of mobile eye-tracking and high-resolution panoramic imagery.

  19. A resource for assessing information processing in the developing brain using EEG and eye tracking

    PubMed Central

    Langer, Nicolas; Ho, Erica J.; Alexander, Lindsay M.; Xu, Helen Y.; Jozanovic, Renee K.; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T.; Parra, Lucas C.; Milham, Michael P.; Kelly, Simon P.

    2017-01-01

    We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6–44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes. PMID:28398357

  20. A resource for assessing information processing in the developing brain using EEG and eye tracking.

    PubMed

    Langer, Nicolas; Ho, Erica J; Alexander, Lindsay M; Xu, Helen Y; Jozanovic, Renee K; Henin, Simon; Petroni, Agustin; Cohen, Samantha; Marcelle, Enitan T; Parra, Lucas C; Milham, Michael P; Kelly, Simon P

    2017-04-11

    We present a dataset combining electrophysiology and eye tracking intended as a resource for the investigation of information processing in the developing brain. The dataset includes high-density task-based and task-free EEG, eye tracking, and cognitive and behavioral data collected from 126 individuals (ages: 6-44). The task battery spans both the simple/complex and passive/active dimensions to cover a range of approaches prevalent in modern cognitive neuroscience. The active task paradigms facilitate principled deconstruction of core components of task performance in the developing brain, whereas the passive paradigms permit the examination of intrinsic functional network activity during varying amounts of external stimulation. Alongside these neurophysiological data, we include an abbreviated cognitive test battery and questionnaire-based measures of psychiatric functioning. We hope that this dataset will lead to the development of novel assays of neural processes fundamental to information processing, which can be used to index healthy brain development as well as detect pathologic processes.

  1. Feature instructions improve face-matching accuracy

    PubMed Central

    Bindemann, Markus

    2018-01-01

    Identity comparisons of photographs of unfamiliar faces are prone to error but important for applied settings, such as person identification at passport control. Finding techniques to improve face-matching accuracy is therefore an important contemporary research topic. This study investigated whether matching accuracy can be improved by instruction to attend to specific facial features. Experiment 1 showed that instruction to attend to the eyebrows enhanced matching accuracy for optimized same-day same-race face pairs but not for other-race faces. By contrast, accuracy was unaffected by instruction to attend to the eyes, and declined with instruction to attend to ears. Experiment 2 replicated the eyebrow-instruction improvement with a different set of same-race faces, comprising both optimized same-day and more challenging different-day face pairs. These findings suggest that instruction to attend to specific features can enhance face-matching accuracy, but feature selection is crucial and generalization across face sets may be limited. PMID:29543822

  2. A comparison study of visually stimulated brain-computer and eye-tracking interfaces

    NASA Astrophysics Data System (ADS)

    Suefusa, Kaori; Tanaka, Toshihisa

    2017-06-01

    Objective. Brain-computer interfacing (BCI) based on visual stimuli detects the target on a screen on which a user is focusing. The detection of the gazing target can be achieved by tracking gaze positions with a video camera, which is called eye-tracking or eye-tracking interfaces (ETIs). The two types of interface have been developed in different communities. Thus, little work on a comprehensive comparison between these two types of interface has been reported. This paper quantitatively compares the performance of these two interfaces on the same experimental platform. Specifically, our study is focused on two major paradigms of BCI and ETI: steady-state visual evoked potential-based BCIs and dwelling-based ETIs. Approach. Recognition accuracy and the information transfer rate were measured by giving subjects the task of selecting one of four targets by gazing at it. The targets were displayed in three different sizes (with sides 20, 40 and 60 mm long) to evaluate performance with respect to the target size. Main results. The experimental results showed that the BCI was comparable to the ETI in terms of accuracy and the information transfer rate. In particular, when the size of a target was relatively small, the BCI had significantly better performance than the ETI. Significance. The results on which of the two interfaces works better in different situations would not only enable us to improve the design of the interfaces but would also allow for the appropriate choice of interface based on the situation. Specifically, one can choose an interface based on the size of the screen that displays the targets.

  3. State-dependent alterations in inhibitory control and emotional face identification in seasonal affective disorder.

    PubMed

    Hjordt, Liv V; Stenbæk, Dea S; Madsen, Kathrine Skak; Mc Mahon, Brenda; Jensen, Christian G; Vestergaard, Martin; Hageman, Ida; Meder, David; Hasselbalch, Steen G; Knudsen, Gitte M

    2017-04-01

    Depressed individuals often exhibit impaired inhibition to negative input and identification of positive stimuli, but it is unclear whether this is a state or trait feature. We here exploited a naturalistic model, namely individuals with seasonal affective disorder (SAD), to study this feature longitudinally. The goal of this study was to examine seasonal changes in inhibitory control and identification of emotional faces in individuals with SAD. Twenty-nine individuals diagnosed with winter-SAD and 30 demographically matched controls with no seasonality symptoms completed an emotional Go/NoGo task, requiring inhibition of prepotent responses to emotional facial expressions and an emotional face identification task twice, in winter and summer. In winter, individuals with SAD showed impaired ability to inhibit responses to angry (p = .0006) and sad faces (p = .011), and decreased identification of happy faces (p = .032) compared with controls. In summer, individuals with SAD and controls performed similarly on these tasks (ps > .24). We provide novel evidence that inhibition of angry and sad faces and identification of happy faces are impaired in SAD in the symptomatic phase, but not in the remitted phase. The affective biases in cognitive processing constitute state-dependent features of SAD. Our data show that reinstatement of a normal affective cognition should be possible and would constitute a major goal in psychiatric treatment to improve the quality of life for these patients. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Rehabilitation of face-processing skills in an adolescent with prosopagnosia: Evaluation of an online perceptual training programme.

    PubMed

    Bate, Sarah; Bennetts, Rachel; Mole, Joseph A; Ainge, James A; Gregory, Nicola J; Bobak, Anna K; Bussunt, Amanda

    2015-01-01

    In this paper we describe the case of EM, a female adolescent who acquired prosopagnosia following encephalitis at the age of eight. Initial neuropsychological and eye-movement investigations indicated that EM had profound difficulties in face perception as well as face recognition. EM underwent 14 weeks of perceptual training in an online programme that attempted to improve her ability to make fine-grained discriminations between faces. Following training, EM's face perception skills had improved, and the effect generalised to untrained faces. Eye-movement analyses also indicated that EM spent more time viewing the inner facial features post-training. Examination of EM's face recognition skills revealed an improvement in her recognition of personally-known faces when presented in a laboratory-based test, although the same gains were not noted in her everyday experiences with these faces. In addition, EM did not improve on a test assessing the recognition of newly encoded faces. One month after training, EM had maintained the improvement on the eye-tracking test, and to a lesser extent, her performance on the familiar faces test. This pattern of findings is interpreted as promising evidence that the programme can improve face perception skills, and with some adjustments, may at least partially improve face recognition skills.

  5. Looking but Not Seeing: Atypical Visual Scanning and Recognition of Faces in 2 and 4-Year-Old Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Chawarska, Katarzyna; Shic, Frederick

    2009-01-01

    This study used eye-tracking to examine visual scanning and recognition of faces by 2- and 4-year-old children with autism spectrum disorder (ASD) (N = 44) and typically developing (TD) controls (N = 30). TD toddlers at both age levels scanned and recognized faces similarly. Toddlers with ASD looked increasingly away from faces with age,…

  6. Discourse intervention strategies in Alzheimer's disease: Eye-tracking and the effect of visual cues in conversation.

    PubMed

    Brandão, Lenisa; Monção, Ana Maria; Andersson, Richard; Holmqvist, Kenneth

    2014-01-01

    The goal of this study was to investigate whether on-topic visual cues can serve as aids for the maintenance of discourse coherence and informativeness in autobiographical narratives of persons with Alzheimer's disease (AD). The experiment consisted of three randomized conversation conditions: one without prompts, showing a blank computer screen; an on-topic condition, showing a picture and a sentence about the conversation; and an off-topic condition, showing a picture and a sentence which were unrelated to the conversation. Speech was recorded while visual attention was examined using eye tracking to measure how long participants looked at cues and the face of the listener. Results suggest that interventions using visual cues in the form of images and written information are useful to improve discourse informativeness in AD. This study demonstrated the potential of using images and short written messages as means of compensating for the cognitive deficits which underlie uninformative discourse in AD. Future studies should further investigate the efficacy of language interventions based in the use of these compensation strategies for AD patients and their family members and friends.

  7. Face recognition system and method using face pattern words and face pattern bytes

    DOEpatents

    Zheng, Yufeng

    2014-12-23

    The present invention provides a novel system and method for identifying individuals and for face recognition utilizing facial features for face identification. The system and method of the invention comprise creating facial features or face patterns called face pattern words and face pattern bytes for face identification. The invention also provides for pattern recognitions for identification other than face recognition. The invention further provides a means for identifying individuals based on visible and/or thermal images of those individuals by utilizing computer software implemented by instructions on a computer or computer system and a computer readable medium containing instructions on a computer system for face recognition and identification.

  8. Morphosyntactic Development in a Second Language: An Eye-Tracking Study on the Role of Attention

    ERIC Educational Resources Information Center

    Issa, Bernard Ibrahim, II

    2015-01-01

    One common claim in second language (L2) acquisition research is that attention is crucial for development to occur. Although previous empirical research supports this claim, methodological approaches have not been able to directly measure attention. This thesis utilized eye-tracking to directly measure attention and thus provide converging…

  9. Using Dual Eye-Tracking Measures to Differentiate between Collaboration on Procedural and Conceptual Learning Activities

    ERIC Educational Resources Information Center

    Belenky, Daniel; Ringenberg, Michael; Olsen, Jennifer; Aleven, Vincent; Rummel, Nikol

    2013-01-01

    Dual eye-tracking measures enable novel ways to test predictions about collaborative learning. For example, the research project we are engaging in uses measures of gaze recurrence to help understand how collaboration may differ when students are completing various learning activities focused on different learning objectives. Specifically, we…

  10. Intentional Response Distortion on Personality Tests: Using Eye-Tracking to Understand Response Processes when Faking

    ERIC Educational Resources Information Center

    van Hooft, Edwin A. J.; Born, Marise Ph.

    2012-01-01

    Intentional response distortion or faking among job applicants completing measures such as personality and integrity tests is a concern in personnel selection. The present study aimed to investigate whether eye-tracking technology can improve our understanding of the response process when faking. In an experimental within-participants design, a…

  11. Pupil Tracking for Real-Time Motion Corrected Anterior Segment Optical Coherence Tomography

    PubMed Central

    Carrasco-Zevallos, Oscar M.; Nankivil, Derek; Viehland, Christian; Keller, Brenton; Izatt, Joseph A.

    2016-01-01

    Volumetric acquisition with anterior segment optical coherence tomography (ASOCT) is necessary to obtain accurate representations of the tissue structure and to account for asymmetries of the anterior eye anatomy. Additionally, recent interest in imaging of anterior segment vasculature and aqueous humor flow resulted in application of OCT angiography techniques to generate en face and 3D micro-vasculature maps of the anterior segment. Unfortunately, ASOCT structural and vasculature imaging systems do not capture volumes instantaneously and are subject to motion artifacts due to involuntary eye motion that may hinder their accuracy and repeatability. Several groups have demonstrated real-time tracking for motion-compensated in vivo OCT retinal imaging, but these techniques are not applicable in the anterior segment. In this work, we demonstrate a simple and low-cost pupil tracking system integrated into a custom swept-source OCT system for real-time motion-compensated anterior segment volumetric imaging. Pupil oculography hardware coaxial with the swept-source OCT system enabled fast detection and tracking of the pupil centroid. The pupil tracking ASOCT system with a field of view of 15 x 15 mm achieved diffraction-limited imaging over a lateral tracking range of +/- 2.5 mm and was able to correct eye motion at up to 22 Hz. Pupil tracking ASOCT offers a novel real-time motion compensation approach that may facilitate accurate and reproducible anterior segment imaging. PMID:27574800

  12. Discrimination between smiling faces: Human observers vs. automated face analysis.

    PubMed

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  13. A Random Finite Set Approach to Space Junk Tracking and Identification

    DTIC Science & Technology

    2014-09-03

    Final 3. DATES COVERED (From - To) 31 Jan 13 – 29 Apr 14 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and...01-2013 to 29-04-2014 4. TITLE AND SUBTITLE A Random Finite Set Approach to Space Junk Tracking and Identification 5a. CONTRACT NUMBER FA2386-13...Prescribed by ANSI Std Z39-18 A Random Finite Set Approach to Space Junk Tracking and Indentification Ba-Ngu Vo1, Ba-Tuong Vo1, 1Department of

  14. Use of Eye Tracking as an Innovative Instructional Method in Surgical Human Anatomy.

    PubMed

    Sánchez-Ferrer, María Luísa; Grima-Murcia, María Dolores; Sánchez-Ferrer, Francisco; Hernández-Peñalver, Ana Isabel; Fernández-Jover, Eduardo; Sánchez Del Campo, Francisco

    Tobii glasses can record corneal infrared light reflection to track pupil position and to map gaze focusing in the video recording. Eye tracking has been proposed for use in training and coaching as a visually guided control interface. The aim of our study was to test the potential use of these glasses in various situations: explanations of anatomical structures on tablet-type electronic devices, explanations of anatomical models and dissected cadavers, and during the prosection thereof. An additional aim of the study was to test the use of the glasses during laparoscopies performed on Thiel-embalmed cadavers (that allows pneumoinsufflation and exact reproduction of the laparoscopic surgical technique). The device was also tried out in actual surgery (both laparoscopy and open surgery). We performed a pilot study using the Tobii glasses. Dissection room at our School of Medicine and in the operating room at our Hospital. To evaluate usefulness, a survey was designed for use among students, instructors, and practicing physicians. The results were satisfactory, with the usefulness of this tool supported by more than 80% positive responses to most questions. There was no inconvenience for surgeons and that patient safety was ensured in the real laparoscopy. To our knowledge, this is the first publication to demonstrate the usefulness of eye tracking in practical instruction of human anatomy, as well as in teaching clinical anatomy and surgical techniques in the dissection and operating rooms. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  15. Effects of Different Multimedia Presentations on Viewers' Information-Processing Activities Measured by Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Chuang, Hsueh-Hua; Liu, Han-Chin

    2012-01-01

    This study implemented eye-tracking technology to understand the impact of different multimedia instructional materials, i.e., five successive pages versus a single page with the same amount of information, on information-processing activities in 21 non-science-major college students. The findings showed that students demonstrated the same number…

  16. Theories of Spoken Word Recognition Deficits in Aphasia: Evidence from Eye-Tracking and Computational Modeling

    ERIC Educational Resources Information Center

    Mirman, Daniel; Yee, Eiling; Blumstein, Sheila E.; Magnuson, James S.

    2011-01-01

    We used eye-tracking to investigate lexical processing in aphasic participants by examining the fixation time course for rhyme (e.g., "carrot-parrot") and cohort (e.g., "beaker-beetle") competitors. Broca's aphasic participants exhibited larger rhyme competition effects than age-matched controls. A re-analysis of previously reported data (Yee,…

  17. Investigating the Effect of Complexity Factors in Stoichiometry Problems Using Logistic Regression and Eye Tracking

    ERIC Educational Resources Information Center

    Tang, Hui; Kirk, John; Pienta, Norbert J.

    2014-01-01

    This paper includes two experiments, one investigating complexity factors in stoichiometry word problems, and the other identifying students' problem-solving protocols by using eye-tracking technology. The word problems used in this study had five different complexity factors, which were randomly assigned by a Web-based tool that we developed. The…

  18. Eye tracking and climate change: How is climate literacy information processed?

    NASA Astrophysics Data System (ADS)

    Williams, C. C.; McNeal, K. S.

    2011-12-01

    The population of the Southeastern United States is perceived to be resistant to information regarding global climate change. The Climate Literacy Partnership in the Southeast (CLiPSE) project was formed to provide a resource for climate science information. As part of this project, we are evaluating the way that education materials influence the interpretation of climate change related information. At Mississippi State University, a study is being conducted examining how individuals from the Southeastern United States process climate change information and whether or not the interaction with such information impacts the interpretation of subsequent climate change related information. By observing the patterns both before and after an educational intervention, we are able to evaluate the effectiveness of the climate change information on an individual's interpretation of related information. Participants in this study view figures describing various types of climate change related information (CO2 emissions, sea levels, etc.) while their eye movements are tracked to determine a baseline for the way that they process this type of graphical data. Specifically, we are examining time spent viewing and number of fixations on critical portions of the figures prior to exposure to an educational document on climate change. Following the baseline period, we provide participants with portions of a computerized version of Climate Literacy: The Essential Principles of Climate Sciences that the participants read at their own pace while their eye movements are monitored. Participants are told that they will be given a test on the material after reading the resource. After reading the excerpt, participants are presented with a new set of climate change related figures to interpret (with eye tracking) along with a series of questions regarding information contained in the resource. We plan to evaluate changes that occur in the way that climate change related information is

  19. Contributions of individual face features to face discrimination.

    PubMed

    Logan, Andrew J; Gordon, Gael E; Loffler, Gunter

    2017-08-01

    Faces are highly complex stimuli that contain a host of information. Such complexity poses the following questions: (a) do observers exhibit preferences for specific information? (b) how does sensitivity to individual face parts compare? These questions were addressed by quantifying sensitivity to different face features. Discrimination thresholds were determined for synthetic faces under the following conditions: (i) 'full face': all face features visible; (ii) 'isolated feature': single feature presented in isolation; (iii) 'embedded feature': all features visible, but only one feature modified. Mean threshold elevations for isolated features, relative to full-faces, were 0.84x, 1.08, 2.12, 3.34, 4.07 and 4.47 for head-shape, hairline, nose, mouth, eyes and eyebrows respectively. Hence, when two full faces can be discriminated at threshold, the difference between the eyes is about four times less than what is required when discriminating between isolated eyes. In all cases, sensitivity was higher when features were presented in isolation than when they were embedded within a face context (threshold elevations of 0.94x, 1.74, 2.67, 2.90, 5.94 and 9.94). This reveals a specific pattern of sensitivity to face information. Observers are between two and four times more sensitive to external than internal features. The pattern for internal features (higher sensitivity for the nose, compared to mouth, eyes and eyebrows) is consistent with lower sensitivity for those parts affected by facial dynamics (e.g. facial expressions). That isolated features are easier to discriminate than embedded features supports a holistic face processing mechanism which impedes extraction of information about individual features from full faces. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Head-mounted eye tracking: a new method to describe infant looking.

    PubMed

    Franchak, John M; Kretch, Kari S; Soska, Kasey C; Adolph, Karen E

    2011-01-01

    Despite hundreds of studies describing infants' visual exploration of experimental stimuli, researchers know little about where infants look during everyday interactions. The current study describes the first method for studying visual behavior during natural interactions in mobile infants. Six 14-month-old infants wore a head-mounted eye-tracker that recorded gaze during free play with mothers. Results revealed that infants' visual exploration is opportunistic and depends on the availability of information and the constraints of infants' own bodies. Looks to mothers' faces were rare following infant-directed utterances but more likely if mothers were sitting at infants' eye level. Gaze toward the destination of infants' hand movements was common during manual actions and crawling, but looks toward obstacles during leg movements were less frequent. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.

  1. Eyes on the bodies: an eye tracking study on deployment of visual attention among females with body dissatisfaction.

    PubMed

    Gao, Xiao; Deng, Xiao; Yang, Jia; Liang, Shuang; Liu, Jie; Chen, Hong

    2014-12-01

    Visual attentional bias has important functions during the appearance social comparisons. However, for the limitations of experimental paradigms or analysis methods in previous studies, the time course of attentional bias to thin and fat body images among women with body dissatisfaction (BD) has still been unclear. In using free reviewing task combined with eye movement tracking, and based on event-related analyses of the critical first eye movement events, as well as epoch-related analyses of gaze durations, the current study investigated different attentional bias components to body shape/part images during 15s presentation time among 34 high BD and 34 non-BD young women. In comparison to the controls, women with BD showed sustained maintenance biases on thin and fat body images during both early automatic and late strategic processing stages. This study highlights a clear need for research on the dynamics of attentional biases related to body image and eating disturbances. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Positive facial affect facilitates the identification of famous faces.

    PubMed

    Gallegos, Diana R; Tranel, Daniel

    2005-06-01

    Several convergent lines of evidence have suggested that the presence of an emotion signal in a visual stimulus can influence processing of that stimulus. In the current study, we picked up on this idea, and explored the hypothesis that the presence of an emotional facial expression (happiness) would facilitate the identification of familiar faces. We studied two groups of normal participants (overall N=54), and neurological patients with either left (n=8) or right (n=10) temporal lobectomies. Reaction times were measured while participants named familiar famous faces that had happy expressions or neutral expressions. In support of the hypothesis, naming was significantly faster for the happy faces, and this effect obtained in the normal participants and in both patient groups. In the patients with left temporal lobectomies, the effect size for this facilitation was large (d=0.87), suggesting that this manipulation might have practical implications for helping such patients compensate for the types of naming defects that often accompany their brain damage. Consistent with other recent work, our findings indicate that emotion can facilitate visual identification, perhaps via a modulatory influence of the amygdala on extrastriate cortex.

  3. Algorithms for High-Speed Noninvasive Eye-Tracking System

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; Morookian, John-Michael; Lambert, James

    2010-01-01

    Two image-data-processing algorithms are essential to the successful operation of a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. The system was described in High-Speed Noninvasive Eye-Tracking System (NPO-30700) NASA Tech Briefs, Vol. 31, No. 8 (August 2007), page 51. To recapitulate from the cited article: Like prior commercial noninvasive eyetracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Most of the prior commercial noninvasive eyetracking systems rely on standard video cameras, which operate at frame rates of about 30 Hz. Such systems are limited to slow, full-frame operation. The video camera in the present system includes a charge-coupled-device (CCD) image detector plus electronic circuitry capable of implementing an advanced control scheme that effects readout from a small region of interest (ROI), or subwindow, of the full image. Inasmuch as the image features of interest (the cornea and pupil) typically occupy a small part of the camera frame, this ROI capability can be exploited to determine the direction of gaze at a high frame rate by reading out from the ROI that contains the cornea and pupil (but not from the rest of the image) repeatedly. One of the present algorithms exploits the ROI capability. The algorithm takes horizontal row slices and takes advantage of the symmetry of the pupil and cornea circles and of the gray-scale contrasts of the pupil and cornea with respect to other parts of the eye. The algorithm determines which horizontal image slices contain the pupil and cornea, and, on each valid slice, the end coordinates of the pupil and cornea

  4. Spontaneous attention to faces in Asperger syndrome using ecologically valid static stimuli.

    PubMed

    Hanley, Mary; McPhillips, Martin; Mulhern, Gerry; Riby, Deborah M

    2013-11-01

    Previous eye tracking research on the allocation of attention to social information by individuals with autism spectrum disorders is equivocal and may be in part a consequence of variation in stimuli used between studies. The current study explored attention allocation to faces, and within faces, by individuals with Asperger syndrome using a range of static stimuli where faces were either viewed in isolation or viewed in the context of a social scene. Results showed that faces were viewed typically by the individuals with Asperger syndrome when presented in isolation, but attention to the eyes was significantly diminished in comparison to age and IQ-matched typical viewers when faces were viewed as part of social scenes. We show that when using static stimuli, there is evidence of atypicality for individuals with Asperger syndrome depending on the extent of social context. Our findings shed light on the previous explanations of gaze behaviour that have emphasised the role of movement in atypicalities of social attention in autism spectrum disorders and highlight the importance of consideration of the realistic portrayal of social information for future studies.

  5. Tracking the eye non-invasively: simultaneous comparison of the scleral search coil and optical tracking techniques in the macaque monkey

    PubMed Central

    Kimmel, Daniel L.; Mammo, Dagem; Newsome, William T.

    2012-01-01

    From human perception to primate neurophysiology, monitoring eye position is critical to the study of vision, attention, oculomotor control, and behavior. Two principal techniques for the precise measurement of eye position—the long-standing sclera-embedded search coil and more recent optical tracking techniques—are in use in various laboratories, but no published study compares the performance of the two methods simultaneously in the same primates. Here we compare two popular systems—a sclera-embedded search coil from C-N-C Engineering and the EyeLink 1000 optical system from SR Research—by recording simultaneously from the same eye in the macaque monkey while the animal performed a simple oculomotor task. We found broad agreement between the two systems, particularly in positional accuracy during fixation, measurement of saccade amplitude, detection of fixational saccades, and sensitivity to subtle changes in eye position from trial to trial. Nonetheless, certain discrepancies persist, particularly elevated saccade peak velocities, post-saccadic ringing, influence of luminance change on reported position, and greater sample-to-sample variation in the optical system. Our study shows that optical performance now rivals that of the search coil, rendering optical systems appropriate for many if not most applications. This finding is consequential, especially for animal subjects, because the optical systems do not require invasive surgery for implantation and repair of search coils around the eye. Our data also allow laboratories using the optical system in human subjects to assess the strengths and limitations of the technique for their own applications. PMID:22912608

  6. Reconceptualizing Reactivity of Think-Alouds and Eye Tracking: Absence of Evidence Is Not Evidence of Absence

    ERIC Educational Resources Information Center

    Godfroid, Aline; Spino, Le Anne

    2015-01-01

    This study extends previous reactivity research on the cognitive effects of think-alouds to include eye-tracking methodology. Unlike previous studies, we supplemented traditional superiority tests with equivalence tests, because only the latter are conceptually appropriate for demonstrating nonreactivity. Advanced learners of English read short…

  7. Attentional processing of other's facial display of pain: an eye tracking study.

    PubMed

    Vervoort, Tine; Trost, Zina; Prkachin, Kenneth M; Mueller, Sven C

    2013-06-01

    The present study investigated the role of observer pain catastrophizing and personal pain experience as possible moderators of attention to varying levels of facial pain expression in others. Eye movements were recorded as a direct and continuous index of attention allocation in a sample of 35 undergraduate students while viewing slides presenting picture pairs consisting of a neutral face combined with either a low, moderate, or high expressive pain face. Initial orienting of attention was measured as latency and duration of first fixation to 1 of 2 target images (i.e., neutral face vs pain face). Attentional maintenance was measured by gaze duration. With respect to initial orienting to pain, findings indicated that participants reporting low catastrophizing directed their attention more quickly to pain faces than to neutral faces, with fixation becoming increasingly faster with increasing levels of facial pain expression. In comparison, participants reporting high levels of catastrophizing showed decreased tendency to initially orient to pain faces, fixating equally quickly on neutral and pain faces. Duration of the first fixation revealed no significant effects. With respect to attentional maintenance, participants reporting high catastrophizing and pain intensity demonstrated significantly longer gaze duration for all face types (neutral and pain expression), relative to low catastrophizing counterparts. Finally, independent of catastrophizing, higher reported pain intensity contributed to decreased attentional maintenance to pain faces vs neutral faces. Theoretical implications and further research directions are discussed. Copyright © 2013 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  8. Social Experience Does Not Abolish Cultural Diversity in Eye Movements

    PubMed Central

    Kelly, David J.; Jack, Rachael E.; Miellet, Sébastien; De Luca, Emanuele; Foreman, Kay; Caldara, Roberto

    2011-01-01

    Adults from Eastern (e.g., China) and Western (e.g., USA) cultural groups display pronounced differences in a range of visual processing tasks. For example, the eye movement strategies used for information extraction during a variety of face processing tasks (e.g., identification and facial expressions of emotion categorization) differs across cultural groups. Currently, many of the differences reported in previous studies have asserted that culture itself is responsible for shaping the way we process visual information, yet this has never been directly investigated. In the current study, we assessed the relative contribution of genetic and cultural factors by testing face processing in a population of British Born Chinese adults using face recognition and expression classification tasks. Contrary to predictions made by the cultural differences framework, the majority of British Born Chinese adults deployed “Eastern” eye movement strategies, while approximately 25% of participants displayed “Western” strategies. Furthermore, the cultural eye movement strategies used by individuals were consistent across recognition and expression tasks. These findings suggest that “culture” alone cannot straightforwardly account for diversity in eye movement patterns. Instead a more complex understanding of how the environment and individual experiences can influence the mechanisms that govern visual processing is required. PMID:21886626

  9. When is protection from impact needed for the face as well as the eyes in occupational environments?

    PubMed

    Dain, Stephen J; Huang, Rose; Tiao, Aimee; Chou, B Ralph

    2018-05-01

    The most commonly identified reason for requiring or using occupational eye and face protection is for protection against flying objects. Standards vary on what risk may require protection of the eyes alone and what requires protection for the whole face. Information on the minimum energy transfer for face damage to occur is not well-established. The heads of pigs were used as the common model for human skin. A 6 mm steel ball projected at velocities between 45 and 135 m/s was directed at the face area. Examples of impacts were filmed with a high-speed camera and the resulting damage was rated visually on a scale from 1 (no visible damage) to 5 (penetrated the skin and embedded in the flesh). The results for the cheek area indicate that 85 m/s is the velocity above which damage is more likely to occur unless the skin near the lip is included. For damage to the lip area to be avoided, the velocity needs to be 60 m/s or less. The present data support a maximum impact velocity of 85 m/s, provided the thinner and more vulnerable skin of the lids and orbital adnexa is protected. If the coverage area does not extend to the orbital adnexa, then the absolute upper limit for the velocity is 60 m/s. At this stage, eye-only protection, as represented by the lowest level of impact test in the standards in the form of a drop ball test, is not in question. © 2017 Optometry Australia.

  10. Suicidal ideation and attentional biases in children: An eye-tracking study.

    PubMed

    Tsypes, Aliona; Owens, Max; Gibb, Brandon E

    2017-11-01

    Despite theoretical and empirical evidence for a heighted responsiveness to signals of social-threat in suicidal individuals, no studies to date have examined whether this responsiveness might also manifest in the form of specific biases in attention to interpersonal stimuli. The current study, therefore, examined the presence and nature of attentional biases for facial expressions of emotion in children with and without a history of suicidal ideation (SI). Participants were 88 children (44 with a history of SI and 44 demographically and clinically matched controls without such history) recruited from the community. The average age of children was 9.26 years (44.3% female; 67.0% Caucasian). Children's history of SI was assessed via structured interviews with children and their parent. Attentional biases were assessed using a dot probe task and included fearful, happy, and sad facial stimuli and focused on eye tracking and reaction time indices of attentional bias. Children with a history of SI exhibited significantly greater gaze duration toward fearful faces. The findings appeared to be at least partially independent of children's history of major depression or anxiety disorders or their current depressive or anxious symptoms. The study is limited by its cross-sectional design, which precludes any causal conclusions regarding the role of attentional biases in future suicide risk. Our results suggest that children with a history of SI exhibit biases in sustained attention toward socially-threatening facial expressions. Pending replications, these findings might represent a new avenue of suicide risk assessment and intervention. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Fixations to the eyes aids in facial encoding; covertly attending to the eyes does not.

    PubMed

    Laidlaw, Kaitlin E W; Kingstone, Alan

    2017-02-01

    When looking at images of faces, people will often focus their fixations on the eyes. It has previously been demonstrated that the eyes convey important information that may improve later facial recognition. Whether this advantage requires that the eyes be fixated, or merely attended to covertly (i.e. while looking elsewhere), is unclear from previous work. While attending to the eyes covertly without fixating them may be sufficient, the act of using overt attention to fixate the eyes may improve the processing of important details used for later recognition. In the present study, participants were shown a series of faces and, in Experiment 1, asked to attend to them normally while avoiding looking at either the eyes or, as a control, the mouth (overt attentional avoidance condition); or in Experiment 2 fixate the center of the face while covertly attending to either the eyes or the mouth (covert attention condition). After the first phase, participants were asked to perform an old/new face recognition task. We demonstrate that a) when fixations to the eyes are avoided during initial viewing then subsequent face discrimination suffers, and b) covert attention to the eyes alone is insufficient to improve face discrimination performance. Together, these findings demonstrate that fixating the eyes provides an encoding advantage that is not availed by covert attention alone. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Detection of third and sixth cranial nerve palsies with a novel method for eye tracking while watching a short film clip

    PubMed Central

    Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H.; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P.; Smith, R. Theodore

    2015-01-01

    OBJECT Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. METHODS The authors recorded subjects’ eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. RESULTS In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2

  13. Detection of third and sixth cranial nerve palsies with a novel method for eye tracking while watching a short film clip.

    PubMed

    Samadani, Uzma; Farooq, Sameer; Ritlop, Robert; Warren, Floyd; Reyes, Marleen; Lamm, Elizabeth; Alex, Anastasia; Nehrbass, Elena; Kolecki, Radek; Jureller, Michael; Schneider, Julia; Chen, Agnes; Shi, Chen; Mendhiratta, Neil; Huang, Jason H; Qian, Meng; Kwak, Roy; Mikheev, Artem; Rusinek, Henry; George, Ajax; Fergus, Robert; Kondziolka, Douglas; Huang, Paul P; Smith, R Theodore

    2015-03-01

    Automated eye movement tracking may provide clues to nervous system function at many levels. Spatial calibration of the eye tracking device requires the subject to have relatively intact ocular motility that implies function of cranial nerves (CNs) III (oculomotor), IV (trochlear), and VI (abducent) and their associated nuclei, along with the multiple regions of the brain imparting cognition and volition. The authors have developed a technique for eye tracking that uses temporal rather than spatial calibration, enabling detection of impaired ability to move the pupil relative to normal (neurologically healthy) control volunteers. This work was performed to demonstrate that this technique may detect CN palsies related to brain compression and to provide insight into how the technique may be of value for evaluating neuropathological conditions associated with CN palsy, such as hydrocephalus or acute mass effect. The authors recorded subjects' eye movements by using an Eyelink 1000 eye tracker sampling at 500 Hz over 200 seconds while the subject viewed a music video playing inside an aperture on a computer monitor. The aperture moved in a rectangular pattern over a fixed time period. This technique was used to assess ocular motility in 157 neurologically healthy control subjects and 12 patients with either clinical CN III or VI palsy confirmed by neuro-ophthalmological examination, or surgically treatable pathological conditions potentially impacting these nerves. The authors compared the ratio of vertical to horizontal eye movement (height/width defined as aspect ratio) in normal and test subjects. In 157 normal controls, the aspect ratio (height/width) for the left eye had a mean value ± SD of 1.0117 ± 0.0706. For the right eye, the aspect ratio had a mean of 1.0077 ± 0.0679 in these 157 subjects. There was no difference between sexes or ages. A patient with known CN VI palsy had a significantly increased aspect ratio (1.39), whereas 2 patients with known CN III

  14. Delayed Anticipatory Spoken Language Processing in Adults with Dyslexia—Evidence from Eye-tracking.

    PubMed

    Huettig, Falk; Brouwer, Susanne

    2015-05-01

    It is now well established that anticipation of upcoming input is a key characteristic of spoken language comprehension. It has also frequently been observed that literacy influences spoken language processing. Here, we investigated whether anticipatory spoken language processing is related to individuals' word reading abilities. Dutch adults with dyslexia and a control group participated in two eye-tracking experiments. Experiment 1 was conducted to assess whether adults with dyslexia show the typical language-mediated eye gaze patterns. Eye movements of both adults with and without dyslexia closely replicated earlier research: spoken language is used to direct attention to relevant objects in the environment in a closely time-locked manner. In Experiment 2, participants received instructions (e.g., 'Kijk naar de(COM) afgebeelde piano(COM)', look at the displayed piano) while viewing four objects. Articles (Dutch 'het' or 'de') were gender marked such that the article agreed in gender only with the target, and thus, participants could use gender information from the article to predict the target object. The adults with dyslexia anticipated the target objects but much later than the controls. Moreover, participants' word reading scores correlated positively with their anticipatory eye movements. We conclude by discussing the mechanisms by which reading abilities may influence predictive language processing. Copyright © 2015 John Wiley & Sons, Ltd.

  15. How Listening to Music Affects Reading: Evidence From Eye Tracking.

    PubMed

    Zhang, Han; Miller, Kevin; Cleveland, Raymond; Cortina, Kai

    2018-02-01

    The current research looked at how listening to music affects eye movements when college students read natural passages for comprehension. Two studies found that effects of music depend on both frequency of the word and dynamics of the music. Study 1 showed that lexical and linguistic features of the text remained highly robust predictors of looking times, even in the music condition. However, under music exposure, (a) readers produced more rereading, and (b) gaze duration on words with very low frequency were less predicted by word length, suggesting disrupted sublexical processing. Study 2 showed that these effects were exacerbated for a short period as soon as a new song came into play. Our results suggested that word recognition generally stayed on track despite music exposure and that extensive rereading can, to some extent, compensate for disruption. However, an irrelevant auditory signal may impair sublexical processing of low-frequency words during first-pass reading, especially when the auditory signal changes dramatically. These eye movement patterns are different from those observed in some other scenarios in which reading comprehension is impaired, including mindless reading. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  16. Three-dimensional face pose detection and tracking using monocular videos: tool and application.

    PubMed

    Dornaika, Fadi; Raducanu, Bogdan

    2009-08-01

    Recently, we have proposed a real-time tracker that simultaneously tracks the 3-D head pose and facial actions in monocular video sequences that can be provided by low quality cameras. This paper has two main contributions. First, we propose an automatic 3-D face pose initialization scheme for the real-time tracker by adopting a 2-D face detector and an eigenface system. Second, we use the proposed methods-the initialization and tracking-for enhancing the human-machine interaction functionality of an AIBO robot. More precisely, we show how the orientation of the robot's camera (or any active vision system) can be controlled through the estimation of the user's head pose. Applications based on head-pose imitation such as telepresence, virtual reality, and video games can directly exploit the proposed techniques. Experiments on real videos confirm the robustness and usefulness of the proposed methods.

  17. Tracking by Identification Using Computer Vision and Radio

    PubMed Central

    Mandeljc, Rok; Kovačič, Stanislav; Kristan, Matej; Perš, Janez

    2013-01-01

    We present a novel system for detection, localization and tracking of multiple people, which fuses a multi-view computer vision approach with a radio-based localization system. The proposed fusion combines the best of both worlds, excellent computer-vision-based localization, and strong identity information provided by the radio system, and is therefore able to perform tracking by identification, which makes it impervious to propagated identity switches. We present comprehensive methodology for evaluation of systems that perform person localization in world coordinate system and use it to evaluate the proposed system as well as its components. Experimental results on a challenging indoor dataset, which involves multiple people walking around a realistically cluttered room, confirm that proposed fusion of both systems significantly outperforms its individual components. Compared to the radio-based system, it achieves better localization results, while at the same time it successfully prevents propagation of identity switches that occur in pure computer-vision-based tracking. PMID:23262485

  18. CON4EI: Development of testing strategies for hazard identification and labelling for serious eye damage and eye irritation of chemicals.

    PubMed

    Adriaens, E; Verstraelen, S; Alépée, N; Kandarova, H; Drzewiecka, A; Gruszka, K; Guest, R; Willoughby, J A; Van Rompay, A R

    2018-06-01

    Assessment of acute eye irritation potential is part of the international regulatory requirements for safety testing of chemicals. In the last decades, many efforts have been made in the search for alternative methods to replace the regulatory in vivo Draize rabbit eye test (OECD TG 405). Success in terms of complete replacement of the regulatory in vivo Draize rabbit eye test has not yet been achieved. The main objective of the CEFIC-LRI-AIMT6-VITO CON4EI (CONsortium for in vitro Eye Irritation testing strategy) project was to develop tiered testing strategies for serious eye damage and eye irritation assessment that can lead to complete replacement of OECD TG 405. A set of 80 reference chemicals (e.g. balanced by important driver of classification and physical state), was tested with seven test methods. Based on the results of this project, three different strategies were suggested. We have provided a standalone (EpiOcular ET-50), a two-tiered and three-tiered strategy, that can be used to distinguish between Cat 1 and Cat 2 chemicals and chemicals that do not require classification (No Cat). The two-tiered and three-tiered strategies use an RhCE test method (EpiOcular EIT or SkinEthic™ EIT) at the bottom (identification No Cat) in combination with the BCOP LLBO (two-tiered strategy) or BCOP OP-KIT and SMI (three-tiered strategy) at the top (identification Cat 1). For our proposed strategies, 71.1% - 82.9% Cat 1, 64.2% - 68.5% Cat 2 and ≥80% No Cat chemicals were correctly identified. Also, similar results were obtained for the Top-Down and Bottom-Up approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Using Eye Tracking to Investigate Semantic and Spatial Representations of Scientific Diagrams during Text-Diagram Integration

    ERIC Educational Resources Information Center

    Jian, Yu-Cin; Wu, Chao-Jung

    2015-01-01

    We investigated strategies used by readers when reading a science article with a diagram and assessed whether semantic and spatial representations were constructed while reading the diagram. Seventy-one undergraduate participants read a scientific article while tracking their eye movements and then completed a reading comprehension test. Our…

  20. Using Eye Tracking to Understand the Responses of Learners to Vocabulary Learning Strategy Instruction and Use

    ERIC Educational Resources Information Center

    Liu, Pei-Lin

    2014-01-01

    This study examined the influence of morphological instruction in an eye-tracking English vocabulary recognition task. Sixty-eight freshmen enrolled in an English course and received either traditional or morphological instruction for learning English vocabulary. The experimental part of the study was conducted over two-hour class periods for…