Sample records for eye gaze patterns

  1. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention.

    PubMed

    Montague, Enid; Asan, Onur

    2014-03-01

    The aim of this study was to examine eye gaze patterns between patients and physicians while electronic health records were used to support patient care. Eye gaze provides an indication of physician attention to patient, patient/physician interaction, and physician behaviors such as searching for information and documenting information. A field study was conducted where 100 patient visits were observed and video recorded in a primary care clinic. Videos were then coded for gaze behaviors where patients' and physicians' gaze at each other and artifacts such as electronic health records were coded using a pre-established objective coding scheme. Gaze data were then analyzed using lag sequential methods. Results showed that there are several eye gaze patterns significantly dependent to each other. All doctor-initiated gaze patterns were followed by patient gaze patterns. Some patient-initiated gaze patterns were also followed by doctor gaze patterns significantly unlike the findings in previous studies. Health information technology appears to contribute to some of the new significant patterns that have emerged. Differences were also found in gaze patterns related to technology that differ from patterns identified in studies with paper charts. Several sequences related to patient-doctor-technology were also significant. Electronic health records affect the patient-physician eye contact dynamic differently than paper charts. This study identified several patterns of patient-physician interaction with electronic health record systems. Consistent with previous studies, physician initiated gaze is an important driver of the interactions between patient and physician and patient and technology. Published by Elsevier Ireland Ltd.

  2. Dynamic modeling of patient and physician eye gaze to understand the effects of electronic health records on doctor-patient communication and attention

    PubMed Central

    Montague, Enid; Asan, Onur

    2014-01-01

    Objective The aim of this study was to examine eye gaze patterns between patients and physicians while electronic health records were used to support patient care. Background Eye gaze provides an indication of physician attention to patient, patient/physician interaction, and physician behaviors such as searching for information and documenting information. Methods A field study was conducted where 100 patient visits were observed and video recorded in a primary care clinic. Videos were then coded for gaze behaviors where patients’ and physicians’ gaze at each other and artifacts such as electronic health records were coded using a pre-established objective coding scheme. Gaze data were then analyzed using lag sequential methods. Results Results showed that there are several eye gaze patterns significantly dependent to each other. All doctor-initiated gaze patterns were followed by patient gaze patterns. Some patient-initiated gaze patterns were also followed by doctor gaze patterns significantly unlike the findings in previous studies. Health information technology appears to contribute to some of the new significant patterns that have emerged. Differences were also found in gaze patterns related to technology that differ from patterns identified in studies with paper charts. Several sequences related to patient-doctor- technology were also significant. Electronic health records affect the patient-physician eye contact dynamic differently than paper charts. Conclusion This study identified several patterns of patient-physician interaction with electronic health record systems. Consistent with previous studies, physician initiated gaze is an important driver of the interactions between patient and physician and patient and technology. PMID:24380671

  3. A comparison of facial color pattern and gazing behavior in canid species suggests gaze communication in gray wolves (Canis lupus).

    PubMed

    Ueda, Sayoko; Kumagai, Gaku; Otaki, Yusuke; Yamaguchi, Shinya; Kohshima, Shiro

    2014-01-01

    As facial color pattern around the eyes has been suggested to serve various adaptive functions related to the gaze signal, we compared the patterns among 25 canid species, focusing on the gaze signal, to estimate the function of facial color pattern in these species. The facial color patterns of the studied species could be categorized into the following three types based on contrast indices relating to the gaze signal: A-type (both pupil position in the eye outline and eye position in the face are clear), B-type (only the eye position is clear), and C-type (both the pupil and eye position are unclear). A-type faces with light-colored irises were observed in most studied species of the wolf-like clade and some of the red fox-like clade. A-type faces tended to be observed in species living in family groups all year-round, whereas B-type faces tended to be seen in solo/pair-living species. The duration of gazing behavior during which the facial gaze-signal is displayed to the other individual was longest in gray wolves with typical A-type faces, of intermediate length in fennec foxes with typical B-type faces, and shortest in bush dogs with typical C-type faces. These results suggest that the facial color pattern of canid species is related to their gaze communication and that canids with A-type faces, especially gray wolves, use the gaze signal in conspecific communication.

  4. Anxiety symptoms and children's eye gaze during fear learning.

    PubMed

    Michalska, Kalina J; Machlin, Laura; Moroney, Elizabeth; Lowet, Daniel S; Hettema, John M; Roberson-Nay, Roxann; Averbeck, Bruno B; Brotman, Melissa A; Nelson, Eric E; Leibenluft, Ellen; Pine, Daniel S

    2017-11-01

    The eye region of the face is particularly relevant for decoding threat-related signals, such as fear. However, it is unclear if gaze patterns to the eyes can be influenced by fear learning. Previous studies examining gaze patterns in adults find an association between anxiety and eye gaze avoidance, although no studies to date examine how associations between anxiety symptoms and eye-viewing patterns manifest in children. The current study examined the effects of learning and trait anxiety on eye gaze using a face-based fear conditioning task developed for use in children. Participants were 82 youth from a general population sample of twins (aged 9-13 years), exhibiting a range of anxiety symptoms. Participants underwent a fear conditioning paradigm where the conditioned stimuli (CS+) were two neutral faces, one of which was randomly selected to be paired with an aversive scream. Eye tracking, physiological, and subjective data were acquired. Children and parents reported their child's anxiety using the Screen for Child Anxiety Related Emotional Disorders. Conditioning influenced eye gaze patterns in that children looked longer and more frequently to the eye region of the CS+ than CS- face; this effect was present only during fear acquisition, not at baseline or extinction. Furthermore, consistent with past work in adults, anxiety symptoms were associated with eye gaze avoidance. Finally, gaze duration to the eye region mediated the effect of anxious traits on self-reported fear during acquisition. Anxiety symptoms in children relate to face-viewing strategies deployed in the context of a fear learning experiment. This relationship may inform attempts to understand the relationship between pediatric anxiety symptoms and learning. © 2017 Association for Child and Adolescent Mental Health.

  5. A Comparison of Facial Color Pattern and Gazing Behavior in Canid Species Suggests Gaze Communication in Gray Wolves (Canis lupus)

    PubMed Central

    Ueda, Sayoko; Kumagai, Gaku; Otaki, Yusuke; Yamaguchi, Shinya; Kohshima, Shiro

    2014-01-01

    As facial color pattern around the eyes has been suggested to serve various adaptive functions related to the gaze signal, we compared the patterns among 25 canid species, focusing on the gaze signal, to estimate the function of facial color pattern in these species. The facial color patterns of the studied species could be categorized into the following three types based on contrast indices relating to the gaze signal: A-type (both pupil position in the eye outline and eye position in the face are clear), B-type (only the eye position is clear), and C-type (both the pupil and eye position are unclear). A-type faces with light-colored irises were observed in most studied species of the wolf-like clade and some of the red fox-like clade. A-type faces tended to be observed in species living in family groups all year-round, whereas B-type faces tended to be seen in solo/pair-living species. The duration of gazing behavior during which the facial gaze-signal is displayed to the other individual was longest in gray wolves with typical A-type faces, of intermediate length in fennec foxes with typical B-type faces, and shortest in bush dogs with typical C-type faces. These results suggest that the facial color pattern of canid species is related to their gaze communication and that canids with A-type faces, especially gray wolves, use the gaze signal in conspecific communication. PMID:24918751

  6. Gaze holding deficits discriminate early from late onset cerebellar degeneration.

    PubMed

    Tarnutzer, Alexander A; Weber, K P; Schuknecht, B; Straumann, D; Marti, S; Bertolini, G

    2015-08-01

    The vestibulo-cerebellum calibrates the output of the inherently leaky brainstem neural velocity-to-position integrator to provide stable gaze holding. In healthy humans small-amplitude centrifugal nystagmus is present at extreme gaze-angles, with a non-linear relationship between eye-drift velocity and eye eccentricity. In cerebellar degeneration this calibration is impaired, resulting in pathological gaze-evoked nystagmus (GEN). For cerebellar dysfunction, increased eye drift may be present at any gaze angle (reflecting pure scaling of eye drift found in controls) or restricted to far-lateral gaze (reflecting changes in shape of the non-linear relationship) and resulting eyed-drift patterns could be related to specific disorders. We recorded horizontal eye positions in 21 patients with cerebellar neurodegeneration (gaze-angle = ±40°) and clinically confirmed GEN. Eye-drift velocity, linearity and symmetry of drift were determined. MR-images were assessed for cerebellar atrophy. In our patients, the relation between eye-drift velocity and gaze eccentricity was non-linear, yielding (compared to controls) significant GEN at gaze-eccentricities ≥20°. Pure scaling was most frequently observed (n = 10/18), followed by pure shape-changing (n = 4/18) and a mixed pattern (n = 4/18). Pure shape-changing patients were significantly (p = 0.001) younger at disease-onset compared to pure scaling patients. Atrophy centered around the superior/dorsal vermis, flocculus/paraflocculus and dentate nucleus and did not correlate with the specific drift behaviors observed. Eye drift in cerebellar degeneration varies in magnitude; however, it retains its non-linear properties. With different drift patterns being linked to age at disease-onset, we propose that the gaze-holding pattern (scaling vs. shape-changing) may discriminate early- from late-onset cerebellar degeneration. Whether this allows a distinction among specific cerebellar disorders remains to be determined.

  7. Investigating the Association of Eye Gaze Pattern and Diagnostic Error in Mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Pinto, Frank M; Xu, Songhua

    2013-01-01

    The objective of this study was to investigate the association between eye-gaze patterns and the diagnostic accuracy of radiologists for the task of assessing the likelihood of malignancy of mammographic masses. Six radiologists (2 expert breast imagers and 4 Radiology residents of variable training) assessed the likelihood of malignancy of 40 biopsy-proven mammographic masses (20 malignant and 20 benign) on a computer monitor. Eye-gaze data were collected using a commercial remote eye-tracker. Upon reviewing each mass, the radiologists were also asked to provide their assessment regarding the probability of malignancy of the depicted mass as well as a rating regardingmore » the perceived difficulty of the diagnostic task. The collected data were analyzed using established algorithms and various quantitative metrics were extracted to characterize the recorded gaze patterns. The extracted metrics were correlated with the radiologists diagnostic decisions and perceived complexity scores. Results showed that the visual gaze pattern of radiologists varies substantially, not only depending on their experience level but also among individuals. However, some eye gaze metrics appear to correlate with diagnostic error and perceived complexity more consistently. These results suggest that although gaze patterns are generally associated with diagnostic error and the human perceived difficulty of the diagnostic task, there are substantially individual differences that are not explained simply by the experience level of the individual performing the diagnostic task.« less

  8. Optimal eye movement strategies: a comparison of neurosurgeons gaze patterns when using a surgical microscope.

    PubMed

    Eivazi, Shahram; Hafez, Ahmad; Fuhl, Wolfgang; Afkari, Hoorieh; Kasneci, Enkelejda; Lehecka, Martin; Bednarik, Roman

    2017-06-01

    Previous studies have consistently demonstrated gaze behaviour differences related to expertise during various surgical procedures. In micro-neurosurgery, however, there is a lack of evidence of empirically demonstrated individual differences associated with visual attention. It is unknown exactly how neurosurgeons see a stereoscopic magnified view in the context of micro-neurosurgery and what this implies for medical training. We report on an investigation of the eye movement patterns in micro-neurosurgery using a state-of-the-art eye tracker. We studied the eye movements of nine neurosurgeons while performing cutting and suturing tasks under a surgical microscope. Eye-movement characteristics, such as fixation (focus level) and saccade (visual search pattern), were analysed. The results show a strong relationship between the level of microsurgical skill and the gaze pattern, whereas more expertise is associated with greater eye control, stability, and focusing in eye behaviour. For example, in the cutting task, well-trained surgeons increased their fixation durations on the operating field twice as much as the novices (expert, 848 ms; novice, 402 ms). Maintaining steady visual attention on the target (fixation), as well as being able to quickly make eye jumps from one target to another (saccades) are two important elements for the success of neurosurgery. The captured gaze patterns can be used to improve medical education, as part of an assessment system or in a gaze-training application.

  9. Objective eye-gaze behaviour during face-to-face communication with proficient alaryngeal speakers: a preliminary study.

    PubMed

    Evitts, Paul; Gallop, Robert

    2011-01-01

    There is a large body of research demonstrating the impact of visual information on speaker intelligibility in both normal and disordered speaker populations. However, there is minimal information on which specific visual features listeners find salient during conversational discourse. To investigate listeners' eye-gaze behaviour during face-to-face conversation with normal, laryngeal and proficient alaryngeal speakers. Sixty participants individually participated in a 10-min conversation with one of four speakers (typical laryngeal, tracheoesophageal, oesophageal, electrolaryngeal; 15 participants randomly assigned to one mode of speech). All speakers were > 85% intelligible and were judged to be 'proficient' by two certified speech-language pathologists. Participants were fitted with a head-mounted eye-gaze tracking device (Mobile Eye, ASL) that calculated the region of interest and mean duration of eye-gaze. Self-reported gaze behaviour was also obtained following the conversation using a 10 cm visual analogue scale. While listening, participants viewed the lower facial region of the oesophageal speaker more than the normal or tracheoesophageal speaker. Results of non-hierarchical cluster analyses showed that while listening, the pattern of eye-gaze was predominantly directed at the lower face of the oesophageal and electrolaryngeal speaker and more evenly dispersed among the background, lower face, and eyes of the normal and tracheoesophageal speakers. Finally, results show a low correlation between self-reported eye-gaze behaviour and objective regions of interest data. Overall, results suggest similar eye-gaze behaviour when healthy controls converse with normal and tracheoesophageal speakers and that participants had significantly different eye-gaze patterns when conversing with an oesophageal speaker. Results are discussed in terms of existing eye-gaze data and its potential implications on auditory-visual speech perception. © 2011 Royal College of Speech & Language Therapists.

  10. Relationship between abstract thinking and eye gaze pattern in patients with schizophrenia.

    PubMed

    Oh, Jooyoung; Chun, Ji-Won; Lee, Jung Suk; Kim, Jae-Jin

    2014-04-16

    Effective integration of visual information is necessary to utilize abstract thinking, but patients with schizophrenia have slow eye movement and usually explore limited visual information. This study examines the relationship between abstract thinking ability and the pattern of eye gaze in patients with schizophrenia using a novel theme identification task. Twenty patients with schizophrenia and 22 healthy controls completed the theme identification task, in which subjects selected which word, out of a set of provided words, best described the theme of a picture. Eye gaze while performing the task was recorded by the eye tracker. Patients exhibited a significantly lower correct rate for theme identification and lesser fixation and saccade counts than controls. The correct rate was significantly correlated with the fixation count in patients, but not in controls. Patients with schizophrenia showed impaired abstract thinking and decreased quality of gaze, which were positively associated with each other. Theme identification and eye gaze appear to be useful as tools for the objective measurement of abstract thinking in patients with schizophrenia.

  11. Design of a Gaze-Sensitive Virtual Social Interactive System for Children With Autism

    PubMed Central

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child’s dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. PMID:21609889

  12. Design of a gaze-sensitive virtual social interactive system for children with autism.

    PubMed

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2011-08-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child's dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. © 2011 IEEE

  13. How physician electronic health record screen sharing affects patient and doctor non-verbal communication in primary care.

    PubMed

    Asan, Onur; Young, Henry N; Chewning, Betty; Montague, Enid

    2015-03-01

    Use of electronic health records (EHRs) in primary-care exam rooms changes the dynamics of patient-physician interaction. This study examines and compares doctor-patient non-verbal communication (eye-gaze patterns) during primary care encounters for three different screen/information sharing groups: (1) active information sharing, (2) passive information sharing, and (3) technology withdrawal. Researchers video recorded 100 primary-care visits and coded the direction and duration of doctor and patient gaze. Descriptive statistics compared the length of gaze patterns as a percentage of visit length. Lag sequential analysis determined whether physician eye-gaze influenced patient eye gaze, and vice versa, and examined variations across groups. Significant differences were found in duration of gaze across groups. Lag sequential analysis found significant associations between several gaze patterns. Some, such as DGP-PGD ("doctor gaze patient" followed by "patient gaze doctor") were significant for all groups. Others, such DGT-PGU ("doctor gaze technology" followed by "patient gaze unknown") were unique to one group. Some technology use styles (active information sharing) seem to create more patient engagement, while others (passive information sharing) lead to patient disengagement. Doctors can engage patients in communication by using EHRs in the visits. EHR training and design should facilitate this. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Relationship between abstract thinking and eye gaze pattern in patients with schizophrenia

    PubMed Central

    2014-01-01

    Background Effective integration of visual information is necessary to utilize abstract thinking, but patients with schizophrenia have slow eye movement and usually explore limited visual information. This study examines the relationship between abstract thinking ability and the pattern of eye gaze in patients with schizophrenia using a novel theme identification task. Methods Twenty patients with schizophrenia and 22 healthy controls completed the theme identification task, in which subjects selected which word, out of a set of provided words, best described the theme of a picture. Eye gaze while performing the task was recorded by the eye tracker. Results Patients exhibited a significantly lower correct rate for theme identification and lesser fixation and saccade counts than controls. The correct rate was significantly correlated with the fixation count in patients, but not in controls. Conclusions Patients with schizophrenia showed impaired abstract thinking and decreased quality of gaze, which were positively associated with each other. Theme identification and eye gaze appear to be useful as tools for the objective measurement of abstract thinking in patients with schizophrenia. PMID:24739356

  15. Aversive eye gaze during a speech in virtual environment in patients with social anxiety disorder.

    PubMed

    Kim, Haena; Shin, Jung Eun; Hong, Yeon-Ju; Shin, Yu-Bin; Shin, Young Seok; Han, Kiwan; Kim, Jae-Jin; Choi, Soo-Hee

    2018-03-01

    One of the main characteristics of social anxiety disorder is excessive fear of social evaluation. In such situations, anxiety can influence gaze behaviour. Thus, the current study adopted virtual reality to examine eye gaze pattern of social anxiety disorder patients while presenting different types of speeches. A total of 79 social anxiety disorder patients and 51 healthy controls presented prepared speeches on general topics and impromptu speeches on self-related topics to a virtual audience while their eye gaze was recorded. Their presentation performance was also evaluated. Overall, social anxiety disorder patients showed less eye gaze towards the audience than healthy controls. Types of speech did not influence social anxiety disorder patients' gaze allocation towards the audience. However, patients with social anxiety disorder showed significant correlations between the amount of eye gaze towards the audience while presenting self-related speeches and social anxiety cognitions. The current study confirms that eye gaze behaviour of social anxiety disorder patients is aversive and that their anxiety symptoms are more dependent on the nature of topic.

  16. Vigilance and Avoidance of Threat in the Eye Movements of Children with Separation Anxiety Disorder

    ERIC Educational Resources Information Center

    In-Albon, Tina; Kossowsky, Joe; Schneider, Silvia

    2010-01-01

    The "vigilance-avoidance" attention pattern is found in anxious adults, who initially gaze more at threatening pictures than nonanxious adults (vigilance), but subsequently gaze less at them than nonanxious adults (avoidance). The present research, using eye tracking methodology, tested whether anxious children show the same pattern. Children with…

  17. Love is in the gaze: an eye-tracking study of love and sexual desire.

    PubMed

    Bolmont, Mylene; Cacioppo, John T; Cacioppo, Stephanie

    2014-09-01

    Reading other people's eyes is a valuable skill during interpersonal interaction. Although a number of studies have investigated visual patterns in relation to the perceiver's interest, intentions, and goals, little is known about eye gaze when it comes to differentiating intentions to love from intentions to lust (sexual desire). To address this question, we conducted two experiments: one testing whether the visual pattern related to the perception of love differs from that related to lust and one testing whether the visual pattern related to the expression of love differs from that related to lust. Our results show that a person's eye gaze shifts as a function of his or her goal (love vs. lust) when looking at a visual stimulus. Such identification of distinct visual patterns for love and lust could have theoretical and clinical importance in couples therapy when these two phenomena are difficult to disentangle from one another on the basis of patients' self-reports. © The Author(s) 2014.

  18. A novel attention training paradigm based on operant conditioning of eye gaze: Preliminary findings.

    PubMed

    Price, Rebecca B; Greven, Inez M; Siegle, Greg J; Koster, Ernst H W; De Raedt, Rudi

    2016-02-01

    Inability to engage with positive stimuli is a widespread problem associated with negative mood states across many conditions, from low self-esteem to anhedonic depression. Though attention retraining procedures have shown promise as interventions in some clinical populations, novel procedures may be necessary to reliably attenuate chronic negative mood in refractory clinical populations (e.g., clinical depression) through, for example, more active, adaptive learning processes. In addition, a focus on individual difference variables predicting intervention outcome may improve the ability to provide such targeted interventions efficiently. To provide preliminary proof-of-principle, we tested a novel paradigm using operant conditioning to train eye gaze patterns toward happy faces. Thirty-two healthy undergraduates were randomized to receive operant conditioning of eye gaze toward happy faces (train-happy) or neutral faces (train-neutral). At the group level, the train-happy condition attenuated sad mood increases following a stressful task, in comparison to train-neutral. In individual differences analysis, greater physiological reactivity (pupil dilation) in response to happy faces (during an emotional face-search task at baseline) predicted decreased mood reactivity after stress. These Preliminary results suggest that operant conditioning of eye gaze toward happy faces buffers against stress-induced effects on mood, particularly in individuals who show sufficient baseline neural engagement with happy faces. Eye gaze patterns to emotional face arrays may have a causal relationship with mood reactivity. Personalized medicine research in depression may benefit from novel cognitive training paradigms that shape eye gaze patterns through feedback. Baseline neural function (pupil dilation) may be a key mechanism, aiding in iterative refinement of this approach. (c) 2016 APA, all rights reserved).

  19. "Avoiding or approaching eyes"? Introversion/extraversion affects the gaze-cueing effect.

    PubMed

    Ponari, Marta; Trojano, Luigi; Grossi, Dario; Conson, Massimiliano

    2013-08-01

    We investigated whether the extra-/introversion personality dimension can influence processing of others' eye gaze direction and emotional facial expression during a target detection task. On the basis of previous evidence showing that self-reported trait anxiety can affect gaze-cueing with emotional faces, we also verified whether trait anxiety can modulate the influence of intro-/extraversion on behavioral performance. Fearful, happy, angry or neutral faces, with either direct or averted gaze, were presented before the target appeared in spatial locations congruent or incongruent with stimuli's eye gaze direction. Results showed a significant influence of intra-/extraversion dimension on gaze-cueing effect for angry, happy, and neutral faces with averted gaze. Introverts did not show the gaze congruency effect when viewing angry expressions, but did so with happy and neutral faces; extraverts showed the opposite pattern. Importantly, the influence of intro-/extraversion on gaze-cueing was not mediated by trait anxiety. These findings demonstrated that personality differences can shape processing of interactions between relevant social signals.

  20. Eye gaze during comprehension of American Sign Language by native and beginning signers.

    PubMed

    Emmorey, Karen; Thompson, Robin; Colvin, Rachael

    2009-01-01

    An eye-tracking experiment investigated where deaf native signers (N = 9) and hearing beginning signers (N = 10) look while comprehending a short narrative and a spatial description in American Sign Language produced live by a fluent signer. Both groups fixated primarily on the signer's face (more than 80% of the time) but differed with respect to fixation location. Beginning signers fixated on or near the signer's mouth, perhaps to better perceive English mouthing, whereas native signers tended to fixate on or near the eyes. Beginning signers shifted gaze away from the signer's face more frequently than native signers, but the pattern of gaze shifts was similar for both groups. When a shift in gaze occurred, the sign narrator was almost always looking at his or her hands and was most often producing a classifier construction. We conclude that joint visual attention and attention to mouthing (for beginning signers), rather than linguistic complexity or processing load, affect gaze fixation patterns during sign language comprehension.

  1. Differential Gaze Patterns on Eyes and Mouth During Audiovisual Speech Segmentation

    PubMed Central

    Lusk, Laina G.; Mitchel, Aaron D.

    2016-01-01

    Speech is inextricably multisensory: both auditory and visual components provide critical information for all aspects of speech processing, including speech segmentation, the visual components of which have been the target of a growing number of studies. In particular, a recent study (Mitchel and Weiss, 2014) established that adults can utilize facial cues (i.e., visual prosody) to identify word boundaries in fluent speech. The current study expanded upon these results, using an eye tracker to identify highly attended facial features of the audiovisual display used in Mitchel and Weiss (2014). Subjects spent the most time watching the eyes and mouth. A significant trend in gaze durations was found with the longest gaze duration on the mouth, followed by the eyes and then the nose. In addition, eye-gaze patterns changed across familiarization as subjects learned the word boundaries, showing decreased attention to the mouth in later blocks while attention on other facial features remained consistent. These findings highlight the importance of the visual component of speech processing and suggest that the mouth may play a critical role in visual speech segmentation. PMID:26869959

  2. Face exploration dynamics differentiate men and women.

    PubMed

    Coutrot, Antoine; Binetti, Nicola; Harrison, Charlotte; Mareschal, Isabelle; Johnston, Alan

    2016-11-01

    The human face is central to our everyday social interactions. Recent studies have shown that while gazing at faces, each one of us has a particular eye-scanning pattern, highly stable across time. Although variables such as culture or personality have been shown to modulate gaze behavior, we still don't know what shapes these idiosyncrasies. Moreover, most previous observations rely on static analyses of small-sized eye-position data sets averaged across time. Here, we probe the temporal dynamics of gaze to explore what information can be extracted about the observers and what is being observed. Controlling for any stimuli effect, we demonstrate that among many individual characteristics, the gender of both the participant (gazer) and the person being observed (actor) are the factors that most influence gaze patterns during face exploration. We record and exploit the largest set of eye-tracking data (405 participants, 58 nationalities) from participants watching videos of another person. Using novel data-mining techniques, we show that female gazers follow a much more exploratory scanning strategy than males. Moreover, female gazers watching female actresses look more at the eye on the left side. These results have strong implications in every field using gaze-based models from computer vision to clinical psychology.

  3. Cognitive context detection in UAS operators using eye-gaze patterns on computer screens

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.

  4. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.

    PubMed

    Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty

    2011-10-01

    The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.

  5. On the use of hidden Markov models for gaze pattern modeling

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    Some of the conventional metrics derived from gaze patterns (on computer screens) to study visual attention, engagement and fatigue are saccade counts, nearest neighbor index (NNI) and duration of dwells/fixations. Each of these metrics has drawbacks in modeling the behavior of gaze patterns; one such drawback comes from the fact that some portions on the screen are not as important as some other portions on the screen. This is addressed by computing the eye gaze metrics corresponding to important areas of interest (AOI) on the screen. There are some challenges in developing accurate AOI based metrics: firstly, the definition of AOI is always fuzzy; secondly, it is possible that the AOI may change adaptively over time. Hence, there is a need to introduce eye-gaze metrics that are aware of the AOI in the field of view; at the same time, the new metrics should be able to automatically select the AOI based on the nature of the gazes. In this paper, we propose a novel way of computing NNI based on continuous hidden Markov models (HMM) that model the gazes as 2D Gaussian observations (x-y coordinates of the gaze) with the mean at the center of the AOI and covariance that is related to the concentration of gazes. The proposed modeling allows us to accurately compute the NNI metric in the presence of multiple, undefined AOI on the screen in the presence of intermittent casual gazing that is modeled as random gazes on the screen.

  6. Gaze stability of observers watching Op Art pictures.

    PubMed

    Zanker, Johannes M; Doyle, Melanie; Robin, Walker

    2003-01-01

    It has been the matter of some debate why we can experience vivid dynamic illusions when looking at static pictures composed from simple black and white patterns. The impression of illusory motion is particularly strong when viewing some of the works of 'Op Artists, such as Bridget Riley's painting Fall. Explanations of the illusory motion have ranged from retinal to cortical mechanisms, and an important role has been attributed to eye movements. To assess the possible contribution of eye movements to the illusory-motion percept we studied the strength of the illusion under different viewing conditions, and analysed the gaze stability of observers viewing the Riley painting and control patterns that do not produce the illusion. Whereas the illusion was reduced, but not abolished, when watching the painting through a pinhole, which reduces the effects of accommodation, it was not perceived in flash afterimages, suggesting an important role for eye movements in generating the illusion for this image. Recordings of eye movements revealed an abundance of small involuntary saccades when looking at the Riley pattern, despite the fact that gaze was kept within the dedicated fixation region. The frequency and particular characteristics of these rapid eye movements can vary considerably between different observers, but, although there was a tendency for gaze stability to deteriorate while viewing a Riley painting, there was no significant difference in saccade frequency between the stimulus and control patterns. Theoretical considerations indicate that such small image displacements can generate patterns of motion signals in a motion-detector network, which may serve as a simple and sufficient, but not necessarily exclusive, explanation for the illusion. Why such image displacements lead to perceptual results with a group of Op Art and similar patterns, but remain invisible for other stimuli, is discussed.

  7. The Pattern of Sexual Interest of Female-to-Male Transsexual Persons With Gender Identity Disorder Does Not Resemble That of Biological Men: An Eye-Tracking Study.

    PubMed

    Tsujimura, Akira; Kiuchi, Hiroshi; Soda, Tetsuji; Takezawa, Kentaro; Fukuhara, Shinichiro; Takao, Tetsuya; Sekiguchi, Yuki; Iwasa, Atsushi; Nonomura, Norio; Miyagawa, Yasushi

    2017-09-01

    Very little has been elucidated about sexual interest in female-to-male (FtM) transsexual persons. To investigate the sexual interest of FtM transsexual persons vs that of men using an eye-tracking system. The study included 15 men and 13 FtM transsexual subjects who viewed three sexual videos (clip 1: sexy clothed young woman kissing the region of the male genitals covered by underwear; clip 2: naked actor and actress kissing and touching each other; and clip 3: heterosexual intercourse between a naked actor and actress) in which several regions were designated for eye-gaze analysis in each frame. The designation of each region was not visible to the participants. Visual attention was measured across each designated region according to gaze duration. For clip 1, there was a statistically significant sex difference in the viewing pattern between men and FtM transsexual subjects. Longest gaze time was for the eyes of the actress in men, whereas it was for non-human regions in FtM transsexual subjects. For clip 2, there also was a statistically significant sex difference. Longest gaze time was for the face of the actress in men, whereas it was for non-human regions in FtM transsexual subjects, and there was a significant difference between regions with longest gaze time. The most apparent difference was in the gaze time for the body of the actor: the percentage of time spent gazing at the body of the actor was 8.35% in FtM transsexual subjects, whereas it was only 0.03% in men. For clip 3, there were no statistically significant differences in viewing patterns between men and FtM transsexual subjects, although longest gaze time was for the face of the actress in men, whereas it was for non-human regions in FtM transsexual subjects. We suggest that the characteristics of sexual interest of FtM transsexual persons are not the same as those of biological men. Tsujimura A, Kiuchi H, Soda T, et al. The Pattern of Sexual Interest of Female-to-Male Transsexual Persons With Gender Identity Disorder Does Not Resemble That of Biological Men: An Eye-Tracking Study. Sex Med 2017;5:e169-e174. Copyright © 2017. Published by Elsevier Inc.

  8. Neural Mechanisms Underlying Conscious and Unconscious Gaze-Triggered Attentional Orienting in Autism Spectrum Disorder

    PubMed Central

    Sato, Wataru; Kochiyama, Takanori; Uono, Shota; Yoshimura, Sayaka; Toichi, Motomi

    2017-01-01

    Impaired joint attention represents the core clinical feature of autism spectrum disorder (ASD). Behavioral studies have suggested that gaze-triggered attentional orienting is intact in response to supraliminally presented eyes but impaired in response to subliminally presented eyes in individuals with ASD. However, the neural mechanisms underlying conscious and unconscious gaze-triggered attentional orienting remain unclear. We investigated this issue in ASD and typically developing (TD) individuals using event-related functional magnetic resonance imaging. The participants viewed cue stimuli of averted or straight eye gaze direction presented either supraliminally or subliminally and then localized a target. Reaction times were shorter when eye-gaze cues were directionally valid compared with when they were neutral under the supraliminal condition in both groups; the same pattern was found in the TD group but not the ASD group under the subliminal condition. The temporo–parieto–frontal regions showed stronger activation in response to averted eyes than to straight eyes in both groups under the supraliminal condition. The left amygdala was more activated while viewing averted vs. straight eyes in the TD group than in the ASD group under the subliminal condition. These findings provide an explanation for the neural mechanisms underlying the impairment in unconscious but not conscious gaze-triggered attentional orienting in individuals with ASD and suggest possible neurological and behavioral interventions to facilitate their joint attention behaviors. PMID:28701942

  9. Gaze-evoked nystagmus induced by alcohol intoxication.

    PubMed

    Romano, Fausto; Tarnutzer, Alexander A; Straumann, Dominik; Ramat, Stefano; Bertolini, Giovanni

    2017-03-15

    The cerebellum is the core structure controlling gaze stability. Chronic cerebellar diseases and acute alcohol intoxication affect cerebellar function, inducing, among others, gaze instability as gaze-evoked nystagmus. Gaze-evoked nystagmus is characterized by increased centripetal eye-drift. It is used as an important diagnostic sign for patients with cerebellar degeneration and to assess the 'driving while intoxicated' condition. We quantified the effect of alcohol on gaze-holding using an approach allowing, for the first time, the comparison of deficits induced by alcohol intoxication and cerebellar degeneration. Our results showed that alcohol intoxication induces a two-fold increase of centripetal eye-drift. We establish analysis techniques for using controlled alcohol intake as a model to support the study of cerebellar deficits. The observed similarity between the effect of alcohol and the clinical signs observed in cerebellar patients suggests a possible pathomechanism for gaze-holding deficits. Gaze-evoked nystagmus (GEN) is an ocular-motor finding commonly observed in cerebellar disease, characterized by increased centripetal eye-drift with centrifugal correcting saccades at eccentric gaze. With cerebellar degeneration being a rare and clinically heterogeneous disease, data from patients are limited. We hypothesized that a transient inhibition of cerebellar function by defined amounts of alcohol may provide a suitable model to study gaze-holding deficits in cerebellar disease. We recorded gaze-holding at varying horizontal eye positions in 15 healthy participants before and 30 min after alcohol intake required to reach 0.6‰ blood alcohol content (BAC). Changes in ocular-motor behaviour were quantified measuring eye-drift velocity as a continuous function of gaze eccentricity over a large range (±40 deg) of horizontal gaze angles and characterized using a two-parameter tangent model. The effect of alcohol on gaze stability was assessed analysing: (1) overall effects on the gaze-holding system, (2) specific effects on each eye and (3) differences between gaze angles in the temporal and nasal hemifields. For all subjects, alcohol consumption induced gaze instability, causing a two-fold increase [2.21 (0.55), median (median absolute deviation); P = 0.002] of eye-drift velocity at all eccentricities. Results were confirmed analysing each eye and hemifield independently. The alcohol-induced transient global deficit in gaze-holding matched the pattern previously described in patients with late-onset cerebellar degeneration. Controlled intake of alcohol seems a suitable disease model to study cerebellar GEN. With alcohol resulting in global cerebellar hypofunction, we hypothesize that patients matching the gaze-holding behaviour observed here suffered from diffuse deficits in the gaze-holding system as well. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  10. What Do Eye Gaze Metrics Tell Us about Motor Imagery?

    PubMed

    Poiroux, Elodie; Cavaro-Ménard, Christine; Leruez, Stéphanie; Lemée, Jean Michel; Richard, Isabelle; Dinomais, Mickael

    2015-01-01

    Many of the brain structures involved in performing real movements also have increased activity during imagined movements or during motor observation, and this could be the neural substrate underlying the effects of motor imagery in motor learning or motor rehabilitation. In the absence of any objective physiological method of measurement, it is currently impossible to be sure that the patient is indeed performing the task as instructed. Eye gaze recording during a motor imagery task could be a possible way to "spy" on the activity an individual is really engaged in. The aim of the present study was to compare the pattern of eye movement metrics during motor observation, visual and kinesthetic motor imagery (VI, KI), target fixation, and mental calculation. Twenty-two healthy subjects (16 females and 6 males), were required to perform tests in five conditions using imagery in the Box and Block Test tasks following the procedure described by Liepert et al. Eye movements were analysed by a non-invasive oculometric measure (SMI RED250 system). Two parameters describing gaze pattern were calculated: the index of ocular mobility (saccade duration over saccade + fixation duration) and the number of midline crossings (i.e. the number of times the subjects gaze crossed the midline of the screen when performing the different tasks). Both parameters were significantly different between visual imagery and kinesthesic imagery, visual imagery and mental calculation, and visual imagery and target fixation. For the first time we were able to show that eye movement patterns are different during VI and KI tasks. Our results suggest gaze metric parameters could be used as an objective unobtrusive approach to assess engagement in a motor imagery task. Further studies should define how oculomotor parameters could be used as an indicator of the rehabilitation task a patient is engaged in.

  11. Modeling Eye Gaze Patterns in Clinician-Patient Interaction with Lag Sequential Analysis

    PubMed Central

    Montague, E; Xu, J; Asan, O; Chen, P; Chewning, B; Barrett, B

    2011-01-01

    Objective The aim of this study was to examine whether lag-sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multi-user health care settings where trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Background Nonverbal communication patterns are important aspects of clinician-patient interactions and may impact patient outcomes. Method Eye gaze behaviors of clinicians and patients in 110-videotaped medical encounters were analyzed using the lag-sequential method to identify significant behavior sequences. Lag-sequential analysis included both event-based lag and time-based lag. Results Results from event-based lag analysis showed that the patients’ gaze followed that of clinicians, while clinicians did not follow patients. Time-based sequential analysis showed that responses from the patient usually occurred within two seconds after the initial behavior of the clinician. Conclusion Our data suggest that the clinician’s gaze significantly affects the medical encounter but not the converse. Application Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs. PMID:22046723

  12. Attention to gaze and emotion in schizophrenia.

    PubMed

    Schwartz, Barbara L; Vaidya, Chandan J; Howard, James H; Deutsch, Stephen I

    2010-11-01

    Individuals with schizophrenia have difficulty interpreting social and emotional cues such as facial expression, gaze direction, body position, and voice intonation. Nonverbal cues are powerful social signals but are often processed implicitly, outside the focus of attention. The aim of this research was to assess implicit processing of social cues in individuals with schizophrenia. Patients with schizophrenia or schizoaffective disorder and matched controls performed a primary task of word classification with social cues in the background. Participants were asked to classify target words (LEFT/RIGHT) by pressing a key that corresponded to the word, in the context of facial expressions with eye gaze averted to the left or right. Although facial expression and gaze direction were irrelevant to the task, these facial cues influenced word classification performance. Participants were slower to classify target words (e.g., LEFT) that were incongruent to gaze direction (e.g., eyes averted to the right) compared to target words (e.g., LEFT) that were congruent to gaze direction (e.g., eyes averted to the left), but this only occurred for expressions of fear. This pattern did not differ for patients and controls. The results showed that threat-related signals capture the attention of individuals with schizophrenia. These data suggest that implicit processing of eye gaze and fearful expressions is intact in schizophrenia. (c) 2010 APA, all rights reserved

  13. A Support System for Mouse Operations Using Eye-Gaze Input

    NASA Astrophysics Data System (ADS)

    Abe, Kiyohiko; Nakayama, Yasuhiro; Ohi, Shoichi; Ohyama, Minoru

    We have developed an eye-gaze input system for people with severe physical disabilities, such as amyotrophic lateral sclerosis (ALS) patients. This system utilizes a personal computer and a home video camera to detect eye-gaze under natural light. The system detects both vertical and horizontal eye-gaze by simple image analysis, and does not require special image processing units or sensors. Our conventional eye-gaze input system can detect horizontal eye-gaze with a high degree of accuracy. However, it can only classify vertical eye-gaze into 3 directions (up, middle and down). In this paper, we propose a new method for vertical eye-gaze detection. This method utilizes the limbus tracking method for vertical eye-gaze detection. Therefore our new eye-gaze input system can detect the two-dimension coordinates of user's gazing point. By using this method, we develop a new support system for mouse operation. This system can move the mouse cursor to user's gazing point.

  14. Upward gaze and head deviation with frontal eye field stimulation.

    PubMed

    Kaiboriboon, Kitti; Lüders, Hans O; Miller, Jonathan P; Leigh, R John

    2012-03-01

    Using electrical stimulation to the deep, most caudal part of the right frontal eye field (FEF), we demonstrate a novel pattern of vertical (upward) eye movement that was previously only thought possible by stimulating both frontal eye fields simultaneously. If stimulation was started when the subject looked laterally, the initial eye movement was back to the midline, followed by upward deviation. Our finding challenges current view of topological organisation in the human FEF and may have general implications for concepts of topological organisation of the motor cortex, since sustained stimulation also induced upward head movements as a component of the vertical gaze shift. [Published with video sequences].

  15. The Eyes Are the Windows to the Mind: Direct Eye Gaze Triggers the Ascription of Others' Minds.

    PubMed

    Khalid, Saara; Deska, Jason C; Hugenberg, Kurt

    2016-12-01

    Eye gaze is a potent source of social information with direct eye gaze signaling the desire to approach and averted eye gaze signaling avoidance. In the current work, we proposed that eye gaze signals whether or not to impute minds into others. Across four studies, we manipulated targets' eye gaze (i.e., direct vs. averted eye gaze) and measured explicit mind ascriptions (Study 1a, Study 1b, and Study 2) and beliefs about the likelihood of targets having mind (Study 3). In all four studies, we find novel evidence that the ascription of sophisticated humanlike minds to others is signaled by the display of direct eye gaze relative to averted eye gaze. Moreover, we provide evidence suggesting that this differential mentalization is due, at least in part, to beliefs that direct gaze targets are more likely to instigate social interaction. In short, eye contact triggers mind perception. © 2016 by the Society for Personality and Social Psychology, Inc.

  16. Gaze and visual search strategies of children with Asperger syndrome/high functioning autism viewing a magic trick.

    PubMed

    Joosten, Annette; Girdler, Sonya; Albrecht, Matthew A; Horlin, Chiara; Falkmer, Marita; Leung, Denise; Ordqvist, Anna; Fleischer, Håkan; Falkmer, Torbjörn

    2016-01-01

    To examine visual search patterns and strategies used by children with and without Asperger syndrome/high functioning autism (AS/HFA) while watching a magic trick. Limited responsivity to gaze cues is hypothesised to contribute to social deficits in children with AS/HFA. Twenty-one children with AS/HFA and 31 matched peers viewed a video of a gaze-cued magic trick twice. Between the viewings, they were informed about how the trick was performed. Participants' eye movements were recorded using a head-mounted eye-tracker. Children with AS/HFA looked less frequently and had shorter fixation on the magician's direct and averted gazes during both viewings and more frequently at not gaze-cued objects and on areas outside the magician's face. After being informed of how the trick was conducted, both groups made fewer fixations on gaze-cued objects and direct gaze. Information may enhance effective visual strategies in children with and without AS/HFA.

  17. Why Do We Move Our Eyes while Trying to Remember? The Relationship between Non-Visual Gaze Patterns and Memory

    ERIC Educational Resources Information Center

    Micic, Dragana; Ehrlichman, Howard; Chen, Rebecca

    2010-01-01

    Non-visual gaze patterns (NVGPs) involve saccades and fixations that spontaneously occur in cognitive activities that are not ostensibly visual. While reasons for their appearance remain obscure, convergent empirical evidence suggests that NVGPs change according to processing requirements of tasks. We examined NVGPs in tasks with long-term memory…

  18. The Disturbance of Gaze in Progressive Supranuclear Palsy: Implications for Pathogenesis

    PubMed Central

    Chen, Athena L.; Riley, David E.; King, Susan A.; Joshi, Anand C.; Serra, Alessandro; Liao, Ke; Cohen, Mark L.; Otero-Millan, Jorge; Martinez-Conde, Susana; Strupp, Michael; Leigh, R. John

    2010-01-01

    Progressive supranuclear palsy (PSP) is a disease of later life that is currently regarded as a form of neurodegenerative tauopathy. Disturbance of gaze is a cardinal clinical feature of PSP that often helps clinicians to establish the diagnosis. Since the neurobiology of gaze control is now well understood, it is possible to use eye movements as investigational tools to understand aspects of the pathogenesis of PSP. In this review, we summarize each disorder of gaze control that occurs in PSP, drawing on our studies of 50 patients, and on reports from other laboratories that have measured the disturbances of eye movements. When these gaze disorders are approached by considering each functional class of eye movements and its neurobiological basis, a distinct pattern of eye movement deficits emerges that provides insight into the pathogenesis of PSP. Although some aspects of all forms of eye movements are affected in PSP, the predominant defects concern vertical saccades (slow and hypometric, both up and down), impaired vergence, and inability to modulate the linear vestibulo-ocular reflex appropriately for viewing distance. These vertical and vergence eye movements habitually work in concert to enable visuomotor skills that are important during locomotion with the hands free. Taken with the prominent early feature of falls, these findings suggest that PSP tauopathy impairs a recently evolved neural system concerned with bipedal locomotion in an erect posture and frequent gaze shifts between the distant environment and proximate hands. This approach provides a conceptual framework that can be used to address the nosological challenge posed by overlapping clinical and neuropathological features of neurodegenerative tauopathies. PMID:21188269

  19. Gaze patterns reveal how situation models and text representations contribute to episodic text memory.

    PubMed

    Johansson, Roger; Oren, Franziska; Holmqvist, Kenneth

    2018-06-01

    When recalling something you have previously read, to what degree will such episodic remembering activate a situation model of described events versus a memory representation of the text itself? The present study was designed to address this question by recording eye movements of participants who recalled previously read texts while looking at a blank screen. An accumulating body of research has demonstrated that spontaneous eye movements occur during episodic memory retrieval and that fixation locations from such gaze patterns to a large degree overlap with the visuospatial layout of the recalled information. Here we used this phenomenon to investigate to what degree participants' gaze patterns corresponded with the visuospatial configuration of the text itself versus a visuospatial configuration described in it. The texts to be recalled were scene descriptions, where the spatial configuration of the scene content was manipulated to be either congruent or incongruent with the spatial configuration of the text itself. Results show that participants' gaze patterns were more likely to correspond with a visuospatial representation of the described scene than with a visuospatial representation of the text itself, but also that the contribution of those representations of space is sensitive to the text content. This is the first demonstration that eye movements can be used to discriminate on which representational level texts are remembered and the findings provide novel insight into the underlying dynamics in play. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Brain stem omnipause neurons and the control of combined eye-head gaze saccades in the alert cat.

    PubMed

    Paré, M; Guitton, D

    1998-06-01

    When the head is unrestrained, rapid displacements of the visual axis-gaze shifts (eye-re-space)-are made by coordinated movements of the eyes (eye-re-head) and head (head-re-space). To address the problem of the neural control of gaze shifts, we studied and contrasted the discharges of omnipause neurons (OPNs) during a variety of combined eye-head gaze shifts and head-fixed eye saccades executed by alert cats. OPNs discharged tonically during intersaccadic intervals and at a reduced level during slow perisaccadic gaze movements sometimes accompanying saccades. Their activity ceased for the duration of the saccadic gaze shifts the animal executed, either by head-fixed eye saccades alone or by combined eye-head movements. This was true for all types of gaze shifts studied: active movements to visual targets; passive movements induced by whole-body rotation or by head rotation about stationary body; and electrically evoked movements by stimulation of the caudal part of the superior colliculus (SC), a central structure for gaze control. For combined eye-head gaze shifts, the OPN pause was therefore not correlated to the eye-in-head trajectory. For instance, in active gaze movements, the end of the pause was better correlated with the gaze end than with either the eye saccade end or the time of eye counterrotation. The hypothesis that cat OPNs participate in controlling gaze shifts is supported by these results, and also by the observation that the movements of both the eyes and the head were transiently interrupted by stimulation of OPNs during gaze shifts. However, we found that the OPN pause could be dissociated from the gaze-motor-error signal producing the gaze shift. First, OPNs resumed discharging when perturbation of head motion briefly interrupted a gaze shift before its intended amplitude was attained. Second, stimulation of caudal SC sites in head-free cat elicited large head-free gaze shifts consistent with the creation of a large gaze-motor-error signal. However, stimulation of the same sites in head-fixed cat produced small "goal-directed" eye saccades, and OPNs paused only for the duration of the latter; neither a pause nor an eye movement occurred when the same stimulation was applied with the eyes at the goal location. We conclude that OPNs can be controlled by neither a simple eye control system nor an absolute gaze control system. Our data cannot be accounted for by existing models describing the control of combined eye-head gaze shifts and therefore put new constraints on future models, which will have to incorporate all the various signals that act synergistically to control gaze shifts.

  1. Optimal Eye-Gaze Fixation Position for Face-Related Neural Responses

    PubMed Central

    Zerouali, Younes; Lina, Jean-Marc; Jemel, Boutheina

    2013-01-01

    It is generally agreed that some features of a face, namely the eyes, are more salient than others as indexed by behavioral diagnosticity, gaze-fixation patterns and evoked-neural responses. However, because previous studies used unnatural stimuli, there is no evidence so far that the early encoding of a whole face in the human brain is based on the eyes or other facial features. To address this issue, scalp electroencephalogram (EEG) and eye gaze-fixations were recorded simultaneously in a gaze-contingent paradigm while observers viewed faces. We found that the N170 indexing the earliest face-sensitive response in the human brain was the largest when the fixation position is located around the nasion. Interestingly, for inverted faces, this optimal fixation position was more variable, but mainly clustered in the upper part of the visual field (around the mouth). These observations extend the findings of recent behavioral studies, suggesting that the early encoding of a face, as indexed by the N170, is not driven by the eyes per se, but rather arises from a general perceptual setting (upper-visual field advantage) coupled with the alignment of a face stimulus to a stored face template. PMID:23762224

  2. Optimal eye-gaze fixation position for face-related neural responses.

    PubMed

    Zerouali, Younes; Lina, Jean-Marc; Jemel, Boutheina

    2013-01-01

    It is generally agreed that some features of a face, namely the eyes, are more salient than others as indexed by behavioral diagnosticity, gaze-fixation patterns and evoked-neural responses. However, because previous studies used unnatural stimuli, there is no evidence so far that the early encoding of a whole face in the human brain is based on the eyes or other facial features. To address this issue, scalp electroencephalogram (EEG) and eye gaze-fixations were recorded simultaneously in a gaze-contingent paradigm while observers viewed faces. We found that the N170 indexing the earliest face-sensitive response in the human brain was the largest when the fixation position is located around the nasion. Interestingly, for inverted faces, this optimal fixation position was more variable, but mainly clustered in the upper part of the visual field (around the mouth). These observations extend the findings of recent behavioral studies, suggesting that the early encoding of a face, as indexed by the N170, is not driven by the eyes per se, but rather arises from a general perceptual setting (upper-visual field advantage) coupled with the alignment of a face stimulus to a stored face template.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Hong-Jun; Carmichael, Tandy; Tourassi, Georgia

    Two people may analyze a visual scene in two completely different ways. Our study sought to determine whether human gaze may be used to establish the identity of an individual. To accomplish this objective we investigated the gaze pattern of twelve individuals viewing different still images with different spatial relationships. Specifically, we created 5 visual dot-pattern tests to be shown on a standard computer monitor. These tests challenged the viewer s capacity to distinguish proximity, alignment, and perceptual organization. Each test included 50 images of varying difficulty (total of 250 images). Eye-tracking data were collected from each individual while takingmore » the tests. The eye-tracking data were converted into gaze velocities and analyzed with Hidden Markov Models to develop personalized gaze profiles. Using leave-one-out cross-validation, we observed that these personalized profiles could differentiate among the 12 users with classification accuracy ranging between 53% and 76%, depending on the test. This was statistically significantly better than random guessing (i.e., 8.3% or 1 out of 12). Classification accuracy was higher for the tests where the users average gaze velocity per case was lower. The study findings support the feasibility of using gaze as a biometric or personalized biomarker. These findings could have implications in Radiology training and the development of personalized e-learning environments.« less

  4. Gaze training enhances laparoscopic technical skill acquisition and multi-tasking performance: a randomized, controlled study.

    PubMed

    Wilson, Mark R; Vine, Samuel J; Bright, Elizabeth; Masters, Rich S W; Defriend, David; McGrath, John S

    2011-12-01

    The operating room environment is replete with stressors and distractions that increase the attention demands of what are already complex psychomotor procedures. Contemporary research in other fields (e.g., sport) has revealed that gaze training interventions may support the development of robust movement skills. This current study was designed to examine the utility of gaze training for technical laparoscopic skills and to test performance under multitasking conditions. Thirty medical trainees with no laparoscopic experience were divided randomly into one of three treatment groups: gaze trained (GAZE), movement trained (MOVE), and discovery learning/control (DISCOVERY). Participants were fitted with a Mobile Eye gaze registration system, which measures eye-line of gaze at 25 Hz. Training consisted of ten repetitions of the "eye-hand coordination" task from the LAP Mentor VR laparoscopic surgical simulator while receiving instruction and video feedback (specific to each treatment condition). After training, all participants completed a control test (designed to assess learning) and a multitasking transfer test, in which they completed the procedure while performing a concurrent tone counting task. Not only did the GAZE group learn more quickly than the MOVE and DISCOVERY groups (faster completion times in the control test), but the performance difference was even more pronounced when multitasking. Differences in gaze control (target locking fixations), rather than tool movement measures (tool path length), underpinned this performance advantage for GAZE training. These results suggest that although the GAZE intervention focused on training gaze behavior only, there were indirect benefits for movement behaviors and performance efficiency. Additionally, focusing on a single external target when learning, rather than on complex movement patterns, may have freed-up attentional resources that could be applied to concurrent cognitive tasks.

  5. Specificity of Age-Related Differences in Eye-Gaze Following: Evidence From Social and Nonsocial Stimuli.

    PubMed

    Slessor, Gillian; Venturini, Cristina; Bonny, Emily J; Insch, Pauline M; Rokaszewicz, Anna; Finnerty, Ailbhe N

    2016-01-01

    Eye-gaze following is a fundamental social skill, facilitating communication. The present series of studies explored adult age-related differences in this key social-cognitive ability. In Study 1 younger and older adult participants completed a cueing task in which eye-gaze cues were predictive or non-predictive of target location. Another eye-gaze cueing task, assessing the influence of congruent and incongruent eye-gaze cues relative to trials which provided no cue to target location, was administered in Study 2. Finally, in Study 3 the eye-gaze cue was replaced by an arrow. In Study 1 older adults showed less evidence of gaze following than younger participants when required to strategically follow predictive eye-gaze cues and when making automatic shifts of attention to non-predictive eye-gaze cues. Findings from Study 2 suggested that, unlike younger adults, older participants showed no facilitation effect and thus did not follow congruent eye-gaze cues. They also had significantly weaker attentional costs than their younger counterparts. These age-related differences were not found in the non-social arrow cueing task. Taken together these findings suggest older adults do not use eye-gaze cues to engage in joint attention, and have specific social difficulties decoding critical information from the eye region. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Eye-Hand Coordination during Visuomotor Adaptation with Different Rotation Angles

    PubMed Central

    Rentsch, Sebastian; Rand, Miya K.

    2014-01-01

    This study examined adaptive changes of eye-hand coordination during a visuomotor rotation task. Young adults made aiming movements to targets on a horizontal plane, while looking at the rotated feedback (cursor) of hand movements on a monitor. To vary the task difficulty, three rotation angles (30°, 75°, and 150°) were tested in three groups. All groups shortened hand movement time and trajectory length with practice. However, control strategies used were different among groups. The 30° group used proportionately more implicit adjustments of hand movements than other groups. The 75° group used more on-line feedback control, whereas the 150° group used explicit strategic adjustments. Regarding eye-hand coordination, timing of gaze shift to the target was gradually changed with practice from the late to early phase of hand movements in all groups, indicating an emerging gaze-anchoring behavior. Gaze locations prior to the gaze anchoring were also modified with practice from the cursor vicinity to an area between the starting position and the target. Reflecting various task difficulties, these changes occurred fastest in the 30° group, followed by the 75° group. The 150° group persisted in gazing at the cursor vicinity. These results suggest that the function of gaze control during visuomotor adaptation changes from a reactive control for exploring the relation between cursor and hand movements to a predictive control for guiding the hand to the task goal. That gaze-anchoring behavior emerged in all groups despite various control strategies indicates a generality of this adaptive pattern for eye-hand coordination in goal-directed actions. PMID:25333942

  7. Kinematics and eye-head coordination of gaze shifts evoked from different sites in the superior colliculus of the cat.

    PubMed

    Guillaume, Alain; Pélisson, Denis

    2006-12-15

    Shifting gaze requires precise coordination of eye and head movements. It is clear that the superior colliculus (SC) is involved with saccadic gaze shifts. Here we investigate its role in controlling both eye and head movements during gaze shifts. Gaze shifts of the same amplitude can be evoked from different SC sites by controlled electrical microstimulation. To describe how the SC coordinates the eye and the head, we compare the characteristics of these amplitude-matched gaze shifts evoked from different SC sites. We show that matched amplitude gaze shifts elicited from progressively more caudal sites are progressively slower and associated with a greater head contribution. Stimulation at more caudal SC sites decreased the peak velocity of the eye but not of the head, suggesting that the lower peak gaze velocity for the caudal sites is due to the increased contribution of the slower-moving head. Eye-head coordination across the SC motor map is also indicated by the relative latencies of the eye and head movements. For some amplitudes of gaze shift, rostral stimulation evoked eye movement before head movement, whereas this reversed with caudal stimulation, which caused the head to move before the eyes. These results show that gaze shifts of similar amplitude evoked from different SC sites are produced with different kinematics and coordination of eye and head movements. In other words, gaze shifts evoked from different SC sites follow different amplitude-velocity curves, with different eye-head contributions. These findings shed light on mechanisms used by the central nervous system to translate a high-level motor representation (a desired gaze displacement on the SC map) into motor commands appropriate for the involved body segments (the eye and the head).

  8. Dissociation of eye and head components of gaze shifts by stimulation of the omnipause neuron region.

    PubMed

    Gandhi, Neeraj J; Sparks, David L

    2007-07-01

    Natural movements often include actions integrated across multiple effectors. Coordinated eye-head movements are driven by a command to shift the line of sight by a desired displacement vector. Yet because extraocular and neck motoneurons are separate entities, the gaze shift command must be separated into independent signals for eye and head movement control. We report that this separation occurs, at least partially, at or before the level of pontine omnipause neurons (OPNs). Stimulation of the OPNs prior to and during gaze shifts temporally decoupled the eye and head components by inhibiting gaze and eye saccades. In contrast, head movements were consistently initiated before gaze onset, and ongoing head movements continued along their trajectories, albeit with some characteristic modulations. After stimulation offset, a gaze shift composed of an eye saccade, and a reaccelerated head movement was produced to preserve gaze accuracy. We conclude that signals subject to OPN inhibition produce the eye-movement component of a coordinated eye-head gaze shift and are not the only signals involved in the generation of the head component of the gaze shift.

  9. Eye movements and manual interception of ballistic trajectories: effects of law of motion perturbations and occlusions.

    PubMed

    Delle Monache, Sergio; Lacquaniti, Francesco; Bosco, Gianfranco

    2015-02-01

    Manual interceptions are known to depend critically on integration of visual feedback information and experience-based predictions of the interceptive event. Within this framework, coupling between gaze and limb movements might also contribute to the interceptive outcome, since eye movements afford acquisition of high-resolution visual information. We investigated this issue by analyzing subjects' head-fixed oculomotor behavior during manual interceptions. Subjects moved a mouse cursor to intercept computer-generated ballistic trajectories either congruent with Earth's gravity or perturbed with weightlessness (0 g) or hypergravity (2 g) effects. In separate sessions, trajectories were either fully visible or occluded before interception to enforce visual prediction. Subjects' oculomotor behavior was classified in terms of amounts of time they gazed at different visual targets and of overall number of saccades. Then, by way of multivariate analyses, we assessed the following: (1) whether eye movement patterns depended on targets' laws of motion and occlusions; and (2) whether interceptive performance was related to the oculomotor behavior. First, we found that eye movement patterns depended significantly on targets' laws of motion and occlusion, suggesting predictive mechanisms. Second, subjects coupled differently oculomotor and interceptive behavior depending on whether targets were visible or occluded. With visible targets, subjects made smaller interceptive errors if they gazed longer at the mouse cursor. Instead, with occluded targets, they achieved better performance by increasing the target's tracking accuracy and by avoiding gaze shifts near interception, suggesting that precise ocular tracking provided better trajectory predictions for the interceptive response.

  10. Intact unconscious processing of eye contact in schizophrenia.

    PubMed

    Seymour, Kiley; Rhodes, Gillian; Stein, Timo; Langdon, Robyn

    2016-03-01

    The perception of eye gaze is crucial for social interaction, providing essential information about another person's goals, intentions, and focus of attention. People with schizophrenia suffer a wide range of social cognitive deficits, including abnormalities in eye gaze perception. For instance, patients have shown an increased bias to misjudge averted gaze as being directed toward them. In this study we probed early unconscious mechanisms of gaze processing in schizophrenia using a technique known as continuous flash suppression. Previous research using this technique to render faces with direct and averted gaze initially invisible reveals that direct eye contact gains privileged access to conscious awareness in healthy adults. We found that patients, as with healthy control subjects, showed the same effect: faces with direct eye gaze became visible significantly faster than faces with averted gaze. This suggests that early unconscious processing of eye gaze is intact in schizophrenia and implies that any misjudgments of gaze direction must manifest at a later conscious stage of gaze processing where deficits and/or biases in attributing mental states to gaze and/or beliefs about being watched may play a role.

  11. Influence of Eye Gaze on Spoken Word Processing: An ERP Study with Infants

    ERIC Educational Resources Information Center

    Parise, Eugenio; Handl, Andrea; Palumbo, Letizia; Friederici, Angela D.

    2011-01-01

    Eye gaze is an important communicative signal, both as mutual eye contact and as referential gaze to objects. To examine whether attention to speech versus nonspeech stimuli in 4- to 5-month-olds (n = 15) varies as a function of eye gaze, event-related brain potentials were used. Faces with mutual or averted gaze were presented in combination with…

  12. Orienting to Eye Gaze and Face Processing

    ERIC Educational Resources Information Center

    Tipples, Jason

    2005-01-01

    The author conducted 7 experiments to examine possible interactions between orienting to eye gaze and specific forms of face processing. Participants classified a letter following either an upright or inverted face with averted, uninformative eye gaze. Eye gaze orienting effects were recorded for upright and inverted faces, irrespective of whether…

  13. Talking heads or talking eyes? Effects of head orientation and sudden onset gaze cues on attention capture.

    PubMed

    van der Wel, Robrecht P; Welsh, Timothy; Böckler, Anne

    2018-01-01

    The direction of gaze towards or away from an observer has immediate effects on attentional processing in the observer. Previous research indicates that faces with direct gaze are processed more efficiently than faces with averted gaze. We recently reported additional processing advantages for faces that suddenly adopt direct gaze (abruptly shift from averted to direct gaze) relative to static direct gaze (always in direct gaze), sudden averted gaze (abruptly shift from direct to averted gaze), and static averted gaze (always in averted gaze). Because changes in gaze orientation in previous study co-occurred with changes in head orientation, it was not clear if the effect is contingent on face or eye processing, or whether it requires both the eyes and the face to provide consistent information. The present study delineates the impact of head orientation, sudden onset motion cues, and gaze cues. Participants completed a target-detection task in which head position remained in a static averted or direct orientation while sudden onset motion and eye gaze cues were manipulated within each trial. The results indicate a sudden direct gaze advantage that resulted from the additive role of motion and gaze cues. Interestingly, the orientation of the face towards or away from the observer did not influence the sudden direct gaze effect, suggesting that eye gaze cues, not face orientation cues, are critical for the sudden direct gaze effect.

  14. Contribution of the frontal eye field to gaze shifts in the head-unrestrained rhesus monkey: neuronal activity.

    PubMed

    Knight, T A

    2012-12-06

    The frontal eye field (FEF) has a strong influence on saccadic eye movements with the head restrained. With the head unrestrained, eye saccades combine with head movements to produce large gaze shifts, and microstimulation of the FEF evokes both eye and head movements. To test whether the dorsomedial FEF provides commands for the entire gaze shift or its separate eye and head components, we recorded extracellular single-unit activity in monkeys trained to make large head-unrestrained gaze shifts. We recorded 80 units active during gaze shifts, and closely examined 26 of these that discharged a burst of action potentials that preceded horizontal gaze movements. These units were movement or visuomovement related and most exhibited open movement fields with respect to amplitude. To reveal the relations of burst parameters to gaze, eye, and/or head movement metrics, we used behavioral dissociations of gaze, eye, and head movements and linear regression analyses. The burst number of spikes (NOS) was strongly correlated with movement amplitude and burst temporal parameters were strongly correlated with movement temporal metrics for eight gaze-related burst neurons and five saccade-related burst neurons. For the remaining 13 neurons, the NOS was strongly correlated with the head movement amplitude, but burst temporal parameters were most strongly correlated with eye movement temporal metrics (head-eye-related burst neurons, HEBNs). These results suggest that FEF units do not encode a command for the unified gaze shift only; instead, different units may carry signals related to the overall gaze shift or its eye and/or head components. Moreover, the HEBNs exhibit bursts whose magnitude and timing may encode a head displacement signal and a signal that influences the timing of the eye saccade, thereby serving as a mechanism for coordinating the eye and head movements of a gaze shift. Copyright © 2012 IBRO. Published by Elsevier Ltd. All rights reserved.

  15. Neural synchrony examined with magnetoencephalography (MEG) during eye gaze processing in autism spectrum disorders: preliminary findings

    PubMed Central

    2014-01-01

    Background Gaze processing deficits are a seminal, early, and enduring behavioral deficit in autism spectrum disorder (ASD); however, a comprehensive characterization of the neural processes mediating abnormal gaze processing in ASD has yet to be conducted. Methods This study investigated whole-brain patterns of neural synchrony during passive viewing of direct and averted eye gaze in ASD adolescents and young adults (M Age  = 16.6) compared to neurotypicals (NT) (M Age  = 17.5) while undergoing magnetoencephalography. Coherence between each pair of 54 brain regions within each of three frequency bands (low frequency (0 to 15 Hz), beta (15 to 30 Hz), and low gamma (30 to 45 Hz)) was calculated. Results Significantly higher coherence and synchronization in posterior brain regions (temporo-parietal-occipital) across all frequencies was evident in ASD, particularly within the low 0 to 15 Hz frequency range. Higher coherence in fronto-temporo-parietal regions was noted in NT. A significantly higher number of low frequency cross-hemispheric synchronous connections and a near absence of right intra-hemispheric coherence in the beta frequency band were noted in ASD. Significantly higher low frequency coherent activity in bilateral temporo-parieto-occipital cortical regions and higher gamma band coherence in right temporo-parieto-occipital brain regions during averted gaze was related to more severe symptomology as reported on the Autism Diagnostic Interview-Revised (ADI-R). Conclusions The preliminary results suggest a pattern of aberrant connectivity that includes higher low frequency synchronization in posterior cortical regions, lack of long-range right hemispheric beta and gamma coherence, and decreased coherence in fronto-temporo-parietal regions necessary for orienting to shifts in eye gaze in ASD; a critical behavior essential for social communication. PMID:24976870

  16. How Do We See Art: An Eye-Tracker Study

    PubMed Central

    Quiroga, Rodrigo Quian; Pedreira, Carlos

    2011-01-01

    We describe the pattern of fixations of subjects looking at figurative and abstract paintings from different artists (Molina, Mondrian, Rembrandt, della Francesca) and at modified versions in which different aspects of these art pieces were altered with simple digital manipulations. We show that the fixations of the subjects followed some general common principles (e.g., being attracted to saliency regions) but with a large variability for the figurative paintings, according to the subject’s personal appreciation and knowledge. In particular, we found different gazing patterns depending on whether the subject saw the original or the modified version of the painting first. We conclude that the study of gazing patterns obtained by using the eye-tracker technology gives a useful approach to quantify how subjects observe art. PMID:21941476

  17. Eye Gaze in Creative Sign Language

    ERIC Educational Resources Information Center

    Kaneko, Michiko; Mesch, Johanna

    2013-01-01

    This article discusses the role of eye gaze in creative sign language. Because eye gaze conveys various types of linguistic and poetic information, it is an intrinsic part of sign language linguistics in general and of creative signing in particular. We discuss various functions of eye gaze in poetic signing and propose a classification of gaze…

  18. Dog owners show experience-based viewing behaviour in judging dog face approachability.

    PubMed

    Gavin, Carla Jade; Houghton, Sarah; Guo, Kun

    2017-01-01

    Our prior visual experience plays a critical role in face perception. We show superior perceptual performance for differentiating conspecific (vs non-conspecific), own-race (vs other-race) and familiar (vs unfamiliar) faces. However, it remains unclear whether our experience with faces of other species would influence our gaze allocation for extracting salient facial information. In this eye-tracking study, we asked both dog owners and non-owners to judge the approachability of human, monkey and dog faces, and systematically compared their behavioural performance and gaze pattern associated with the task. Compared to non-owners, dog owners assessed dog faces with shorter time and fewer fixations, but gave higher approachability ratings. The gaze allocation within local facial features was also modulated by the ownership. The averaged proportion of the fixations and viewing time directed at the dog mouth region were significantly less for the dog owners, and more experienced dog owners tended to look more at the dog eyes, suggesting the adoption of a prior experience-based viewing behaviour for assessing dog approachability. No differences in behavioural performance and gaze pattern were observed between dog owners and non-owners when judging human and monkey faces, implying that the dog owner's experience-based gaze strategy for viewing dog faces was not transferable across faces of other species.

  19. Is eye to eye contact really threatening and avoided in social anxiety?--An eye-tracking and psychophysiology study.

    PubMed

    Wieser, Matthias J; Pauli, Paul; Alpers, Georg W; Mühlberger, Andreas

    2009-01-01

    The effects of direct and averted gaze on autonomic arousal and gaze behavior in social anxiety were investigated using a new paradigm including animated movie stimuli and eye-tracking methodology. While high, medium, and low socially anxious (HSA vs. MSA vs. LSA) women watched animated movie clips, in which faces responded to the gaze of the participants with either direct or averted gaze, their eye movements, heart rate (HR) and skin conductance responses (SCR) were continuously recorded. Groups did not differ in their gaze behavior concerning direct vs. averted gaze, but high socially anxious women tended to fixate the eye region of the presented face longer than MSA and LSA, respectively. Furthermore, they responded to direct gaze with more pronounced cardiac acceleration. This physiological finding indicates that direct gaze may be a fear-relevant feature for socially anxious individuals in social interaction. However, this seems not to result in gaze avoidance. Future studies should examine the role of gaze direction and its interaction with facial expressions in social anxiety and its consequences for avoidance behavior and fear responses. Additionally, further research is needed to clarify the role of gaze perception in social anxiety.

  20. Neural bases of eye and gaze processing: The core of social cognition

    PubMed Central

    Itier, Roxane J.; Batty, Magali

    2014-01-01

    Eyes and gaze are very important stimuli for human social interactions. Recent studies suggest that impairments in recognizing face identity, facial emotions or in inferring attention and intentions of others could be linked to difficulties in extracting the relevant information from the eye region including gaze direction. In this review, we address the central role of eyes and gaze in social cognition. We start with behavioral data demonstrating the importance of the eye region and the impact of gaze on the most significant aspects of face processing. We review neuropsychological cases and data from various imaging techniques such as fMRI/PET and ERP/MEG, in an attempt to best describe the spatio-temporal networks underlying these processes. The existence of a neuronal eye detector mechanism is discussed as well as the links between eye gaze and social cognition impairments in autism. We suggest impairments in processing eyes and gaze may represent a core deficiency in several other brain pathologies and may be central to abnormal social cognition. PMID:19428496

  1. Eye Contact and Fear of Being Laughed at in a Gaze Discrimination Task

    PubMed Central

    Torres-Marín, Jorge; Carretero-Dios, Hugo; Acosta, Alberto; Lupiáñez, Juan

    2017-01-01

    Current approaches conceptualize gelotophobia as a personality trait characterized by a disproportionate fear of being laughed at by others. Consistently with this perspective, gelotophobes are also described as neurotic and introverted and as having a paranoid tendency to anticipate derision and mockery situations. Although research on gelotophobia has significantly progressed over the past two decades, no evidence exists concerning the potential effects of gelotophobia in reaction to eye contact. Previous research has pointed to difficulties in discriminating gaze direction as the basis of possible misinterpretations of others’ intentions or mental states. The aim of the present research was to examine whether gelotophobia predisposition modulates the effects of eye contact (i.e., gaze discrimination) when processing faces portraying several emotional expressions. In two different experiments, participants performed an experimental gaze discrimination task in which they responded, as quickly and accurately as possible, to the eyes’ directions on faces displaying either a happy, angry, fear, neutral, or sad emotional expression. In particular, we expected trait-gelotophobia to modulate the eye contact effect, showing specific group differences in the happiness condition. The results of Study 1 (N = 40) indicated that gelotophobes made more errors than non-gelotophobes did in the gaze discrimination task. In contrast to our initial hypothesis, the happiness expression did not have any special role in the observed differences between individuals with high vs. low trait-gelotophobia. In Study 2 (N = 40), we replicated the pattern of data concerning gaze discrimination ability, even after controlling for individuals’ scores on social anxiety. Furthermore, in our second experiment, we found that gelotophobes did not exhibit any problem with identifying others’ emotions, or a general incorrect attribution of affective features, such as valence, intensity, or arousal. Therefore, this bias in processing gaze might be related to the global processes of social cognition. Further research is needed to explore how eye contact relates to the fear of being laughed at. PMID:29167652

  2. Gaze Patterns in Auditory-Visual Perception of Emotion by Children with Hearing Aids and Hearing Children

    PubMed Central

    Wang, Yifang; Zhou, Wei; Cheng, Yanhong; Bian, Xiaoying

    2017-01-01

    This study investigated eye-movement patterns during emotion perception for children with hearing aids and hearing children. Seventy-eight participants aged from 3 to 7 were asked to watch videos with a facial expression followed by an oral statement, and these two cues were either congruent or incongruent in emotional valence. Results showed that while hearing children paid more attention to the upper part of the face, children with hearing aids paid more attention to the lower part of the face after the oral statement was presented, especially for the neutral facial expression/neutral oral statement condition. These results suggest that children with hearing aids have an altered eye contact pattern with others and a difficulty in matching visual and voice cues in emotion perception. The negative cause and effect of these gaze patterns should be avoided in earlier rehabilitation for hearing-impaired children with assistive devices. PMID:29312104

  3. Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm.

    PubMed

    Li, Bin; Fu, Hong; Wen, Desheng; Lo, WaiLun

    2018-05-19

    Eye tracking technology has become increasingly important for psychological analysis, medical diagnosis, driver assistance systems, and many other applications. Various gaze-tracking models have been established by previous researchers. However, there is currently no near-eye display system with accurate gaze-tracking performance and a convenient user experience. In this paper, we constructed a complete prototype of the mobile gaze-tracking system ' Etracker ' with a near-eye viewing device for human gaze tracking. We proposed a combined gaze-tracking algorithm. In this algorithm, the convolutional neural network is used to remove blinking images and predict coarse gaze position, and then a geometric model is defined for accurate human gaze tracking. Moreover, we proposed using the mean value of gazes to resolve pupil center changes caused by nystagmus in calibration algorithms, so that an individual user only needs to calibrate it the first time, which makes our system more convenient. The experiments on gaze data from 26 participants show that the eye center detection accuracy is 98% and Etracker can provide an average gaze accuracy of 0.53° at a rate of 30⁻60 Hz.

  4. Reactive and anticipatory looking in 6-month-old infants during a visual expectation paradigm.

    PubMed

    Quan, Jeffry; Bureau, Jean-François; Abdul Malik, Adam B; Wong, Johnny; Rifkin-Graboi, Anne

    2017-10-01

    This article presents data from 278 six-month-old infants who completed a visual expectation paradigm in which audiovisual stimuli were first presented randomly (random phase), and then in a spatial pattern (pattern phase). Infants' eye gaze behaviour was tracked with a 60 Hz Tobii eye-tracker in order to measure two types of looking behaviour: reactive looking (i.e., latency to shift eye gaze in reaction to the appearance of stimuli) and anticipatory looking (i.e., percentage of time spent looking at the location where the next stimulus is about to appear during the inter-stimulus interval). Data pertaining to missing data and task order effects are presented. Further analyses show that infants' reactive looking was faster in the pattern phase, compared to the random phase, and their anticipatory looking increased from random to pattern phases. Within the pattern phase, infants' reactive looking showed a quadratic trend, with reactive looking time latencies peaking in the middle portion of the phase. Similarly, within the pattern phase, infants' anticipatory looking also showed a quadratic trend, with anticipatory looking peaking during the middle portion of the phase.

  5. Fear of Negative Evaluation Influences Eye Gaze in Adolescents with Autism Spectrum Disorder: A Pilot Study

    ERIC Educational Resources Information Center

    White, Susan W.; Maddox, Brenna B.; Panneton, Robin K.

    2015-01-01

    Social anxiety is common among adolescents with Autism Spectrum Disorder (ASD). In this modest-sized pilot study, we examined the relationship between social worries and gaze patterns to static social stimuli in adolescents with ASD (n = 15) and gender-matched adolescents without ASD (control; n = 18). Among cognitively unimpaired adolescents with…

  6. Vestibulo-Cervico-Ocular Responses and Tracking Eye Movements after Prolonged Exposure to Microgravity

    NASA Technical Reports Server (NTRS)

    Kornilova, L. N.; Naumov, I. A.; Azarov, K. A.; Sagalovitch, S. V.; Reschke, Millard F.; Kozlovskaya, I. B.

    2007-01-01

    The vestibular function and tracking eye movements were investigated in 12 Russian crew members of ISS missions on days 1(2), 4(5-6), and 8(9-10) after prolonged exposure to microgravity (126 to 195 days). The spontaneous oculomotor activity, static torsional otolith-cervico-ocular reflex, dynamic vestibulo-cervico-ocular responses, vestibular reactivity, tracking eye movements, and gaze-holding were studied using videooculography (VOG) and electrooculography (EOG) for parallel eye movement recording. On post-flight days 1-2 (R+1-2) some cosmonauts demonstrated: - an increased spontaneous oculomotor activity (floating eye movements, spontaneous nystagmus of the typical and atypical form, square wave jerks, gaze nystagmus) with the head held in the vertical position; - suppressed otolith function (absent or reduced by one half amplitude of torsional compensatory eye counter-rolling) with the head inclined statically right- or leftward by 300; - increased vestibular reactivity (lowered threshold and increased intensity of the vestibular nystagmus) during head turns around the longitudinal body axis at 0.125 Hz; - a significant change in the accuracy, velocity, and temporal characteristics of the eye tracking. The pattern, depth, dynamics, and velocity of the vestibular function and tracking eye movements recovery varied with individual participants in the investigation. However, there were also regular responses during readaptation to the normal gravity: - suppression of the otolith function was typically accompanied by an exaggerated vestibular reactivity; - the structure of visual tracking (the accuracy of fixational eye rotations, smooth tracking, and gaze-holding) was disturbed (the appearance of correcting saccades, the transition of smooth tracking to saccadic tracking) only in those cosmonauts who, in parallel to an increased reactivity of the vestibular input, also had central changes in the oculomotor system (spontaneous nystagmus, gaze nystagmus).

  7. Gaze Compensation as a Technique for Improving Hand–Eye Coordination in Prosthetic Vision

    PubMed Central

    Titchener, Samuel A.; Shivdasani, Mohit N.; Fallon, James B.; Petoe, Matthew A.

    2018-01-01

    Purpose Shifting the region-of-interest within the input image to compensate for gaze shifts (“gaze compensation”) may improve hand–eye coordination in visual prostheses that incorporate an external camera. The present study investigated the effects of eye movement on hand-eye coordination under simulated prosthetic vision (SPV), and measured the coordination benefits of gaze compensation. Methods Seven healthy-sighted subjects performed a target localization-pointing task under SPV. Three conditions were tested, modeling: retinally stabilized phosphenes (uncompensated); gaze compensation; and no phosphene movement (center-fixed). The error in pointing was quantified for each condition. Results Gaze compensation yielded a significantly smaller pointing error than the uncompensated condition for six of seven subjects, and a similar or smaller pointing error than the center-fixed condition for all subjects (two-way ANOVA, P < 0.05). Pointing error eccentricity and gaze eccentricity were moderately correlated in the uncompensated condition (azimuth: R2 = 0.47; elevation: R2 = 0.51) but not in the gaze-compensated condition (azimuth: R2 = 0.01; elevation: R2 = 0.00). Increased variability in gaze at the time of pointing was correlated with greater reduction in pointing error in the center-fixed condition compared with the uncompensated condition (R2 = 0.64). Conclusions Eccentric eye position impedes hand–eye coordination in SPV. While limiting eye eccentricity in uncompensated viewing can reduce errors, gaze compensation is effective in improving coordination for subjects unable to maintain fixation. Translational Relevance The results highlight the present necessity for suppressing eye movement and support the use of gaze compensation to improve hand–eye coordination and localization performance in prosthetic vision. PMID:29321945

  8. Frames of reference for gaze saccades evoked during stimulation of lateral intraparietal cortex.

    PubMed

    Constantin, A G; Wang, H; Martinez-Trujillo, J C; Crawford, J D

    2007-08-01

    Previous studies suggest that stimulation of lateral intraparietal cortex (LIP) evokes saccadic eye movements toward eye- or head-fixed goals, whereas most single-unit studies suggest that LIP uses an eye-fixed frame with eye-position modulations. The goal of our study was to determine the reference frame for gaze shifts evoked during LIP stimulation in head-unrestrained monkeys. Two macaques (M1 and M2) were implanted with recording chambers over the right intraparietal sulcus and with search coils for recording three-dimensional eye and head movements. The LIP region was microstimulated using pulse trains of 300 Hz, 100-150 microA, and 200 ms. Eighty-five putative LIP sites in M1 and 194 putative sites in M2 were used in our quantitative analysis throughout this study. Average amplitude of the stimulation-evoked gaze shifts was 8.67 degrees for M1 and 7.97 degrees for M2 with very small head movements. When these gaze-shift trajectories were rotated into three coordinate frames (eye, head, and body), gaze endpoint distribution for all sites was most convergent to a common point when plotted in eye coordinates. Across all sites, the eye-centered model provided a significantly better fit compared with the head, body, or fixed-vector models (where the latter model signifies no modulation of the gaze trajectory as a function of initial gaze position). Moreover, the probability of evoking a gaze shift from any one particular position was modulated by the current gaze direction (independent of saccade direction). These results provide causal evidence that the motor commands from LIP encode gaze command in eye-fixed coordinates but are also subtly modulated by initial gaze position.

  9. Looking at Eye Gaze Processing and Its Neural Correlates in Infancy--Implications for Social Development and Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Hoehl, Stefanie; Reid, Vincent M.; Parise, Eugenio; Handl, Andrea; Palumbo, Letizia; Striano, Tricia

    2009-01-01

    The importance of eye gaze as a means of communication is indisputable. However, there is debate about whether there is a dedicated neural module, which functions as an eye gaze detector and when infants are able to use eye gaze cues in a referential way. The application of neuroscience methodologies to developmental psychology has provided new…

  10. Frequency analysis of gaze points with CT colonography interpretation using eye gaze tracking system

    NASA Astrophysics Data System (ADS)

    Tsutsumi, Shoko; Tamashiro, Wataru; Sato, Mitsuru; Okajima, Mika; Ogura, Toshihiro; Doi, Kunio

    2017-03-01

    It is important to investigate eye tracking gaze points of experts, in order to assist trainees in understanding of image interpretation process. We investigated gaze points of CT colonography (CTC) interpretation process, and analyzed the difference in gaze points between experts and trainees. In this study, we attempted to understand how trainees can be improved to a level achieved by experts in viewing of CTC. We used an eye gaze point sensing system, Gazefineder (JVCKENWOOD Corporation, Tokyo, Japan), which can detect pupil point and corneal reflection point by the dark pupil eye tracking. This system can provide gaze points images and excel file data. The subjects are radiological technologists who are experienced, and inexperienced in reading CTC. We performed observer studies in reading virtual pathology images and examined observer's image interpretation process using gaze points data. Furthermore, we examined eye tracking frequency analysis by using the Fast Fourier Transform (FFT). We were able to understand the difference in gaze points between experts and trainees by use of the frequency analysis. The result of the trainee had a large amount of both high-frequency components and low-frequency components. In contrast, both components by the expert were relatively low. Regarding the amount of eye movement in every 0.02 second we found that the expert tended to interpret images slowly and calmly. On the other hand, the trainee was moving eyes quickly and also looking for wide areas. We can assess the difference in the gaze points on CTC between experts and trainees by use of the eye gaze point sensing system and based on the frequency analysis. The potential improvements in CTC interpretation for trainees can be evaluated by using gaze points data.

  11. Experimental Test of Spatial Updating Models for Monkey Eye-Head Gaze Shifts

    PubMed Central

    Van Grootel, Tom J.; Van der Willigen, Robert F.; Van Opstal, A. John

    2012-01-01

    How the brain maintains an accurate and stable representation of visual target locations despite the occurrence of saccadic gaze shifts is a classical problem in oculomotor research. Here we test and dissociate the predictions of different conceptual models for head-unrestrained gaze-localization behavior of macaque monkeys. We adopted the double-step paradigm with rapid eye-head gaze shifts to measure localization accuracy in response to flashed visual stimuli in darkness. We presented the second target flash either before (static), or during (dynamic) the first gaze displacement. In the dynamic case the brief visual flash induced a small retinal streak of up to about 20 deg at an unpredictable moment and retinal location during the eye-head gaze shift, which provides serious challenges for the gaze-control system. However, for both stimulus conditions, monkeys localized the flashed targets with accurate gaze shifts, which rules out several models of visuomotor control. First, these findings exclude the possibility that gaze-shift programming relies on retinal inputs only. Instead, they support the notion that accurate eye-head motor feedback updates the gaze-saccade coordinates. Second, in dynamic trials the visuomotor system cannot rely on the coordinates of the planned first eye-head saccade either, which rules out remapping on the basis of a predictive corollary gaze-displacement signal. Finally, because gaze-related head movements were also goal-directed, requiring continuous access to eye-in-head position, we propose that our results best support a dynamic feedback scheme for spatial updating in which visuomotor control incorporates accurate signals about instantaneous eye- and head positions rather than relative eye- and head displacements. PMID:23118883

  12. Gaze failure, drifting eye movements, and centripetal nystagmus in cerebellar disease.

    PubMed Central

    Leech, J; Gresty, M; Hess, K; Rudge, P

    1977-01-01

    Three abnormalities of eye movement in man are described which are indicative of cerebellar system disorder, namely, centripetally beating nystagmus, failure to maintain lateral gaze either in darkness or with eye closure, and slow drifting movements of the eyes in the absence of fixation. Similar eye movement signs follow cerebellectomy in the primate and the cat. These abnormalities of eye movement, together with other signs of cerebellar disease, such as rebound alternating, and gaze paretic nystagmus, are explained by the hypothesis that the cerebellum helps to maintain lateral gaze and that brain stem mechanisms which monitor gaze position generate compensatory biases in the absence of normal cerebellar function. PMID:603785

  13. Stationary gaze entropy predicts lane departure events in sleep-deprived drivers.

    PubMed

    Shiferaw, Brook A; Downey, Luke A; Westlake, Justine; Stevens, Bronwyn; Rajaratnam, Shantha M W; Berlowitz, David J; Swann, Phillip; Howard, Mark E

    2018-02-02

    Performance decrement associated with sleep deprivation is a leading contributor to traffic accidents and fatalities. While current research has focused on eye blink parameters as physiological indicators of driver drowsiness, little is understood of how gaze behaviour alters as a result of sleep deprivation. In particular, the effect of sleep deprivation on gaze entropy has not been previously examined. In this randomised, repeated measures study, 9 (4 male, 5 female) healthy participants completed two driving sessions in a fully instrumented vehicle (1 after a night of sleep deprivation and 1 after normal sleep) on a closed track, during which eye movement activity and lane departure events were recorded. Following sleep deprivation, the rate of fixations reduced while blink rate and duration as well as saccade amplitude increased. In addition, stationary and transition entropy of gaze also increased following sleep deprivation as well as with amount of time driven. An increase in stationary gaze entropy in particular was associated with higher odds of a lane departure event occurrence. These results highlight how fatigue induced by sleep deprivation and time-on-task effects can impair drivers' visual awareness through disruption of gaze distribution and scanning patterns.

  14. Overview of Nonelectronic Eye-Gaze Communication Techniques.

    ERIC Educational Resources Information Center

    Goossens, Carol A.; Crain, Sharon S.

    1987-01-01

    The article discusses currently used eye gaze communication techniques with the severely physically disabled (eye-gaze vest, laptray, transparent display, and mirror/prism communicator), presents information regarding the types of message displays used to depict encoded material, and discusses the advantages of implementing nonelectronic eye-gaze…

  15. Gaze as a biometric

    NASA Astrophysics Data System (ADS)

    Yoon, Hong-Jun; Carmichael, Tandy R.; Tourassi, Georgia

    2014-03-01

    Two people may analyze a visual scene in two completely different ways. Our study sought to determine whether human gaze may be used to establish the identity of an individual. To accomplish this objective we investigated the gaze pattern of twelve individuals viewing still images with different spatial relationships. Specifically, we created 5 visual "dotpattern" tests to be shown on a standard computer monitor. These tests challenged the viewer's capacity to distinguish proximity, alignment, and perceptual organization. Each test included 50 images of varying difficulty (total of 250 images). Eye-tracking data were collected from each individual while taking the tests. The eye-tracking data were converted into gaze velocities and analyzed with Hidden Markov Models to develop personalized gaze profiles. Using leave-one-out cross-validation, we observed that these personalized profiles could differentiate among the 12 users with classification accuracy ranging between 53% and 76%, depending on the test. This was statistically significantly better than random guessing (i.e., 8.3% or 1 out of 12). Classification accuracy was higher for the tests where the users' average gaze velocity per case was lower. The study findings support the feasibility of using gaze as a biometric or personalized biomarker. These findings could have implications in Radiology training and the development of personalized e-learning environments.

  16. Eye, head, and body coordination during large gaze shifts in rhesus monkeys: movement kinematics and the influence of posture.

    PubMed

    McCluskey, Meaghan K; Cullen, Kathleen E

    2007-04-01

    Coordinated movements of the eye, head, and body are used to redirect the axis of gaze between objects of interest. However, previous studies of eye-head gaze shifts in head-unrestrained primates generally assumed the contribution of body movement to be negligible. Here we characterized eye-head-body coordination during horizontal gaze shifts made by trained rhesus monkeys to visual targets while they sat upright in a standard primate chair and assumed a more natural sitting posture in a custom-designed chair. In both postures, gaze shifts were characterized by the sequential onset of eye, head, and body movements, which could be described by predictable relationships. Body motion made a small but significant contribution to gaze shifts that were > or =40 degrees in amplitude. Furthermore, as gaze shift amplitude increased (40-120 degrees ), body contribution and velocity increased systematically. In contrast, peak eye and head velocities plateaued at velocities of approximately 250-300 degrees /s, and the rotation of the eye-in-orbit and head-on-body remained well within the physical limits of ocular and neck motility during large gaze shifts, saturating at approximately 35 and 60 degrees , respectively. Gaze shifts initiated with the eye more contralateral in the orbit were accompanied by smaller body as well as head movement amplitudes and velocities were greater when monkeys were seated in the more natural body posture. Taken together, our findings show that body movement makes a predictable contribution to gaze shifts that is systematically influenced by factors such as orbital position and posture. We conclude that body movements are part of a coordinated series of motor events that are used to voluntarily reorient gaze and that these movements can be significant even in a typical laboratory setting. Our results emphasize the need for caution in the interpretation of data from neurophysiological studies of the control of saccadic eye movements and/or eye-head gaze shifts because single neurons can code motor commands to move the body as well as the head and eyes.

  17. The Effect of Eye Contact Is Contingent on Visual Awareness

    PubMed Central

    Xu, Shan; Zhang, Shen; Geng, Haiyan

    2018-01-01

    The present study explored how eye contact at different levels of visual awareness influences gaze-induced joint attention. We adopted a spatial-cueing paradigm, in which an averted gaze was used as an uninformative central cue for a joint-attention task. Prior to the onset of the averted-gaze cue, either supraliminal (Experiment 1) or subliminal (Experiment 2) eye contact was presented. The results revealed a larger subsequent gaze-cueing effect following supraliminal eye contact compared to a no-contact condition. In contrast, the gaze-cueing effect was smaller in the subliminal eye-contact condition than in the no-contact condition. These findings suggest that the facilitation effect of eye contact on coordinating social attention depends on visual awareness. Furthermore, subliminal eye contact might have an impact on subsequent social attention processes that differ from supraliminal eye contact. This study highlights the need to further investigate the role of eye contact in implicit social cognition. PMID:29467703

  18. An eye model for uncalibrated eye gaze estimation under variable head pose

    NASA Astrophysics Data System (ADS)

    Hnatow, Justin; Savakis, Andreas

    2007-04-01

    Gaze estimation is an important component of computer vision systems that monitor human activity for surveillance, human-computer interaction, and various other applications including iris recognition. Gaze estimation methods are particularly valuable when they are non-intrusive, do not require calibration, and generalize well across users. This paper presents a novel eye model that is employed for efficiently performing uncalibrated eye gaze estimation. The proposed eye model was constructed from a geometric simplification of the eye and anthropometric data about eye feature sizes in order to circumvent the requirement of calibration procedures for each individual user. The positions of the two eye corners and the midpupil, the distance between the two eye corners, and the radius of the eye sphere are required for gaze angle calculation. The locations of the eye corners and midpupil are estimated via processing following eye detection, and the remaining parameters are obtained from anthropometric data. This eye model is easily extended to estimating eye gaze under variable head pose. The eye model was tested on still images of subjects at frontal pose (0 °) and side pose (34 °). An upper bound of the model's performance was obtained by manually selecting the eye feature locations. The resulting average absolute error was 2.98 ° for frontal pose and 2.87 ° for side pose. The error was consistent across subjects, which indicates that good generalization was obtained. This level of performance compares well with other gaze estimation systems that utilize a calibration procedure to measure eye features.

  19. Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements

    ERIC Educational Resources Information Center

    Yu, Chen; Yurovsky, Daniel; Xu, Tian

    2012-01-01

    Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…

  20. Dynamic Eye Tracking Based Metrics for Infant Gaze Patterns in the Face-Distractor Competition Paradigm

    PubMed Central

    Ahtola, Eero; Stjerna, Susanna; Yrttiaho, Santeri; Nelson, Charles A.; Leppänen, Jukka M.; Vanhatalo, Sampsa

    2014-01-01

    Objective To develop new standardized eye tracking based measures and metrics for infants’ gaze dynamics in the face-distractor competition paradigm. Method Eye tracking data were collected from two samples of healthy 7-month-old (total n = 45), as well as one sample of 5-month-old infants (n = 22) in a paradigm with a picture of a face or a non-face pattern as a central stimulus, and a geometric shape as a lateral stimulus. The data were analyzed by using conventional measures of infants’ initial disengagement from the central to the lateral stimulus (i.e., saccadic reaction time and probability) and, additionally, novel measures reflecting infants gaze dynamics after the initial disengagement (i.e., cumulative allocation of attention to the central vs. peripheral stimulus). Results The results showed that the initial saccade away from the centrally presented stimulus is followed by a rapid re-engagement of attention with the central stimulus, leading to cumulative preference for the central stimulus over the lateral stimulus over time. This pattern tended to be stronger for salient facial expressions as compared to non-face patterns, was replicable across two independent samples of 7-month-old infants, and differentiated between 7 and 5 month-old infants. Conclusion The results suggest that eye tracking based assessments of infants’ cumulative preference for faces over time can be readily parameterized and standardized, and may provide valuable techniques for future studies examining normative developmental changes in preference for social signals. Significance Standardized measures of early developing face preferences may have potential to become surrogate biomarkers of neurocognitive and social development. PMID:24845102

  1. 3D ocular ultrasound using gaze tracking on the contralateral eye: a feasibility study.

    PubMed

    Afsham, Narges; Najafi, Mohammad; Abolmaesumi, Purang; Rohling, Robert

    2011-01-01

    A gaze-deviated examination of the eye with a 2D ultrasound transducer is a common and informative ophthalmic test; however, the complex task of the pose estimation of the ultrasound images relative to the eye affects 3D interpretation. To tackle this challenge, a novel system for 3D image reconstruction based on gaze tracking of the contralateral eye has been proposed. The gaze fixates on several target points and, for each fixation, the pose of the examined eye is inferred from the gaze tracking. A single camera system has been developed for pose estimation combined with subject-specific parameter identification. The ultrasound images are then transformed to the coordinate system of the examined eye to create a 3D volume. Accuracy of the proposed gaze tracking system and the pose estimation of the eye have been validated in a set of experiments. Overall system error, including pose estimation and calibration, are 3.12 mm and 4.68 degrees.

  2. Revisiting Patterson's Paradigm: Gaze Behaviors in Deaf Communication.

    ERIC Educational Resources Information Center

    Luciano, Jason M.

    2001-01-01

    This article explains a sequential model of eye gaze and eye contact behaviors researched among hearing populations and explores these behaviors in people with deafness. It is found that characterizations of eye contact and eye gaze behavior applied to hearing populations are not completely applicable to those with deafness. (Contains references.)…

  3. Elevated amygdala response to faces and gaze aversion in autism spectrum disorder.

    PubMed

    Tottenham, Nim; Hertzig, Margaret E; Gillespie-Lynch, Kristen; Gilhooly, Tara; Millner, Alexander J; Casey, B J

    2014-01-01

    Autism spectrum disorders (ASD) are often associated with impairments in judgment of facial expressions. This impairment is often accompanied by diminished eye contact and atypical amygdala responses to face stimuli. The current study used a within-subjects design to examine the effects of natural viewing and an experimental eye-gaze manipulation on amygdala responses to faces. Individuals with ASD showed less gaze toward the eye region of faces relative to a control group. Among individuals with ASD, reduced eye gaze was associated with higher threat ratings of neutral faces. Amygdala signal was elevated in the ASD group relative to controls. This elevated response was further potentiated by experimentally manipulating gaze to the eye region. Potentiation by the gaze manipulation was largest for those individuals who exhibited the least amount of naturally occurring gaze toward the eye region and was associated with their subjective threat ratings. Effects were largest for neutral faces, highlighting the importance of examining neutral faces in the pathophysiology of autism and questioning their use as control stimuli with this population. Overall, our findings provide support for the notion that gaze direction modulates affective response to faces in ASD.

  4. Interactions between gaze-evoked blinks and gaze shifts in monkeys.

    PubMed

    Gandhi, Neeraj J

    2012-02-01

    Rapid eyelid closure, or a blink, often accompanies head-restrained and head-unrestrained gaze shifts. This study examines the interactions between such gaze-evoked blinks and gaze shifts in monkeys. Blink probability increases with gaze amplitude and at a faster rate for head-unrestrained movements. Across animals, blink likelihood is inversely correlated with the average gaze velocity of large-amplitude control movements. Gaze-evoked blinks induce robust perturbations in eye velocity. Peak and average velocities are reduced, duration is increased, but accuracy is preserved. The temporal features of the perturbation depend on factors such as the time of blink relative to gaze onset, inherent velocity kinematics of control movements, and perhaps initial eye-in-head position. Although variable across animals, the initial effect is a reduction in eye velocity, followed by a reacceleration that yields two or more peaks in its waveform. Interestingly, head velocity is not attenuated; instead, it peaks slightly later and with a larger magnitude. Gaze latency is slightly reduced on trials with gaze-evoked blinks, although the effect was more variable during head-unrestrained movements; no reduction in head latency is observed. Preliminary data also demonstrate a similar perturbation of gaze-evoked blinks during vertical saccades. The results are compared with previously reported effects of reflexive blinks (evoked by air-puff delivered to one eye or supraorbital nerve stimulation) and discussed in terms of effects of blinks on saccadic suppression, neural correlates of the altered eye velocity signals, and implications on the hypothesis that the attenuation in eye velocity is produced by a head movement command.

  5. Normal correspondence of tectal maps for saccadic eye movements in strabismus

    PubMed Central

    Economides, John R.; Adams, Daniel L.

    2016-01-01

    The superior colliculus is a major brain stem structure for the production of saccadic eye movements. Electrical stimulation at any given point in the motor map generates saccades of defined amplitude and direction. It is unknown how this saccade map is affected by strabismus. Three macaques were raised with exotropia, an outwards ocular deviation, by detaching the medial rectus tendon in each eye at age 1 mo. The animals were able to make saccades to targets with either eye and appeared to alternate fixation freely. To probe the organization of the superior colliculus, microstimulation was applied at multiple sites, with the animals either free-viewing or fixating a target. On average, microstimulation drove nearly conjugate saccades, similar in both amplitude and direction but separated by the ocular deviation. Two monkeys showed a pattern deviation, characterized by a systematic change in the relative position of the two eyes with certain changes in gaze angle. These animals' saccades were slightly different for the right eye and left eye in their amplitude or direction. The differences were consistent with the animals' underlying pattern deviation, measured during static fixation and smooth pursuit. The tectal map for saccade generation appears to be normal in strabismus, but saccades may be affected by changes in the strabismic deviation that occur with different gaze angles. PMID:27605534

  6. Web Usability or Accessibility: Comparisons between People with and without Intellectual Disabilities in Viewing Complex Naturalistic Scenes Using Eye-Tracking Technology

    ERIC Educational Resources Information Center

    Bazar, Nancy Sceery

    2009-01-01

    The purpose of this primarily quantitative study was to compare how young adults with and without intellectual disabilities examine different types of images. Two experiments were conducted. The first, a replication and extension of a classic eye-tracking study (Yarbus, 1967), generated eye gaze patterns and data in response to questions related…

  7. Systematic Observation of an Expert Driver's Gaze Strategy—An On-Road Case Study

    PubMed Central

    Lappi, Otto; Rinkkala, Paavo; Pekkanen, Jami

    2017-01-01

    In this paper we present and qualitatively analyze an expert driver's gaze behavior in natural driving on a real road, with no specific experimental task or instruction. Previous eye tracking research on naturalistic tasks has revealed recurring patterns of gaze behavior that are surprisingly regular and repeatable. Lappi (2016) identified in the literature seven “qualitative laws of gaze behavior in the wild”: recurring patterns that tend to go together, the more so the more naturalistic the setting, all of them expected in extended sequences of fully naturalistic behavior. However, no study to date has observed all in a single experiment. Here, we wanted to do just that: present observations supporting all the “laws” in a single behavioral sequence by a single subject. We discuss the laws in terms of unresolved issues in driver modeling and open challenges for experimental and theoretical development. PMID:28496422

  8. Autistic Symptomatology, Face Processing Abilities, and Eye Fixation Patterns

    ERIC Educational Resources Information Center

    Kirchner, Jennifer C.; Hatri, Alexander; Heekeren, Hauke R.; Dziobek, Isabel

    2011-01-01

    Deviant gaze behavior is a defining characteristic of autism. Its relevance as a pathophysiological mechanism, however, remains unknown. In the present study, we compared eye fixations of 20 adults with autism and 21 controls while they were engaged in taking the Multifaceted Empathy Test (MET). Additional measures of face emotion and identity…

  9. Guiding the mind's eye: improving communication and vision by external control of the scanpath

    NASA Astrophysics Data System (ADS)

    Barth, Erhardt; Dorr, Michael; Böhme, Martin; Gegenfurtner, Karl; Martinetz, Thomas

    2006-02-01

    Larry Stark has emphasised that what we visually perceive is very much determined by the scanpath, i.e. the pattern of eye movements. Inspired by his view, we have studied the implications of the scanpath for visual communication and came up with the idea to not only sense and analyse eye movements, but also guide them by using a special kind of gaze-contingent information display. Our goal is to integrate gaze into visual communication systems by measuring and guiding eye movements. For guidance, we first predict a set of about 10 salient locations. We then change the probability for one of these candidates to be attended: for one candidate the probability is increased, for the others it is decreased. To increase saliency, for example, we add red dots that are displayed very briefly such that they are hardly perceived consciously. To decrease the probability, for example, we locally reduce the temporal frequency content. Again, if performed in a gaze-contingent fashion with low latencies, these manipulations remain unnoticed. Overall, the goal is to find the real-time video transformation minimising the difference between the actual and the desired scanpath without being obtrusive. Applications are in the area of vision-based communication (better control of what information is conveyed) and augmented vision and learning (guide a person's gaze by the gaze of an expert or a computer-vision system). We believe that our research is very much in the spirit of Larry Stark's views on visual perception and the close link between vision research and engineering.

  10. Using gaze patterns to predict task intent in collaboration.

    PubMed

    Huang, Chien-Ming; Andrist, Sean; Sauppé, Allison; Mutlu, Bilge

    2015-01-01

    In everyday interactions, humans naturally exhibit behavioral cues, such as gaze and head movements, that signal their intentions while interpreting the behavioral cues of others to predict their intentions. Such intention prediction enables each partner to adapt their behaviors to the intent of others, serving a critical role in joint action where parties work together to achieve a common goal. Among behavioral cues, eye gaze is particularly important in understanding a person's attention and intention. In this work, we seek to quantify how gaze patterns may indicate a person's intention. Our investigation was contextualized in a dyadic sandwich-making scenario in which a "worker" prepared a sandwich by adding ingredients requested by a "customer." In this context, we investigated the extent to which the customers' gaze cues serve as predictors of which ingredients they intend to request. Predictive features were derived to represent characteristics of the customers' gaze patterns. We developed a support vector machine-based (SVM-based) model that achieved 76% accuracy in predicting the customers' intended requests based solely on gaze features. Moreover, the predictor made correct predictions approximately 1.8 s before the spoken request from the customer. We further analyzed several episodes of interactions from our data to develop a deeper understanding of the scenarios where our predictor succeeded and failed in making correct predictions. These analyses revealed additional gaze patterns that may be leveraged to improve intention prediction. This work highlights gaze cues as a significant resource for understanding human intentions and informs the design of real-time recognizers of user intention for intelligent systems, such as assistive robots and ubiquitous devices, that may enable more complex capabilities and improved user experience.

  11. Aberrant face and gaze habituation in fragile x syndrome.

    PubMed

    Bruno, Jennifer Lynn; Garrett, Amy S; Quintin, Eve-Marie; Mazaika, Paul K; Reiss, Allan L

    2014-10-01

    The authors sought to investigate neural system habituation to face and eye gaze in fragile X syndrome, a disorder characterized by eye-gaze aversion, among other social and cognitive deficits. Participants (ages 15-25 years) were 30 individuals with fragile X syndrome (females, N=14) and a comparison group of 25 individuals without fragile X syndrome (females, N=12) matched for general cognitive ability and autism symptoms. Functional MRI (fMRI) was used to assess brain activation during a gaze habituation task. Participants viewed repeated presentations of four unique faces with either direct or averted eye gaze and judged the direction of eye gaze. Four participants (males, N=4/4; fragile X syndrome, N=3) were excluded because of excessive head motion during fMRI scanning. Behavioral performance did not differ between the groups. Less neural habituation (and significant sensitization) in the fragile X syndrome group was found in the cingulate gyrus, fusiform gyrus, and frontal cortex in response to all faces (direct and averted gaze). Left fusiform habituation in female participants was directly correlated with higher, more typical levels of the fragile X mental retardation protein and inversely correlated with autism symptoms. There was no evidence for differential habituation to direct gaze compared with averted gaze within or between groups. Impaired habituation and accentuated sensitization in response to face/eye gaze was distributed across multiple levels of neural processing. These results could help inform interventions, such as desensitization therapy, which may help patients with fragile X syndrome modulate anxiety and arousal associated with eye gaze, thereby improving social functioning.

  12. Locations of serial reach targets are coded in multiple reference frames.

    PubMed

    Thompson, Aidan A; Henriques, Denise Y P

    2010-12-01

    Previous work from our lab, and elsewhere, has demonstrated that remembered target locations are stored and updated in an eye-fixed reference frame. That is, reach errors systematically vary as a function of gaze direction relative to a remembered target location, not only when the target is viewed in the periphery (Bock, 1986, known as the retinal magnification effect), but also when the target has been foveated, and the eyes subsequently move after the target has disappeared but prior to reaching (e.g., Henriques, Klier, Smith, Lowy, & Crawford, 1998; Sorrento & Henriques, 2008; Thompson & Henriques, 2008). These gaze-dependent errors, following intervening eye movements, cannot be explained by representations whose frame is fixed to the head, body or even the world. However, it is unknown whether targets presented sequentially would all be coded relative to gaze (i.e., egocentrically/absolutely), or if they would be coded relative to the previous target (i.e., allocentrically/relatively). It might be expected that the reaching movements to two targets separated by 5° would differ by that distance. But, if gaze were to shift between the first and second reaches, would the movement amplitude between the targets differ? If the target locations are coded allocentrically (i.e., the location of the second target coded relative to the first) then the movement amplitude should be about 5°. But, if the second target is coded egocentrically (i.e., relative to current gaze direction), then the reaches to this target and the distances between the subsequent movements should vary systematically with gaze as described above. We found that requiring an intervening saccade to the opposite side of 2 briefly presented targets between reaches to them resulted in a pattern of reaching error that systematically varied as a function of the distance between current gaze and target, and led to a systematic change in the distance between the sequential reach endpoints as predicted by an egocentric frame anchored to the eye. However, the amount of change in this distance was smaller than predicted by a pure eye-fixed representation, suggesting that relative positions of the targets or allocentric coding was also used in sequential reach planning. The spatial coding and updating of sequential reach target locations seems to rely on a combined weighting of multiple reference frames, with one of them centered on the eye. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Brief Report: Using a Point-of-View Camera to Measure Eye Gaze in Young Children with Autism Spectrum Disorder during Naturalistic Social Interactions--A Pilot Study

    ERIC Educational Resources Information Center

    Edmunds, Sarah R.; Rozga, Agata; Li, Yin; Karp, Elizabeth A.; Ibanez, Lisa V.; Rehg, James M.; Stone, Wendy L.

    2017-01-01

    Children with autism spectrum disorder (ASD) show reduced gaze to social partners. Eye contact during live interactions is often measured using stationary cameras that capture various views of the child, but determining a child's precise gaze target within another's face is nearly impossible. This study compared eye gaze coding derived from…

  14. Use of a Remote Eye-Tracker for the Analysis of Gaze during Treadmill Walking and Visual Stimuli Exposition.

    PubMed

    Serchi, V; Peruzzi, A; Cereatti, A; Della Croce, U

    2016-01-01

    The knowledge of the visual strategies adopted while walking in cognitively engaging environments is extremely valuable. Analyzing gaze when a treadmill and a virtual reality environment are used as motor rehabilitation tools is therefore critical. Being completely unobtrusive, remote eye-trackers are the most appropriate way to measure the point of gaze. Still, the point of gaze measurements are affected by experimental conditions such as head range of motion and visual stimuli. This study assesses the usability limits and measurement reliability of a remote eye-tracker during treadmill walking while visual stimuli are projected. During treadmill walking, the head remained within the remote eye-tracker workspace. Generally, the quality of the point of gaze measurements declined as the distance from the remote eye-tracker increased and data loss occurred for large gaze angles. The stimulus location (a dot-target) did not influence the point of gaze accuracy, precision, and trackability during both standing and walking. Similar results were obtained when the dot-target was replaced by a static or moving 2D target and "region of interest" analysis was applied. These findings foster the feasibility of the use of a remote eye-tracker for the analysis of gaze during treadmill walking in virtual reality environments.

  15. Active head rotations and eye-head coordination

    NASA Technical Reports Server (NTRS)

    Zangemeister, W. H.; Stark, L.

    1981-01-01

    It is pointed out that head movements play an important role in gaze. The interaction between eye and head movements involves both their shared role in directing gaze and the compensatory vestibular ocular reflex. The dynamics of head trajectories are discussed, taking into account the use of parameterization to obtain the peak velocity, peak accelerations, the times of these extrema, and the duration of the movement. Attention is given to the main sequence, neck muscle EMG and details of the head-movement trajectory, types of head model accelerations, the latency of eye and head movement in coordinated gaze, gaze latency as a function of various factors, and coordinated gaze types. Clinical examples of gaze-plane analysis are considered along with the instantaneous change of compensatory eye movement (CEM) gain, and aspects of variability.

  16. Live interaction distinctively shapes social gaze dynamics in rhesus macaques.

    PubMed

    Dal Monte, Olga; Piva, Matthew; Morris, Jason A; Chang, Steve W C

    2016-10-01

    The dynamic interaction of gaze between individuals is a hallmark of social cognition. However, very few studies have examined social gaze dynamics after mutual eye contact during real-time interactions. We used a highly quantifiable paradigm to assess social gaze dynamics between pairs of monkeys and modeled these dynamics using an exponential decay function to investigate sustained attention after mutual eye contact. When monkeys were interacting with real partners compared with static images and movies of the same monkeys, we found a significant increase in the proportion of fixations to the eyes and a smaller dispersion of fixations around the eyes, indicating enhanced focal attention to the eye region. Notably, dominance and familiarity between the interacting pairs induced separable components of gaze dynamics that were unique to live interactions. Gaze dynamics of dominant monkeys after mutual eye contact were associated with a greater number of fixations to the eyes, whereas those of familiar pairs were associated with a faster rate of decrease in this eye-directed attention. Our findings endorse the notion that certain key aspects of social cognition are only captured during interactive social contexts and dependent on the elapsed time relative to socially meaningful events. Copyright © 2016 the American Physiological Society.

  17. Live interaction distinctively shapes social gaze dynamics in rhesus macaques

    PubMed Central

    Piva, Matthew; Morris, Jason A.; Chang, Steve W. C.

    2016-01-01

    The dynamic interaction of gaze between individuals is a hallmark of social cognition. However, very few studies have examined social gaze dynamics after mutual eye contact during real-time interactions. We used a highly quantifiable paradigm to assess social gaze dynamics between pairs of monkeys and modeled these dynamics using an exponential decay function to investigate sustained attention after mutual eye contact. When monkeys were interacting with real partners compared with static images and movies of the same monkeys, we found a significant increase in the proportion of fixations to the eyes and a smaller dispersion of fixations around the eyes, indicating enhanced focal attention to the eye region. Notably, dominance and familiarity between the interacting pairs induced separable components of gaze dynamics that were unique to live interactions. Gaze dynamics of dominant monkeys after mutual eye contact were associated with a greater number of fixations to the eyes, whereas those of familiar pairs were associated with a faster rate of decrease in this eye-directed attention. Our findings endorse the notion that certain key aspects of social cognition are only captured during interactive social contexts and dependent on the elapsed time relative to socially meaningful events. PMID:27486105

  18. Stimulus exposure and gaze bias: a further test of the gaze cascade model.

    PubMed

    Glaholt, Mackenzie G; Reingold, Eyal M

    2009-04-01

    We tested predictions derived from the gaze cascade model of preference decision making (Shimojo, Simion, Shimojo, & Scheier, 2003; Simion & Shimojo, 2006, 2007). In each trial, participants' eye movements were monitored while they performed an eight-alternative decision task in which four of the items in the array were preexposed prior to the trial. Replicating previous findings, we found a gaze bias toward the chosen item prior to the response. However, contrary to the prediction of the gaze cascade model, preexposure of stimuli decreased, rather than increased, the magnitude of the gaze bias in preference decisions. Furthermore, unlike the prediction of the model, preexposure did not affect the likelihood of an item being chosen, and the pattern of looking behavior in preference decisions and on a non preference control task was remarkably similar. Implications of the present findings in multistage models of decision making are discussed.

  19. Helmet Mounted Eye Tracking for Virtual Panoramic Displays. Volume 1: Review of Current Eye Movement Measurement Technology

    DTIC Science & Technology

    1989-08-01

    paths for integration with the off-aperture and dual-mirror VPD designs. PREFACE The goal of this work was to explore integration of an eye line-of- gaze ...Relationship in one plane between point-of- gaze on a flat scene and relative eye, detector, and scene positions...and eye line-of- gaze measurement. As a first step towards the design of an appropriate eye trac.<ing system for interface with the virtual cockpit

  20. Eye gaze performance for children with severe physical impairments using gaze-based assistive technology-A longitudinal study.

    PubMed

    Borgestig, Maria; Sandqvist, Jan; Parsons, Richard; Falkmer, Torbjörn; Hemmingsson, Helena

    2016-01-01

    Gaze-based assistive technology (gaze-based AT) has the potential to provide children affected by severe physical impairments with opportunities for communication and activities. This study aimed to examine changes in eye gaze performance over time (time on task and accuracy) in children with severe physical impairments, without speaking ability, using gaze-based AT. A longitudinal study with a before and after design was conducted on 10 children (aged 1-15 years) with severe physical impairments, who were beginners to gaze-based AT at baseline. Thereafter, all children used the gaze-based AT in daily activities over the course of the study. Compass computer software was used to measure time on task and accuracy with eye selection of targets on screen, and tests were performed with the children at baseline, after 5 months, 9-11 months, and after 15-20 months. Findings showed that the children improved in time on task after 5 months and became more accurate in selecting targets after 15-20 months. This study indicates that these children with severe physical impairments, who were unable to speak, could improve in eye gaze performance. However, the children needed time to practice on a long-term basis to acquire skills needed to develop fast and accurate eye gaze performance.

  1. See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction.

    PubMed

    Xu, Tian Linger; Zhang, Hui; Yu, Chen

    2016-05-01

    We focus on a fundamental looking behavior in human-robot interactions - gazing at each other's face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user's face as a response to the human's gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user to control an interacting robot in real time. Using this system, we conducted an experiment in which human participants interacted with the robot in a joint attention task. In the experiment, we systematically manipulated the robot's gaze toward the human partner's face in real time and then analyzed the human's gaze behavior as a response to the robot's gaze behavior. We found that more face looks from the robot led to more look-backs (to the robot's face) from human participants and consequently created more mutual gaze and eye contact between the two. Moreover, participants demonstrated more coordinated and synchronized multimodal behaviors between speech and gaze when more eye contact was successfully established and maintained.

  2. Transcranial magnetic stimulation over the cerebellum delays predictive head movements in the coordination of gaze.

    PubMed

    Zangemeister, W H; Nagel, M

    2001-01-01

    We investigated coordinated saccadic eye and head movements following predictive horizontal visual targets at +/- 30 degrees by applying transcranial magnetic stimulation (TMS) over the cerebellum before the start of the gaze movement in 10 young subjects. We found three effects of TMS on eye-head movements: 1. Saccadic latency effect. When stimulation took place shortly before movements commenced (75-25 ms before), significantly shorter latencies were found between predictive target presentation and initiation of saccades. Eye latencies were significantly decreased by 45 ms on average, but head latencies were not. 2. Gaze amplitude effect. Without TMS, for the 60 degrees target amplitudes, head movements usually preceded eye movements, as expected (predictive gaze type 3). With TMS 5-75 ms before the gaze movement, the number of eye movements preceding head movements by 20-50 ms was significantly increased (p < 0.001) and the delay between eye and head movements was reversed (p < 0.001), i.e. we found eye-predictive gaze type 1. 3. Saccadic peak velocity effect. For TMS 5-25 s before the start of head movement, mean peak velocity of synkinetic eye saccades increased by 20-30% up to 600 degrees/s, compared to 350-400 degrees/s without TMS. We conclude that transient functional cerebellar deficits exerted by means of TMS can change the central synkinesis of eye-head coordination, including the preprogramming of the saccadic pulse and step of a coordinated gaze movement.

  3. Anxiety and sensitivity to gaze direction in emotionally expressive faces.

    PubMed

    Fox, Elaine; Mathews, Andrew; Calder, Andrew J; Yiend, Jenny

    2007-08-01

    This study investigated the role of neutral, happy, fearful, and angry facial expressions in enhancing orienting to the direction of eye gaze. Photographs of faces with either direct or averted gaze were presented. A target letter (T or L) appeared unpredictably to the left or the right of the face, either 300 ms or 700 ms after gaze direction changed. Response times were faster in congruent conditions (i.e., when the eyes gazed toward the target) relative to incongruent conditions (when the eyes gazed away from the target letter). Facial expression did influence reaction times, but these effects were qualified by individual differences in self-reported anxiety. High trait-anxious participants showed an enhanced orienting to the eye gaze of faces with fearful expressions relative to all other expressions. In contrast, when the eyes stared straight ahead, trait anxiety was associated with slower responding when the facial expressions depicted anger. Thus, in anxiety-prone people attention is more likely to be held by an expression of anger, whereas attention is guided more potently by fearful facial expressions. ((c) 2007 APA, all rights reserved).

  4. Observing Shared Attention Modulates Gaze Following

    ERIC Educational Resources Information Center

    Bockler, Anne; Knoblich, Gunther; Sebanz, Natalie

    2011-01-01

    Humans' tendency to follow others' gaze is considered to be rather resistant to top-down influences. However, recent evidence indicates that gaze following depends on prior eye contact with the observed agent. Does observing two people engaging in eye contact also modulate gaze following? Participants observed two faces looking at each other or…

  5. Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions.

    PubMed

    Ho, Simon; Foulsham, Tom; Kingstone, Alan

    2015-01-01

    Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest.

  6. Eye gazing direction inspection based on image processing technique

    NASA Astrophysics Data System (ADS)

    Hao, Qun; Song, Yong

    2005-02-01

    According to the research result in neural biology, human eyes can obtain high resolution only at the center of view of field. In the research of Virtual Reality helmet, we design to detect the gazing direction of human eyes in real time and feed it back to the control system to improve the resolution of the graph at the center of field of view. In the case of current display instruments, this method can both give attention to the view field of virtual scene and resolution, and improve the immersion of virtual system greatly. Therefore, detecting the gazing direction of human eyes rapidly and exactly is the basis of realizing the design scheme of this novel VR helmet. In this paper, the conventional method of gazing direction detection that based on Purklinje spot is introduced firstly. In order to overcome the disadvantage of the method based on Purklinje spot, this paper proposed a method based on image processing to realize the detection and determination of the gazing direction. The locations of pupils and shapes of eye sockets change with the gazing directions. With the aid of these changes, analyzing the images of eyes captured by the cameras, gazing direction of human eyes can be determined finally. In this paper, experiments have been done to validate the efficiency of this method by analyzing the images. The algorithm can carry out the detection of gazing direction base on normal eye image directly, and it eliminates the need of special hardware. Experiment results show that the method is easy to implement and have high precision.

  7. A kinematic model for 3-D head-free gaze-shifts

    PubMed Central

    Daemi, Mehdi; Crawford, J. Douglas

    2015-01-01

    Rotations of the line of sight are mainly implemented by coordinated motion of the eyes and head. Here, we propose a model for the kinematics of three-dimensional (3-D) head-unrestrained gaze-shifts. The model was designed to account for major principles in the known behavior, such as gaze accuracy, spatiotemporal coordination of saccades with vestibulo-ocular reflex (VOR), relative eye and head contributions, the non-commutativity of rotations, and Listing's and Fick constraints for the eyes and head, respectively. The internal design of the model was inspired by known and hypothesized elements of gaze control physiology. Inputs included retinocentric location of the visual target and internal representations of initial 3-D eye and head orientation, whereas outputs were 3-D displacements of eye relative to the head and head relative to shoulder. Internal transformations decomposed the 2-D gaze command into 3-D eye and head commands with the use of three coordinated circuits: (1) a saccade generator, (2) a head rotation generator, (3) a VOR predictor. Simulations illustrate that the model can implement: (1) the correct 3-D reference frame transformations to generate accurate gaze shifts (despite variability in other parameters), (2) the experimentally verified constraints on static eye and head orientations during fixation, and (3) the experimentally observed 3-D trajectories of eye and head motion during gaze-shifts. We then use this model to simulate how 2-D eye-head coordination strategies interact with 3-D constraints to influence 3-D orientations of the eye-in-space, and the implications of this for spatial vision. PMID:26113816

  8. A Web Browsing System by Eye-gaze Input

    NASA Astrophysics Data System (ADS)

    Abe, Kiyohiko; Owada, Kosuke; Ohi, Shoichi; Ohyama, Minoru

    We have developed an eye-gaze input system for people with severe physical disabilities, such as amyotrophic lateral sclerosis (ALS) patients. This system utilizes a personal computer and a home video camera to detect eye-gaze under natural light. The system detects both vertical and horizontal eye-gaze by simple image analysis, and does not require special image processing units or sensors. We also developed the platform for eye-gaze input based on our system. In this paper, we propose a new web browsing system for physically disabled computer users as an application of the platform for eye-gaze input. The proposed web browsing system uses a method of direct indicator selection. The method categorizes indicators by their function. These indicators are hierarchized relations; users can select the felicitous function by switching indicators group. This system also analyzes the location of selectable object on web page, such as hyperlink, radio button, edit box, etc. This system stores the locations of these objects, in other words, the mouse cursor skips to the object of candidate input. Therefore it enables web browsing at a faster pace.

  9. Eye Gaze and Production Accuracy Predict English L2 Speakers' Morphosyntactic Learning

    ERIC Educational Resources Information Center

    McDonough, Kim; Trofimovich, Pavel; Dao, Phung; Dio, Alexandre

    2017-01-01

    This study investigated the relationship between second language (L2) speakers' success in learning a new morphosyntactic pattern and characteristics of one-on-one learning activities, including opportunities to comprehend and produce the target pattern, receive feedback from an interlocutor, and attend to the meaning of the pattern through self-…

  10. Gaze Toward Naturalistic Social Scenes by Individuals With Intellectual and Developmental Disabilities: Implications for Augmentative and Alternative Communication Designs.

    PubMed

    Liang, Jiali; Wilkinson, Krista

    2018-04-18

    A striking characteristic of the social communication deficits in individuals with autism is atypical patterns of eye contact during social interactions. We used eye-tracking technology to evaluate how the number of human figures depicted and the presence of sharing activity between the human figures in still photographs influenced visual attention by individuals with autism, typical development, or Down syndrome. We sought to examine visual attention to the contents of visual scene displays, a growing form of augmentative and alternative communication support. Eye-tracking technology recorded point-of-gaze while participants viewed 32 photographs in which either 2 or 3 human figures were depicted. Sharing activities between these human figures are either present or absent. The sampling rate was 60 Hz; that is, the technology gathered 60 samples of gaze behavior per second, per participant. Gaze behaviors, including latency to fixate and time spent fixating, were quantified. The overall gaze behaviors were quite similar across groups, regardless of the social content depicted. However, individuals with autism were significantly slower than the other groups in latency to first view the human figures, especially when there were 3 people depicted in the photographs (as compared with 2 people). When participants' own viewing pace was considered, individuals with autism resembled those with Down syndrome. The current study supports the inclusion of social content with various numbers of human figures and sharing activities between human figures into visual scene displays, regardless of the population served. Study design and reporting practices in eye-tracking literature as it relates to autism and Down syndrome are discussed. https://doi.org/10.23641/asha.6066545.

  11. Dual Purkinje-Image Eyetracker

    DTIC Science & Technology

    1996-01-01

    Abnormal nystagmus can also be detected through the use of an eyetracker [4]. Through tracking points of eye gaze within a scene, it is possible to...moving, even when gazing . Correcting for these unpredictable micro eye movements would allow corrective procedures in eye surgery to become more accurate...victim with a screen of letters on a monitor. A calibrated eyetracker then provides a processor with information about the location of eye gaze . The

  12. Stabilization of gaze during circular locomotion in light. I. Compensatory head and eye nystagmus in the running monkey

    NASA Technical Reports Server (NTRS)

    Solomon, D.; Cohen, B.

    1992-01-01

    1. A rhesus and cynomolgus monkey were trained to run around the perimeter of a circular platform in light. We call this "circular locomotion" because forward motion had an angular component. Head and body velocity in space were recorded with angular rate sensors and eye movements with electrooculography (EOG). From these measurements we derived signals related to the angular velocity of the eyes in the head (Eh), of the head on the body (Hb), of gaze on the body (Gb), of the body in space (Bs), of gaze in space (Gs), and of the gain of gaze (Gb/Bs). 2. The monkeys had continuous compensatory nystagmus of the head and eyes while running, which stabilized Gs during the slow phases. The eyes established and maintained compensatory gaze velocities at the beginning and end of the slow phases. The head contributed to gaze velocity during the middle of the slow phases. Slow phase Gb was as high as 250 degrees/s, and targets were fixed for gaze angles as large as 90-140 degrees. 3. Properties of the visual surround affected both the gain and strategy of gaze compensation in the one monkey tested. Gains of Eh ranged from 0.3 to 1.1 during compensatory gaze nystagmus. Gains of Hb varied around 0.3 (0.2-0.7), building to a maximum as Eh dropped while running past sectors of interest. Consistent with predictions, gaze gains varied from below to above unity, when translational and angular body movements with regard to the target were in opposite or the same directions, respectively. 4. Gaze moved in saccadic shifts in the direction of running during quick phases. Most head quick phases were small, and at times the head only paused during an eye quick phase. Eye quick phases were larger, ranging up to 60 degrees. This is larger than quick phases during passive rotation or saccades made with the head fixed. 5. These data indicate that head and eye nystagmus are natural phenomena that support gaze compensation during locomotion. Despite differential utilization of the head and eyes in various conditions, Gb compensated for Bs. There are various frames of reference in which an estimate of angular velocity that drives the head and eyes could be based. We infer that body in space velocity (Bs) is likely to be represented centrally to provide this signal.

  13. Viewing condition dependence of the gaze-evoked nystagmus in Arnold Chiari type 1 malformation.

    PubMed

    Ghasia, Fatema F; Gulati, Deepak; Westbrook, Edward L; Shaikh, Aasef G

    2014-04-15

    Saccadic eye movements rapidly shift gaze to the target of interest. Once the eyes reach a given target, the brainstem ocular motor integrator utilizes feedback from various sources to assure steady gaze. One of such sources is cerebellum whose lesion can impair neural integration leading to gaze-evoked nystagmus. The gaze evoked nystagmus is characterized by drifts moving the eyes away from the target and a null position where the drifts are absent. The extent of impairment in the neural integration for two opposite eccentricities might determine the location of the null position. Eye in the orbit position might also determine the location of the null. We report this phenomenon in a patient with Arnold Chiari type 1 malformation who had intermittent esotropia and horizontal gaze-evoked nystagmus with a shift in the null position. During binocular viewing, the null was shifted to the right. During monocular viewing, when the eye under cover drifted nasally (secondary to the esotropia), the null of the gaze-evoked nystagmus reorganized toward the center. We speculate that the output of the neural integrator is altered from the bilateral conflicting eye in the orbit position secondary to the strabismus. This could possibly explain the reorganization of the location of the null position. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Combined influence of vergence and eye position on three-dimensional vestibulo-ocular reflex in the monkey.

    PubMed

    Misslisch, H; Hess, B J M

    2002-11-01

    This study examined two kinematical features of the rotational vestibulo-ocular reflex (VOR) of the monkey in near vision. First, is there an effect of eye position on the axes of eye rotation during yaw, pitch and roll head rotations when the eyes are converged to fixate near targets? Second, do the three-dimensional positions of the left and right eye during yaw and roll head rotations obey the binocular extension of Listing's law (L2), showing eye position planes that rotate temporally by a quarter as far as the angle of horizontal vergence? Animals fixated near visual targets requiring 17 or 8.5 degrees vergence and placed at straight ahead, 20 degrees up, down, left, or right during yaw, pitch, and roll head rotations at 1 Hz. The 17 degrees vergence experiments were performed both with and without a structured visual background, the 8.5 degrees vergence experiments with a visual background only. A 40 degrees horizontal change in eye position never influenced the axis of eye rotation produced by the VOR during pitch head rotation. Eye position did not affect the VOR eye rotation axes, which stayed aligned with the yaw and roll head rotation axes, when torsional gain was high. If torsional gain was low, eccentric eye positions produced yaw and roll VOR eye rotation axes that tilted somewhat in the directions predicted by Listing's law, i.e., with or opposite to gaze during yaw or roll. These findings were seen in both visual conditions and in both vergence experiments. During yaw and roll head rotations with a 40 degrees vertical change in gaze, torsional eye position followed on average the prediction of L2: the left eye showed counterclockwise (ex-) torsion in down gaze and clockwise (in-) torsion in up gaze and vice versa for the right eye. In other words, the left and right eye's position plane rotated temporally by about a quarter of the horizontal vergence angle. Our results indicate that torsional gain is the central mechanism by which the brain adjusts the retinal image stabilizing function of the VOR both in far and near vision and the three dimensional eye positions during yaw and roll head rotations in near vision follow on average the predictions of L2, a kinematic pattern that is maintained by the saccadic/quick phase system.

  15. Collective Behaviour in Video Viewing: A Thermodynamic Analysis of Gaze Position.

    PubMed

    Burleson-Lesser, Kate; Morone, Flaviano; DeGuzman, Paul; Parra, Lucas C; Makse, Hernán A

    2017-01-01

    Videos and commercials produced for large audiences can elicit mixed opinions. We wondered whether this diversity is also reflected in the way individuals watch the videos. To answer this question, we presented 65 commercials with high production value to 25 individuals while recording their eye movements, and asked them to provide preference ratings for each video. We find that gaze positions for the most popular videos are highly correlated. To explain the correlations of eye movements, we model them as "interactions" between individuals. A thermodynamic analysis of these interactions shows that they approach a "critical" point such that any stronger interaction would put all viewers into lock-step and any weaker interaction would fully randomise patterns. At this critical point, groups with similar collective behaviour in viewing patterns emerge while maintaining diversity between groups. Our results suggest that popularity of videos is already evident in the way we look at them, and that we maintain diversity in viewing behaviour even as distinct patterns of groups emerge. Our results can be used to predict popularity of videos and commercials at the population level from the collective behaviour of the eye movements of a few viewers.

  16. Intermediate view synthesis for eye-gazing

    NASA Astrophysics Data System (ADS)

    Baek, Eu-Ttuem; Ho, Yo-Sung

    2015-01-01

    Nonverbal communication, also known as body language, is an important form of communication. Nonverbal behaviors such as posture, eye contact, and gestures send strong messages. In regard to nonverbal communication, eye contact is one of the most important forms that an individual can use. However, lack of eye contact occurs when we use video conferencing system. The disparity between locations of the eyes and a camera gets in the way of eye contact. The lock of eye gazing can give unapproachable and unpleasant feeling. In this paper, we proposed an eye gazing correction for video conferencing. We use two cameras installed at the top and the bottom of the television. The captured two images are rendered with 2D warping at virtual position. We implement view morphing to the detected face, and synthesize the face and the warped image. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

  17. Gaze entropy reflects surgical task load.

    PubMed

    Di Stasi, Leandro L; Diaz-Piedra, Carolina; Rieiro, Héctor; Sánchez Carrión, José M; Martin Berrido, Mercedes; Olivares, Gonzalo; Catena, Andrés

    2016-11-01

    Task (over-)load imposed on surgeons is a main contributing factor to surgical errors. Recent research has shown that gaze metrics represent a valid and objective index to asses operator task load in non-surgical scenarios. Thus, gaze metrics have the potential to improve workplace safety by providing accurate measurements of task load variations. However, the direct relationship between gaze metrics and surgical task load has not been investigated yet. We studied the effects of surgical task complexity on the gaze metrics of surgical trainees. We recorded the eye movements of 18 surgical residents, using a mobile eye tracker system, during the performance of three high-fidelity virtual simulations of laparoscopic exercises of increasing complexity level: Clip Applying exercise, Cutting Big exercise, and Translocation of Objects exercise. We also measured performance accuracy and subjective rating of complexity. Gaze entropy and velocity linearly increased with increased task complexity: Visual exploration pattern became less stereotyped (i.e., more random) and faster during the more complex exercises. Residents performed better the Clip Applying exercise and the Cutting Big exercise than the Translocation of Objects exercise and their perceived task complexity differed accordingly. Our data show that gaze metrics are a valid and reliable surgical task load index. These findings have potential impacts to improve patient safety by providing accurate measurements of surgeon task (over-)load and might provide future indices to assess residents' learning curves, independently of expensive virtual simulators or time-consuming expert evaluation.

  18. Measure and Analysis of a Gaze Position Using Infrared Light Technique

    DTIC Science & Technology

    2001-10-25

    MEASURE AND ANALYSIS OF A GAZE POSITION USING INFRARED LIGHT TECHNIQUE Z. Ramdane-Cherif1,2, A. Naït-Ali2, J F. Motsch2, M. O. Krebs1 1INSERM E 01-17...also proposes a method to correct head movements. Keywords: eye movement, gaze tracking, visual scan path, spatial mapping. INTRODUCTION The eye gaze ...tracking has been used for clinical purposes to detect illnesses, such as nystagmus , unusual eye movements and many others [1][2][3]. It is also used

  19. Visual–Motor Transformations Within Frontal Eye Fields During Head-Unrestrained Gaze Shifts in the Monkey

    PubMed Central

    Sajad, Amirsaman; Sadeh, Morteza; Keith, Gerald P.; Yan, Xiaogang; Wang, Hongying; Crawford, John Douglas

    2015-01-01

    A fundamental question in sensorimotor control concerns the transformation of spatial signals from the retina into eye and head motor commands required for accurate gaze shifts. Here, we investigated these transformations by identifying the spatial codes embedded in visually evoked and movement-related responses in the frontal eye fields (FEFs) during head-unrestrained gaze shifts. Monkeys made delayed gaze shifts to the remembered location of briefly presented visual stimuli, with delay serving to dissociate visual and movement responses. A statistical analysis of nonparametric model fits to response field data from 57 neurons (38 with visual and 49 with movement activities) eliminated most effector-specific, head-fixed, and space-fixed models, but confirmed the dominance of eye-centered codes observed in head-restrained studies. More importantly, the visual response encoded target location, whereas the movement response mainly encoded the final position of the imminent gaze shift (including gaze errors). This spatiotemporal distinction between target and gaze coding was present not only at the population level, but even at the single-cell level. We propose that an imperfect visual–motor transformation occurs during the brief memory interval between perception and action, and further transformations from the FEF's eye-centered gaze motor code to effector-specific codes in motor frames occur downstream in the subcortical areas. PMID:25491118

  20. See You See Me: the Role of Eye Contact in Multimodal Human-Robot Interaction

    PubMed Central

    XU, TIAN (LINGER); ZHANG, HUI; YU, CHEN

    2016-01-01

    We focus on a fundamental looking behavior in human-robot interactions – gazing at each other’s face. Eye contact and mutual gaze between two social partners are critical in smooth human-human interactions. Therefore, investigating at what moments and in what ways a robot should look at a human user’s face as a response to the human’s gaze behavior is an important topic. Toward this goal, we developed a gaze-contingent human-robot interaction system, which relied on momentary gaze behaviors from a human user to control an interacting robot in real time. Using this system, we conducted an experiment in which human participants interacted with the robot in a joint attention task. In the experiment, we systematically manipulated the robot’s gaze toward the human partner’s face in real time and then analyzed the human’s gaze behavior as a response to the robot’s gaze behavior. We found that more face looks from the robot led to more look-backs (to the robot’s face) from human participants and consequently created more mutual gaze and eye contact between the two. Moreover, participants demonstrated more coordinated and synchronized multimodal behaviors between speech and gaze when more eye contact was successfully established and maintained. PMID:28966875

  1. Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions

    PubMed Central

    Ho, Simon; Foulsham, Tom; Kingstone, Alan

    2015-01-01

    Cognitive scientists have long been interested in the role that eye gaze plays in social interactions. Previous research suggests that gaze acts as a signaling mechanism and can be used to control turn-taking behaviour. However, early research on this topic employed methods of analysis that aggregated gaze information across an entire trial (or trials), which masks any temporal dynamics that may exist in social interactions. More recently, attempts have been made to understand the temporal characteristics of social gaze but little research has been conducted in a natural setting with two interacting participants. The present study combines a temporally sensitive analysis technique with modern eye tracking technology to 1) validate the overall results from earlier aggregated analyses and 2) provide insight into the specific moment-to-moment temporal characteristics of turn-taking behaviour in a natural setting. Dyads played two social guessing games (20 Questions and Heads Up) while their eyes were tracked. Our general results are in line with past aggregated data, and using cross-correlational analysis on the specific gaze and speech signals of both participants we found that 1) speakers end their turn with direct gaze at the listener and 2) the listener in turn begins to speak with averted gaze. Convergent with theoretical models of social interaction, our data suggest that eye gaze can be used to signal both the end and the beginning of a speaking turn during a social interaction. The present study offers insight into the temporal dynamics of live dyadic interactions and also provides a new method of analysis for eye gaze data when temporal relationships are of interest. PMID:26309216

  2. Gaze toward Naturalistic Social Scenes by Individuals with Intellectual and Developmental Disabilities: Implications for Augmentative and Alternative Communication Designs

    ERIC Educational Resources Information Center

    Liang, Jiali; Wilkinson, Krista

    2018-01-01

    Purpose: A striking characteristic of the social communication deficits in individuals with autism is atypical patterns of eye contact during social interactions. We used eye-tracking technology to evaluate how the number of human figures depicted and the presence of sharing activity between the human figures in still photographs influenced visual…

  3. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography

    PubMed Central

    2018-01-01

    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user’s eye gaze. PMID:29304120

  4. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography.

    PubMed

    Hládek, Ľuboš; Porr, Bernd; Brimijoin, W Owen

    2018-01-01

    The manuscript proposes and evaluates a real-time algorithm for estimating eye gaze angle based solely on single-channel electrooculography (EOG), which can be obtained directly from the ear canal using conductive ear moulds. In contrast to conventional high-pass filtering, we used an algorithm that calculates absolute eye gaze angle via statistical analysis of detected saccades. The estimated eye positions of the new algorithm were still noisy. However, the performance in terms of Pearson product-moment correlation coefficients was significantly better than the conventional approach in some instances. The results suggest that in-ear EOG signals captured with conductive ear moulds could serve as a basis for light-weight and portable horizontal eye gaze angle estimation suitable for a broad range of applications. For instance, for hearing aids to steer the directivity of microphones in the direction of the user's eye gaze.

  5. Deficits in eye gaze during negative social interactions in patients with schizophrenia.

    PubMed

    Choi, Soo-Hee; Ku, Jeonghun; Han, Kiwan; Kim, Eosu; Kim, Sun I; Park, Junyoung; Kim, Jae-Jin

    2010-11-01

    Impaired social functioning has been reported in patients with schizophrenia. This study aimed to examine characteristics of interpersonal behaviors in patients with schizophrenia during various social interactions using the virtual reality system. Twenty-six patients and 26 controls engaged in the virtual conversation tasks, including 3 positive and 3 negative emotion-laden conversations. Eye gaze and other behavioral parameters were recorded during the listening and answering phases. The amount of eye gaze was assessed as smaller in the patients than in the controls. A significant interaction effect of group status and emotional type was found for the listening phase. The amount of eye gaze in the patients inversely correlated with self-rated scores of assertiveness for the listening phase. These results suggest that the patients displayed inadequate levels of augmentations in eye gaze during negative emotional situations. These deficits should be considered in the treatment and social skills training for patients with schizophrenia.

  6. Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese.

    PubMed

    Stanley, Jennifer Tehan; Zhang, Xin; Fung, Helene H; Isaacowitz, Derek M

    2013-02-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye-tracking data suggest that, for some emotions, Americans attended more to the target faces, and they made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  7. Cultural Differences in Gaze and Emotion Recognition: Americans Contrast More than Chinese

    PubMed Central

    Tehan Stanley, Jennifer; Zhang, Xin; Fung, Helene H.; Isaacowitz, Derek M.

    2014-01-01

    We investigated the influence of contextual expressions on emotion recognition accuracy and gaze patterns among American and Chinese participants. We expected Chinese participants would be more influenced by, and attend more to, contextual information than Americans. Consistent with our hypothesis, Americans were more accurate than Chinese participants at recognizing emotions embedded in the context of other emotional expressions. Eye tracking data suggest that, for some emotions, Americans attended more to the target faces and made more gaze transitions to the target face than Chinese. For all emotions except anger and disgust, Americans appeared to use more of a contrasting strategy where each face was individually contrasted with the target face, compared with Chinese who used less of a contrasting strategy. Both cultures were influenced by contextual information, although the benefit of contextual information depended upon the perceptual dissimilarity of the contextual emotions to the target emotion and the gaze pattern employed during the recognition task. PMID:22889414

  8. Eye-tracking novice and expert geologist groups in the field and laboratory

    NASA Astrophysics Data System (ADS)

    Cottrell, R. D.; Evans, K. M.; Jacobs, R. A.; May, B. B.; Pelz, J. B.; Rosen, M. R.; Tarduno, J. A.; Voronov, J.

    2010-12-01

    We are using an Active Vision approach to learn how novices and expert geologists acquire visual information in the field. The Active Vision approach emphasizes that visual perception is an active process wherein new information is acquired about a particular environment through exploratory eye movements. Eye movements are not only influenced by physical stimuli, but are also strongly influenced by high-level perceptual and cognitive processes. Eye-tracking data were collected on ten novices (undergraduate geology students) and 3 experts during a 10-day field trip across California focused on neotectonics. In addition, high-resolution panoramic images were captured at each key locality for use in a semi-immersive laboratory environment. Examples of each data type will be presented. The number of observers will be increased in subsequent field trips, but expert/novice differences are already apparent in the first set of individual eye-tracking records, including gaze time, gaze pattern and object recognition. We will review efforts to quantify these patterns, and development of semi-immersive environments to display geologic scenes. The research is a collaborative effort between Earth scientists, Cognitive scientists and Imaging scientists at the University of Rochester and the Rochester Institute of Technology and with funding from the National Science Foundation.

  9. Perception and Processing of Faces in the Human Brain Is Tuned to Typical Feature Locations

    PubMed Central

    Schwarzkopf, D. Samuel; Alvarez, Ivan; Lawson, Rebecca P.; Henriksson, Linda; Kriegeskorte, Nikolaus; Rees, Geraint

    2016-01-01

    Faces are salient social stimuli whose features attract a stereotypical pattern of fixations. The implications of this gaze behavior for perception and brain activity are largely unknown. Here, we characterize and quantify a retinotopic bias implied by typical gaze behavior toward faces, which leads to eyes and mouth appearing most often in the upper and lower visual field, respectively. We found that the adult human visual system is tuned to these contingencies. In two recognition experiments, recognition performance for isolated face parts was better when they were presented at typical, rather than reversed, visual field locations. The recognition cost of reversed locations was equal to ∼60% of that for whole face inversion in the same sample. Similarly, an fMRI experiment showed that patterns of activity evoked by eye and mouth stimuli in the right inferior occipital gyrus could be separated with significantly higher accuracy when these features were presented at typical, rather than reversed, visual field locations. Our findings demonstrate that human face perception is determined not only by the local position of features within a face context, but by whether features appear at the typical retinotopic location given normal gaze behavior. Such location sensitivity may reflect fine-tuning of category-specific visual processing to retinal input statistics. Our findings further suggest that retinotopic heterogeneity might play a role for face inversion effects and for the understanding of conditions affecting gaze behavior toward faces, such as autism spectrum disorders and congenital prosopagnosia. SIGNIFICANCE STATEMENT Faces attract our attention and trigger stereotypical patterns of visual fixations, concentrating on inner features, like eyes and mouth. Here we show that the visual system represents face features better when they are shown at retinal positions where they typically fall during natural vision. When facial features were shown at typical (rather than reversed) visual field locations, they were discriminated better by humans and could be decoded with higher accuracy from brain activity patterns in the right occipital face area. This suggests that brain representations of face features do not cover the visual field uniformly. It may help us understand the well-known face-inversion effect and conditions affecting gaze behavior toward faces, such as prosopagnosia and autism spectrum disorders. PMID:27605606

  10. The role of emotion in learning trustworthiness from eye-gaze: Evidence from facial electromyography

    PubMed Central

    Manssuer, Luis R.; Pawling, Ralph; Hayes, Amy E.; Tipper, Steven P.

    2016-01-01

    Gaze direction can be used to rapidly and reflexively lead or mislead others’ attention as to the location of important stimuli. When perception of gaze direction is congruent with the location of a target, responses are faster compared to when incongruent. Faces that consistently gaze congruently are also judged more trustworthy than faces that consistently gaze incongruently. However, it’s unclear how gaze-cues elicit changes in trust. We measured facial electromyography (EMG) during an identity-contingent gaze-cueing task to examine whether embodied emotional reactions to gaze-cues mediate trust learning. Gaze-cueing effects were found to be equivalent regardless of whether participants showed learning of trust in the expected direction or did not. In contrast, we found distinctly different patterns of EMG activity in these two populations. In a further experiment we showed the learning effects were specific to viewing faces, as no changes in liking were detected when viewing arrows that evoked similar attentional orienting responses. These findings implicate embodied emotion in learning trust from identity-contingent gaze-cueing, possibly due to the social value of shared attention or deception rather than domain-general attentional orienting. PMID:27153239

  11. Patterns of Visual Attention to Faces and Objects in Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    McPartland, James C.; Webb, Sara Jane; Keehn, Brandon; Dawson, Geraldine

    2011-01-01

    This study used eye-tracking to examine visual attention to faces and objects in adolescents with autism spectrum disorder (ASD) and typical peers. Point of gaze was recorded during passive viewing of images of human faces, inverted human faces, monkey faces, three-dimensional curvilinear objects, and two-dimensional geometric patterns.…

  12. Dogs Evaluate Threatening Facial Expressions by Their Biological Validity – Evidence from Gazing Patterns

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Kujala, Miiamaaria V.; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2016-01-01

    Appropriate response to companions’ emotional signals is important for all social creatures. The emotional expressions of humans and non-human animals have analogies in their form and function, suggesting shared evolutionary roots, but very little is known about how animals other than primates view and process facial expressions. In primates, threat-related facial expressions evoke exceptional viewing patterns compared with neutral or positive stimuli. Here, we explore if domestic dogs (Canis familiaris) have such an attentional bias toward threatening social stimuli and whether observed emotional expressions affect dogs’ gaze fixation distribution among the facial features (eyes, midface and mouth). We recorded the voluntary eye gaze of 31 domestic dogs during viewing of facial photographs of humans and dogs with three emotional expressions (threatening, pleasant and neutral). We found that dogs’ gaze fixations spread systematically among facial features. The distribution of fixations was altered by the seen expression, but eyes were the most probable targets of the first fixations and gathered longer looking durations than mouth regardless of the viewed expression. The examination of the inner facial features as a whole revealed more pronounced scanning differences among expressions. This suggests that dogs do not base their perception of facial expressions on the viewing of single structures, but the interpretation of the composition formed by eyes, midface and mouth. Dogs evaluated social threat rapidly and this evaluation led to attentional bias, which was dependent on the depicted species: threatening conspecifics’ faces evoked heightened attention but threatening human faces instead an avoidance response. We propose that threatening signals carrying differential biological validity are processed via distinctive neurocognitive pathways. Both of these mechanisms may have an adaptive significance for domestic dogs. The findings provide a novel perspective on understanding the processing of emotional expressions and sensitivity to social threat in non-primates. PMID:26761433

  13. The importance of the eyes: communication skills in infants of blind parents.

    PubMed

    Senju, Atsushi; Tucker, Leslie; Pasco, Greg; Hudry, Kristelle; Elsabbagh, Mayada; Charman, Tony; Johnson, Mark H

    2013-06-07

    The effects of selectively different experience of eye contact and gaze behaviour on the early development of five sighted infants of blind parents were investigated. Infants were assessed longitudinally at 6-10, 12-15 and 24-47 months. Face scanning and gaze following were assessed using eye tracking. In addition, established measures of autistic-like behaviours and standardized tests of cognitive, motor and linguistic development, as well as observations of naturalistic parent-child interaction were collected. These data were compared with those obtained from a larger group of sighted infants of sighted parents. Infants with blind parents did not show an overall decrease in eye contact or gaze following when they observed sighted adults on video or in live interactions, nor did they show any autistic-like behaviours. However, they directed their own eye gaze somewhat less frequently towards their blind mothers and also showed improved performance in visual memory and attention at younger ages. Being reared with significantly reduced experience of eye contact and gaze behaviour does not preclude sighted infants from developing typical gaze processing and other social-communication skills. Indeed, the need to switch between different types of communication strategy may actually enhance other skills during development.

  14. Facial Expression Training Optimises Viewing Strategy in Children and Adults

    PubMed Central

    Pollux, Petra M. J.; Hall, Sophie; Guo, Kun

    2014-01-01

    This study investigated whether training-related improvements in facial expression categorization are facilitated by spontaneous changes in gaze behaviour in adults and nine-year old children. Four sessions of a self-paced, free-viewing training task required participants to categorize happy, sad and fear expressions with varying intensities. No instructions about eye movements were given. Eye-movements were recorded in the first and fourth training session. New faces were introduced in session four to establish transfer-effects of learning. Adults focused most on the eyes in all sessions and increased expression categorization accuracy after training coincided with a strengthening of this eye-bias in gaze allocation. In children, training-related behavioural improvements coincided with an overall shift in gaze-focus towards the eyes (resulting in more adult-like gaze-distributions) and towards the mouth for happy faces in the second fixation. Gaze-distributions were not influenced by the expression intensity or by the introduction of new faces. It was proposed that training enhanced the use of a uniform, predominantly eyes-biased, gaze strategy in children in order to optimise extraction of relevant cues for discrimination between subtle facial expressions. PMID:25144680

  15. Mentalizing eye contact with a face on a video: Gaze direction does not influence autonomic arousal.

    PubMed

    Lyyra, Pessi; Myllyneva, Aki; Hietanen, Jari K

    2018-04-26

    Recent research has revealed enhanced autonomic and subjective responses to eye contact only when perceiving another live person. However, these enhanced responses to eye contact are abolished if the viewer believes that the other person is not able to look back at the viewer. We purported to investigate whether this "genuine" eye contact effect can be reproduced with pre-recorded videos of stimulus persons. Autonomic responses, gaze behavior, and subjective self-assessments were measured while participants viewed pre-recorded video persons with direct or averted gaze, imagined that the video person was real, and mentalized that the person could see them or not. Pre-recorded videos did not evoke similar physiological or subjective eye contact effect as previously observed with live persons, not even when the participants were mentalizing being seen by the person. Gaze tracking results showed, however, increased attention allocation to faces with direct gaze compared to averted gaze directions. The results suggest that elicitation of the physiological arousal in response to genuine eye contact seems to require spontaneous experience of seeing and of being seen by another individual. © 2018 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  16. The imitation game: Effects of social cues on 'imitation' are domain-general in nature.

    PubMed

    Marsh, Lauren E; Bird, Geoffrey; Catmur, Caroline

    2016-10-01

    Imitation has been hailed as 'social glue', facilitating rapport with others. Previous studies suggest that social cues modulate imitation but the mechanism of such modulation remains underspecified. Here we examine the locus, specificity, and neural basis of the social control of imitation. Social cues (group membership and eye gaze) were manipulated during an imitation task in which imitative and spatial compatibility could be measured independently. Participants were faster to perform compatible compared to incompatible movements in both spatial and imitative domains. However, only spatial compatibility was modulated by social cues: an interaction between group membership and eye gaze revealed more spatial compatibility for ingroup members with direct gaze and outgroup members with averted gaze. The fMRI data were consistent with this finding. Regions associated with the control of imitative responding (temporoparietal junction, inferior frontal gyrus) were more active during imitatively incompatible compared to imitatively compatible trials. However, this activity was not modulated by social cues. On the contrary, an interaction between group, gaze and spatial compatibility was found in the dorsolateral prefrontal cortex in a pattern consistent with reaction times. This region may be exerting control over the motor system to modulate response inhibition. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Coordination of eye and head components of movements evoked by stimulation of the paramedian pontine reticular formation.

    PubMed

    Gandhi, Neeraj J; Barton, Ellen J; Sparks, David L

    2008-07-01

    Constant frequency microstimulation of the paramedian pontine reticular formation (PPRF) in head-restrained monkeys evokes a constant velocity eye movement. Since the PPRF receives significant projections from structures that control coordinated eye-head movements, we asked whether stimulation of the pontine reticular formation in the head-unrestrained animal generates a combined eye-head movement or only an eye movement. Microstimulation of most sites yielded a constant-velocity gaze shift executed as a coordinated eye-head movement, although eye-only movements were evoked from some sites. The eye and head contributions to the stimulation-evoked movements varied across stimulation sites and were drastically different from the lawful relationship observed for visually-guided gaze shifts. These results indicate that the microstimulation activated elements that issued movement commands to the extraocular and, for most sites, neck motoneurons. In addition, the stimulation-evoked changes in gaze were similar in the head-restrained and head-unrestrained conditions despite the assortment of eye and head contributions, suggesting that the vestibulo-ocular reflex (VOR) gain must be near unity during the coordinated eye-head movements evoked by stimulation of the PPRF. These findings contrast the attenuation of VOR gain associated with visually-guided gaze shifts and suggest that the vestibulo-ocular pathway processes volitional and PPRF stimulation-evoked gaze shifts differently.

  18. Oxytocin increases attention to the eyes and selectively enhances self-reported affective empathy for fear.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-11-01

    Oxytocin (OXT) has previously been implicated in a range of prosocial behaviors such as trust and emotion recognition. Nevertheless, recent studies have questioned the evidence for this link. In addition, there has been relatively little conclusive research on the effect of OXT on empathic ability and such studies as there are have not examined the mechanisms through which OXT might affect empathy, or whether OXT selectively facilitates empathy for specific emotions. In the current study, we used eye-tracking to assess attention to socially relevant information while participants viewed dynamic, empathy-inducing video clips, in which protagonists expressed sadness, happiness, pain or fear. In a double-blind, within-subjects, randomized control trial, 40 healthy male participants received 24 IU intranasal OXT or placebo in two identical experimental sessions, separated by a 2-week interval. OXT led to an increase in time spent fixating upon the eye-region of the protagonist's face across emotions. OXT also selectively enhanced self-reported affective empathy for fear, but did not affect cognitive or affective empathy for other emotions. Nevertheless, there was no positive relationship between eye-gaze patterns and affective empathy, suggesting that although OXT influences eye-gaze and may enhance affective empathy for fear, these two systems are independent. Future studies need to further examine the effect of OXT on eye-gaze to fully ascertain whether this can explain the improvements in emotional behavior. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Oxytocin Reduces Face Processing Time but Leaves Recognition Accuracy and Eye-Gaze Unaffected.

    PubMed

    Hubble, Kelly; Daughters, Katie; Manstead, Antony S R; Rees, Aled; Thapar, Anita; van Goozen, Stephanie H M

    2017-01-01

    Previous studies have found that oxytocin (OXT) can improve the recognition of emotional facial expressions; it has been proposed that this effect is mediated by an increase in attention to the eye-region of faces. Nevertheless, evidence in support of this claim is inconsistent, and few studies have directly tested the effect of oxytocin on emotion recognition via altered eye-gaze Methods: In a double-blind, within-subjects, randomized control experiment, 40 healthy male participants received 24 IU intranasal OXT and placebo in two identical experimental sessions separated by a 2-week interval. Visual attention to the eye-region was assessed on both occasions while participants completed a static facial emotion recognition task using medium intensity facial expressions. Although OXT had no effect on emotion recognition accuracy, recognition performance was improved because face processing was faster across emotions under the influence of OXT. This effect was marginally significant (p<.06). Consistent with a previous study using dynamic stimuli, OXT had no effect on eye-gaze patterns when viewing static emotional faces and this was not related to recognition accuracy or face processing time. These findings suggest that OXT-induced enhanced facial emotion recognition is not necessarily mediated by an increase in attention to the eye-region of faces, as previously assumed. We discuss several methodological issues which may explain discrepant findings and suggest the effect of OXT on visual attention may differ depending on task requirements. (JINS, 2017, 23, 23-33).

  20. Visual-Motor Transformations Within Frontal Eye Fields During Head-Unrestrained Gaze Shifts in the Monkey.

    PubMed

    Sajad, Amirsaman; Sadeh, Morteza; Keith, Gerald P; Yan, Xiaogang; Wang, Hongying; Crawford, John Douglas

    2015-10-01

    A fundamental question in sensorimotor control concerns the transformation of spatial signals from the retina into eye and head motor commands required for accurate gaze shifts. Here, we investigated these transformations by identifying the spatial codes embedded in visually evoked and movement-related responses in the frontal eye fields (FEFs) during head-unrestrained gaze shifts. Monkeys made delayed gaze shifts to the remembered location of briefly presented visual stimuli, with delay serving to dissociate visual and movement responses. A statistical analysis of nonparametric model fits to response field data from 57 neurons (38 with visual and 49 with movement activities) eliminated most effector-specific, head-fixed, and space-fixed models, but confirmed the dominance of eye-centered codes observed in head-restrained studies. More importantly, the visual response encoded target location, whereas the movement response mainly encoded the final position of the imminent gaze shift (including gaze errors). This spatiotemporal distinction between target and gaze coding was present not only at the population level, but even at the single-cell level. We propose that an imperfect visual-motor transformation occurs during the brief memory interval between perception and action, and further transformations from the FEF's eye-centered gaze motor code to effector-specific codes in motor frames occur downstream in the subcortical areas. © The Author 2014. Published by Oxford University Press.

  1. Teachers' Experiences of Using Eye Gaze-Controlled Computers for Pupils with Severe Motor Impairments and without Speech

    ERIC Educational Resources Information Center

    Rytterström, Patrik; Borgestig, Maria; Hemmingsson, Helena

    2016-01-01

    The purpose of this study is to explore teachers' experiences of using eye gaze-controlled computers with pupils with severe disabilities. Technology to control a computer with eye gaze is a fast growing field and has promising implications for people with severe disabilities. This is a new assistive technology and a new learning situation for…

  2. Looking while Listening and Speaking: Eye-to-Face Gaze in Adolescents with and without Traumatic Brain Injury

    ERIC Educational Resources Information Center

    Turkstra, Lyn S.

    2005-01-01

    Purpose: The purpose of this study was to address the lack of quantitative data on eye-to-face gaze (also known as eye contact) in the literature on pragmatic communication. The study focused on adolescents and young adults with traumatic brain injury (TBI), as gaze often is included in social skills intervention in this population. Method: Gaze…

  3. Automatic attentional orienting to other people's gaze in schizophrenia.

    PubMed

    Langdon, Robyn; Seymour, Kiley; Williams, Tracey; Ward, Philip B

    2017-08-01

    Explicit tests of social cognition have revealed pervasive deficits in schizophrenia. Less is known of automatic social cognition in schizophrenia. We used a spatial orienting task to investigate automatic shifts of attention cued by another person's eye gaze in 29 patients and 28 controls. Central photographic images of a face with eyes shifted left or right, or looking straight ahead, preceded targets that appeared left or right of the cue. To examine automatic effects, cue direction was non-predictive of target location. Cue-target intervals were 100, 300, and 800 ms. In non-social control trials, arrows replaced eye-gaze cues. Both groups showed automatic attentional orienting indexed by faster reaction times (RTs) when arrows were congruent with target location across all cue-target intervals. Similar congruency effects were seen for eye-shift cues at 300 and 800 ms intervals, but patients showed significantly larger congruency effects at 800 ms, which were driven by delayed responses to incongruent target locations. At short 100-ms cue-target intervals, neither group showed faster RTs for congruent than for incongruent eye-shift cues, but patients were significantly slower to detect targets after direct-gaze cues. These findings conflict with previous studies using schematic line drawings of eye-shifts that have found automatic attentional orienting to be reduced in schizophrenia. Instead, our data indicate that patients display abnormalities in responding to gaze direction at various stages of gaze processing-reflected by a stronger preferential capture of attention by another person's direct eye contact at initial stages of gaze processing and difficulties disengaging from a gazed-at location once shared attention is established.

  4. Development of Gaze Following Abilities in Wolves (Canis Lupus)

    PubMed Central

    Range, Friederike; Virányi, Zsófia

    2011-01-01

    The ability to coordinate with others' head and eye orientation to look in the same direction is considered a key step towards an understanding of others mental states like attention and intention. Here, we investigated the ontogeny and habituation patterns of gaze following into distant space and behind barriers in nine hand-raised wolves. We found that these wolves could use conspecific as well as human gaze cues even in the barrier task, which is thought to be more cognitively advanced than gazing into distant space. Moreover, while gaze following into distant space was already present at the age of 14 weeks and subjects did not habituate to repeated cues, gazing around a barrier developed considerably later and animals quickly habituated, supporting the hypothesis that different cognitive mechanisms may underlie the two gaze following modalities. More importantly, this study demonstrated that following another individuals' gaze around a barrier is not restricted to primates and corvids but is also present in canines, with remarkable between-group similarities in the ontogeny of this behaviour. This sheds new light on the evolutionary origins of and selective pressures on gaze following abilities as well as on the sensitivity of domestic dogs towards human communicative cues. PMID:21373192

  5. Tracking down the path of memory: eye scanpaths facilitate retrieval of visuospatial information.

    PubMed

    Bochynska, Agata; Laeng, Bruno

    2015-09-01

    Recent research points to a crucial role of eye fixations on the same spatial locations where an item appeared when learned, for the successful retrieval of stored information (e.g., Laeng et al. in Cognition 131:263-283, 2014. doi: 10.1016/j.cognition.2014.01.003 ). However, evidence about whether the specific temporal sequence (i.e., scanpath) of these eye fixations is also relevant for the accuracy of memory remains unclear. In the current study, eye fixations were recorded while looking at a checkerboard-like pattern. In a recognition session (48 h later), animations were shown where each square that formed the pattern was presented one by one, either according to the same, idiosyncratic, temporal sequence in which they were originally viewed by each participant or in a shuffled sequence although the squares were, in both conditions, always in their correct positions. Afterward, participants judged whether they had seen the same pattern before or not. Showing the elements serially according to the original scanpath's sequence yielded a significantly better recognition performance than the shuffled condition. In a forced fixation condition, where the gaze was maintained on the center of the screen, the advantage of memory accuracy for same versus shuffled scanpaths disappeared. Concluding, gaze scanpaths (i.e., the order of fixations and not simply their positions) are functional to visual memory and physical reenacting of the original, embodied, perception can facilitate retrieval.

  6. Does the implicit models of leadership influence the scanning of other-race faces in adults?

    PubMed Central

    Densten, Iain L.; Borrowman, Luc

    2017-01-01

    The current study aims to identify the relationships between implicit leadership theoretical (ILT) prototypes / anti-prototype and five facial features (i.e., nasion, upper nose, lower nose, and upper lip) of a leader from a different race than respondents. A sample of 81 Asian respondents viewed a 30-second video of a Caucasian female who in a non-engaging manner talked about her career achievements. As participants watch the video, their eye movements were recorded via an eye tracking devise. While previous research has identified that ILT influences perceptional and attitudinal ratings of leaders, the current study extends these findings by confirming the impact of ILT on the gaze patterns of other race participants, who appear to adopt system one type thinking. This study advances our understanding in how cognitive categories or schemas influence the physicality of individuals (i.e., eye gaze or movements). Finally, this study confirms that individual ILT factors have a relationship with the eye movements of participants and suggests future research directions. PMID:28686605

  7. Allocentrically implied target locations are updated in an eye-centred reference frame.

    PubMed

    Thompson, Aidan A; Glover, Christopher V; Henriques, Denise Y P

    2012-04-18

    When reaching to remembered target locations following an intervening eye movement a systematic pattern of error is found indicating eye-centred updating of visuospatial memory. Here we investigated if implicit targets, defined only by allocentric visual cues, are also updated in an eye-centred reference frame as explicit targets are. Participants viewed vertical bars separated by varying distances, and horizontal lines of equivalently varying lengths, implying a "target" location at the midpoint of the stimulus. After determining the implied "target" location from only the allocentric stimuli provided, participants saccaded to an eccentric location, and reached to the remembered "target" location. Irrespective of the type of stimulus reaching errors to these implicit targets are gaze-dependent, and do not differ from those found when reaching to remembered explicit targets. Implicit target locations are coded and updated as a function of relative gaze direction with respect to those implied locations just as explicit targets are, even though no target is specifically represented. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Does the implicit models of leadership influence the scanning of other-race faces in adults?

    PubMed

    Densten, Iain L; Borrowman, Luc

    2017-01-01

    The current study aims to identify the relationships between implicit leadership theoretical (ILT) prototypes / anti-prototype and five facial features (i.e., nasion, upper nose, lower nose, and upper lip) of a leader from a different race than respondents. A sample of 81 Asian respondents viewed a 30-second video of a Caucasian female who in a non-engaging manner talked about her career achievements. As participants watch the video, their eye movements were recorded via an eye tracking devise. While previous research has identified that ILT influences perceptional and attitudinal ratings of leaders, the current study extends these findings by confirming the impact of ILT on the gaze patterns of other race participants, who appear to adopt system one type thinking. This study advances our understanding in how cognitive categories or schemas influence the physicality of individuals (i.e., eye gaze or movements). Finally, this study confirms that individual ILT factors have a relationship with the eye movements of participants and suggests future research directions.

  9. Nucleus prepositus hypoglossi lesions produce a unique ocular motor syndrome

    PubMed Central

    Kim, Sung-Hee; Zee, David S.; du Lac, Sascha; Kim, Hyo Jung

    2016-01-01

    Objective: To describe the ocular motor abnormalities in 9 patients with a lesion involving the nucleus prepositus hypoglossi (NPH), a key constituent of a vestibular-cerebellar-brainstem neural network that ensures that the eyes are held steady in all positions of gaze. Methods: We recorded eye movements, including the vestibulo-ocular reflex during head impulses, in patients with vertigo and a lesion involving the NPH. Results: Our patients showed an ipsilesional-beating spontaneous nystagmus, horizontal gaze-evoked nystagmus more intense on looking toward the ipsilesional side, impaired pursuit more to the ipsilesional side, central patterns of head-shaking nystagmus, contralateral eye deviation, and decreased vestibulo-ocular reflex gain during contralesionally directed head impulses. Conclusions: We attribute these findings to an imbalance in the NPH–inferior olive–flocculus–vestibular nucleus loop, and the ocular motor abnormalities provide a new brainstem localization for patients with acute vertigo. PMID:27733568

  10. Eye gaze tracking using correlation filters

    NASA Astrophysics Data System (ADS)

    Karakaya, Mahmut; Bolme, David; Boehnen, Chris

    2014-03-01

    In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjects gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm's length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.

  11. Decline of vertical gaze and convergence with aging.

    PubMed

    Oguro, Hiroaki; Okada, Kazunori; Suyama, Nobuo; Yamashita, Kazuya; Yamaguchi, Shuhei; Kobayashi, Shotai

    2004-01-01

    Disturbance of vertical eye movement and ocular convergence is often observed in elderly people, but little is known about its frequency. The purpose of this study was to investigate age-associated changes in vertical eye movement and convergence in healthy elderly people, using a digital video camera system. We analyzed vertical eye movements and convergence in 113 neurologically normal elderly subjects (mean age 70 years) in comparison with 20 healthy young controls (mean age 32 years). The range of vertical eye movement was analyzed quantitatively and convergence was analyzed qualitatively. In the elderly subjects, the angle of vertical gaze decreased with advancing age and it was significantly smaller than that of the younger subjects. The mean angle of upward gaze was significantly smaller than that of downward gaze for both young and elderly subjects. Upward gaze impairment became apparent in subjects in their 70s, and downward gaze impairment in subjects in their 60s. Disturbance in convergence also increased with advancing age, and was found in 40.7% of the elderly subjects. These findings indicate that the mechanisms of age-related change are different for upward and downward vertical gaze. Digital video camera monitoring was useful for assessing and monitoring eye movements. Copyright 2004 S. Karger AG, Basel

  12. Quantifying the cognitive cost of laparo-endoscopic single-site surgeries: Gaze-based indices.

    PubMed

    Di Stasi, Leandro L; Díaz-Piedra, Carolina; Ruiz-Rabelo, Juan Francisco; Rieiro, Héctor; Sanchez Carrion, Jose M; Catena, Andrés

    2017-11-01

    Despite the growing interest concerning the laparo-endoscopic single-site surgery (LESS) procedure, LESS presents multiple difficulties and challenges that are likely to increase the surgeon's cognitive cost, in terms of both cognitive load and performance. Nevertheless, there is currently no objective index capable of assessing the surgeon cognitive cost while performing LESS. We assessed if gaze-based indices might offer unique and unbiased measures to quantify LESS complexity and its cognitive cost. We expect that the assessment of surgeon's cognitive cost to improve patient safety by measuring fitness-for-duty and reducing surgeons overload. Using a wearable eye tracker device, we measured gaze entropy and velocity of surgical trainees and attending surgeons during two surgical procedures (LESS vs. multiport laparoscopy surgery [MPS]). None of the participants had previous experience with LESS. They performed two exercises with different complexity levels (Low: Pattern Cut vs. High: Peg Transfer). We also collected performance and subjective data. LESS caused higher cognitive demand than MPS, as indicated by increased gaze entropy in both surgical trainees and attending surgeons (exploration pattern became more random). Furthermore, gaze velocity was higher (exploration pattern became more rapid) for the LESS procedure independently of the surgeon's expertise. Perceived task complexity and laparoscopic accuracy confirmed gaze-based results. Gaze-based indices have great potential as objective and non-intrusive measures to assess surgeons' cognitive cost and fitness-for-duty. Furthermore, gaze-based indices might play a relevant role in defining future guidelines on surgeons' examinations to mark their achievements during the entire training (e.g. analyzing surgical learning curves). Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Drivers’ Visual Search Patterns during Overtaking Maneuvers on Freeway

    PubMed Central

    Zhang, Wenhui; Dai, Jing; Pei, Yulong; Li, Penghui; Yan, Ying; Chen, Xinqiang

    2016-01-01

    Drivers gather traffic information primarily by means of their vision. Especially during complicated maneuvers, such as overtaking, they need to perceive a variety of characteristics including the lateral and longitudinal distances with other vehicles, the speed of others vehicles, lane occupancy, and so on, to avoid crashes. The primary object of this study is to examine the appropriate visual search patterns during overtaking maneuvers on freeways. We designed a series of driving simulating experiments in which the type and speed of the leading vehicle were considered as two influential factors. One hundred and forty participants took part in the study. The participants overtook the leading vehicles just like they would usually do so, and their eye movements were collected by use of the Eye Tracker. The results show that participants’ gaze durations and saccade durations followed normal distribution patterns and that saccade angles followed a log-normal distribution pattern. It was observed that the type of leading vehicle significantly impacted the drivers’ gaze duration and gaze frequency. As the speed of a leading vehicle increased, subjects’ saccade durations became longer and saccade angles became larger. In addition, the initial and destination lanes were found to be key areas with the highest visual allocating proportion, accounting for more than 65% of total visual allocation. Subjects tended to more frequently shift their viewpoints between the initial lane and destination lane in order to search for crucial traffic information. However, they seldom directly shifted their viewpoints between the two wing mirrors. PMID:27869764

  14. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker.

    PubMed

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2014-06-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees.

  15. iShadow: Design of a Wearable, Real-Time Mobile Gaze Tracker

    PubMed Central

    Mayberry, Addison; Hu, Pan; Marlin, Benjamin; Salthouse, Christopher; Ganesan, Deepak

    2015-01-01

    Continuous, real-time tracking of eye gaze is valuable in a variety of scenarios including hands-free interaction with the physical world, detection of unsafe behaviors, leveraging visual context for advertising, life logging, and others. While eye tracking is commonly used in clinical trials and user studies, it has not bridged the gap to everyday consumer use. The challenge is that a real-time eye tracker is a power-hungry and computation-intensive device which requires continuous sensing of the eye using an imager running at many tens of frames per second, and continuous processing of the image stream using sophisticated gaze estimation algorithms. Our key contribution is the design of an eye tracker that dramatically reduces the sensing and computation needs for eye tracking, thereby achieving orders of magnitude reductions in power consumption and form-factor. The key idea is that eye images are extremely redundant, therefore we can estimate gaze by using a small subset of carefully chosen pixels per frame. We instantiate this idea in a prototype hardware platform equipped with a low-power image sensor that provides random access to pixel values, a low-power ARM Cortex M3 microcontroller, and a bluetooth radio to communicate with a mobile phone. The sparse pixel-based gaze estimation algorithm is a multi-layer neural network learned using a state-of-the-art sparsity-inducing regularization function that minimizes the gaze prediction error while simultaneously minimizing the number of pixels used. Our results show that we can operate at roughly 70mW of power, while continuously estimating eye gaze at the rate of 30 Hz with errors of roughly 3 degrees. PMID:26539565

  16. Training for eye contact modulates gaze following in dogs.

    PubMed

    Wallis, Lisa J; Range, Friederike; Müller, Corsin A; Serisier, Samuel; Huber, Ludwig; Virányi, Zsófia

    2015-08-01

    Following human gaze in dogs and human infants can be considered a socially facilitated orientation response, which in object choice tasks is modulated by human-given ostensive cues. Despite their similarities to human infants, and extensive skills in reading human cues in foraging contexts, no evidence that dogs follow gaze into distant space has been found. We re-examined this question, and additionally whether dogs' propensity to follow gaze was affected by age and/or training to pay attention to humans. We tested a cross-sectional sample of 145 border collies aged 6 months to 14 years with different amounts of training over their lives. The dogs' gaze-following response in test and control conditions before and after training for initiating eye contact with the experimenter was compared with that of a second group of 13 border collies trained to touch a ball with their paw. Our results provide the first evidence that dogs can follow human gaze into distant space. Although we found no age effect on gaze following, the youngest and oldest age groups were more distractible, which resulted in a higher number of looks in the test and control conditions. Extensive lifelong formal training as well as short-term training for eye contact decreased dogs' tendency to follow gaze and increased their duration of gaze to the face. The reduction in gaze following after training for eye contact cannot be explained by fatigue or short-term habituation, as in the second group gaze following increased after a different training of the same length. Training for eye contact created a competing tendency to fixate the face, which prevented the dogs from following the directional cues. We conclude that following human gaze into distant space in dogs is modulated by training, which may explain why dogs perform poorly in comparison to other species in this task.

  17. Investigating Gaze of Children with ASD in Naturalistic Settings

    PubMed Central

    Noris, Basilio; Nadel, Jacqueline; Barker, Mandy; Hadjikhani, Nouchine; Billard, Aude

    2012-01-01

    Background Visual behavior is known to be atypical in Autism Spectrum Disorders (ASD). Monitor-based eye-tracking studies have measured several of these atypicalities in individuals with Autism. While atypical behaviors are known to be accentuated during natural interactions, few studies have been made on gaze behavior in natural interactions. In this study we focused on i) whether the findings done in laboratory settings are also visible in a naturalistic interaction; ii) whether new atypical elements appear when studying visual behavior across the whole field of view. Methodology/Principal Findings Ten children with ASD and ten typically developing children participated in a dyadic interaction with an experimenter administering items from the Early Social Communication Scale (ESCS). The children wore a novel head-mounted eye-tracker, measuring gaze direction and presence of faces across the child's field of view. The analysis of gaze episodes to faces revealed that children with ASD looked significantly less and for shorter lapses of time at the experimenter. The analysis of gaze patterns across the child's field of view revealed that children with ASD looked downwards and made more extensive use of their lateral field of view when exploring the environment. Conclusions/Significance The data gathered in naturalistic settings confirm findings previously obtained only in monitor-based studies. Moreover, the study allowed to observe a generalized strategy of lateral gaze in children with ASD when they were looking at the objects in their environment. PMID:23028494

  18. Anxiety and Sensitivity to Eye Gaze in Emotional Faces

    ERIC Educational Resources Information Center

    Holmes, Amanda; Richards, Anne; Green, Simon

    2006-01-01

    This paper reports three studies in which stronger orienting to perceived eye gaze direction was revealed when observers viewed faces showing fearful or angry, compared with happy or neutral, emotional expressions. Gaze-related spatial cueing effects to laterally presented fearful faces and centrally presented angry faces were also modulated by…

  19. Sex differences in visual attention to sexually explicit videos: a preliminary study.

    PubMed

    Tsujimura, Akira; Miyagawa, Yasushi; Takada, Shingo; Matsuoka, Yasuhiro; Takao, Tetsuya; Hirai, Toshiaki; Matsushita, Masateru; Nonomura, Norio; Okuyama, Akihiko

    2009-04-01

    Although men appear to be more interested in sexual stimuli than women, this difference is not completely understood. Eye-tracking technology has been used to investigate visual attention to still sexual images; however, it has not been applied to moving sexual images. To investigate whether sex difference exists in visual attention to sexual videos. Eleven male and 11 female healthy volunteers were studied by our new methodology. The subjects viewed two sexual videos (one depicting sexual intercourse and one not) in which several regions were designated for eye-gaze analysis in each frame. Visual attention was measured across each designated region according to gaze duration. Sex differences, the region attracting the most attention, and visually favored sex were evaluated. In the nonintercourse clip, gaze time for the face and body of the actress was significantly shorter among women than among men. Gaze time for the face and body of the actor and nonhuman regions was significantly longer for women than men. The region attracting the most attention was the face of the actress for both men and women. Men viewed the opposite sex for a significantly longer period than did women, and women viewed their own sex for a significantly longer period than did men. However, gaze times for the clip showing intercourse were not significantly different between sexes. A sex difference existed in visual attention to a sexual video without heterosexual intercourse; men viewed the opposite sex for longer periods than did women, and women viewed the same sex for longer periods than did men. There was no statistically significant sex difference in viewing patterns in a sexual video showing heterosexual intercourse, and we speculate that men and women may have similar visual attention patterns if the sexual stimuli are sufficiently explicit.

  20. A Pilot Study of Horizontal Head and Eye Rotations in Baseball Batting.

    PubMed

    Fogt, Nick; Persson, Tyler W

    2017-08-01

    The purpose of the study was to measure and compare horizontal head and eye tracking movements as baseball batters "took" pitches and swung at baseball pitches. Two former college baseball players were tested in two conditions. A pitching machine was used to project tennis balls toward the subjects. In the first condition, subjects acted as if they were taking (i.e., not swinging) the pitches. In the second condition, subjects attempted to bat the pitched balls. Head movements were measured with an inertial sensor; eye movements were measured with a video eye tracker. For each condition, the relationship between the horizontal head and eye rotations was similar for the two subjects, as were the overall head-, eye-, and gaze-tracking strategies. In the "take" condition, head movements in the direction of the ball were larger than eye movements for much of the pitch trajectory. Large eye movements occurred only late in the pitch trajectory. Gaze was directed near the ball until approximately 150 milliseconds before the ball arrived at the batter, at which time gaze was directed ahead of the ball to a location near that occupied when the ball crosses the plate. In the "swing" condition, head movements in the direction of the ball were larger than eye movements throughout the pitch trajectory. Gaze was directed near the ball until approximately 50 to 60 milliseconds prior to pitch arrival at the batter. Horizontal head rotations were larger than horizontal eye rotations in both the "take" and "swing" conditions. Gaze was directed ahead of the ball late in the pitch trajectory in the "take" condition, whereas gaze was directed near the ball throughout much of the pitch trajectory in the "swing" condition.

  1. CULTURAL DISPLAY RULES DRIVE EYE GAZE DURING THINKING.

    PubMed

    McCarthy, Anjanie; Lee, Kang; Itakura, Shoji; Muir, Darwin W

    2006-11-01

    The authors measured the eye gaze displays of Canadian, Trinidadian, and Japanese participants as they answered questions for which they either knew, or had to derive, the answers. When they knew the answers, Trinidadians maintained the most eye contact, whereas Japanese maintained the least. When thinking about the answers to questions, Canadians and Trinidadians looked up, whereas Japanese looked down. Thus, for humans, gaze displays while thinking are at least in part culturally determined.

  2. A model of face selection in viewing video stories.

    PubMed

    Suda, Yuki; Kitazawa, Shigeru

    2015-01-19

    When typical adults watch TV programs, they show surprisingly stereo-typed gaze behaviours, as indicated by the almost simultaneous shifts of their gazes from one face to another. However, a standard saliency model based on low-level physical features alone failed to explain such typical gaze behaviours. To find rules that explain the typical gaze behaviours, we examined temporo-spatial gaze patterns in adults while they viewed video clips with human characters that were played with or without sound, and in the forward or reverse direction. We here show the following: 1) the "peak" face scanpath, which followed the face that attracted the largest number of views but ignored other objects in the scene, still retained the key features of actual scanpaths, 2) gaze behaviours remained unchanged whether the sound was provided or not, 3) the gaze behaviours were sensitive to time reversal, and 4) nearly 60% of the variance of gaze behaviours was explained by the face saliency that was defined as a function of its size, novelty, head movements, and mouth movements. These results suggest that humans share a face-oriented network that integrates several visual features of multiple faces, and directs our eyes to the most salient face at each moment.

  3. I spy with my little eye: Analysis of airline pilots' gaze patterns in a manual instrument flight scenario.

    PubMed

    Haslbeck, Andreas; Zhang, Bo

    2017-09-01

    The aim of this study was to analyze pilots' visual scanning in a manual approach and landing scenario. Manual flying skills suffer from increasing use of automation. In addition, predominantly long-haul pilots with only a few opportunities to practice these skills experience this decline. Airline pilots representing different levels of practice (short-haul vs. long-haul) had to perform a manual raw data precision approach while their visual scanning was recorded by an eye-tracking device. The analysis of gaze patterns, which are based on predominant saccades, revealed one main group of saccades among long-haul pilots. In contrast, short-haul pilots showed more balanced scanning using two different groups of saccades. Short-haul pilots generally demonstrated better manual flight performance and within this group, one type of scan pattern was found to facilitate the manual landing task more. Long-haul pilots tend to utilize visual scanning behaviors that are inappropriate for the manual ILS landing task. This lack of skills needs to be addressed by providing specific training and more practice. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Gaze-stabilizing deficits and latent nystagmus in monkeys with brief, early-onset visual deprivation: eye movement recordings.

    PubMed

    Tusa, R J; Mustari, M J; Burrows, A F; Fuchs, A F

    2001-08-01

    The normal development and the capacity to calibrate gaze-stabilizing systems may depend on normal vision during infancy. At the end of 1 yr of dark rearing, cats have gaze-stabilizing deficits similar to that of the newborn human infant including decreased monocular optokinetic nystagmus (OKN) in the nasal to temporal (N-T) direction and decreased velocity storage in the vestibuloocular reflex (VOR). The purpose of this study is to determine to what extent restricted vision during the first 2 mo of life in monkeys affects the development of gaze-stabilizing systems. The eyelids of both eyes were sutured closed in three rhesus monkeys (Macaca mulatta) at birth. Eyelids were opened at 25 days in one monkey and 40 and 55 days in the other two animals. Eye movements were recorded from each eye using scleral search coils. The VOR, OKN, and fixation were examined at 6 and 12 mo of age. We also examined ocular alignment, refraction, and visual acuity in these animals. At 1 yr of age, visual acuity ranged from 0.3 to 0.6 LogMAR (20/40-20/80). All animals showed a defect in monocular OKN in the N-T direction. The velocity-storage component of OKN (i.e., OKAN) was the most impaired. All animals had a mild reduction in VOR gain but had a normal time constant. The animals deprived for 40 and 55 days had a persistent strabismus. All animals showed a nystagmus similar to latent nystagmus (LN) in human subjects. The amount of LN and OKN defect correlated positively with the duration of deprivation. In addition, the animal deprived for 55 days demonstrated a pattern of nystagmus similar to congenital nystagmus in human subjects. We found that restricted visual input during the first 2 mo of life impairs certain gaze-stabilizing systems and causes LN in primates.

  5. Gaze patterns hold key to unlocking successful search strategies and increasing polyp detection rate in colonoscopy.

    PubMed

    Lami, Mariam; Singh, Harsimrat; Dilley, James H; Ashraf, Hajra; Edmondon, Matthew; Orihuela-Espina, Felipe; Hoare, Jonathan; Darzi, Ara; Sodergren, Mikael H

    2018-02-07

    The adenoma detection rate (ADR) is an important quality indicator in colonoscopy. The aim of this study was to evaluate the changes in visual gaze patterns (VGPs) with increasing polyp detection rate (PDR), a surrogate marker of ADR. 18 endoscopists participated in the study. VGPs were measured using eye-tracking technology during the withdrawal phase of colonoscopy. VGPs were characterized using two analyses - screen and anatomy. Eye-tracking parameters were used to characterize performance, which was further substantiated using hidden Markov model (HMM) analysis. Subjects with higher PDRs spent more time viewing the outer ring of the 3 × 3 grid for both analyses (screen-based: r = 0.56, P  = 0.02; anatomy: r = 0.62, P  < 0.01). Fixation distribution to the "bottom U" of the screen in screen-based analysis was positively correlated with PDR (r = 0.62, P  = 0.01). HMM demarcated the VGPs into three PDR groups. This study defined distinct VGPs that are associated with expert behavior. These data may allow introduction of visual gaze training within structured training programs, and have implications for adoption in higher-level assessment. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Live Speech Driven Head-and-Eye Motion Generators.

    PubMed

    Le, Binh H; Ma, Xiaohan; Deng, Zhigang

    2012-11-01

    This paper describes a fully automated framework to generate realistic head motion, eye gaze, and eyelid motion simultaneously based on live (or recorded) speech input. Its central idea is to learn separate yet interrelated statistical models for each component (head motion, gaze, or eyelid motion) from a prerecorded facial motion data set: 1) Gaussian Mixture Models and gradient descent optimization algorithm are employed to generate head motion from speech features; 2) Nonlinear Dynamic Canonical Correlation Analysis model is used to synthesize eye gaze from head motion and speech features, and 3) nonnegative linear regression is used to model voluntary eye lid motion and log-normal distribution is used to describe involuntary eye blinks. Several user studies are conducted to evaluate the effectiveness of the proposed speech-driven head and eye motion generator using the well-established paired comparison methodology. Our evaluation results clearly show that this approach can significantly outperform the state-of-the-art head and eye motion generation algorithms. In addition, a novel mocap+video hybrid data acquisition technique is introduced to record high-fidelity head movement, eye gaze, and eyelid motion simultaneously.

  7. The Development of Emotional Face and Eye Gaze Processing

    ERIC Educational Resources Information Center

    Hoehl, Stefanie; Striano, Tricia

    2010-01-01

    Recent research has demonstrated that infants' attention towards novel objects is affected by an adult's emotional expression and eye gaze toward the object. The current event-related potential (ERP) study investigated how infants at 3, 6, and 9 months of age process fearful compared to neutral faces looking toward objects or averting gaze away…

  8. Eye Gaze Tracking using Correlation Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karakaya, Mahmut; Boehnen, Chris Bensing; Bolme, David S

    In this paper, we studied a method for eye gaze tracking that provide gaze estimation from a standard webcam with a zoom lens and reduce the setup and calibration requirements for new users. Specifically, we have developed a gaze estimation method based on the relative locations of points on the top of the eyelid and eye corners. Gaze estimation method in this paper is based on the distances between top point of the eyelid and eye corner detected by the correlation filters. Advanced correlation filters were found to provide facial landmark detections that are accurate enough to determine the subjectsmore » gaze direction up to angle of approximately 4-5 degrees although calibration errors often produce a larger overall shift in the estimates. This is approximately a circle of diameter 2 inches for a screen that is arm s length from the subject. At this accuracy it is possible to figure out what regions of text or images the subject is looking but it falls short of being able to determine which word the subject has looked at.« less

  9. Coordinated Flexibility: How Initial Gaze Position Modulates Eye-Hand Coordination and Reaching

    ERIC Educational Resources Information Center

    Adam, Jos J.; Buetti, Simona; Kerzel, Dirk

    2012-01-01

    Reaching to targets in space requires the coordination of eye and hand movements. In two experiments, we recorded eye and hand kinematics to examine the role of gaze position at target onset on eye-hand coordination and reaching performance. Experiment 1 showed that with eyes and hand aligned on the same peripheral start location, time lags…

  10. Joint Attention without Gaze Following: Human Infants and Their Parents Coordinate Visual Attention to Objects through Eye-Hand Coordination

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    The coordination of visual attention among social partners is central to many components of human behavior and human development. Previous research has focused on one pathway to the coordination of looking behavior by social partners, gaze following. The extant evidence shows that even very young infants follow the direction of another's gaze but they do so only in highly constrained spatial contexts because gaze direction is not a spatially precise cue as to the visual target and not easily used in spatially complex social interactions. Our findings, derived from the moment-to-moment tracking of eye gaze of one-year-olds and their parents as they actively played with toys, provide evidence for an alternative pathway, through the coordination of hands and eyes in goal-directed action. In goal-directed actions, the hands and eyes of the actor are tightly coordinated both temporally and spatially, and thus, in contexts including manual engagement with objects, hand movements and eye movements provide redundant information about where the eyes are looking. Our findings show that one-year-olds rarely look to the parent's face and eyes in these contexts but rather infants and parents coordinate looking behavior without gaze following by attending to objects held by the self or the social partner. This pathway, through eye-hand coupling, leads to coordinated joint switches in visual attention and to an overall high rate of looking at the same object at the same time, and may be the dominant pathway through which physically active toddlers align their looking behavior with a social partner. PMID:24236151

  11. Eye gaze tracking based on the shape of pupil image

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is an important instrument for research in psychology, widely used in attention, visual perception, reading and other fields of research. Because of its potential function in human-computer interaction, the eye gaze tracking has already been a topic of research in many fields over the last decades. Nowadays, with the development of technology, non-intrusive methods are more and more welcomed. In this paper, we will present a method based on the shape of pupil image to estimate the gaze point of human eyes without any other intrusive devices such as a hat, a pair of glasses and so on. After using the ellipse fitting algorithm to deal with the pupil image we get, we can determine the direction of the fixation by the shape of the pupil.The innovative aspect of this method is to utilize the new idea of the shape of the pupil so that we can avoid much complicated algorithm. The performance proposed is very helpful for the study of eye gaze tracking, which just needs one camera without infrared light to know the changes in the shape of the pupil to determine the direction of the eye gazing, no additional condition is required.

  12. Head eye co-ordination and gaze stability in subjects with persistent whiplash associated disorders.

    PubMed

    Treleaven, Julia; Jull, Gwendolen; Grip, Helena

    2011-06-01

    Symptoms of dizziness, unsteadiness and visual disturbances are frequent complaints in persons with persistent whiplash associated disorders. This study investigated eye, head co-ordination and gaze stability in subjects with persistent whiplash (n = 20) and asymptomatic controls (n = 20). Wireless motion sensors and electro-oculography were used to measure: head rotation during unconstrained head movement, head rotation during gaze stability and sequential head and eye movements. Ten control subjects participated in a repeatability study (two occasions one week apart). Between-day repeatability was acceptable (ICC > 0.6) for most measures. The whiplash group had significantly less maximal eye angle to the left, range of head movement during the gaze stability task and decreased velocity of head movement in head eye co-ordination and gaze stability tasks compared to the control group (p < 0.01). There were significant correlations (r > 0.55) between both unrestrained neck movement and neck pain and head movement and velocity in the whiplash group. Deficits in gaze stability and head eye co-ordination may be related to disturbed reflex activity associated with decreased head range of motion and/or neck pain. Further research is required to explore the mechanisms behind these deficits, the nature of changes over time and the tests' ability to measure change in response to rehabilitation. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.

  13. Measurement of ocular aberrations in downward gaze using a modified clinical aberrometer

    PubMed Central

    Ghosh, Atanu; Collins, Michael J; Read, Scott A; Davis, Brett A; Iskander, D. Robert

    2011-01-01

    Changes in corneal optics have been measured after downward gaze. However, ocular aberrations during downward gaze have not been previously measured. A commercial Shack-Hartmann aberrometer (COAS-HD) was modified by adding a relay lens system and a rotatable beam splitter to allow on-axis aberration measurements in primary gaze and downward gaze with binocular fixation. Measurements with the modified aberrometer (COAS-HD relay system) in primary and downward gaze were validated against a conventional aberrometer. In human eyes, there were significant changes (p<0.05) in defocus C(2,0), primary astigmatism C(2,2) and vertical coma C(3,−1) in downward gaze (25 degrees) compared to primary gaze, indicating the potential influence of biomechanical forces on the optics of the eye in downward gaze. To demonstrate a further clinical application of this modified aberrometer, we measured ocular aberrations when wearing a progressive addition lens (PAL) in primary gaze (0 degree), 15 degrees downward gaze and 25 degrees downward gaze. PMID:21412451

  14. Real-time recording and classification of eye movements in an immersive virtual environment.

    PubMed

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-10-10

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.

  15. Real-time recording and classification of eye movements in an immersive virtual environment

    PubMed Central

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-01-01

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements. PMID:24113087

  16. GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.

    PubMed

    Sogo, Hiroyuki

    2013-09-01

    Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.

  17. Contribution of the cerebellar flocculus to gaze control during active head movements

    NASA Technical Reports Server (NTRS)

    Belton, T.; McCrea, R. A.; Peterson, B. W. (Principal Investigator)

    1999-01-01

    The flocculus and ventral paraflocculus are adjacent regions of the cerebellar cortex that are essential for controlling smooth pursuit eye movements and for altering the performance of the vestibulo-ocular reflex (VOR). The question addressed in this study is whether these regions of the cerebellum are more globally involved in controlling gaze, regardless of whether eye or active head movements are used to pursue moving visual targets. Single-unit recordings were obtained from Purkinje (Pk) cells in the floccular region of squirrel monkeys that were trained to fixate and pursue small visual targets. Cell firing rate was recorded during smooth pursuit eye movements, cancellation of the VOR, combined eye-head pursuit, and spontaneous gaze shifts in the absence of targets. Pk cells were found to be much less sensitive to gaze velocity during combined eye-head pursuit than during ocular pursuit. They were not sensitive to gaze or head velocity during gaze saccades. Temporary inactivation of the floccular region by muscimol injection compromised ocular pursuit but had little effect on the ability of monkeys to pursue visual targets with head movements or to cancel the VOR during active head movements. Thus the signals produced by Pk cells in the floccular region are necessary for controlling smooth pursuit eye movements but not for coordinating gaze during active head movements. The results imply that individual functional modules in the cerebellar cortex are less involved in the global organization and coordination of movements than with parametric control of movements produced by a specific part of the body.

  18. Simulating hemispatial neglect with virtual reality.

    PubMed

    Baheux, Kenji; Yoshizawa, Makoto; Yoshida, Yasuko

    2007-07-19

    Hemispatial neglect is a cognitive disorder defined as a lack of attention for stimuli contra-lateral to the brain lesion. The assessment is traditionally done with basic pencil and paper tests and the rehabilitation programs are generally not well adapted. We propose a virtual reality system featuring an eye-tracking device for a better characterization of the neglect that will lead to new rehabilitation techniques. This paper presents a comparison of eye-gaze patterns of healthy subjects, patients and healthy simulated patients on a virtual line bisection test. The task was also executed with a reduced visual field condition hoping that fewer stimuli would limit the neglect. We found that patients and healthy simulated patients had similar eye-gaze patterns. However, while the reduced visual field condition had no effect on the healthy simulated patients, it actually had a negative impact on the patients. We discuss the reasons for these differences and how they relate to the limitations of the neglect simulation. We argue that with some improvements the technique could be used to determine the potential of new rehabilitation techniques and also help the rehabilitation staff or the patient's relatives to better understand the neglect condition.

  19. Parent Perception of Two Eye-Gaze Control Technology Systems in Young Children with Cerebral Palsy: Pilot Study.

    PubMed

    Karlsson, Petra; Wallen, Margaret

    2017-01-01

    Eye-gaze control technology enables people with significant physical disability to access computers for communication, play, learning and environmental control. This pilot study used a multiple case study design with repeated baseline assessment and parents' evaluations to compare two eye-gaze control technology systems to identify any differences in factors such as ease of use and impact of the systems for their young children. Five children, aged 3 to 5 years, with dyskinetic cerebral palsy, and their families participated. Overall, families were satisfied with both the Tobii PCEye Go and myGaze® eye tracker, found them easy to position and use, and children learned to operate them quickly. This technology provides young children with important opportunities for learning, play, leisure, and developing communication.

  20. Recognition of Emotion from Facial Expressions with Direct or Averted Eye Gaze and Varying Expression Intensities in Children with Autism Disorder and Typically Developing Children

    PubMed Central

    Tell, Dina; Davidson, Denise; Camras, Linda A.

    2014-01-01

    Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently. PMID:24804098

  1. Predicting Aggressive Tendencies by Visual Attention Bias Associated with Hostile Emotions

    PubMed Central

    Lin, Ping-I; Hsieh, Cheng-Da; Juan, Chi-Hung; Hossain, Md Monir; Erickson, Craig A.; Lee, Yang-Han; Su, Mu-Chun

    2016-01-01

    The goal of the current study is to clarify the relationship between social information processing (e.g., visual attention to cues of hostility, hostility attribution bias, and facial expression emotion labeling) and aggressive tendencies. Thirty adults were recruited in the eye-tracking study that measured various components in social information processing. Baseline aggressive tendencies were measured using the Buss-Perry Aggression Questionnaire (AQ). Visual attention towards hostile objects was measured as the proportion of eye gaze fixation duration on cues of hostility. Hostility attribution bias was measured with the rating results for emotions of characters in the images. The results show that the eye gaze duration on hostile characters was significantly inversely correlated with the AQ score and less eye contact with an angry face. The eye gaze duration on hostile object was not significantly associated with hostility attribution bias, although hostility attribution bias was significantly positively associated with the AQ score. Our findings suggest that eye gaze fixation time towards non-hostile cues may predict aggressive tendencies. PMID:26901770

  2. Predicting Aggressive Tendencies by Visual Attention Bias Associated with Hostile Emotions.

    PubMed

    Lin, Ping-I; Hsieh, Cheng-Da; Juan, Chi-Hung; Hossain, Md Monir; Erickson, Craig A; Lee, Yang-Han; Su, Mu-Chun

    2016-01-01

    The goal of the current study is to clarify the relationship between social information processing (e.g., visual attention to cues of hostility, hostility attribution bias, and facial expression emotion labeling) and aggressive tendencies. Thirty adults were recruited in the eye-tracking study that measured various components in social information processing. Baseline aggressive tendencies were measured using the Buss-Perry Aggression Questionnaire (AQ). Visual attention towards hostile objects was measured as the proportion of eye gaze fixation duration on cues of hostility. Hostility attribution bias was measured with the rating results for emotions of characters in the images. The results show that the eye gaze duration on hostile characters was significantly inversely correlated with the AQ score and less eye contact with an angry face. The eye gaze duration on hostile object was not significantly associated with hostility attribution bias, although hostility attribution bias was significantly positively associated with the AQ score. Our findings suggest that eye gaze fixation time towards non-hostile cues may predict aggressive tendencies.

  3. The impact of fatigue on latent print examinations as revealed by behavioral and eye gaze testing.

    PubMed

    Busey, Thomas; Swofford, Henry J; Vanderkolk, John; Emerick, Brandi

    2015-06-01

    Eye tracking and behavioral methods were used to assess the effects of fatigue on performance in latent print examiners. Eye gaze was measured both before and after a fatiguing exercise involving fine-grained examination decisions. The eye tracking tasks used similar images, often laterally reversed versions of previously viewed prints, which holds image detail constant while minimizing prior recognition. These methods, as well as a within-subject design with fine grained analyses of the eye gaze data, allow fairly strong conclusions despite a relatively small subject population. Consistent with the effects of fatigue on practitioners in other fields such as radiology, behavioral performance declined with fatigue, and the eye gaze statistics suggested a smaller working memory capacity. Participants also terminated the search/examination process sooner when fatigued. However, fatigue did not produce changes in inter-examiner consistency as measured by the Earth Mover Metric. Implications for practice are discussed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Effects of Peripheral Eccentricity and Head Orientation on Gaze Discrimination.

    PubMed

    Palanica, Adam; Itier, Roxane J

    2014-01-01

    Visual search tasks support a special role for direct gaze in human cognition, while classic gaze judgment tasks suggest the congruency between head orientation and gaze direction plays a central role in gaze perception. Moreover, whether gaze direction can be accurately discriminated in the periphery using covert attention is unknown. In the present study, individual faces in frontal and in deviated head orientations with a direct or an averted gaze were flashed for 150 ms across the visual field; participants focused on a centred fixation while judging the gaze direction. Gaze discrimination speed and accuracy varied with head orientation and eccentricity. The limit of accurate gaze discrimination was less than ±6° eccentricity. Response times suggested a processing facilitation for direct gaze in fovea, irrespective of head orientation, however, by ±3° eccentricity, head orientation started biasing gaze judgments, and this bias increased with eccentricity. Results also suggested a special processing of frontal heads with direct gaze in central vision, rather than a general congruency effect between eye and head cues. Thus, while both head and eye cues contribute to gaze discrimination, their role differs with eccentricity.

  5. Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop.

    PubMed

    Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K

    2008-01-01

    A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.

  6. Reactions to Eye Contact Initiated by Physically Attractive and Unattractive Men and Women.

    ERIC Educational Resources Information Center

    Rall, Marilyn; And Others

    1984-01-01

    Explored the effects of physical attractiveness and eye gaze on subjects' (N=93) reactions to both male and female confederates. Results showed that subjects reported discomfort with the gaze of both the attractive and the unattractive confederates and that women tended to receive slightly more return gazes than did men. (LLL)

  7. Gaze Duration Biases for Colours in Combination with Dissonant and Consonant Sounds: A Comparative Eye-Tracking Study with Orangutans.

    PubMed

    Mühlenbeck, Cordelia; Liebal, Katja; Pritsch, Carla; Jacobsen, Thomas

    2015-01-01

    Research on colour preferences in humans and non-human primates suggests similar patterns of biases for and avoidance of specific colours, indicating that these colours are connected to a psychological reaction. Similarly, in the acoustic domain, approach reactions to consonant sounds (considered as positive) and avoidance reactions to dissonant sounds (considered as negative) have been found in human adults and children, and it has been demonstrated that non-human primates are able to discriminate between consonant and dissonant sounds. Yet it remains unclear whether the visual and acoustic approach-avoidance patterns remain consistent when both types of stimuli are combined, how they relate to and influence each other, and whether these are similar for humans and other primates. Therefore, to investigate whether gaze duration biases for colours are similar across primates and whether reactions to consonant and dissonant sounds cumulate with reactions to specific colours, we conducted an eye-tracking study in which we compared humans with one species of great apes, the orangutans. We presented four different colours either in isolation or in combination with consonant and dissonant sounds. We hypothesised that the viewing time for specific colours should be influenced by dissonant sounds and that previously existing avoidance behaviours with regard to colours should be intensified, reflecting their association with negative acoustic information. The results showed that the humans had constant gaze durations which were independent of the auditory stimulus, with a clear avoidance of yellow. In contrast, the orangutans did not show any clear gaze duration bias or avoidance of colours, and they were also not influenced by the auditory stimuli. In conclusion, our findings only partially support the previously identified pattern of biases for and avoidance of specific colours in humans and do not confirm such a pattern for orangutans.

  8. Gaze Duration Biases for Colours in Combination with Dissonant and Consonant Sounds: A Comparative Eye-Tracking Study with Orangutans

    PubMed Central

    Mühlenbeck, Cordelia; Liebal, Katja; Pritsch, Carla; Jacobsen, Thomas

    2015-01-01

    Research on colour preferences in humans and non-human primates suggests similar patterns of biases for and avoidance of specific colours, indicating that these colours are connected to a psychological reaction. Similarly, in the acoustic domain, approach reactions to consonant sounds (considered as positive) and avoidance reactions to dissonant sounds (considered as negative) have been found in human adults and children, and it has been demonstrated that non-human primates are able to discriminate between consonant and dissonant sounds. Yet it remains unclear whether the visual and acoustic approach–avoidance patterns remain consistent when both types of stimuli are combined, how they relate to and influence each other, and whether these are similar for humans and other primates. Therefore, to investigate whether gaze duration biases for colours are similar across primates and whether reactions to consonant and dissonant sounds cumulate with reactions to specific colours, we conducted an eye-tracking study in which we compared humans with one species of great apes, the orangutans. We presented four different colours either in isolation or in combination with consonant and dissonant sounds. We hypothesised that the viewing time for specific colours should be influenced by dissonant sounds and that previously existing avoidance behaviours with regard to colours should be intensified, reflecting their association with negative acoustic information. The results showed that the humans had constant gaze durations which were independent of the auditory stimulus, with a clear avoidance of yellow. In contrast, the orangutans did not show any clear gaze duration bias or avoidance of colours, and they were also not influenced by the auditory stimuli. In conclusion, our findings only partially support the previously identified pattern of biases for and avoidance of specific colours in humans and do not confirm such a pattern for orangutans. PMID:26466351

  9. Keeping an Eye on Noisy Movements: On Different Approaches to Perceptual-Motor Skill Research and Training.

    PubMed

    Dicks, Matt; Button, Chris; Davids, Keith; Chow, Jia Yi; van der Kamp, John

    2017-04-01

    Contemporary theorizing on the complementary nature of perception and action in expert performance has led to different emphases in the study of movement coordination and gaze behavior. On the one hand, coordination research has examined the role of variability in movement control, evidencing that variability facilitates individualized adaptations during both learning and performance. On the other hand, and at odds with this principle, the majority of gaze behavior studies have tended to average data over participants and trials, proposing the importance of universal 'optimal' gaze patterns in a given task, for all performers, irrespective of stage of learning. In this article, we discuss new lines of inquiry with the aim of reconciling these two distinct approaches. We consider the role of inter- and intra-individual variability in gaze behaviors and suggest directions for future research.

  10. In the presence of conflicting gaze cues, fearful expression and eye-size guide attention.

    PubMed

    Carlson, Joshua M; Aday, Jacob

    2017-10-19

    Humans are social beings that often interact in multi-individual environments. As such, we are frequently confronted with nonverbal social signals, including eye-gaze direction, from multiple individuals. Yet, the factors that allow for the prioritisation of certain gaze cues over others are poorly understood. Using a modified conflicting gaze paradigm, we tested the hypothesis that fearful gaze would be favoured amongst competing gaze cues. We further hypothesised that this effect is related to the increased sclera exposure, which is characteristic of fearful expressions. Across three experiments, we found that fearful, but not happy, gaze guides observers' attention over competing non-emotional gaze. The guidance of attention by fearful gaze appears to be linked to increased sclera exposure. However, differences in sclera exposure do not prioritise competing gazes of other types. Thus, fearful gaze guides attention among competing cues and this effect is facilitated by increased sclera exposure - but increased sclera exposure per se does not guide attention. The prioritisation of fearful gaze over non-emotional gaze likely represents an adaptive means of selectively attending to survival-relevant spatial locations.

  11. Modeling eye-head gaze shifts in multiple contexts without motor planning

    PubMed Central

    Haji-Abolhassani, Iman; Guitton, Daniel

    2016-01-01

    During gaze shifts, the eyes and head collaborate to rapidly capture a target (saccade) and fixate it. Accordingly, models of gaze shift control should embed both saccadic and fixation modes and a mechanism for switching between them. We demonstrate a model in which the eye and head platforms are driven by a shared gaze error signal. To limit the number of free parameters, we implement a model reduction approach in which steady-state cerebellar effects at each of their projection sites are lumped with the parameter of that site. The model topology is consistent with anatomy and neurophysiology, and can replicate eye-head responses observed in multiple experimental contexts: 1) observed gaze characteristics across species and subjects can emerge from this structure with minor parametric changes; 2) gaze can move to a goal while in the fixation mode; 3) ocular compensation for head perturbations during saccades could rely on vestibular-only cells in the vestibular nuclei with postulated projections to burst neurons; 4) two nonlinearities suffice, i.e., the experimentally-determined mapping of tectoreticular cells onto brain stem targets and the increased recruitment of the head for larger target eccentricities; 5) the effects of initial conditions on eye/head trajectories are due to neural circuit dynamics, not planning; and 6) “compensatory” ocular slow phases exist even after semicircular canal plugging, because of interconnections linking eye-head circuits. Our model structure also simulates classical vestibulo-ocular reflex and pursuit nystagmus, and provides novel neural circuit and behavioral predictions, notably that both eye-head coordination and segmental limb coordination are possible without trajectory planning. PMID:27440248

  12. Eye Contact Facilitates Awareness of Faces during Interocular Suppression

    ERIC Educational Resources Information Center

    Stein, Timo; Senju, Atsushi; Peelen, Marius V.; Sterzer, Philipp

    2011-01-01

    Eye contact captures attention and receives prioritized visual processing. Here we asked whether eye contact might be processed outside conscious awareness. Faces with direct and averted gaze were rendered invisible using interocular suppression. In two experiments we found that faces with direct gaze overcame such suppression more rapidly than…

  13. An eye on reactor and computer control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.; Knee, B.

    1992-01-01

    At ORNL computer software has been developed to make possible an improved eye-gaze measurement technology. Such an inovation could be the basis for advanced eye-gaze systems that may have applications in reactor control, software development, cognitive engineering, evaluation of displays, prediction of mental workloads, and military target recognition.

  14. Children's Knowledge of Deceptive Gaze Cues and Its Relation to Their Actual Lying Behavior

    ERIC Educational Resources Information Center

    McCarthy, Anjanie; Lee, Kang

    2009-01-01

    Eye gaze plays a pivotal role during communication. When interacting deceptively, it is commonly believed that the deceiver will break eye contact and look downward. We examined whether children's gaze behavior when lying is consistent with this belief. In our study, 7- to 15-year-olds and adults answered questions truthfully ("Truth" questions)…

  15. Remote gaze tracking system for 3D environments.

    PubMed

    Congcong Liu; Herrup, Karl; Shi, Bertram E

    2017-07-01

    Eye tracking systems are typically divided into two categories: remote and mobile. Remote systems, where the eye tracker is located near the object being viewed by the subject, have the advantage of being less intrusive, but are typically used for tracking gaze points on fixed two dimensional (2D) computer screens. Mobile systems such as eye tracking glasses, where the eye tracker are attached to the subject, are more intrusive, but are better suited for cases where subjects are viewing objects in the three dimensional (3D) environment. In this paper, we describe how remote gaze tracking systems developed for 2D computer screens can be used to track gaze points in a 3D environment. The system is non-intrusive. It compensates for small head movements by the user, so that the head need not be stabilized by a chin rest or bite bar. The system maps the 3D gaze points of the user onto 2D images from a scene camera and is also located remotely from the subject. Measurement results from this system indicate that it is able to estimate gaze points in the scene camera to within one degree over a wide range of head positions.

  16. Speaker gaze increases information coupling between infant and adult brains.

    PubMed

    Leong, Victoria; Byrne, Elizabeth; Clackson, Kaili; Georgieva, Stanimira; Lam, Sarah; Wass, Sam

    2017-12-12

    When infants and adults communicate, they exchange social signals of availability and communicative intention such as eye gaze. Previous research indicates that when communication is successful, close temporal dependencies arise between adult speakers' and listeners' neural activity. However, it is not known whether similar neural contingencies exist within adult-infant dyads. Here, we used dual-electroencephalography to assess whether direct gaze increases neural coupling between adults and infants during screen-based and live interactions. In experiment 1 ( n = 17), infants viewed videos of an adult who was singing nursery rhymes with ( i ) direct gaze (looking forward), ( ii ) indirect gaze (head and eyes averted by 20°), or ( iii ) direct-oblique gaze (head averted but eyes orientated forward). In experiment 2 ( n = 19), infants viewed the same adult in a live context, singing with direct or indirect gaze. Gaze-related changes in adult-infant neural network connectivity were measured using partial directed coherence. Across both experiments, the adult had a significant (Granger) causal influence on infants' neural activity, which was stronger during direct and direct-oblique gaze relative to indirect gaze. During live interactions, infants also influenced the adult more during direct than indirect gaze. Further, infants vocalized more frequently during live direct gaze, and individual infants who vocalized longer also elicited stronger synchronization from the adult. These results demonstrate that direct gaze strengthens bidirectional adult-infant neural connectivity during communication. Thus, ostensive social signals could act to bring brains into mutual temporal alignment, creating a joint-networked state that is structured to facilitate information transfer during early communication and learning. Copyright © 2017 the Author(s). Published by PNAS.

  17. Speaker gaze increases information coupling between infant and adult brains

    PubMed Central

    Leong, Victoria; Byrne, Elizabeth; Clackson, Kaili; Georgieva, Stanimira; Lam, Sarah

    2017-01-01

    When infants and adults communicate, they exchange social signals of availability and communicative intention such as eye gaze. Previous research indicates that when communication is successful, close temporal dependencies arise between adult speakers’ and listeners’ neural activity. However, it is not known whether similar neural contingencies exist within adult–infant dyads. Here, we used dual-electroencephalography to assess whether direct gaze increases neural coupling between adults and infants during screen-based and live interactions. In experiment 1 (n = 17), infants viewed videos of an adult who was singing nursery rhymes with (i) direct gaze (looking forward), (ii) indirect gaze (head and eyes averted by 20°), or (iii) direct-oblique gaze (head averted but eyes orientated forward). In experiment 2 (n = 19), infants viewed the same adult in a live context, singing with direct or indirect gaze. Gaze-related changes in adult–infant neural network connectivity were measured using partial directed coherence. Across both experiments, the adult had a significant (Granger) causal influence on infants’ neural activity, which was stronger during direct and direct-oblique gaze relative to indirect gaze. During live interactions, infants also influenced the adult more during direct than indirect gaze. Further, infants vocalized more frequently during live direct gaze, and individual infants who vocalized longer also elicited stronger synchronization from the adult. These results demonstrate that direct gaze strengthens bidirectional adult–infant neural connectivity during communication. Thus, ostensive social signals could act to bring brains into mutual temporal alignment, creating a joint-networked state that is structured to facilitate information transfer during early communication and learning. PMID:29183980

  18. The Effectiveness of Gaze-Contingent Control in Computer Games.

    PubMed

    Orlov, Paul A; Apraksin, Nikolay

    2015-01-01

    Eye-tracking technology and gaze-contingent control in human-computer interaction have become an objective reality. This article reports on a series of eye-tracking experiments, in which we concentrated on one aspect of gaze-contingent interaction: Its effectiveness compared with mouse-based control in a computer strategy game. We propose a measure for evaluating the effectiveness of interaction based on "the time of recognition" the game unit. In this article, we use this measure to compare gaze- and mouse-contingent systems, and we present the analysis of the differences as a function of the number of game units. Our results indicate that performance of gaze-contingent interaction is typically higher than mouse manipulation in a visual searching task. When tested on 60 subjects, the results showed that the effectiveness of gaze-contingent systems over 1.5 times higher. In addition, we obtained that eye behavior stays quite stabile with or without mouse interaction. © The Author(s) 2015.

  19. A model of face selection in viewing video stories

    PubMed Central

    Suda, Yuki; Kitazawa, Shigeru

    2015-01-01

    When typical adults watch TV programs, they show surprisingly stereo-typed gaze behaviours, as indicated by the almost simultaneous shifts of their gazes from one face to another. However, a standard saliency model based on low-level physical features alone failed to explain such typical gaze behaviours. To find rules that explain the typical gaze behaviours, we examined temporo-spatial gaze patterns in adults while they viewed video clips with human characters that were played with or without sound, and in the forward or reverse direction. We here show the following: 1) the “peak” face scanpath, which followed the face that attracted the largest number of views but ignored other objects in the scene, still retained the key features of actual scanpaths, 2) gaze behaviours remained unchanged whether the sound was provided or not, 3) the gaze behaviours were sensitive to time reversal, and 4) nearly 60% of the variance of gaze behaviours was explained by the face saliency that was defined as a function of its size, novelty, head movements, and mouth movements. These results suggest that humans share a face-oriented network that integrates several visual features of multiple faces, and directs our eyes to the most salient face at each moment. PMID:25597621

  20. Implicit prosody mining based on the human eye image capture technology

    NASA Astrophysics Data System (ADS)

    Gao, Pei-pei; Liu, Feng

    2013-08-01

    The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.

  1. In the eye of the beholder: eye contact increases resistance to persuasion.

    PubMed

    Chen, Frances S; Minson, Julia A; Schöne, Maren; Heinrichs, Markus

    2013-11-01

    Popular belief holds that eye contact increases the success of persuasive communication, and prior research suggests that speakers who direct their gaze more toward their listeners are perceived as more persuasive. In contrast, we demonstrate that more eye contact between the listener and speaker during persuasive communication predicts less attitude change in the direction advocated. In Study 1, participants freely watched videos of speakers expressing various views on controversial sociopolitical issues. Greater direct gaze at the speaker's eyes was associated with less attitude change in the direction advocated by the speaker. In Study 2, we instructed participants to look at either the eyes or the mouths of speakers presenting arguments counter to participants' own attitudes. Intentionally maintaining direct eye contact led to less persuasion than did gazing at the mouth. These findings suggest that efforts at increasing eye contact may be counterproductive across a variety of persuasion contexts.

  2. Effects of Peripheral Eccentricity and Head Orientation on Gaze Discrimination

    PubMed Central

    Palanica, Adam; Itier, Roxane J.

    2017-01-01

    Visual search tasks support a special role for direct gaze in human cognition, while classic gaze judgment tasks suggest the congruency between head orientation and gaze direction plays a central role in gaze perception. Moreover, whether gaze direction can be accurately discriminated in the periphery using covert attention is unknown. In the present study, individual faces in frontal and in deviated head orientations with a direct or an averted gaze were flashed for 150 ms across the visual field; participants focused on a centred fixation while judging the gaze direction. Gaze discrimination speed and accuracy varied with head orientation and eccentricity. The limit of accurate gaze discrimination was less than ±6° eccentricity. Response times suggested a processing facilitation for direct gaze in fovea, irrespective of head orientation, however, by ±3° eccentricity, head orientation started biasing gaze judgments, and this bias increased with eccentricity. Results also suggested a special processing of frontal heads with direct gaze in central vision, rather than a general congruency effect between eye and head cues. Thus, while both head and eye cues contribute to gaze discrimination, their role differs with eccentricity. PMID:28344501

  3. Gaze Estimation for Off-Angle Iris Recognition Based on the Biometric Eye Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karakaya, Mahmut; Barstow, Del R; Santos-Villalobos, Hector J

    Iris recognition is among the highest accuracy biometrics. However, its accuracy relies on controlled high quality capture data and is negatively affected by several factors such as angle, occlusion, and dilation. Non-ideal iris recognition is a new research focus in biometrics. In this paper, we present a gaze estimation method designed for use in an off-angle iris recognition framework based on the ANONYMIZED biometric eye model. Gaze estimation is an important prerequisite step to correct an off-angle iris images. To achieve the accurate frontal reconstruction of an off-angle iris image, we first need to estimate the eye gaze direction frommore » elliptical features of an iris image. Typically additional information such as well-controlled light sources, head mounted equipment, and multiple cameras are not available. Our approach utilizes only the iris and pupil boundary segmentation allowing it to be applicable to all iris capture hardware. We compare the boundaries with a look-up-table generated by using our biologically inspired biometric eye model and find the closest feature point in the look-up-table to estimate the gaze. Based on the results from real images, the proposed method shows effectiveness in gaze estimation accuracy for our biometric eye model with an average error of approximately 3.5 degrees over a 50 degree range.« less

  4. An eye tracking study of bloodstain pattern analysts during pattern classification.

    PubMed

    Arthur, R M; Hoogenboom, J; Green, R D; Taylor, M C; de Bruin, K G

    2018-05-01

    Bloodstain pattern analysis (BPA) is the forensic discipline concerned with the classification and interpretation of bloodstains and bloodstain patterns at the crime scene. At present, it is unclear exactly which stain or pattern properties and their associated values are most relevant to analysts when classifying a bloodstain pattern. Eye tracking technology has been widely used to investigate human perception and cognition. Its application to forensics, however, is limited. This is the first study to use eye tracking as a tool for gaining access to the mindset of the bloodstain pattern expert. An eye tracking method was used to follow the gaze of 24 bloodstain pattern analysts during an assigned task of classifying a laboratory-generated test bloodstain pattern. With the aid of an automated image-processing methodology, the properties of selected features of the pattern were quantified leading to the delineation of areas of interest (AOIs). Eye tracking data were collected for each AOI and combined with verbal statements made by analysts after the classification task to determine the critical range of values for relevant diagnostic features. Eye-tracking data indicated that there were four main regions of the pattern that analysts were most interested in. Within each region, individual elements or groups of elements that exhibited features associated with directionality, size, colour and shape appeared to capture the most interest of analysts during the classification task. The study showed that the eye movements of trained bloodstain pattern experts and their verbal descriptions of a pattern were well correlated.

  5. Pursuit Eye-Movements in Curve Driving Differentiate between Future Path and Tangent Point Models

    PubMed Central

    Lappi, Otto; Pekkanen, Jami; Itkonen, Teemu H.

    2013-01-01

    For nearly 20 years, looking at the tangent point on the road edge has been prominent in models of visual orientation in curve driving. It is the most common interpretation of the commonly observed pattern of car drivers looking through a bend, or at the apex of the curve. Indeed, in the visual science literature, visual orientation towards the inside of a bend has become known as “tangent point orientation”. Yet, it remains to be empirically established whether it is the tangent point the drivers are looking at, or whether some other reference point on the road surface, or several reference points, are being targeted in addition to, or instead of, the tangent point. Recently discovered optokinetic pursuit eye-movements during curve driving can provide complementary evidence over and above traditional gaze-position measures. This paper presents the first detailed quantitative analysis of pursuit eye movements elicited by curvilinear optic flow in real driving. The data implicates the far zone beyond the tangent point as an important gaze target area during steady-state cornering. This is in line with the future path steering models, but difficult to reconcile with any pure tangent point steering model. We conclude that the tangent point steering models do not provide a general explanation of eye movement and steering during a curve driving sequence and cannot be considered uncritically as the default interpretation when the gaze position distribution is observed to be situated in the region of the curve apex. PMID:23894300

  6. Atypical Visual Orienting to Eye Gaze and Arrow Cues in Children with High Functioning Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Stauder, Johannes E. A.; Bosch, Claudia P. A.; Nuij, Hiske A. M.

    2011-01-01

    Although children with autism often fail follow the gaze of others in natural situations they are sensitive to directional cues by eye movements. This suggests that the low-level aspects of gaze cueing and are intact in persons with autism, while the higher level social skills like joint attention and attribution of desire and intention are…

  7. Deep Gaze Velocity Analysis During Mammographic Reading for Biometric Identification of Radiologists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Hong-Jun; Alamudun, Folami T.; Hudson, Kathy

    Several studies have confirmed that the gaze velocity of the human eye can be utilized as a behavioral biometric or personalized biomarker. In this study, we leverage the local feature representation capacity of convolutional neural networks (CNNs) for eye gaze velocity analysis as the basis for biometric identification of radiologists performing breast cancer screening. Using gaze data collected from 10 radiologists reading 100 mammograms of various diagnoses, we compared the performance of a CNN-based classification algorithm with two deep learning classifiers, deep neural network and deep belief network, and a previously presented hidden Markov model classifier. The study showed thatmore » the CNN classifier is superior compared to alternative classification methods based on macro F1-scores derived from 10-fold cross-validation experiments. Our results further support the efficacy of eye gaze velocity as a biometric identifier of medical imaging experts.« less

  8. Deep Gaze Velocity Analysis During Mammographic Reading for Biometric Identification of Radiologists

    DOE PAGES

    Yoon, Hong-Jun; Alamudun, Folami T.; Hudson, Kathy; ...

    2018-01-24

    Several studies have confirmed that the gaze velocity of the human eye can be utilized as a behavioral biometric or personalized biomarker. In this study, we leverage the local feature representation capacity of convolutional neural networks (CNNs) for eye gaze velocity analysis as the basis for biometric identification of radiologists performing breast cancer screening. Using gaze data collected from 10 radiologists reading 100 mammograms of various diagnoses, we compared the performance of a CNN-based classification algorithm with two deep learning classifiers, deep neural network and deep belief network, and a previously presented hidden Markov model classifier. The study showed thatmore » the CNN classifier is superior compared to alternative classification methods based on macro F1-scores derived from 10-fold cross-validation experiments. Our results further support the efficacy of eye gaze velocity as a biometric identifier of medical imaging experts.« less

  9. Spatiotemporal commonalities of fronto-parietal activation in attentional orienting triggered by supraliminal and subliminal gaze cues: An event-related potential study.

    PubMed

    Uono, Shota; Sato, Wataru; Sawada, Reiko; Kochiyama, Takanori; Toichi, Motomi

    2018-05-04

    Eye gaze triggers attentional shifts with and without conscious awareness. It remains unclear whether the spatiotemporal patterns of electric neural activity are the same for conscious and unconscious attentional shifts. Thus, the present study recorded event-related potentials (ERPs) and evaluated the neural activation involved in attentional orienting induced by subliminal and supraliminal gaze cues. Nonpredictive gaze cues were presented in the central field of vision, and participants were asked to detect a subsequent peripheral target. The mean reaction time was shorter for congruent gaze cues than for incongruent gaze cues under both presentation conditions, indicating that both types of cues reliably trigger attentional orienting. The ERP analysis revealed that averted versus straight gaze induced greater negative deflection in the bilateral fronto-central and temporal regions between 278 and 344 ms under both supraliminal and subliminal presentation conditions. Supraliminal cues, irrespective of gaze direction, induced a greater negative amplitude than did subliminal cues at the right posterior cortices at a peak of approximately 170 ms and in the 200-300 ms. These results suggest that similar spatial and temporal fronto-parietal activity is involved in attentional orienting triggered by both supraliminal and subliminal gaze cues, although inputs from different visual processing routes (cortical and subcortical regions) may trigger activity in the attentional network. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Impaired reflexive orienting to social cues in attention deficit hyperactivity disorder.

    PubMed

    Marotta, Andrea; Casagrande, Maria; Rosa, Caterina; Maccari, Lisa; Berloco, Bianca; Pasini, Augusto

    2014-08-01

    The present study investigated whether another person's social attention, specifically the direction of their eye gaze, and non-social directional cues triggered reflexive orienting in individuals with Attention Deficit Hyperactivity Disorder (ADHD) and age-matched controls. A choice reaction time and a detection tasks were used in which eye gaze, arrow and peripheral cues correctly (congruent) or incorrectly (incongruent) signalled target location. Independently of the type of the task, differences between groups were specific to the cue condition. Typically developing individuals shifted attention to the location cued by both social and non-social cues, whereas ADHD group showed evidence of reflexive orienting only to locations previously cued by non-social stimuli (arrow and peripheral cues) but failed to show such orienting effect in response to social eye gaze cues. The absence of reflexive orienting effect for eye gaze cues observed in the participants with ADHD may reflect an attentional impairment in responding to socially relevant information.

  11. Baby schema in human and animal faces induces cuteness perception and gaze allocation in children.

    PubMed

    Borgi, Marta; Cogliati-Dezza, Irene; Brelsford, Victoria; Meints, Kerstin; Cirulli, Francesca

    2014-01-01

    The baby schema concept was originally proposed as a set of infantile traits with high appeal for humans, subsequently shown to elicit caretaking behavior and to affect cuteness perception and attentional processes. However, it is unclear whether the response to the baby schema may be extended to the human-animal bond context. Moreover, questions remain as to whether the cute response is constant and persistent or whether it changes with development. In the present study we parametrically manipulated the baby schema in images of humans, dogs, and cats. We analyzed responses of 3-6 year-old children, using both explicit (i.e., cuteness ratings) and implicit (i.e., eye gaze patterns) measures. By means of eye-tracking, we assessed children's preferential attention to images varying only for the degree of baby schema and explored participants' fixation patterns during a cuteness task. For comparative purposes, cuteness ratings were also obtained in a sample of adults. Overall our results show that the response to an infantile facial configuration emerges early during development. In children, the baby schema affects both cuteness perception and gaze allocation to infantile stimuli and to specific facial features, an effect not simply limited to human faces. In line with previous research, results confirm human positive appraisal toward animals and inform both educational and therapeutic interventions involving pets, helping to minimize risk factors (e.g., dog bites).

  12. Neural activity in the posterior superior temporal region during eye contact perception correlates with autistic traits.

    PubMed

    Hasegawa, Naoya; Kitamura, Hideaki; Murakami, Hiroatsu; Kameyama, Shigeki; Sasagawa, Mutsuo; Egawa, Jun; Endo, Taro; Someya, Toshiyuki

    2013-08-09

    The present study investigated the relationship between neural activity associated with gaze processing and autistic traits in typically developed subjects using magnetoencephalography. Autistic traits in 24 typically developed college students with normal intelligence were assessed using the Autism Spectrum Quotient (AQ). The Minimum Current Estimates method was applied to estimate the cortical sources of magnetic responses to gaze stimuli. These stimuli consisted of apparent motion of the eyes, displaying direct or averted gaze motion. Results revealed gaze-related brain activations in the 150-250 ms time window in the right posterior superior temporal sulcus (pSTS), and in the 150-450 ms time window in medial prefrontal regions. In addition, the mean amplitude in the 150-250 ms time window in the right pSTS region was modulated by gaze direction, and its activity in response to direct gaze stimuli correlated with AQ score. pSTS activation in response to direct gaze is thought to be related to higher-order social processes. Thus, these results suggest that brain activity linking eye contact and social signals is associated with autistic traits in a typical population. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Increased Eye Contact during Conversation Compared to Play in Children with Autism

    ERIC Educational Resources Information Center

    Jones, Rebecca M.; Southerland, Audrey; Hamo, Amarelle; Carberry, Caroline; Bridges, Chanel; Nay, Sarah; Stubbs, Elizabeth; Komarow, Emily; Washington, Clay; Rehg, James M.; Lord, Catherine; Rozga, Agata

    2017-01-01

    Children with autism have atypical gaze behavior but it is unknown whether gaze differs during distinct types of reciprocal interactions. Typically developing children (N = 20) and children with autism (N = 20) (4-13 years) made similar amounts of eye contact with an examiner during a conversation. Surprisingly, there was minimal eye contact…

  14. Quantitative measurement of eyestrain on 3D stereoscopic display considering the eye foveation model and edge information.

    PubMed

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-05-15

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors.

  15. To Gaze or Not to Gaze: Visual Communication in Eastern Zaire. Sociolinguistic Working Paper Number 87.

    ERIC Educational Resources Information Center

    Blakely, Thomas D.

    The nature of gazing at someone or something, as a form of communication among the Bahemba people in eastern Zaire, is analyzed across a range of situations. Variations of steady gazing, a common eye contact routine, are outlined, including: (1) negative non-gazing or glance routines, especially in situations in which gazing would ordinarily…

  16. Gaze shifts and fixations dominate gaze behavior of walking cats

    PubMed Central

    Rivers, Trevor J.; Sirota, Mikhail G.; Guttentag, Andrew I.; Ogorodnikov, Dmitri A.; Shah, Neet A.; Beloozerova, Irina N.

    2014-01-01

    Vision is important for locomotion in complex environments. How it is used to guide stepping is not well understood. We used an eye search coil technique combined with an active marker-based head recording system to characterize the gaze patterns of cats walking over terrains of different complexity: (1) on a flat surface in the dark when no visual information was available, (2) on the flat surface in light when visual information was available but not required, (3) along the highly structured but regular and familiar surface of a horizontal ladder, a task for which visual guidance of stepping was required, and (4) along a pathway cluttered with many small stones, an irregularly structured surface that was new each day. Three cats walked in a 2.5 m corridor, and 958 passages were analyzed. Gaze activity during the time when the gaze was directed at the walking surface was subdivided into four behaviors based on speed of gaze movement along the surface: gaze shift (fast movement), gaze fixation (no movement), constant gaze (movement at the body’s speed), and slow gaze (the remainder). We found that gaze shifts and fixations dominated the cats’ gaze behavior during all locomotor tasks, jointly occupying 62–84% of the time when the gaze was directed at the surface. As visual complexity of the surface and demand on visual guidance of stepping increased, cats spent more time looking at the surface, looked closer to them, and switched between gaze behaviors more often. During both visually guided locomotor tasks, gaze behaviors predominantly followed a repeated cycle of forward gaze shift followed by fixation. We call this behavior “gaze stepping”. Each gaze shift took gaze to a site approximately 75–80 cm in front of the cat, which the cat reached in 0.7–1.2 s and 1.1–1.6 strides. Constant gaze occupied only 5–21% of the time cats spent looking at the walking surface. PMID:24973656

  17. Young Children Show a Dissociation in Looking and Pointing Behavior in Falling Events

    ERIC Educational Resources Information Center

    Lee, Vivian; Kuhlmeier, Valerie A.

    2013-01-01

    Studies of social cognitive reasoning have demonstrated instances of children engaging in eye gaze patterns toward correct answers even when pointing or verbal responses are directed toward incorrect answers. Findings such as these have spawned seminal theories, yet no consensus has been reached regarding the characteristics of the knowledge…

  18. Design of a virtual reality based adaptive response technology for children with autism.

    PubMed

    Lahiri, Uttama; Bekele, Esubalew; Dohrmann, Elizabeth; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Children with autism spectrum disorder (ASD) demonstrate potent impairments in social communication skills including atypical viewing patterns during social interactions. Recently, several assistive technologies, particularly virtual reality (VR), have been investigated to address specific social deficits in this population. Some studies have coupled eye-gaze monitoring mechanisms to design intervention strategies. However, presently available systems are designed to primarily chain learning via aspects of one's performance only which affords restricted range of individualization. The presented work seeks to bridge this gap by developing a novel VR-based interactive system with Gaze-sensitive adaptive response technology that can seamlessly integrate VR-based tasks with eye-tracking techniques to intelligently facilitate engagement in tasks relevant to advancing social communication skills. Specifically, such a system is capable of objectively identifying and quantifying one's engagement level by measuring real-time viewing patterns, subtle changes in eye physiological responses, as well as performance metrics in order to adaptively respond in an individualized manner to foster improved social communication skills among the participants. The developed system was tested through a usability study with eight adolescents with ASD. The results indicate the potential of the system to promote improved social task performance along with socially-appropriate mechanisms during VR-based social conversation tasks.

  19. Design of a Virtual Reality Based Adaptive Response Technology for Children With Autism

    PubMed Central

    Lahiri, Uttama; Bekele, Esubalew; Dohrmann, Elizabeth; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Children with autism spectrum disorder (ASD) demonstrate potent impairments in social communication skills including atypical viewing patterns during social interactions. Recently, several assistive technologies, particularly virtual reality (VR), have been investigated to address specific social deficits in this population. Some studies have coupled eye-gaze monitoring mechanisms to design intervention strategies. However, presently available systems are designed to primarily chain learning via aspects of one’s performance only which affords restricted range of individualization. The presented work seeks to bridge this gap by developing a novel VR-based interactive system with Gaze-sensitive adaptive response technology that can seamlessly integrate VR-based tasks with eye-tracking techniques to intelligently facilitate engagement in tasks relevant to advancing social communication skills. Specifically, such a system is capable of objectively identifying and quantifying one’s engagement level by measuring real-time viewing patterns, subtle changes in eye physiological responses, as well as performance metrics in order to adaptively respond in an individualized manner to foster improved social communication skills among the participants. The developed system was tested through a usability study with eight adolescents with ASD. The results indicate the potential of the system to promote improved social task performance along with socially-appropriate mechanisms during VR-based social conversation tasks. PMID:23033333

  20. Social communication with virtual agents: The effects of body and gaze direction on attention and emotional responding in human observers.

    PubMed

    Marschner, Linda; Pannasch, Sebastian; Schulz, Johannes; Graupner, Sven-Thomas

    2015-08-01

    In social communication, the gaze direction of other persons provides important information to perceive and interpret their emotional response. Previous research investigated the influence of gaze by manipulating mutual eye contact. Therefore, gaze and body direction have been changed as a whole, resulting in only congruent gaze and body directions (averted or directed) of another person. Here, we aimed to disentangle these effects by using short animated sequences of virtual agents posing with either direct or averted body or gaze. Attention allocation by means of eye movements, facial muscle response, and emotional experience to agents of different gender and facial expressions were investigated. Eye movement data revealed longer fixation durations, i.e., a stronger allocation of attention, when gaze and body direction were not congruent with each other or when both were directed towards the observer. This suggests that direct interaction as well as incongruous signals increase the demands of attentional resources in the observer. For the facial muscle response, only the reaction of muscle zygomaticus major revealed an effect of body direction, expressed by stronger activity in response to happy expressions for direct compared to averted gaze when the virtual character's body was directed towards the observer. Finally, body direction also influenced the emotional experience ratings towards happy expressions. While earlier findings suggested that mutual eye contact is the main source for increased emotional responding and attentional allocation, the present results indicate that direction of the virtual agent's body and head also plays a minor but significant role. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Learning rational temporal eye movement strategies.

    PubMed

    Hoppe, David; Rothkopf, Constantin A

    2016-07-19

    During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

  2. Primate social attention: Species differences and effects of individual experience in humans, great apes, and macaques

    PubMed Central

    Shepherd, Stephen V.; Hirata, Satoshi; Call, Josep

    2018-01-01

    When viewing social scenes, humans and nonhuman primates focus on particular features, such as the models’ eyes, mouth, and action targets. Previous studies reported that such viewing patterns vary significantly across individuals in humans, and also across closely-related primate species. However, the nature of these individual and species differences remains unclear, particularly among nonhuman primates. In large samples of human and nonhuman primates, we examined species differences and the effects of experience on patterns of gaze toward social movies. Experiment 1 examined the species differences across rhesus macaques, nonhuman apes (bonobos, chimpanzees, and orangutans), and humans while they viewed movies of various animals’ species-typical behaviors. We found that each species had distinct viewing patterns of the models’ faces, eyes, mouths, and action targets. Experiment 2 tested the effect of individuals’ experience on chimpanzee and human viewing patterns. We presented movies depicting natural behaviors of chimpanzees to three groups of chimpanzees (individuals from a zoo, a sanctuary, and a research institute) differing in their early social and physical experiences. We also presented the same movies to human adults and children differing in their expertise with chimpanzees (experts vs. novices) or movie-viewing generally (adults vs. preschoolers). Individuals varied within each species in their patterns of gaze toward models’ faces, eyes, mouths, and action targets depending on their unique individual experiences. We thus found that the viewing patterns for social stimuli are both individual- and species-specific in these closely-related primates. Such individual/species-specificities are likely related to both individual experience and species-typical temperament, suggesting that primate individuals acquire their unique attentional biases through both ontogeny and evolution. Such unique attentional biases may help them learn efficiently about their particular social environments. PMID:29474416

  3. Primate social attention: Species differences and effects of individual experience in humans, great apes, and macaques.

    PubMed

    Kano, Fumihiro; Shepherd, Stephen V; Hirata, Satoshi; Call, Josep

    2018-01-01

    When viewing social scenes, humans and nonhuman primates focus on particular features, such as the models' eyes, mouth, and action targets. Previous studies reported that such viewing patterns vary significantly across individuals in humans, and also across closely-related primate species. However, the nature of these individual and species differences remains unclear, particularly among nonhuman primates. In large samples of human and nonhuman primates, we examined species differences and the effects of experience on patterns of gaze toward social movies. Experiment 1 examined the species differences across rhesus macaques, nonhuman apes (bonobos, chimpanzees, and orangutans), and humans while they viewed movies of various animals' species-typical behaviors. We found that each species had distinct viewing patterns of the models' faces, eyes, mouths, and action targets. Experiment 2 tested the effect of individuals' experience on chimpanzee and human viewing patterns. We presented movies depicting natural behaviors of chimpanzees to three groups of chimpanzees (individuals from a zoo, a sanctuary, and a research institute) differing in their early social and physical experiences. We also presented the same movies to human adults and children differing in their expertise with chimpanzees (experts vs. novices) or movie-viewing generally (adults vs. preschoolers). Individuals varied within each species in their patterns of gaze toward models' faces, eyes, mouths, and action targets depending on their unique individual experiences. We thus found that the viewing patterns for social stimuli are both individual- and species-specific in these closely-related primates. Such individual/species-specificities are likely related to both individual experience and species-typical temperament, suggesting that primate individuals acquire their unique attentional biases through both ontogeny and evolution. Such unique attentional biases may help them learn efficiently about their particular social environments.

  4. Spatial updating depends on gaze direction even after loss of vision.

    PubMed

    Reuschel, Johanna; Rösler, Frank; Henriques, Denise Y P; Fiehler, Katja

    2012-02-15

    Direction of gaze (eye angle + head angle) has been shown to be important for representing space for action, implying a crucial role of vision for spatial updating. However, blind people have no access to vision yet are able to perform goal-directed actions successfully. Here, we investigated the role of visual experience for localizing and updating targets as a function of intervening gaze shifts in humans. People who differed in visual experience (late blind, congenitally blind, or sighted) were briefly presented with a proprioceptive reach target while facing it. Before they reached to the target's remembered location, they turned their head toward an eccentric direction that also induced corresponding eye movements in sighted and late blind individuals. We found that reaching errors varied systematically as a function of shift in gaze direction only in participants with early visual experience (sighted and late blind). In the late blind, this effect was solely present in people with moveable eyes but not in people with at least one glass eye. Our results suggest that the effect of gaze shifts on spatial updating develops on the basis of visual experience early in life and remains even after loss of vision as long as feedback from the eyes and head is available.

  5. 3D gaze tracking method using Purkinje images on eye optical model and pupil

    NASA Astrophysics Data System (ADS)

    Lee, Ji Woo; Cho, Chul Woo; Shin, Kwang Yong; Lee, Eui Chul; Park, Kang Ryoung

    2012-05-01

    Gaze tracking is to detect the position a user is looking at. Most research on gaze estimation has focused on calculating the X, Y gaze position on a 2D plane. However, as the importance of stereoscopic displays and 3D applications has increased greatly, research into 3D gaze estimation of not only the X, Y gaze position, but also the Z gaze position has gained attention for the development of next-generation interfaces. In this paper, we propose a new method for estimating the 3D gaze position based on the illuminative reflections (Purkinje images) on the surface of the cornea and lens by considering the 3D optical structure of the human eye model. This research is novel in the following four ways compared with previous work. First, we theoretically analyze the generated models of Purkinje images based on the 3D human eye model for 3D gaze estimation. Second, the relative positions of the first and fourth Purkinje images to the pupil center, inter-distance between these two Purkinje images, and pupil size are used as the features for calculating the Z gaze position. The pupil size is used on the basis of the fact that pupil accommodation happens according to the gaze positions in the Z direction. Third, with these features as inputs, the final Z gaze position is calculated using a multi-layered perceptron (MLP). Fourth, the X, Y gaze position on the 2D plane is calculated by the position of the pupil center based on a geometric transform considering the calculated Z gaze position. Experimental results showed that the average errors of the 3D gaze estimation were about 0.96° (0.48 cm) on the X-axis, 1.60° (0.77 cm) on the Y-axis, and 4.59 cm along the Z-axis in 3D space.

  6. Integrating a Motion Base into a CAVE Automatic Virtual Environment: Phase 1

    DTIC Science & Technology

    2001-07-01

    this, a CAVE system must perform well in the following motion-related areas: visual gaze stability, simulator sickness, realism (or face validity...and performance validity. Visual Gaze Stability Visual gaze stability, the ability to maintain eye fixation on a particular target, depends upon human...reflexes such as the vestibulo-ocular reflex (VOR) and the optokinetic nystagmus (OKN). VOR is a reflex that counter-rotates the eye relative to the

  7. Effects of Observing Eye Contact on Gaze Following in High-Functioning Autism

    ERIC Educational Resources Information Center

    Böckler, Anne; Timmermans, Bert; Sebanz, Natalie; Vogeley, Kai; Schilbach, Leonhard

    2014-01-01

    Observing eye contact between others enhances the tendency to subsequently follow their gaze and has been suggested to function as a social signal that adds meaning to an upcoming action or event. The present study investigated effects of observed eye contact in high-functioning autism (HFA). Two faces on a screen either looked at or away from…

  8. A Statistical Physics Perspective to Understand Social Visual Attention in Autism Spectrum Disorder.

    PubMed

    Liberati, Alessio; Fadda, Roberta; Doneddu, Giuseppe; Congiu, Sara; Javarone, Marco A; Striano, Tricia; Chessa, Alessandro

    2017-08-01

    This study investigated social visual attention in children with Autism Spectrum Disorder (ASD) and with typical development (TD) in the light of Brockmann and Geisel's model of visual attention. The probability distribution of gaze movements and clustering of gaze points, registered with eye-tracking technology, was studied during a free visual exploration of a gaze stimulus. A data-driven analysis of the distribution of eye movements was chosen to overcome any possible methodological problems related to the subjective expectations of the experimenters about the informative contents of the image in addition to a computational model to simulate group differences. Analysis of the eye-tracking data indicated that the scanpaths of children with TD and ASD were characterized by eye movements geometrically equivalent to Lévy flights. Children with ASD showed a higher frequency of long saccadic amplitudes compared with controls. A clustering analysis revealed a greater dispersion of eye movements for these children. Modeling of the results indicated higher values of the model parameter modulating the dispersion of eye movements for children with ASD. Together, the experimental results and the model point to a greater dispersion of gaze points in ASD.

  9. Trained Eyes: Experience Promotes Adaptive Gaze Control in Dynamic and Uncertain Visual Environments

    PubMed Central

    Taya, Shuichiro; Windridge, David; Osman, Magda

    2013-01-01

    Current eye-tracking research suggests that our eyes make anticipatory movements to a location that is relevant for a forthcoming task. Moreover, there is evidence to suggest that with more practice anticipatory gaze control can improve. However, these findings are largely limited to situations where participants are actively engaged in a task. We ask: does experience modulate anticipative gaze control while passively observing a visual scene? To tackle this we tested people with varying degrees of experience of tennis, in order to uncover potential associations between experience and eye movement behaviour while they watched tennis videos. The number, size, and accuracy of saccades (rapid eye-movements) made around ‘events,’ which is critical for the scene context (i.e. hit and bounce) were analysed. Overall, we found that experience improved anticipatory eye-movements while watching tennis clips. In general, those with extensive experience showed greater accuracy of saccades to upcoming event locations; this was particularly prevalent for events in the scene that carried high uncertainty (i.e. ball bounces). The results indicate that, even when passively observing, our gaze control system utilizes prior relevant knowledge in order to anticipate upcoming uncertain event locations. PMID:23951147

  10. Assessing the precision of gaze following using a stereoscopic 3D virtual reality setting.

    PubMed

    Atabaki, Artin; Marciniak, Karolina; Dicke, Peter W; Thier, Peter

    2015-07-01

    Despite the ecological importance of gaze following, little is known about the underlying neuronal processes, which allow us to extract gaze direction from the geometric features of the eye and head of a conspecific. In order to understand the neuronal mechanisms underlying this ability, a careful description of the capacity and the limitations of gaze following at the behavioral level is needed. Previous studies of gaze following, which relied on naturalistic settings have the disadvantage of allowing only very limited control of potentially relevant visual features guiding gaze following, such as the contrast of iris and sclera, the shape of the eyelids and--in the case of photographs--they lack depth. Hence, in order to get full control of potentially relevant features we decided to study gaze following of human observers guided by the gaze of a human avatar seen stereoscopically. To this end we established a stereoscopic 3D virtual reality setup, in which we tested human subjects' abilities to detect at which target a human avatar was looking at. Following the gaze of the avatar showed all the features of the gaze following of a natural person, namely a substantial degree of precision associated with a consistent pattern of systematic deviations from the target. Poor stereo vision affected performance surprisingly little (only in certain experimental conditions). Only gaze following guided by targets at larger downward eccentricities exhibited a differential effect of the presence or absence of accompanying movements of the avatar's eyelids and eyebrows. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Visual Representation of Eye Gaze Is Coded by a Nonopponent Multichannel System

    ERIC Educational Resources Information Center

    Calder, Andrew J.; Jenkins, Rob; Cassel, Anneli; Clifford, Colin W. G.

    2008-01-01

    To date, there is no functional account of the visual perception of gaze in humans. Previous work has demonstrated that left gaze and right gaze are represented by separate mechanisms. However, these data are consistent with either a multichannel system comprising separate channels for distinct gaze directions (e.g., left, direct, and right) or an…

  12. Matching the oculomotor drive during head-restrained and head-unrestrained gaze shifts in monkey.

    PubMed

    Bechara, Bernard P; Gandhi, Neeraj J

    2010-08-01

    High-frequency burst neurons in the pons provide the eye velocity command (equivalently, the primary oculomotor drive) to the abducens nucleus for generation of the horizontal component of both head-restrained (HR) and head-unrestrained (HU) gaze shifts. We sought to characterize how gaze and its eye-in-head component differ when an "identical" oculomotor drive is used to produce HR and HU movements. To address this objective, the activities of pontine burst neurons were recorded during horizontal HR and HU gaze shifts. The burst profile recorded on each HU trial was compared with the burst waveform of every HR trial obtained for the same neuron. The oculomotor drive was assumed to be comparable for the pair yielding the lowest root-mean-squared error. For matched pairs of HR and HU trials, the peak eye-in-head velocity was substantially smaller in the HU condition, and the reduction was usually greater than the peak head velocity of the HU trial. A time-varying attenuation index, defined as the difference in HR and HU eye velocity waveforms divided by head velocity [alpha = (H(hr) - E(hu))/H] was computed. The index was variable at the onset of the gaze shift, but it settled at values several times greater than 1. The index then decreased gradually during the movement and stabilized at 1 around the end of gaze shift. These results imply that substantial attenuation in eye velocity occurs, at least partially, downstream of the burst neurons. We speculate on the potential roles of burst-tonic neurons in the neural integrator and various cell types in the vestibular nuclei in mediating the attenuation in eye velocity in the presence of head movements.

  13. Perceptual impairment and psychomotor control in virtual laparoscopic surgery.

    PubMed

    Wilson, Mark R; McGrath, John S; Vine, Samuel J; Brewer, James; Defriend, David; Masters, Richard S W

    2011-07-01

    It is recognised that one of the major difficulties in performing laparoscopic surgery is the translation of two-dimensional video image information to a three-dimensional working area. However, research has tended to ignore the gaze and eye-hand coordination strategies employed by laparoscopic surgeons as they attempt to overcome these perceptual constraints. This study sought to examine if measures related to tool movements, gaze strategy, and eye-hand coordination (the quiet eye) differentiate between experienced and novice operators performing a two-handed manoeuvres task on a virtual reality laparoscopic surgical simulator (LAP Mentor™). Twenty-five right-handed surgeons were categorised as being either experienced (having led more than 60 laparoscopic procedures) or novice (having performed fewer than 10 procedures) operators. The 10 experienced and 15 novice surgeons completed the "two-hand manoeuvres" task from the LAP Mentor basic skills learning environment while wearing a gaze registration system. Performance, movement, gaze, and eye-hand coordination parameters were recorded and compared between groups. The experienced surgeons completed the task significantly more quickly than the novices, used significantly fewer movements, and displayed shorter tool paths. Gaze analyses revealed that experienced surgeons spent significantly more time fixating the target locations than novices, who split their time between focusing on the targets and tracking the tools. A more detailed analysis of a difficult subcomponent of the task revealed that experienced operators used a significantly longer aiming fixation (the quiet eye period) to guide precision grasping movements and hence needed fewer grasp attempts. The findings of the study provide further support for the utility of examining strategic gaze behaviour and eye-hand coordination measures to help further our understanding of how experienced surgeons attempt to overcome the perceptual difficulties inherent in the laparoscopic environment.

  14. Quiet eye gaze behavior of expert, and near-expert, baseball plate umpires.

    PubMed

    Millslagle, Duane G; Hines, Bridget B; Smith, Melissa S

    2013-02-01

    The quiet eye gaze behavior of 4 near-expert and 4 expert baseball umpires who called balls and strikes in simulated pitch-hit situations was assessed with a mobile eye cornea tracker system. Statistical analyses of the umpires' gaze behavior (fixation/pursuit tracking, saccades, and blinks)--onset, duration, offset, and frequency--were performed between and within 4 stages (pitcher's preparation, pitcher's delivery, ball in flight, and umpire call) by umpire's skill level. The results indicated that the quiet eye of expert umpires at onset of the pitcher's release point occurred earlier and was longer in duration than near-expert umpires. Expert expert umpires. The area outside the pitcher's ball release point may be the key environment cue for the behind-the-plate umpire.

  15. Avoidance of a moving threat in the common chameleon (Chamaeleo chamaeleon): rapid tracking by body motion and eye use.

    PubMed

    Lev-Ari, Tidhar; Lustig, Avichai; Ketter-Katz, Hadas; Baydach, Yossi; Katzir, Gadi

    2016-08-01

    A chameleon (Chamaeleo chamaeleon) on a perch responds to a nearby threat by moving to the side of the perch opposite the threat, while bilaterally compressing its abdomen, thus minimizing its exposure to the threat. If the threat moves, the chameleon pivots around the perch to maintain its hidden position. How precise is the body rotation and what are the patterns of eye movement during avoidance? Just-hatched chameleons, placed on a vertical perch, on the side roughly opposite to a visual threat, adjusted their position to precisely opposite the threat. If the threat were moved on a horizontal arc at angular velocities of up to 85°/s, the chameleons co-rotated smoothly so that (1) the angle of the sagittal plane of the head relative to the threat and (2) the direction of monocular gaze, were positively and significantly correlated with threat angular position. Eye movements were role-dependent: the eye toward which the threat moved maintained a stable gaze on it, while the contralateral eye scanned the surroundings. This is the first description, to our knowledge, of such a response in a non-flying terrestrial vertebrate, and it is discussed in terms of possible underlying control systems.

  16. Watching Eyes effects: When others meet the self.

    PubMed

    Conty, Laurence; George, Nathalie; Hietanen, Jari K

    2016-10-01

    The perception of direct gaze-that is, of another individual's gaze directed at the observer-is known to influence a wide range of cognitive processes and behaviors. We present a new theoretical proposal to provide a unified account of these effects. We argue that direct gaze first captures the beholder's attention and then triggers self-referential processing, i.e., a heightened processing of stimuli in relation with the self. Self-referential processing modulates incoming information processing and leads to the Watching Eyes effects, which we classify into four main categories: the enhancement of self-awareness, memory effects, the activation of pro-social behavior, and positive appraisals of others. We advance that the belief to be the object of another's attention is embedded in direct gaze perception and gives direct gaze its self-referential power. Finally, we stress that the Watching Eyes effects reflect a positive impact on human cognition; therefore, they may have a therapeutic potential, which future research should delineate. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.

    PubMed

    Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung

    2018-02-03

    A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver's point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods.

  18. Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor

    PubMed Central

    Naqvi, Rizwan Ali; Arsalan, Muhammad; Batchuluun, Ganbayar; Yoon, Hyo Sik; Park, Kang Ryoung

    2018-01-01

    A paradigm shift is required to prevent the increasing automobile accident deaths that are mostly due to the inattentive behavior of drivers. Knowledge of gaze region can provide valuable information regarding a driver’s point of attention. Accurate and inexpensive gaze classification systems in cars can improve safe driving. However, monitoring real-time driving behaviors and conditions presents some challenges: dizziness due to long drives, extreme lighting variations, glasses reflections, and occlusions. Past studies on gaze detection in cars have been chiefly based on head movements. The margin of error in gaze detection increases when drivers gaze at objects by moving their eyes without moving their heads. To solve this problem, a pupil center corneal reflection (PCCR)-based method has been considered. However, the error of accurately detecting the pupil center and corneal reflection center is increased in a car environment due to various environment light changes, reflections on glasses surface, and motion and optical blurring of captured eye image. In addition, existing PCCR-based methods require initial user calibration, which is difficult to perform in a car environment. To address this issue, we propose a deep learning-based gaze detection method using a near-infrared (NIR) camera sensor considering driver head and eye movement that does not require any initial user calibration. The proposed system is evaluated on our self-constructed database as well as on open Columbia gaze dataset (CAVE-DB). The proposed method demonstrated greater accuracy than the previous gaze classification methods. PMID:29401681

  19. Stabilization of gaze during circular locomotion in darkness. II. Contribution of velocity storage to compensatory eye and head nystagmus in the running monkey

    NASA Technical Reports Server (NTRS)

    Solomon, D.; Cohen, B.

    1992-01-01

    1. Yaw eye in head (Eh) and head on body velocities (Hb) were measured in two monkeys that ran around the perimeter of a circular platform in darkness. The platform was stationary or could be counterrotated to reduce body velocity in space (Bs) while increasing gait velocity on the platform (Bp). The animals were also rotated while seated in a primate chair at eccentric locations to provide linear and angular accelerations similar to those experienced while running. 2. Both animals had head and eye nystagmus while running in darkness during which slow phase gaze velocity on the body (Gb) partially compensated for body velocity in space (Bs). The eyes, driven by the vestibuloocular reflex (VOR), supplied high-frequency characteristics, bringing Gb up to compensatory levels at the beginning and end of the slow phases. The head provided substantial gaze compensation during the slow phases, probably through the vestibulocollic reflex (VCR). Synchronous eye and head quick phases moved gaze in the direction of running. Head movements occurred consistently only when animals were running. This indicates that active body and limb motion may be essential for inducing the head-eye gaze synergy. 3. Gaze compensation was good when running in both directions in one animal and in one direction in the other animal. The animals had long VOR time constants in these directions. The VOR time constant was short to one side in one animal, and it had poor gaze compensation in this direction. Postlocomotory nystagmus was weaker after running in directions with a long VOR time constant than when the animals were passively rotated in darkness. We infer that velocity storage in the vestibular system had been activated to produce continuous Eh and Hb during running and to counteract postrotatory afterresponses. 4. Continuous compensatory gaze nystagmus was not produced by passive eccentric rotation with the head stabilized or free. This indicates that an aspect of active locomotion, most likely somatosensory feedback, was responsible for activating velocity storage. 5. Nystagmus was compared when an animal ran in darkness and in light. the beat frequency of eye and head nystagmus was lower, and the quick phases were larger in darkness. The duration of head and eye quick phases covaried. Eye quick phases were larger when animals ran in darkness than when they were passively rotated. The maximum velocity and duration of eye quick phases were the same in both conditions. 6. The platform was counterrotated under one monkey in darkness while it ran in the direction of its long vestibular time constant.(ABSTRACT TRUNCATED AT 400 WORDS).

  20. Eye contact with neutral and smiling faces: effects on autonomic responses and frontal EEG asymmetry

    PubMed Central

    Pönkänen, Laura M.; Hietanen, Jari K.

    2012-01-01

    In our previous studies we have shown that seeing another person “live” with a direct vs. averted gaze results in enhanced skin conductance responses (SCRs) indicating autonomic arousal and in greater relative left-sided frontal activity in the electroencephalography (asymmetry in the alpha-band power), associated with approach motivation. In our studies, however, the stimulus persons had a neutral expression. In real-life social interaction, eye contact is often associated with a smile, which is another signal of the sender's approach-related motivation. A smile could, therefore, enhance the affective-motivational responses to eye contact. In the present study, we investigated whether the facial expression (neutral vs. social smile) would modulate autonomic arousal and frontal EEG alpha-band asymmetry to seeing a direct vs. an averted gaze in faces presented “live” through a liquid crystal (LC) shutter. The results showed that the SCRs were greater for the direct than the averted gaze and that the effect of gaze direction was more pronounced for a smiling than a neutral face. However, in this study, gaze direction and facial expression did not affect the frontal EEG asymmetry, although, for gaze direction, we found a marginally significant correlation between the degree of an overall bias for asymmetric frontal activity and the degree to which direct gaze elicited stronger left-sided frontal activity than did averted gaze. PMID:22586387

  1. Method of Menu Selection by Gaze Movement Using AC EOG Signals

    NASA Astrophysics Data System (ADS)

    Kanoh, Shin'ichiro; Futami, Ryoko; Yoshinobu, Tatsuo; Hoshimiya, Nozomu

    A method to detect the direction and the distance of voluntary eye gaze movement from EOG (electrooculogram) signals was proposed and tested. In this method, AC-amplified vertical and horizontal transient EOG signals were classified into 8-class directions and 2-class distances of voluntary eye gaze movements. A horizontal and a vertical EOGs during eye gaze movement at each sampling time were treated as a two-dimensional vector, and the center of gravity of the sample vectors whose norms were more than 80% of the maximum norm was used as a feature vector to be classified. By the classification using the k-nearest neighbor algorithm, it was shown that the averaged correct detection rates on each subject were 98.9%, 98.7%, 94.4%, respectively. This method can avoid strict EOG-based eye tracking which requires DC amplification of very small signal. It would be useful to develop robust human interfacing systems based on menu selection for severely paralyzed patients.

  2. Eye-gaze control of the computer interface: Discrimination of zoom intent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-10-01

    An analysis methodology and associated experiment were developed to assess whether definable and repeatable signatures of eye-gaze characteristics are evident, preceding a decision to zoom-in, zoom-out, or not to zoom at a computer interface. This user intent discrimination procedure can have broad application in disability aids and telerobotic control. Eye-gaze was collected from 10 subjects in a controlled experiment, requiring zoom decisions. The eye-gaze data were clustered, then fed into a multiple discriminant analysis (MDA) for optimal definition of heuristics separating the zoom-in, zoom-out, and no-zoom conditions. Confusion matrix analyses showed that a number of variable combinations classified at amore » statistically significant level, but practical significance was more difficult to establish. Composite contour plots demonstrated the regions in parameter space consistently assigned by the MDA to unique zoom conditions. Peak classification occurred at about 1200--1600 msec. Improvements in the methodology to achieve practical real-time zoom control are considered.« less

  3. Disentangling the initiation from the response in joint attention: an eye-tracking study in toddlers with autism spectrum disorders.

    PubMed

    Billeci, L; Narzisi, A; Campatelli, G; Crifaci, G; Calderoni, S; Gagliano, A; Calzone, C; Colombi, C; Pioggia, G; Muratori, F

    2016-05-17

    Joint attention (JA), whose deficit is an early risk marker for autism spectrum disorder (ASD), has two dimensions: (1) responding to JA and (2) initiating JA. Eye-tracking technology has largely been used to investigate responding JA, but rarely to study initiating JA especially in young children with ASD. The aim of this study was to describe the differences in the visual patterns of toddlers with ASD and those with typical development (TD) during both responding JA and initiating JA tasks. Eye-tracking technology was used to monitor the gaze of 17 children with ASD and 15 age-matched children with TD during the presentation of short video sequences involving one responding JA and two initiating JA tasks (initiating JA-1 and initiating JA-2). Gaze accuracy, transitions and fixations were analyzed. No differences were found in the responding JA task between children with ASD and those with TD, whereas, in the initiating JA tasks, different patterns of fixation and transitions were shown between the groups. These results suggest that children with ASD and those with TD show different visual patterns when they are expected to initiate joint attention but not when they respond to joint attention. We hypothesized that differences in transitions and fixations are linked to ASD impairments in visual disengagement from face, in global scanning of the scene and in the ability to anticipate object's action.

  4. The Role of Face Familiarity in Eye Tracking of Faces by Individuals with Autism Spectrum Disorders

    PubMed Central

    Dawson, Geraldine; Webb, Sara; Murias, Michael; Munson, Jeffrey; Panagiotides, Heracles; Aylward, Elizabeth

    2010-01-01

    It has been shown that individuals with autism spectrum disorders (ASD) demonstrate normal activation in the fusiform gyrus when viewing familiar, but not unfamiliar faces. The current study utilized eye tracking to investigate patterns of attention underlying familiar versus unfamiliar face processing in ASD. Eye movements of 18 typically developing participants and 17 individuals with ASD were recorded while passively viewing three face categories: unfamiliar non-repeating faces, a repeating highly familiar face, and a repeating previously unfamiliar face. Results suggest that individuals with ASD do not exhibit more normative gaze patterns when viewing familiar faces. A second task assessed facial recognition accuracy and response time for familiar and novel faces. The groups did not differ on accuracy or reaction times. PMID:18306030

  5. Quantitative Measurement of Eyestrain on 3D Stereoscopic Display Considering the Eye Foveation Model and Edge Information

    PubMed Central

    Heo, Hwan; Lee, Won Oh; Shin, Kwang Yong; Park, Kang Ryoung

    2014-01-01

    We propose a new method for measuring the degree of eyestrain on 3D stereoscopic displays using a glasses-type of eye tracking device. Our study is novel in the following four ways: first, the circular area where a user's gaze position exists is defined based on the calculated gaze position and gaze estimation error. Within this circular area, the position where edge strength is maximized can be detected, and we determine this position as the gaze position that has a higher probability of being the correct one. Based on this gaze point, the eye foveation model is defined. Second, we quantitatively evaluate the correlation between the degree of eyestrain and the causal factors of visual fatigue, such as the degree of change of stereoscopic disparity (CSD), stereoscopic disparity (SD), frame cancellation effect (FCE), and edge component (EC) of the 3D stereoscopic display using the eye foveation model. Third, by comparing the eyestrain in conventional 3D video and experimental 3D sample video, we analyze the characteristics of eyestrain according to various factors and types of 3D video. Fourth, by comparing the eyestrain with or without the compensation of eye saccades movement in 3D video, we analyze the characteristics of eyestrain according to the types of eye movements in 3D video. Experimental results show that the degree of CSD causes more eyestrain than other factors. PMID:24834910

  6. Pursuit tracks chase: exploring the role of eye movements in the detection of chasing

    PubMed Central

    Träuble, Birgit

    2015-01-01

    We explore the role of eye movements in a chase detection task. Unlike the previous studies, which focused on overall performance as indicated by response speed and chase detection accuracy, we decompose the search process into gaze events such as smooth eye movements and use a data-driven approach to separately describe these gaze events. We measured eye movements of four human subjects engaged in a chase detection task displayed on a computer screen. The subjects were asked to detect two chasing rings among twelve other randomly moving rings. Using principal component analysis and support vector machines, we looked at the template and classification images that describe various stages of the detection process. We showed that the subjects mostly search for pairs of rings that move one after another in the same direction with a distance of 3.5–3.8 degrees. To find such pairs, the subjects first looked for regions with a high ring density and then pursued the rings in this region. Most of these groups consisted of two rings. Three subjects preferred to pursue the pair as a single object, while the remaining subject pursued the group by alternating the gaze between the two individual rings. In the discussion, we argue that subjects do not compare the movement of the pursued pair to a singular preformed template that describes a chasing motion. Rather, subjects bring certain hypotheses about what motion may qualify as chase and then, through feedback, they learn to look for a motion pattern that maximizes their performance. PMID:26401454

  7. Spatial Coding of Eye Movements Relative to Perceived Orientations During Roll Tilt with Different Gravitoinertial Loads

    NASA Technical Reports Server (NTRS)

    Wood, Scott; Clement, Gilles

    2013-01-01

    This purpose of this study was to examine the spatial coding of eye movements during roll tilt relative to perceived orientations while free-floating during the microgravity phase of parabolic flight or during head tilt in normal gravity. Binocular videographic recordings obtained in darkness from six subjects allowed us to quantify the mean deviations in gaze trajectories along both horizontal and vertical coordinates relative to the aircraft and head orientations. Both variability and curvature of gaze trajectories increased during roll tilt compared to the upright position. The saccades were less accurate during parabolic flight compared to measurements obtained in normal gravity. The trajectories of saccades along perceived horizontal orientations tended to deviate in the same direction as the head tilt, while the deviations in gaze trajectories along the perceived vertical orientations deviated in the opposite direction relative to the head tilt. Although subjects were instructed to look off in the distance while performing the eye movements, fixation distance varied with vertical gaze direction independent of whether the saccades were made along perceived aircraft or head orientations. This coupling of horizontal vergence with vertical gaze is in a consistent direction with the vertical slant of the horopter. The increased errors in gaze trajectories along both perceived orientations during microgravity can be attributed to the otolith's role in spatial coding of eye movements.

  8. Human-like object tracking and gaze estimation with PKD android

    PubMed Central

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K; Bugnariu, Nicoleta L.; Popa, Dan O.

    2018-01-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold : to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans. PMID:29416193

  9. Human-like object tracking and gaze estimation with PKD android

    NASA Astrophysics Data System (ADS)

    Wijayasinghe, Indika B.; Miller, Haylie L.; Das, Sumit K.; Bugnariu, Nicoleta L.; Popa, Dan O.

    2016-05-01

    As the use of robots increases for tasks that require human-robot interactions, it is vital that robots exhibit and understand human-like cues for effective communication. In this paper, we describe the implementation of object tracking capability on Philip K. Dick (PKD) android and a gaze tracking algorithm, both of which further robot capabilities with regard to human communication. PKD's ability to track objects with human-like head postures is achieved with visual feedback from a Kinect system and an eye camera. The goal of object tracking with human-like gestures is twofold: to facilitate better human-robot interactions and to enable PKD as a human gaze emulator for future studies. The gaze tracking system employs a mobile eye tracking system (ETG; SensoMotoric Instruments) and a motion capture system (Cortex; Motion Analysis Corp.) for tracking the head orientations. Objects to be tracked are displayed by a virtual reality system, the Computer Assisted Rehabilitation Environment (CAREN; MotekForce Link). The gaze tracking algorithm converts eye tracking data and head orientations to gaze information facilitating two objectives: to evaluate the performance of the object tracking system for PKD and to use the gaze information to predict the intentions of the user, enabling the robot to understand physical cues by humans.

  10. Can gaze-contingent mirror-feedback from unfamiliar faces alter self-recognition?

    PubMed

    Estudillo, Alejandro J; Bindemann, Markus

    2017-05-01

    This study focuses on learning of the self, by examining how human observers update internal representations of their own face. For this purpose, we present a novel gaze-contingent paradigm, in which an onscreen face mimics observers' own eye-gaze behaviour (in the congruent condition), moves its eyes in different directions to that of the observers (incongruent condition), or remains static and unresponsive (neutral condition). Across three experiments, the mimicry of the onscreen face did not affect observers' perceptual self-representations. However, this paradigm influenced observers' reports of their own face. This effect was such that observers felt the onscreen face to be their own and that, if the onscreen gaze had moved on its own accord, observers expected their own eyes to move too. The theoretical implications of these findings are discussed.

  11. Nasal Oxytocin Treatment Biases Dogs’ Visual Attention and Emotional Response toward Positive Human Facial Expressions

    PubMed Central

    Somppi, Sanni; Törnqvist, Heini; Topál, József; Koskela, Aija; Hänninen, Laura; Krause, Christina M.; Vainio, Outi

    2017-01-01

    The neuropeptide oxytocin plays a critical role in social behavior and emotion regulation in mammals. The aim of this study was to explore how nasal oxytocin administration affects gazing behavior during emotional perception in domestic dogs. Looking patterns of dogs, as a measure of voluntary attention, were recorded during the viewing of human facial expression photographs. The pupil diameters of dogs were also measured as a physiological index of emotional arousal. In a placebo-controlled within-subjects experimental design, 43 dogs, after having received either oxytocin or placebo (saline) nasal spray treatment, were presented with pictures of unfamiliar male human faces displaying either a happy or an angry expression. We found that, depending on the facial expression, the dogs’ gaze patterns were affected selectively by oxytocin treatment. After receiving oxytocin, dogs fixated less often on the eye regions of angry faces and revisited (glanced back at) more often the eye regions of smiling (happy) faces than after the placebo treatment. Furthermore, following the oxytocin treatment dogs fixated and revisited the eyes of happy faces significantly more often than the eyes of angry faces. The analysis of dogs’ pupil diameters during viewing of human facial expressions indicated that oxytocin may also have a modulatory effect on dogs’ emotional arousal. While subjects’ pupil sizes were significantly larger when viewing angry faces than happy faces in the control (placebo treatment) condition, oxytocin treatment not only eliminated this effect but caused an opposite pupil response. Overall, these findings suggest that nasal oxytocin administration selectively changes the allocation of attention and emotional arousal in domestic dogs. Oxytocin has the potential to decrease vigilance toward threatening social stimuli and increase the salience of positive social stimuli thus making eye gaze of friendly human faces more salient for dogs. Our study provides further support for the role of the oxytocinergic system in the social perception abilities of domestic dogs. We propose that oxytocin modulates fundamental emotional processing in dogs through a mechanism that may facilitate communication between humans and dogs. PMID:29089919

  12. The Foundations of Social Cognition: Studies on Face/Voice Integration in Newborn Infants

    ERIC Educational Resources Information Center

    Streri, Arlette; Coulon, Marion; Guellai, Bahia

    2013-01-01

    A series of studies on newborns' abilities for recognizing speaking faces has been performed in order to identify the fundamental cues of social cognition. We used audiovisual dynamic faces rather than photographs or patterns of faces. Direct eye gaze and speech addressed to newborns, in interactive situations, appear to be two good candidates for…

  13. Do Gaze Cues in Complex Scenes Capture and Direct the Attention of High Functioning Adolescents with ASD? Evidence from Eye-Tracking

    ERIC Educational Resources Information Center

    Freeth, M.; Chapman, P.; Ropar, D.; Mitchell, P.

    2010-01-01

    Visual fixation patterns whilst viewing complex photographic scenes containing one person were studied in 24 high-functioning adolescents with Autism Spectrum Disorders (ASD) and 24 matched typically developing adolescents. Over two different scene presentation durations both groups spent a large, strikingly similar proportion of their viewing…

  14. Balance, mobility and gaze stability deficits remain following surgical removal of vestibular schwannoma (acoustic neuroma): an observational study.

    PubMed

    Choy, Nancy Low; Johnson, Natalie; Treleaven, Julia; Jull, Gwendolen; Panizza, Benedict; Brown-Rothwell, David

    2006-01-01

    Are there residual deficits in balance, mobility, and gaze stability after surgical removal of vestibular schwannoma? Observational study. Twelve people with a mean age of 52 years who had undergone surgical removal of vestibular schwannoma at least three months previously and had not undergone vestibular rehabilitation. Twelve age- and gender-matched healthy people who acted as controls. Handicap due to dizziness, balance, mobility, and gaze stability was measured. Handicap due to dizziness was moderate for the clinical group. They swayed significantly more than the controls in comfortable stance: firm surface eyes open and visual conflict (p < 0.05); foam surface eyes closed (p < 0.05) and visual conflict (p < 0.05); and feet together: firm surface, eyes closed (p < 0.05), foam surface, eyes open (p < 0.05) and eyes closed (p < 0.01). They displayed a higher rate of failure for timed stance and gaze stability (p < 0.05) than the controls. Step Test (p < 0.01), Tandem Walk Test (p < 0.05) and Dynamic Gait Index (p < 0.01) scores were also significantly reduced compared with controls. There was a significant correlation between handicap due to dizziness and the inability to maintain balance in single limb and tandem stance (r = 0.68, p = 0.02) and the ability to maintain gaze stability during passive head movement (r = 0.78; p = 0.02). A prospective study is required to evaluate vestibular rehabilitation to ameliorate dizziness and to improve balance, mobility, and gaze stability for this clinical group.

  15. What You Learn is What You See: Using Eye Movements to Study Infant Cross-Situational Word Learning

    PubMed Central

    Smith, Linda

    2016-01-01

    Recent studies show that both adults and young children possess powerful statistical learning capabilities to solve the word-to-world mapping problem. However, the underlying mechanisms that make statistical learning possible and powerful are not yet known. With the goal of providing new insights into this issue, the research reported in this paper used an eye tracker to record the moment-by-moment eye movement data of 14-month-old babies in statistical learning tasks. Various measures are applied to such fine-grained temporal data, such as looking duration and shift rate (the number of shifts in gaze from one visual object to the other) trial by trial, showing different eye movement patterns between strong and weak statistical learners. Moreover, an information-theoretic measure is developed and applied to gaze data to quantify the degree of learning uncertainty trial by trial. Next, a simple associative statistical learning model is applied to eye movement data and these simulation results are compared with empirical results from young children, showing strong correlations between these two. This suggests that an associative learning mechanism with selective attention can provide a cognitively plausible model of cross-situational statistical learning. The work represents the first steps to use eye movement data to infer underlying real-time processes in statistical word learning. PMID:22213894

  16. Variability of eye movements when viewing dynamic natural scenes.

    PubMed

    Dorr, Michael; Martinetz, Thomas; Gegenfurtner, Karl R; Barth, Erhardt

    2010-08-26

    How similar are the eye movement patterns of different subjects when free viewing dynamic natural scenes? We collected a large database of eye movements from 54 subjects on 18 high-resolution videos of outdoor scenes and measured their variability using the Normalized Scanpath Saliency, which we extended to the temporal domain. Even though up to about 80% of subjects looked at the same image region in some video parts, variability usually was much greater. Eye movements on natural movies were then compared with eye movements in several control conditions. "Stop-motion" movies had almost identical semantic content as the original videos but lacked continuous motion. Hollywood action movie trailers were used to probe the upper limit of eye movement coherence that can be achieved by deliberate camera work, scene cuts, etc. In a "repetitive" condition, subjects viewed the same movies ten times each over the course of 2 days. Results show several systematic differences between conditions both for general eye movement parameters such as saccade amplitude and fixation duration and for eye movement variability. Most importantly, eye movements on static images are initially driven by stimulus onset effects and later, more so than on continuous videos, by subject-specific idiosyncrasies; eye movements on Hollywood movies are significantly more coherent than those on natural movies. We conclude that the stimuli types often used in laboratory experiments, static images and professionally cut material, are not very representative of natural viewing behavior. All stimuli and gaze data are publicly available at http://www.inb.uni-luebeck.de/tools-demos/gaze.

  17. Biasing moral decisions by exploiting the dynamics of eye gaze.

    PubMed

    Pärnamets, Philip; Johansson, Petter; Hall, Lars; Balkenius, Christian; Spivey, Michael J; Richardson, Daniel C

    2015-03-31

    Eye gaze is a window onto cognitive processing in tasks such as spatial memory, linguistic processing, and decision making. We present evidence that information derived from eye gaze can be used to change the course of individuals' decisions, even when they are reasoning about high-level, moral issues. Previous studies have shown that when an experimenter actively controls what an individual sees the experimenter can affect simple decisions with alternatives of almost equal valence. Here we show that if an experimenter passively knows when individuals move their eyes the experimenter can change complex moral decisions. This causal effect is achieved by simply adjusting the timing of the decisions. We monitored participants' eye movements during a two-alternative forced-choice task with moral questions. One option was randomly predetermined as a target. At the moment participants had fixated the target option for a set amount of time we terminated their deliberation and prompted them to choose between the two alternatives. Although participants were unaware of this gaze-contingent manipulation, their choices were systematically biased toward the target option. We conclude that even abstract moral cognition is partly constituted by interactions with the immediate environment and is likely supported by gaze-dependent decision processes. By tracking the interplay between individuals, their sensorimotor systems, and the environment, we can influence the outcome of a decision without directly manipulating the content of the information available to them.

  18. Constraining eye movement in individuals with Parkinson's disease during walking turns.

    PubMed

    Ambati, V N Pradeep; Saucedo, Fabricio; Murray, Nicholas G; Powell, Douglas W; Reed-Jones, Rebecca J

    2016-10-01

    Walking and turning is a movement that places individuals with Parkinson's disease (PD) at increased risk for fall-related injury. However, turning is an essential movement in activities of daily living, making up to 45 % of the total steps taken in a given day. Hypotheses regarding how turning is controlled suggest an essential role of anticipatory eye movements to provide feedforward information for body coordination. However, little research has investigated control of turning in individuals with PD with specific consideration for eye movements. The purpose of this study was to examine eye movement behavior and body segment coordination in individuals with PD during walking turns. Three experimental groups, a group of individuals with PD, a group of healthy young adults (YAC), and a group of healthy older adults (OAC), performed walking and turning tasks under two visual conditions: free gaze and fixed gaze. Whole-body motion capture and eye tracking characterized body segment coordination and eye movement behavior during walking trials. Statistical analysis revealed significant main effects of group (PD, YAC, and OAC) and visual condition (free and fixed gaze) on timing of segment rotation and horizontal eye movement. Within group comparisons, revealed timing of eye and head movement was significantly different between the free and fixed gaze conditions for YAC (p < 0.001) and OAC (p < 0.05), but not for the PD group (p > 0.05). In addition, while intersegment timings (reflecting segment coordination) were significantly different for YAC and OAC during free gaze (p < 0.05), they were not significantly different in PD. These results suggest individuals with PD do not make anticipatory eye and head movements ahead of turning and that this may result in altered segment coordination during turning. As such, eye movements may be an important addition to training programs for those with PD, possibly promoting better coordination during turning and potentially reducing the risk of falls.

  19. Eye-Pursuit and Reafferent Head Movement Signals Carried by Pursuit Neurons in the Caudal Part of the Frontal Eye Fields during Head-Free Pursuit

    PubMed Central

    Kasahara, Satoshi; Akao, Teppei; Kurkin, Sergei; Peterson, Barry W.

    2009-01-01

    Eye and head movements are coordinated during head-free pursuit. To examine whether pursuit neurons in frontal eye fields (FEF) carry gaze-pursuit commands that drive both eye-pursuit and head-pursuit, monkeys whose heads were free to rotate about a vertical axis were trained to pursue a juice feeder with their head and a target with their eyes. Initially the feeder and target moved synchronously with the same visual angle. FEF neurons responding to this gaze-pursuit were tested for eye-pursuit of target motion while the feeder was stationary and for head-pursuit while the target was stationary. The majority of pursuit neurons exhibited modulation during head-pursuit, but their preferred directions during eye-pursuit and head-pursuit were different. Although peak modulation occurred during head movements, the onset of discharge usually was not aligned with the head movement onset. The minority of neurons whose discharge onset was so aligned discharged after the head movement onset. These results do not support the idea that the head-pursuit–related modulation reflects head-pursuit commands. Furthermore, modulation similar to that during head-pursuit was obtained by passive head rotation on stationary trunk. Our results suggest that FEF pursuit neurons issue gaze or eye movement commands during gaze-pursuit and that the head-pursuit–related modulation primarily reflects reafferent signals resulting from head movements. PMID:18483002

  20. Trajectory prediction of saccadic eye movements using a compressed exponential model

    PubMed Central

    Han, Peng; Saunders, Daniel R.; Woods, Russell L.; Luo, Gang

    2013-01-01

    Gaze-contingent display paradigms play an important role in vision research. The time delay due to data transmission from eye tracker to monitor may lead to a misalignment between the gaze direction and image manipulation during eye movements, and therefore compromise the contingency. We present a method to reduce this misalignment by using a compressed exponential function to model the trajectories of saccadic eye movements. Our algorithm was evaluated using experimental data from 1,212 saccades ranging from 3° to 30°, which were collected with an EyeLink 1000 and a Dual-Purkinje Image (DPI) eye tracker. The model fits eye displacement with a high agreement (R2 > 0.96). When assuming a 10-millisecond time delay, prediction of 2D saccade trajectories using our model could reduce the misalignment by 30% to 60% with the EyeLink tracker and 20% to 40% with the DPI tracker for saccades larger than 8°. Because a certain number of samples are required for model fitting, the prediction did not offer improvement for most small saccades and the early stages of large saccades. Evaluation was also performed for a simulated 100-Hz gaze-contingent display using the prerecorded saccade data. With prediction, the percentage of misalignment larger than 2° dropped from 45% to 20% for EyeLink and 42% to 26% for DPI data. These results suggest that the saccade-prediction algorithm may help create more accurate gaze-contingent displays. PMID:23902753

  1. Figure-ground activity in V1 and guidance of saccadic eye movements.

    PubMed

    Supèr, Hans

    2006-01-01

    Every day we shift our gaze about 150.000 times mostly without noticing it. The direction of these gaze shifts are not random but directed by sensory information and internal factors. After each movement the eyes hold still for a brief moment so that visual information at the center of our gaze can be processed in detail. This means that visual information at the saccade target location is sufficient to accurately guide the gaze shift but yet is not sufficiently processed to be fully perceived. In this paper I will discuss the possible role of activity in the primary visual cortex (V1), in particular figure-ground activity, in oculo-motor behavior. Figure-ground activity occurs during the late response period of V1 neurons and correlates with perception. The strength of figure-ground responses predicts the direction and moment of saccadic eye movements. The superior colliculus, a gaze control center that integrates visual and motor signals, receives direct anatomical connections from V1. These projections may convey the perceptual information that is required for appropriate gaze shifts. In conclusion, figure-ground activity in V1 may act as an intermediate component linking visual and motor signals.

  2. Gaze‐evoked nystagmus induced by alcohol intoxication

    PubMed Central

    Tarnutzer, Alexander A.; Straumann, Dominik; Ramat, Stefano; Bertolini, Giovanni

    2017-01-01

    Key points The cerebellum is the core structure controlling gaze stability. Chronic cerebellar diseases and acute alcohol intoxication affect cerebellar function, inducing, among others, gaze instability as gaze‐evoked nystagmus.Gaze‐evoked nystagmus is characterized by increased centripetal eye‐drift. It is used as an important diagnostic sign for patients with cerebellar degeneration and to assess the ‘driving while intoxicated’ condition.We quantified the effect of alcohol on gaze‐holding using an approach allowing, for the first time, the comparison of deficits induced by alcohol intoxication and cerebellar degeneration.Our results showed that alcohol intoxication induces a two‐fold increase of centripetal eye‐drift.We establish analysis techniques for using controlled alcohol intake as a model to support the study of cerebellar deficits.The observed similarity between the effect of alcohol and the clinical signs observed in cerebellar patients suggests a possible pathomechanism for gaze‐holding deficits. Abstract Gaze‐evoked nystagmus (GEN) is an ocular‐motor finding commonly observed in cerebellar disease, characterized by increased centripetal eye‐drift with centrifugal correcting saccades at eccentric gaze. With cerebellar degeneration being a rare and clinically heterogeneous disease, data from patients are limited. We hypothesized that a transient inhibition of cerebellar function by defined amounts of alcohol may provide a suitable model to study gaze‐holding deficits in cerebellar disease. We recorded gaze‐holding at varying horizontal eye positions in 15 healthy participants before and 30 min after alcohol intake required to reach 0.6‰ blood alcohol content (BAC). Changes in ocular‐motor behaviour were quantified measuring eye‐drift velocity as a continuous function of gaze eccentricity over a large range (±40 deg) of horizontal gaze angles and characterized using a two‐parameter tangent model. The effect of alcohol on gaze stability was assessed analysing: (1) overall effects on the gaze‐holding system, (2) specific effects on each eye and (3) differences between gaze angles in the temporal and nasal hemifields. For all subjects, alcohol consumption induced gaze instability, causing a two‐fold increase [2.21 (0.55), median (median absolute deviation); P = 0.002] of eye‐drift velocity at all eccentricities. Results were confirmed analysing each eye and hemifield independently. The alcohol‐induced transient global deficit in gaze‐holding matched the pattern previously described in patients with late‐onset cerebellar degeneration. Controlled intake of alcohol seems a suitable disease model to study cerebellar GEN. With alcohol resulting in global cerebellar hypofunction, we hypothesize that patients matching the gaze‐holding behaviour observed here suffered from diffuse deficits in the gaze‐holding system as well. PMID:27981586

  3. The effects of social pressure and emotional expression on the cone of gaze in patients with social anxiety disorder.

    PubMed

    Harbort, Johannes; Spiegel, Julia; Witthöft, Michael; Hecht, Heiko

    2017-06-01

    Patients with social anxiety disorder suffer from pronounced fears in social situations. As gaze perception is crucial in these situations, we examined which factors influence the range of gaze directions where mutual gaze is experienced (the cone of gaze). The social stimulus was modified by changing the number of people (heads) present and the emotional expression of their faces. Participants completed a psychophysical task, in which they had to adjust the eyes of a virtual head to gaze at the edge of the range where mutual eye-contact was experienced. The number of heads affected the width of the gaze cone: the more heads, the wider the gaze cone. The emotional expression of the virtual head had no consistent effect on the width of the gaze cone, it did however affect the emotional state of the participants. Angry expressions produced the highest arousal values. Highest valence emerged from happy faces, lowest valence from angry faces. These results suggest that the widening of the gaze cone in social anxiety disorder is not primarily mediated by their altered emotional reactivity. Implications for gaze assessment and gaze training in therapeutic contexts are discussed. Due to interindividual variability, enlarged gaze cones are not necessarily indicative of social anxiety disorder, they merely constitute a correlate at the group level. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. An Open Conversation on Using Eye-Gaze Methods in Studies of Neurodevelopmental Disorders

    ERIC Educational Resources Information Center

    Venker, Courtney E.; Kover, Sara T.

    2015-01-01

    Purpose: Eye-gaze methods have the potential to advance the study of neurodevelopmental disorders. Despite their increasing use, challenges arise in using these methods with individuals with neurodevelopmental disorders and in reporting sufficient methodological detail such that the resulting research is replicable and interpretable. Method: This…

  5. Evolution of Biological Image Stabilization.

    PubMed

    Hardcastle, Ben J; Krapp, Holger G

    2016-10-24

    The use of vision to coordinate behavior requires an efficient control design that stabilizes the world on the retina or directs the gaze towards salient features in the surroundings. With a level gaze, visual processing tasks are simplified and behaviorally relevant features from the visual environment can be extracted. No matter how simple or sophisticated the eye design, mechanisms have evolved across phyla to stabilize gaze. In this review, we describe functional similarities in eyes and gaze stabilization reflexes, emphasizing their fundamental role in transforming sensory information into motor commands that support postural and locomotor control. We then focus on gaze stabilization design in flying insects and detail some of the underlying principles. Systems analysis reveals that gaze stabilization often involves several sensory modalities, including vision itself, and makes use of feedback as well as feedforward signals. Independent of phylogenetic distance, the physical interaction between an animal and its natural environment - its available senses and how it moves - appears to shape the adaptation of all aspects of gaze stabilization. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Eye-Tracking Technology and the Dynamics of Natural Gaze Behavior in Sports: A Systematic Review of 40 Years of Research.

    PubMed

    Kredel, Ralf; Vater, Christian; Klostermann, André; Hossner, Ernst-Joachim

    2017-01-01

    Reviewing 60 studies on natural gaze behavior in sports, it becomes clear that, over the last 40 years, the use of eye-tracking devices has considerably increased. Specifically, this review reveals the large variance of methods applied, analyses performed, and measures derived within the field. The results of sub-sample analyses suggest that sports-related eye-tracking research strives, on the one hand, for ecologically valid test settings (i.e., viewing conditions and response modes), while on the other, for experimental control along with high measurement accuracy (i.e., controlled test conditions with high-frequency eye-trackers linked to algorithmic analyses). To meet both demands, some promising compromises of methodological solutions have been proposed-in particular, the integration of robust mobile eye-trackers in motion-capture systems. However, as the fundamental trade-off between laboratory and field research cannot be solved by technological means, researchers need to carefully weigh the arguments for one or the other approach by accounting for the respective consequences. Nevertheless, for future research on dynamic gaze behavior in sports, further development of the current mobile eye-tracking methodology seems highly advisable to allow for the acquisition and algorithmic analyses of larger amounts of gaze-data and further, to increase the explanatory power of the derived results.

  7. Seductive Eyes: Attractiveness and Direct Gaze Increase Desire for Associated Objects

    ERIC Educational Resources Information Center

    Strick, Madelijn; Holland, Rob W.; van Knippenberg, Ad

    2008-01-01

    Recent research in neuroscience shows that observing attractive faces with direct gaze is more rewarding than observing attractive faces with averted gaze. On the basis of this research, it was hypothesized that object evaluations can be enhanced by associating them with attractive faces displaying direct gaze. In a conditioning paradigm, novel…

  8. Eye position modulates the electromyographic responses of neck muscles to electrical stimulation of the superior colliculus in the alert cat.

    PubMed

    Hadjidimitrakis, K; Moschovakis, A K; Dalezios, Y; Grantyn, A

    2007-05-01

    Rapid gaze shifts are often accomplished with coordinated movements of the eyes and head, the relative amplitude of which depends on the starting position of the eyes. The size of gaze shifts is determined by the superior colliculus (SC) but additional processing in the lower brain stem is needed to determine the relative contributions of eye and head components. Models of eye-head coordination often assume that the strength of the command sent to the head controllers is modified by a signal indicative of the eye position. Evidence in favor of this hypothesis has been recently obtained in a study of phasic electromyographic (EMG) responses to stimulation of the SC in head-restrained monkeys (Corneil et al. in J Neurophysiol 88:2000-2018, 2002b). Bearing in mind that the patterns of eye-head coordination are not the same in all species and because the eye position sensitivity of phasic EMG responses has not been systematically investigated in cats, in the present study we used cats to address this issue. We stimulated electrically the intermediate and deep layers of the caudal SC in alert cats and recorded the EMG responses of neck muscles with horizontal and vertical pulling directions. Our data demonstrate that phasic, short latency EMG responses can be modulated by the eye position such that they increase as the eye occupies more and more eccentric positions in the pulling direction of the muscle tested. However, the influence of the eye position is rather modest, typically accounting for only 10-50% of the variance of EMG response amplitude. Responses evoked from several SC sites were not modulated by the eye position.

  9. Time in the eye of the beholder: Gaze position reveals spatial-temporal associations during encoding and memory retrieval of future and past.

    PubMed

    Martarelli, Corinna S; Mast, Fred W; Hartmann, Matthias

    2017-01-01

    Time is grounded in various ways, and previous studies point to a "mental time line" with past associated with the left, and future with the right side. In this study, we investigated whether spontaneous eye movements on a blank screen would follow a mental timeline during encoding, free recall, and recognition of past and future items. In all three stages of processing, gaze position was more rightward during future items compared to past items. Moreover, horizontal gaze position during encoding predicted horizontal gaze position during free recall and recognition. We conclude that mental time line and the stored gaze position during encoding assist memory retrieval of past versus future items. Our findings highlight the spatial nature of temporal representations.

  10. A closer look at the size of the gaze-liking effect: a preregistered replication.

    PubMed

    Tipples, Jason; Pecchinenda, Anna

    2018-04-30

    This study is a direct replication of gaze-liking effect using the same design, stimuli and procedure. The gaze-liking effect describes the tendency for people to rate objects as more likeable when they have recently seen a person repeatedly gaze toward rather than away from the object. However, as subsequent studies show considerable variability in the size of this effect, we sampled a larger number of participants (N = 98) than the original study (N = 24) to gain a more precise estimate of the gaze-liking effect size. Our results indicate a much smaller standardised effect size (d z  = 0.02) than that of the original study (d z  = 0.94). Our smaller effect size was not due to general insensitivity to eye-gaze effects because the same sample showed a clear (d z  = 1.09) gaze-cuing effect - faster reaction times when eyes looked toward vs away from target objects. We discuss the implications of our findings for future studies wishing to study the gaze-liking effect.

  11. Orienting in Response to Gaze and the Social Use of Gaze among Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Rombough, Adrienne; Iarocci, Grace

    2013-01-01

    Potential relations between gaze cueing, social use of gaze, and ability to follow line of sight were examined in children with autism and typically developing peers. Children with autism (mean age = 10 years) demonstrated intact gaze cueing. However, they preferred to follow arrows instead of eyes to infer mental state, and showed decreased…

  12. Automatic Mechanisms for Social Attention Are Culturally Penetrable.

    PubMed

    Cohen, Adam S; Sasaki, Joni Y; German, Tamsin C; Kim, Heejung S

    2017-01-01

    Are mechanisms for social attention influenced by culture? Evidence that social attention is triggered automatically by bottom-up gaze cues and is uninfluenced by top-down verbal instructions may suggest it operates in the same way everywhere. Yet considerations from evolutionary and cultural psychology suggest that specific aspects of one's cultural background may have consequence for the way mechanisms for social attention develop and operate. In more interdependent cultures, the scope of social attention may be broader, focusing on more individuals and relations between those individuals. We administered a multi-gaze cueing task requiring participants to fixate a foreground face flanked by background faces and measured shifts in attention using eye tracking. For European Americans, gaze cueing did not depend on the direction of background gaze cues, suggesting foreground gaze alone drives automatic attention shifting; for East Asians, cueing patterns differed depending on whether the foreground cue matched or mismatched background cues, suggesting foreground and background gaze information were integrated. These results demonstrate that cultural background influences the social attention system by shifting it into a narrow or broad mode of operation and, importantly, provides evidence challenging the assumption that mechanisms underlying automatic social attention are necessarily rigid and impenetrable to culture. Copyright © 2015 Cognitive Science Society, Inc.

  13. Visual Foraging With Fingers and Eye Gaze

    PubMed Central

    Thornton, Ian M.; Smith, Irene J.; Chetverikov, Andrey; Kristjánsson, Árni

    2016-01-01

    A popular model of the function of selective visual attention involves search where a single target is to be found among distractors. For many scenarios, a more realistic model involves search for multiple targets of various types, since natural tasks typically do not involve a single target. Here we present results from a novel multiple-target foraging paradigm. We compare finger foraging where observers cancel a set of predesignated targets by tapping them, to gaze foraging where observers cancel items by fixating them for 100 ms. During finger foraging, for most observers, there was a large difference between foraging based on a single feature, where observers switch easily between target types, and foraging based on a conjunction of features where observers tended to stick to one target type. The pattern was notably different during gaze foraging where these condition differences were smaller. Two conclusions follow: (a) The fact that a sizeable number of observers (in particular during gaze foraging) had little trouble switching between different target types raises challenges for many prominent theoretical accounts of visual attention and working memory. (b) While caveats must be noted for the comparison of gaze and finger foraging, the results suggest that selection mechanisms for gaze and pointing have different operational constraints. PMID:27433323

  14. Surgical planning and innervation in pontine gaze palsy with ipsilateral esotropia.

    PubMed

    Somer, Deniz; Cinar, Fatma Gul; Kaderli, Ahmet; Ornek, Firdevs

    2016-10-01

    To discuss surgical intervention strategies among patients with horizontal gaze palsy with concurrent esotropia. Five consecutive patients with dorsal pontine lesions are presented. Each patient had horizontal gaze palsy with symptomatic diplopia as a consequence of esotropia in primary gaze and an anomalous head turn to attain single binocular vision. Clinical findings in the first 2 patients led us to presume there was complete loss of rectus muscle function from rectus muscle palsy. Based on this assumption, medial rectus recessions with simultaneous partial vertical muscle transposition (VRT) on the ipsilateral eye of the gaze palsy and recession-resection surgery on the contralateral eye were performed, resulting in significant motility limitation. Sequential recession-resection surgery without simultaneous VRT on the 3rd patient created an unexpected motility improvement to the side of gaze palsy, an observation differentiating rectus muscle palsy from paresis. Recession combined with VRT approach in the esotropic eye was abandoned on subsequent patients. Simultaneous recession-resection surgery without VRT in the next 2 patients resulted in alleviation of head postures, resolution of esotropia, and also substantial motility improvements to the ipsilateral hemifield of gaze palsy without limitations in adduction and vertical deviations. Ocular misalignment and abnormal head posture as a result of conjugate gaze palsy can be successfully treated by basic recession-resection surgery, with the advantage of increasing versions to the ipsilateral side of the gaze palsy. Improved motility after surgery presumably represents paresis, not "paralysis," with residual innervation in rectus muscles. Copyright © 2016 American Association for Pediatric Ophthalmology and Strabismus. Published by Elsevier Inc. All rights reserved.

  15. Eye size and visual acuity influence vestibular anatomy in mammals.

    PubMed

    Kemp, Addison D; Christopher Kirk, E

    2014-04-01

    The semicircular canals of the inner ear detect head rotations and trigger compensatory movements that stabilize gaze and help maintain visual fixation. Mammals with large eyes and high visual acuity require precise gaze stabilization mechanisms because they experience diminished visual functionality at low thresholds of uncompensated motion. Because semicircular canal radius of curvature is a primary determinant of canal sensitivity, species with large canal radii are expected to be capable of more precise gaze stabilization than species with small canal radii. Here, we examine the relationship between mean semicircular canal radius of curvature, eye size, and visual acuity in a large sample of mammals. Our results demonstrate that eye size and visual acuity both explain a significant proportion of the variance in mean canal radius of curvature after statistically controlling for the effects of body mass and phylogeny. These findings suggest that variation in mean semicircular canal radius of curvature among mammals is partly the result of selection for improved gaze stabilization in species with large eyes and acute vision. Our results also provide a possible functional explanation for the small semicircular canal radii of fossorial mammals and plesiadapiforms. Copyright © 2014 Wiley Periodicals, Inc.

  16. An eye tracking system for monitoring face scanning patterns reveals the enhancing effect of oxytocin on eye contact in common marmosets.

    PubMed

    Kotani, Manato; Shimono, Kohei; Yoneyama, Toshihiro; Nakako, Tomokazu; Matsumoto, Kenji; Ogi, Yuji; Konoike, Naho; Nakamura, Katsuki; Ikeda, Kazuhito

    2017-09-01

    Eye tracking systems are used to investigate eyes position and gaze patterns presumed as eye contact in humans. Eye contact is a useful biomarker of social communication and known to be deficient in patients with autism spectrum disorders (ASDs). Interestingly, the same eye tracking systems have been used to directly compare face scanning patterns in some non-human primates to those in human. Thus, eye tracking is expected to be a useful translational technique for investigating not only social attention and visual interest, but also the effects of psychiatric drugs, such as oxytocin, a neuropeptide that regulates social behavior. In this study, we report on a newly established method for eye tracking in common marmosets as unique New World primates that, like humans, use eye contact as a mean of communication. Our investigation was aimed at characterizing these primates face scanning patterns and evaluating the effects of oxytocin on their eye contact behavior. We found that normal common marmosets spend more time viewing the eyes region in common marmoset's picture than the mouth region or a scrambled picture. In oxytocin experiment, the change in eyes/face ratio was significantly greater in the oxytocin group than in the vehicle group. Moreover, oxytocin-induced increase in the change in eyes/face ratio was completely blocked by the oxytocin receptor antagonist L-368,899. These results indicate that eye tracking in common marmosets may be useful for evaluating drug candidates targeting psychiatric conditions, especially ASDs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism

    PubMed Central

    Auyeung, B; Lombardo, M V; Heinrichs, M; Chakrabarti, B; Sule, A; Deakin, J B; Bethlehem, R A I; Dickens, L; Mooney, N; Sipple, J A N; Thiemann, P; Baron-Cohen, S

    2015-01-01

    Autism spectrum conditions (autism) affect ~1% of the population and are characterized by deficits in social communication. Oxytocin has been widely reported to affect social-communicative function and its neural underpinnings. Here we report the first evidence that intranasal oxytocin administration improves a core problem that individuals with autism have in using eye contact appropriately in real-world social settings. A randomized double-blind, placebo-controlled, within-subjects design is used to examine how intranasal administration of 24 IU of oxytocin affects gaze behavior for 32 adult males with autism and 34 controls in a real-time interaction with a researcher. This interactive paradigm bypasses many of the limitations encountered with conventional static or computer-based stimuli. Eye movements are recorded using eye tracking, providing an objective measurement of looking patterns. The measure is shown to be sensitive to the reduced eye contact commonly reported in autism, with the autism group spending less time looking to the eye region of the face than controls. Oxytocin administration selectively enhanced gaze to the eyes in both the autism and control groups (transformed mean eye-fixation difference per second=0.082; 95% CI:0.025–0.14, P=0.006). Within the autism group, oxytocin has the most effect on fixation duration in individuals with impaired levels of eye contact at baseline (Cohen's d=0.86). These findings demonstrate that the potential benefits of oxytocin in autism extend to a real-time interaction, providing evidence of a therapeutic effect in a key aspect of social communication. PMID:25668435

  18. Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism.

    PubMed

    Auyeung, B; Lombardo, M V; Heinrichs, M; Chakrabarti, B; Sule, A; Deakin, J B; Bethlehem, R A I; Dickens, L; Mooney, N; Sipple, J A N; Thiemann, P; Baron-Cohen, S

    2015-02-10

    Autism spectrum conditions (autism) affect ~1% of the population and are characterized by deficits in social communication. Oxytocin has been widely reported to affect social-communicative function and its neural underpinnings. Here we report the first evidence that intranasal oxytocin administration improves a core problem that individuals with autism have in using eye contact appropriately in real-world social settings. A randomized double-blind, placebo-controlled, within-subjects design is used to examine how intranasal administration of 24 IU of oxytocin affects gaze behavior for 32 adult males with autism and 34 controls in a real-time interaction with a researcher. This interactive paradigm bypasses many of the limitations encountered with conventional static or computer-based stimuli. Eye movements are recorded using eye tracking, providing an objective measurement of looking patterns. The measure is shown to be sensitive to the reduced eye contact commonly reported in autism, with the autism group spending less time looking to the eye region of the face than controls. Oxytocin administration selectively enhanced gaze to the eyes in both the autism and control groups (transformed mean eye-fixation difference per second=0.082; 95% CI:0.025-0.14, P=0.006). Within the autism group, oxytocin has the most effect on fixation duration in individuals with impaired levels of eye contact at baseline (Cohen's d=0.86). These findings demonstrate that the potential benefits of oxytocin in autism extend to a real-time interaction, providing evidence of a therapeutic effect in a key aspect of social communication.

  19. A 2D eye gaze estimation system with low-resolution webcam images

    NASA Astrophysics Data System (ADS)

    Ince, Ibrahim Furkan; Kim, Jin Woo

    2011-12-01

    In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI) algorithm. Deformable template-based 2D gaze estimation (DTBGE) algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Right and left eye estimates are evaluated separately. DAISMI finds the stable approximate pupil-center location by calculating the mass-center of eyeball border vertices to be employed for initial deformable template alignment. DTBGE starts running with initial alignment and updates the template alignment with resulting eye movements and eyeball size frame by frame. The horizontal and vertical deviation of eye movements through eyeball size is considered as if it is directly proportional with the deviation of cursor movements in a certain screen size and resolution. The core advantage of the system is that it does not employ the real pupil-center as a reference point for gaze estimation which is more reliable against corneal reflection. Visual angle accuracy is used for the evaluation and benchmarking of the system. Effectiveness of the proposed system is presented and experimental results are shown.

  20. Eye Contact Is Crucial for Referential Communication in Pet Dogs.

    PubMed

    Savalli, Carine; Resende, Briseida; Gaunet, Florence

    2016-01-01

    Dogs discriminate human direction of attention cues, such as body, gaze, head and eye orientation, in several circumstances. Eye contact particularly seems to provide information on human readiness to communicate; when there is such an ostensive cue, dogs tend to follow human communicative gestures more often. However, little is known about how such cues influence the production of communicative signals (e.g. gaze alternation and sustained gaze) in dogs. In the current study, in order to get an unreachable food, dogs needed to communicate with their owners in several conditions that differ according to the direction of owners' visual cues, namely gaze, head, eyes, and availability to make eye contact. Results provided evidence that pet dogs did not rely on details of owners' direction of visual attention. Instead, they relied on the whole combination of visual cues and especially on the owners' availability to make eye contact. Dogs increased visual communicative behaviors when they established eye contact with their owners, a different strategy compared to apes and baboons, that intensify vocalizations and gestures when human is not visually attending. The difference in strategy is possibly due to distinct status: domesticated vs wild. Results are discussed taking into account the ecological relevance of the task since pet dogs live in human environment and face similar situations on a daily basis during their lives.

  1. Visual attention on a respiratory function monitor during simulated neonatal resuscitation: an eye-tracking study.

    PubMed

    Katz, Trixie A; Weinberg, Danielle D; Fishman, Claire E; Nadkarni, Vinay; Tremoulet, Patrice; Te Pas, Arjan B; Sarcevic, Aleksandra; Foglia, Elizabeth E

    2018-06-14

    A respiratory function monitor (RFM) may improve positive pressure ventilation (PPV) technique, but many providers do not use RFM data appropriately during delivery room resuscitation. We sought to use eye-tracking technology to identify RFM parameters that neonatal providers view most commonly during simulated PPV. Mixed methods study. Neonatal providers performed RFM-guided PPV on a neonatal manikin while wearing eye-tracking glasses to quantify visual attention on displayed RFM parameters (ie, exhaled tidal volume, flow, leak). Participants subsequently provided qualitative feedback on the eye-tracking glasses. Level 3 academic neonatal intensive care unit. Twenty neonatal resuscitation providers. Visual attention: overall gaze sample percentage; total gaze duration, visit count and average visit duration for each displayed RFM parameter. Qualitative feedback: willingness to wear eye-tracking glasses during clinical resuscitation. Twenty providers participated in this study. The mean gaze sample captured wa s 93% (SD 4%). Exhaled tidal volume waveform was the RFM parameter with the highest total gaze duration (median 23%, IQR 13-51%), highest visit count (median 5.17 per 10 s, IQR 2.82-6.16) and longest visit duration (median 0.48 s, IQR 0.38-0.81 s). All participants were willing to wear the glasses during clinical resuscitation. Wearable eye-tracking technology is feasible to identify gaze fixation on the RFM display and is well accepted by providers. Neonatal providers look at exhaled tidal volume more than any other RFM parameter. Future applications of eye-tracking technology include use during clinical resuscitation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.

    PubMed

    Sesin, Anaelis; Adjouadi, Malek; Cabrerizo, Mercedes; Ayala, Melvin; Barreto, Armando

    2008-01-01

    This study developed an adaptive real-time human-computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user's different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user's initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.

  3. Does the 'P300' speller depend on eye gaze?

    NASA Astrophysics Data System (ADS)

    Brunner, P.; Joshi, S.; Briskin, S.; Wolpaw, J. R.; Bischof, H.; Schalk, G.

    2010-10-01

    Many people affected by debilitating neuromuscular disorders such as amyotrophic lateral sclerosis, brainstem stroke or spinal cord injury are impaired in their ability to, or are even unable to, communicate. A brain-computer interface (BCI) uses brain signals, rather than muscles, to re-establish communication with the outside world. One particular BCI approach is the so-called 'P300 matrix speller' that was first described by Farwell and Donchin (1988 Electroencephalogr. Clin. Neurophysiol. 70 510-23). It has been widely assumed that this method does not depend on the ability to focus on the desired character, because it was thought that it relies primarily on the P300-evoked potential and minimally, if at all, on other EEG features such as the visual-evoked potential (VEP). This issue is highly relevant for the clinical application of this BCI method, because eye movements may be impaired or lost in the relevant user population. This study investigated the extent to which the performance in a 'P300' speller BCI depends on eye gaze. We evaluated the performance of 17 healthy subjects using a 'P300' matrix speller under two conditions. Under one condition ('letter'), the subjects focused their eye gaze on the intended letter, while under the second condition ('center'), the subjects focused their eye gaze on a fixation cross that was located in the center of the matrix. The results show that the performance of the 'P300' matrix speller in normal subjects depends in considerable measure on gaze direction. They thereby disprove a widespread assumption in BCI research, and suggest that this BCI might function more effectively for people who retain some eye-movement control. The applicability of these findings to people with severe neuromuscular disabilities (particularly in eye-movements) remains to be determined.

  4. Storyline Visualizations of Eye Tracking of Movie Viewing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balint, John T.; Arendt, Dustin L.; Blaha, Leslie M.

    Storyline visualizations offer an approach that promises to capture the spatio-temporal characteristics of individual observers and simultaneously illustrate emerging group behaviors. We develop a visual analytics approach to parsing, aligning, and clustering fixation sequences from eye tracking data. Visualization of the results captures the similarities and differences across a group of observers performing a common task. We apply our storyline approach to visualize gaze patterns of people watching dynamic movie clips. Storylines mitigate some of the shortcomings of existent spatio-temporal visualization techniques and, importantly, continue to highlight individual observer behavioral dynamics.

  5. Countermanding eye-head gaze shifts in humans: marching orders are delivered to the head first.

    PubMed

    Corneil, Brian D; Elsley, James K

    2005-07-01

    The countermanding task requires subjects to cancel a planned movement on appearance of a stop signal, providing insights into response generation and suppression. Here, we studied human eye-head gaze shifts in a countermanding task with targets located beyond the horizontal oculomotor range. Consistent with head-restrained saccadic countermanding studies, the proportion of gaze shifts on stop trials increased the longer the stop signal was delayed after target presentation, and gaze shift stop-signal reaction times (SSRTs: a derived statistic measuring how long it takes to cancel a movement) averaged approximately 120 ms across seven subjects. We also observed a marked proportion of trials (13% of all stop trials) during which gaze remained stable but the head moved toward the target. Such head movements were more common at intermediate stop signal delays. We never observed the converse sequence wherein gaze moved while the head remained stable. SSRTs for head movements averaged approximately 190 ms or approximately 70-75 ms longer than gaze SSRTs. Although our findings are inconsistent with a single race to threshold as proposed for controlling saccadic eye movements, movement parameters on stop trials attested to interactions consistent with a race model architecture. To explain our data, we tested two extensions to the saccadic race model. The first assumed that gaze shifts and head movements are controlled by parallel but independent races. The second model assumed that gaze shifts and head movements are controlled by a single race, preceded by terminal ballistic intervals not under inhibitory control, and that the head-movement branch is activated at a lower threshold. Although simulations of both models produced acceptable fits to the empirical data, we favor the second alternative as it is more parsimonious with recent findings in the oculomotor system. Using the second model, estimates for gaze and head ballistic intervals were approximately 25 and 90 ms, respectively, consistent with the known physiology of the final motor paths. Further, the threshold of the head movement branch was estimated to be 85% of that required to activate gaze shifts. From these results, we conclude that a commitment to a head movement is made in advance of gaze shifts and that the comparative SSRT differences result primarily from biomechanical differences inherent to eye and head motion.

  6. Understanding How Adolescents with Autism Respond to Facial Expressions in Virtual Reality Environments

    PubMed Central

    Bekele, Esubalew; Zheng, Zhi; Swanson, Amy; Crittendon, Julie; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD. PMID:23428456

  7. Understanding how adolescents with autism respond to facial expressions in virtual reality environments.

    PubMed

    Bekele, Esubalew; Zheng, Zhi; Swanson, Amy; Crittendon, Julie; Warren, Zachary; Sarkar, Nilanjan

    2013-04-01

    Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.

  8. Gaze pursuit responses in nucleus reticularis tegmenti pontis of head-unrestrained macaques.

    PubMed

    Suzuki, David A; Betelak, Kathleen F; Yee, Robert D

    2009-01-01

    Eye-head gaze pursuit-related activity was recorded in rostral portions of the nucleus reticularis tegmenti pontis (rNRTP) in alert macaques. The head was unrestrained in the horizontal plane, and macaques were trained to pursue a moving target either with their head, with the eyes stationary in the orbits, or with their eyes, with their head voluntarily held stationary in space. Head-pursuit-related modulations in rNRTP activity were observed with some cells exhibiting increases in firing rate with increases in head-pursuit frequency. For many units, this head-pursuit response appeared to saturate at higher frequencies (>0.6 Hz). The response phase re:peak head-pursuit velocity formed a continuum, containing cells that could encode head-pursuit velocity and those encoding head-pursuit acceleration. The latter cells did not exhibit head position-related activity. Sensitivities were calculated with respect to peak head-pursuit velocity and averaged 1.8 spikes/s/deg/s. Of the cells that were tested for both head- and eye-pursuit-related activity, 86% exhibited responses to both head- and eye-pursuit and therefore carried a putative gaze-pursuit signal. For these gaze-pursuit units, the ratio of head to eye response sensitivities averaged approximately 1.4. Pursuit eccentricity seemed to affect head-pursuit response amplitude even in the absence of a head position response per se. The results indicated that rNRTP is a strong candidate for the source of an active head-pursuit signal that projects to the cerebellum, specifically to the target-velocity and gaze-velocity Purkinje cells that have been observed in vermal lobules VI and VII.

  9. Gorillas with white sclera: A naturally occurring variation in a morphological trait linked to social cognitive functions.

    PubMed

    Mayhew, Jessica A; Gómez, Juan-Carlos

    2015-08-01

    Human eye morphology is considered unique among the primates in that humans possess larger width/height ratios (WHR), expose a greater amount of visible sclera (SSI; width of exposed eyeball/width of visible iris), and critically, have a white sclera due to a lack of pigmentation. White sclera in humans amplifies gaze direction, whereas the all-dark eyes of apes are hypothesized to conceal gaze from others. This study examines WHR and SSI in humans (N = 13) and gorillas (N = 85) engaged in direct and averted gazes and introduces a qualitative assessment of sclera color to evaluate variations in sclera pigmentation. The results confirm previous findings that humans possess a larger WHR than gorillas but indicate that humans and gorillas display similar amounts of visible sclera. Additionally, 72% (N = 124) of gorilla eyes in this sample deviated from the assumed all-dark eye condition. This questions whether gaze camouflage is the primary function of darkened sclera in non-human primates or whether other functional roles can be ascribed to the sclera, light or dark. We argue that white sclera evolved to amplify direct gazes in humans, which would have played a significant role in the development of ostensive communication, which is communication that both shows something and shows the intention to show something. We conclude that the horizontal elongation of the human eye, rather than sclera color, more reliably distinguishes human from great ape eyes, represented here by gorillas. © 2015 Wiley Periodicals, Inc.

  10. Face-to-face interference in typical and atypical development

    PubMed Central

    Riby, Deborah M; Doherty-Sneddon, Gwyneth; Whittle, Lisa

    2012-01-01

    Visual communication cues facilitate interpersonal communication. It is important that we look at faces to retrieve and subsequently process such cues. It is also important that we sometimes look away from faces as they increase cognitive load that may interfere with online processing. Indeed, when typically developing individuals hold face gaze it interferes with task completion. In this novel study we quantify face interference for the first time in Williams syndrome (WS) and Autism Spectrum Disorder (ASD). These disorders of development impact on cognition and social attention, but how do faces interfere with cognitive processing? Individuals developing typically as well as those with ASD (n = 19) and WS (n = 16) were recorded during a question and answer session that involved mathematics questions. In phase 1 gaze behaviour was not manipulated, but in phase 2 participants were required to maintain eye contact with the experimenter at all times. Looking at faces decreased task accuracy for individuals who were developing typically. Critically, the same pattern was seen in WS and ASD, whereby task performance decreased when participants were required to hold face gaze. The results show that looking at faces interferes with task performance in all groups. This finding requires the caveat that individuals with WS and ASD found it harder than individuals who were developing typically to maintain eye contact throughout the interaction. Individuals with ASD struggled to hold eye contact at all points of the interaction while those with WS found it especially difficult when thinking. PMID:22356183

  11. Biases in orienting and maintenance of attention among weight dissatisfied women: an eye-movement study.

    PubMed

    Gao, Xiao; Wang, Quanchuan; Jackson, Todd; Zhao, Guang; Liang, Yi; Chen, Hong

    2011-04-01

    Despite evidence indicating fatness and thinness information are processed differently among weight-preoccupied and eating disordered individuals, the exact nature of these attentional biases is not clear. In this research, eye movement (EM) tracking assessed biases in specific component processes of visual attention (i.e., orientation, detection, maintenance and disengagement of gaze) in relation to body-related stimuli among 20 weight dissatisfied (WD) and 20 weight satisfied young women. Eye movements were recorded while participants completed a dot-probe task that featured fatness-neutral and thinness-neutral word pairs. Compared to controls, WD women were more likely to direct their initial gaze toward fatness words, had a shorter mean latency of first fixation on both fatness and thinness words, had longer first fixation on fatness words but shorter first fixation on thinness words, and shorter total gaze duration on thinness words. Reaction time data showed a maintenance bias towards fatness words among the WD women. In sum, results indicated WD women show initial orienting, speeded detection and initial maintenance biases towards fat body words in addition to a speeded detection - avoidance pattern of biases in relation to thin body words. In sum, results highlight the importance of the utility of EM-tracking as a means of identifying subtle attentional biases among weight dissatisfied women drawn from a non-clinical setting and the need to assess attentional biases as a dynamic process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Face in profile view reduces perceived facial expression intensity: an eye-tracking study.

    PubMed

    Guo, Kun; Shaw, Heather

    2015-02-01

    Recent studies measuring the facial expressions of emotion have focused primarily on the perception of frontal face images. As we frequently encounter expressive faces from different viewing angles, having a mechanism which allows invariant expression perception would be advantageous to our social interactions. Although a couple of studies have indicated comparable expression categorization accuracy across viewpoints, it is unknown how perceived expression intensity and associated gaze behaviour change across viewing angles. Differences could arise because diagnostic cues from local facial features for decoding expressions could vary with viewpoints. Here we manipulated orientation of faces (frontal, mid-profile, and profile view) displaying six common facial expressions of emotion, and measured participants' expression categorization accuracy, perceived expression intensity and associated gaze patterns. In comparison with frontal faces, profile faces slightly reduced identification rates for disgust and sad expressions, but significantly decreased perceived intensity for all tested expressions. Although quantitatively viewpoint had expression-specific influence on the proportion of fixations directed at local facial features, the qualitative gaze distribution within facial features (e.g., the eyes tended to attract the highest proportion of fixations, followed by the nose and then the mouth region) was independent of viewpoint and expression type. Our results suggest that the viewpoint-invariant facial expression processing is categorical perception, which could be linked to a viewpoint-invariant holistic gaze strategy for extracting expressive facial cues. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Understanding the Referential Nature of Looking: Infants' Preference for Object-Directed Gaze

    ERIC Educational Resources Information Center

    Senju, Atsushi; Csibra, Gergely; Johnson, Mark H.

    2008-01-01

    In four experiments, we investigated whether 9-month-old infants are sensitive to the relationship between gaze direction and object location and whether this sensitivity depends on the presence of communicative cues like eye contact. Infants observed a face, which repeatedly shifted its eyes either toward, or away from, unpredictably appearing…

  14. Learning to Interact with a Computer by Gaze

    ERIC Educational Resources Information Center

    Aoki, Hirotaka; Hansen, John Paulin; Itoh, Kenji

    2008-01-01

    The aim of this paper is to examine the learning processes that subjects undertake when they start using gaze as computer input. A 7-day experiment with eight Japanese students was carried out to record novice users' eye movement data during typing of 110 sentences. The experiment revealed that inefficient eye movements was dramatically reduced…

  15. Inhibition of Return in Response to Eye Gaze and Peripheral Cues in Young People with Asperger's Syndrome

    ERIC Educational Resources Information Center

    Marotta, Andrea; Pasini, Augusto; Ruggiero, Sabrina; Maccari, Lisa; Rosa, Caterina; Lupianez, Juan; Casagrande, Maria

    2013-01-01

    Inhibition of return (IOR) reflects slower reaction times to stimuli presented in previously attended locations. In this study, we examined this inhibitory after-effect using two different cue types, eye-gaze and standard peripheral cues, in individuals with Asperger's syndrome and typically developing individuals. Typically developing…

  16. Self-Monitoring of Gaze in High Functioning Autism

    ERIC Educational Resources Information Center

    Grynszpan, Ouriel; Nadel, Jacqueline; Martin, Jean-Claude; Simonin, Jerome; Bailleul, Pauline; Wang, Yun; Gepner, Daniel; Le Barillier, Florence; Constant, Jacques

    2012-01-01

    Atypical visual behaviour has been recently proposed to account for much of social misunderstanding in autism. Using an eye-tracking system and a gaze-contingent lens display, the present study explores self-monitoring of eye motion in two conditions: free visual exploration and guided exploration via blurring the visual field except for the focal…

  17. Semantic Preview Benefit in Eye Movements during Reading: A Parafoveal Fast-Priming Study

    ERIC Educational Resources Information Center

    Hohenstein, Sven; Laubrock, Jochen; Kliegl, Reinhold

    2010-01-01

    Eye movements in reading are sensitive to foveal and parafoveal word features. Whereas the influence of orthographic or phonological parafoveal information on gaze control is undisputed, there has been no reliable evidence for early parafoveal extraction of semantic information in alphabetic script. Using a novel combination of the gaze-contingent…

  18. The Development of Joint Visual Attention: A Longitudinal Study of Gaze following during Interactions with Mothers and Strangers

    ERIC Educational Resources Information Center

    Gredeback, Gustaf; Fikke, Linn; Melinder, Annika

    2010-01-01

    Two- to 8-month-old infants interacted with their mother or a stranger in a prospective longitudinal gaze following study. Gaze following, as assessed by eye tracking, emerged between 2 and 4 months and stabilized between 6 and 8 months of age. Overall, infants followed the gaze of a stranger more than they followed the gaze of their mothers,…

  19. High-Speed Noninvasive Eye-Tracking System

    NASA Technical Reports Server (NTRS)

    Talukder, Ashit; LaBaw, Clayton; Michael-Morookian, John; Monacos, Steve; Serviss, Orin

    2007-01-01

    The figure schematically depicts a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Relative to the prior commercial systems, the present system operates at much higher speed and thereby offers enhanced capability for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals,and eye-based diagnosis of physiological disorders that affect gaze responses.

  20. Eye Movements Affect Postural Control in Young and Older Females

    PubMed Central

    Thomas, Neil M.; Bampouras, Theodoros M.; Donovan, Tim; Dewhurst, Susan

    2016-01-01

    Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions. PMID:27695412

  1. Eye Movements Affect Postural Control in Young and Older Females.

    PubMed

    Thomas, Neil M; Bampouras, Theodoros M; Donovan, Tim; Dewhurst, Susan

    2016-01-01

    Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions.

  2. Eye Gaze Correlates of Motor Impairment in VR Observation of Motor Actions.

    PubMed

    Alves, J; Vourvopoulos, A; Bernardino, A; Bermúdez I Badia, S

    2016-01-01

    This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". Identify eye gaze correlates of motor impairment in a virtual reality motor observation task in a study with healthy participants and stroke patients. Participants consisted of a group of healthy subjects (N = 20) and a group of stroke survivors (N = 10). Both groups were required to observe a simple reach-and-grab and place-and-release task in a virtual environment. Additionally, healthy subjects were required to observe the task in a normal condition and a constrained movement condition. Eye movements were recorded during the observation task for later analysis. For healthy participants, results showed differences in gaze metrics when comparing the normal and arm-constrained conditions. Differences in gaze metrics were also found when comparing dominant and non-dominant arm for saccades and smooth pursuit events. For stroke patients, results showed longer smooth pursuit segments in action observation when observing the paretic arm, thus providing evidence that the affected circuitry may be activated for eye gaze control during observation of the simulated motor action. This study suggests that neural motor circuits are involved, at multiple levels, in observation of motor actions displayed in a virtual reality environment. Thus, eye tracking combined with action observation tasks in a virtual reality display can be used to monitor motor deficits derived from stroke, and consequently can also be used for rehabilitation of stroke patients.

  3. How children with specific language impairment view social situations: an eye tracking study.

    PubMed

    Hosozawa, Mariko; Tanaka, Kyoko; Shimizu, Toshiaki; Nakano, Tamami; Kitazawa, Shigeru

    2012-06-01

    Children with specific language impairment (SLI) face risks for social difficulties. However, the nature and developmental course of these difficulties remain unclear. Gaze behaviors have been studied by using eye tracking among those with autism spectrum disorders (ASDs). Using this method, we compared the gaze behaviors of children with SLI with those of individuals with ASD and typically developing (TD) children to explore the social perception of children with SLI. The eye gazes of 66 children (16 with SLI, 25 with ASD, and 25 TD) were studied while viewing videos of social interactions. Gaze behaviors were summarized with multidimensional scaling, and participants with similar gaze behaviors were represented proximally in a 2-dimensional plane. The SLI and TD groups each formed a cluster near the center of the multidimensional scaling plane, whereas the ASD group was distributed around the periphery. Frame-by-frame analyses showed that children with SLI and TD children viewed faces in a manner consistent with the story line, but children with ASD devoted less attention to faces and social interactions. During speech scenes, children with SLI were significantly more fixated on the mouth, whereas TD children viewed the eyes and the mouth. Children with SLI viewed social situations in ways similar to those of TD children but different from those of children with ASD. However, children with SLI concentrated on the speaker's mouth, possibly to compensate for audiovisual processing deficits. Because eyes carry important information, this difference may influence the social development of children with SLI.

  4. Spatial transformations between superior colliculus visual and motor response fields during head-unrestrained gaze shifts.

    PubMed

    Sadeh, Morteza; Sajad, Amirsaman; Wang, Hongying; Yan, Xiaogang; Crawford, John Douglas

    2015-12-01

    We previously reported that visuomotor activity in the superior colliculus (SC)--a key midbrain structure for the generation of rapid eye movements--preferentially encodes target position relative to the eye (Te) during low-latency head-unrestrained gaze shifts (DeSouza et al., 2011). Here, we trained two monkeys to perform head-unrestrained gaze shifts after a variable post-stimulus delay (400-700 ms), to test whether temporally separated SC visual and motor responses show different spatial codes. Target positions, final gaze positions and various frames of reference (eye, head, and space) were dissociated through natural (untrained) trial-to-trial variations in behaviour. 3D eye and head orientations were recorded, and 2D response field data were fitted against multiple models by use of a statistical method reported previously (Keith et al., 2009). Of 60 neurons, 17 showed a visual response, 12 showed a motor response, and 31 showed both visual and motor responses. The combined visual response field population (n = 48) showed a significant preference for Te, which was also preferred in each visual subpopulation. In contrast, the motor response field population (n = 43) showed a preference for final (relative to initial) gaze position models, and the Te model was statistically eliminated in the motor-only population. There was also a significant shift of coding from the visual to motor response within visuomotor neurons. These data confirm that SC response fields are gaze-centred, and show a target-to-gaze transformation between visual and motor responses. Thus, visuomotor transformations can occur between, and even within, neurons within a single frame of reference and brain structure. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  5. Spatial coding of eye movements relative to perceived earth and head orientations during static roll tilt

    NASA Technical Reports Server (NTRS)

    Wood, S. J.; Paloski, W. H.; Reschke, M. F.

    1998-01-01

    This purpose of this study was to examine the spatial coding of eye movements during static roll tilt (up to +/-45 degrees) relative to perceived earth and head orientations. Binocular videographic recordings obtained in darkness from eight subjects allowed us to quantify the mean deviations in gaze trajectories along both horizontal and vertical coordinates relative to the true earth and head orientations. We found that both variability and curvature of gaze trajectories increased with roll tilt. The trajectories of eye movements made along the perceived earth-horizontal (PEH) were more accurate than movements along the perceived head-horizontal (PHH). The trajectories of both PEH and PHH saccades tended to deviate in the same direction as the head tilt. The deviations in gaze trajectories along the perceived earth-vertical (PEV) and perceived head-vertical (PHV) were both similar to the PHH orientation, except that saccades along the PEV deviated in the opposite direction relative to the head tilt. The magnitude of deviations along the PEV, PHH, and PHV corresponded to perceptual overestimations of roll tilt obtained from verbal reports. Both PEV gaze trajectories and perceptual estimates of tilt orientation were different following clockwise rather than counterclockwise tilt rotation; however, the PEH gaze trajectories were less affected by the direction of tilt rotation. Our results suggest that errors in gaze trajectories along PEV and perceived head orientations increase during roll tilt in a similar way to perceptual errors of tilt orientation. Although PEH and PEV gaze trajectories became nonorthogonal during roll tilt, we conclude that the spatial coding of eye movements during roll tilt is overall more accurate for the perceived earth reference frame than for the perceived head reference frame.

  6. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

    PubMed

    Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan

    2014-12-01

    The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

  7. Gaze Behavior in One-Handed Catching and Its Relation with Interceptive Performance: What the Eyes Can't Tell

    PubMed Central

    Cesqui, Benedetta; Mezzetti, Maura; Lacquaniti, Francesco; d'Avella, Andrea

    2015-01-01

    In ball sports, it is usually acknowledged that expert athletes track the ball more accurately than novices. However, there is also evidence that keeping the eyes on the ball is not always necessary for interception. Here we aimed at gaining new insights on the extent to which ocular pursuit performance is related to catching performance. To this end, we analyzed eye and head movements of nine subjects catching a ball projected by an actuated launching apparatus. Four different ball flight durations and two different ball arrival heights were tested and the quality of ocular pursuit was characterized by means of several timing and accuracy parameters. Catching performance differed across subjects and depended on ball flight characteristics. All subjects showed a similar sequence of eye movement events and a similar modulation of the timing of these events in relation to the characteristics of the ball trajectory. On a trial-by-trial basis there was a significant relationship only between pursuit duration and catching performance, confirming that keeping the eyes on the ball longer increases catching success probability. Ocular pursuit parameters values and their dependence on flight conditions as well as the eye and head contributions to gaze shift differed across subjects. However, the observed average individual ocular behavior and the eye-head coordination patterns were not directly related to the individual catching performance. These results suggest that several oculomotor strategies may be used to gather information on ball motion, and that factors unrelated to eye movements may underlie the observed differences in interceptive performance. PMID:25793989

  8. Novel Eye Movement Disorders in Whipple's Disease-Staircase Horizontal Saccades, Gaze-Evoked Nystagmus, and Esotropia.

    PubMed

    Shaikh, Aasef G; Ghasia, Fatema F

    2017-01-01

    Whipple's disease, a rare systemic infectious disorder, is complicated by the involvement of the central nervous system in about 5% of cases. Oscillations of the eyes and the jaw, called oculo-masticatory myorhythmia, are pathognomonic of the central nervous system involvement but are often absent. Typical manifestations of the central nervous system Whipple's disease are cognitive impairment, parkinsonism mimicking progressive supranuclear palsy with vertical saccade slowing, and up-gaze range limitation. We describe a unique patient with the central nervous system Whipple's disease who had typical features, including parkinsonism, cognitive impairment, and up-gaze limitation; but also had diplopia, esotropia with mild horizontal (abduction more than adduction) limitation, and vertigo. The patient also had gaze-evoked nystagmus and staircase horizontal saccades. Latter were thought to be due to mal-programmed small saccades followed by a series of corrective saccades. The saccades were disconjugate due to the concurrent strabismus. Also, we noted disconjugacy in the slow phase of gaze-evoked nystagmus. The disconjugacy of the slow phase of gaze-evoked nystagmus was larger during monocular viewing condition. We propose that interaction of the strabismic drifts of the covered eyes and the nystagmus drift, putatively at the final common pathway might lead to such disconjugacy.

  9. Limitations of gaze transfer: without visual context, eye movements do not to help to coordinate joint action, whereas mouse movements do.

    PubMed

    Müller, Romy; Helmert, Jens R; Pannasch, Sebastian

    2014-10-01

    Remote cooperation can be improved by transferring the gaze of one participant to the other. However, based on a partner's gaze, an interpretation of his communicative intention can be difficult. Thus, gaze transfer has been inferior to mouse transfer in remote spatial referencing tasks where locations had to be pointed out explicitly. Given that eye movements serve as an indicator of visual attention, it remains to be investigated whether gaze and mouse transfer differentially affect the coordination of joint action when the situation demands an understanding of the partner's search strategies. In the present study, a gaze or mouse cursor was transferred from a searcher to an assistant in a hierarchical decision task. The assistant could use this cursor to guide his movement of a window which continuously opened up the display parts the searcher needed to find the right solution. In this context, we investigated how the ease of using gaze transfer depended on whether a link could be established between the partner's eye movements and the objects he was looking at. Therefore, in addition to the searcher's cursor, the assistant either saw the positions of these objects or only a grey background. When the objects were visible, performance and the number of spoken words were similar for gaze and mouse transfer. However, without them, gaze transfer resulted in longer solution times and more verbal effort as participants relied more strongly on speech to coordinate the window movement. Moreover, an analysis of the spatio-temporal coupling of the transmitted cursor and the window indicated that when no visual object information was available, assistants confidently followed the searcher's mouse but not his gaze cursor. Once again, the results highlight the importance of carefully considering task characteristics when applying gaze transfer in remote cooperation. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Abnormal social reward processing in autism as indexed by pupillary responses to happy faces

    PubMed Central

    2012-01-01

    Background Individuals with Autism Spectrum Disorders (ASD) typically show impaired eye contact during social interactions. From a young age, they look less at faces than typically developing (TD) children and tend to avoid direct gaze. However, the reason for this behavior remains controversial; ASD children might avoid eye contact because they perceive the eyes as aversive or because they do not find social engagement through mutual gaze rewarding. Methods We monitored pupillary diameter as a measure of autonomic response in children with ASD (n = 20, mean age = 12.4) and TD controls (n = 18, mean age = 13.7) while they looked at faces displaying different emotions. Each face displayed happy, fearful, angry or neutral emotions with the gaze either directed to or averted from the subjects. Results Overall, children with ASD and TD controls showed similar pupillary responses; however, they differed significantly in their sensitivity to gaze direction for happy faces. Specifically, pupillary diameter increased among TD children when viewing happy faces with direct gaze as compared to those with averted gaze, whereas children with ASD did not show such sensitivity to gaze direction. We found no group differences in fixation that could explain the differential pupillary responses. There was no effect of gaze direction on pupil diameter for negative affect or neutral faces among either the TD or ASD group. Conclusions We interpret the increased pupillary diameter to happy faces with direct gaze in TD children to reflect the intrinsic reward value of a smiling face looking directly at an individual. The lack of this effect in children with ASD is consistent with the hypothesis that individuals with ASD may have reduced sensitivity to the reward value of social stimuli. PMID:22958650

  11. Are fixations in static natural scenes a useful predictor of attention in the real world?

    PubMed

    Foulsham, Tom; Kingstone, Alan

    2017-06-01

    Research investigating scene perception normally involves laboratory experiments using static images. Much has been learned about how observers look at pictures of the real world and the attentional mechanisms underlying this behaviour. However, the use of static, isolated pictures as a proxy for studying everyday attention in real environments has led to the criticism that such experiments are artificial. We report a new study that tests the extent to which the real world can be reduced to simpler laboratory stimuli. We recorded the gaze of participants walking on a university campus with a mobile eye tracker, and then showed static frames from this walk to new participants, in either a random or sequential order. The aim was to compare the gaze of participants walking in the real environment with fixations on pictures of the same scene. The data show that picture order affects interobserver fixation consistency and changes looking patterns. Critically, while fixations on the static images overlapped significantly with the actual real-world eye movements, they did so no more than a model that assumed a general bias to the centre. Remarkably, a model that simply takes into account where the eyes are normally positioned in the head-independent of what is actually in the scene-does far better than any other model. These data reveal that viewing patterns to static scenes are a relatively poor proxy for predicting real world eye movement behaviour, while raising intriguing possibilities for how to best measure attention in everyday life. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. [Eye contact effects: A therapeutic issue?

    PubMed

    Baltazar, M; Conty, L

    2016-12-01

    The perception of a direct gaze - that is, of another individual's gaze directed at the observer that leads to eye contact - is known to influence a wide range of cognitive processes and behaviors. We stress that these effects mainly reflect positive impacts on human cognition and may thus be used as relevant tools for therapeutic purposes. In this review, we aim (1) to provide an exhaustive review of eye contact effects while discussing the limits of the dominant models used to explain these effects, (2) to illustrate the therapeutic potential of eye contact by targeting those pathologies that show both preserved gaze processing and deficits in one or several functions that are targeted by the eye contact effects, and (3) to propose concrete ways in which eye contact could be employed as a therapeutic tool. (1) We regroup the variety of eye contact effects into four categories, including memory effects, activation of prosocial behavior, positive appraisals of self and others and the enhancement of self-awareness. We emphasize that the models proposed to account for these effects have a poor predictive value and that further descriptions of these effects is needed. (2) We then emphasize that people with pathologies that affect memory, social behavior, and self and/or other appraisal, and self-awareness could benefit from eye contact effects. We focus on depression, autism and Alzheimer's disease to illustrate our proposal. To our knowledge, no anomaly of eye contact has been reported in depression. Patients suffering from Alzheimer disease, at the early and moderate stage, have been shown to maintain a normal amount of eye contact with their interlocutor. We take into account that autism is controversial regarding whether gaze processing is preserved or altered. In the first view, individuals are thought to elude or omit gazing at another's eyes while in the second, individuals are considered to not be able to process the gaze of others. We adopt the first stance following the view that people with autism are not interested in processing social signals such as gaze but could do so efficiently if properly motivated. For each pathology we emphasize that eye contact could be used, for example, to enhance sensitivity to bodily states, thus improving emotional decision making (in autism); to lead to more positive appraisal of the self and others (in depression); to improve memory performances (in Alzheimer disease) and, more generally, to motivate the recipient to engage in the therapeutic process. (3) Finally we propose two concrete ways to employ eye contact effects as a therapeutic tool. The first is to develop cognitive-behavioral tools to learn and/or motivate the recipient to create frequent and prolonged eye contact periods. The second is to raise awareness among caregivers of the beneficial effects of eye contact and to teach them the way to use eye contact to reach its optimum effects. Future investigations are however needed to explore the ways in which eye contact effects can be efficiently integrated in therapeutic strategies, as well as to identify the clinical populations that can benefit from such therapeutic interventions. Copyright © 2016 L’Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.

  13. Language/Culture Modulates Brain and Gaze Processes in Audiovisual Speech Perception.

    PubMed

    Hisanaga, Satoko; Sekiyama, Kaoru; Igasaki, Tomohiko; Murayama, Nobuki

    2016-10-13

    Several behavioural studies have shown that the interplay between voice and face information in audiovisual speech perception is not universal. Native English speakers (ESs) are influenced by visual mouth movement to a greater degree than native Japanese speakers (JSs) when listening to speech. However, the biological basis of these group differences is unknown. Here, we demonstrate the time-varying processes of group differences in terms of event-related brain potentials (ERP) and eye gaze for audiovisual and audio-only speech perception. On a behavioural level, while congruent mouth movement shortened the ESs' response time for speech perception, the opposite effect was observed in JSs. Eye-tracking data revealed a gaze bias to the mouth for the ESs but not the JSs, especially before the audio onset. Additionally, the ERP P2 amplitude indicated that ESs processed multisensory speech more efficiently than auditory-only speech; however, the JSs exhibited the opposite pattern. Taken together, the ESs' early visual attention to the mouth was likely to promote phonetic anticipation, which was not the case for the JSs. These results clearly indicate the impact of language and/or culture on multisensory speech processing, suggesting that linguistic/cultural experiences lead to the development of unique neural systems for audiovisual speech perception.

  14. Are Eyes Windows to a Deceiver's Soul? Children's Use of Another's Eye Gaze Cues in a Deceptive Situation

    ERIC Educational Resources Information Center

    Freire, Alejo; Eskritt, Michelle; Lee, Kang

    2004-01-01

    Three experiments examined 3- to 5-year-olds' use of eye gaze cues to infer truth in a deceptive situation. Children watched a video of an actor who hid a toy in 1 of 3 cups. In Experiments 1 and 2, the actor claimed ignorance about the toy's location but looked toward 1 of the cups, without (Experiment 1) and with (Experiment 2) head movement. In…

  15. A new mapping function in table-mounted eye tracker

    NASA Astrophysics Data System (ADS)

    Tong, Qinqin; Hua, Xiao; Qiu, Jian; Luo, Kaiqing; Peng, Li; Han, Peng

    2018-01-01

    Eye tracker is a new apparatus of human-computer interaction, which has caught much attention in recent years. Eye tracking technology is to obtain the current subject's "visual attention (gaze)" direction by using mechanical, electronic, optical, image processing and other means of detection. While the mapping function is one of the key technology of the image processing, and is also the determination of the accuracy of the whole eye tracker system. In this paper, we present a new mapping model based on the relationship among the eyes, the camera and the screen that the eye gazed. Firstly, according to the geometrical relationship among the eyes, the camera and the screen, the framework of mapping function between the pupil center and the screen coordinate is constructed. Secondly, in order to simplify the vectors inversion of the mapping function, the coordinate of the eyes, the camera and screen was modeled by the coaxial model systems. In order to verify the mapping function, corresponding experiment was implemented. It is also compared with the traditional quadratic polynomial function. And the results show that our approach can improve the accuracy of the determination of the gazing point. Comparing with other methods, this mapping function is simple and valid.

  16. The Head Tracks and Gaze Predicts: How the World’s Best Batters Hit a Ball

    PubMed Central

    Mann, David L.; Spratford, Wayne; Abernethy, Bruce

    2013-01-01

    Hitters in fast ball-sports do not align their gaze with the ball throughout ball-flight; rather, they use predictive eye movement strategies that contribute towards their level of interceptive skill. Existing studies claim that (i) baseball and cricket batters cannot track the ball because it moves too quickly to be tracked by the eyes, and that consequently (ii) batters do not – and possibly cannot – watch the ball at the moment they hit it. However, to date no studies have examined the gaze of truly elite batters. We examined the eye and head movements of two of the world’s best cricket batters and found both claims do not apply to these batters. Remarkably, the batters coupled the rotation of their head to the movement of the ball, ensuring the ball remained in a consistent direction relative to their head. To this end, the ball could be followed if the batters simply moved their head and kept their eyes still. Instead of doing so, we show the elite batters used distinctive eye movement strategies, usually relying on two predictive saccades to anticipate (i) the location of ball-bounce, and (ii) the location of bat-ball contact, ensuring they could direct their gaze towards the ball as they hit it. These specific head and eye movement strategies play important functional roles in contributing towards interceptive expertise. PMID:23516460

  17. An exploratory study on the driving method of speech synthesis based on the human eye reading imaging data

    NASA Astrophysics Data System (ADS)

    Gao, Pei-pei; Liu, Feng

    2016-10-01

    With the development of information technology and artificial intelligence, speech synthesis plays a significant role in the fields of Human-Computer Interaction Techniques. However, the main problem of current speech synthesis techniques is lacking of naturalness and expressiveness so that it is not yet close to the standard of natural language. Another problem is that the human-computer interaction based on the speech synthesis is too monotonous to realize mechanism of user subjective drive. This thesis introduces the historical development of speech synthesis and summarizes the general process of this technique. It is pointed out that prosody generation module is an important part in the process of speech synthesis. On the basis of further research, using eye activity rules when reading to control and drive prosody generation was introduced as a new human-computer interaction method to enrich the synthetic form. In this article, the present situation of speech synthesis technology is reviewed in detail. Based on the premise of eye gaze data extraction, using eye movement signal in real-time driving, a speech synthesis method which can express the real speech rhythm of the speaker is proposed. That is, when reader is watching corpora with its eyes in silent reading, capture the reading information such as the eye gaze duration per prosodic unit, and establish a hierarchical prosodic pattern of duration model to determine the duration parameters of synthesized speech. At last, after the analysis, the feasibility of the above method is verified.

  18. The effect of arousal and eye gaze direction on trust evaluations of stranger's faces: A potential pathway to paranoid thinking.

    PubMed

    Abbott, Jennie; Middlemiss, Megan; Bruce, Vicki; Smailes, David; Dudley, Robert

    2018-09-01

    When asked to evaluate faces of strangers, people with paranoia show a tendency to rate others as less trustworthy. The present study investigated the impact of arousal on this interpersonal bias, and whether this bias was specific to evaluations of trust or additionally affected other trait judgements. The study also examined the impact of eye gaze direction, as direct eye gaze has been shown to heighten arousal. In two experiments, non-clinical participants completed face rating tasks before and after either an arousal manipulation or control manipulation. Experiment one examined the effects of heightened arousal on judgements of trustworthiness. Experiment two examined the specificity of the bias, and the impact of gaze direction. Experiment one indicated that the arousal manipulation led to lower trustworthiness ratings. Experiment two showed that heightened arousal reduced trust evaluations of trustworthy faces, particularly trustworthy faces with averted gaze. The control group rated trustworthy faces with direct gaze as more trustworthy post-manipulation. There was some evidence that attractiveness ratings were affected similarly to the trust judgements, whereas judgements of intelligence were not affected by higher arousal. In both studies, participants reported low levels of arousal even after the manipulation and the use of a non-clinical sample limits the generalisability to clinical samples. There is a complex interplay between arousal, evaluations of trustworthiness and gaze direction. Heightened arousal influences judgements of trustworthiness, but within the context of face type and gaze direction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Exploring the Potential Relationship between Eye Gaze and English L2 Speakers' Responses to Recasts

    ERIC Educational Resources Information Center

    McDonough, Kim; Crowther, Dustin; Kielstra, Paula; Trofimovich, Pavel

    2015-01-01

    This exploratory study investigated whether joint attention through eye gaze was predictive of second language (L2) speakers' responses to recasts. L2 English learners (N = 20) carried out communicative tasks with research assistants who provided feedback in response to non-targetlike (non-TL) forms. Their interaction was audio-recorded and their…

  20. Can Gaze Avoidance Explain Why Individuals with Asperger's Syndrome Can't Recognise Emotions from Facial Expressions?

    ERIC Educational Resources Information Center

    Sawyer, Alyssa C. P.; Williamson, Paul; Young, Robyn L.

    2012-01-01

    Research has shown that individuals with Autism Spectrum Disorders (ASD) have difficulties recognising emotions from facial expressions. Since eye contact is important for accurate emotion recognition, and individuals with ASD tend to avoid eye contact, this tendency for gaze aversion has been proposed as an explanation for the emotion recognition…

  1. An Exploration of the Use of Eye Gaze and Gestures in Females with Rett Syndrome

    ERIC Educational Resources Information Center

    Urbanowicz, Anna; Downs, Jenny; Girdler, Sonya; Ciccone, Natalie; Leonard, Helen

    2016-01-01

    Purpose: This study investigated the communicative use of eye gaze and gestures in females with Rett syndrome. Method: Data on 151 females with Rett syndrome participating in the Australian Rett Syndrome Database was used in this study. Items from the Communication and Symbolic Behavior Scales Developmental Profile Infant-Toddler Checklist…

  2. Eye Gaze versus Arrows as Spatial Cues: Two Qualitatively Different Modes of Attentional Selection

    ERIC Educational Resources Information Center

    Marotta, Andrea; Lupianez, Juan; Martella, Diana; Casagrande, Maria

    2012-01-01

    This study aimed to evaluate the type of attentional selection (location- and/or object-based) triggered by two different types of central noninformative cues: eye gaze and arrows. Two rectangular objects were presented in the visual field, and subjects' attention was directed to the end of a rectangle via the observation of noninformative…

  3. Gaze Stabilization During Locomotion Requires Full Body Coordination

    NASA Technical Reports Server (NTRS)

    Mulavara, A. P.; Miller, C. A.; Houser, J.; Richards, J. T.; Bloomberg, J. J.

    2001-01-01

    Maintaining gaze stabilization during locomotion places substantial demands on multiple sensorimotor subsystems for precise coordination. Gaze stabilization during locomotion requires eye-head-trunk coordination (Bloomberg, et al., 1997) as well as the regulation of energy flow or shock-wave transmission through the body at high impact phases with the support surface (McDonald, et al., 1997). Allowing these excessive transmissions of energy to reach the head may compromise gaze stability. Impairments in these mechanisms may lead to the oscillopsia and decreased dynamic visual acuity seen in crewmembers returning from short and long duration spaceflight, as well as in patients with vestibular disorders (Hillman, et al., 1999). Thus, we hypothesize that stabilized gaze during locomotion results from full-body coordination of the eye-head-trunk system combined with the lower limb apparatus. The goal of this study was to determine how multiple, interdependent full- body sensorimotor subsystems aiding gaze stabilization during locomotion are functionally coordinated, and how they adaptively respond to spaceffight.

  4. Gaze Pursuit Responses in Nucleus Reticularis Tegmenti Pontis of Head-Unrestrained Macaques

    PubMed Central

    Suzuki, David A.; Betelak, Kathleen F.; Yee, Robert D.

    2009-01-01

    Eye-head gaze pursuit–related activity was recorded in rostral portions of the nucleus reticularis tegmenti pontis (rNRTP) in alert macaques. The head was unrestrained in the horizontal plane, and macaques were trained to pursue a moving target either with their head, with the eyes stationary in the orbits, or with their eyes, with their head voluntarily held stationary in space. Head-pursuit–related modulations in rNRTP activity were observed with some cells exhibiting increases in firing rate with increases in head-pursuit frequency. For many units, this head-pursuit response appeared to saturate at higher frequencies (>0.6 Hz). The response phase re:peak head-pursuit velocity formed a continuum, containing cells that could encode head-pursuit velocity and those encoding head-pursuit acceleration. The latter cells did not exhibit head position–related activity. Sensitivities were calculated with respect to peak head-pursuit velocity and averaged 1.8 spikes/s/deg/s. Of the cells that were tested for both head- and eye-pursuit–related activity, 86% exhibited responses to both head- and eye-pursuit and therefore carried a putative gaze-pursuit signal. For these gaze-pursuit units, the ratio of head to eye response sensitivities averaged ∼1.4. Pursuit eccentricity seemed to affect head-pursuit response amplitude even in the absence of a head position response per se. The results indicated that rNRTP is a strong candidate for the source of an active head-pursuit signal that projects to the cerebellum, specifically to the target-velocity and gaze-velocity Purkinje cells that have been observed in vermal lobules VI and VII. PMID:18987125

  5. Eye contact perception in the West and East: a cross-cultural study.

    PubMed

    Uono, Shota; Hietanen, Jari K

    2015-01-01

    This study investigated whether eye contact perception differs in people with different cultural backgrounds. Finnish (European) and Japanese (East Asian) participants were asked to determine whether Finnish and Japanese neutral faces with various gaze directions were looking at them. Further, participants rated the face stimuli for emotion and other affect-related dimensions. The results indicated that Finnish viewers had a smaller bias toward judging slightly averted gazes as directed at them when judging Finnish rather than Japanese faces, while the bias of Japanese viewers did not differ between faces from their own and other cultural backgrounds. This may be explained by Westerners experiencing more eye contact in their daily life leading to larger visual experience of gaze perception generally, and to more accurate perception of eye contact with people from their own cultural background particularly. The results also revealed cultural differences in the perception of emotion from neutral faces that could also contribute to the bias in eye contact perception.

  6. Eye gaze correction with stereovision for video-teleconferencing.

    PubMed

    Yang, Ruigang; Zhang, Zhengyou

    2004-07-01

    The lack of eye contact in desktop video teleconferencing substantially reduces the effectiveness of video contents. While expensive and bulky hardware is available on the market to correct eye gaze, researchers have been trying to provide a practical software-based solution to bring video-teleconferencing one step closer to the mass market. This paper presents a novel approach: Based on stereo analysis combined with rich domain knowledge (a personalized face model), we synthesize, using graphics hardware, a virtual video that maintains eye contact. A 3D stereo head tracker with a personalized face model is used to compute initial correspondences across two views. More correspondences are then added through template and feature matching. Finally, all the correspondence information is fused together for view synthesis using view morphing techniques. The combined methods greatly enhance the accuracy and robustness of the synthesized views. Our current system is able to generate an eye-gaze corrected video stream at five frames per second on a commodity 1 GHz PC.

  7. Gaze and viewing angle influence visual stabilization of upright posture

    PubMed Central

    Ustinova, KI; Perkins, J

    2011-01-01

    Focusing gaze on a target helps stabilize upright posture. We investigated how this visual stabilization can be affected by observing a target presented under different gaze and viewing angles. In a series of 10-second trials, participants (N = 20, 29.3 ± 9 years of age) stood on a force plate and fixed their gaze on a figure presented on a screen at a distance of 1 m. The figure changed position (gaze angle: eye level (0°), 25° up or down), vertical body orientation (viewing angle: at eye level but rotated 25° as if leaning toward or away from the participant), or both (gaze and viewing angle: 25° up or down with the rotation equivalent of a natural visual perspective). Amplitude of participants’ sagittal displacement, surface area, and angular position of the center of gravity (COG) were compared. Results showed decreased COG velocity and amplitude for up and down gaze angles. Changes in viewing angles resulted in altered body alignment and increased amplitude of COG displacement. No significant changes in postural stability were observed when both gaze and viewing angles were altered. Results suggest that both the gaze angle and viewing perspective may be essential variables of the visuomotor system modulating postural responses. PMID:22398978

  8. Face Age and Eye Gaze Influence Older Adults' Emotion Recognition.

    PubMed

    Campbell, Anna; Murray, Janice E; Atkinson, Lianne; Ruffman, Ted

    2017-07-01

    Eye gaze has been shown to influence emotion recognition. In addition, older adults (over 65 years) are not as influenced by gaze direction cues as young adults (18-30 years). Nevertheless, these differences might stem from the use of young to middle-aged faces in emotion recognition research because older adults have an attention bias toward old-age faces. Therefore, using older face stimuli might allow older adults to process gaze direction cues to influence emotion recognition. To investigate this idea, young and older adults completed an emotion recognition task with young and older face stimuli displaying direct and averted gaze, assessing labeling accuracy for angry, disgusted, fearful, happy, and sad faces. Direct gaze rather than averted gaze improved young adults' recognition of emotions in young and older faces, but for older adults this was true only for older faces. The current study highlights the impact of stimulus face age and gaze direction on emotion recognition in young and older adults. The use of young face stimuli with direct gaze in most research might contribute to age-related emotion recognition differences. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study.

    PubMed

    Barsingerhorn, A D; Boonstra, F N; Goossens, H H L M

    2017-02-01

    Current stereo eye-tracking methods model the cornea as a sphere with one refractive surface. However, the human cornea is slightly aspheric and has two refractive surfaces. Here we used ray-tracing and the Navarro eye-model to study how these optical properties affect the accuracy of different stereo eye-tracking methods. We found that pupil size, gaze direction and head position all influence the reconstruction of gaze. Resulting errors range between ± 1.0 degrees at best. This shows that stereo eye-tracking may be an option if reliable calibration is not possible, but the applied eye-model should account for the actual optics of the cornea.

  10. Differences in gaze anticipation for locomotion with and without vision

    PubMed Central

    Authié, Colas N.; Hilt, Pauline M.; N'Guyen, Steve; Berthoz, Alain; Bennequin, Daniel

    2015-01-01

    Previous experimental studies have shown a spontaneous anticipation of locomotor trajectory by the head and gaze direction during human locomotion. This anticipatory behavior could serve several functions: an optimal selection of visual information, for instance through landmarks and optic flow, as well as trajectory planning and motor control. This would imply that anticipation remains in darkness but with different characteristics. We asked 10 participants to walk along two predefined complex trajectories (limaçon and figure eight) without any cue on the trajectory to follow. Two visual conditions were used: (i) in light and (ii) in complete darkness with eyes open. The whole body kinematics were recorded by motion capture, along with the participant's right eye movements. We showed that in darkness and in light, horizontal gaze anticipates the orientation of the head which itself anticipates the trajectory direction. However, the horizontal angular anticipation decreases by a half in darkness for both gaze and head. In both visual conditions we observed an eye nystagmus with similar properties (frequency and amplitude). The main difference comes from the fact that in light, there is a shift of the orientations of the eye nystagmus and the head in the direction of the trajectory. These results suggest that a fundamental function of gaze is to represent self motion, stabilize the perception of space during locomotion, and to simulate the future trajectory, regardless of the vision condition. PMID:26106313

  11. NIR tracking assists sports medicine in junior basketball training

    NASA Astrophysics Data System (ADS)

    Paeglis, Roberts; Bluss, Kristaps; Rudzitis, Andris; Spunde, Andris; Brice, Tamara; Nitiss, Edgars

    2011-07-01

    We recorded eye movements of eight elite junior basketball players. We hypothesized that a more stable gaze is correlated to a better shot rate. Upon preliminary testing we invited male juniors whose eyes could be reliably tracked in a game situation. To these ends, we used a head-mounted video-based eye tracker. The participants had no record of ocular or other health issues. No significant differences were found between shots made with and without the tracker cap, Paired samples t-test yielded p= .130 for the far and p=..900 > .050 for the middle range shots. The players made 40 shots from common far and middle range locations, 5 and 4 meters respectively for aged 14 years As expected, a statistical correlation was found between gaze fixation (in milliseconds) for the far and middle range shot rates, r=.782, p=.03. Notably, juniors who fixated longer before a shot had a more stable fixation or a lower gaze dispersion (in tracker's screen pixels), r=-.786, p=.02. This finding was augmented by the observation that the gaze dispersion while aiming at the basket was less (i.e., gaze more stable) in those who were more likely to score. We derived a regression equation linking fixation duration to shot success. We advocate infra-red eye tracking as a means to monitor player selection and training success.

  12. Holistic integration of gaze cues in visual face and body perception: Evidence from the composite design.

    PubMed

    Vrancken, Leia; Germeys, Filip; Verfaillie, Karl

    2017-01-01

    A considerable amount of research on identity recognition and emotion identification with the composite design points to the holistic processing of these aspects in faces and bodies. In this paradigm, the interference from a nonattended face half on the perception of the attended half is taken as evidence for holistic processing (i.e., a composite effect). Far less research, however, has been dedicated to the concept of gaze. Nonetheless, gaze perception is a substantial component of face and body perception, and holds critical information for everyday communicative interactions. Furthermore, the ability of human observers to detect direct versus averted eye gaze is effortless, perhaps similar to identity perception and emotion recognition. However, the hypothesis of holistic perception of eye gaze has never been tested directly. Research on gaze perception with the composite design could facilitate further systematic comparison with other aspects of face and body perception that have been investigated using the composite design (i.e., identity and emotion). In the present research, a composite design was administered to assess holistic processing of gaze cues in faces (Experiment 1) and bodies (Experiment 2). Results confirmed that eye and head orientation (Experiment 1A) and head and body orientation (Experiment 2A) are integrated in a holistic manner. However, the composite effect was not completely disrupted by inversion (Experiments 1B and 2B), a finding that will be discussed together with implications for future research.

  13. Mouse cursor movement and eye tracking data as an indicator of pathologists’ attention when viewing digital whole slide images

    PubMed Central

    Raghunath, Vignesh; Braxton, Melissa O.; Gagnon, Stephanie A.; Brunyé, Tad T.; Allison, Kimberly H.; Reisch, Lisa M.; Weaver, Donald L.; Elmore, Joann G.; Shapiro, Linda G.

    2012-01-01

    Context: Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists’ viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists’ viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists’ viewing strategies and time expenditures in their interpretive workflow. Aims: To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists’ attention and viewing behavior. Settings and Design: Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Materials and Methods: Participants’ foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Statistical Analysis Used: Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists’ accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Results: Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Conclusions: Data detailing mouse cursor movements may be a useful addition to future studies of pathologists’ accuracy and efficiency when using digital pathology. PMID:23372984

  14. Mouse cursor movement and eye tracking data as an indicator of pathologists' attention when viewing digital whole slide images.

    PubMed

    Raghunath, Vignesh; Braxton, Melissa O; Gagnon, Stephanie A; Brunyé, Tad T; Allison, Kimberly H; Reisch, Lisa M; Weaver, Donald L; Elmore, Joann G; Shapiro, Linda G

    2012-01-01

    Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists' viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists' viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists' viewing strategies and time expenditures in their interpretive workflow. To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists' attention and viewing behavior. Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Participants' foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists' accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Data detailing mouse cursor movements may be a useful addition to future studies of pathologists' accuracy and efficiency when using digital pathology.

  15. Individual Differences in Face Processing: Infants' Scanning Patterns and Pupil Dilations Are Influenced by the Distribution of Parental Leave

    ERIC Educational Resources Information Center

    Gredeback, Gustaf; Eriksson, Malin; Schmitow, Clara; Laeng, Bruno; Stenberg, Gunilla

    2012-01-01

    Fourteen-month-old infants were presented with static images of happy, neutral, and fearful emotional facial expressions in an eye-tracking paradigm. The emotions were expressed by the infant's own parents as well as a male and female stranger (parents of another participating infant). Rather than measuring the duration of gaze in particular areas…

  16. Culture and Listeners' Gaze Responses to Stuttering

    ERIC Educational Resources Information Center

    Zhang, Jianliang; Kalinowski, Joseph

    2012-01-01

    Background: It is frequently observed that listeners demonstrate gaze aversion to stuttering. This response may have profound social/communicative implications for both fluent and stuttering individuals. However, there is a lack of empirical examination of listeners' eye gaze responses to stuttering, and it is unclear whether cultural background…

  17. Interaction between gaze and visual and proprioceptive position judgements.

    PubMed

    Fiehler, Katja; Rösler, Frank; Henriques, Denise Y P

    2010-06-01

    There is considerable evidence that targets for action are represented in a dynamic gaze-centered frame of reference, such that each gaze shift requires an internal updating of the target. Here, we investigated the effect of eye movements on the spatial representation of targets used for position judgements. Participants had their hand passively placed to a location, and then judged whether this location was left or right of a remembered visual or remembered proprioceptive target, while gaze direction was varied. Estimates of position of the remembered targets relative to the unseen position of the hand were assessed with an adaptive psychophysical procedure. These positional judgements significantly varied relative to gaze for both remembered visual and remembered proprioceptive targets. Our results suggest that relative target positions may also be represented in eye-centered coordinates. This implies similar spatial reference frames for action control and space perception when positions are coded relative to the hand.

  18. Dysthyroid Orbitopathy Presenting with Gaze-Evoked Amaurosis: Case Report and Review of the Literature.

    PubMed

    Orlans, Harry O; Bremner, Fion D

    2015-01-01

    Gaze-evoked amaurosis (GEA) describes visual loss associated with eccentric gaze that recovers when the eye is returned to primary position. Here we describe an unusual case of bilateral GEA as the presenting feature of dysthyroid orbitopathy. This is only the third such case to be reported in the literature and the first to feature bilateral GEA in all positions of gaze without accompanying proptosis or ophthalmoplegia. A 50-year-old man who had recently commenced treatment for thyrotoxicosis presented with a 3-week history of typical GEA in both eyes in all positions of gaze. He subsequently developed a bilateral compressive optic neuropathy which was only partially responsive to high dose steroid therapy. Although an uncommon presenting feature of dysthyroid orbitopathy, GEA is an ominous symptom that may precede sight-threatening optic nerve compromise. When present, early immunosuppressive and/or decompressive treatment should be considered.

  19. A neural-based remote eye gaze tracker under natural head motion.

    PubMed

    Torricelli, Diego; Conforto, Silvia; Schmid, Maurizio; D'Alessio, Tommaso

    2008-10-01

    A novel approach to view-based eye gaze tracking for human computer interface (HCI) is presented. The proposed method combines different techniques to address the problems of head motion, illumination and usability in the framework of low cost applications. Feature detection and tracking algorithms have been designed to obtain an automatic setup and strengthen the robustness to light conditions. An extensive analysis of neural solutions has been performed to deal with the non-linearity associated with gaze mapping under free-head conditions. No specific hardware, such as infrared illumination or high-resolution cameras, is needed, rather a simple commercial webcam working in visible light spectrum suffices. The system is able to classify the gaze direction of the user over a 15-zone graphical interface, with a success rate of 95% and a global accuracy of around 2 degrees , comparable with the vast majority of existing remote gaze trackers.

  20. Evidence for impairments in using static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism.

    PubMed

    Goldberg, Melissa C; Mostow, Allison J; Vecera, Shaun P; Larson, Jennifer C Gidley; Mostofsky, Stewart H; Mahone, E Mark; Denckla, Martha B

    2008-09-01

    We examined the ability to use static line drawings of eye gaze cues to orient visual-spatial attention in children with high functioning autism (HFA) compared to typically developing children (TD). The task was organized such that on valid trials, gaze cues were directed toward the same spatial location as the appearance of an upcoming target, while on invalid trials gaze cues were directed to an opposite location. Unlike TD children, children with HFA showed no advantage in reaction time (RT) on valid trials compared to invalid trials (i.e., no significant validity effect). The two stimulus onset asynchronies (200 ms, 700 ms) did not differentially affect these findings. The results suggest that children with HFA show impairments in utilizing static line drawings of gaze cues to orient visual-spatial attention.

  1. Simple gaze-contingent cues guide eye movements in a realistic driving simulator

    NASA Astrophysics Data System (ADS)

    Pomarjanschi, Laura; Dorr, Michael; Bex, Peter J.; Barth, Erhardt

    2013-03-01

    Looking at the right place at the right time is a critical component of driving skill. Therefore, gaze guidance has the potential to become a valuable driving assistance system. In previous work, we have already shown that complex gaze-contingent stimuli can guide attention and reduce the number of accidents in a simple driving simulator. We here set out to investigate whether cues that are simple enough to be implemented in a real car can also capture gaze during a more realistic driving task in a high-fidelity driving simulator. We used a state-of-the-art, wide-field-of-view driving simulator with an integrated eye tracker. Gaze-contingent warnings were implemented using two arrays of light-emitting diodes horizontally fitted below and above the simulated windshield. Thirteen volunteering subjects drove along predetermined routes in a simulated environment popu­ lated with autonomous traffic. Warnings were triggered during the approach to half of the intersections, cueing either towards the right or to the left. The remaining intersections were not cued, and served as controls. The analysis of the recorded gaze data revealed that the gaze-contingent cues did indeed have a gaze guiding effect, triggering a significant shift in gaze position towards the highlighted direction. This gaze shift was not accompanied by changes in driving behaviour, suggesting that the cues do not interfere with the driving task itself.

  2. Virtual social interactions in social anxiety--the impact of sex, gaze, and interpersonal distance.

    PubMed

    Wieser, Matthias J; Pauli, Paul; Grosseibl, Miriam; Molzow, Ina; Mühlberger, Andreas

    2010-10-01

    In social interactions, interpersonal distance between interaction partners plays an important role in determining the status of the relationship. Interpersonal distance is an important nonverbal behavior, and is used to regulate personal space in a complex interplay with other nonverbal behaviors such as eye gaze. In social anxiety, studies regarding the impact of interpersonal distance on within-situation avoidance behavior are so far rare. Thus the present study aimed to scrutinize the relationship between gaze direction, sex, interpersonal distance, and social anxiety in social interactions. Social interactions were modeled in a virtual-reality (VR) environment, where 20 low and 19 high socially anxious women were confronted with approaching male and female characters, who stopped in front of the participant, either some distance away or close to them, and displayed either a direct or an averted gaze. Gaze and head movements, as well as heart rate, were measured as indices of avoidance behavior and fear reactions. High socially anxious participants showed a complex pattern of avoidance behavior: when the avatar was standing farther away, high socially anxious women avoided gaze contact with male avatars showing a direct gaze. Furthermore, they showed avoidance behavior (backward head movements) in response to male avatars showing a direct gaze, regardless of the interpersonal distance. Overall, the current study proved that VR social interactions might be a very useful tool for investigating avoidance behavior of socially anxious individuals in highly controlled situations. This might also be the first step in using VR social interactions in clinical protocols for the therapy of social anxiety disorder.

  3. No Evidence of Emotional Dysregulation or Aversion to Mutual Gaze in Preschoolers with Autism Spectrum Disorder: An Eye-Tracking Pupillometry Study

    ERIC Educational Resources Information Center

    Nuske, Heather J.; Vivanti, Giacomo; Dissanayake, Cheryl

    2015-01-01

    The "gaze aversion hypothesis", suggests that people with Autism Spectrum Disorder (ASD) avoid mutual gaze because they experience it as hyper-arousing. To test this hypothesis we showed mutual and averted gaze stimuli to 23 mixed-ability preschoolers with ASD ("M" Mullen DQ = 68) and 21 typically-developing preschoolers, aged…

  4. Disentangling working memory processes during spatial span assessment: a modeling analysis of preferred eye movement strategies.

    PubMed

    Patt, Virginie M; Thomas, Michael L; Minassian, Arpi; Geyer, Mark A; Brown, Gregory G; Perry, William

    2014-01-01

    The neurocognitive processes involved during classic spatial working memory (SWM) assessment were investigated by examining naturally preferred eye movement strategies. Cognitively healthy adult volunteers were tested in a computerized version of the Corsi Block-Tapping Task--a spatial span task requiring the short term maintenance of a series of locations presented in a specific order--coupled with eye tracking. Modeling analysis was developed to characterize eye-tracking patterns across all task phases, including encoding, retention, and recall. Results revealed a natural preference for local gaze maintenance during both encoding and retention, with fewer than 40% fixated targets. These findings contrasted with the stimulus retracing pattern expected during recall as a result of task demands, with 80% fixated targets. Along with participants' self-reported strategies of mentally "making shapes," these results suggest the involvement of covert attention shifts and higher order cognitive Gestalt processes during spatial span tasks, challenging instrument validity as a single measure of SWM storage capacity.

  5. An Exploration of the Use of Eye-Gaze Tracking to Study Problem-Solving on Standardized Science Assessments

    ERIC Educational Resources Information Center

    Tai, Robert H.; Loehr, John F.; Brigham, Frederick J.

    2006-01-01

    This pilot study investigated the capacity of eye-gaze tracking to identify differences in problem-solving behaviours within a group of individuals who possessed varying degrees of knowledge and expertise in three disciplines of science (biology, chemistry and physics). The six participants, all pre-service science teachers, completed an 18-item…

  6. Real-Time Mutual Gaze Perception Enhances Collaborative Learning and Collaboration Quality

    ERIC Educational Resources Information Center

    Schneider, Bertrand; Pea, Roy

    2013-01-01

    In this paper we present the results of an eye-tracking study on collaborative problem-solving dyads. Dyads remotely collaborated to learn from contrasting cases involving basic concepts about how the human brain processes visual information. In one condition, dyads saw the eye gazes of their partner on the screen; in a control group, they did not…

  7. A Bilingual Advantage in 54-Month-Olds' Use of Referential Cues in Fast Mapping

    ERIC Educational Resources Information Center

    Yow, W. Quin; Li, Xiaoqian; Lam, Sarah; Gliga, Teodora; Chong, Yap Seng; Kwek, Kenneth; Broekman, Birit F. P.

    2017-01-01

    Research has demonstrated a bilingual advantage in how young children use referential cues such as eye gaze and pointing gesture to locate an object or to categorize objects. This study investigated the use of referential cues (i.e. eye gaze) in fast mapping in three groups of children that differed in their language exposure. One hundred and…

  8. Eye Gaze Metrics Reflect a Shared Motor Representation for Action Observation and Movement Imagery

    ERIC Educational Resources Information Center

    McCormick, Sheree A.; Causer, Joe; Holmes, Paul S.

    2012-01-01

    Action observation (AO) and movement imagery (MI) have been reported to share similar neural networks. This study investigated the congruency between AO and MI using the eye gaze metrics, dwell time and fixation number. A simple reach-grasp-place arm movement was observed and, in a second condition, imagined where the movement was presented from…

  9. The Relationship between Children's Gaze Reporting and Theory of Mind

    ERIC Educational Resources Information Center

    D'Entremont, Barbara; Seamans, Elizabeth; Boudreau, Elyse

    2012-01-01

    Seventy-nine 3- and 4-year-old children were tested on gaze-reporting ability and Wellman and Liu's (2004) continuous measure of theory of mind (ToM). Children were better able to report where someone was looking when eye and head direction were provided as a cue compared with when only eye direction cues were provided. With the exception of…

  10. Perception of direct vs. averted gaze in portrait paintings: An fMRI and eye-tracking study.

    PubMed

    Kesner, Ladislav; Grygarová, Dominika; Fajnerová, Iveta; Lukavský, Jiří; Nekovářová, Tereza; Tintěra, Jaroslav; Zaytseva, Yuliya; Horáček, Jiří

    2018-06-15

    In this study, we use separate eye-tracking measurements and functional magnetic resonance imaging to investigate the neuronal and behavioral response to painted portraits with direct versus averted gaze. We further explored modulatory effects of several painting characteristics (premodern vs modern period, influence of style and pictorial context). In the fMRI experiment, we show that the direct versus averted gaze elicited increased activation in lingual and inferior occipital and the fusiform face area, as well as in several areas involved in attentional and social cognitive processes, especially the theory of mind: angular gyrus/temporo-parietal junction, inferior frontal gyrus and dorsolateral prefrontal cortex. The additional eye-tracking experiment showed that participants spent more time viewing the portrait's eyes and mouth when the portrait's gaze was directed towards the observer. These results suggest that static and, in some cases, highly stylized depictions of human beings in artistic portraits elicit brain activation commensurate with the experience of being observed by a watchful intelligent being. They thus involve observers in implicit inferences of the painted subject's mental states and emotions. We further confirm the substantial influence of representational medium on brain activity. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. An automatic calibration procedure for remote eye-gaze tracking systems.

    PubMed

    Model, Dmitri; Guestrin, Elias D; Eizenman, Moshe

    2009-01-01

    Remote gaze estimation systems use calibration procedures to estimate subject-specific parameters that are needed for the calculation of the point-of-gaze. In these procedures, subjects are required to fixate on a specific point or points at specific time instances. Advanced remote gaze estimation systems can estimate the optical axis of the eye without any personal calibration procedure, but use a single calibration point to estimate the angle between the optical axis and the visual axis (line-of-sight). This paper presents a novel automatic calibration procedure that does not require active user participation. To estimate the angles between the optical and visual axes of each eye, this procedure minimizes the distance between the intersections of the visual axes of the left and right eyes with the surface of a display while subjects look naturally at the display (e.g., watching a video clip). Simulation results demonstrate that the performance of the algorithm improves as the range of viewing angles increases. For a subject sitting 75 cm in front of an 80 cm x 60 cm display (40" TV) the standard deviation of the error in the estimation of the angles between the optical and visual axes is 0.5 degrees.

  12. The Development of Mentalistic Gaze Understanding

    ERIC Educational Resources Information Center

    Doherty, Martin J.

    2006-01-01

    Very young infants are sensitive to and follow other people's gaze. By 18 months children, like chimpanzees, apparently represent the spatial relationship between viewer and object viewed: they can follow eye-direction alone, and react appropriately if the other's gaze is blocked by occluding barriers. This paper assesses when children represent…

  13. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology

    PubMed Central

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822

  14. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.

    PubMed

    Demšar, Urška; Çöltekin, Arzu

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.

  15. A preliminary study of right hemisphere cognitive deficits and impaired social judgments among young people with Asperger syndrome.

    PubMed

    Ellis, Hadyn D; Ellis, Diane M; Fraser, William; Deb, Shoumitro

    1994-10-01

    Seven children and young adults with definite signs of Asperger syndrome were administered a battery of tests designed to test: intelligence; left and right cerebral hemisphere functioning; ability to discriminate eye gaze; and social judgment. The subjects revealed a non significant tendency to have a higher verbal IQ than visual IQ; and their right hemisphere functioning seemed impaired. They were also poorer at discriminating eye gaze and revealed difficulties in making hypothetical social judgments. The data are considered with reference to Rourke's (1988) work on non-verbal learning disabilities together with the ideas of Tantam (1992) on the "social gaze response" and Baron-Cohen's (1993) Eye-Detection Detector model. The possible links between social judgment and theory of mind (Frith, 1991) are briefly explored.

  16. Spatial Frequency Requirements and Gaze Strategy in Visual-Only and Audiovisual Speech Perception

    PubMed Central

    Wilson, Amanda H.; Paré, Martin; Munhall, Kevin G.

    2016-01-01

    Purpose The aim of this article is to examine the effects of visual image degradation on performance and gaze behavior in audiovisual and visual-only speech perception tasks. Method We presented vowel–consonant–vowel utterances visually filtered at a range of frequencies in visual-only, audiovisual congruent, and audiovisual incongruent conditions (Experiment 1; N = 66). In Experiment 2 (N = 20), participants performed a visual-only speech perception task and in Experiment 3 (N = 20) an audiovisual task while having their gaze behavior monitored using eye-tracking equipment. Results In the visual-only condition, increasing image resolution led to monotonic increases in performance, and proficient speechreaders were more affected by the removal of high spatial information than were poor speechreaders. The McGurk effect also increased with increasing visual resolution, although it was less affected by the removal of high-frequency information. Observers tended to fixate on the mouth more in visual-only perception, but gaze toward the mouth did not correlate with accuracy of silent speechreading or the magnitude of the McGurk effect. Conclusions The results suggest that individual differences in silent speechreading and the McGurk effect are not related. This conclusion is supported by differential influences of high-resolution visual information on the 2 tasks and differences in the pattern of gaze. PMID:27537379

  17. Eye Movements in Risky Choice

    PubMed Central

    Hermens, Frouke; Matthews, William J.

    2015-01-01

    Abstract We asked participants to make simple risky choices while we recorded their eye movements. We built a complete statistical model of the eye movements and found very little systematic variation in eye movements over the time course of a choice or across the different choices. The only exceptions were finding more (of the same) eye movements when choice options were similar, and an emerging gaze bias in which people looked more at the gamble they ultimately chose. These findings are inconsistent with prospect theory, the priority heuristic, or decision field theory. However, the eye movements made during a choice have a large relationship with the final choice, and this is mostly independent from the contribution of the actual attribute values in the choice options. That is, eye movements tell us not just about the processing of attribute values but also are independently associated with choice. The pattern is simple—people choose the gamble they look at more often, independently of the actual numbers they see—and this pattern is simpler than predicted by decision field theory, decision by sampling, and the parallel constraint satisfaction model. © 2015 The Authors. Journal of Behavioral Decision Making published by John Wiley & Sons Ltd. PMID:27522985

  18. Art critic: Multisignal vision and speech interaction system in a gaming context.

    PubMed

    Reale, Michael J; Liu, Peng; Yin, Lijun; Canavan, Shaun

    2013-12-01

    True immersion of a player within a game can only occur when the world simulated looks and behaves as close to reality as possible. This implies that the game must correctly read and understand, among other things, the player's focus, attitude toward the objects/persons in focus, gestures, and speech. In this paper, we proposed a novel system that integrates eye gaze estimation, head pose estimation, facial expression recognition, speech recognition, and text-to-speech components for use in real-time games. Both the eye gaze and head pose components utilize underlying 3-D models, and our novel head pose estimation algorithm uniquely combines scene flow with a generic head model. The facial expression recognition module uses the local binary patterns with three orthogonal planes approach on the 2-D shape index domain rather than the pixel domain, resulting in improved classification. Our system has also been extended to use a pan-tilt-zoom camera driven by the Kinect, allowing us to track a moving player. A test game, Art Critic, is also presented, which not only demonstrates the utility of our system but also provides a template for player/non-player character (NPC) interaction in a gaming context. The player alters his/her view of the 3-D world using head pose, looks at paintings/NPCs using eye gaze, and makes an evaluation based on the player's expression and speech. The NPC artist will respond with facial expression and synthetic speech based on its personality. Both qualitative and quantitative evaluations of the system are performed to illustrate the system's effectiveness.

  19. Studies of the Ability to Hold the Eye in Eccentric Gaze: Measurements in Normal Subjects with the Head Erect

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Somers, Jeffrey T.; Feiveson, Alan H.; Leigh, R. John; Wood, Scott J.; Paloski, William H.; Kornilova, Ludmila

    2006-01-01

    We studied the ability to hold the eyes in eccentric horizontal or vertical gaze angles in 68 normal humans, age range 19-56. Subjects attempted to sustain visual fixation of a briefly flashed target located 30 in the horizontal plane and 15 in the vertical plane in a dark environment. Conventionally, the ability to hold eccentric gaze is estimated by fitting centripetal eye drifts by exponential curves and calculating the time constant (t(sub c)) of these slow phases of gazeevoked nystagmus. Although the distribution of time-constant measurements (t(sub c)) in our normal subjects was extremely skewed due to occasional test runs that exhibited near-perfect stability (large t(sub c) values), we found that log10(tc) was approximately normally distributed within classes of target direction. Therefore, statistical estimation and inference on the effect of target direction was performed on values of z identical with log10t(sub c). Subjects showed considerable variation in their eyedrift performance over repeated trials; nonetheless, statistically significant differences emerged: values of tc were significantly higher for gaze elicited to targets in the horizontal plane than for the vertical plane (P less than 10(exp -5), suggesting eccentric gazeholding is more stable in the horizontal than in the vertical plane. Furthermore, centrifugal eye drifts were observed in 13.3, 16.0 and 55.6% of cases for horizontal, upgaze and downgaze tests, respectively. Fifth percentile values of the time constant were estimated to be 10.2 sec, 3.3 sec and 3.8 sec for horizontal, upward and downward gaze, respectively. The difference between horizontal and vertical gazeholding may be ascribed to separate components of the velocity position neural integrator for eye movements, and to differences in orbital mechanics. Our statistical method for representing the range of normal eccentric gaze stability can be readily applied in a clinical setting to patients who were exposed to environments that may have modified their central integrators and thus require monitoring. Patients with gaze-evoked nystagmus can be flagged by comparing to the above established normative criteria.

  20. Real-time computer-based visual feedback improves visual acuity in downbeat nystagmus - a pilot study.

    PubMed

    Teufel, Julian; Bardins, S; Spiegel, Rainer; Kremmyda, O; Schneider, E; Strupp, M; Kalla, R

    2016-01-04

    Patients with downbeat nystagmus syndrome suffer from oscillopsia, which leads to an unstable visual perception and therefore impaired visual acuity. The aim of this study was to use real-time computer-based visual feedback to compensate for the destabilizing slow phase eye movements. The patients were sitting in front of a computer screen with the head fixed on a chin rest. The eye movements were recorded by an eye tracking system (EyeSeeCam®). We tested the visual acuity with a fixed Landolt C (static) and during real-time feedback driven condition (dynamic) in gaze straight ahead and (20°) sideward gaze. In the dynamic condition, the Landolt C moved according to the slow phase eye velocity of the downbeat nystagmus. The Shapiro-Wilk test was used to test for normal distribution and one-way ANOVA for comparison. Ten patients with downbeat nystagmus were included in the study. Median age was 76 years and the median duration of symptoms was 6.3 years (SD +/- 3.1y). The mean slow phase velocity was moderate during gaze straight ahead (1.44°/s, SD +/- 1.18°/s) and increased significantly in sideward gaze (mean left 3.36°/s; right 3.58°/s). In gaze straight ahead, we found no difference between the static and feedback driven condition. In sideward gaze, visual acuity improved in five out of ten subjects during the feedback-driven condition (p = 0.043). This study provides proof of concept that non-invasive real-time computer-based visual feedback compensates for the SPV in DBN. Therefore, real-time visual feedback may be a promising aid for patients suffering from oscillopsia and impaired text reading on screen. Recent technological advances in the area of virtual reality displays might soon render this approach feasible in fully mobile settings.

  1. Eyes on the Mind: Investigating the Influence of Gaze Dynamics on the Perception of Others in Real-Time Social Interaction

    PubMed Central

    Pfeiffer, Ulrich J.; Schilbach, Leonhard; Jording, Mathis; Timmermans, Bert; Bente, Gary; Vogeley, Kai

    2012-01-01

    Social gaze provides a window into the interests and intentions of others and allows us to actively point out our own. It enables us to engage in triadic interactions involving human actors and physical objects and to build an indispensable basis for coordinated action and collaborative efforts. The object-related aspect of gaze in combination with the fact that any motor act of looking encompasses both input and output of the minds involved makes this non-verbal cue system particularly interesting for research in embodied social cognition. Social gaze comprises several core components, such as gaze-following or gaze aversion. Gaze-following can result in situations of either “joint attention” or “shared attention.” The former describes situations in which the gaze-follower is aware of sharing a joint visual focus with the gazer. The latter refers to a situation in which gazer and gaze-follower focus on the same object and both are aware of their reciprocal awareness of this joint focus. Here, a novel interactive eye-tracking paradigm suited for studying triadic interactions was used to explore two aspects of social gaze. Experiments 1a and 1b assessed how the latency of another person’s gaze reactions (i.e., gaze-following or gaze version) affected participants’ sense of agency, which was measured by their experience of relatedness of these reactions. Results demonstrate that both timing and congruency of a gaze reaction as well as the other’s action options influence the sense of agency. Experiment 2 explored differences in gaze dynamics when participants were asked to establish either joint or shared attention. Findings indicate that establishing shared attention takes longer and requires a larger number of gaze shifts as compared to joint attention, which more closely seems to resemble simple visual detection. Taken together, novel insights into the sense of agency and the awareness of others in gaze-based interaction are provided. PMID:23227017

  2. Neurons in the human amygdala encode face identity, but not gaze direction.

    PubMed

    Mormann, Florian; Niediek, Johannes; Tudusciuc, Oana; Quesada, Carlos M; Coenen, Volker A; Elger, Christian E; Adolphs, Ralph

    2015-11-01

    The amygdala is important for face processing, and direction of eye gaze is one of the most socially salient facial signals. Recording from over 200 neurons in the amygdala of neurosurgical patients, we found robust encoding of the identity of neutral-expression faces, but not of their direction of gaze. Processing of gaze direction may rely on a predominantly cortical network rather than the amygdala.

  3. The Brainstem Switch for Gaze Shifts in Humans

    DTIC Science & Technology

    2001-10-25

    Page 1 of 4 THE BRAINSTEM SWITCH FOR GAZE SHIFTS IN HUMANS A. N. Kumar1, R. J. Leigh1,2, S. Ramat3 Department of 1Biomedical Engineering, Case...omnipause neurons during gaze shifts. Using the scleral search coil technique, eye movements were measured in seven normal subjects, as they made...voluntary, disjunctive gaze shifts comprising saccades and vergence movements. Conjugate oscillations of small amplitude and high frequency were identified

  4. Culture, gaze and the neural processing of fear expressions

    PubMed Central

    Franklin, Robert G.; Rule, Nicholas O.; Freeman, Jonathan B.; Kveraga, Kestutis; Hadjikhani, Nouchine; Yoshikawa, Sakiko; Ambady, Nalini

    2010-01-01

    The direction of others’ eye gaze has important influences on how we perceive their emotional expressions. Here, we examined differences in neural activation to direct- versus averted-gaze fear faces as a function of culture of the participant (Japanese versus US Caucasian), culture of the stimulus face (Japanese versus US Caucasian), and the relation between the two. We employed a previously validated paradigm to examine differences in neural activation in response to rapidly presented direct- versus averted-fear expressions, finding clear evidence for a culturally determined role of gaze in the processing of fear. Greater neural responsivity was apparent to averted- versus direct-gaze fear in several regions related to face and emotion processing, including bilateral amygdalae, when posed on same-culture faces, whereas greater response to direct- versus averted-gaze fear was apparent in these same regions when posed on other-culture faces. We also found preliminary evidence for intercultural variation including differential responses across participants to Japanese versus US Caucasian stimuli, and to a lesser degree differences in how Japanese and US Caucasian participants responded to these stimuli. These findings reveal a meaningful role of culture in the processing of eye gaze and emotion, and highlight their interactive influences in neural processing. PMID:20019073

  5. Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor

    PubMed Central

    Tanno, Koichi

    2017-01-01

    A gaze estimation system is one of the communication methods for severely disabled people who cannot perform gestures and speech. We previously developed an eye tracking method using a compact and light electrooculogram (EOG) signal, but its accuracy is not very high. In the present study, we conducted experiments to investigate the EOG component strongly correlated with the change of eye movements. The experiments in this study are of two types: experiments to see objects only by eye movements and experiments to see objects by face and eye movements. The experimental results show the possibility of an eye tracking method using EOG signals and a Kinect sensor. PMID:28912800

  6. Gaze Behavior Consistency among Older and Younger Adults When Looking at Emotional Faces

    PubMed Central

    Chaby, Laurence; Hupont, Isabelle; Avril, Marie; Luherne-du Boullay, Viviane; Chetouani, Mohamed

    2017-01-01

    The identification of non-verbal emotional signals, and especially of facial expressions, is essential for successful social communication among humans. Previous research has reported an age-related decline in facial emotion identification, and argued for socio-emotional or aging-brain model explanations. However, more perceptual differences in the gaze strategies that accompany facial emotional processing with advancing age have been under-explored yet. In this study, 22 young (22.2 years) and 22 older (70.4 years) adults were instructed to look at basic facial expressions while their gaze movements were recorded by an eye-tracker. Participants were then asked to identify each emotion, and the unbiased hit rate was applied as performance measure. Gaze data were first analyzed using traditional measures of fixations over two preferential regions of the face (upper and lower areas) for each emotion. Then, to better capture core gaze changes with advancing age, spatio-temporal gaze behaviors were deeper examined using data-driven analysis (dimension reduction, clustering). Results first confirmed that older adults performed worse than younger adults at identifying facial expressions, except for “joy” and “disgust,” and this was accompanied by a gaze preference toward the lower-face. Interestingly, this phenomenon was maintained during the whole time course of stimulus presentation. More importantly, trials corresponding to older adults were more tightly clustered, suggesting that the gaze behavior patterns of older adults are more consistent than those of younger adults. This study demonstrates that, confronted to emotional faces, younger and older adults do not prioritize or ignore the same facial areas. Older adults mainly adopted a focused-gaze strategy, consisting in focusing only on the lower part of the face throughout the whole stimuli display time. This consistency may constitute a robust and distinctive “social signature” of emotional identification in aging. Younger adults, however, were more dispersed in terms of gaze behavior and used a more exploratory-gaze strategy, consisting in repeatedly visiting both facial areas. PMID:28450841

  7. Assessing Self-Awareness through Gaze Agency

    PubMed Central

    Crespi, Sofia Allegra; de’Sperati, Claudio

    2016-01-01

    We define gaze agency as the awareness of the causal effect of one’s own eye movements in gaze-contingent environments, which might soon become a widespread reality with the diffusion of gaze-operated devices. Here we propose a method for measuring gaze agency based on self-monitoring propensity and sensitivity. In one task, naïf observers watched bouncing balls on a computer monitor with the goal of discovering the cause of concurrently presented beeps, which were generated in real-time by their saccades or by other events (Discovery Task). We manipulated observers’ self-awareness by pre-exposing them to a condition in which beeps depended on gaze direction or by focusing their attention to their own eyes. These manipulations increased propensity to agency discovery. In a second task, which served to monitor agency sensitivity at the sensori-motor level, observers were explicitly asked to detect gaze agency (Detection Task). Both tasks turned out to be well suited to measure both increases and decreases of gaze agency. We did not find evident oculomotor correlates of agency discovery or detection. A strength of our approach is that it probes self-monitoring propensity–difficult to evaluate with traditional tasks based on bodily agency. In addition to putting a lens on this novel cognitive function, measuring gaze agency could reveal subtle self-awareness deficits in pathological conditions and during development. PMID:27812138

  8. Physiological Correlates of Social Avoidance Behavior in Children and Adolescents with Fragile X Syndrome

    ERIC Educational Resources Information Center

    Hall, Scott S.; Lightbody, Amy A.; Huffman, Lynne C.; Lazzeroni, Laura C.; Reiss, Allan L.

    2009-01-01

    The heart rate and eye-gaze avoidance of 50 boys and girls with fragile X syndrome were monitored and it was found that those with this condition has significantly higher heart rates, lower vagal tones, and lower heart rate variability estimates when compared to their sibling. Eye-gaze avoidance decreased slightly over the course of the 25-minute…

  9. Eye Gaze During Face Processing in Children and Adolescents with 22q11.2 Deletion Syndrome

    ERIC Educational Resources Information Center

    Glaser, Bronwyn; Debbane, Martin; Ottet, Marie-Christine; Vuilleumier, Patrik; Zesiger, Pascal; Antonarakis, Stylianos E.; Eliez, Stephan

    2010-01-01

    Objective: The 22q11.2 deletion syndrome (22q11DS) is a neurogenetic syndrome with high risk for the development of psychiatric disorder. There is interest in identifying reliable markers for measuring and monitoring socio-emotional impairments in 22q11DS during development. The current study investigated eye gaze as a potential marker during a…

  10. Brief Report: Broad Autism Phenotype in Adults Is Associated with Performance on an Eye-Tracking Measure of Joint Attention

    ERIC Educational Resources Information Center

    Swanson, Meghan R.; Siller, Michael

    2014-01-01

    The current study takes advantage of modern eye-tracking technology and evaluates how individuals allocate their attention when viewing social videos that display an adult model who is gazing at a series of targets that appear and disappear in the four corners of the screen (congruent condition), or gazing elsewhere (incongruent condition). Data…

  11. A software module for implementing auditory and visual feedback on a video-based eye tracking system

    NASA Astrophysics Data System (ADS)

    Rosanlall, Bharat; Gertner, Izidor; Geri, George A.; Arrington, Karl F.

    2016-05-01

    We describe here the design and implementation of a software module that provides both auditory and visual feedback of the eye position measured by a commercially available eye tracking system. The present audio-visual feedback module (AVFM) serves as an extension to the Arrington Research ViewPoint EyeTracker, but it can be easily modified for use with other similar systems. Two modes of audio feedback and one mode of visual feedback are provided in reference to a circular area-of-interest (AOI). Auditory feedback can be either a click tone emitted when the user's gaze point enters or leaves the AOI, or a sinusoidal waveform with frequency inversely proportional to the distance from the gaze point to the center of the AOI. Visual feedback is in the form of a small circular light patch that is presented whenever the gaze-point is within the AOI. The AVFM processes data that are sent to a dynamic-link library by the EyeTracker. The AVFM's multithreaded implementation also allows real-time data collection (1 kHz sampling rate) and graphics processing that allow display of the current/past gaze-points as well as the AOI. The feedback provided by the AVFM described here has applications in military target acquisition and personnel training, as well as in visual experimentation, clinical research, marketing research, and sports training.

  12. Effect of direct eye contact in PTSD related to interpersonal trauma: an fMRI study of activation of an innate alarm system.

    PubMed

    Steuwe, Carolin; Daniels, Judith K; Frewen, Paul A; Densmore, Maria; Pannasch, Sebastian; Beblo, Thomas; Reiss, Jeffrey; Lanius, Ruth A

    2014-01-01

    In healthy individuals, direct eye contact initially leads to activation of a fast subcortical pathway, which then modulates a cortical route eliciting social cognitive processes. The aim of this study was to gain insight into the neurobiological effects of direct eye-to-eye contact using a virtual reality paradigm in individuals with posttraumatic stress disorder (PTSD) related to prolonged childhood abuse. We examined 16 healthy comparison subjects and 16 patients with a primary diagnosis of PTSD using a virtual reality functional magnetic resonance imaging paradigm involving direct vs averted gaze (happy, sad, neutral) as developed by Schrammel et al. in 2009. Irrespective of the displayed emotion, controls exhibited an increased blood oxygenation level-dependent response during direct vs averted gaze within the dorsomedial prefrontal cortex, left temporoparietal junction and right temporal pole. Under the same conditions, individuals with PTSD showed increased activation within the superior colliculus (SC)/periaqueductal gray (PAG) and locus coeruleus. Our findings suggest that healthy controls react to the exposure of direct gaze with an activation of a cortical route that enhances evaluative 'top-down' processes underlying social interactions. In individuals with PTSD, however, direct gaze leads to sustained activation of a subcortical route of eye-contact processing, an innate alarm system involving the SC and the underlying circuits of the PAG.

  13. Effect of direct eye contact in PTSD related to interpersonal trauma: an fMRI study of activation of an innate alarm system

    PubMed Central

    Steuwe, Carolin; Daniels, Judith K.; Frewen, Paul A.; Densmore, Maria; Pannasch, Sebastian; Beblo, Thomas; Reiss, Jeffrey; Lanius, Ruth A.

    2014-01-01

    In healthy individuals, direct eye contact initially leads to activation of a fast subcortical pathway, which then modulates a cortical route eliciting social cognitive processes. The aim of this study was to gain insight into the neurobiological effects of direct eye-to-eye contact using a virtual reality paradigm in individuals with posttraumatic stress disorder (PTSD) related to prolonged childhood abuse. We examined 16 healthy comparison subjects and 16 patients with a primary diagnosis of PTSD using a virtual reality functional magnetic resonance imaging paradigm involving direct vs averted gaze (happy, sad, neutral) as developed by Schrammel et al. in 2009. Irrespective of the displayed emotion, controls exhibited an increased blood oxygenation level-dependent response during direct vs averted gaze within the dorsomedial prefrontal cortex, left temporoparietal junction and right temporal pole. Under the same conditions, individuals with PTSD showed increased activation within the superior colliculus (SC)/periaqueductal gray (PAG) and locus coeruleus. Our findings suggest that healthy controls react to the exposure of direct gaze with an activation of a cortical route that enhances evaluative ‘top–down’ processes underlying social interactions. In individuals with PTSD, however, direct gaze leads to sustained activation of a subcortical route of eye-contact processing, an innate alarm system involving the SC and the underlying circuits of the PAG. PMID:22977200

  14. Deficient gaze pattern during virtual multiparty conversation in patients with schizophrenia.

    PubMed

    Han, Kiwan; Shin, Jungeun; Yoon, Sang Young; Jang, Dong-Pyo; Kim, Jae-Jin

    2014-06-01

    Virtual reality has been used to measure abnormal social characteristics, particularly in one-to-one situations. In real life, however, conversations with multiple companions are common and more complicated than two-party conversations. In this study, we explored the features of social behaviors in patients with schizophrenia during virtual multiparty conversations. Twenty-three patients with schizophrenia and 22 healthy controls performed the virtual three-party conversation task, which included leading and aiding avatars, positive- and negative-emotion-laden situations, and listening and speaking phases. Patients showed a significant negative correlation in the listening phase between the amount of gaze on the between-avatar space and reasoning ability, and demonstrated increased gaze on the between-avatar space in the speaking phase that was uncorrelated with attentional ability. These results suggest that patients with schizophrenia have active avoidance of eye contact during three-party conversations. Virtual reality may provide a useful way to measure abnormal social characteristics during multiparty conversations in schizophrenia. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. The EyeHarp: A Gaze-Controlled Digital Musical Instrument

    PubMed Central

    Vamvakousis, Zacharias; Ramirez, Rafael

    2016-01-01

    We present and evaluate the EyeHarp, a new gaze-controlled Digital Musical Instrument, which aims to enable people with severe motor disabilities to learn, perform, and compose music using only their gaze as control mechanism. It consists of (1) a step-sequencer layer, which serves for constructing chords/arpeggios, and (2) a melody layer, for playing melodies and changing the chords/arpeggios. We have conducted a pilot evaluation of the EyeHarp involving 39 participants with no disabilities from both a performer and an audience perspective. In the first case, eight people with normal vision and no motor disability participated in a music-playing session in which both quantitative and qualitative data were collected. In the second case 31 people qualitatively evaluated the EyeHarp in a concert setting consisting of two parts: a solo performance part, and an ensemble (EyeHarp, two guitars, and flute) performance part. The obtained results indicate that, similarly to traditional music instruments, the proposed digital musical instrument has a steep learning curve, and allows to produce expressive performances both from the performer and audience perspective. PMID:27445885

  16. The German Version of the Gaze Anxiety Rating Scale (GARS): Reliability and Validity

    PubMed Central

    Domes, Gregor; Marx, Lisa; Spenthof, Ines; Heinrichs, Markus

    2016-01-01

    Objective Fear of eye gaze and avoidance of eye contact are core features of social anxiety disorders (SAD). To measure self-reported fear and avoidance of eye gaze, the Gaze Anxiety Rating Scale (GARS) has been developed and validated in recent years in its English version. The main objectives of the present study were to psychometrically evaluate the German translation of the GARS concerning its reliability, factorial structure, and validity. Methods Three samples of participants were enrolled in the study. (1) A non-patient sample (n = 353) completed the GARS and a set of trait questionnaires to assess internal consistency, test-retest reliability, factorial structure, and concurrent and divergent validity. (2) A sample of patients with SAD (n = 33) was compared to a healthy control group (n = 30) regarding their scores on the GARS and the trait measures. Results The German GARS fear and avoidance scales exhibited excellent internal consistency and high stability over 2 and 4 months, as did the original version. The English version’s factorial structure was replicated, yielding two categories of situations: (1) everyday situations and (2) situations involving high evaluative threat. GARS fear and avoidance displayed convergent validity with trait measures of social anxiety and were markedly higher in patients with GSAD than in healthy controls. Fear and avoidance of eye contact in situations involving high levels of evaluative threat related more closely to social anxiety than to gaze anxiety in everyday situations. Conclusions The German version of the GARS has demonstrated reliability and validity similar to the original version, and is thus well suited to capture fear and avoidance of eye contact in different social situations as a valid self-report measure of social anxiety and related disorders in the social domain for use in both clinical practice and research. PMID:26937638

  17. The German Version of the Gaze Anxiety Rating Scale (GARS): Reliability and Validity.

    PubMed

    Domes, Gregor; Marx, Lisa; Spenthof, Ines; Heinrichs, Markus

    2016-01-01

    Fear of eye gaze and avoidance of eye contact are core features of social anxiety disorders (SAD). To measure self-reported fear and avoidance of eye gaze, the Gaze Anxiety Rating Scale (GARS) has been developed and validated in recent years in its English version. The main objectives of the present study were to psychometrically evaluate the German translation of the GARS concerning its reliability, factorial structure, and validity. Three samples of participants were enrolled in the study. (1) A non-patient sample (n = 353) completed the GARS and a set of trait questionnaires to assess internal consistency, test-retest reliability, factorial structure, and concurrent and divergent validity. (2) A sample of patients with SAD (n = 33) was compared to a healthy control group (n = 30) regarding their scores on the GARS and the trait measures. The German GARS fear and avoidance scales exhibited excellent internal consistency and high stability over 2 and 4 months, as did the original version. The English version's factorial structure was replicated, yielding two categories of situations: (1) everyday situations and (2) situations involving high evaluative threat. GARS fear and avoidance displayed convergent validity with trait measures of social anxiety and were markedly higher in patients with GSAD than in healthy controls. Fear and avoidance of eye contact in situations involving high levels of evaluative threat related more closely to social anxiety than to gaze anxiety in everyday situations. The German version of the GARS has demonstrated reliability and validity similar to the original version, and is thus well suited to capture fear and avoidance of eye contact in different social situations as a valid self-report measure of social anxiety and related disorders in the social domain for use in both clinical practice and research.

  18. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment.

    PubMed

    Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M

    2016-01-26

    Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.

  19. Videos of conspecifics elicit interactive looking patterns and facial expressions in monkeys

    PubMed Central

    Mosher, Clayton P.; Zimmerman, Prisca E.; Gothard, Katalin M.

    2014-01-01

    A broader understanding of the neural basis of social behavior in primates requires the use of species-specific stimuli that elicit spontaneous, but reproducible and tractable behaviors. In this context of natural behaviors, individual variation can further inform about the factors that influence social interactions. To approximate natural social interactions similar to those documented by field studies, we used unedited video footage to induce in viewer monkeys spontaneous facial expressions and looking patterns in the laboratory setting. Three adult male monkeys, previously behaviorally and genetically (5-HTTLPR) characterized (Gibboni et al., 2009), were monitored while they watched 10 s video segments depicting unfamiliar monkeys (movie monkeys) displaying affiliative, neutral, and aggressive behaviors. The gaze and head orientation of the movie monkeys alternated between ‘averted’ and ‘directed’ at the viewer. The viewers were not reinforced for watching the movies, thus their looking patterns indicated their interest and social engagement with the stimuli. The behavior of the movie monkey accounted for differences in the looking patterns and facial expressions displayed by the viewers. We also found multiple significant differences in the behavior of the viewers that correlated with their interest in these stimuli. These socially relevant dynamic stimuli elicited spontaneous social behaviors, such as eye-contact induced reciprocation of facial expression, gaze aversion, and gaze following, that were previously not observed in response to static images. This approach opens a unique opportunity to understanding the mechanisms that trigger spontaneous social behaviors in humans and non-human primates. PMID:21688888

  20. Responding to Other People's Direct Gaze: Alterations in Gaze Behavior in Infants at Risk for Autism Occur on Very Short Timescales

    ERIC Educational Resources Information Center

    Nyström, Pär; Bölte, Sven; Falck-Ytter, Terje; Achermann, Sheila; Andersson Konke, Linn; Brocki, Karin; Cauvet, Elodie; Gredebäck, Gustaf; Lundin Kleberg, Johan; Nilsson Jobs, Elisabeth; Thorup, Emilia; Zander, Eric

    2017-01-01

    Atypical gaze processing has been reported in children with autism spectrum disorders (ASD). Here we explored how infants at risk for ASD respond behaviorally to others' direct gaze. We assessed 10-month-olds with a sibling with ASD (high risk group; n = 61) and a control group (n = 18) during interaction with an adult. Eye-tracking revealed less…

  1. Auditory, Vestibular and Cognitive Effects due to Repeated Blast Exposure on the Warfighter

    DTIC Science & Technology

    2012-10-01

    Gaze Horizontal (Left and Right) Description: The primary purpose of the Gaze Horizontal subtest was to detect nystagmus when the head is fixed and...to detect nystagmus when the head is fixed and the eyes are gazing off center from the primary (straight ahead) gaze position. This test is designed...physiological target area and examiner instructions for testing): Spontaneous Nystagmus Smooth Harmonic Acceleration (.01, .08, .32, .64, 1.75

  2. Toddler learning from video: Effect of matched pedagogical cues.

    PubMed

    Lauricella, Alexis R; Barr, Rachel; Calvert, Sandra L

    2016-11-01

    Toddlers learn about their social world by following visual and verbal cues from adults, but they have difficulty transferring what they see in one context to another (e.g., from a screen to real life). Therefore, it is important to understand how the use of matched pedagogical cues, specifically adult eye gaze and language, influence toddlers' imitation from live and digital presentations. Fifteen- and 18-month-old toddlers (N=123) were randomly assigned to one of four experimental conditions or a baseline control condition. The four experimental conditions differed as a function of the interactive cues (audience gaze with interactive language or object gaze with non-interactive language) and presentation type (live or video). Results indicate that toddlers' successfully imitate a task when eye gaze was directed at the object or at the audience and equally well when the task was demonstrated live or via video. All four experimental conditions performed significantly better than the baseline control, indicating learned behavior. Additionally, results demonstrate that girls attended more to the demonstrations and outperformed the boys on the imitation task. In sum, this study demonstrates that young toddlers can learn from video when the models use matched eye gaze and verbal cues, providing additional evidence for ways in which the transfer deficit effect can be ameliorated. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Varix of the vortex vein ampulla simulating choroidal melanoma: report of four cases.

    PubMed

    Gündüz, K; Shields, C L; Shields, J A

    1998-01-01

    Varix of the vortex vein ampulla is a condition that can cause diagnostic confusion with choroidal melanoma. A case series review was performed from the Ocular Oncology Service, Wills Eye Hospital. In all four cases, the patients were referred with the diagnosis of a small choroidal melanoma. The lesions were located in the nasal quadrant of the fundus near the equator. One patient had two lesions in the same quadrant. In all cases, the fundus lesion became more prominent when the eye gazed in the direction of the lesion and diminished in primary gaze. The mass measured up to 6.0 mm in base diameter and 2.5 mm in thickness in proper gaze. B-scan ultrasonography showed acoustic solidity and gaze-evoked dynamic enlargement of the lesion. Indocyanine green angiography demonstrated early pooling of dye and gaze-evoked fluctuation of the hyperfluorescence in the lesion. Color Doppler imaging, performed in one patient, showed a vascular lesion of venous origin that filled when the eye was placed in the direction of the lesion. Varix of the vortex vein is a condition that should be considered in the differential diagnosis of equatorial small choroidal melanoma. The dynamic nature of the lesion is characteristic and diagnostic.

  4. Hierarchical control of two-dimensional gaze saccades

    PubMed Central

    Optican, Lance M.; Blohm, Gunnar; Lefèvre, Philippe

    2014-01-01

    Coordinating the movements of different body parts is a challenging process for the central nervous system because of several problems. Four of these main difficulties are: first, moving one part can move others; second, the parts can have different dynamics; third, some parts can have different motor goals; and fourth, some parts may be perturbed by outside forces. Here, we propose a novel approach for the control of linked systems with feedback loops for each part. The proximal parts have separate goals, but critically the most distal part has only the common goal. We apply this new control policy to eye-head coordination in two-dimensions, specifically head-unrestrained gaze saccades. Paradoxically, the hierarchical structure has controllers for the gaze and the head, but not for the eye (the most distal part). Our simulations demonstrate that the proposed control structure reproduces much of the published empirical data about gaze movements, e.g., it compensates for perturbations, accurately reaches goals for gaze and head from arbitrary initial positions, simulates the nine relationships of the head-unrestrained main sequence, and reproduces observations from lesion and single-unit recording experiments. We conclude by showing how our model can be easily extended to control structures with more linked segments, such as the control of coordinated eye on head on trunk movements. PMID:24062206

  5. Remote gaze tracking system on a large display.

    PubMed

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-10-07

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°~±0.775° and a speed of 5~10 frames/s.

  6. Remote Gaze Tracking System on a Large Display

    PubMed Central

    Lee, Hyeon Chang; Lee, Won Oh; Cho, Chul Woo; Gwon, Su Yeong; Park, Kang Ryoung; Lee, Heekyung; Cha, Jihun

    2013-01-01

    We propose a new remote gaze tracking system as an intelligent TV interface. Our research is novel in the following three ways: first, because a user can sit at various positions in front of a large display, the capture volume of the gaze tracking system should be greater, so the proposed system includes two cameras which can be moved simultaneously by panning and tilting mechanisms, a wide view camera (WVC) for detecting eye position and an auto-focusing narrow view camera (NVC) for capturing enlarged eye images. Second, in order to remove the complicated calibration between the WVC and NVC and to enhance the capture speed of the NVC, these two cameras are combined in a parallel structure. Third, the auto-focusing of the NVC is achieved on the basis of both the user's facial width in the WVC image and a focus score calculated on the eye image of the NVC. Experimental results showed that the proposed system can be operated with a gaze tracking accuracy of ±0.737°∼±0.775° and a speed of 5∼10 frames/s. PMID:24105351

  7. Seductive eyes: attractiveness and direct gaze increase desire for associated objects.

    PubMed

    Strick, Madelijn; Holland, Rob W; van Knippenberg, Ad

    2008-03-01

    Recent research in neuroscience shows that observing attractive faces with direct gaze is more rewarding than observing attractive faces with averted gaze. On the basis of this research, it was hypothesized that object evaluations can be enhanced by associating them with attractive faces displaying direct gaze. In a conditioning paradigm, novel objects were associated with either attractive or unattractive female faces, either displaying direct or averted gaze. An affective priming task showed more positive automatic evaluations of objects that were paired with attractive faces with direct gaze than attractive faces with averted gaze and unattractive faces, irrespective of gaze direction. Participants' self-reported desire for the objects matched the affective priming data. The results are discussed against the background of recent findings on affective consequences of gaze cueing.

  8. Surface coverage with single vs. multiple gaze surface topography to fit scleral lenses.

    PubMed

    DeNaeyer, Gregory; Sanders, Donald R; Farajian, Timothy S

    2017-06-01

    To determine surface coverage of measurements using the sMap3D ® corneo-scleral topographer in patients presenting for scleral lens fitting. Twenty-five eyes of 23 scleral lens patients were examined. Up-gaze, straight-gaze, and down-gaze positions of each eye were "stitched" into a single map. The percentage surface coverage between 10mm and 20mm diameter circles from corneal center was compared between the straight-gaze and stitched images. Scleral toricity magnitude was calculated at 100% coverage and at the same diameter after 50% of the data was removed. At a 10mm diameter from corneal center, the straight-gaze and stitched images both had 100% coverage. At the 14, 15, 16, 18 and 20mm diameters, the straight-gaze image only covered 68%, 53%, 39%, 18%, and 6% of the ocular surface diameters while the stitched image covered 98%, 96%, 93%, 75%, and 32% respectively. In the case showing the most scleral coverage at 16mm (straight-gaze), there was only 75% coverage (straight-gaze) compared to 100% (stitched image); the case with the least coverage had 7% (straight gaze) and 92% (stitched image). The 95% limits of agreement between the 50% and 100% coverage scleral toricity was between -1.4D (50% coverage value larger) and 1.2D (100% coverage larger), a 2.6D spread. The absolute difference between 50% to 100% coverage scleral toricity was ≥0.50D in 28% and ≥1.0D in 16% of cases. It appears that a single straight-gaze image would introduce significant measurement inaccuracy in fitting scleral lenses using the sMap3D while a 3-gaze stitched image would not. Copyright © 2017 British Contact Lens Association. Published by Elsevier Ltd. All rights reserved.

  9. Discriminating between intentional and unintentional gaze fixation using multimodal-based fuzzy logic algorithm for gaze tracking system with NIR camera sensor

    NASA Astrophysics Data System (ADS)

    Naqvi, Rizwan Ali; Park, Kang Ryoung

    2016-06-01

    Gaze tracking systems are widely used in human-computer interfaces, interfaces for the disabled, game interfaces, and for controlling home appliances. Most studies on gaze detection have focused on enhancing its accuracy, whereas few have considered the discrimination of intentional gaze fixation (looking at a target to activate or select it) from unintentional fixation while using gaze detection systems. Previous research methods based on the use of a keyboard or mouse button, eye blinking, and the dwell time of gaze position have various limitations. Therefore, we propose a method for discriminating between intentional and unintentional gaze fixation using a multimodal fuzzy logic algorithm applied to a gaze tracking system with a near-infrared camera sensor. Experimental results show that the proposed method outperforms the conventional method for determining gaze fixation.

  10. The Influences of Face Inversion and Facial Expression on Sensitivity to Eye Contact in High-Functioning Adults with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Vida, Mark D.; Maurer, Daphne; Calder, Andrew J.; Rhodes, Gillian; Walsh, Jennifer A.; Pachai, Matthew V.; Rutherford, M. D.

    2013-01-01

    We examined the influences of face inversion and facial expression on sensitivity to eye contact in high-functioning adults with and without an autism spectrum disorder (ASD). Participants judged the direction of gaze of angry, fearful, and neutral faces. In the typical group only, the range of directions of gaze leading to the perception of eye…

  11. Gaze-contingent control for minimally invasive robotic surgery.

    PubMed

    Mylonas, George P; Darzi, Ara; Yang, Guang Zhong

    2006-09-01

    Recovering tissue depth and deformation during robotically assisted minimally invasive procedures is an important step towards motion compensation, stabilization and co-registration with preoperative data. This work demonstrates that eye gaze derived from binocular eye tracking can be effectively used to recover 3D motion and deformation of the soft tissue. A binocular eye-tracking device was integrated into the stereoscopic surgical console. After calibration, the 3D fixation point of the participating subjects could be accurately resolved in real time. A CT-scanned phantom heart model was used to demonstrate the accuracy of gaze-contingent depth extraction and motion stabilization of the soft tissue. The dynamic response of the oculomotor system was assessed with the proposed framework by using autoregressive modeling techniques. In vivo data were also used to perform gaze-contingent decoupling of cardiac and respiratory motion. Depth reconstruction, deformation tracking, and motion stabilization of the soft tissue were possible with binocular eye tracking. The dynamic response of the oculomotor system was able to cope with frequencies likely to occur under most routine minimally invasive surgical operations. The proposed framework presents a novel approach towards the tight integration of a human and a surgical robot where interaction in response to sensing is required to be under the control of the operating surgeon.

  12. The role of uncertainty and reward on eye movements in a virtual driving task

    PubMed Central

    Sullivan, Brian T.; Johnson, Leif; Rothkopf, Constantin A.; Ballard, Dana; Hayhoe, Mary

    2012-01-01

    Eye movements during natural tasks are well coordinated with ongoing task demands and many variables could influence gaze strategies. Sprague and Ballard (2003) proposed a gaze-scheduling model that uses a utility-weighted uncertainty metric to prioritize fixations on task-relevant objects and predicted that human gaze should be influenced by both reward structure and task-relevant uncertainties. To test this conjecture, we tracked the eye movements of participants in a simulated driving task where uncertainty and implicit reward (via task priority) were varied. Participants were instructed to simultaneously perform a Follow Task where they followed a lead car at a specific distance and a Speed Task where they drove at an exact speed. We varied implicit reward by instructing the participants to emphasize one task over the other and varied uncertainty in the Speed Task with the presence or absence of uniform noise added to the car's velocity. Subjects' gaze data were classified for the image content near fixation and segmented into looks. Gaze measures, including look proportion, duration and interlook interval, showed that drivers more closely monitor the speedometer if it had a high level of uncertainty, but only if it was also associated with high task priority or implicit reward. The interaction observed appears to be an example of a simple mechanism whereby the reduction of visual uncertainty is gated by behavioral relevance. This lends qualitative support for the primary variables controlling gaze allocation proposed in the Sprague and Ballard model. PMID:23262151

  13. Gaze perception in social anxiety and social anxiety disorder

    PubMed Central

    Schulze, Lars; Renneberg, Babette; Lobmaier, Janek S.

    2013-01-01

    Clinical observations suggest abnormal gaze perception to be an important indicator of social anxiety disorder (SAD). Experimental research has yet paid relatively little attention to the study of gaze perception in SAD. In this article we first discuss gaze perception in healthy human beings before reviewing self-referential and threat-related biases of gaze perception in clinical and non-clinical socially anxious samples. Relative to controls, socially anxious individuals exhibit an enhanced self-directed perception of gaze directions and demonstrate a pronounced fear of direct eye contact, though findings are less consistent regarding the avoidance of mutual gaze in SAD. Prospects for future research and clinical implications are discussed. PMID:24379776

  14. Social and Non-Social Cueing of Visuospatial Attention in Autism and Typical Development

    ERIC Educational Resources Information Center

    Pruett, John R.; LaMacchia, Angela; Hoertel, Sarah; Squire, Emma; McVey, Kelly; Todd, Richard D.; Constantino, John N.; Petersen, Steven E.

    2011-01-01

    Three experiments explored attention to eye gaze, which is incompletely understood in typical development and is hypothesized to be disrupted in autism. Experiment 1 (n = 26 typical adults) involved covert orienting to box, arrow, and gaze cues at two probabilities and cue-target times to test whether reorienting for gaze is endogenous, exogenous,…

  15. Direct Gaze Modulates Face Recognition in Young Infants

    ERIC Educational Resources Information Center

    Farroni, Teresa; Massaccesi, Stefano; Menon, Enrica; Johnson, Mark H.

    2007-01-01

    From birth, infants prefer to look at faces that engage them in direct eye contact. In adults, direct gaze is known to modulate the processing of faces, including the recognition of individuals. In the present study, we investigate whether direction of gaze has any effect on face recognition in four-month-old infants. Four-month infants were shown…

  16. Improving data retention in EEG research with children using child-centered eye tracking

    PubMed Central

    Maguire, Mandy J.; Magnon, Grant; Fitzhugh, Anna E.

    2014-01-01

    Background Event Related Potentials (ERPs) elicited by visual stimuli have increased our understanding of developmental disorders and adult cognitive abilities for decades; however, these studies are very difficult with populations who cannot sustain visual attention such as infants and young children. Current methods for studying such populations include requiring a button response, which may be impossible for some participants, and experimenter monitoring, which is subject to error, highly variable, and spatially imprecise. New Method We developed a child-centered methodology to integrate EEG data acquisition and eye-tracking technologies that uses “attention-getters” in which stimulus display is contingent upon the child’s gaze. The goal was to increase the number of trials retained. Additionally, we used the eye-tracker to categorize and analyze the EEG data based on gaze to specific areas of the visual display, compared to analyzing based on stimulus presentation. Results Compared with Existing Methods The number of trials retained was substantially improved using the child-centered methodology compared to a button-press response in 7–8 year olds. In contrast, analyzing the EEG based on eye gaze to specific points within the visual display as opposed to stimulus presentation provided too few trials for reliable interpretation. Conclusions By using the linked EEG-eye-tracker we significantly increased data retention. With this method, studies can be completed with fewer participants and a wider range of populations. However, caution should be used when epoching based on participants’ eye gaze because, in this case, this technique provided substantially fewer trials. PMID:25251555

  17. Gaze direction differentially affects avoidance tendencies to happy and angry faces in socially anxious individuals.

    PubMed

    Roelofs, Karin; Putman, Peter; Schouten, Sonja; Lange, Wolf-Gero; Volman, Inge; Rinck, Mike

    2010-04-01

    Increasing evidence indicates that eye gaze direction affects the processing of emotional faces in anxious individuals. However, the effects of eye gaze direction on the behavioral responses elicited by emotional faces, such as avoidance behavior, remain largely unexplored. We administered an Approach-Avoidance Task (AAT) in high (HSA) and low socially anxious (LSA) individuals. All participants responded to photographs of angry, happy and neutral faces (presented with direct and averted gaze), by either pushing a joystick away from them (avoidance) or pulling it towards them (approach). Compared to LSA, HSA were faster in avoiding than approaching angry faces. Most crucially, this avoidance tendency was only present when the perceived anger was directed towards the subject (direct gaze) and not when the gaze of the face-stimulus was averted. In contrast, HSA individuals tended to avoid happy faces irrespectively of gaze direction. Neutral faces elicited no approach-avoidance tendencies. Thus avoidance of angry faces in social anxiety as measured by AA-tasks reflects avoidance of subject-directed anger and not of negative stimuli in general. In addition, although both anger and joy are considered to reflect approach-related emotions, gaze direction did not affect HSA's avoidance of happy faces, suggesting differential mechanisms affecting responses to happy and angry faces in social anxiety. 2009 Elsevier Ltd. All rights reserved.

  18. Temporal dynamics underlying the modulation of social status on social attention.

    PubMed

    Dalmaso, Mario; Galfano, Giovanni; Coricelli, Carol; Castelli, Luigi

    2014-01-01

    Fixating someone suddenly moving the eyes is known to trigger a corresponding shift of attention in the observer. This phenomenon, known as gaze-cueing effect, can be modulated as a function of the social status of the individual depicted in the cueing face. Here, in two experiments, we investigated the temporal dynamics underlying this modulation. To this end, a gaze-cueing paradigm was implemented in which centrally-placed faces depicting high- and low-status individuals suddenly shifted the eyes towards a location either spatially congruent or incongruent with that occupied by a subsequent target stimulus. Social status was manipulated by presenting fictive Curriculum Vitae before the experimental phase. In Experiment 1, in which two temporal intervals (50 ms vs. 900 ms) occurred between the direct-gaze face and the averted-gaze face onsets, a stronger gaze-cueing effect in response to high-status faces than low-status faces was observed, irrespective of the time participants were allowed for extracting social information. In Experiment 2, in which two temporal intervals (200 ms vs. 1000 ms) occurred between the averted-gaze face and target onset, a stronger gaze cueing for high-status faces was observed at the shorter interval only. Taken together, these results suggest that information regarding social status is extracted from faces rapidly (Experiment 1), and that the tendency to selectively attend to the locations gazed by high-status individuals may decay with time (Experiment 2).

  19. Eye Movements in Darkness Modulate Self-Motion Perception.

    PubMed

    Clemens, Ivar Adrianus H; Selen, Luc P J; Pomante, Antonella; MacNeilage, Paul R; Medendorp, W Pieter

    2017-01-01

    During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first ( n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment ( n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation.

  20. Eye Movements in Darkness Modulate Self-Motion Perception

    PubMed Central

    Pomante, Antonella

    2017-01-01

    Abstract During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first (n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment (n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation. PMID:28144623

  1. Psychomotor control in a virtual laparoscopic surgery training environment: gaze control parameters differentiate novices from experts.

    PubMed

    Wilson, Mark; McGrath, John; Vine, Samuel; Brewer, James; Defriend, David; Masters, Richard

    2010-10-01

    Surgical simulation is increasingly used to facilitate the adoption of technical skills during surgical training. This study sought to determine if gaze control parameters could differentiate between the visual control of experienced and novice operators performing an eye-hand coordination task on a virtual reality laparoscopic surgical simulator (LAP Mentor™). Typically adopted hand movement metrics reflect only one half of the eye-hand coordination relationship; therefore, little is known about how hand movements are guided and controlled by vision. A total of 14 right-handed surgeons were categorised as being either experienced (having led more than 70 laparoscopic procedures) or novice (having performed fewer than 10 procedures) operators. The eight experienced and six novice surgeons completed the eye-hand coordination task from the LAP Mentor basic skills package while wearing a gaze registration system. A variety of performance, movement, and gaze parameters were recorded and compared between groups. The experienced surgeons completed the task significantly more quickly than the novices, but only the economy of movement of the left tool differentiated skill level from the LAP Mentor parameters. Gaze analyses revealed that experienced surgeons spent significantly more time fixating the target locations than novices, who split their time between focusing on the targets and tracking the tools. The findings of the study provide support for the utility of assessing strategic gaze behaviour to better understand the way in which surgeons utilise visual information to plan and control tool movements in a virtual reality laparoscopic environment. It is hoped that by better understanding the limitations of the psychomotor system, effective gaze training programs may be developed.

  2. Psychomotor control in a virtual laparoscopic surgery training environment: gaze control parameters differentiate novices from experts

    PubMed Central

    McGrath, John; Vine, Samuel; Brewer, James; Defriend, David; Masters, Richard

    2010-01-01

    Background Surgical simulation is increasingly used to facilitate the adoption of technical skills during surgical training. This study sought to determine if gaze control parameters could differentiate between the visual control of experienced and novice operators performing an eye-hand coordination task on a virtual reality laparoscopic surgical simulator (LAP Mentor™). Typically adopted hand movement metrics reflect only one half of the eye-hand coordination relationship; therefore, little is known about how hand movements are guided and controlled by vision. Methods A total of 14 right-handed surgeons were categorised as being either experienced (having led more than 70 laparoscopic procedures) or novice (having performed fewer than 10 procedures) operators. The eight experienced and six novice surgeons completed the eye-hand coordination task from the LAP Mentor basic skills package while wearing a gaze registration system. A variety of performance, movement, and gaze parameters were recorded and compared between groups. Results The experienced surgeons completed the task significantly more quickly than the novices, but only the economy of movement of the left tool differentiated skill level from the LAP Mentor parameters. Gaze analyses revealed that experienced surgeons spent significantly more time fixating the target locations than novices, who split their time between focusing on the targets and tracking the tools. Conclusion The findings of the study provide support for the utility of assessing strategic gaze behaviour to better understand the way in which surgeons utilise visual information to plan and control tool movements in a virtual reality laparoscopic environment. It is hoped that by better understanding the limitations of the psychomotor system, effective gaze training programs may be developed. PMID:20333405

  3. Eye-hand coordination during a double-step task: evidence for a common stochastic accumulator

    PubMed Central

    Gopal, Atul

    2015-01-01

    Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (∼90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning. PMID:26084906

  4. Emerging Technologies Look Deeper into the Eyes to Catch Signs of Disease

    MedlinePlus

    ... Eye Disease Vision Screening World Sight Day Emerging technologies look deeper into the eyes to catch signs ... to eye gazing Adaptive optics (AO) is one technology helping to overcome this problem. It deals with ...

  5. "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism.

    PubMed

    Grossman, Ruth B; Steinhart, Erin; Mitchell, Teresa; McIlvane, William

    2015-06-01

    Conversation requires integration of information from faces and voices to fully understand the speaker's message. To detect auditory-visual asynchrony of speech, listeners must integrate visual movements of the face, particularly the mouth, with auditory speech information. Individuals with autism spectrum disorder may be less successful at such multisensory integration, despite their demonstrated preference for looking at the mouth region of a speaker. We showed participants (individuals with and without high-functioning autism (HFA) aged 8-19) a split-screen video of two identical individuals speaking side by side. Only one of the speakers was in synchrony with the corresponding audio track and synchrony switched between the two speakers every few seconds. Participants were asked to watch the video without further instructions (implicit condition) or to specifically watch the in-synch speaker (explicit condition). We recorded which part of the screen and face their eyes targeted. Both groups looked at the in-synch video significantly more with explicit instructions. However, participants with HFA looked at the in-synch video less than typically developing (TD) peers and did not increase their gaze time as much as TD participants in the explicit task. Importantly, the HFA group looked significantly less at the mouth than their TD peers, and significantly more at non-face regions of the image. There were no between-group differences for eye-directed gaze. Overall, individuals with HFA spend less time looking at the crucially important mouth region of the face during auditory-visual speech integration, which is maladaptive gaze behavior for this type of task. © 2015 International Society for Autism Research, Wiley Periodicals, Inc.

  6. Attentional blink in adults with attention-deficit hyperactivity disorder. Influence of eye movements.

    PubMed

    Armstrong, I T; Munoz, D P

    2003-09-01

    The attentional blink paradigm tests attention by overloading it: a list of stimuli is presented very rapidly one after another at the same location on a computer screen, each item overwriting the last, and participants monitor the list using two criteria [e.g. detect the target (red letter) and identify the probe (letter p)]. If the interval between the target and the probe is greater than about 500 ms, both are usually reported correctly, but, when the interval between the target and the probe is within 200-500 ms, report of the probe declines. This decline is the attentional blink, an interval of time when attention is supposedly switching from the first criterion to the second. The attentional blink paradigm should be difficult to perform correctly without vigilantly attending to the rapidly presented list. Vigilance tasks are often used to assess attention-deficit hyperactivity disorder (ADHD). Symptoms of the disorder include hyperactivity and attentional dysfunction; however, some people with ADHD also have difficulty maintaining gaze at a fixed location. We tested 15 adults with ADHD and their age- and sex-matched controls, measuring accuracy and gaze stability during the attentional blink task. ADHD participants reported fewer targets and probes, took longer to recover from the attentional blink, made more eye movements, and made identification errors consistent with non-perception of the letter list. In contrast, errors made by control participants were consistent with guessing (i.e., report of a letter immediately preceding or succeeding the correct letter). Excessive eye movements result in poorer performance for all participants; however, error patterns confirm that the weak performance of ADHD participants may be related to gaze instability as well as to attentional dysfunction.

  7. Dynamic sound localization in cats

    PubMed Central

    Ruhland, Janet L.; Jones, Amy E.

    2015-01-01

    Sound localization in cats and humans relies on head-centered acoustic cues. Studies have shown that humans are able to localize sounds during rapid head movements that are directed toward the target or other objects of interest. We studied whether cats are able to utilize similar dynamic acoustic cues to localize acoustic targets delivered during rapid eye-head gaze shifts. We trained cats with visual-auditory two-step tasks in which we presented a brief sound burst during saccadic eye-head gaze shifts toward a prior visual target. No consistent or significant differences in accuracy or precision were found between this dynamic task (2-step saccade) and the comparable static task (single saccade when the head is stable) in either horizontal or vertical direction. Cats appear to be able to process dynamic auditory cues and execute complex motor adjustments to accurately localize auditory targets during rapid eye-head gaze shifts. PMID:26063772

  8. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    PubMed

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  9. Multimodal Language Learner Interactions via Desktop Videoconferencing within a Framework of Social Presence: Gaze

    ERIC Educational Resources Information Center

    Satar, H. Muge

    2013-01-01

    Desktop videoconferencing (DVC) offers many opportunities for language learning through its multimodal features. However, it also brings some challenges such as gaze and mutual gaze, that is, eye-contact. This paper reports some of the findings of a PhD study investigating social presence in DVC interactions of English as a Foreign Language (EFL)…

  10. Anticipating Intentional Actions: The Effect of Eye Gaze Direction on the Judgment of Head Rotation

    ERIC Educational Resources Information Center

    Hudson, Matthew; Liu, Chang Hong; Jellema, Tjeerd

    2009-01-01

    Using a representational momentum paradigm, this study investigated the hypothesis that judgments of how far another agent's head has rotated are influenced by the perceived gaze direction of the head. Participants observed a video-clip of a face rotating 60[degrees] towards them starting from the left or right profile view. The gaze direction of…

  11. The effect of face eccentricity on the perception of gaze direction.

    PubMed

    Todorović, Dejan

    2009-01-01

    The perception of a looker's gaze direction depends not only on iris eccentricity (the position of the looker's irises within the sclera) but also on the orientation of the lookers' head. One among several potential cues of head orientation is face eccentricity, the position of the inner features of the face (eyes, nose, mouth) within the head contour, as viewed by the observer. For natural faces this cue is confounded with many other head-orientation cues, but in schematic faces it can be studied in isolation. Salient novel illustrations of the effectiveness of face eccentricity are 'Necker faces', which involve equal iris eccentricities but multiple perceived gaze directions. In four experiments, iris and face eccentricity in schematic faces were manipulated, revealing strong and consistent effects of face eccentricity on perceived gaze direction, with different types of tasks. An additional experiment confirmed the 'Mona Lisa' effect with this type of stimuli. Face eccentricity most likely acted as a simple but robust cue of head turn. A simple computational account of combined effects of cues of eye and head turn on perceived gaze direction is presented, including a formal condition for the perception of direct gaze. An account of the 'Mona Lisa' effect is presented.

  12. Exogenous orienting of attention depends upon the ability to execute eye movements.

    PubMed

    Smith, Daniel T; Rorden, Chris; Jackson, Stephen R

    2004-05-04

    Shifts of attention can be made overtly by moving the eyes or covertly with attention being allocated to a region of space that does not correspond to the current direction of gaze. However, the precise relationship between eye movements and the covert orienting of attention remains controversial. The influential premotor theory proposes that the covert orienting of attention is produced by the programming of (unexecuted) eye movements and thus predicts a strong relationship between the ability to execute eye movements and the operation of spatial attention. Here, we demonstrate for the first time that impaired spatial attention is observed in an individual (AI) who is neurologically healthy but who cannot execute eye movements as a result of a congenital impairment in the elasticity of her eye muscles. This finding provides direct support for the role of the eye-movement system in the covert orienting of attention and suggests that whereas intact cortical structures may be necessary for normal attentional reflexes, they are not sufficient. The ability to move our eyes is essential for the development of normal patterns of spatial attention.

  13. Who is the Usual Suspect? Evidence of a Selection Bias Toward Faces That Make Direct Eye Contact in a Lineup Task

    PubMed Central

    van Golde, Celine; Verstraten, Frans A. J.

    2017-01-01

    The speed and ease with which we recognize the faces of our friends and family members belies the difficulty we have recognizing less familiar individuals. Nonetheless, overconfidence in our ability to recognize faces has carried over into various aspects of our legal system; for instance, eyewitness identification serves a critical role in criminal proceedings. For this reason, understanding the perceptual and psychological processes that underlie false identification is of the utmost importance. Gaze direction is a salient social signal and direct eye contact, in particular, is thought to capture attention. Here, we tested the hypothesis that differences in gaze direction may influence difficult decisions in a lineup context. In a series of experiments, we show that when a group of faces differed in their gaze direction, the faces that were making eye contact with the participants were more likely to be misidentified. Interestingly, this bias disappeared when the faces are presented with their eyes closed. These findings open a critical conversation between social neuroscience and forensic psychology, and imply that direct eye contact may (wrongly) increase the perceived familiarity of a face. PMID:28203355

  14. Genuine eye contact elicits self-referential processing.

    PubMed

    Hietanen, Jonne O; Hietanen, Jari K

    2017-05-01

    The effect of eye contact on self-awareness was investigated with implicit measures based on the use of first-person singular pronouns in sentences. The measures were proposed to tap into self-referential processing, that is, information processing associated with self-awareness. In addition, participants filled in a questionnaire measuring explicit self-awareness. In Experiment 1, the stimulus was a video clip showing another person and, in Experiment 2, the stimulus was a live person. In both experiments, participants were divided into two groups and presented with the stimulus person either making eye contact or gazing downward, depending on the group assignment. During the task, the gaze stimulus was presented before each trial of the pronoun-selection task. Eye contact was found to increase the use of first-person pronouns, but only when participants were facing a real person, not when they were looking at a video of a person. No difference in self-reported self-awareness was found between the two gaze direction groups in either experiment. The results indicate that eye contact elicits self-referential processing, but the effect may be stronger, or possibly limited to, live interaction. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Timing of gazes in child dialogues: a time-course analysis of requests and back channelling in referential communication.

    PubMed

    Sandgren, Olof; Andersson, Richard; van de Weijer, Joost; Hansson, Kristina; Sahlén, Birgitta

    2012-01-01

    This study investigates gaze behaviour in child dialogues. In earlier studies the authors have investigated the use of requests for clarification and responses in order to study the co-creation of understanding in a referential communication task. By adding eye tracking, this line of research is now expanded to include non-verbal contributions in conversation. To investigate the timing of gazes in face-to-face interaction and to relate the gaze behaviour to the use of requests for clarification. Eight conversational pairs of typically developing 10-15 year olds participated. The pairs (director and executor) performed a referential communication task requiring the description of faces. During the dialogues both participants wore head-mounted eye trackers. All gazes were recorded and categorized according to the area fixated (Task, Face, Off). The verbal context for all instances of gaze at the partner's face was identified and categorized using time-course analysis. The results showed that the executor spends almost 90% of the time fixating the gaze on the task, 10% on the director's face and less than 0.5% elsewhere. Turn shift, primarily requests for clarification, and back channelling significantly predicted the executors' gaze to the face of the task director. The distribution of types of requests showed that requests for previously unmentioned information were significantly more likely to be associated with gaze at the director. The study shows that the executors' gaze at the director accompanies important dynamic shifts in the dialogue. The association with requests for clarification indicates that gaze at the director can be used to monitor the response with two modalities. Furthermore, the significantly higher association with requests for previously unmentioned information indicates that gaze may be used to emphasize the verbal content. The results will be used as a reference for studies of gaze behaviour in clinical populations with hearing and language impairments. © 2012 Royal College of Speech and Language Therapists.

  16. Spatial eye–hand coordination during bimanual reaching is not systematically coded in either LIP or PRR

    PubMed Central

    Snyder, Lawrence H.

    2018-01-01

    We often orient to where we are about to reach. Spatial and temporal correlations in eye and arm movements may depend on the posterior parietal cortex (PPC). Spatial representations of saccade and reach goals preferentially activate cells in the lateral intraparietal area (LIP) and the parietal reach region (PRR), respectively. With unimanual reaches, eye and arm movement patterns are highly stereotyped. This makes it difficult to study the neural circuits involved in coordination. Here, we employ bimanual reaching to two different targets. Animals naturally make a saccade first to one target and then the other, resulting in different patterns of limb–gaze coordination on different trials. Remarkably, neither LIP nor PRR cells code which target the eyes will move to first. These results suggest that the parietal cortex plays at best only a permissive role in some aspects of eye–hand coordination and makes the role of LIP in saccade generation unclear. PMID:29610356

  17. Analogical Reasoning in Children With Autism Spectrum Disorder: Evidence From an Eye-Tracking Approach

    PubMed Central

    Tan, Enda; Wu, Xueyuan; Nishida, Tracy; Huang, Dan; Chen, Zhe; Yi, Li

    2018-01-01

    The present study examined analogical reasoning in children with autism spectrum disorder (ASD) and its relationship with cognitive and executive functioning and processing strategies. Our findings showed that although children with ASD were less competent in solving analogical problems than typically developing children, this inferior performance was attributable to general cognitive impairments. Eye-movement analyses revealed that children with ASD paid less attention to relational items and showed fewer gaze shifts between relational locations. Nevertheless, these eye-movement patterns did not predict autistic children’s behavioral performance. Together, our findings suggest that ASD per se does not entail impairments in analogical reasoning. The inferior performance of autistic children on analogical reasoning tasks is attributable to deficits in general cognitive and executive functioning. PMID:29899718

  18. Using eye tracking and gaze pattern analysis to test a "dirty bomb" decision aid in a pilot RCT in urban adults with limited literacy.

    PubMed

    Bass, Sarah Bauerle; Gordon, Thomas F; Gordon, Ryan; Parvanta, Claudia

    2016-06-08

    Eye tracking is commonly used in marketing to understand complex responses to materials, but has not been used to understand how low-literacy adults access health information or its relationship to decision making. This study assessed how participants use a literacy appropriate "dirty bomb" decision aid. Participants were randomized to receive a CDC "factsheet" (n = 21) or literacy-appropriate aid (n = 29) shown on a computer screen. Using 7 content similar slides, gaze patterns, mean pupil fixation time and mean overall time reading and looking at slides were compared. Groups were also compared by literacy level and effect on 'confidence of knowledge' and intended behavior. Results revealed differing abilities to read densely written material. Intervention participants more precisely followed text on 4 of 7 content-similar slides compared to control participants whose gaze patterns indicated unread text, or repeated attempts at reading the same text, suggesting difficulty in understanding key preparedness messages. Controls had significantly longer pupil fixations on 5 of 7 slides and spent more overall time on every slide. In those with very low literacy, intervention participants were more likely than controls to say they understood what a "dirty bomb" is and how to respond if one should occur. Results indicate limited- literacy adults, especially those with very low literacy, may not be able to understand how to respond during a "dirty bomb" using available materials, making them vulnerable to negative health events. This study provides insights into how individuals perceive and process risk communication messages, illustrating a rich and nuanced understanding of the qualitative experience of a limited literacy population with written materials. It also demonstrates the feasibility of using these methods on a wider scale to develop more effective health and risk communication messages designed to increase knowledge of and compliance with general health guidelines, and enhance decision making. This has application for those with learning disabilities, those with limited media-literacy skills, and those needing to access the diverse array of assistive technologies now available. Eye tracking is thus a practical approach to understanding these diverse needs to ensure the development of cogent and salient communication.

  19. Oxytocin enhances gaze-following responses to videos of natural social behavior in adult male rhesus monkeys

    PubMed Central

    Putnam, P.T.; Roman, J.M.; Zimmerman, P.E.; Gothard, K.M.

    2017-01-01

    Gaze following is a basic building block of social behavior that has been observed in multiple species, including primates. The absence of gaze following is associated with abnormal development of social cognition, such as in autism spectrum disorders (ASD). Some social deficits in ASD, including the failure to look at eyes and the inability to recognize facial expressions, are ameliorated by intranasal administration of oxytocin (IN-OT). Here we tested the hypothesis that IN-OT might enhance social processes that require active engagement with a social partner, such as gaze following. Alternatively, IN-OT may only enhance the perceptual salience of the eyes, and may not modify behavioral responses to social signals. To test this hypothesis, we presented four monkeys with videos of conspecifics displaying natural behaviors. Each video was viewed multiple times before and after the monkeys received intranasally either 50 IU of OT or saline. We found that despite a gradual decrease in attention to the repeated viewing of the same videos (habituation), IN-OT consistently increased the frequency of gaze following saccades. Further analysis confirmed that these behaviors did not occur randomly, but rather predictably in response to the same segments of the videos. These findings suggest that in response to more naturalistic social stimuli IN-OT enhances the propensity to interact with a social partner rather than merely elevating the perceptual salience of the eyes. In light of these findings, gaze following may serve as a metric for pro-social effects of oxytocin that target social action more than social perception. PMID:27343726

  20. Predicting diagnostic error in Radiology via eye-tracking and image analytics: Application in mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Pinto, Frank M; Morin-Ducote, Garnetta

    2013-01-01

    Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels. Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from 4 Radiology residents and 2 breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADsmore » images features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated. Results: Diagnostic error can be predicted reliably by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model (AUC=0.79). Personalized user modeling was far more accurate for the more experienced readers (average AUC of 0.837 0.029) than for the less experienced ones (average AUC of 0.667 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features. Conclusions: Diagnostic errors in mammography can be predicted reliably by leveraging the radiologists gaze behavior and image content.« less

  1. Coordinates of Human Visual and Inertial Heading Perception.

    PubMed

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results.

  2. Coordinates of Human Visual and Inertial Heading Perception

    PubMed Central

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results. PMID:26267865

  3. Unchanging visions: the effects and limitations of ocular stillness

    PubMed Central

    Macknik, Stephen L.

    2017-01-01

    Scientists have pondered the perceptual effects of ocular motion, and those of its counterpart, ocular stillness, for over 200 years. The unremitting ‘trembling of the eye’ that occurs even during gaze fixation was first noted by Jurin in 1738. In 1794, Erasmus Darwin documented that gaze fixation produces perceptual fading, a phenomenon rediscovered in 1804 by Ignaz Paul Vital Troxler. Studies in the twentieth century established that Jurin's ‘eye trembling’ consisted of three main types of ‘fixational’ eye movements, now called microsaccades (or fixational saccades), drifts and tremor. Yet, owing to the constant and minute nature of these motions, the study of their perceptual and physiological consequences has met significant technological challenges. Studies starting in the 1950s and continuing in the present have attempted to study vision during retinal stabilization—a technique that consists on shifting any and all visual stimuli presented to the eye in such a way as to nullify all concurrent eye movements—providing a tantalizing glimpse of vision in the absence of change. No research to date has achieved perfect retinal stabilization, however, and so other work has devised substitute ways to counteract eye motion, such as by studying the perception of afterimages or of the entoptic images formed by retinal vessels, which are completely stable with respect to the eye. Yet other research has taken the alternative tack to control eye motion by behavioural instruction to fix one's gaze or to keep one's gaze still, during concurrent physiological and/or psychophysical measurements. Here, we review the existing data—from historical and contemporary studies that have aimed to nullify or minimize eye motion—on the perceptual and physiological consequences of perfect versus imperfect fixation. We also discuss the accuracy, quality and stability of ocular fixation, and the bottom–up and top–down influences that affect fixation behaviour. This article is part of the themed issue ‘Movement suppression: brain mechanisms for stopping and stillness’. PMID:28242737

  4. Enhanced Video-Oculography System

    NASA Technical Reports Server (NTRS)

    Moore, Steven T.; MacDougall, Hamish G.

    2009-01-01

    A previously developed video-oculography system has been enhanced for use in measuring vestibulo-ocular reflexes of a human subject in a centrifuge, motor vehicle, or other setting. The system as previously developed included a lightweight digital video camera mounted on goggles. The left eye was illuminated by an infrared light-emitting diode via a dichroic mirror, and the camera captured images of the left eye in infrared light. To extract eye-movement data, the digitized video images were processed by software running in a laptop computer. Eye movements were calibrated by having the subject view a target pattern, fixed with respect to the subject s head, generated by a goggle-mounted laser with a diffraction grating. The system as enhanced includes a second camera for imaging the scene from the subject s perspective, and two inertial measurement units (IMUs) for measuring linear accelerations and rates of rotation for computing head movements. One IMU is mounted on the goggles, the other on the centrifuge or vehicle frame. All eye-movement and head-motion data are time-stamped. In addition, the subject s point of regard is superimposed on each scene image to enable analysis of patterns of gaze in real time.

  5. Preliminary Experience Using Eye-Tracking Technology to Differentiate Novice and Expert Image Interpretation for Ultrasound-Guided Regional Anesthesia.

    PubMed

    Borg, Lindsay K; Harrison, T Kyle; Kou, Alex; Mariano, Edward R; Udani, Ankeet D; Kim, T Edward; Shum, Cynthia; Howard, Steven K

    2018-02-01

    Objective measures are needed to guide the novice's pathway to expertise. Within and outside medicine, eye tracking has been used for both training and assessment. We designed this study to test the hypothesis that eye tracking may differentiate novices from experts in static image interpretation for ultrasound (US)-guided regional anesthesia. We recruited novice anesthesiology residents and regional anesthesiology experts. Participants wore eye-tracking glasses, were shown 5 sonograms of US-guided regional anesthesia, and were asked a series of anatomy-based questions related to each image while their eye movements were recorded. The answer to each question was a location on the sonogram, defined as the area of interest (AOI). The primary outcome was the total gaze time in the AOI (seconds). Secondary outcomes were the total gaze time outside the AOI (seconds), total time to answer (seconds), and time to first fixation on the AOI (seconds). Five novices and 5 experts completed the study. Although the gaze time (mean ± SD) in the AOI was not different between groups (7 ± 4 seconds for novices and 7 ± 3 seconds for experts; P = .150), the gaze time outside the AOI was greater for novices (75 ± 18 versus 44 ± 4 seconds for experts; P = .005). The total time to answer and total time to first fixation in the AOI were both shorter for experts. Experts in US-guided regional anesthesia take less time to identify sonoanatomy and spend less unfocused time away from a target compared to novices. Eye tracking is a potentially useful tool to differentiate novices from experts in the domain of US image interpretation. © 2017 by the American Institute of Ultrasound in Medicine.

  6. Use of Speaker’s Gaze and Syntax in Verb Learning

    PubMed Central

    Nappa, Rebecca; Wessel, Allison; McEldoon, Katherine L.; Gleitman, Lila R.; Trueswell, John C.

    2013-01-01

    Speaker eye gaze and gesture are known to help child and adult listeners establish communicative alignment and learn object labels. Here we consider how learners use these cues, along with linguistic information, to acquire abstract relational verbs. Test items were perspective verb pairs (e.g., chase/flee, win/lose), which pose a special problem for observational accounts of word learning because their situational contexts overlap very closely; the learner must infer the speaker’s chosen perspective on the event. Two cues to the speaker’s perspective on a depicted event were compared and combined: (a) the speaker’s eye gaze to an event participant (e.g., looking at the Chaser vs. looking at the Flee-er) and (b) the speaker’s linguistic choice of which event participant occupies Subject position in his utterance. Participants (3-, 4-, and 5-year-olds) were eye-tracked as they watched a series of videos of a man describing drawings of perspective events (e.g., a rabbit chasing an elephant). The speaker looked at one of the two characters and then uttered either an utterance that was referentially uninformative (He’s mooping him) or informative (The rabbit’s mooping the elephant/The elephant’s mooping the rabbit) because of the syntactic positioning of the nouns. Eye-tracking results showed that all participants regardless of age followed the speaker’s gaze in both uninformative and informative contexts. However, verb-meaning choices were responsive to speaker’s gaze direction only in the linguistically uninformative condition. In the presence of a linguistically informative context, effects of speaker gaze on meaning were minimal for the youngest children to nonexistent for the older populations. Thus children, like adults, can use multiple cues to inform verb-meaning choice but rapidly learn that the syntactic positioning of referring expressions is an especially informative source of evidence for these decisions. PMID:24465183

  7. Effect of 3,4-diaminopyridine on the postural control in patients with downbeat nystagmus.

    PubMed

    Sprenger, Andreas; Zils, Elisabeth; Rambold, Holger; Sander, Thurid; Helmchen, Christoph

    2005-04-01

    Downbeat nystagmus (DBN) is a common, usually persistent ocular motor sign in vestibulocerebellar midline lesions. Postural imbalance in DBN may increase on lateral gaze when downbeat nystagmus increases. 3,4-Diaminopyridine (3,4-DAP) has been shown to suppress the slow-phase velocity component of downbeat nystagmus and its gravity-dependent component with concomitant improvement of oscillopsia. Because the pharmacological effect is thought to be caused by improvement of the vestibulocerebellar Purkinje cell activity, the effect of 3,4-DAP on the postural control of patients with downbeat nystagmus syndrome was examined. Eye movements were recorded with the video-based Eyelink II system. Postural sway and pathway were assessed by posturography in lateral gaze in the light and on eye closure. Two out of four patients showed an improvement of the area of postural sway by 57% of control (baseline) on eye closure. In contrast, downbeat nystagmus in gaze straight ahead and on lateral gaze did not benefit in these two patients, implying a specific influence of 3,4-DAP on the vestibulocerebellar control of posture. It was concluded that 3,4-DAP may particularly influence the postural performance in patients with downbeat nystagmus.

  8. Reading strategies in infantile nystagmus syndrome.

    PubMed

    Thomas, Mervyn G; Gottlob, Irene; McLean, Rebecca J; Maconachie, Gail; Kumar, Anil; Proudlock, Frank A

    2011-10-17

    The adaptive strategies adopted by individuals with infantile nystagmus syndrome (INS) during reading are not clearly understood. Eye movement recordings were used to identify ocular motor strategies used by patients with INS during reading. Eye movements were recorded at 500 Hz in 25 volunteers with INS and 7 controls when reading paragraphs of text centered at horizontal gaze angles of -20°, -10°, 0°, 10°, and 20°. At each location, reading speeds were measured, along with logMAR visual acuity and nystagmus during gaze-holding. Adaptive strategies were identified from slow and quick-phase patterns in the nystagmus waveform. Median reading speeds were 204.3 words per minute in individuals with INS and 273.6 words per minute in controls. Adaptive strategies included (1) suppression of corrective quick phases allowing involuntary slow phases to achieve the desired goal, (2) voluntarily changing the character of the involuntary slow phases using quick phases, and (3) correction of involuntary slow phases using quick phases. Several individuals with INS read more rapidly than healthy control volunteers. These findings demonstrate that volunteers with INS learn to manipulate their nystagmus using a range of strategies to acquire visual information from the text. These strategies include taking advantage of the stereotypical and periodic nature of involuntary eye movements to allow the involuntary eye movements to achieve the desired goal. The versatility of these adaptations yields reading speeds in those with nystagmus that are often much better than might be expected, given the degree of foveal and ocular motor deficits.

  9. INFRARED- BASED BLINK DETECTING GLASSES FOR FACIAL PACING: TOWARDS A BIONIC BLINK

    PubMed Central

    Frigerio, Alice; Hadlock, Tessa A; Murray, Elizabeth H; Heaton, James T

    2015-01-01

    IMPORTANCE Facial paralysis remains one of the most challenging conditions to effectively manage, often causing life-altering deficits in both function and appearance. Facial rehabilitation via pacing and robotic technology has great yet unmet potential. A critical first step towards reanimating symmetrical facial movement in cases of unilateral paralysis is the detection of healthy movement to use as a trigger for stimulated movement. OBJECTIVE To test a blink detection system that can be attached to standard eyeglasses and used as part of a closed-loop facial pacing system. DESIGN Standard safety glasses were equipped with an infrared (IR) emitter/detector pair oriented horizontally across the palpebral fissure, creating a monitored IR beam that became interrupted when the eyelids closed. SETTING Tertiary care Facial Nerve Center. PARTICIPANTS 24 healthy volunteers. MAIN OUTCOME MEASURE Video-quantified blinking was compared with both IR sensor signal magnitude and rate of change in healthy participants with their gaze in repose, while they shifted gaze from central to far peripheral positions, and during the production of particular facial expressions. RESULTS Blink detection based on signal magnitude achieved 100% sensitivity in forward gaze, but generated false-detections on downward gaze. Calculations of peak rate of signal change (first derivative) typically distinguished blinks from gaze-related lid movements. During forward gaze, 87% of detected blink events were true positives, 11% were false positives, and 2% false negatives. Of the 11% false positives, 6% were associated with partial eyelid closures. During gaze changes, false blink detection occurred 6.3% of the time during lateral eye movements, 10.4% during upward movements, 46.5% during downward movements, and 5.6% for movements from an upward or downward gaze back to the primary gaze. Facial expressions disrupted sensor output if they caused substantial squinting or shifted the glasses. CONCLUSION AND RELEVANCE Our blink detection system provides a reliable, non-invasive indication of eyelid closure using an invisible light beam passing in front of the eye. Future versions will aim to mitigate detection errors by using multiple IR emitter/detector pairs mounted on the glasses, and alternative frame designs may reduce shifting of the sensors relative to the eye during facial movements. PMID:24699708

  10. Amygdala lesions in rhesus macaques decrease attention to threat

    PubMed Central

    Dal Monte, Olga; Costa, Vincent D.; Noble, Pamela L.; Murray, Elisabeth A.; Averbeck, Bruno B.

    2015-01-01

    Evidence from animal and human studies has suggested that the amygdala plays a role in detecting threat and in directing attention to the eyes. Nevertheless, there has been no systematic investigation of whether the amygdala specifically facilitates attention to the eyes or whether other features can also drive attention via amygdala processing. The goal of the present study was to examine the effects of amygdala lesions in rhesus monkeys on attentional capture by specific facial features, as well as gaze patterns and changes in pupil dilation during free viewing. Here we show reduced attentional capture by threat stimuli, specifically the mouth, and reduced exploration of the eyes in free viewing in monkeys with amygdala lesions. Our findings support a role for the amygdala in detecting threat signals and in directing attention to the eye region of faces when freely viewing different expressions. PMID:26658670

  11. Eye-Tracking, Autonomic, and Electrophysiological Correlates of Emotional Face Processing in Adolescents with Autism Spectrum Disorder

    PubMed Central

    Wagner, Jennifer B.; Hirsch, Suzanna B.; Vogel-Farley, Vanessa K.; Redcay, Elizabeth; Nelson, Charles A.

    2014-01-01

    Individuals with autism spectrum disorder (ASD) often have difficulty with social-emotional cues. This study examined the neural, behavioral, and autonomic correlates of emotional face processing in adolescents with ASD and typical development (TD) using eye-tracking and event-related potentials (ERPs) across two different paradigms. Scanning of faces was similar across groups in the first task, but the second task found that face-sensitive ERPs varied with emotional expressions only in TD. Further, ASD showed enhanced neural responding to non-social stimuli. In TD only, attention to eyes during eye-tracking related to faster face-sensitive ERPs in a separate task; in ASD, a significant positive association was found between autonomic activity and attention to mouths. Overall, ASD showed an atypical pattern of emotional face processing, with reduced neural differentiation between emotions and a reduced relationship between gaze behavior and neural processing of faces. PMID:22684525

  12. I Reach Faster When I See You Look: Gaze Effects in Human-Human and Human-Robot Face-to-Face Cooperation.

    PubMed

    Boucher, Jean-David; Pattacini, Ugo; Lelong, Amelie; Bailly, Gerrard; Elisei, Frederic; Fagel, Sascha; Dominey, Peter Ford; Ventre-Dominey, Jocelyne

    2012-01-01

    Human-human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human-human cooperation experiment demonstrating that an agent's vision of her/his partner's gaze can significantly improve that agent's performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human-robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human-robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times.

  13. Videos of conspecifics elicit interactive looking patterns and facial expressions in monkeys.

    PubMed

    Mosher, Clayton P; Zimmerman, Prisca E; Gothard, Katalin M

    2011-08-01

    A broader understanding of the neural basis of social behavior in primates requires the use of species-specific stimuli that elicit spontaneous, but reproducible and tractable behaviors. In this context of natural behaviors, individual variation can further inform about the factors that influence social interactions. To approximate natural social interactions similar to those documented by field studies, we used unedited video footage to induce in viewer monkeys spontaneous facial expressions and looking patterns in the laboratory setting. Three adult male monkeys (Macaca mulatta), previously behaviorally and genetically (5-HTTLPR) characterized, were monitored while they watched 10 s video segments depicting unfamiliar monkeys (movie monkeys) displaying affiliative, neutral, and aggressive behaviors. The gaze and head orientation of the movie monkeys alternated between "averted" and "directed" at the viewer. The viewers were not reinforced for watching the movies, thus their looking patterns indicated their interest and social engagement with the stimuli. The behavior of the movie monkey accounted for differences in the looking patterns and facial expressions displayed by the viewers. We also found multiple significant differences in the behavior of the viewers that correlated with their interest in these stimuli. These socially relevant dynamic stimuli elicited spontaneous social behaviors, such as eye-contact induced reciprocation of facial expression, gaze aversion, and gaze following, that were previously not observed in response to static images. This approach opens a unique opportunity to understanding the mechanisms that trigger spontaneous social behaviors in humans and nonhuman primates. (PsycINFO Database Record (c) 2011 APA, all rights reserved).

  14. Eye-head coordination during free exploration in human and cat.

    PubMed

    Einhäuser, Wolfgang; Moeller, Gudrun U; Schumann, Frank; Conradt, Jörg; Vockeroth, Johannes; Bartl, Klaus; Schneider, Erich; König, Peter

    2009-05-01

    Eye, head, and body movements jointly control the direction of gaze and the stability of retinal images in most mammalian species. The contribution of the individual movement components, however, will largely depend on the ecological niche the animal occupies and the layout of the animal's retina, in particular its photoreceptor density distribution. Here the relative contribution of eye-in-head and head-in-world movements in cats is measured, and the results are compared to recent human data. For the cat, a lightweight custom-made head-mounted video setup was used (CatCam). Human data were acquired with the novel EyeSeeCam device, which measures eye position to control a gaze-contingent camera in real time. For both species, analysis was based on simultaneous recordings of eye and head movements during free exploration of a natural environment. Despite the substantial differences in ecological niche, photoreceptor density, and saccade frequency, eye-movement characteristics in both species are remarkably similar. Coordinated eye and head movements dominate the dynamics of the retinal input. Interestingly, compensatory (gaze-stabilizing) movements play a more dominant role in humans than they do in cats. This finding was interpreted to be a consequence of substantially different timescales for head movements, with cats' head movements showing about a 5-fold faster dynamics than humans. For both species, models and laboratory experiments therefore need to account for this rich input dynamic to obtain validity for ecologically realistic settings.

  15. Symptoms elicited in persons with vestibular dysfunction while performing gaze movements in optic flow environments

    PubMed Central

    Whitney, Susan L.; Sparto, Patrick J.; Cook, James R.; Redfern, Mark S.; Furman, Joseph M.

    2016-01-01

    Introduction People with vestibular disorders often experience space and motion discomfort when exposed to moving or highly textured visual scenes. The purpose of this study was to measure the type and severity of symptoms in people with vestibular dysfunction during coordinated head and eye movements in optic flow environments. Methods Seven subjects with vestibular disorders and 25 controls viewed four different full-field optic flow environments on six different visits. The optic flow environments consisted of textures with various contrasts and spatial frequencies. Subjects performed 8 gaze movement tasks, including eye saccades, gaze saccades, and gaze stabilization tasks. Subjects reported symptoms using Subjective Units of Discomfort (SUD) and the Simulator Sickness Questionnaire (SSQ). Self-reported dizziness handicap and space and motion discomfort were also measured. Results/ Conclusion Subjects with vestibular disorders had greater discomfort and experienced greater oculomotor and disorientation symptoms. The magnitude of the symptoms increased during each visit, but did not depend on the optic flow condition. Subjects who reported greater dizziness handicap and space motion discomfort had greater severity of symptoms during the experiment. Symptoms of fatigue, difficulty focusing, and dizziness during the experiment were evident. Compared with controls, subjects with vestibular disorders had less head movement during the gaze saccade tasks. Overall, performance of gaze pursuit and gaze stabilization tasks in moving visual environments elicited greater symptoms in subjects with vestibular disorders compared with healthy subjects. PMID:23549055

  16. SOCIAL AND NON-SOCIAL CUEING OF VISUOSPATIAL ATTENTION IN AUTISM AND TYPICAL DEVELOPMENT

    PubMed Central

    Pruett, John R.; LaMacchia, Angela; Hoertel, Sarah; Squire, Emma; McVey, Kelly; Todd, Richard D.; Constantino, John N.; Petersen, Steven E.

    2013-01-01

    Three experiments explored attention to eye gaze, which is incompletely understood in typical development and is hypothesized to be disrupted in autism. Experiment 1 (n=26 typical adults) involved covert orienting to box, arrow, and gaze cues at two probabilities and cue-target times to test whether reorienting for gaze is endogenous, exogenous, or unique; experiment 2 (total n=80: male and female children and adults) studied age and sex effects on gaze cueing. Gaze cueing appears endogenous and may strengthen in typical development. Experiment 3 tested exogenous, endogenous, and/or gaze-based orienting in 25 typical and 27 Autistic Spectrum Disorder (ASD) children. ASD children made more saccades, slowing their reaction times; however, exogenous and endogenous orienting, including gaze cueing, appear intact in ASD. PMID:20809377

  17. A novel approach to training attention and gaze in ASD: A feasibility and efficacy pilot study.

    PubMed

    Chukoskie, Leanne; Westerfield, Marissa; Townsend, Jeanne

    2018-05-01

    In addition to the social, communicative and behavioral symptoms that define the disorder, individuals with ASD have difficulty re-orienting attention quickly and accurately. Similarly, fast re-orienting saccadic eye movements are also inaccurate and more variable in both endpoint and timing. Atypical gaze and attention are among the earliest symptoms observed in ASD. Disruption of these foundation skills critically affects the development of higher level cognitive and social behavior. We propose that interventions aimed at these early deficits that support social and cognitive skills will be broadly effective. We conducted a pilot clinical trial designed to demonstrate the feasibility and preliminary efficacy of using gaze-contingent video games for low-cost in-home training of attention and eye movement. Eight adolescents with ASD participated in an 8-week training, with pre-, mid- and post-testing of eye movement and attention control. Six of the eight adolescents completed the 8 weeks of training and all six showed improvement in attention (orienting, disengagement) and eye movement control or both. All game systems remained intact for the duration of training and all participants could use the system independently. We delivered a robust, low-cost, gaze-contingent game system for home use that, in our pilot training sample, improved the attention orienting and eye movement performance of adolescent participants in 8 weeks of training. We are currently conducting a clinical trial to replicate these results and to examine what, if any, aspects of training transfer to more real-world tasks. © 2017 Wiley Periodicals, Inc. Develop Neurobiol 78: 546-554, 2018. © 2017 Wiley Periodicals, Inc.

  18. Development of a low cost pupillometer-eyetracker and applications

    NASA Astrophysics Data System (ADS)

    Bianchetti, Arturo; Perez, Liliana I.; Comastri, Silvia A.

    2013-11-01

    The determination of ocular pupil diameter and gaze direction is important in various psychophysical and cognitive tests and can be accomplished using commercial, academic or open-source devices. In this work we develop a table-top pupillometer eyetracker termed Blick, the hardware costs 50 dollars and the software is open source (https://github.com/abianchetti/blick). The hardware is mounted in a portable holder and comprises an illumination system (two infrared LEDs generating 0.13 W/m2 at 22 cm) and a detection system (containing an USB camera, an infrared filter and a 16mm lens system) The software, programmed in C++ using OpenCV and cvblob libraries, processes eye images in real time and supplies plots and tables of pupil diameter and gaze direction and a video of the eye. As applications, capturing the right eye of six young emmetropes and after performing the pixel-mm and homographic calibrations (required to determine diameter in mm and gaze direction), we conduct three tests. The corresponding tasks are to detect mistakes in three series of four poker cards, to recognize letters F between distractors and to write a sentence via eye movements using Blick as eye tracker and the tool Dasher (MacKayśs Cambridge Group). We obtain that Blicḱs performance is satisfactory (errors being 0.05 mm in pupil diameter and 1 degree in gaze direction); that there are slight pupil dilations when subjects used to playing cards find mistakes and when some subjects find targets and, finally, that Blick can be employed as eyetracker to allow communication of disabled persons.

  19. Can gaze avoidance explain why individuals with Asperger's syndrome can't recognise emotions from facial expressions?

    PubMed

    Sawyer, Alyssa C P; Williamson, Paul; Young, Robyn L

    2012-04-01

    Research has shown that individuals with Autism Spectrum Disorders (ASD) have difficulties recognising emotions from facial expressions. Since eye contact is important for accurate emotion recognition, and individuals with ASD tend to avoid eye contact, this tendency for gaze aversion has been proposed as an explanation for the emotion recognition deficit. This explanation was investigated using a newly developed emotion and mental state recognition task. Individuals with Asperger's Syndrome were less accurate at recognising emotions and mental states, but did not show evidence of gaze avoidance compared to individuals without Asperger's Syndrome. This suggests that the way individuals with Asperger's Syndrome look at faces cannot account for the difficulty they have recognising expressions.

  20. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.

    PubMed

    Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.

  1. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma

    PubMed Central

    Black, Alex A.

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433

  2. Neurocognitive mechanisms of gaze-expression interactions in face processing and social attention

    PubMed Central

    Graham, Reiko; LaBar, Kevin S.

    2012-01-01

    The face conveys a rich source of non-verbal information used during social communication. While research has revealed how specific facial channels such as emotional expression are processed, little is known about the prioritization and integration of multiple cues in the face during dyadic exchanges. Classic models of face perception have emphasized the segregation of dynamic versus static facial features along independent information processing pathways. Here we review recent behavioral and neuroscientific evidence suggesting that within the dynamic stream, concurrent changes in eye gaze and emotional expression can yield early independent effects on face judgments and covert shifts of visuospatial attention. These effects are partially segregated within initial visual afferent processing volleys, but are subsequently integrated in limbic regions such as the amygdala or via reentrant visual processing volleys. This spatiotemporal pattern may help to resolve otherwise perplexing discrepancies across behavioral studies of emotional influences on gaze-directed attentional cueing. Theoretical explanations of gaze-expression interactions are discussed, with special consideration of speed-of-processing (discriminability) and contextual (ambiguity) accounts. Future research in this area promises to reveal the mental chronometry of face processing and interpersonal attention, with implications for understanding how social referencing develops in infancy and is impaired in autism and other disorders of social cognition. PMID:22285906

  3. Frontal view reconstruction for iris recognition

    DOEpatents

    Santos-Villalobos, Hector J; Bolme, David S; Boehnen, Chris Bensing

    2015-02-17

    Iris recognition can be accomplished for a wide variety of eye images by correcting input images with an off-angle gaze. A variety of techniques, from limbus modeling, corneal refraction modeling, optical flows, and genetic algorithms can be used. A variety of techniques, including aspherical eye modeling, corneal refraction modeling, ray tracing, and the like can be employed. Precomputed transforms can enhance performance for use in commercial applications. With application of the technologies, images with significantly unfavorable gaze angles can be successfully recognized.

  4. Analysis of eye-tracking experiments performed on a Tobii T60

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banks, David C

    2008-01-01

    Commercial eye-gaze trackers have the potential to be an important tool for quantifying the benefits of new visualization techniques. The expense of such trackers has made their use relatively infrequent in visualization studies. As such, it is difficult for researchers to compare multiple devices obtaining several demonstration models is impractical in cost and time, and quantitative measures from real-world use are not readily available. In this paper, we present a sample protocol to determine the accuracy of a gaze-tacking device.

  5. [Case of acute ophthalmoparesis with gaze nystagmus].

    PubMed

    Ikuta, Naomi; Tada, Yukiko; Koga, Michiaki

    2012-01-01

    A 61-year-old man developed double vision subsequent to diarrheal illness. Mixed horizontal-vertical gaze palsy in both eyes, diminution of tendon reflexes, and gaze nystagmus were noted. His horizontal gaze palsy was accompanied by gaze nystagmus in the abducent direction, indicative of the disturbance in central nervous system. Neither limb weakness nor ataxia was noted. Serum anti-GQ1b antibody was detected. Brain magnetic resonance imaging (MRI) findings were normal. The patient was diagnosed as having acute ophthalmoparesis. The ophthalmoparesis and nystagmus gradually disappeared in 3 months. The accompanying nystagmus suggests that central nervous system disturbance may also be present with acute ophthalmoparesis.

  6. The Microstructure of Infants' Gaze as They View Adult Shifts in Overt Attention

    ERIC Educational Resources Information Center

    Gredeback, Gustaf; Theuring, Carolin; Hauf, Petra; Kenward, Ben

    2008-01-01

    We presented infants (5, 6, 9, and 12 months old) with movies in which a female model turned toward and fixated 1 of 2 toys placed on a table. Infants' gaze was measured using a Tobii 1750 eye tracker. Six-, 9-, and 12-month-olds' first gaze shift from the model's face (after the model started turning) was directed to the attended toy. The…

  7. The interaction of pupil response with the vergence system.

    PubMed

    Feil, Moritz; Moser, Barbara; Abegg, Mathias

    2017-11-01

    A gaze shift from a target at distance to a target at near leads to pupillary constriction. The regulation of this pupillary near response is ill known. We investigated the impact of accommodation, convergence, and proximity on the pupillary diameter. We recorded pupil size and vergence eye movements with the use of an infrared eye tracker. We determined the pupillary response in four conditions: (1) after a gaze shift from far to near without accommodation, (2) after a gaze shift from far to near with neither accommodation nor convergence, (3) after accommodation alone, and (4) after accommodation with convergence without a gaze shift to near. These responses were compared to the pupil response of a full near response and to a gaze shift from one far target to another. We found a reliable pupillary near response. The removal of both accommodation and convergence in gaze shift from far to near abolished the pupillary near response. Accommodation alone did not induce pupillary constriction, while convergence and accommodation together induced a pupil response similar to the full near response. The main trigger for the pupillary response seems to be convergence. Neither accommodation nor proximity alone induce a significant pupillary constriction. This suggests that the miosis of the near triad is closely coupled to the vergence system rather than being independently regulated.

  8. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load.

    PubMed

    Pecchinenda, Anna; Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information.

  9. Emotion Unchained: Facial Expression Modulates Gaze Cueing under Cognitive Load

    PubMed Central

    Petrucci, Manuel

    2016-01-01

    Direction of eye gaze cues spatial attention, and typically this cueing effect is not modulated by the expression of a face unless top-down processes are explicitly or implicitly involved. To investigate the role of cognitive control on gaze cueing by emotional faces, participants performed a gaze cueing task with happy, angry, or neutral faces under high (i.e., counting backward by 7) or low cognitive load (i.e., counting forward by 2). Results show that high cognitive load enhances gaze cueing effects for angry facial expressions. In addition, cognitive load reduces gaze cueing for neutral faces, whereas happy facial expressions and gaze affected object preferences regardless of load. This evidence clearly indicates a differential role of cognitive control in processing gaze direction and facial expression, suggesting that under typical conditions, when we shift attention based on social cues from another person, cognitive control processes are used to reduce interference from emotional information. PMID:27959925

  10. Rett syndrome: basic features of visual processing-a pilot study of eye-tracking.

    PubMed

    Djukic, Aleksandra; Valicenti McDermott, Maria; Mavrommatis, Kathleen; Martins, Cristina L

    2012-07-01

    Consistently observed "strong eye gaze" has not been validated as a means of communication in girls with Rett syndrome, ubiquitously affected by apraxia, unable to reply either verbally or manually to questions during formal psychologic assessment. We examined nonverbal cognitive abilities and basic features of visual processing (visual discrimination attention/memory) by analyzing patterns of visual fixation in 44 girls with Rett syndrome, compared with typical control subjects. To determine features of visual fixation patterns, multiple pictures (with the location of the salient and presence/absence of novel stimuli as variables) were presented on the screen of a TS120 eye-tracker. Of the 44, 35 (80%) calibrated and exhibited meaningful patterns of visual fixation. They looked longer at salient stimuli (cartoon, 2.8 ± 2 seconds S.D., vs shape, 0.9 ± 1.2 seconds S.D.; P = 0.02), regardless of their position on the screen. They recognized novel stimuli, decreasing the fixation time on the central image when another image appeared on the periphery of the slide (2.7 ± 1 seconds S.D. vs 1.8 ± 1 seconds S.D., P = 0.002). Eye-tracking provides a feasible method for cognitive assessment and new insights into the "hidden" abilities of individuals with Rett syndrome. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Single-plane compensatory phase shift of head and eye oscillations in infantile nystagmus syndrome.

    PubMed

    Anagnostou, Evangelos; Spengos, Konstantinos; Anastasopoulos, Dimitri

    2011-09-15

    A 43-year-old man with infantile nystagmus syndrome complained of "head tremor" that would occur during attempted reading. Three-dimensional, combined eye and head recordings were performed with the magnetic search coil technique in two conditions: 1) looking straight-ahead under photopic conditions without a particular attentional focus and 2) reading a simple text held one meter away. A mainly vertical-horizontal spontaneous nystagmus was evident in both conditions, whereas head nodding emerged in the second condition. The head oscillated only in the vertical plane and concomitant analysis of eye and head displacement revealed a counterphase, compensatory pattern of the first harmonic of the INS waveform. This was verified by the significant negative peak of the crosscorrelogram at zero lag. Eye-in-space (gaze) displacement during nystagmic oscillations was thereby reduced suggesting a central adaptive behavior that may have evolved to partly compensate for the abnormal eye movements during reading. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Driving experience and special skills reflected in eye movements

    NASA Astrophysics Data System (ADS)

    Paeglis, Roberts; Bluss, Kristaps; Atvars, Aigars

    2011-10-01

    When driving a vehicle, people use the central vision both to plan ahead and monitor their performance feedback (research by Donges, 1978 [1], and after). Discussion is ongoing if making eye movements do more than gathering information. Moving eyes may also prepare the following body movements like steering. Different paradigms exist to explore vision in driving. Our perspective was to quantify eye movements and fixation patterns of different proficiency individuals, a driving learner, a novice, an experienced driver and a European level car racer. Thus for safety reasons we started by asking them to follow a video tour through a known city, remote from an infrared eye tracker sampling at 250 Hz. We report that gaze strategy of an experienced driver differs qualitatively from that of an automobile sports master. Quantitative differences only were found between the latter and a driving learner or a novice driver. Experience in a motor action provides skills different from sports training. We are aiming at testing this finding in real world driving.

  13. Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study.

    PubMed

    Król, Magdalena Ewa

    2018-01-01

    We investigated the effect of auditory noise added to speech on patterns of looking at faces in 40 toddlers. We hypothesised that noise would increase the difficulty of processing speech, making children allocate more attention to the mouth of the speaker to gain visual speech cues from mouth movements. We also hypothesised that this shift would cause a decrease in fixation time to the eyes, potentially decreasing the ability to monitor gaze. We found that adding noise increased the number of fixations to the mouth area, at the price of a decreased number of fixations to the eyes. Thus, to our knowledge, this is the first study demonstrating a mouth-eyes trade-off between attention allocated to social cues coming from the eyes and linguistic cues coming from the mouth. We also found that children with higher word recognition proficiency and higher average pupil response had an increased likelihood of fixating the mouth, compared to the eyes and the rest of the screen, indicating stronger motivation to decode the speech.

  14. Auditory noise increases the allocation of attention to the mouth, and the eyes pay the price: An eye-tracking study

    PubMed Central

    2018-01-01

    We investigated the effect of auditory noise added to speech on patterns of looking at faces in 40 toddlers. We hypothesised that noise would increase the difficulty of processing speech, making children allocate more attention to the mouth of the speaker to gain visual speech cues from mouth movements. We also hypothesised that this shift would cause a decrease in fixation time to the eyes, potentially decreasing the ability to monitor gaze. We found that adding noise increased the number of fixations to the mouth area, at the price of a decreased number of fixations to the eyes. Thus, to our knowledge, this is the first study demonstrating a mouth-eyes trade-off between attention allocated to social cues coming from the eyes and linguistic cues coming from the mouth. We also found that children with higher word recognition proficiency and higher average pupil response had an increased likelihood of fixating the mouth, compared to the eyes and the rest of the screen, indicating stronger motivation to decode the speech. PMID:29558514

  15. Modification of Eccentric Gaze-Holding

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Paloski, W. H.; Somers, J. T.; Leigh, R. J.; Wood, S. J.; Kornilova, L.

    2006-01-01

    Clear vision and accurate localization of objects in the environment are prerequisites for reliable performance of motor tasks. Space flight confronts the crewmember with a stimulus rearrangement that requires adaptation to function effectively with the new requirements of altered spatial orientation and motor coordination. Adaptation and motor learning driven by the effects of cerebellar disorders may share some of the same demands that face our astronauts. One measure of spatial localization shared by the astronauts and those suffering from cerebellar disorders that is easily quantified, and for which a neurobiological substrate has been identified, is the control of the angle of gaze (the "line of sight"). The disturbances of gaze control that have been documented to occur in astronauts and cosmonauts, both in-flight and postflight, can be directly related to changes in the extrinsic gravitational environment and intrinsic proprioceptive mechanisms thus, lending themselves to description by simple non-linear statistical models. Because of the necessity of developing robust normal response populations and normative populations against which abnormal responses can be evaluated, the basic models can be formulated using normal, non-astronaut test subjects and subsequently extended using centrifugation techniques to alter the gravitational and proprioceptive environment of these subjects. Further tests and extensions of the models can be made by studying abnormalities of gaze control in patients with cerebellar disease. A series of investigations were conducted in which a total of 62 subjects were tested to: (1) Define eccentric gaze-holding parameters in a normative population, and (2) explore the effects of linear acceleration on gaze-holding parameters. For these studies gaze-holding was evaluated with the subjects seated upright (the normative values), rolled 45 degrees to both the left and right, or pitched back 30 and 90 degrees. In a separate study the further effects of acceleration on gaze stability was examined during centrifugation (+2 G (sub x) and +2 G (sub z) using a total of 23 subjects. In all of our investigations eccentric gaze-holding was established by having the subjects acquire an eccentric target (+/-30 degrees horizontal, +/- 15 degrees vertical) that was flashed for 750 msec in an otherwise dark room. Subjects were instructed to hold gaze on the remembered position of the flashed target for 20 sec. Immediately following the 20 sec period, subjects were cued to return to the remembered center position and to hold gaze there for an additional 20 sec. Following this 20 sec period the center target was briefly flashed and the subject made any corrective eye movement back to the true center position. Conventionally, the ability to hold eccentric gaze is estimated by fitting the natural log of centripetal eye drifts by linear regression and calculating the time constant (G) of these slow phases of "gaze-evoked nystagmus". However, because our normative subjects sometimes showed essentially no drift (tau (sub c) = m), statistical estimation and inference on the effect of target direction was performed on values of the decay constant theta = 1/(tau (sub c)) which we found was well modeled by a gamma distribution. Subjects showed substantial variance of their eye drifts, which were centrifugal in approximately 20 % of cases, and > 40% for down gaze. Using the ensuing estimated gamma distributions, we were able to conclude that rightward and leftward gaze holding were not significantly different, but that upward gaze holding was significantly worse than downward (p<0.05). We also concluded that vertical gaze holding was significantly worse than horizontal (p<0.05). In the case of left and right roll, we found that both had a similar improvement to horizontal gaze holding (p<0.05), but didn't have a significant effect on vertical gaze holding. For pitch tilts, both tilt angles significantly decreased gaze-holding ility in all directions (p<0.05). Finally, we found that hyper-g centrifugation significantly decreased gaze holding ability in the vertical plane. The main findings of this study are as follows: (1) vertical gaze-holding is less stable than horizontal, (2) gaze-holding to upward targets is less stable than to downward targets, (3) tilt affects gaze holding, and (4) hyper-g affects gaze holding. This difference between horizontal and vertical gaze-holding may be ascribed to separate components of the velocity-to-position neural integrator for eye movements, and to differences in orbital mechanics. The differences between upward and downward gaze-holding may be ascribed to an inherent vertical imbalance in the vestibular system. Because whole body tilt and hyper-g affects gaze-holding, it is implied that the otolith organs have direct connections to the neural integrator and further studies of astronaut gaze-holding are warranted. Our statistical method for representing the range of normal eccentric gaze stability can be readily applied to normals who maybe exposed to environments which may modify the central integrator and require monitoring, and to evaluate patients with gaze-evoked nystagmus by comparing to the above established normative criteria.

  16. Predicting diagnostic error in radiology via eye-tracking and image analytics: Preliminary investigation in mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Voisin, Sophie; Tourassi, Georgia D.; Pinto, Frank

    2013-10-15

    Purpose: The primary aim of the present study was to test the feasibility of predicting diagnostic errors in mammography by merging radiologists’ gaze behavior and image characteristics. A secondary aim was to investigate group-based and personalized predictive models for radiologists of variable experience levels.Methods: The study was performed for the clinical task of assessing the likelihood of malignancy of mammographic masses. Eye-tracking data and diagnostic decisions for 40 cases were acquired from four Radiology residents and two breast imaging experts as part of an IRB-approved pilot study. Gaze behavior features were extracted from the eye-tracking data. Computer-generated and BIRADS imagesmore » features were extracted from the images. Finally, machine learning algorithms were used to merge gaze and image features for predicting human error. Feature selection was thoroughly explored to determine the relative contribution of the various features. Group-based and personalized user modeling was also investigated.Results: Machine learning can be used to predict diagnostic error by merging gaze behavior characteristics from the radiologist and textural characteristics from the image under review. Leveraging data collected from multiple readers produced a reasonable group model [area under the ROC curve (AUC) = 0.792 ± 0.030]. Personalized user modeling was far more accurate for the more experienced readers (AUC = 0.837 ± 0.029) than for the less experienced ones (AUC = 0.667 ± 0.099). The best performing group-based and personalized predictive models involved combinations of both gaze and image features.Conclusions: Diagnostic errors in mammography can be predicted to a good extent by leveraging the radiologists’ gaze behavior and image content.« less

  17. Video attention deviation estimation using inter-frame visual saliency map analysis

    NASA Astrophysics Data System (ADS)

    Feng, Yunlong; Cheung, Gene; Le Callet, Patrick; Ji, Yusheng

    2012-01-01

    A viewer's visual attention during video playback is the matching of his eye gaze movement to the changing video content over time. If the gaze movement matches the video content (e.g., follow a rolling soccer ball), then the viewer keeps his visual attention. If the gaze location moves from one video object to another, then the viewer shifts his visual attention. A video that causes a viewer to shift his attention often is a "busy" video. Determination of which video content is busy is an important practical problem; a busy video is difficult for encoder to deploy region of interest (ROI)-based bit allocation, and hard for content provider to insert additional overlays like advertisements, making the video even busier. One way to determine the busyness of video content is to conduct eye gaze experiments with a sizable group of test subjects, but this is time-consuming and costineffective. In this paper, we propose an alternative method to determine the busyness of video-formally called video attention deviation (VAD): analyze the spatial visual saliency maps of the video frames across time. We first derive transition probabilities of a Markov model for eye gaze using saliency maps of a number of consecutive frames. We then compute steady state probability of the saccade state in the model-our estimate of VAD. We demonstrate that the computed steady state probability for saccade using saliency map analysis matches that computed using actual gaze traces for a range of videos with different degrees of busyness. Further, our analysis can also be used to segment video into shorter clips of different degrees of busyness by computing the Kullback-Leibler divergence using consecutive motion compensated saliency maps.

  18. Age Deficits in Facial Affect Recognition: The Influence of Dynamic Cues.

    PubMed

    Grainger, Sarah A; Henry, Julie D; Phillips, Louise H; Vanman, Eric J; Allen, Roy

    2017-07-01

    Older adults have difficulties in identifying most facial expressions of emotion. However, most aging studies have presented static photographs of intense expressions, whereas in everyday experience people see emotions that develop and change. The present study was designed to assess whether age-related difficulties with emotion recognition are reduced when more ecologically valid (i.e., dynamic) stimuli are used. We examined the effect of stimuli format (i.e., static vs. dynamic) on facial affect recognition in two separate studies that included independent samples and distinct stimuli sets. In addition to younger and older participants, a middle-aged group was included in Study 1 and eye gaze patterns were assessed in Study 2. Across both studies, older adults performed worse than younger adults on measures of facial affect recognition. In Study 1, older and-middle aged adults benefited from dynamic stimuli, but only when the emotional displays were subtle. Younger adults gazed more at the eye region of the face relative to older adults (Study 2), but dynamic presentation increased attention towards the eye region for younger adults only. Together, these studies provide important and novel insights into the specific circumstances in which older adults may be expected to experience difficulties in perceiving facial emotions. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Eye-Gaze Analysis of Facial Emotion Recognition and Expression in Adolescents with ASD.

    PubMed

    Wieckowski, Andrea Trubanova; White, Susan W

    2017-01-01

    Impaired emotion recognition and expression in individuals with autism spectrum disorder (ASD) may contribute to observed social impairment. The aim of this study was to examine the role of visual attention directed toward nonsocial aspects of a scene as a possible mechanism underlying recognition and expressive ability deficiency in ASD. One recognition and two expression tasks were administered. Recognition was assessed in force-choice paradigm, and expression was assessed during scripted and free-choice response (in response to emotional stimuli) tasks in youth with ASD (n = 20) and an age-matched sample of typically developing youth (n = 20). During stimulus presentation prior to response in each task, participants' eye gaze was tracked. Youth with ASD were less accurate at identifying disgust and sadness in the recognition task. They fixated less to the eye region of stimuli showing surprise. A group difference was found during the free-choice response task, such that those with ASD expressed emotion less clearly but not during the scripted task. Results suggest altered eye gaze to the mouth region but not the eye region as a candidate mechanism for decreased ability to recognize or express emotion. Findings inform our understanding of the association between social attention and emotion recognition and expression deficits.

  20. Eyes that bind us: Gaze leading induces an implicit sense of agency.

    PubMed

    Stephenson, Lisa J; Edwards, S Gareth; Howard, Emma E; Bayliss, Andrew P

    2018-03-01

    Humans feel a sense of agency over the effects their motor system causes. This is the case for manual actions such as pushing buttons, kicking footballs, and all acts that affect the physical environment. We ask whether initiating joint attention - causing another person to follow our eye movement - can elicit an implicit sense of agency over this congruent gaze response. Eye movements themselves cannot directly affect the physical environment, but joint attention is an example of how eye movements can indirectly cause social outcomes. Here we show that leading the gaze of an on-screen face induces an underestimation of the temporal gap between action and consequence (Experiments 1 and 2). This underestimation effect, named 'temporal binding,' is thought to be a measure of an implicit sense of agency. Experiment 3 asked whether merely making an eye movement in a non-agentic, non-social context might also affect temporal estimation, and no reliable effects were detected, implying that inconsequential oculomotor acts do not reliably affect temporal estimations under these conditions. Together, these findings suggest that an implicit sense of agency is generated when initiating joint attention interactions. This is important for understanding how humans can efficiently detect and understand the social consequences of their actions. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Risk and Ambiguity in Information Seeking: Eye Gaze Patterns Reveal Contextual Behavior in Dealing with Uncertainty.

    PubMed

    Wittek, Peter; Liu, Ying-Hsang; Darányi, Sándor; Gedeon, Tom; Lim, Ik Soo

    2016-01-01

    Information foraging connects optimal foraging theory in ecology with how humans search for information. The theory suggests that, following an information scent, the information seeker must optimize the tradeoff between exploration by repeated steps in the search space vs. exploitation, using the resources encountered. We conjecture that this tradeoff characterizes how a user deals with uncertainty and its two aspects, risk and ambiguity in economic theory. Risk is related to the perceived quality of the actually visited patch of information, and can be reduced by exploiting and understanding the patch to a better extent. Ambiguity, on the other hand, is the opportunity cost of having higher quality patches elsewhere in the search space. The aforementioned tradeoff depends on many attributes, including traits of the user: at the two extreme ends of the spectrum, analytic and wholistic searchers employ entirely different strategies. The former type focuses on exploitation first, interspersed with bouts of exploration, whereas the latter type prefers to explore the search space first and consume later. Our findings from an eye-tracking study of experts' interactions with novel search interfaces in the biomedical domain suggest that user traits of cognitive styles and perceived search task difficulty are significantly correlated with eye gaze and search behavior. We also demonstrate that perceived risk shifts the balance between exploration and exploitation in either type of users, tilting it against vs. in favor of ambiguity minimization. Since the pattern of behavior in information foraging is quintessentially sequential, risk and ambiguity minimization cannot happen simultaneously, leading to a fundamental limit on how good such a tradeoff can be. This in turn connects information seeking with the emergent field of quantum decision theory.

  2. Selective Visual Attention during Mirror Exposure in Anorexia and Bulimia Nervosa.

    PubMed

    Tuschen-Caffier, Brunna; Bender, Caroline; Caffier, Detlef; Klenner, Katharina; Braks, Karsten; Svaldi, Jennifer

    2015-01-01

    Cognitive theories suggest that body dissatisfaction results from the activation of maladaptive appearance schemata, which guide mental processes such as selective attention to shape and weight-related information. In line with this, the present study hypothesized that patients with anorexia nervosa (AN) and bulimia nervosa (BN) are characterized by increased visual attention for the most dissatisfying/ugly body part compared to their most satisfying/beautiful body part, while a more balanced viewing pattern was expected for controls without eating disorders (CG). Eye movements were recorded in a group of patients with AN (n = 16), BN (n = 16) and a CG (n = 16) in an ecologically valid setting, i.e., during a 3-min mirror exposure. Evidence was found that patients with AN and BN display longer and more frequent gazes towards the most dissatisfying relative to the most satisfying and towards their most ugly compared to their most beautiful body parts, whereas the CG showed a more balanced gaze pattern. The results converge with theoretical models that emphasize the role of information processing in the maintenance of body dissatisfaction. Given the etiological importance of body dissatisfaction in the development of eating disorders, future studies should focus on the modification of the reported patterns.

  3. Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat.

    PubMed

    Petermeijer, S M; Cieler, S; de Winter, J C F

    2017-02-01

    Vibrotactile stimuli can be effective as warning signals, but their effectiveness as directional take-over requests in automated driving is yet unknown. This study aimed to investigate the correct response rate, reaction times, and eye and head orientation for static versus dynamic directional take-over requests presented via vibrating motors in the driver seat. In a driving simulator, eighteen participants performed three sessions: 1) a session involving no driving (Baseline), 2) driving a highly automated car without additional task (HAD), and 3) driving a highly automated car while performing a mentally demanding task (N-Back). Per session, participants received four directional static (in the left or right part of the seat) and four dynamic (moving from one side towards the opposite left or right of the seat) take-over requests via two 6×4 motor matrices embedded in the seat back and bottom. In the Baseline condition, participants reported whether the cue was left or right, and in the HAD and N-Back conditions participants had to change lanes to the left or to the right according to the directional cue. The correct response rate was operationalized as the accuracy of the self-reported direction (Baseline session) and the accuracy of the lane change direction (HAD & N-Back sessions). The results showed that the correct response rate ranged between 94% for static patterns in the Baseline session and 74% for dynamic patterns in the N-Back session, although these effects were not statistically significant. Steering wheel touch and steering input reaction times were approximately 200ms faster for static patterns than for dynamic ones. Eye tracking results revealed a correspondence between head/eye-gaze direction and lane change direction, and showed that head and eye-gaze movements where initiated faster for static vibrations than for dynamic ones. In conclusion, vibrotactile stimuli presented via the driver seat are effective as warnings, but their effectiveness as directional take-over requests may be limited. The present study may encourage further investigation into how to get drivers safely back into the loop. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Follow My Eyes: The Gaze of Politicians Reflexively Captures the Gaze of Ingroup Voters

    PubMed Central

    Liuzza, Marco Tullio; Cazzato, Valentina; Vecchione, Michele; Crostella, Filippo; Caprara, Gian Vittorio; Aglioti, Salvatore Maria

    2011-01-01

    Studies in human and non-human primates indicate that basic socio-cognitive operations are inherently linked to the power of gaze in capturing reflexively the attention of an observer. Although monkey studies indicate that the automatic tendency to follow the gaze of a conspecific is modulated by the leader-follower social status, evidence for such effects in humans is meager. Here, we used a gaze following paradigm where the directional gaze of right- or left-wing Italian political characters could influence the oculomotor behavior of ingroup or outgroup voters. We show that the gaze of Berlusconi, the right-wing leader currently dominating the Italian political landscape, potentiates and inhibits gaze following behavior in ingroup and outgroup voters, respectively. Importantly, the higher the perceived similarity in personality traits between voters and Berlusconi, the stronger the gaze interference effect. Thus, higher-order social variables such as political leadership and affiliation prepotently affect reflexive shifts of attention. PMID:21957479

  5. The Expressive Gaze Model: Using Gaze to Express Emotion

    DTIC Science & Technology

    2010-07-01

    World of Warcraft or Oblivion , have thou- sands of computer-controlled nonplayer characters with which users can interact. Producing hand- generated...increasing to the right and the vertical increasing upward. In both cases, 0 degrees is straight ahead. Although the mechani- cal limits of human eye...to gaze from a target directly in front of her to one 60 degrees to her right , while performing these behaviors in a manner that expressed the de

  6. The reticular formation.

    PubMed

    Horn, Anja K E

    2006-01-01

    The reticular formation of the brainstem contains functional cell groups that are important for the control of eye, head, or lid movements. The mesencephalic reticular formation is primarily involved in the control of vertical gaze, the paramedian pontine reticular formation in horizontal gaze, and the medullary pontine reticular formation in head movements and gaze holding. In this chapter, the locations, connections, and histochemical properties of the functional cell groups are reviewed and correlated with specific subdivisions of the reticular formation.

  7. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  8. A Gaze Independent Brain-Computer Interface Based on Visual Stimulation through Closed Eyelids

    NASA Astrophysics Data System (ADS)

    Hwang, Han-Jeong; Ferreria, Valeria Y.; Ulrich, Daniel; Kilic, Tayfun; Chatziliadis, Xenofon; Blankertz, Benjamin; Treder, Matthias

    2015-10-01

    A classical brain-computer interface (BCI) based on visual event-related potentials (ERPs) is of limited application value for paralyzed patients with severe oculomotor impairments. In this study, we introduce a novel gaze independent BCI paradigm that can be potentially used for such end-users because visual stimuli are administered on closed eyelids. The paradigm involved verbally presented questions with 3 possible answers. Online BCI experiments were conducted with twelve healthy subjects, where they selected one option by attending to one of three different visual stimuli. It was confirmed that typical cognitive ERPs can be evidently modulated by the attention of a target stimulus in eyes-closed and gaze independent condition, and further classified with high accuracy during online operation (74.58% ± 17.85 s.d.; chance level 33.33%), demonstrating the effectiveness of the proposed novel visual ERP paradigm. Also, stimulus-specific eye movements observed during stimulation were verified as reflex responses to light stimuli, and they did not contribute to classification. To the best of our knowledge, this study is the first to show the possibility of using a gaze independent visual ERP paradigm in an eyes-closed condition, thereby providing another communication option for severely locked-in patients suffering from complex ocular dysfunctions.

  9. Selective looking at natural scenes: Hedonic content and gender.

    PubMed

    Bradley, Margaret M; Costa, Vincent D; Lang, Peter J

    2015-10-01

    Choice viewing behavior when looking at affective scenes was assessed to examine differences due to hedonic content and gender by monitoring eye movements in a selective looking paradigm. On each trial, participants viewed a pair of pictures that included a neutral picture together with an affective scene depicting either contamination, mutilation, threat, food, nude males, or nude females. The duration of time that gaze was directed to each picture in the pair was determined from eye fixations. Results indicated that viewing choices varied with both hedonic content and gender. Initially, gaze duration for both men and women was heightened when viewing all affective contents, but was subsequently followed by significant avoidance of scenes depicting contamination or nude males. Gender differences were most pronounced when viewing pictures of nude females, with men continuing to devote longer gaze time to pictures of nude females throughout viewing, whereas women avoided scenes of nude people, whether male or female, later in the viewing interval. For women, reported disgust of sexual activity was also inversely related to gaze duration for nude scenes. Taken together, selective looking as indexed by eye movements reveals differential perceptual intake as a function of specific content, gender, and individual differences. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. A new system for quantitative evaluation of infant gaze capabilities in a wide visual field.

    PubMed

    Pratesi, Andrea; Cecchi, Francesca; Beani, Elena; Sgandurra, Giuseppina; Cioni, Giovanni; Laschi, Cecilia; Dario, Paolo

    2015-09-07

    The visual assessment of infants poses specific challenges: many techniques that are used on adults are based on the patient's response, and are not suitable for infants. Significant advances in the eye-tracking have made this assessment of infant visual capabilities easier, however, eye-tracking still requires the subject's collaboration, in most cases and thus limiting the application in infant research. Moreover, there is a lack of transferability to clinical practice, and thus it emerges the need for a new tool to measure the paradigms and explore the most common visual competences in a wide visual field. This work presents the design, development and preliminary testing of a new system for measuring infant's gaze in the wide visual field called CareToy C: CareToy for Clinics. The system is based on a commercial eye tracker (SmartEye) with six cameras running at 60 Hz, suitable for measuring an infant's gaze. In order to stimulate the infant visually and audibly, a mechanical structure has been designed to support five speakers and five screens at a specific distance (60 cm) and angle: one in the centre, two on the right-hand side and two on the left (at 30° and 60° respectively). Different tasks have been designed in order to evaluate the system capability to assess the infant's gaze movements during different conditions (such as gap, overlap or audio-visual paradigms). Nine healthy infants aged 4-10 months were assessed as they performed the visual tasks at random. We developed a system able to measure infant's gaze in a wide visual field covering a total visual range of ±60° from the centre with an intermediate evaluation at ±30°. Moreover, the same system, thanks to different integrated software, was able to provide different visual paradigms (as gap, overlap and audio-visual) assessing and comparing different visual and multisensory sub-competencies. The proposed system endowed the integration of a commercial eye-tracker into a purposive setup in a smart and innovative way. The proposed system is suitable for measuring and evaluating infant's gaze capabilities in a wide visual field, in order to provide quantitative data that can enrich the clinical assessment.

  11. Association of predeployment gaze bias for emotion stimuli with later symptoms of PTSD and depression in soldiers deployed in Iraq.

    PubMed

    Beevers, Christopher G; Lee, Han-Joo; Wells, Tony T; Ellis, Alissa J; Telch, Michael J

    2011-07-01

    Biased processing of emotion stimuli is thought to confer vulnerability to psychopathology, but few longitudinal studies of this link have been conducted. The authors examined the relationship between predeployment gaze bias for emotion stimuli and later symptoms of posttraumatic stress disorder (PTSD) and depression in soldiers deployed to Iraq. An eye-tracking paradigm was used to assess line of gaze in 139 soldiers while they viewed a two-by-two matrix of fearful, sad, happy, and neutral facial expressions before they were deployed to Iraq. Once they were deployed, the soldiers periodically reported on their levels of war zone stress exposure and symptoms of PTSD and depression. War zone stress exposure predicted higher scores on PTSD and depression symptom measures; however, eye gaze bias moderated this relationship. In soldiers with war zone stress exposure, shorter mean fixation time when viewing fearful faces predicted higher PTSD symptom scores, and greater total fixation time and longer mean fixation time for sad faces predicted higher depressive symptom scores. Biased processing of emotion stimuli, as measured by gaze bias, appears to confer vulnerability to symptoms of PTSD and depression in soldiers who experience war zone stress.

  12. Solar retinopathy. A study from Nepal and from Germany.

    PubMed

    Rai, N; Thuladar, L; Brandt, F; Arden, G B; Berninger, T A

    1998-01-01

    319 patients with a solar retinopathy were seen in an eye clinic in Nepal within 20 months. All patients had either a positive history of sun-gazing or typical circumscribed scars in the foveal area. In more than 80% of the patients the visual acuity was 6/12 or better and did not deteriorate over time. 126 (40%) patients had a history of gazing at the sun during an eclipse, 33 (10%) were sun worshipers and 4 (1%) were in both categories. Three years later 29 patients were re-examined in a follow-up study. Only 16 had had visual disturbances directly after they had gazed into the sun. No colour vision defects were seen in any of the 44 affected eyes, when tested with Panel D 15, while four patients (6 eyes) had some uncertainty with the tritan plates of the Ishihara test charts. Metamorphopsia were recorded in 11 eyes. Five German patients with solar retinopathy were examined in more detail. Colour contrast sensitivity (CCS) was tested for the central and the peripheral visual field. CCS for tritan axis was raised in all patients for the central visual field, while it was normal for the peripheral visual field.

  13. Visual laterality in belugas (Delphinapterus leucas) and Pacific white-sided dolphins (Lagenorhynchus obliquidens) when viewing familiar and unfamiliar humans.

    PubMed

    Yeater, Deirdre B; Hill, Heather M; Baus, Natalie; Farnell, Heather; Kuczaj, Stan A

    2014-11-01

    Lateralization of cognitive processes and motor functions has been demonstrated in a number of species, including humans, elephants, and cetaceans. For example, bottlenose dolphins (Tursiops truncatus) have exhibited preferential eye use during a variety of cognitive tasks. The present study investigated the possibility of visual lateralization in 12 belugas (Delphinapterus leucas) and six Pacific white-sided dolphins (Lagenorhynchus obliquidens) located at two separate marine mammal facilities. During free swim periods, the belugas and Pacific white-sided dolphins were presented a familiar human, an unfamiliar human, or no human during 10-15 min sessions. Session videos were coded for gaze duration, eye presentation at approach, and eye preference while viewing each stimulus. Although we did not find any clear group level lateralization, we found individual left eye lateralized preferences related to social stimuli for most belugas and some Pacific white-sided dolphins. Differences in gaze durations were also observed. The majority of individual belugas had longer gaze durations for unfamiliar rather than familiar stimuli. These results suggest that lateralization occurs during visual processing of human stimuli in belugas and Pacific white-sided dolphins and that these species can distinguish between familiar and unfamiliar humans.

  14. Gaze Cueing of Attention

    PubMed Central

    Frischen, Alexandra; Bayliss, Andrew P.; Tipper, Steven P.

    2007-01-01

    During social interactions, people’s eyes convey a wealth of information about their direction of attention and their emotional and mental states. This review aims to provide a comprehensive overview of past and current research into the perception of gaze behavior and its effect on the observer. This encompasses the perception of gaze direction and its influence on perception of the other person, as well as gaze-following behavior such as joint attention, in infant, adult, and clinical populations. Particular focus is given to the gaze-cueing paradigm that has been used to investigate the mechanisms of joint attention. The contribution of this paradigm has been significant and will likely continue to advance knowledge across diverse fields within psychology and neuroscience. PMID:17592962

  15. Assessing visual attention using eye tracking sensors in intelligent cognitive therapies based on serious games.

    PubMed

    Frutos-Pascual, Maite; Garcia-Zapirain, Begonya

    2015-05-12

    This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems.

  16. Assessing Visual Attention Using Eye Tracking Sensors in Intelligent Cognitive Therapies Based on Serious Games

    PubMed Central

    Frutos-Pascual, Maite; Garcia-Zapirain, Begonya

    2015-01-01

    This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems. PMID:25985158

  17. How Beauty Determines Gaze! Facial Attractiveness and Gaze Duration in Images of Real World Scenes

    PubMed Central

    Mitrovic, Aleksandra; Goller, Jürgen

    2016-01-01

    We showed that the looking time spent on faces is a valid covariate of beauty by testing the relation between facial attractiveness and gaze behavior. We presented natural scenes which always pictured two people, encompassing a wide range of facial attractiveness. Employing measurements of eye movements in a free viewing paradigm, we found a linear relation between facial attractiveness and gaze behavior: The more attractive the face, the longer and the more often it was looked at. In line with evolutionary approaches, the positive relation was particularly pronounced when participants viewed other sex faces. PMID:27698984

  18. Oculomotor Apraxia

    MedlinePlus

    ... a defect in, the control of voluntary purposeful eye movement. Children with this condition have difficulty moving their ... to compensate for this inability to initiate horizontal eye movements away from the straight-ahead gaze position. Typically, ...

  19. Is gaze following purely reflexive or goal-directed instead? Revisiting the automaticity of orienting attention by gaze cues.

    PubMed

    Ricciardelli, Paola; Carcagno, Samuele; Vallar, Giuseppe; Bricolo, Emanuela

    2013-01-01

    Distracting gaze has been shown to elicit automatic gaze following. However, it is still debated whether the effects of perceived gaze are a simple automatic spatial orienting response or are instead sensitive to the context (i.e. goals and task demands). In three experiments, we investigated the conditions under which gaze following occurs. Participants were instructed to saccade towards one of two lateral targets. A face distracter, always present in the background, could gaze towards: (a) a task-relevant target--("matching" goal-directed gaze shift)--congruent or incongruent with the instructed direction, (b) a task-irrelevant target, orthogonal to the one instructed ("non-matching" goal-directed gaze shift), or (c) an empty spatial location (no-goal-directed gaze shift). Eye movement recordings showed faster saccadic latencies in correct trials in congruent conditions especially when the distracting gaze shift occurred before the instruction to make a saccade. Interestingly, while participants made a higher proportion of gaze-following errors (i.e. errors in the direction of the distracting gaze) in the incongruent conditions when the distracter's gaze shift preceded the instruction onset indicating an automatic gaze following, they never followed the distracting gaze when it was directed towards an empty location or a stimulus that was never the target. Taken together, these findings suggest that gaze following is likely to be a product of both automatic and goal-driven orienting mechanisms.

  20. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze.

    PubMed

    Wells, Laura Jean; Gillespie, Steven Mark; Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.

  1. Identification of Emotional Facial Expressions: Effects of Expression, Intensity, and Sex on Eye Gaze

    PubMed Central

    Rotshtein, Pia

    2016-01-01

    The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down. PMID:27942030

  2. Infant Eyes: A Window on Cognitive Development

    ERIC Educational Resources Information Center

    Aslin, Richard N.

    2012-01-01

    Eye-trackers suitable for use with infants are now marketed by several commercial vendors. As eye-trackers become more prevalent in infancy research, there is the potential for users to be unaware of dangers lurking "under the hood" if they assume the eye-tracker introduces no errors in measuring infants' gaze. Moreover, the influx of voluminous…

  3. Differences in gaze behaviour of expert and junior surgeons performing open inguinal hernia repair.

    PubMed

    Tien, Tony; Pucher, Philip H; Sodergren, Mikael H; Sriskandarajah, Kumuthan; Yang, Guang-Zhong; Darzi, Ara

    2015-02-01

    Various fields have used gaze behaviour to evaluate task proficiency. This may also apply to surgery for the assessment of technical skill, but has not previously been explored in live surgery. The aim was to assess differences in gaze behaviour between expert and junior surgeons during open inguinal hernia repair. Gaze behaviour of expert and junior surgeons (defined by operative experience) performing the operation was recorded using eye-tracking glasses (SMI Eye Tracking Glasses 2.0, SensoMotoric Instruments, Germany). Primary endpoints were fixation frequency (steady eye gaze rate) and dwell time (fixation and saccades duration) and were analysed for designated areas of interest in the subject's visual field. Secondary endpoints were maximum pupil size, pupil rate of change (change frequency in pupil size) and pupil entropy (predictability of pupil change). NASA TLX scale measured perceived workload. Recorded metrics were compared between groups for the entire procedure and for comparable procedural segments. Twenty-five cases were recorded, with 13 operations analysed, from 9 surgeons giving 630 min of data, recorded at 30 Hz. Experts demonstrated higher fixation frequency (median[IQR] 1.86 [0.3] vs 0.96 [0.3]; P = 0.006) and dwell time on the operative site during application of mesh (792 [159] vs 469 [109] s; P = 0.028), closure of the external oblique (1.79 [0.2] vs 1.20 [0.6]; P = 0.003) (625 [154] vs 448 [147] s; P = 0.032) and dwelled more on the sterile field during cutting of mesh (716 [173] vs 268 [297] s; P = 0.019). NASA TLX scores indicated experts found the procedure less mentally demanding than juniors (3 [2] vs 12 [5.2]; P = 0.038). No subjects reported problems with wearing of the device, or obstruction of view. Use of portable eye-tracking technology in open surgery is feasible, without impinging surgical performance. Differences in gaze behaviour during open inguinal hernia repair can be seen between expert and junior surgeons and may have uses for assessment of surgical skill.

  4. Judgments at Gaze Value: Gaze Cuing in Banner Advertisements, Its Effect on Attention Allocation and Product Judgments.

    PubMed

    Palcu, Johanna; Sudkamp, Jennifer; Florack, Arnd

    2017-01-01

    Banner advertising is a popular means of promoting products and brands online. Although banner advertisements are often designed to be particularly attention grabbing, they frequently go unnoticed. Applying an eye-tracking procedure, the present research aimed to (a) determine whether presenting human faces (static or animated) in banner advertisements is an adequate tool for capturing consumers' attention and thus overcoming the frequently observed phenomenon of banner blindness, (b) to examine whether the gaze of a featured face possesses the ability to direct consumers' attention toward specific elements (i.e., the product) in an advertisement, and (c) to establish whether the gaze direction of an advertised face influences consumers subsequent evaluation of the advertised product. We recorded participants' eye gaze while they viewed a fictional online shopping page displaying banner advertisements that featured either no human face or a human face that was either static or animated and involved different gaze directions (toward or away from the advertised product). Moreover, we asked participants to subsequently evaluate a set of products, one of which was the product previously featured in the banner advertisement. Results showed that, when advertisements included a human face, participants' attention was more attracted by and they looked longer at animated compared with static banner advertisements. Moreover, when a face gazed toward the product region, participants' likelihood of looking at the advertised product increased regardless of whether the face was animated or not. Most important, gaze direction influenced subsequent product evaluations; that is, consumers indicated a higher intention to buy a product when it was previously presented in a banner advertisement that featured a face that gazed toward the product. The results suggest that while animation in banner advertising constitutes a salient feature that captures consumers' visual attention, gaze cuing can be an effective tool for driving viewers' attention toward specific elements in the advertisement and even shaping consumers' intentions to purchase the advertised product.

  5. Judgments at Gaze Value: Gaze Cuing in Banner Advertisements, Its Effect on Attention Allocation and Product Judgments

    PubMed Central

    Palcu, Johanna; Sudkamp, Jennifer; Florack, Arnd

    2017-01-01

    Banner advertising is a popular means of promoting products and brands online. Although banner advertisements are often designed to be particularly attention grabbing, they frequently go unnoticed. Applying an eye-tracking procedure, the present research aimed to (a) determine whether presenting human faces (static or animated) in banner advertisements is an adequate tool for capturing consumers’ attention and thus overcoming the frequently observed phenomenon of banner blindness, (b) to examine whether the gaze of a featured face possesses the ability to direct consumers’ attention toward specific elements (i.e., the product) in an advertisement, and (c) to establish whether the gaze direction of an advertised face influences consumers subsequent evaluation of the advertised product. We recorded participants’ eye gaze while they viewed a fictional online shopping page displaying banner advertisements that featured either no human face or a human face that was either static or animated and involved different gaze directions (toward or away from the advertised product). Moreover, we asked participants to subsequently evaluate a set of products, one of which was the product previously featured in the banner advertisement. Results showed that, when advertisements included a human face, participants’ attention was more attracted by and they looked longer at animated compared with static banner advertisements. Moreover, when a face gazed toward the product region, participants’ likelihood of looking at the advertised product increased regardless of whether the face was animated or not. Most important, gaze direction influenced subsequent product evaluations; that is, consumers indicated a higher intention to buy a product when it was previously presented in a banner advertisement that featured a face that gazed toward the product. The results suggest that while animation in banner advertising constitutes a salient feature that captures consumers’ visual attention, gaze cuing can be an effective tool for driving viewers’ attention toward specific elements in the advertisement and even shaping consumers’ intentions to purchase the advertised product. PMID:28626436

  6. I Reach Faster When I See You Look: Gaze Effects in Human–Human and Human–Robot Face-to-Face Cooperation

    PubMed Central

    Boucher, Jean-David; Pattacini, Ugo; Lelong, Amelie; Bailly, Gerard; Elisei, Frederic; Fagel, Sascha; Dominey, Peter Ford; Ventre-Dominey, Jocelyne

    2012-01-01

    Human–human interaction in natural environments relies on a variety of perceptual cues. Humanoid robots are becoming increasingly refined in their sensorimotor capabilities, and thus should now be able to manipulate and exploit these social cues in cooperation with their human partners. Previous studies have demonstrated that people follow human and robot gaze, and that it can help them to cope with spatially ambiguous language. Our goal is to extend these findings into the domain of action, to determine how human and robot gaze can influence the speed and accuracy of human action. We report on results from a human–human cooperation experiment demonstrating that an agent’s vision of her/his partner’s gaze can significantly improve that agent’s performance in a cooperative task. We then implement a heuristic capability to generate such gaze cues by a humanoid robot that engages in the same cooperative interaction. The subsequent human–robot experiments demonstrate that a human agent can indeed exploit the predictive gaze of their robot partner in a cooperative task. This allows us to render the humanoid robot more human-like in its ability to communicate with humans. The long term objectives of the work are thus to identify social cooperation cues, and to validate their pertinence through implementation in a cooperative robot. The current research provides the robot with the capability to produce appropriate speech and gaze cues in the context of human–robot cooperation tasks. Gaze is manipulated in three conditions: Full gaze (coordinated eye and head), eyes hidden with sunglasses, and head fixed. We demonstrate the pertinence of these cues in terms of statistical measures of action times for humans in the context of a cooperative task, as gaze significantly facilitates cooperation as measured by human response times. PMID:22563315

  7. Eye Movement in Response to Single and Multiple Targets

    DTIC Science & Technology

    1985-02-01

    pursuit control system. METHOD The SVFB technique was described in detail elsewhere (Zeevi et al., 1979). Displaying, to the subject, the point of gaze , in...34 The subject was presented with his point of gaze using the unconditioned SVFB signal (gain = 1, eccentric bias = 0). The SVFB signal was locked on the...superimposing the SVFB on the target, is gazing away from it and thus achieves eccentric fixation (Zeevi et al., 1979). As the subject moves from one

  8. The anatomy and physiology of the ocular motor system.

    PubMed

    Horn, Anja K E; Leigh, R John

    2011-01-01

    Accurate diagnosis of abnormal eye movements depends upon knowledge of the purpose, properties, and neural substrate of distinct functional classes of eye movement. Here, we summarize current concepts of the anatomy of eye movement control. Our approach is bottom-up, starting with the extraocular muscles and their innervation by the cranial nerves. Second, we summarize the neural circuits in the pons underlying horizontal gaze control, and the midbrain connections that coordinate vertical and torsional movements. Third, the role of the cerebellum in governing and optimizing eye movements is presented. Fourth, each area of cerebral cortex contributing to eye movements is discussed. Last, descending projections from cerebral cortex, including basal ganglionic circuits that govern different components of gaze, and the superior colliculus, are summarized. At each stage of this review, the anatomical scheme is used to predict the effects of lesions on the control of eye movements, providing clinical-anatomical correlation. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Boldness psychopathic traits predict reduced gaze toward fearful eyes in men with a history of violence.

    PubMed

    Gillespie, Steven M; Rotshtein, Pia; Beech, Anthony R; Mitchell, Ian J

    2017-09-01

    Research with developmental and adult samples has shown a relationship of psychopathic traits with reduced eye gaze. However, these relationships remained to be investigated among forensic samples. Here we examined the eye movements of male violent offenders during an emotion recognition task. Violent offenders performed similar to non-offending controls, and their eye movements varied with the emotion and intensity of the facial expression. In the violent offender group Boldness psychopathic traits, but not Meanness or Disinhibition, were associated with reduced dwell time and fixation counts, and slower first fixation latencies, on the eyes compared with the mouth. These results are the first to show a relationship of psychopathic traits with reduced attention to the eyes in a forensic sample, and suggest that Boldness is associated with difficulties in orienting attention toward emotionally salient aspects of the face. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  10. Out of sight, out of mind: racial retrieval cues increase the accessibility of social justice concepts.

    PubMed

    Salter, Phia S; Kelley, Nicholas J; Molina, Ludwin E; Thai, Luyen T

    2017-09-01

    Photographs provide critical retrieval cues for personal remembering, but few studies have considered this phenomenon at the collective level. In this research, we examined the psychological consequences of visual attention to the presence (or absence) of racially charged retrieval cues within American racial segregation photographs. We hypothesised that attention to racial retrieval cues embedded in historical photographs would increase social justice concept accessibility. In Study 1, we recorded gaze patterns with an eye-tracker among participants viewing images that contained racial retrieval cues or were digitally manipulated to remove them. In Study 2, we manipulated participants' gaze behaviour by either directing visual attention toward racial retrieval cues, away from racial retrieval cues, or directing attention within photographs where racial retrieval cues were missing. Across Studies 1 and 2, visual attention to racial retrieval cues in photographs documenting historical segregation predicted social justice concept accessibility.

  11. Individual differences in children's pronoun processing during reading: Detection of incongruence is associated with higher reading fluency and more regressions.

    PubMed

    Eilers, Sarah; Tiffin-Richards, Simon P; Schroeder, Sascha

    2018-05-10

    In two eye tracking experiments, we tested fourth graders' and adults' sensitivity to gender feature mismatches during reading of pronouns and their susceptibility to interference of feature-matching entities in the sentence. In Experiment 1, we showed children and adults two-phrase sentences such as "Leon{m}/Lisa{f} shooed away the sparrow{m}/the seagull{f} and then he{m} ate the tasty sandwich." Eye tracking measures showed no qualitative differences between children's and adults' processing of the pronouns. Both age groups showed longer gaze durations on subject mismatching than on matching pronouns, and there was no evidence of interference of a gender-matching object. Strikingly, in contrast to the adults, not all fourth graders reported detection of the subject gender mismatch. In Experiment 2, we replicated earlier results with a larger sample of children (N = 75) and found that only half of the fourth graders detected the gender mismatch during reading. The detectors' reading pattern at the pronoun differed from that of the non-detectors. Children who reported detection of the mismatch showed a reading pattern more similar to the adults. Children who did not report detection of the mismatch had comparably slower gaze durations and were less likely to make regressions directly at the pronoun. We conclude that children who read more fluently use their available processing resources to immediately repair grammatical inconsistencies encountered in a text. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. How preview space/time translates into preview cost/benefit for fixation durations during reading.

    PubMed

    Kliegl, Reinhold; Hohenstein, Sven; Yan, Ming; McDonald, Scott A

    2013-01-01

    Eye-movement control during reading depends on foveal and parafoveal information. If the parafoveal preview of the next word is suppressed, reading is less efficient. A linear mixed model (LMM) reanalysis of McDonald (2006) confirmed his observation that preview benefit may be limited to parafoveal words that have been selected as the saccade target. Going beyond the original analyses, in the same LMM, we examined how the preview effect (i.e., the difference in single-fixation duration, SFD, between random-letter and identical preview) depends on the gaze duration on the pretarget word and on the amplitude of the saccade moving the eye onto the target word. There were two key results: (a) The shorter the saccade amplitude (i.e., the larger preview space), the shorter a subsequent SFD with an identical preview; this association was not observed with a random-letter preview. (b) However, the longer the gaze duration on the pretarget word, the longer the subsequent SFD on the target, with the difference between random-letter string and identical previews increasing with preview time. A third pattern-increasing cost of a random-letter string in the parafovea associated with shorter saccade amplitudes-was observed for target gaze durations. Thus, LMMs revealed that preview effects, which are typically summarized under "preview benefit", are a complex mixture of preview cost and preview benefit and vary with preview space and preview time. The consequence for reading is that parafoveal preview may not only facilitate, but also interfere with lexical access.

  13. 4-aminopyridine restores vertical and horizontal neural integrator function in downbeat nystagmus.

    PubMed

    Kalla, Roger; Glasauer, Stefan; Büttner, Ulrich; Brandt, Thomas; Strupp, Michael

    2007-09-01

    Downbeat nystagmus (DBN), the most common form of acquired fixation nystagmus, is often caused by cerebellar degeneration, especially if the vestibulo-cerebellum is involved. The upward ocular drift in DBN has a spontaneous and a vertical gaze-evoked component. Since cerebellar involvement is suspected to be the underlying pathomechanism of DBN, we tested in 15 patients with DBN whether the application of the potassium-channel blocker 4-aminopyridine (4-AP), which increases the excitability of cerebellar Purkinje cells as shown in animal experiments, reduces the vertical ocular drift leading to nystagmus. Fifteen age-matched healthy subjects served as the control group. 4-AP may affect spontaneous drift or gaze-evoked drift by either enhancing visual fixation ability or restoring vision-independent gaze holding. We therefore recorded 3D slow-phase eye movements using search coils during attempted fixation in nine different eye positions and with or without a continuously visible target before and 45 min after ingestion of 10mg 4-AP. Since the effect of 4-AP may depend on the associated etiology, we divided our patients into three groups (cerebellar atrophy, n = 4; idiopathic DBN, n = 5; other etiology, n = 6). 4-AP decreased DBN during gaze straight ahead in 12 of 15 patients. Statistical analysis showed that improvement occurred predominantly in patients with cerebellar atrophy, in whom the drift was reduced from -4.99 +/- 1.07 deg/s (mean +/- SE) before treatment to -0.60 +/- 0.82 deg/s afterwards. Regression analysis of slow-phase velocity (SPV) in different eye positions revealed that vertical and horizontal gaze-evoked drift was significantly reduced independently of the patient group and caused perfect gaze holding on the average. Since the observed improvements were independent of target visibility, 4-AP improved fixation by restoring gaze-holding ability. All in all, the present study demonstrates that 4-AP has a differential effect on DBN: drift with gaze straight ahead was predominantly reduced in patients with cerebellar atrophy, but less so in the remaining patients; 4-AP on the average improved neural integrator function, i.e. gaze-evoked drift, regardless of etiology. Our results thus show that 4-AP was a successful treatment option in the majority of DBN patients, possibly by increasing Purkinje cell excitability in the cerebellar flocculi. It may work best when DBN is associated with cerebellar atrophy. Furthermore, 4-AP may be a promising treatment option for patients with a dominant gaze-evoked component of nystagmus, regardless of its etiology.

  14. Audience gaze while appreciating a multipart musical performance.

    PubMed

    Kawase, Satoshi; Obata, Satoshi

    2016-11-01

    Visual information has been observed to be crucial for audience members during musical performances. The present study used an eye tracker to investigate audience members' gazes while appreciating an audiovisual musical ensemble performance, based on evidence of the dominance of musical part in auditory attention when listening to multipart music that contains different melody lines and the joint-attention theory of gaze. We presented singing performances, by a female duo. The main findings were as follows: (1) the melody part (soprano) attracted more visual attention than the accompaniment part (alto) throughout the piece, (2) joint attention emerged when the singers shifted their gazes toward their co-performer, suggesting that inter-performer gazing interactions that play a spotlight role mediated performer-audience visual interaction, and (3) musical part (melody or accompaniment) strongly influenced the total duration of gazes among audiences, while the spotlight effect of gaze was limited to just after the singers' gaze shifts. Copyright © 2016. Published by Elsevier Inc.

  15. Reliability and Validity of Gaze-Dependent Functional Vision Space: A Novel Metric Quantifying Visual Function in Infantile Nystagmus Syndrome.

    PubMed

    Roberts, Tawna L; Kester, Kristi N; Hertle, Richard W

    2018-04-01

    This study presents test-retest reliability of optotype visual acuity (OVA) across 60° of horizontal gaze position in patients with infantile nystagmus syndrome (INS). Also, the validity of the metric gaze-dependent functional vision space (GDFVS) is shown in patients with INS. In experiment 1, OVA was measured twice in seven horizontal gaze positions from 30° left to right in 10° steps in 20 subjects with INS and 14 without INS. Test-retest reliability was assessed using intraclass correlation coefficient (ICC) in each gaze. OVA area under the curve (AUC) was calculated with horizontal eye position on the x-axis, and logMAR visual acuity on the y-axis and then converted to GDFVS. In experiment 2, validity of GDFVS was determined over 40° horizontal gaze by applying the 95% limits of agreement from experiment 1 to pre- and post-treatment GDFVS values from 85 patients with INS. In experiment 1, test-retest reliability for OVA was high (ICC ≥ 0.88) as the difference in test-retest was on average less than 0.1 logMAR in each gaze position. In experiment 2, as a group, INS subjects had a significant increase (P < 0.001) in the size of their GDFVS that exceeded the 95% limits of agreement found during test-retest. OVA is a reliable measure in INS patients across 60° of horizontal gaze position. GDFVS is a valid clinical method to be used to quantify OVA as a function of eye position in INS patients. This method captures the dynamic nature of OVA in INS patients and may be a valuable measure to quantify visual function patients with INS, particularly in quantifying change as part of clinical studies.

  16. Visual attention in mixed-gender groups

    PubMed Central

    Amon, Mary Jean

    2015-01-01

    A basic principle of objectification theory is that a mere glance from a stranger represents the potential to be sexualized, triggering women to take on the perspective of others and become vigilant to their appearance. However, research has yet to document gendered gaze patterns in social groups. The present study examined visual attention in groups of varying gender composition to understand how gender and minority status influence gaze behavior. One hundred undergraduates enrolled in psychology courses were photographed, and an additional 76 participants viewed groupings of these photographs while their point of gaze was recorded using a remote eye-tracking device. Participants were not told that their gaze was being recorded. Women were viewed more frequently and for longer periods of time than men in mixed-gender groups. Women were also more likely to be looked at first and last by observers. Men spent more time attending to pictures of women when fewer women were in the group. The opposite effect was found for pictures of men, such that male pictures were viewed less when fewer pictures of men were in the group. Female observers spent more time looking at men compared to male observers, and male observers spent more time looking at women than female observers, though both female and male observers looked at women more than men overall. Consistent with objectification theory, women's appearance garners more attention and interest in mixed-gender social groups. PMID:25628589

  17. [Webino syndrome caused by meningovascular syphilis. A rare entity with an unexpected cause].

    PubMed

    Rodríguez Calvo de Mora, M; Rodríguez Moreno, G; España Contreras, M

    2014-05-01

    The patient is a 57-year-old obese and hypertensive male. His chief complaints were double vision and dizziness, with mild exodeviation in both eyes in primary gaze position in the ocular motility examination, but more predominant in the left eye. The exotropia was noticeably more evident on the attempted upgaze. On horizontal gaze, the abducting eye deviated fully, but the adducting eye did not cross the midline. Nystagmus in the abducting eye and convergence impairment were found. Pupil size and testing were normal. Ataxia and areflexia were also present. Bilateral internuclear ophthalmoplegia was suspected and imaging and laboratory tests were performed. The CAT scan showed a right occipital hypo-attenuated lesion. In the MRI scan, a mesencephalic subacute ischemic lesion was found, involving the medial rectus sub-nuclei. Blood and cerebrospinal fluid test for syphilis were positive. Bilateral internuclear ophthalmoplegia is a very uncommon -and difficult to diagnose- condition. In the reported case the lesion involved the medial rectus sub-nuclei. This fact could explain the exotropia in the primary gaze position, and supports that is not possible to exclude the involvement of the medial rectus sub-nuclei in the webino syndrome. The rapid identification of the pathology contributed to the better prognosis of the patient. Copyright © 2012 Sociedad Española de Oftalmología. Published by Elsevier Espana. All rights reserved.

  18. Transition from Target to Gaze Coding in Primate Frontal Eye Field during Memory Delay and Memory-Motor Transformation.

    PubMed

    Sajad, Amirsaman; Sadeh, Morteza; Yan, Xiaogang; Wang, Hongying; Crawford, John Douglas

    2016-01-01

    The frontal eye fields (FEFs) participate in both working memory and sensorimotor transformations for saccades, but their role in integrating these functions through time remains unclear. Here, we tracked FEF spatial codes through time using a novel analytic method applied to the classic memory-delay saccade task. Three-dimensional recordings of head-unrestrained gaze shifts were made in two monkeys trained to make gaze shifts toward briefly flashed targets after a variable delay (450-1500 ms). A preliminary analysis of visual and motor response fields in 74 FEF neurons eliminated most potential models for spatial coding at the neuron population level, as in our previous study (Sajad et al., 2015). We then focused on the spatiotemporal transition from an eye-centered target code (T; preferred in the visual response) to an eye-centered intended gaze position code (G; preferred in the movement response) during the memory delay interval. We treated neural population codes as a continuous spatiotemporal variable by dividing the space spanning T and G into intermediate T-G models and dividing the task into discrete steps through time. We found that FEF delay activity, especially in visuomovement cells, progressively transitions from T through intermediate T-G codes that approach, but do not reach, G. This was followed by a final discrete transition from these intermediate T-G delay codes to a "pure" G code in movement cells without delay activity. These results demonstrate that FEF activity undergoes a series of sensory-memory-motor transformations, including a dynamically evolving spatial memory signal and an imperfect memory-to-motor transformation.

  19. Loneliness and Hypervigilance to Social Cues in Females: An Eye-Tracking Study

    PubMed Central

    Lodder, Gerine M. A.; Scholte, Ron H. J.; Clemens, Ivar A. H.; Engels, Rutger C. M. E.; Goossens, Luc; Verhagen, Maaike

    2015-01-01

    The goal of the present study was to examine whether lonely individuals differ from nonlonely individuals in their overt visual attention to social cues. Previous studies showed that loneliness was related to biased post-attentive processing of social cues (e.g., negative interpretation bias), but research on whether lonely and nonlonely individuals also show differences in an earlier information processing stage (gazing behavior) is very limited. A sample of 25 lonely and 25 nonlonely students took part in an eye-tracking study consisting of four tasks. We measured gazing (duration, number of fixations and first fixation) at the eyes, nose and mouth region of faces expressing emotions (Task 1), at emotion quadrants (anger, fear, happiness and neutral expression) (Task 2), at quadrants with positive and negative social and nonsocial images (Task 3), and at the facial area of actors in video clips with positive and negative content (Task 4). In general, participants tended to gaze most often and longest at areas that conveyed most social information, such as the eye region of the face (T1), and social images (T3). Participants gazed most often and longest at happy faces (T2) in still images, and more often and longer at the facial area in negative than in positive video clips (T4). No differences occurred between lonely and nonlonely participants in their gazing times and frequencies, nor at first fixations at social cues in the four different tasks. Based on this study, we found no evidence that overt visual attention to social cues differs between lonely and nonlonely individuals. This implies that biases in social information processing of lonely individuals may be limited to other phases of social information processing. Alternatively, biased overt attention to social cues may only occur under specific conditions, for specific stimuli or for specific lonely individuals. PMID:25915656

  20. A 6-year-old girl with restricted upward gaze of her right eye.

    PubMed

    Tuli, Sanjeev; Tuli, Sonal

    2012-08-01

    Brown syndrome is an incomitant strabismus syndrome characterized by inability of the eye to elevate during adduction. • Primary Brown syndrome is thought to occur due to the inability of the superior oblique tendon to stretch.However, there are many secondary causes of this condition that must be ruled out. • Despite significant misalignment of the eyes during upgaze, patients with Brown syndrome usually do not have decreased vision or diplopia with primary gaze. • Unlike paralytic strabismus, forced duction tests demonstrate restriction and a Parks’ three-step test does not demonstrate a paralytic muscle. Spontaneous resolution is frequent, and surgical management typically is not indicated because of the high incidence of postoperative symptomatic superior oblique palsy.

Top