Sample records for eye movement-based interaction

  1. Effects of saccadic bilateral eye movements on episodic and semantic autobiographical memory fluency.

    PubMed

    Parker, Andrew; Parkin, Adam; Dagnall, Neil

    2013-01-01

    Performing a sequence of fast saccadic horizontal eye movements has been shown to facilitate performance on a range of cognitive tasks, including the retrieval of episodic memories. One explanation for these effects is based on the hypothesis that saccadic eye movements increase hemispheric interaction, and that such interactions are important for particular types of memory. The aim of the current research was to assess the effect of horizontal saccadic eye movements on the retrieval of both episodic autobiographical memory (event/incident based memory) and semantic autobiographical memory (fact based memory) over recent and more distant time periods. It was found that saccadic eye movements facilitated the retrieval of episodic autobiographical memories (over all time periods) but not semantic autobiographical memories. In addition, eye movements did not enhance the retrieval of non-autobiographical semantic memory. This finding illustrates a dissociation between the episodic and semantic characteristics of personal memory and is considered within the context of hemispheric contributions to episodic memory performance.

  2. Effects of Saccadic Bilateral Eye Movements on Episodic and Semantic Autobiographical Memory Fluency

    PubMed Central

    Parker, Andrew; Parkin, Adam; Dagnall, Neil

    2013-01-01

    Performing a sequence of fast saccadic horizontal eye movements has been shown to facilitate performance on a range of cognitive tasks, including the retrieval of episodic memories. One explanation for these effects is based on the hypothesis that saccadic eye movements increase hemispheric interaction, and that such interactions are important for particular types of memory. The aim of the current research was to assess the effect of horizontal saccadic eye movements on the retrieval of both episodic autobiographical memory (event/incident based memory) and semantic autobiographical memory (fact based memory) over recent and more distant time periods. It was found that saccadic eye movements facilitated the retrieval of episodic autobiographical memories (over all time periods) but not semantic autobiographical memories. In addition, eye movements did not enhance the retrieval of non-autobiographical semantic memory. This finding illustrates a dissociation between the episodic and semantic characteristics of personal memory and is considered within the context of hemispheric contributions to episodic memory performance. PMID:24133435

  3. Implicit prosody mining based on the human eye image capture technology

    NASA Astrophysics Data System (ADS)

    Gao, Pei-pei; Liu, Feng

    2013-08-01

    The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.

  4. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology

    PubMed Central

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822

  5. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.

    PubMed

    Demšar, Urška; Çöltekin, Arzu

    2017-01-01

    Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.

  6. Dynamic interactions of eye and head movements when reading with single-vision and progressive lenses in a simulated computer-based environment.

    PubMed

    Han, Ying; Ciuffreda, Kenneth J; Selenow, Arkady; Ali, Steven R

    2003-04-01

    To assess dynamic interactions of eye and head movements during return-sweep saccades (RSS) when reading with single-vision (SVL) versus progressive-addition (PAL) lenses in a simulated computer-based business environment. Horizontal eye and head movements were recorded objectively and simultaneously at a rate of 60 Hz during reading of single-page (SP; 14 degrees horizontal [H]) and double-page (DP; 37 degrees H) formats at 60 cm with binocular viewing. Subjects included 11 individuals with normal presbyopic vision aged 45 to 71 years selected by convenience sampling from a clinic population. Reading was performed with three types of spectacle lenses with a different clear near field of view (FOV): a SVL (60 degrees H clear FOV), a PAL-I with a relatively wide intermediate zone (7.85 mm; 18 degrees H clear FOV), and a PAL-II with a relatively narrow intermediate zone (5.60 mm; 13 degrees H clear FOV). Eye movements were initiated before head movements in the SP condition, and the reverse was found in the DP condition, with all three lens types. Duration of eye movements increased as the zone of clear vision decreased in the SP condition, and they were longer with the PALs than with the SVL in the DP condition. Gaze stabilization occurred later with the PALs than with the SVL in both the SP and DP conditions. The duration of head movements was longer with the PAL-II than with the SVL in both the SP and DP conditions. Eye movement peak velocity was greater with the SVL than the PALs in the DP condition. Eye movement and head movement strategies and timing were contingent on viewing conditions. The longer eye movement duration and gaze-stabilization times suggested that additional eye movements were needed to locate the clear-vision zone and commence reading after the RSS. Head movements with PALs for the SP condition were similarly optically induced. These eye movement and head movement results may contribute to the reduced reading rate and related symptoms reported by some PAL wearers. The dynamic interactions of eye movements and head movements during reading with the PALs appear to be a sensitive indicator of the effect of lens optical design parameters on overall reading performance, because the movements can discriminate between SVL and PAL designs and at times even between PALs.

  7. Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements

    ERIC Educational Resources Information Center

    Yu, Chen; Yurovsky, Daniel; Xu, Tian

    2012-01-01

    Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…

  8. Using Eye Movement Desensitization and Reprocessing To Enhance Treatment of Couples.

    ERIC Educational Resources Information Center

    Protinsky, Howard; Sparks, Jennifer; Flemke, Kimberly

    2001-01-01

    Eye Movement Desensitization and Reprocessing (EMDR) as a clinical technique may enhance treatment effectiveness when applied in couple therapy that is emotionally and experientially oriented. Clinical experience indicates EMDR-based interventions are useful for accessing and reprocessing intense emotions in couple interactions. EMDR can amplify…

  9. Using Eye Movement to Control a Computer: A Design for a Lightweight Electro-Oculogram Electrode Array and Computer Interface

    PubMed Central

    Iáñez, Eduardo; Azorin, Jose M.; Perez-Vidal, Carlos

    2013-01-01

    This paper describes a human-computer interface based on electro-oculography (EOG) that allows interaction with a computer using eye movement. The EOG registers the movement of the eye by measuring, through electrodes, the difference of potential between the cornea and the retina. A new pair of EOG glasses have been designed to improve the user's comfort and to remove the manual procedure of placing the EOG electrodes around the user's eye. The interface, which includes the EOG electrodes, uses a new processing algorithm that is able to detect the gaze direction and the blink of the eyes from the EOG signals. The system reliably enabled subjects to control the movement of a dot on a video screen. PMID:23843986

  10. Removing the Interdependency between Horizontal and Vertical Eye-Movement Components in Electrooculograms

    PubMed Central

    Chang, Won-Du; Cha, Ho-Seung; Im, Chang-Hwan

    2016-01-01

    This paper introduces a method to remove the unwanted interdependency between vertical and horizontal eye-movement components in electrooculograms (EOGs). EOGs have been widely used to estimate eye movements without a camera in a variety of human-computer interaction (HCI) applications using pairs of electrodes generally attached either above and below the eye (vertical EOG) or to the left and right of the eyes (horizontal EOG). It has been well documented that the vertical EOG component has less stability than the horizontal EOG one, making accurate estimation of the vertical location of the eyes difficult. To address this issue, an experiment was designed in which ten subjects participated. Visual inspection of the recorded EOG signals showed that the vertical EOG component is highly influenced by horizontal eye movements, whereas the horizontal EOG is rarely affected by vertical eye movements. Moreover, the results showed that this interdependency could be effectively removed by introducing an individual constant value. It is therefore expected that the proposed method can enhance the overall performance of practical EOG-based eye-tracking systems. PMID:26907271

  11. Target Selection by the Frontal Cortex during Coordinated Saccadic and Smooth Pursuit Eye Movements

    ERIC Educational Resources Information Center

    Srihasam, Krishna; Bullock, Daniel; Grossberg, Stephen

    2009-01-01

    Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth-pursuit eye movements. In particular, the saccadic and smooth-pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do…

  12. Feature Selection in Classification of Eye Movements Using Electrooculography for Activity Recognition

    PubMed Central

    Mala, S.; Latha, K.

    2014-01-01

    Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition. PMID:25574185

  13. Feature selection in classification of eye movements using electrooculography for activity recognition.

    PubMed

    Mala, S; Latha, K

    2014-01-01

    Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition.

  14. Vestibular-Related Frontal Cortical Areas and Their Roles in Smooth-Pursuit Eye Movements: Representation of Neck Velocity, Neck-Vestibular Interactions, and Memory-Based Smooth-Pursuit

    PubMed Central

    Fukushima, Kikuro; Fukushima, Junko; Warabi, Tateo

    2011-01-01

    Smooth-pursuit eye movements are voluntary responses to small slow-moving objects in the fronto-parallel plane. They evolved in primates, who possess high-acuity foveae, to ensure clear vision about the moving target. The primate frontal cortex contains two smooth-pursuit related areas; the caudal part of the frontal eye fields (FEF) and the supplementary eye fields (SEF). Both areas receive vestibular inputs. We review functional differences between the two areas in smooth-pursuit. Most FEF pursuit neurons signal pursuit parameters such as eye velocity and gaze-velocity, and are involved in canceling the vestibulo-ocular reflex by linear addition of vestibular and smooth-pursuit responses. In contrast, gaze-velocity signals are rarely represented in the SEF. Most FEF pursuit neurons receive neck velocity inputs, while discharge modulation during pursuit and trunk-on-head rotation adds linearly. Linear addition also occurs between neck velocity responses and vestibular responses during head-on-trunk rotation in a task-dependent manner. During cross-axis pursuit–vestibular interactions, vestibular signals effectively initiate predictive pursuit eye movements. Most FEF pursuit neurons discharge during the interaction training after the onset of pursuit eye velocity, making their involvement unlikely in the initial stages of generating predictive pursuit. Comparison of representative signals in the two areas and the results of chemical inactivation during a memory-based smooth-pursuit task indicate they have different roles; the SEF plans smooth-pursuit including working memory of motion–direction, whereas the caudal FEF generates motor commands for pursuit eye movements. Patients with idiopathic Parkinson’s disease were asked to perform this task, since impaired smooth-pursuit and visual working memory deficit during cognitive tasks have been reported in most patients. Preliminary results suggested specific roles of the basal ganglia in memory-based smooth-pursuit. PMID:22174706

  15. Intersegmental Eye-Head-Body Interactions during Complex Whole Body Movements

    PubMed Central

    von Laßberg, Christoph; Beykirch, Karl A.; Mohler, Betty J.; Bülthoff, Heinrich H.

    2014-01-01

    Using state-of-the-art technology, interactions of eye, head and intersegmental body movements were analyzed for the first time during multiple twisting somersaults of high-level gymnasts. With this aim, we used a unique combination of a 16-channel infrared kinemetric system; a three-dimensional video kinemetric system; wireless electromyography; and a specialized wireless sport-video-oculography system, which was able to capture and calculate precise oculomotor data under conditions of rapid multiaxial acceleration. All data were synchronized and integrated in a multimodal software tool for three-dimensional analysis. During specific phases of the recorded movements, a previously unknown eye-head-body interaction was observed. The phenomenon was marked by a prolonged and complete suppression of gaze-stabilizing eye movements, in favor of a tight coupling with the head, spine and joint movements of the gymnasts. Potential reasons for these observations are discussed with regard to earlier findings and integrated within a functional model. PMID:24763143

  16. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  17. Using eye movement desensitization and reprocessing to enhance treatment of couples.

    PubMed

    Protinsky, H; Sparks, J; Flemke, K

    2001-04-01

    Eye Movement Desensitization and Reprocessing (EMDR) as a clinical technique may enhance treatment effectiveness when applied within a couple therapy approach that is emotionally and experientially oriented. Clinical experience indicates that EMDR-based interventions are useful for accessing, activating, tolerating, and reprocessing the intense emotions that often fuel dysfunctional couple interactions. Using EMDR within conjoint sessions to reprocess negative emotions can amplify intimacy, increase connection, and subsequently lead to a change in problematic relationship patterns.

  18. Video-Based Eye Tracking to Detect the Attention Shift: A Computer Classroom Context-Aware System

    ERIC Educational Resources Information Center

    Kuo, Yung-Lung; Lee, Jiann-Shu; Hsieh, Min-Chai

    2014-01-01

    Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom.…

  19. Interactions Dominate the Dynamics of Visual Cognition

    PubMed Central

    Stephen, Damian G.; Mirman, Daniel

    2010-01-01

    Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. PMID:20070957

  20. Evaluation of an eye-pointer interaction device for human-computer interaction.

    PubMed

    Cáceres, Enrique; Carrasco, Miguel; Ríos, Sebastián

    2018-03-01

    Advances in eye-tracking technology have led to better human-computer interaction, and involve controlling a computer without any kind of physical contact. This research describes the transformation of a commercial eye-tracker for use as an alternative peripheral device in human-computer interactions, implementing a pointer that only needs the eye movements of a user facing a computer screen, thus replacing the need to control the software by hand movements. The experiment was performed with 30 test individuals who used the prototype with a set of educational videogames. The results show that, although most of the test subjects would prefer a mouse to control the pointer, the prototype tested has an empirical precision similar to that of the mouse, either when trying to control its movements or when attempting to click on a point of the screen.

  1. The oculomotor role of the pontine nuclei and the nucleus reticularis tegmenti pontis.

    PubMed

    Thier, Peter; Möck, Martin

    2006-01-01

    Cerebral cortex and the cerebellum interact closely in order to facilitate spatial orientation and the generation of motor behavior, including eye movements. This interaction is based on a massive projection system that allows the exchange of signals between the two cortices. This cerebro-cerebellar communication system includes several intercalated brain stem nuclei, whose eminent role in the organization of oculomotor behavior has only recently become apparent. This review focuses on the two major nuclei of this group taking a precerebellar position, the pontine nuclei and the nucleus reticularis tegmenti pontis, both intimately involved in the visual guidance of eye movements.

  2. The interaction of Bayesian priors and sensory data and its neural circuit implementation in visually-guided movement

    PubMed Central

    Yang, Jin; Lee, Joonyeol; Lisberger, Stephen G.

    2012-01-01

    Sensory-motor behavior results from a complex interaction of noisy sensory data with priors based on recent experience. By varying the stimulus form and contrast for the initiation of smooth pursuit eye movements in monkeys, we show that visual motion inputs compete with two independent priors: one prior biases eye speed toward zero; the other prior attracts eye direction according to the past several days’ history of target directions. The priors bias the speed and direction of the initiation of pursuit for the weak sensory data provided by the motion of a low-contrast sine wave grating. However, the priors have relatively little effect on pursuit speed and direction when the visual stimulus arises from the coherent motion of a high-contrast patch of dots. For any given stimulus form, the mean and variance of eye speed co-vary in the initiation of pursuit, as expected for signal-dependent noise. This relationship suggests that pursuit implements a trade-off between movement accuracy and variation, reducing both when the sensory signals are noisy. The tradeoff is implemented as a competition of sensory data and priors that follows the rules of Bayesian estimation. Computer simulations show that the priors can be understood as direction specific control of the strength of visual-motor transmission, and can be implemented in a neural-network model that makes testable predictions about the population response in the smooth eye movement region of the frontal eye fields. PMID:23223286

  3. Interactions dominate the dynamics of visual cognition.

    PubMed

    Stephen, Damian G; Mirman, Daniel

    2010-04-01

    Many cognitive theories have described behavior as the summation of independent contributions from separate components. Contrasting views have emphasized the importance of multiplicative interactions and emergent structure. We describe a statistical approach to distinguishing additive and multiplicative processes and apply it to the dynamics of eye movements during classic visual cognitive tasks. The results reveal interaction-dominant dynamics in eye movements in each of the three tasks, and that fine-grained eye movements are modulated by task constraints. These findings reveal the interactive nature of cognitive processing and are consistent with theories that view cognition as an emergent property of processes that are broadly distributed over many scales of space and time rather than a componential assembly line. Copyright 2009 Elsevier B.V. All rights reserved.

  4. Eye Movements During Everyday Behavior Predict Personality Traits.

    PubMed

    Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A; Bulling, Andreas

    2018-01-01

    Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human-computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization.

  5. Eye Movements During Everyday Behavior Predict Personality Traits

    PubMed Central

    Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A.; Bulling, Andreas

    2018-01-01

    Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human–computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization. PMID:29713270

  6. Visuo-Vestibular Interactions

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Session TA3 includes short reports covering: (1) Vestibulo-Oculomotor Interaction in Long-Term Microgravity; (2) Effects of Weightlessness on the Spatial Orientation of Visually Induced Eye Movements; (3) Adaptive Modification of the Three-Dimensional Vestibulo-Ocular Reflex during Prolonged Microgravity; (4) The Dynamic Change of Brain Potential Related to Selective Attention to Visual Signals from Left and Right Visual Fields; (5) Locomotor Errors Caused by Vestibular Suppression; and (6) A Novel, Image-Based Technique for Three-Dimensional Eye Measurement.

  7. Interacting with mobile devices by fusion eye and hand gestures recognition systems based on decision tree approach

    NASA Astrophysics Data System (ADS)

    Elleuch, Hanene; Wali, Ali; Samet, Anis; Alimi, Adel M.

    2017-03-01

    Two systems of eyes and hand gestures recognition are used to control mobile devices. Based on a real-time video streaming captured from the device's camera, the first system recognizes the motion of user's eyes and the second one detects the static hand gestures. To avoid any confusion between natural and intentional movements we developed a system to fuse the decision coming from eyes and hands gesture recognition systems. The phase of fusion was based on decision tree approach. We conducted a study on 5 volunteers and the results that our system is robust and competitive.

  8. Eye-hand coordination during a double-step task: evidence for a common stochastic accumulator

    PubMed Central

    Gopal, Atul

    2015-01-01

    Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (∼90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning. PMID:26084906

  9. Eye-movements and Voice as Interface Modalities to Computer Systems

    NASA Astrophysics Data System (ADS)

    Farid, Mohsen M.; Murtagh, Fionn D.

    2003-03-01

    We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance human-computer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.

  10. Real time eye tracking using Kalman extended spatio-temporal context learning

    NASA Astrophysics Data System (ADS)

    Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu

    2017-06-01

    Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.

  11. Eye Tracking and Head Movement Detection: A State-of-Art Survey

    PubMed Central

    2013-01-01

    Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851

  12. MR-eyetracker: a new method for eye movement recording in functional magnetic resonance imaging.

    PubMed

    Kimmig, H; Greenlee, M W; Huethe, F; Mergner, T

    1999-06-01

    We present a method for recording saccadic and pursuit eye movements in the magnetic resonance tomograph designed for visual functional magnetic resonance imaging (fMRI) experiments. To reliably classify brain areas as pursuit or saccade related it is important to carefully measure the actual eye movements. For this purpose, infrared light, created outside the scanner by light-emitting diodes (LEDs), is guided via optic fibers into the head coil and onto the eye of the subject. Two additional fiber optical cables pick up the light reflected by the iris. The illuminating and detecting cables are mounted in a plastic eyepiece that is manually lowered to the level of the eye. By means of differential amplification, we obtain a signal that covaries with the horizontal position of the eye. Calibration of eye position within the scanner yields an estimate of eye position with a resolution of 0.2 degrees at a sampling rate of 1000 Hz. Experiments are presented that employ echoplanar imaging with 12 image planes through visual, parietal and frontal cortex while subjects performed saccadic and pursuit eye movements. The distribution of BOLD (blood oxygen level dependent) responses is shown to depend on the type of eye movement performed. Our method yields high temporal and spatial resolution of the horizontal component of eye movements during fMRI scanning. Since the signal is purely optical, there is no interaction between the eye movement signals and the echoplanar images. This reasonably priced eye tracker can be used to control eye position and monitor eye movements during fMRI.

  13. Form-To-Expectation Matching Effects on First-Pass Eye Movement Measures During Reading

    PubMed Central

    Farmer, Thomas A.; Yan, Shaorong; Bicknell, Klinton; Tanenhaus, Michael K.

    2015-01-01

    Recent EEG/MEG studies suggest that when contextual information is highly predictive of some property of a linguistic signal, expectations generated from context can be translated into surprisingly low-level estimates of the physical form-based properties likely to occur in subsequent portions of the unfolding signal. Whether form-based expectations are generated and assessed during natural reading, however, remains unclear. We monitored eye movements while participants read phonologically typical and atypical nouns in noun-predictive contexts (Experiment 1), demonstrating that when a noun is strongly expected, fixation durations on first-pass eye movement measures, including first fixation duration, gaze duration, and go-past times, are shorter for nouns with category typical form-based features. In Experiments 2 and 3, typical and atypical nouns were placed in sentential contexts normed to create expectations of variable strength for a noun. Context and typicality interacted significantly at gaze duration. These results suggest that during reading, form-based expectations that are translated from higher-level category-based expectancies can facilitate the processing of a word in context, and that their effect on lexical processing is graded based on the strength of category expectancy. PMID:25915072

  14. Active head rotations and eye-head coordination

    NASA Technical Reports Server (NTRS)

    Zangemeister, W. H.; Stark, L.

    1981-01-01

    It is pointed out that head movements play an important role in gaze. The interaction between eye and head movements involves both their shared role in directing gaze and the compensatory vestibular ocular reflex. The dynamics of head trajectories are discussed, taking into account the use of parameterization to obtain the peak velocity, peak accelerations, the times of these extrema, and the duration of the movement. Attention is given to the main sequence, neck muscle EMG and details of the head-movement trajectory, types of head model accelerations, the latency of eye and head movement in coordinated gaze, gaze latency as a function of various factors, and coordinated gaze types. Clinical examples of gaze-plane analysis are considered along with the instantaneous change of compensatory eye movement (CEM) gain, and aspects of variability.

  15. iTemplate: A template-based eye movement data analysis approach.

    PubMed

    Xiao, Naiqi G; Lee, Kang

    2018-02-08

    Current eye movement data analysis methods rely on defining areas of interest (AOIs). Due to the fact that AOIs are created and modified manually, variances in their size, shape, and location are unavoidable. These variances affect not only the consistency of the AOI definitions, but also the validity of the eye movement analyses based on the AOIs. To reduce the variances in AOI creation and modification and achieve a procedure to process eye movement data with high precision and efficiency, we propose a template-based eye movement data analysis method. Using a linear transformation algorithm, this method registers the eye movement data from each individual stimulus to a template. Thus, users only need to create one set of AOIs for the template in order to analyze eye movement data, rather than creating a unique set of AOIs for all individual stimuli. This change greatly reduces the error caused by the variance from manually created AOIs and boosts the efficiency of the data analysis. Furthermore, this method can help researchers prepare eye movement data for some advanced analysis approaches, such as iMap. We have developed software (iTemplate) with a graphic user interface to make this analysis method available to researchers.

  16. Eye movement identification based on accumulated time feature

    NASA Astrophysics Data System (ADS)

    Guo, Baobao; Wu, Qiang; Sun, Jiande; Yan, Hua

    2017-06-01

    Eye movement is a new kind of feature for biometrical recognition, it has many advantages compared with other features such as fingerprint, face, and iris. It is not only a sort of static characteristics, but also a combination of brain activity and muscle behavior, which makes it effective to prevent spoofing attack. In addition, eye movements can be incorporated with faces, iris and other features recorded from the face region into multimode systems. In this paper, we do an exploring study on eye movement identification based on the eye movement datasets provided by Komogortsev et al. in 2011 with different classification methods. The time of saccade and fixation are extracted from the eye movement data as the eye movement features. Furthermore, the performance analysis was conducted on different classification methods such as the BP, RBF, ELMAN and SVM in order to provide a reference to the future research in this field.

  17. Binocular fusion and invariant category learning due to predictive remapping during scanning of a depthful scene with eye movements

    PubMed Central

    Grossberg, Stephen; Srinivasan, Karthik; Yazdanbakhsh, Arash

    2015-01-01

    How does the brain maintain stable fusion of 3D scenes when the eyes move? Every eye movement causes each retinal position to process a different set of scenic features, and thus the brain needs to binocularly fuse new combinations of features at each position after an eye movement. Despite these breaks in retinotopic fusion due to each movement, previously fused representations of a scene in depth often appear stable. The 3D ARTSCAN neural model proposes how the brain does this by unifying concepts about how multiple cortical areas in the What and Where cortical streams interact to coordinate processes of 3D boundary and surface perception, spatial attention, invariant object category learning, predictive remapping, eye movement control, and learned coordinate transformations. The model explains data from single neuron and psychophysical studies of covert visual attention shifts prior to eye movements. The model further clarifies how perceptual, attentional, and cognitive interactions among multiple brain regions (LGN, V1, V2, V3A, V4, MT, MST, PPC, LIP, ITp, ITa, SC) may accomplish predictive remapping as part of the process whereby view-invariant object categories are learned. These results build upon earlier neural models of 3D vision and figure-ground separation and the learning of invariant object categories as the eyes freely scan a scene. A key process concerns how an object's surface representation generates a form-fitting distribution of spatial attention, or attentional shroud, in parietal cortex that helps maintain the stability of multiple perceptual and cognitive processes. Predictive eye movement signals maintain the stability of the shroud, as well as of binocularly fused perceptual boundaries and surface representations. PMID:25642198

  18. Binocular fusion and invariant category learning due to predictive remapping during scanning of a depthful scene with eye movements.

    PubMed

    Grossberg, Stephen; Srinivasan, Karthik; Yazdanbakhsh, Arash

    2014-01-01

    How does the brain maintain stable fusion of 3D scenes when the eyes move? Every eye movement causes each retinal position to process a different set of scenic features, and thus the brain needs to binocularly fuse new combinations of features at each position after an eye movement. Despite these breaks in retinotopic fusion due to each movement, previously fused representations of a scene in depth often appear stable. The 3D ARTSCAN neural model proposes how the brain does this by unifying concepts about how multiple cortical areas in the What and Where cortical streams interact to coordinate processes of 3D boundary and surface perception, spatial attention, invariant object category learning, predictive remapping, eye movement control, and learned coordinate transformations. The model explains data from single neuron and psychophysical studies of covert visual attention shifts prior to eye movements. The model further clarifies how perceptual, attentional, and cognitive interactions among multiple brain regions (LGN, V1, V2, V3A, V4, MT, MST, PPC, LIP, ITp, ITa, SC) may accomplish predictive remapping as part of the process whereby view-invariant object categories are learned. These results build upon earlier neural models of 3D vision and figure-ground separation and the learning of invariant object categories as the eyes freely scan a scene. A key process concerns how an object's surface representation generates a form-fitting distribution of spatial attention, or attentional shroud, in parietal cortex that helps maintain the stability of multiple perceptual and cognitive processes. Predictive eye movement signals maintain the stability of the shroud, as well as of binocularly fused perceptual boundaries and surface representations.

  19. Human amygdala activation during rapid eye movements of rapid eye movement sleep: an intracranial study.

    PubMed

    Corsi-Cabrera, María; Velasco, Francisco; Del Río-Portilla, Yolanda; Armony, Jorge L; Trejo-Martínez, David; Guevara, Miguel A; Velasco, Ana L

    2016-10-01

    The amygdaloid complex plays a crucial role in processing emotional signals and in the formation of emotional memories. Neuroimaging studies have shown human amygdala activation during rapid eye movement sleep (REM). Stereotactically implanted electrodes for presurgical evaluation in epileptic patients provide a unique opportunity to directly record amygdala activity. The present study analysed amygdala activity associated with REM sleep eye movements on the millisecond scale. We propose that phasic activation associated with rapid eye movements may provide the amygdala with endogenous excitation during REM sleep. Standard polysomnography and stereo-electroencephalograph (SEEG) were recorded simultaneously during spontaneous sleep in the left amygdala of four patients. Time-frequency analysis and absolute power of gamma activity were obtained for 250 ms time windows preceding and following eye movement onset in REM sleep, and in spontaneous waking eye movements in the dark. Absolute power of the 44-48 Hz band increased significantly during the 250 ms time window after REM sleep rapid eye movements onset, but not during waking eye movements. Transient activation of the amygdala provides physiological support for the proposed participation of the amygdala in emotional expression, in the emotional content of dreams and for the reactivation and consolidation of emotional memories during REM sleep, as well as for next-day emotional regulation, and its possible role in the bidirectional interaction between REM sleep and such sleep disorders as nightmares, anxiety and post-traumatic sleep disorder. These results provide unique, direct evidence of increased activation of the human amygdala time-locked to REM sleep rapid eye movements. © 2016 European Sleep Research Society.

  20. On Biometrics With Eye Movements.

    PubMed

    Zhang, Youming; Juhola, Martti

    2017-09-01

    Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.

  1. Eye Tracking Based Control System for Natural Human-Computer Interaction

    PubMed Central

    Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528

  2. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    PubMed

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  3. Generating and Describing Affective Eye Behaviors

    NASA Astrophysics Data System (ADS)

    Mao, Xia; Li, Zheng

    The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.

  4. Smooth pursuitlike eye movements evoked by microstimulation in macaque nucleus reticularis tegmenti pontis.

    PubMed

    Yamada, T; Suzuki, D A; Yee, R D

    1996-11-01

    1. Smooth pursuitlike eye movements were evoked with low current microstimulation delivered to rostral portions of the nucleus reticularis tegmenti pontis (rNRTP) in alert macaques. Microstimulation sites were selected by the observation of modulations in single-cell firing rates that were correlated with periodic smoothpursuit eye movements. Current intensities ranged from 10 to 120 microA and were routinely < 40 microA. Microstimulation was delivered either in the dark with no fixation, 100 ms after a fixation target was extinguished, or during maintained fixation of a stationary or moving target. Evoked eye movements also were studied under open-loop conditions with the target image stabilized on the retina. 2. Eye movements evoked in the absence of a target rapidly accelerated to a constant velocity that was maintained for the duration of the microstimulation. Evoked eye speeds ranged from 3.7 to 23 deg/s and averaged 11 deg/s. Evoked eye speed appeared to be linearly related to initial eye position with a sensitivity to initial eye position that averaged 0.23 deg.s-1.deg-1. While some horizontal and oblique smooth eye movements were elicited, microstimulation resulted in upward eye movements in 89% of the sites. 3. Evoked eye speed was found to be dependent on microstimulation pulse frequency and current intensity. Within limits, evoked eye speed increased with increases in stimulation frequency or current intensity. For stimulation frequencies < 300-400 Hz, only smooth pursuit-like eye movements were evoked. At higher stimulation frequencies, accompanying saccades consistently were elicited. 4. Feedback of retinal image motion interacted with the evoked eye movements to decrease eye speed if the visual motion was in the opposite direction as the evoked, pursuit-like eye movements. 5. The results implicate rNRTP as part of the neuronal substrate that controls smooth-pursuit eye movements. NRTP appears to be divided functionally into a rostral, pursuit-related portion and a caudal, saccade-related area. rNRTP is a component of a corticopontocerebellar circuit that presumably involves the pursuit area of the frontal eye field and that parallels the middle and medial superior temporal cerebral cortical/dorsalateral pontine nucleus (MT/MST-DLPN-cerebellum) pathway known to be involved also with regulating smooth-pursuit eye movements.

  5. Brief Report: Patterns of Eye Movements in Face to Face Conversation Are Associated with Autistic Traits--Evidence from a Student Sample

    ERIC Educational Resources Information Center

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking…

  6. The Relation between Reading Skills and Eye Movement Patterns in Adolescent Readers: Evidence from a Regular Orthography

    PubMed Central

    Krieber, Magdalena; Bartl-Pokorny, Katrin D.; Pokorny, Florian B.; Einspieler, Christa; Langmann, Andrea; Körner, Christof; Falck-Ytter, Terje; Marschik, Peter B.

    2016-01-01

    Over the past decades, the relation between reading skills and eye movement behavior has been well documented in English-speaking cohorts. As English and German differ substantially with regard to orthographic complexity (i.e. grapheme-phoneme correspondence), we aimed to delineate specific characteristics of how reading speed and reading comprehension interact with eye movements in typically developing German-speaking (Austrian) adolescents. Eye movements of 22 participants (14 females; mean age = 13;6 years;months) were tracked while they were performing three tasks, namely silently reading words, texts, and pseudowords. Their reading skills were determined by means of a standardized German reading speed and reading comprehension assessment (Lesegeschwindigkeits- und -verständnistest für Klassen 6−12). We found that (a) reading skills were associated with various eye movement parameters in each of the three reading tasks; (b) better reading skills were associated with an increased efficiency of eye movements, but were primarily linked to spatial reading parameters, such as the number of fixations per word, the total number of saccades and saccadic amplitudes; (c) reading speed was a more reliable predictor for eye movement parameters than reading comprehension; (d) eye movements were highly correlated across reading tasks, which indicates consistent reading performances. Contrary to findings in English-speaking cohorts, the reading skills neither consistently correlated with temporal eye movement parameters nor with the number or percentage of regressions made while performing any of the three reading tasks. These results indicate that, although reading skills are associated with eye movement patterns irrespective of language, the temporal and spatial characteristics of this association may vary with orthographic consistency. PMID:26727255

  7. The vestibular-related frontal cortex and its role in smooth-pursuit eye movements and vestibular-pursuit interactions

    PubMed Central

    Fukushima, Junko; Akao, Teppei; Kurkin, Sergei; Kaneko, Chris R.S.; Fukushima, Kikuro

    2006-01-01

    In order to see clearly when a target is moving slowly, primates with high acuity foveae use smooth-pursuit and vergence eye movements. The former rotates both eyes in the same direction to track target motion in frontal planes, while the latter rotates left and right eyes in opposite directions to track target motion in depth. Together, these two systems pursue targets precisely and maintain their images on the foveae of both eyes. During head movements, both systems must interact with the vestibular system to minimize slip of the retinal images. The primate frontal cortex contains two pursuit-related areas; the caudal part of the frontal eye fields (FEF) and supplementary eye fields (SEF). Evoked potential studies have demonstrated vestibular projections to both areas and pursuit neurons in both areas respond to vestibular stimulation. The majority of FEF pursuit neurons code parameters of pursuit such as pursuit and vergence eye velocity, gaze velocity, and retinal image motion for target velocity in frontal and depth planes. Moreover, vestibular inputs contribute to the predictive pursuit responses of FEF neurons. In contrast, the majority of SEF pursuit neurons do not code pursuit metrics and many SEF neurons are reported to be active in more complex tasks. These results suggest that FEF- and SEF-pursuit neurons are involved in different aspects of vestibular-pursuit interactions and that eye velocity coding of SEF pursuit neurons is specialized for the task condition. PMID:16917164

  8. Image-based computer-assisted diagnosis system for benign paroxysmal positional vertigo

    NASA Astrophysics Data System (ADS)

    Kohigashi, Satoru; Nakamae, Koji; Fujioka, Hiromu

    2005-04-01

    We develop the image based computer assisted diagnosis system for benign paroxysmal positional vertigo (BPPV) that consists of the balance control system simulator, the 3D eye movement simulator, and the extraction method of nystagmus response directly from an eye movement image sequence. In the system, the causes and conditions of BPPV are estimated by searching the database for record matching with the nystagmus response for the observed eye image sequence of the patient with BPPV. The database includes the nystagmus responses for simulated eye movement sequences. The eye movement velocity is obtained by using the balance control system simulator that allows us to simulate BPPV under various conditions such as canalithiasis, cupulolithiasis, number of otoconia, otoconium size, and so on. Then the eye movement image sequence is displayed on the CRT by the 3D eye movement simulator. The nystagmus responses are extracted from the image sequence by the proposed method and are stored in the database. In order to enhance the diagnosis accuracy, the nystagmus response for a newly simulated sequence is matched with that for the observed sequence. From the matched simulation conditions, the causes and conditions of BPPV are estimated. We apply our image based computer assisted diagnosis system to two real eye movement image sequences for patients with BPPV to show its validity.

  9. Looking around: 35 years of oculomotor modeling

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1995-01-01

    Eye movements have attracted an unusually large number of researchers from many disparate fields, especially over the past 35 years. The lure of this system stemmed from its apparent simplicity of description, measurement, and analysis, as well as the promise of providing a "window in the mind." Investigators in areas ranging from biological control systems and neurological diagnosis to applications in advertising and flight simulation expected eye movements to provide clear indicators of what the sensory-motor system was accomplishing and what the brain found to be of interest. The parallels between compensatory eye movements and perception of spatial orientation have been a subject for active study in visual-vestibular interaction, where substantial knowledge has accumulated through experiments largely guided by the challenge of proving or disproving model predictions. Even though oculomotor control has arguably benefited more from systems theory than any other branch of motor control, many of the original goals remain largely unfulfilled. This paper considers some of the promising potential benefits of eye movement research and compares accomplishments with anticipated results. Four topics are considered in greater detail: (i) the definition of oculomotor system input and output, (ii) optimization of the eye movement system, (iii) the relationship between compensatory eye movements and spatial orientation through the "internal model," and (iv) the significance of eye movements as measured in (outer) space.

  10. Looking around: 35 years of oculomotor modeling.

    PubMed

    Young, L R

    1995-01-01

    Eye movements have attracted an unusually large number of researchers from many disparate fields, especially over the past 35 years. The lure of this system stemmed from its apparent simplicity of description, measurement, and analysis, as well as the promise of providing a "window in the mind." Investigators in areas ranging from biological control systems and neurological diagnosis to applications in advertising and flight simulation expected eye movements to provide clear indicators of what the sensory-motor system was accomplishing and what the brain found to be of interest. The parallels between compensatory eye movements and perception of spatial orientation have been a subject for active study in visual-vestibular interaction, where substantial knowledge has accumulated through experiments largely guided by the challenge of proving or disproving model predictions. Even though oculomotor control has arguably benefited more from systems theory than any other branch of motor control, many of the original goals remain largely unfulfilled. This paper considers some of the promising potential benefits of eye movement research and compares accomplishments with anticipated results. Four topics are considered in greater detail: (i) the definition of oculomotor system input and output, (ii) optimization of the eye movement system, (iii) the relationship between compensatory eye movements and spatial orientation through the "internal model," and (iv) the significance of eye movements as measured in (outer) space.

  11. Attachment Avoidance Is Significantly Related to Attentional Preference for Infant Faces: Evidence from Eye Movement Data

    PubMed Central

    Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan

    2017-01-01

    Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women’s eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants (N = 150; 84% Han ethnicity) were aged 18–29 years (M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment. PMID:28184210

  12. Attachment Avoidance Is Significantly Related to Attentional Preference for Infant Faces: Evidence from Eye Movement Data.

    PubMed

    Jia, Yuncheng; Cheng, Gang; Zhang, Dajun; Ta, Na; Xia, Mu; Ding, Fangyuan

    2017-01-01

    Objective: To determine the influence of adult attachment orientations on infant preference. Methods: We adopted eye-tracking technology to monitor childless college women's eye movements when looking at pairs of faces, including one adult face (man or woman) and one infant face, with three different expressions (happy, sadness, and neutral). The participants ( N = 150; 84% Han ethnicity) were aged 18-29 years ( M = 19.22, SD = 1.72). A random intercepts multilevel linear regression analysis was used to assess the unique contribution of attachment avoidance, determined using the Experiences in Close Relationships scale, to preference for infant faces. Results: Women with higher attachment avoidance showed less infant preference, as shown by less sustained overt attentional bias to the infant face than the adult face based on fixation time and count. Conclusion: Adult attachment might be related to infant preference according to eye movement indices. Women with higher attachment avoidance may lack attentional preference for infant faces. The findings may aid the treatment and remediation of the interactions between children and mothers with insecure attachment.

  13. ECEM (Eye Closure, Eye Movements): application to depersonalization disorder.

    PubMed

    Harriet, E Hollander

    2009-10-01

    Eye Closure, Eye Movements (ECEM) is a hypnotically-based approach to treatment that incorporates eye movements adapted from the Eye Movement Desensitization and Reprocessing (EMDR) protocol in conjunction with hypnosis for the treatment of depersonalization disorder. Depersonalization Disorder has been differentiated from post-traumatic stress disorders and has recently been conceptualized as a subtype of panic disorder (Baker et al., 2003; David, Phillips, Medford, & Sierra, 2004; Segui et. al., 2000). During ECEM, while remaining in a hypnotic state, clients self-generated six to seven trials of eye movements to reduce anticipatory anxiety associated with depersonalization disorder. Eye movements were also used to process triggers that elicited breath holding, often followed by episodes of depersonalization. Hypnotic suggestions were used to reverse core symptoms of depersonalization, subjectively described as "feeling unreal" (Simeon et al., 1997).

  14. Effect of bilateral eye movements on frontal interhemispheric gamma EEG coherence: implications for EMDR therapy.

    PubMed

    Propper, Ruth E; Pierce, Jenna; Geisler, Mark W; Christman, Stephen D; Bellorado, Nathan

    2007-09-01

    The use of bilateral eye movements (EMs) is an important component of Eye Movement Desensitization and Reprocessing (EMDR) therapy for posttraumatic stress disorder. The neural mechanisms underlying EMDR remain unclear. However, prior behavioral work looking at the effects of bilateral EMs on the retrieval of episodic memories suggests that the EMs enhance interhemispheric interaction. The present study examined the effects of the EMs used in EMDR on interhemispheric electroencephalogram coherence. Relative to noneye-movement controls, engaging in bilateral EMs led to decreased interhemispheric gamma electroencephalogram coherence. Implications for future work on EMDR and episodic memory are discussed.

  15. Investigating the causes of wrap-up effects: evidence from eye movements and E-Z Reader.

    PubMed

    Warren, Tessa; White, Sarah J; Reichle, Erik D

    2009-04-01

    Wrap-up effects in reading have traditionally been thought to reflect increased processing associated with intra- and inter-clause integration (Just, M. A. & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review,87(4), 329-354; Rayner, K., Kambe, G., & Duffy, S. A. (2000). The effect of clause wrap-up on eye movements during reading. The Quarterly Journal of Experimental Psychology,53A(4), 1061-1080; cf. Hirotani, M., Frazier, L., & Rayner, K. (2006). Punctuation and intonation effects on clause and sentence wrap-up: Evidence from eye movements. Journal of Memory and Language,54, 425-443). We report an eye-tracking experiment with a strong manipulation of integrative complexity at a critical word that was either sentence-final, ended a comma-marked clause, or was not comma-marked. Although both complexity and punctuation had reliable effects, they did not interact in any eye-movement measure. These results as well as simulations using the E-Z Reader model of eye-movement control (Reichle, E. D., Warren, T., & McConnell, K. (2009). Using E-Z Reader to model the effects of higher-level language processing on eye movements during reading. Psychonomic Bulletin & Review,16(1), 1-20) suggest that traditional accounts of clause wrap-up are incomplete.

  16. Geometry and Gesture-Based Features from Saccadic Eye-Movement as a Biometric in Radiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hammond, Tracy; Tourassi, Georgia; Yoon, Hong-Jun

    In this study, we present a novel application of sketch gesture recognition on eye-movement for biometric identification and estimating task expertise. The study was performed for the task of mammographic screening with simultaneous viewing of four coordinated breast views as typically done in clinical practice. Eye-tracking data and diagnostic decisions collected for 100 mammographic cases (25 normal, 25 benign, 50 malignant) and 10 readers (three board certified radiologists and seven radiology residents), formed the corpus for this study. Sketch gesture recognition techniques were employed to extract geometric and gesture-based features from saccadic eye-movements. Our results show that saccadic eye-movement, characterizedmore » using sketch-based features, result in more accurate models for predicting individual identity and level of expertise than more traditional eye-tracking features.« less

  17. Magnetic eye tracking in mice

    PubMed Central

    Payne, Hannah L

    2017-01-01

    Eye movements provide insights about a wide range of brain functions, from sensorimotor integration to cognition; hence, the measurement of eye movements is an important tool in neuroscience research. We describe a method, based on magnetic sensing, for measuring eye movements in head-fixed and freely moving mice. A small magnet was surgically implanted on the eye, and changes in the magnet angle as the eye rotated were detected by a magnetic field sensor. Systematic testing demonstrated high resolution measurements of eye position of <0.1°. Magnetic eye tracking offers several advantages over the well-established eye coil and video-oculography methods. Most notably, it provides the first method for reliable, high-resolution measurement of eye movements in freely moving mice, revealing increased eye movements and altered binocular coordination compared to head-fixed mice. Overall, magnetic eye tracking provides a lightweight, inexpensive, easily implemented, and high-resolution method suitable for a wide range of applications. PMID:28872455

  18. Learning to Interact with a Computer by Gaze

    ERIC Educational Resources Information Center

    Aoki, Hirotaka; Hansen, John Paulin; Itoh, Kenji

    2008-01-01

    The aim of this paper is to examine the learning processes that subjects undertake when they start using gaze as computer input. A 7-day experiment with eight Japanese students was carried out to record novice users' eye movement data during typing of 110 sentences. The experiment revealed that inefficient eye movements was dramatically reduced…

  19. Interaction of visual and vestibular stimulation on spatial coordinates for eye movements in rabbits.

    PubMed

    Pettorossi, V E; Errico, P; Ferraresi, A; Minciotti, M; Barmack, N H

    1998-07-01

    Researchers investigated how vestibular and optokinetic signals alter the spatial transformation of the coordinate system that governs the spatial orientation of reflexive eye movements. Also examined were the effects of sensory stimulation when vestibular and optokinetic signals act synergistically and when the two signals are in conflict.

  20. Effects of handedness & saccadic bilateral eye movements on the specificity of past autobiographical memory & episodic future thinking.

    PubMed

    Parker, Andrew; Parkin, Adam; Dagnall, Neil

    2017-06-01

    The present research investigated the effects of personal handedness and saccadic eye movements on the specificity of past autobiographical memory and episodic future thinking. Handedness and saccadic eye movements have been hypothesised to share a common functional basis in that both influence cognition through hemispheric interaction. The technique used to elicit autobiographical memory and episodic future thought involved a cued sentence completion procedure that allowed for the production of memories spanning the highly specific to the very general. Experiment 1 found that mixed-handed (vs. right handed) individuals generated more specific past autobiographical memories, but equivalent numbers of specific future predictions. Experiment 2 demonstrated that following 30s of bilateral (horizontal) saccades, more specific cognitions about both the past and future were generated. These findings extend previous research by showing that more distinct and episodic-like information pertaining to the self can be elicited by either mixed-handedness or eye movements. The results are discussed in relation to hemispheric interaction and top-down influences in the control of memory retrieval. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Eye movements: The past 25 years

    PubMed Central

    Kowler, Eileen

    2011-01-01

    This article reviews the past 25 of research on eye movements (1986–2011). Emphasis is on three oculomotor behaviors: gaze control, smooth pursuit and saccades, and on their interactions with vision. Focus over the past 25 years has remained on the fundamental and classical questions: What are the mechanisms that keep gaze stable with either stationary or moving targets? How does the motion of the image on the retina affect vision? Where do we look – and why – when performing a complex task? How can the world appear clear and stable despite continual movements of the eyes? The past 25 years of investigation of these questions has seen progress and transformations at all levels due to new approaches (behavioral, neural and theoretical) aimed at studying how eye movements cope with real-world visual and cognitive demands. The work has led to a better understanding of how prediction, learning and attention work with sensory signals to contribute to the effective operation of eye movements in visually rich environments. PMID:21237189

  2. Touch and Gesture-Based Language Learning: Some Possible Avenues for Research and Classroom Practice

    ERIC Educational Resources Information Center

    Reinders, Hayo

    2014-01-01

    Our interaction with digital resources is becoming increasingly based on touch, gestures, and now also eye movement. Many everyday consumer electronics products already include touch-based interfaces, from e-book readers to tablets, and from the last personal computers to the GPS system in your car. What implications do these new forms of…

  3. Effects of Bilateral Eye Movements on Gist Based False Recognition in the DRM Paradigm

    ERIC Educational Resources Information Center

    Parker, Andrew; Dagnall, Neil

    2007-01-01

    The effects of saccadic bilateral (horizontal) eye movements on gist based false recognition was investigated. Following exposure to lists of words related to a critical but non-studied word participants were asked to engage in 30s of bilateral vs. vertical vs. no eye movements. Subsequent testing of recognition memory revealed that those who…

  4. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    NASA Astrophysics Data System (ADS)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  5. Predictors of verb-mediated anticipatory eye movements in the visual world.

    PubMed

    Hintz, Florian; Meyer, Antje S; Huettig, Falk

    2017-09-01

    Many studies have demonstrated that listeners use information extracted from verbs to guide anticipatory eye movements to objects in the visual context that satisfy the selection restrictions of the verb. An important question is what underlies such verb-mediated anticipatory eye gaze. Based on empirical and theoretical suggestions, we investigated the influence of 5 potential predictors of this behavior: functional associations and general associations between verb and target object, as well as the listeners' production fluency, receptive vocabulary knowledge, and nonverbal intelligence. In 3 eye-tracking experiments, participants looked at sets of 4 objects and listened to sentences where the final word was predictable or not predictable (e.g., "The man peels/draws an apple"). On predictable trials only the target object, but not the distractors, were functionally and associatively related to the verb. In Experiments 1 and 2, objects were presented before the verb was heard. In Experiment 3, participants were given a short preview of the display after the verb was heard. Functional associations and receptive vocabulary were found to be important predictors of verb-mediated anticipatory eye gaze independent of the amount of contextual visual input. General word associations did not and nonverbal intelligence was only a very weak predictor of anticipatory eye movements. Participants' production fluency correlated positively with the likelihood of anticipatory eye movements when participants were given the long but not the short visual display preview. These findings fit best with a pluralistic approach to predictive language processing in which multiple mechanisms, mediating factors, and situational context dynamically interact. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. A laser-based eye-tracking system.

    PubMed

    Irie, Kenji; Wilson, Bruce A; Jones, Richard D; Bones, Philip J; Anderson, Tim J

    2002-11-01

    This paper reports on the development of a new eye-tracking system for noninvasive recording of eye movements. The eye tracker uses a flying-spot laser to selectively image landmarks on the eye and, subsequently, measure horizontal, vertical, and torsional eye movements. Considerable work was required to overcome the adverse effects of specular reflection of the flying-spot from the surface of the eye onto the sensing elements of the eye tracker. These effects have been largely overcome, and the eye-tracker has been used to document eye movement abnormalities, such as abnormal torsional pulsion of saccades, in the clinical setting.

  7. Context effects on smooth pursuit and manual interception of a disappearing target.

    PubMed

    Kreyenmeier, Philipp; Fooken, Jolande; Spering, Miriam

    2017-07-01

    In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points. Copyright © 2017 the American Physiological Society.

  8. Real-time inference of word relevance from electroencephalogram and eye gaze

    NASA Astrophysics Data System (ADS)

    Wenzel, M. A.; Bogojeski, M.; Blankertz, B.

    2017-10-01

    Objective. Brain-computer interfaces can potentially map the subjective relevance of the visual surroundings, based on neural activity and eye movements, in order to infer the interest of a person in real-time. Approach. Readers looked for words belonging to one out of five semantic categories, while a stream of words passed at different locations on the screen. It was estimated in real-time which words and thus which semantic category interested each reader based on the electroencephalogram (EEG) and the eye gaze. Main results. Words that were subjectively relevant could be decoded online from the signals. The estimation resulted in an average rank of 1.62 for the category of interest among the five categories after a hundred words had been read. Significance. It was demonstrated that the interest of a reader can be inferred online from EEG and eye tracking signals, which can potentially be used in novel types of adaptive software, which enrich the interaction by adding implicit information about the interest of the user to the explicit interaction. The study is characterised by the following novelties. Interpretation with respect to the word meaning was necessary in contrast to the usual practice in brain-computer interfacing where stimulus recognition is sufficient. The typical counting task was avoided because it would not be sensible for implicit relevance detection. Several words were displayed at the same time, in contrast to the typical sequences of single stimuli. Neural activity was related with eye tracking to the words, which were scanned without restrictions on the eye movements.

  9. Eye movements reflect and shape strategies in fraction comparison.

    PubMed

    Ischebeck, Anja; Weilharter, Marina; Körner, Christof

    2016-01-01

    The comparison of fractions is a difficult task that can often be facilitated by separately comparing components (numerators and denominators) of the fractions--that is, by applying so-called component-based strategies. The usefulness of such strategies depends on the type of fraction pair to be compared. We investigated the temporal organization and the flexibility of strategy deployment in fraction comparison by evaluating sequences of eye movements in 20 young adults. We found that component-based strategies could account for the response times and the overall number of fixations observed for the different fraction pairs. The analysis of eye movement sequences showed that the initial eye movements in a trial were characterized by stereotypical scanning patterns indicative of an exploratory phase that served to establish the kind of fraction pair presented. Eye movements that followed this phase adapted to the particular type of fraction pair and indicated the deployment of specific comparison strategies. These results demonstrate that participants employ eye movements systematically to support strategy use in fraction comparison. Participants showed a remarkable flexibility to adapt to the most efficient strategy on a trial-by-trial basis. Our results confirm the value of eye movement measurements in the exploration of strategic adaptation in complex tasks.

  10. Interaction between Visual- and Goal-Related Neuronal Signals on the Trajectories of Saccadic Eye Movements

    ERIC Educational Resources Information Center

    White, Brian J.; Theeuwes, Jan; Munoz, Douglas P.

    2012-01-01

    During natural viewing, the trajectories of saccadic eye movements often deviate dramatically from a straight-line path between objects. In human studies, saccades have been shown to deviate toward or away from salient visual distractors depending on visual- and goal-related parameters, but the neurophysiological basis for this is not well…

  11. Effects of Macrophage Depletion on Sleep in Mice

    PubMed Central

    Ames, Conner; Boland, Erin; Szentirmai, Éva

    2016-01-01

    The reciprocal interaction between the immune system and sleep regulation has been widely acknowledged but the cellular mechanisms that underpin this interaction are not completely understood. In the present study, we investigated the role of macrophages in sleep loss- and cold exposure-induced sleep and body temperature responses. Macrophage apoptosis was induced in mice by systemic injection of clodronate-containing liposomes (CCL). We report that CCL treatment induced an immediate and transient increase in non-rapid-eye movement sleep (NREMS) and fever accompanied by decrease in rapid-eye movement sleep, motor activity and NREMS delta power. Chronically macrophage-depleted mice had attenuated NREMS rebound after sleep deprivation compared to normal mice. Cold-induced increase in wakefulness and decrease in NREMS, rapid-eye movement sleep and body temperature were significantly enhanced in macrophage-depleted mice indicating increased cold sensitivity. These findings provide further evidence for the reciprocal interaction among the immune system, sleep and metabolism, and identify macrophages as one of the key cellular elements in this interplay. PMID:27442442

  12. Dissociable Stages of Problem Solving (I): Temporal Characteristics Revealed by Eye-Movement Analyses

    ERIC Educational Resources Information Center

    Nitschke, Kai; Ruh, Nina; Kappler, Sonja; Stahl, Christoph; Kaller, Christoph P.

    2012-01-01

    Understanding the functional neuroanatomy of planning and problem solving may substantially benefit from better insight into the chronology of the cognitive processes involved. Based on the assumption that regularities in cognitive processing are reflected in overtly observable eye-movement patterns, here we recorded eye movements while…

  13. A unified dynamic neural field model of goal directed eye movements

    NASA Astrophysics Data System (ADS)

    Quinton, J. C.; Goffart, L.

    2018-01-01

    Primates heavily rely on their visual system, which exploits signals of graded precision based on the eccentricity of the target in the visual field. The interactions with the environment involve actively selecting and focusing on visual targets or regions of interest, instead of contemplating an omnidirectional visual flow. Eye-movements specifically allow foveating targets and track their motion. Once a target is brought within the central visual field, eye-movements are usually classified into catch-up saccades (jumping from one orientation or fixation to another) and smooth pursuit (continuously tracking a target with low velocity). Building on existing dynamic neural field equations, we introduce a novel model that incorporates internal projections to better estimate the current target location (associated to a peak of activity). Such estimate is then used to trigger an eye movement, leading to qualitatively different behaviours depending on the dynamics of the whole oculomotor system: (1) fixational eye-movements due to small variations in the weights of projections when the target is stationary, (2) interceptive and catch-up saccades when peaks build and relax on the neural field, (3) smooth pursuit when the peak stabilises near the centre of the field, the system reaching a fixed point attractor. Learning is nevertheless required for tracking a rapidly moving target, and the proposed model thus replicates recent results in the monkey, in which repeated exercise permits the maintenance of the target within in the central visual field at its current (here-and-now) location, despite the delays involved in transmitting retinal signals to the oculomotor neurons.

  14. Interactions of cervico-ocular and vestibulo-ocular fast-phase signals in the control of eye position in rabbits.

    PubMed Central

    Barmack, N H; Errico, P; Ferraresi, A; Pettorossi, V E

    1989-01-01

    1. Eye movements in unanaesthetized rabbits were studied during horizontal neck-proprioceptive stimulation (movement of the body with respect to the fixed head), when this stimulation was given alone and when it was given simultaneously with vestibular stimulation (rotation of the head-body). The effect of neck-proprioceptive stimulation on modifying the anticompensatory fast-phase eye movements (AFPs) evoked by vestibular stimulation was studied with a 'conditioning-test' protocol; the 'conditioning' stimulus was a neck-proprioceptive signal evoked by a step-like change in body position with respect to the head and the 'test' stimulus was a vestibular signal evoked by a step rotation of the head-body. 2. The influence of eye position and direction of slow eye movements on the occurrence of compensatory fast-phase eye movements (CFPs) evoked by neck-proprioceptive stimulation was also examined. 3. The anticompensatory fast phase (AFP) evoked by vestibular stimulation was attenuated by a preceding neck-proprioceptive stimulus which when delivered alone evoked compensatory slow-phase eye movements (CSP) in the same direction as the CSP evoked by vestibular stimulation. Conversely, the vestibularly evoked AFP was potentiated by a neck-proprioceptive stimulus which evoked CSPs opposite to that of vestibularly evoked CSPs. 4. Eccentric initial eye positions increased the probability of occurrence of midline-directed compensatory fast-phase eye movements (CFPs) evoked by appropriate neck-proprioceptive stimulation. 5. The gain of the horizontal cervico-ocular reflex (GHCOR) was measured from the combined changes in eye position resulting from AFPs and CSPs. GHCOR was potentiated during simultaneous vestibular stimulation. This enhancement of GHCOR occurred at neck-proprioceptive stimulus frequencies which, in the absence of conjoint vestibular stimulation, do not evoke CSPs. PMID:2795479

  15. Interactions between gaze-evoked blinks and gaze shifts in monkeys.

    PubMed

    Gandhi, Neeraj J

    2012-02-01

    Rapid eyelid closure, or a blink, often accompanies head-restrained and head-unrestrained gaze shifts. This study examines the interactions between such gaze-evoked blinks and gaze shifts in monkeys. Blink probability increases with gaze amplitude and at a faster rate for head-unrestrained movements. Across animals, blink likelihood is inversely correlated with the average gaze velocity of large-amplitude control movements. Gaze-evoked blinks induce robust perturbations in eye velocity. Peak and average velocities are reduced, duration is increased, but accuracy is preserved. The temporal features of the perturbation depend on factors such as the time of blink relative to gaze onset, inherent velocity kinematics of control movements, and perhaps initial eye-in-head position. Although variable across animals, the initial effect is a reduction in eye velocity, followed by a reacceleration that yields two or more peaks in its waveform. Interestingly, head velocity is not attenuated; instead, it peaks slightly later and with a larger magnitude. Gaze latency is slightly reduced on trials with gaze-evoked blinks, although the effect was more variable during head-unrestrained movements; no reduction in head latency is observed. Preliminary data also demonstrate a similar perturbation of gaze-evoked blinks during vertical saccades. The results are compared with previously reported effects of reflexive blinks (evoked by air-puff delivered to one eye or supraorbital nerve stimulation) and discussed in terms of effects of blinks on saccadic suppression, neural correlates of the altered eye velocity signals, and implications on the hypothesis that the attenuation in eye velocity is produced by a head movement command.

  16. The brain stem saccadic burst generator encodes gaze in three-dimensional space.

    PubMed

    Van Horn, Marion R; Sylvestre, Pierre A; Cullen, Kathleen E

    2008-05-01

    When we look between objects located at different depths the horizontal movement of each eye is different from that of the other, yet temporally synchronized. Traditionally, a vergence-specific neuronal subsystem, independent from other oculomotor subsystems, has been thought to generate all eye movements in depth. However, recent studies have challenged this view by unmasking interactions between vergence and saccadic eye movements during disconjugate saccades. Here, we combined experimental and modeling approaches to address whether the premotor command to generate disconjugate saccades originates exclusively in "vergence centers." We found that the brain stem burst generator, which is commonly assumed to drive only the conjugate component of eye movements, carries substantial vergence-related information during disconjugate saccades. Notably, facilitated vergence velocities during disconjugate saccades were synchronized with the burst onset of excitatory and inhibitory brain stem saccadic burst neurons (SBNs). Furthermore, the time-varying discharge properties of the majority of SBNs (>70%) preferentially encoded the dynamics of an individual eye during disconjugate saccades. When these experimental results were implemented into a computer-based simulation, to further evaluate the contribution of the saccadic burst generator in generating disconjugate saccades, we found that it carries all the vergence drive that is necessary to shape the activity of the abducens motoneurons to which it projects. Taken together, our results provide evidence that the premotor commands from the brain stem saccadic circuitry, to the target motoneurons, are sufficient to ensure the accurate control shifts of gaze in three dimensions.

  17. Reading "sun" and looking up: the influence of language on saccadic eye movements in the vertical dimension.

    PubMed

    Dudschig, Carolin; Souman, Jan; Lachmair, Martin; de la Vega, Irmgard; Kaup, Barbara

    2013-01-01

    Traditionally, language processing has been attributed to a separate system in the brain, which supposedly works in an abstract propositional manner. However, there is increasing evidence suggesting that language processing is strongly interrelated with sensorimotor processing. Evidence for such an interrelation is typically drawn from interactions between language and perception or action. In the current study, the effect of words that refer to entities in the world with a typical location (e.g., sun, worm) on the planning of saccadic eye movements was investigated. Participants had to perform a lexical decision task on visually presented words and non-words. They responded by moving their eyes to a target in an upper (lower) screen position for a word (non-word) or vice versa. Eye movements were faster to locations compatible with the word's referent in the real world. These results provide evidence for the importance of linguistic stimuli in directing eye movements, even if the words do not directly transfer directional information.

  18. Eye movements reflect and shape strategies in fraction comparison

    PubMed Central

    Ischebeck, Anja; Weilharter, Marina; Körner, Christof

    2016-01-01

    The comparison of fractions is a difficult task that can often be facilitated by separately comparing components (numerators and denominators) of the fractions—that is, by applying so-called component-based strategies. The usefulness of such strategies depends on the type of fraction pair to be compared. We investigated the temporal organization and the flexibility of strategy deployment in fraction comparison by evaluating sequences of eye movements in 20 young adults. We found that component-based strategies could account for the response times and the overall number of fixations observed for the different fraction pairs. The analysis of eye movement sequences showed that the initial eye movements in a trial were characterized by stereotypical scanning patterns indicative of an exploratory phase that served to establish the kind of fraction pair presented. Eye movements that followed this phase adapted to the particular type of fraction pair and indicated the deployment of specific comparison strategies. These results demonstrate that participants employ eye movements systematically to support strategy use in fraction comparison. Participants showed a remarkable flexibility to adapt to the most efficient strategy on a trial-by-trial basis. Our results confirm the value of eye movement measurements in the exploration of strategic adaptation in complex tasks. PMID:26039819

  19. Cognitive processes involved in smooth pursuit eye movements: behavioral evidence, neural substrate and clinical correlation

    PubMed Central

    Fukushima, Kikuro; Fukushima, Junko; Warabi, Tateo; Barnes, Graham R.

    2013-01-01

    Smooth-pursuit eye movements allow primates to track moving objects. Efficient pursuit requires appropriate target selection and predictive compensation for inherent processing delays. Prediction depends on expectation of future object motion, storage of motion information and use of extra-retinal mechanisms in addition to visual feedback. We present behavioral evidence of how cognitive processes are involved in predictive pursuit in normal humans and then describe neuronal responses in monkeys and behavioral responses in patients using a new technique to test these cognitive controls. The new technique examines the neural substrate of working memory and movement preparation for predictive pursuit by using a memory-based task in macaque monkeys trained to pursue (go) or not pursue (no-go) according to a go/no-go cue, in a direction based on memory of a previously presented visual motion display. Single-unit task-related neuronal activity was examined in medial superior temporal cortex (MST), supplementary eye fields (SEF), caudal frontal eye fields (FEF), cerebellar dorsal vermis lobules VI–VII, caudal fastigial nuclei (cFN), and floccular region. Neuronal activity reflecting working memory of visual motion direction and go/no-go selection was found predominantly in SEF, cerebellar dorsal vermis and cFN, whereas movement preparation related signals were found predominantly in caudal FEF and the same cerebellar areas. Chemical inactivation produced effects consistent with differences in signals represented in each area. When applied to patients with Parkinson's disease (PD), the task revealed deficits in movement preparation but not working memory. In contrast, patients with frontal cortical or cerebellar dysfunction had high error rates, suggesting impaired working memory. We show how neuronal activity may be explained by models of retinal and extra-retinal interaction in target selection and predictive control and thus aid understanding of underlying pathophysiology. PMID:23515488

  20. Evidence of common and separate eye and hand accumulators underlying flexible eye-hand coordination

    PubMed Central

    Jana, Sumitash; Gopal, Atul

    2016-01-01

    Eye and hand movements are initiated by anatomically separate regions in the brain, and yet these movements can be flexibly coupled and decoupled, depending on the need. The computational architecture that enables this flexible coupling of independent effectors is not understood. Here, we studied the computational architecture that enables flexible eye-hand coordination using a drift diffusion framework, which predicts that the variability of the reaction time (RT) distribution scales with its mean. We show that a common stochastic accumulator to threshold, followed by a noisy effector-dependent delay, explains eye-hand RT distributions and their correlation in a visual search task that required decision-making, while an interactive eye and hand accumulator model did not. In contrast, in an eye-hand dual task, an interactive model better predicted the observed correlations and RT distributions than a common accumulator model. Notably, these two models could only be distinguished on the basis of the variability and not the means of the predicted RT distributions. Additionally, signatures of separate initiation signals were also observed in a small fraction of trials in the visual search task, implying that these distinct computational architectures were not a manifestation of the task design per se. Taken together, our results suggest two unique computational architectures for eye-hand coordination, with task context biasing the brain toward instantiating one of the two architectures. NEW & NOTEWORTHY Previous studies on eye-hand coordination have considered mainly the means of eye and hand reaction time (RT) distributions. Here, we leverage the approximately linear relationship between the mean and standard deviation of RT distributions, as predicted by the drift-diffusion model, to propose the existence of two distinct computational architectures underlying coordinated eye-hand movements. These architectures, for the first time, provide a computational basis for the flexible coupling between eye and hand movements. PMID:27784809

  1. Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments.

    PubMed

    Andrews, T J; Coppola, D M

    1999-08-01

    Eye position was recorded in different viewing conditions to assess whether the temporal and spatial characteristics of saccadic eye movements in different individuals are idiosyncratic. Our aim was to determine the degree to which oculomotor control is based on endogenous factors. A total of 15 naive subjects viewed five visual environments: (1) The absence of visual stimulation (i.e. a dark room); (2) a repetitive visual environment (i.e. simple textured patterns); (3) a complex natural scene; (4) a visual search task; and (5) reading text. Although differences in visual environment had significant effects on eye movements, idiosyncrasies were also apparent. For example, the mean fixation duration and size of an individual's saccadic eye movements when passively viewing a complex natural scene covaried significantly with those same parameters in the absence of visual stimulation and in a repetitive visual environment. In contrast, an individual's spatio-temporal characteristics of eye movements during active tasks such as reading text or visual search covaried together, but did not correlate with the pattern of eye movements detected when viewing a natural scene, simple patterns or in the dark. These idiosyncratic patterns of eye movements in normal viewing reveal an endogenous influence on oculomotor control. The independent covariance of eye movements during different visual tasks shows that saccadic eye movements during active tasks like reading or visual search differ from those engaged during the passive inspection of visual scenes.

  2. Biometric recognition via texture features of eye movement trajectories in a visual searching task.

    PubMed

    Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei; Zhang, Chenggang

    2018-01-01

    Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers' temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases.

  3. Biometric recognition via texture features of eye movement trajectories in a visual searching task

    PubMed Central

    Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei

    2018-01-01

    Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers’ temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases. PMID:29617383

  4. New methods for the assessment of accommodative convergence.

    PubMed

    Asakawa, Ken; Ishikawa, Hitoshi; Shoji, Nobuyuki

    2009-01-01

    The authors introduced a new objective method for measuring horizontal eye movements based on the first Purkinje image with the use of infrared charge-coupled device (CCD) cameras and compared stimulus accommodative convergence to accommodation (AC/A) ratios as determined by a standard gradient method. The study included 20 patients, 5 to 9 years old, who had intermittent exotropia (10 eyes) and accommodative esotropia (10 eyes). Measurement of horizontal eye movements in millimeters (mm), based on the first Purkinje image, was obtained with a TriIRIS C9000 instrument (Hamamatsu Photonics K.K., Hamamatsu, Japan). The stimulus AC/A ratio was determined with the far gradient method. The average values of horizontal eye movements (mm) and eye deviation (Delta) (a) before and (b) after an accommodative stimulus of 3.00 diopters (D) were calculated with the following formula: horizontal eye movements (mm/D) and stimulus AC/A ratio (Delta/D) = (b - a)/3. The average values of the horizontal eye movements and the stimulus AC/A ratio were 0.5 mm/D and 3.8 Delta/D, respectively. Correlation analysis showed a strong positive correlation between these two parameters (r = 0.92). Moreover, horizontal eye movements are directly proportional to the AC/A ratio measured with the gradient method. The methods used in this study allow objective recordings of accommodative convergence to be obtained in many clinical situations. Copyright 2009, SLACK Incorporated.

  5. Brief Report: Patterns of Eye Movements in Face to Face Conversation are Associated with Autistic Traits: Evidence from a Student Sample.

    PubMed

    Vabalas, Andrius; Freeth, Megan

    2016-01-01

    The current study investigated whether the amount of autistic traits shown by an individual is associated with viewing behaviour during a face-to-face interaction. The eye movements of 36 neurotypical university students were recorded using a mobile eye-tracking device. High amounts of autistic traits were neither associated with reduced looking to the social partner overall, nor with reduced looking to the face. However, individuals who were high in autistic traits exhibited reduced visual exploration during the face-to-face interaction overall, as demonstrated by shorter and less frequent saccades. Visual exploration was not related to social anxiety. This study suggests that there are systematic individual differences in visual exploration during social interactions and these are related to amount of autistic traits.

  6. Anatomy of emotion: a 3D study of facial mimicry.

    PubMed

    Ferrario, V F; Sforza, C

    2007-01-01

    Alterations in facial motion severely impair the quality of life and social interaction of patients, and an objective grading of facial function is necessary. A method for the non-invasive detection of 3D facial movements was developed. Sequences of six standardized facial movements (maximum smile; free smile; surprise with closed mouth; surprise with open mouth; right side eye closure; left side eye closure) were recorded in 20 healthy young adults (10 men, 10 women) using an optoelectronic motion analyzer. For each subject, 21 cutaneous landmarks were identified by 2-mm reflective markers, and their 3D movements during each facial animation were computed. Three repetitions of each expression were recorded (within-session error), and four separate sessions were used (between-session error). To assess the within-session error, the technical error of the measurement (random error, TEM) was computed separately for each sex, movement and landmark. To assess the between-session repeatability, the standard deviation among the mean displacements of each landmark (four independent sessions) was computed for each movement. TEM for the single landmarks ranged between 0.3 and 9.42 mm (intrasession error). The sex- and movement-related differences were statistically significant (two-way analysis of variance, p=0.003 for sex comparison, p=0.009 for the six movements, p<0.001 for the sex x movement interaction). Among four different (independent) sessions, the left eye closure had the worst repeatability, the right eye closure had the best one; the differences among various movements were statistically significant (one-way analysis of variance, p=0.041). In conclusion, the current protocol demonstrated a sufficient repeatability for a future clinical application. Great care should be taken to assure a consistent marker positioning in all the subjects.

  7. Model simulation studies to clarify the effect on saccadic eye movements of initial condition velocities set by the Vestibular Ocular Reflex (VOR)

    NASA Technical Reports Server (NTRS)

    Nam, M. H.; Winters, J. M.; Stark, L.

    1981-01-01

    Voluntary active head rotations produced vestibulo-ocular reflex eye movements (VOR) with the subject viewing a fixation target. When this target jumped, the size of the refixation saccades were a function of the ongoing initial velocity of the eye. Saccades made against the VOR were larger in magnitude. Simulation of a reciprocally innervated model eye movement provided results comparable to the experimental data. Most of the experimental effect appeared to be due to linear summation for saccades of 5 and 10 degree magnitude. For small saccades of 2.5 degrees, peripheral nonlinear interaction of state variables in the neuromuscular plant also played a role as proven by comparable behavior in the simulated model with known controller signals.

  8. CUE: counterfeit-resistant usable eye movement-based authentication via oculomotor plant characteristics and complex eye movement patterns

    NASA Astrophysics Data System (ADS)

    Komogortsev, Oleg V.; Karpov, Alexey; Holland, Corey D.

    2012-06-01

    The widespread use of computers throughout modern society introduces the necessity for usable and counterfeit-resistant authentication methods to ensure secure access to personal resources such as bank accounts, e-mail, and social media. Current authentication methods require tedious memorization of lengthy pass phrases, are often prone to shouldersurfing, and may be easily replicated (either by counterfeiting parts of the human body or by guessing an authentication token based on readily available information). This paper describes preliminary work toward a counterfeit-resistant usable eye movement-based (CUE) authentication method. CUE does not require any passwords (improving the memorability aspect of the authentication system), and aims to provide high resistance to spoofing and shoulder-surfing by employing the combined biometric capabilities of two behavioral biometric traits: 1) oculomotor plant characteristics (OPC) which represent the internal, non-visible, anatomical structure of the eye; 2) complex eye movement patterns (CEM) which represent the strategies employed by the brain to guide visual attention. Both OPC and CEM are extracted from the eye movement signal provided by an eye tracking system. Preliminary results indicate that the fusion of OPC and CEM traits is capable of providing a 30% reduction in authentication error when compared to the authentication accuracy of individual traits.

  9. Effect of viewing distance on the generation of vertical eye movements during locomotion

    NASA Technical Reports Server (NTRS)

    Moore, S. T.; Hirasaki, E.; Cohen, B.; Raphan, T.

    1999-01-01

    Vertical head and eye coordination was studied as a function of viewing distance during locomotion. Vertical head translation and pitch movements were measured using a video motion analysis system (Optotrak 3020). Vertical eye movements were recorded using a video-based pupil tracker (Iscan). Subjects (five) walked on a linear treadmill at a speed of 1.67 m/s (6 km/h) while viewing a target screen placed at distances ranging from 0.25 to 2.0 m at 0. 25-m intervals. The predominant frequency of vertical head movement was 2 Hz. In accordance with previous studies, there was a small head pitch rotation, which was compensatory for vertical head translation. The magnitude of the vertical head movements and the phase relationship between head translation and pitch were little affected by viewing distance, and tended to orient the naso-occipital axis of the head at a point approximately 1 m in front of the subject (the head fixation distance or HFD). In contrast, eye velocity was significantly affected by viewing distance. When viewing a far (2-m) target, vertical eye velocity was 180 degrees out of phase with head pitch velocity, with a gain of 0. 8. This indicated that the angular vestibulo-ocular reflex (aVOR) was generating the eye movement response. The major finding was that, at a close viewing distance (0.25 m), eye velocity was in phase with head pitch and compensatory for vertical head translation, suggesting that activation of the linear vestibulo-ocular reflex (lVOR) was contributing to the eye movement response. There was also a threefold increase in the magnitude of eye velocity when viewing near targets, which was consistent with the goal of maintaining gaze on target. The required vertical lVOR sensitivity to cancel an unmodified aVOR response and generate the observed eye velocity magnitude for near targets was almost 3 times that previously measured. Supplementary experiments were performed utilizing body-fixed active head pitch rotations at 1 and 2 Hz while viewing a head-fixed target. Results indicated that the interaction of smooth pursuit and the aVOR during visual suppression could modify both the gain and phase characteristics of the aVOR at frequencies encountered during locomotion. When walking, targets located closer than the HFD (1.0 m) would appear to move in the same direction as the head pitch, resulting in suppression of the aVOR. The results of the head-fixed target experiment suggest that phase modification of the aVOR during visual suppression could play a role in generating eye movements consistent with the goal of maintaining gaze on targets closer than the HFD, which would augment the lVOR response.

  10. An exploratory study on the driving method of speech synthesis based on the human eye reading imaging data

    NASA Astrophysics Data System (ADS)

    Gao, Pei-pei; Liu, Feng

    2016-10-01

    With the development of information technology and artificial intelligence, speech synthesis plays a significant role in the fields of Human-Computer Interaction Techniques. However, the main problem of current speech synthesis techniques is lacking of naturalness and expressiveness so that it is not yet close to the standard of natural language. Another problem is that the human-computer interaction based on the speech synthesis is too monotonous to realize mechanism of user subjective drive. This thesis introduces the historical development of speech synthesis and summarizes the general process of this technique. It is pointed out that prosody generation module is an important part in the process of speech synthesis. On the basis of further research, using eye activity rules when reading to control and drive prosody generation was introduced as a new human-computer interaction method to enrich the synthetic form. In this article, the present situation of speech synthesis technology is reviewed in detail. Based on the premise of eye gaze data extraction, using eye movement signal in real-time driving, a speech synthesis method which can express the real speech rhythm of the speaker is proposed. That is, when reader is watching corpora with its eyes in silent reading, capture the reading information such as the eye gaze duration per prosodic unit, and establish a hierarchical prosodic pattern of duration model to determine the duration parameters of synthesized speech. At last, after the analysis, the feasibility of the above method is verified.

  11. Collective Behaviour in Video Viewing: A Thermodynamic Analysis of Gaze Position.

    PubMed

    Burleson-Lesser, Kate; Morone, Flaviano; DeGuzman, Paul; Parra, Lucas C; Makse, Hernán A

    2017-01-01

    Videos and commercials produced for large audiences can elicit mixed opinions. We wondered whether this diversity is also reflected in the way individuals watch the videos. To answer this question, we presented 65 commercials with high production value to 25 individuals while recording their eye movements, and asked them to provide preference ratings for each video. We find that gaze positions for the most popular videos are highly correlated. To explain the correlations of eye movements, we model them as "interactions" between individuals. A thermodynamic analysis of these interactions shows that they approach a "critical" point such that any stronger interaction would put all viewers into lock-step and any weaker interaction would fully randomise patterns. At this critical point, groups with similar collective behaviour in viewing patterns emerge while maintaining diversity between groups. Our results suggest that popularity of videos is already evident in the way we look at them, and that we maintain diversity in viewing behaviour even as distinct patterns of groups emerge. Our results can be used to predict popularity of videos and commercials at the population level from the collective behaviour of the eye movements of a few viewers.

  12. Visual-vestibular interaction

    NASA Technical Reports Server (NTRS)

    Young, Laurence R.; Merfeld, D.

    1994-01-01

    Significant progress was achieved during the period of this grant on a number of different fronts. A list of publications, abstracts, and theses supported by this grant is provided at the end of this document. The completed studies focused on three general areas: eye movements induced by dynamic linear acceleration, eye movements and vection reports induced by visual roll stimulation, and the separation of gravito-inertial force into central estimates of gravity and linear acceleration.

  13. Exploring the potential of analysing visual search behaviour data using FROC (free-response receiver operating characteristic) method: an initial study

    NASA Astrophysics Data System (ADS)

    Dong, Leng; Chen, Yan; Dias, Sarah; Stone, William; Dias, Joseph; Rout, John; Gale, Alastair G.

    2017-03-01

    Visual search techniques and FROC analysis have been widely used in radiology to understand medical image perceptual behaviour and diagnostic performance. The potential of exploiting the advantages of both methodologies is of great interest to medical researchers. In this study, eye tracking data of eight dental practitioners was investigated. The visual search measures and their analyses are considered here. Each participant interpreted 20 dental radiographs which were chosen by an expert dental radiologist. Various eye movement measurements were obtained based on image area of interest (AOI) information. FROC analysis was then carried out by using these eye movement measurements as a direct input source. The performance of FROC methods using different input parameters was tested. The results showed that there were significant differences in FROC measures, based on eye movement data, between groups with different experience levels. Namely, the area under the curve (AUC) score evidenced higher values for experienced group for the measurements of fixation and dwell time. Also, positive correlations were found for AUC scores between the eye movement data conducted FROC and rating based FROC. FROC analysis using eye movement measurements as input variables can act as a potential performance indicator to deliver assessment in medical imaging interpretation and assess training procedures. Visual search data analyses lead to new ways of combining eye movement data and FROC methods to provide an alternative dimension to assess performance and visual search behaviour in the area of medical imaging perceptual tasks.

  14. Combining EEG and eye movement recording in free viewing: Pitfalls and possibilities.

    PubMed

    Nikolaev, Andrey R; Meghanathan, Radha Nila; van Leeuwen, Cees

    2016-08-01

    Co-registration of EEG and eye movement has promise for investigating perceptual processes in free viewing conditions, provided certain methodological challenges can be addressed. Most of these arise from the self-paced character of eye movements in free viewing conditions. Successive eye movements occur within short time intervals. Their evoked activity is likely to distort the EEG signal during fixation. Due to the non-uniform distribution of fixation durations, these distortions are systematic, survive across-trials averaging, and can become a source of confounding. We illustrate this problem with effects of sequential eye movements on the evoked potentials and time-frequency components of EEG and propose a solution based on matching of eye movement characteristics between experimental conditions. The proposal leads to a discussion of which eye movement characteristics are to be matched, depending on the EEG activity of interest. We also compare segmentation of EEG into saccade-related epochs relative to saccade and fixation onsets and discuss the problem of baseline selection and its solution. Further recommendations are given for implementing EEG-eye movement co-registration in free viewing conditions. By resolving some of the methodological problems involved, we aim to facilitate the transition from the traditional stimulus-response paradigm to the study of visual perception in more naturalistic conditions. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. The Frequency-Predictability Interaction in Reading: It Depends Where You're Coming from

    ERIC Educational Resources Information Center

    Hand, Christopher J.; Miellet, Sebastien; O'Donnell, Patrick J.; Sereno, Sara C.

    2010-01-01

    A word's frequency of occurrence and its predictability from a prior context are key factors determining how long the eyes remain on that word in normal reading. Past reaction-time and eye movement research can be distinguished by whether these variables, when combined, produce interactive or additive results, respectively. Our study addressed…

  16. Evaluation of Eye Metrics as a Detector of Fatigue

    DTIC Science & Technology

    2010-03-01

    eyeglass frames . The cameras are angled upward toward the eyes and extract real-time pupil diameter, eye-lid movement, and eye-ball movement. The...because the cameras were mounted on eyeglass -like frames , the system was able to continuously monitor the eye throughout all sessions. Overall, the...of “ fitness for duty” testing and “real-time monitoring” of operator performance has been slow (Institute of Medicine, 2004). Oculometric-based

  17. Lexical and Post-Lexical Complexity Effects on Eye Movements in Reading

    PubMed Central

    Warren, Tessa; Reichle, Erik D.; Patson, Nikole D.

    2011-01-01

    The current study investigated how a post-lexical complexity manipulation followed by a lexical complexity manipulation affects eye movements during reading. Both manipulations caused disruption in all measures on the manipulated words, but the patterns of spill-over differed. Critically, the effects of the two kinds of manipulations did not interact, and there was no evidence that post-lexical processing difficulty delayed lexical processing on the next word (c.f. Henderson & Ferreira, 1990). This suggests that post-lexical processing of one word and lexical processing of the next can proceed independently and likely in parallel. This finding is consistent with the assumptions of the E-Z Reader model of eye movement control in reading (Reichle, Warren, & McConnell, 2009). PMID:21603125

  18. Signal-dependent noise determines motor planning

    NASA Astrophysics Data System (ADS)

    Harris, Christopher M.; Wolpert, Daniel M.

    1998-08-01

    When we make saccadic eye movements or goal-directed arm movements, there is an infinite number of possible trajectories that the eye or arm could take to reach the target,. However, humans show highly stereotyped trajectories in which velocity profiles of both the eye and hand are smooth and symmetric for brief movements,. Here we present a unifying theory of eye and arm movements based on the single physiological assumption that the neural control signals are corrupted by noise whose variance increases with the size of the control signal. We propose that in the presence of such signal-dependent noise, the shape of a trajectory is selected to minimize the variance of the final eye or arm position. This minimum-variance theory accurately predicts the trajectories of both saccades and arm movements and the speed-accuracy trade-off described by Fitt's law. These profiles are robust to changes in the dynamics of the eye or arm, as found empirically,. Moreover, the relation between path curvature and hand velocity during drawing movements reproduces the empirical `two-thirds power law',. This theory provides a simple and powerful unifying perspective for both eye and arm movement control.

  19. Observers' cognitive states modulate how visual inputs relate to gaze control.

    PubMed

    Kardan, Omid; Henderson, John M; Yourganov, Grigori; Berman, Marc G

    2016-09-01

    Previous research has shown that eye-movements change depending on both the visual features of our environment, and the viewer's top-down knowledge. One important question that is unclear is the degree to which the visual goals of the viewer modulate how visual features of scenes guide eye-movements. Here, we propose a systematic framework to investigate this question. In our study, participants performed 3 different visual tasks on 135 scenes: search, memorization, and aesthetic judgment, while their eye-movements were tracked. Canonical correlation analyses showed that eye-movements were reliably more related to low-level visual features at fixations during the visual search task compared to the aesthetic judgment and scene memorization tasks. Different visual features also had different relevance to eye-movements between tasks. This modulation of the relationship between visual features and eye-movements by task was also demonstrated with classification analyses, where classifiers were trained to predict the viewing task based on eye movements and visual features at fixations. Feature loadings showed that the visual features at fixations could signal task differences independent of temporal and spatial properties of eye-movements. When classifying across participants, edge density and saliency at fixations were as important as eye-movements in the successful prediction of task, with entropy and hue also being significant, but with smaller effect sizes. When classifying within participants, brightness and saturation were also significant contributors. Canonical correlation and classification results, together with a test of moderation versus mediation, suggest that the cognitive state of the observer moderates the relationship between stimulus-driven visual features and eye-movements. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  20. Beyond qualitative and subjective techniques to assess usability of banking interfaces for senior citizens.

    PubMed

    Laparra-Hernández, José; Medina, Enric; Sancho, María; Soriano, Carolina; Durá, Juanvi; Barberà-Guillem, Ricard; Poveda-Puente, Rakel

    2015-01-01

    Senior citizens can benefit from banking services but the lack of usability hampers this possibility. New approaches based on physiological response, eye tracking and user movement analysis can provide more information during interface interaction. This research shows the differences depending on user knowledge and use of technology, gender and type of interface.

  1. Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism

    PubMed Central

    Auyeung, B; Lombardo, M V; Heinrichs, M; Chakrabarti, B; Sule, A; Deakin, J B; Bethlehem, R A I; Dickens, L; Mooney, N; Sipple, J A N; Thiemann, P; Baron-Cohen, S

    2015-01-01

    Autism spectrum conditions (autism) affect ~1% of the population and are characterized by deficits in social communication. Oxytocin has been widely reported to affect social-communicative function and its neural underpinnings. Here we report the first evidence that intranasal oxytocin administration improves a core problem that individuals with autism have in using eye contact appropriately in real-world social settings. A randomized double-blind, placebo-controlled, within-subjects design is used to examine how intranasal administration of 24 IU of oxytocin affects gaze behavior for 32 adult males with autism and 34 controls in a real-time interaction with a researcher. This interactive paradigm bypasses many of the limitations encountered with conventional static or computer-based stimuli. Eye movements are recorded using eye tracking, providing an objective measurement of looking patterns. The measure is shown to be sensitive to the reduced eye contact commonly reported in autism, with the autism group spending less time looking to the eye region of the face than controls. Oxytocin administration selectively enhanced gaze to the eyes in both the autism and control groups (transformed mean eye-fixation difference per second=0.082; 95% CI:0.025–0.14, P=0.006). Within the autism group, oxytocin has the most effect on fixation duration in individuals with impaired levels of eye contact at baseline (Cohen's d=0.86). These findings demonstrate that the potential benefits of oxytocin in autism extend to a real-time interaction, providing evidence of a therapeutic effect in a key aspect of social communication. PMID:25668435

  2. Oxytocin increases eye contact during a real-time, naturalistic social interaction in males with and without autism.

    PubMed

    Auyeung, B; Lombardo, M V; Heinrichs, M; Chakrabarti, B; Sule, A; Deakin, J B; Bethlehem, R A I; Dickens, L; Mooney, N; Sipple, J A N; Thiemann, P; Baron-Cohen, S

    2015-02-10

    Autism spectrum conditions (autism) affect ~1% of the population and are characterized by deficits in social communication. Oxytocin has been widely reported to affect social-communicative function and its neural underpinnings. Here we report the first evidence that intranasal oxytocin administration improves a core problem that individuals with autism have in using eye contact appropriately in real-world social settings. A randomized double-blind, placebo-controlled, within-subjects design is used to examine how intranasal administration of 24 IU of oxytocin affects gaze behavior for 32 adult males with autism and 34 controls in a real-time interaction with a researcher. This interactive paradigm bypasses many of the limitations encountered with conventional static or computer-based stimuli. Eye movements are recorded using eye tracking, providing an objective measurement of looking patterns. The measure is shown to be sensitive to the reduced eye contact commonly reported in autism, with the autism group spending less time looking to the eye region of the face than controls. Oxytocin administration selectively enhanced gaze to the eyes in both the autism and control groups (transformed mean eye-fixation difference per second=0.082; 95% CI:0.025-0.14, P=0.006). Within the autism group, oxytocin has the most effect on fixation duration in individuals with impaired levels of eye contact at baseline (Cohen's d=0.86). These findings demonstrate that the potential benefits of oxytocin in autism extend to a real-time interaction, providing evidence of a therapeutic effect in a key aspect of social communication.

  3. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma.

    PubMed

    Kasneci, Enkelejda; Black, Alex A; Wood, Joanne M

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior.

  4. Eye-Tracking as a Tool to Evaluate Functional Ability in Everyday Tasks in Glaucoma

    PubMed Central

    Black, Alex A.

    2017-01-01

    To date, few studies have investigated the eye movement patterns of individuals with glaucoma while they undertake everyday tasks in real-world settings. While some of these studies have reported possible compensatory gaze patterns in those with glaucoma who demonstrated good task performance despite their visual field loss, little is known about the complex interaction between field loss and visual scanning strategies and the impact on task performance and, consequently, on quality of life. We review existing approaches that have quantified the effect of glaucomatous visual field defects on the ability to undertake everyday activities through the use of eye movement analysis. Furthermore, we discuss current developments in eye-tracking technology and the potential for combining eye-tracking with virtual reality and advanced analytical approaches. Recent technological developments suggest that systems based on eye-tracking have the potential to assist individuals with glaucomatous loss to maintain or even improve their performance on everyday tasks and hence enhance their long-term quality of life. We discuss novel approaches for studying the visual search behavior of individuals with glaucoma that have the potential to assist individuals with glaucoma, through the use of personalized programs that take into consideration the individual characteristics of their remaining visual field and visual search behavior. PMID:28293433

  5. Investigating the effect of Eye Movement Desensitization and Reprocessing (EMDR) on postoperative pain intensity in adolescents undergoing surgery: a randomized controlled trial.

    PubMed

    Maroufi, Mohsen; Zamani, Shahla; Izadikhah, Zahra; Marofi, Maryam; O'Connor, Peter

    2016-09-01

    To investigate the efficacy of Eye Movement Desensitization and Reprocessing for postoperative pain management in adolescents. Eye Movement Desensitization and Reprocessing is an inexpensive, non-pharmacological intervention that has successfully been used to treat chronic pain. It holds promise in the treatment of acute, postsurgical pain based on its purported effects on the brain and nervous system. A randomized controlled trial was used. Fifty-six adolescent surgical patients aged between 12-18 years were allocated to gender-balanced Eye Movement Desensitization and Reprocessing (treatment) or non-Eye Movement Desensitization and Reprocessing (control) groups. Pain was measured using the Wong-Baker FACES(®) Pain Rating Scale (WBFS) before and after the intervention (or non-intervention for the control group). A Wilcoxon signed-rank test demonstrated that the Eye Movement Desensitization and Reprocessing group experienced a significant reduction in pain intensity after treatment intervention, whereas the control group did not. Additionally, a Mann-Whitney U-test showed that, while there was no significant difference between the two groups at time 1, there was a significant difference in pain intensity between the two groups at time 2, with the Eye Movement Desensitization and Reprocessing group experiencing lower levels of pain. These results suggest that Eye Movement Desensitization and Reprocessing may be an effective treatment modality for postoperative pain. © 2016 John Wiley & Sons Ltd.

  6. Eye movement analysis for activity recognition using electrooculography.

    PubMed

    Bulling, Andreas; Ward, Jamie A; Gellersen, Hans; Tröster, Gerhard

    2011-04-01

    In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals-saccades, fixations, and blinks-and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.

  7. Eye-head coordination during free exploration in human and cat.

    PubMed

    Einhäuser, Wolfgang; Moeller, Gudrun U; Schumann, Frank; Conradt, Jörg; Vockeroth, Johannes; Bartl, Klaus; Schneider, Erich; König, Peter

    2009-05-01

    Eye, head, and body movements jointly control the direction of gaze and the stability of retinal images in most mammalian species. The contribution of the individual movement components, however, will largely depend on the ecological niche the animal occupies and the layout of the animal's retina, in particular its photoreceptor density distribution. Here the relative contribution of eye-in-head and head-in-world movements in cats is measured, and the results are compared to recent human data. For the cat, a lightweight custom-made head-mounted video setup was used (CatCam). Human data were acquired with the novel EyeSeeCam device, which measures eye position to control a gaze-contingent camera in real time. For both species, analysis was based on simultaneous recordings of eye and head movements during free exploration of a natural environment. Despite the substantial differences in ecological niche, photoreceptor density, and saccade frequency, eye-movement characteristics in both species are remarkably similar. Coordinated eye and head movements dominate the dynamics of the retinal input. Interestingly, compensatory (gaze-stabilizing) movements play a more dominant role in humans than they do in cats. This finding was interpreted to be a consequence of substantially different timescales for head movements, with cats' head movements showing about a 5-fold faster dynamics than humans. For both species, models and laboratory experiments therefore need to account for this rich input dynamic to obtain validity for ecologically realistic settings.

  8. Objective Methods to Test Visual Dysfunction in the Presence of Cognitive Impairment

    DTIC Science & Technology

    2015-12-01

    the eye and 3) purposeful eye movements to track targets that are resolved. Major Findings: Three major objective tests of vision were successfully...developed and optimized to detect disease. These were 1) the pupil light reflex (either comparing the two eyes or independently evaluating each eye ...separately for retina or optic nerve damage, 2) eye movement based analysis of target acquisition, fixation, and eccentric viewing as a means of

  9. The Decline of Comprehension-Based Silent Reading Efficiency in the United States: A Comparison of Current Data with Performance in 1960

    ERIC Educational Resources Information Center

    Spichtig, Alexandra N.; Hiebert, Elfrieda H.; Vorstius, Christian; Pascoe, Jeffrey P.; Pearson, P. David; Radach, Ralph

    2016-01-01

    The present study measured the comprehension-based silent reading efficiency of U.S. students in grades 2, 4, 6, 8, 10, and 12. Students read standardized grade-level passages while an eye movement recording system was used to measure reading rate, fixations (eye stops) per word, fixation durations, and regressions (right-to-left eye movements)…

  10. Sex differences in objective measures of sleep in post-traumatic stress disorder and healthy control subjects.

    PubMed

    Richards, Anne; Metzler, Thomas J; Ruoff, Leslie M; Inslicht, Sabra S; Rao, Madhu; Talbot, Lisa S; Neylan, Thomas C

    2013-12-01

    A growing literature shows prominent sex effects for risk for post-traumatic stress disorder and associated medical comorbid burden. Previous research indicates that post-traumatic stress disorder is associated with reduced slow wave sleep, which may have implications for overall health, and abnormalities in rapid eye movement sleep, which have been implicated in specific post-traumatic stress disorder symptoms, but most research has been conducted in male subjects. We therefore sought to compare objective measures of sleep in male and female post-traumatic stress disorder subjects with age- and sex-matched control subjects. We used a cross-sectional, 2 × 2 design (post-traumatic stress disorder/control × female/male) involving83 medically healthy, non-medicated adults aged 19-39 years in the inpatient sleep laboratory. Visual electroencephalographic analysis demonstrated that post-traumatic stress disorder was associated with lower slow wave sleep duration (F(3,82)  = 7.63, P = 0.007) and slow wave sleep percentage (F(3,82)  = 6.11, P = 0.016). There was also a group × sex interaction effect for rapid eye movement sleep duration (F(3,82)  = 4.08, P = 0.047) and rapid eye movement sleep percentage (F(3,82)  = 4.30, P = 0.041), explained by greater rapid eye movement sleep in post-traumatic stress disorder females compared to control females, a difference not seen in male subjects. Quantitative electroencephalography analysis demonstrated that post-traumatic stress disorder was associated with lower energy in the delta spectrum (F(3,82)  = 6.79, P = 0.011) in non-rapid eye movement sleep. Slow wave sleep and delta findings were more pronounced in males. Removal of post-traumatic stress disorder subjects with comorbid major depressive disorder, who had greater post-traumatic stress disorder severity, strengthened delta effects but reduced rapid eye movement effects to non-significance. These findings support previous evidence that post-traumatic stress disorder is associated with impairment in the homeostatic function of sleep, especially in men with the disorder. These findings suggest that group × sex interaction effects on rapid eye movement may occur with more severe post-traumatic stress disorder or with post-traumatic stress disorder comorbid with major depressive disorder. © 2013 European Sleep Research Society.

  11. A Model of the Spatio-temporal Dynamics of Drosophila Eye Disc Development.

    PubMed

    Fried, Patrick; Sánchez-Aragón, Máximo; Aguilar-Hidalgo, Daniel; Lehtinen, Birgitta; Casares, Fernando; Iber, Dagmar

    2016-09-01

    Patterning and growth are linked during early development and have to be tightly controlled to result in a functional tissue or organ. During the development of the Drosophila eye, this linkage is particularly clear: the growth of the eye primordium mainly results from proliferating cells ahead of the morphogenetic furrow (MF), a moving signaling wave that sweeps across the tissue from the posterior to the anterior side, that induces proliferating cells anterior to it to differentiate and become cell cycle quiescent in its wake. Therefore, final eye disc size depends on the proliferation rate of undifferentiated cells and on the speed with which the MF sweeps across the eye disc. We developed a spatio-temporal model of the growing eye disc based on the regulatory interactions controlled by the signals Decapentaplegic (Dpp), Hedgehog (Hh) and the transcription factor Homothorax (Hth) and explored how the signaling patterns affect the movement of the MF and impact on eye disc growth. We used published and new quantitative data to parameterize the model. In particular, two crucial parameter values, the degradation rate of Hth and the diffusion coefficient of Hh, were measured. The model is able to reproduce the linear movement of the MF and the termination of growth of the primordium. We further show that the model can explain several mutant phenotypes, but fails to reproduce the previously observed scaling of the Dpp gradient in the anterior compartment.

  12. An Examination of Cognitive Processing of Multimedia Information Based on Viewers' Eye Movements

    ERIC Educational Resources Information Center

    Liu, Han-Chin; Chuang, Hsueh-Hua

    2011-01-01

    This study utilized qualitative and quantitative designs and eye-tracking technology to understand how viewers process multimedia information. Eye movement data were collected from eight college students (non-science majors) while they were viewing web pages containing different types of text and illustrations depicting the mechanism of…

  13. Predictors of Verb-Mediated Anticipatory Eye Movements in the Visual World

    ERIC Educational Resources Information Center

    Hintz, Florian; Meyer, Antje S.; Huettig, Falk

    2017-01-01

    Many studies have demonstrated that listeners use information extracted from verbs to guide anticipatory eye movements to objects in the visual context that satisfy the selection restrictions of the verb. An important question is what underlies such verb-mediated anticipatory eye gaze. Based on empirical and theoretical suggestions, we…

  14. Head-mounted eye tracking: a new method to describe infant looking.

    PubMed

    Franchak, John M; Kretch, Kari S; Soska, Kasey C; Adolph, Karen E

    2011-01-01

    Despite hundreds of studies describing infants' visual exploration of experimental stimuli, researchers know little about where infants look during everyday interactions. The current study describes the first method for studying visual behavior during natural interactions in mobile infants. Six 14-month-old infants wore a head-mounted eye-tracker that recorded gaze during free play with mothers. Results revealed that infants' visual exploration is opportunistic and depends on the availability of information and the constraints of infants' own bodies. Looks to mothers' faces were rare following infant-directed utterances but more likely if mothers were sitting at infants' eye level. Gaze toward the destination of infants' hand movements was common during manual actions and crawling, but looks toward obstacles during leg movements were less frequent. © 2011 The Authors. Child Development © 2011 Society for Research in Child Development, Inc.

  15. Understanding eye movements in face recognition using hidden Markov models.

    PubMed

    Chuk, Tim; Chan, Antoni B; Hsiao, Janet H

    2014-09-16

    We use a hidden Markov model (HMM) based approach to analyze eye movement data in face recognition. HMMs are statistical models that are specialized in handling time-series data. We conducted a face recognition task with Asian participants, and model each participant's eye movement pattern with an HMM, which summarized the participant's scan paths in face recognition with both regions of interest and the transition probabilities among them. By clustering these HMMs, we showed that participants' eye movements could be categorized into holistic or analytic patterns, demonstrating significant individual differences even within the same culture. Participants with the analytic pattern had longer response times, but did not differ significantly in recognition accuracy from those with the holistic pattern. We also found that correct and wrong recognitions were associated with distinctive eye movement patterns; the difference between the two patterns lies in the transitions rather than locations of the fixations alone. © 2014 ARVO.

  16. Rapid and coordinated processing of global motion images by local clusters of retinal ganglion cells.

    PubMed

    Matsumoto, Akihiro; Tachibana, Masao

    2017-01-01

    Even when the body is stationary, the whole retinal image is always in motion by fixational eye movements and saccades that move the eye between fixation points. Accumulating evidence indicates that the brain is equipped with specific mechanisms for compensating for the global motion induced by these eye movements. However, it is not yet fully understood how the retina processes global motion images during eye movements. Here we show that global motion images evoke novel coordinated firing in retinal ganglion cells (GCs). We simultaneously recorded the firing of GCs in the goldfish isolated retina using a multi-electrode array, and classified each GC based on the temporal profile of its receptive field (RF). A moving target that accompanied the global motion (simulating a saccade following a period of fixational eye movements) modulated the RF properties and evoked synchronized and correlated firing among local clusters of the specific GCs. Our findings provide a novel concept for retinal information processing during eye movements.

  17. Contextual effects on smooth-pursuit eye movements.

    PubMed

    Spering, Miriam; Gegenfurtner, Karl R

    2007-02-01

    Segregating a moving object from its visual context is particularly relevant for the control of smooth-pursuit eye movements. We examined the interaction between a moving object and a stationary or moving visual context to determine the role of the context motion signal in driving pursuit. Eye movements were recorded from human observers to a medium-contrast Gaussian dot that moved horizontally at constant velocity. A peripheral context consisted of two vertically oriented sinusoidal gratings, one above and one below the stimulus trajectory, that were either stationary or drifted into the same or opposite direction as that of the target at different velocities. We found that a stationary context impaired pursuit acceleration and velocity and prolonged pursuit latency. A drifting context enhanced pursuit performance, irrespective of its motion direction. This effect was modulated by context contrast and orientation. When a context was briefly perturbed to move faster or slower eye velocity changed accordingly, but only when the context was drifting along with the target. Perturbing a context into the direction orthogonal to target motion evoked a deviation of the eye opposite to the perturbation direction. We therefore provide evidence for the use of absolute and relative motion cues, or motion assimilation and motion contrast, for the control of smooth-pursuit eye movements.

  18. Fundamental Visual Representations of Social Cognition in ASD

    DTIC Science & Technology

    2016-12-01

    visual adaptation functions in Autism , again pointing to basic sensory processing anomalies in this population. Our research team is developing...challenging-to-test ASD pediatric population. 15. SUBJECT TERMS Autism , Visual Adaptation, Retinotopy, Social Communication, Eye-movements, fMRI, EEG, ERP...social interaction are a hallmark symptom of Autism , and the lack of appropriate eye- contact during interpersonal interactions is an oft-noted feature

  19. Combining EEG and eye tracking: identification, characterization, and correction of eye movement artifacts in electroencephalographic data

    PubMed Central

    Plöchl, Michael; Ossandón, José P.; König, Peter

    2012-01-01

    Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts when those had to base their decision on IC topographies only. Furthermore, a receiver operating characteristic (ROC) analysis demonstrated an optimal balance of false positive and false negative at an area under curve (AUC) of more than 0.99. Removing the automatically detected ICs from the data resulted in removal or substantial suppression of ocular artifacts including microsaccadic spike potentials, while the relevant neural signal remained unaffected. In conclusion the present work aims at a better understanding of individual eye movement artifacts, their interrelations and the respective implications for eye artifact correction. Additionally, the proposed ICA-procedure provides a tool for optimized detection and correction of eye movement-related artifact components. PMID:23087632

  20. Disk space and load time requirements for eye movement biometric databases

    NASA Astrophysics Data System (ADS)

    Kasprowski, Pawel; Harezlak, Katarzyna

    2016-06-01

    Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.

  1. Eye Movements in Darkness Modulate Self-Motion Perception.

    PubMed

    Clemens, Ivar Adrianus H; Selen, Luc P J; Pomante, Antonella; MacNeilage, Paul R; Medendorp, W Pieter

    2017-01-01

    During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first ( n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment ( n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation.

  2. Eye Movements in Darkness Modulate Self-Motion Perception

    PubMed Central

    Pomante, Antonella

    2017-01-01

    Abstract During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first (n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment (n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation. PMID:28144623

  3. Neural Dynamics of Object-Based Multifocal Visual Spatial Attention and Priming: Object Cueing, Useful-Field-of-View, and Crowding

    ERIC Educational Resources Information Center

    Foley, Nicholas C.; Grossberg, Stephen; Mingolla, Ennio

    2012-01-01

    How are spatial and object attention coordinated to achieve rapid object learning and recognition during eye movement search? How do prefrontal priming and parietal spatial mechanisms interact to determine the reaction time costs of intra-object attention shifts, inter-object attention shifts, and shifts between visible objects and covertly cued…

  4. A relationship between eye movement patterns and performance in a precognitive tracking task

    NASA Technical Reports Server (NTRS)

    Repperger, D. W.; Hartzell, E. J.

    1977-01-01

    Eye movements made by various subjects in the performance of a precognitive tracking task are studied. The tracking task persented by an antiaircraft artillery (AAA) simulator has an input forcing function represented by a deterministic aircraft fly-by. The performance of subjects is ranked by two metrics. Good, mediocre, and poor trackers are selected for analysis based on performance during the difficult segment of the tracking task and over replications. Using phase planes to characterize both the eye movement patterns and the displayed error signal, a simple metric is developed to study these patterns. Two characterizations of eye movement strategies are defined and quantified. Using these two types of eye strategies, two conclusions are obtained about good, mediocre, and poor trackers. First, the eye tracker who used a fixed strategy will consistently perform better. Secondly, the best fixed strategy is defined as a Crosshair Fixator.

  5. Is there a common motor dysregulation in sleepwalking and REM sleep behaviour disorder?

    PubMed

    Haridi, Mehdi; Weyn Banningh, Sebastian; Clé, Marion; Leu-Semenescu, Smaranda; Vidailhet, Marie; Arnulf, Isabelle

    2017-10-01

    This study sought to determine if there is any overlap between the two major non-rapid eye movement and rapid eye movement parasomnias, i.e. sleepwalking/sleep terrors and rapid eye movement sleep behaviour disorder. We assessed adult patients with sleepwalking/sleep terrors using rapid eye movement sleep behaviour disorder screening questionnaires and determined if they had enhanced muscle tone during rapid eye movement sleep. Conversely, we assessed rapid eye movement sleep behaviour disorder patients using the Paris Arousal Disorders Severity Scale and determined if they had more N3 awakenings. The 251 participants included 64 patients with rapid eye movement sleep behaviour disorder (29 with idiopathic rapid eye movement sleep behaviour disorder and 35 with rapid eye movement sleep behaviour disorder associated with Parkinson's disease), 62 patients with sleepwalking/sleep terrors, 66 old healthy controls (age-matched with the rapid eye movement sleep behaviour disorder group) and 59 young healthy controls (age-matched with the sleepwalking/sleep terrors group). They completed the rapid eye movement sleep behaviour disorder screening questionnaire, rapid eye movement sleep behaviour disorder single question and Paris Arousal Disorders Severity Scale. In addition, all the participants underwent a video-polysomnography. The sleepwalking/sleep terrors patients scored positive on rapid eye movement sleep behaviour disorder scales and had a higher percentage of 'any' phasic rapid eye movement sleep without atonia when compared with controls; however, these patients did not have higher tonic rapid eye movement sleep without atonia or complex behaviours during rapid eye movement sleep. Patients with rapid eye movement sleep behaviour disorder had moderately elevated scores on the Paris Arousal Disorders Severity Scale but did not exhibit more N3 arousals (suggestive of non-rapid eye movement parasomnia) than the control group. These results indicate that dream-enacting behaviours (assessed by rapid eye movement sleep behaviour disorder screening questionnaires) are commonly reported by sleepwalking/sleep terrors patients, thus decreasing the questionnaire's specificity. Furthermore, sleepwalking/sleep terrors patients have excessive twitching during rapid eye movement sleep, which may result either from a higher dreaming activity in rapid eye movement sleep or from a more generalised non-rapid eye movement/rapid eye movement motor dyscontrol during sleep. © 2017 European Sleep Research Society.

  6. The right look for the job: decoding cognitive processes involved in the task from spatial eye-movement patterns.

    PubMed

    Król, Magdalena Ewa; Król, Michał

    2018-02-20

    The aim of the study was not only to demonstrate whether eye-movement-based task decoding was possible but also to investigate whether eye-movement patterns can be used to identify cognitive processes behind the tasks. We compared eye-movement patterns elicited under different task conditions, with tasks differing systematically with regard to the types of cognitive processes involved in solving them. We used four tasks, differing along two dimensions: spatial (global vs. local) processing (Navon, Cognit Psychol, 9(3):353-383 1977) and semantic (deep vs. shallow) processing (Craik and Lockhart, J Verbal Learn Verbal Behav, 11(6):671-684 1972). We used eye-movement patterns obtained from two time periods: fixation cross preceding the target stimulus and the target stimulus. We found significant effects of both spatial and semantic processing, but in case of the latter, the effect might be an artefact of insufficient task control. We found above chance task classification accuracy for both time periods: 51.4% for the period of stimulus presentation and 34.8% for the period of fixation cross presentation. Therefore, we show that task can be to some extent decoded from the preparatory eye-movements before the stimulus is displayed. This suggests that anticipatory eye-movements reflect the visual scanning strategy employed for the task at hand. Finally, this study also demonstrates that decoding is possible even from very scant eye-movement data similar to Coco and Keller, J Vis 14(3):11-11 (2014). This means that task decoding is not limited to tasks that naturally take longer to perform and yield multi-second eye-movement recordings.

  7. Maximum entropy perception-action space: a Bayesian model of eye movement selection

    NASA Astrophysics Data System (ADS)

    Colas, Francis; Bessière, Pierre; Girard, Benoît

    2011-03-01

    In this article, we investigate the issue of the selection of eye movements in a free-eye Multiple Object Tracking task. We propose a Bayesian model of retinotopic maps with a complex logarithmic mapping. This model is structured in two parts: a representation of the visual scene, and a decision model based on the representation. We compare different decision models based on different features of the representation and we show that taking into account uncertainty helps predict the eye movements of subjects recorded in a psychophysics experiment. Finally, based on experimental data, we postulate that the complex logarithmic mapping has a functional relevance, as the density of objects in this space in more uniform than expected. This may indicate that the representation space and control strategies are such that the object density is of maximum entropy.

  8. Can Changes in Eye Movement Scanning Alter the Age-Related Deficit in Recognition Memory?

    PubMed Central

    Chan, Jessica P. K.; Kamino, Daphne; Binns, Malcolm A.; Ryan, Jennifer D.

    2011-01-01

    Older adults typically exhibit poorer face recognition compared to younger adults. These recognition differences may be due to underlying age-related changes in eye movement scanning. We examined whether older adults’ recognition could be improved by yoking their eye movements to those of younger adults. Participants studied younger and older faces, under free viewing conditions (bases), through a gaze-contingent moving window (own), or a moving window which replayed the eye movements of a base participant (yoked). During the recognition test, participants freely viewed the faces with no viewing restrictions. Own-age recognition biases were observed for older adults in all viewing conditions, suggesting that this effect occurs independently of scanning. Participants in the bases condition had the highest recognition accuracy, and participants in the yoked condition were more accurate than participants in the own condition. Among yoked participants, recognition did not depend on age of the base participant. These results suggest that successful encoding for all participants requires the bottom-up contribution of peripheral information, regardless of the locus of control of the viewer. Although altering the pattern of eye movements did not increase recognition, the amount of sampling of the face during encoding predicted subsequent recognition accuracy for all participants. Increased sampling may confer some advantages for subsequent recognition, particularly for people who have declining memory abilities. PMID:21687460

  9. Young children with autism spectrum disorder use predictive eye movements in action observation.

    PubMed

    Falck-Ytter, Terje

    2010-06-23

    Does a dysfunction in the mirror neuron system (MNS) underlie the social symptoms defining autism spectrum disorder (ASD)? Research suggests that the MNS matches observed actions to motor plans for similar actions, and that these motor plans include directions for predictive eye movements when observing goal-directed actions. Thus, one important question is whether children with ASD use predictive eye movements in action observation. Young children with ASD as well as typically developing children and adults were shown videos in which an actor performed object-directed actions (human agent condition). Children with ASD were also shown control videos showing objects moving by themselves (self-propelled condition). Gaze was measured using a corneal reflection technique. Children with ASD and typically developing individuals used strikingly similar goal-directed eye movements when observing others' actions in the human agent condition. Gaze was reactive in the self-propelled condition, suggesting that prediction is linked to seeing a hand-object interaction. This study does not support the view that ASD is characterized by a global dysfunction in the MNS.

  10. Using eye movements to explore mental representations of space.

    PubMed

    Fourtassi, Maryam; Rode, Gilles; Pisella, Laure

    2017-06-01

    Visual mental imagery is a cognitive experience characterised by the activation of the mental representation of an object or scene in the absence of the corresponding stimulus. According to the analogical theory, mental representations have a pictorial nature that preserves the spatial characteristics of the environment that is mentally represented. This cognitive experience shares many similarities with the experience of visual perception, including eye movements. The mental visualisation of a scene is accompanied by eye movements that reflect the spatial content of the mental image, and which can mirror the deformations of this mental image with respect to the real image, such as asymmetries or size reduction. The present article offers a concise overview of the main theories explaining the interactions between eye movements and mental representations, with some examples of the studies supporting them. It also aims to explain how ocular-tracking could be a useful tool in exploring the dynamics of spatial mental representations, especially in pathological situations where these representations can be altered, for instance in unilateral spatial neglect. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  11. Readers in Adult Basic Education.

    PubMed

    Barnes, Adrienne E; Kim, Young-Suk; Tighe, Elizabeth L; Vorstius, Christian

    The present study explored the reading skills of a sample of 48 adults enrolled in a basic education program in northern Florida, United States. Previous research has reported on reading component skills for students in adult education settings, but little is known about eye movement patterns or their relation to reading skills for this population. In this study, reading component skills including decoding, language comprehension, and reading fluency are reported, as are eye movement variables for connected-text oral reading. Eye movement comparisons between individuals with higher and lower oral reading fluency revealed within- and between-subject effects for word frequency and word length as well as group and word frequency interactions. Bivariate correlations indicated strong relations between component skills of reading, eye movement measures, and both the Test of Adult Basic Education ( Reading subtest) and the Woodcock-Johnson III Diagnostic Reading Battery Passage Comprehension assessments. Regression analyses revealed the utility of decoding, language comprehension, and lexical activation time for predicting achievement on both the Woodcock Johnson III Passage Comprehension and the Test of Adult Basic Education Reading Comprehension.

  12. Cognitive context detection in UAS operators using eye-gaze patterns on computer screens

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.

  13. Re-Engineering the Stomatopod Eye, Nature’s Most Comprehensive Visual Sensor

    DTIC Science & Technology

    2013-08-22

    degree of polarisation difference. Stomatopods (Right) also react to similar looming stonuli by retreating or showing eye movements . However, the...N Roberts, NJ Marshall 2013 "Varying degrees of polarization vision in octopus : interaction between degree and angle of polarization in contrast

  14. Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing

    PubMed Central

    Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan

    2017-01-01

    Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant’s eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results. PMID:29071007

  15. Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing.

    PubMed

    Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan

    2017-01-01

    Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant's eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results.

  16. Saccadic eye movement performance as an indicator of driving ability in elderly drivers.

    PubMed

    Schmitt, Kai-Uwe; Seeger, Rolf; Fischer, Hartmut; Lanz, Christian; Muser, Markus; Walz, Felix; Schwarz, Urs

    2015-01-01

    Regular checking of the fitness to drive of elderly car-license holders is required in some countries, and this will become increasingly important as more countries face aging populations. The present study investigated whether the analysis of saccadic eye movements could be used as a screening method for the assessment of driving ability. Three different paradigms (prosaccades, antisaccades, and visuovisual interactive (VVI) saccades) were used to test saccadic eye movements in 144 participants split into four groups: elderly drivers who came to the attention of road authorities for suspected lack of fitness to drive, a group of elderly drivers who served as a comparison group, a group of neurology patients with established brain lesion diagnoses, and a young comparison group. The group of elderly drivers with suspected deficits in driving skills also underwent a medical examination and a practical on-road driving test. The results of the saccadic eye tests of the different groups were compared. Antisaccade results indicated a strong link to driving behaviour: elderly drivers who were not fit to drive exhibited a poor performance on the antisaccade task and the performance in the VVI task was also clearly poorer in this group. Testing saccadic eye movements appears to be a promising and efficient method for screening large numbers of people such as elderly drivers. This study indicated a link between antisaccade performance and the ability to drive. Hence, measuring saccadic eye movements should be considered as a tool for screening the fitness to drive.

  17. An ocular biomechanic model for dynamic simulation of different eye movements.

    PubMed

    Iskander, J; Hossny, M; Nahavandi, S; Del Porto, L

    2018-04-11

    Simulating and analysing eye movement is useful for assessing visual system contribution to discomfort with respect to body movements, especially in virtual environments where simulation sickness might occur. It can also be used in the design of eye prosthesis or humanoid robot eye. In this paper, we present two biomechanic ocular models that are easily integrated into the available musculoskeletal models. The model was previously used to simulate eye-head coordination. The models are used to simulate and analyse eye movements. The proposed models are based on physiological and kinematic properties of the human eye. They incorporate an eye-globe, orbital suspension tissues and six muscles with their connective tissues (pulleys). Pulleys were incorporated in rectus and inferior oblique muscles. The two proposed models are the passive pulleys and the active pulleys models. Dynamic simulations of different eye movements, including fixation, saccade and smooth pursuit, are performed to validate both models. The resultant force-length curves of the models were similar to the experimental data. The simulation results show that the proposed models are suitable to generate eye movement simulations with results comparable to other musculoskeletal models. The maximum kinematic root mean square error (RMSE) is 5.68° and 4.35° for the passive and active pulley models, respectively. The analysis of the muscle forces showed realistic muscle activation with increased muscle synergy in the active pulley model. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Neural Correlates of Fixation Duration during Real-world Scene Viewing: Evidence from Fixation-related (FIRE) fMRI.

    PubMed

    Henderson, John M; Choi, Wonil

    2015-06-01

    During active scene perception, our eyes move from one location to another via saccadic eye movements, with the eyes fixating objects and scene elements for varying amounts of time. Much of the variability in fixation duration is accounted for by attentional, perceptual, and cognitive processes associated with scene analysis and comprehension. For this reason, current theories of active scene viewing attempt to account for the influence of attention and cognition on fixation duration. Yet almost nothing is known about the neurocognitive systems associated with variation in fixation duration during scene viewing. We addressed this topic using fixation-related fMRI, which involves coregistering high-resolution eye tracking and magnetic resonance scanning to conduct event-related fMRI analysis based on characteristics of eye movements. We observed that activation in visual and prefrontal executive control areas was positively correlated with fixation duration, whereas activation in ventral areas associated with scene encoding and medial superior frontal and paracentral regions associated with changing action plans was negatively correlated with fixation duration. The results suggest that fixation duration in scene viewing is controlled by cognitive processes associated with real-time scene analysis interacting with motor planning, consistent with current computational models of active vision for scene perception.

  19. Design, simulation and evaluation of uniform magnetic field systems for head-free eye movement recordings with scleral search coils.

    PubMed

    Eibenberger, Karin; Eibenberger, Bernhard; Rucci, Michele

    2016-08-01

    The precise measurement of eye movements is important for investigating vision, oculomotor control and vestibular function. The magnetic scleral search coil technique is one of the most precise measurement techniques for recording eye movements with very high spatial (≈ 1 arcmin) and temporal (>kHz) resolution. The technique is based on measuring voltage induced in a search coil through a large magnetic field. This search coil is embedded in a contact lens worn by a human subject. The measured voltage is in direct relationship to the orientation of the eye in space. This requires a magnetic field with a high homogeneity in the center, since otherwise the field inhomogeneity would give the false impression of a rotation of the eye due to a translational movement of the head. To circumvent this problem, a bite bar typically restricts head movement to a minimum. However, the need often emerges to precisely record eye movements under natural viewing conditions. To this end, one needs a uniform magnetic field that is uniform over a large area. In this paper, we present the numerical and finite element simulations of the magnetic flux density of different coil geometries that could be used for search coil recordings. Based on the results, we built a 2.2 × 2.2 × 2.2 meter coil frame with a set of 3 × 4 coils to generate a 3D magnetic field and compared the measured flux density with our simulation results. In agreement with simulation results, the system yields a highly uniform field enabling high-resolution recordings of eye movements.

  20. Tonic and phasic phenomena underlying eye movements during sleep in the cat

    PubMed Central

    Márquez-Ruiz, Javier; Escudero, Miguel

    2008-01-01

    Mammalian sleep is not a homogenous state, and different variables have traditionally been used to distinguish different periods during sleep. Of these variables, eye movement is one of the most paradigmatic, and has been used to differentiate between the so-called rapid eye movement (REM) and non-REM (NREM) sleep periods. Despite this, eye movements during sleep are poorly understood, and the behaviour of the oculomotor system remains almost unknown. In the present work, we recorded binocular eye movements during the sleep–wake cycle of adult cats by the scleral search-coil technique. During alertness, eye movements consisted of conjugated saccades and eye fixations. During NREM sleep, eye movements were slow and mostly unconjugated. The two eyes moved upwardly and in the abducting direction, producing a tonic divergence and elevation of the visual axis. During the transition period between NREM and REM sleep, rapid monocular eye movements of low amplitude in the abducting direction occurred in coincidence with ponto-geniculo-occipital waves. Along REM sleep, the eyes tended to maintain a tonic convergence and depression, broken by high-frequency bursts of complex rapid eye movements. In the horizontal plane, each eye movement in the burst comprised two consecutive movements in opposite directions, which were more evident in the eye that performed the abducting movements. In the vertical plane, rapid eye movements were always upward. Comparisons of the characteristics of eye movements during the sleep–wake cycle reveal the uniqueness of eye movements during sleep, and the noteworthy existence of tonic and phasic phenomena in the oculomotor system, not observed until now. PMID:18499729

  1. Tonic and phasic phenomena underlying eye movements during sleep in the cat.

    PubMed

    Márquez-Ruiz, Javier; Escudero, Miguel

    2008-07-15

    Mammalian sleep is not a homogenous state, and different variables have traditionally been used to distinguish different periods during sleep. Of these variables, eye movement is one of the most paradigmatic, and has been used to differentiate between the so-called rapid eye movement (REM) and non-REM (NREM) sleep periods. Despite this, eye movements during sleep are poorly understood, and the behaviour of the oculomotor system remains almost unknown. In the present work, we recorded binocular eye movements during the sleep-wake cycle of adult cats by the scleral search-coil technique. During alertness, eye movements consisted of conjugated saccades and eye fixations. During NREM sleep, eye movements were slow and mostly unconjugated. The two eyes moved upwardly and in the abducting direction, producing a tonic divergence and elevation of the visual axis. During the transition period between NREM and REM sleep, rapid monocular eye movements of low amplitude in the abducting direction occurred in coincidence with ponto-geniculo-occipital waves. Along REM sleep, the eyes tended to maintain a tonic convergence and depression, broken by high-frequency bursts of complex rapid eye movements. In the horizontal plane, each eye movement in the burst comprised two consecutive movements in opposite directions, which were more evident in the eye that performed the abducting movements. In the vertical plane, rapid eye movements were always upward. Comparisons of the characteristics of eye movements during the sleep-wake cycle reveal the uniqueness of eye movements during sleep, and the noteworthy existence of tonic and phasic phenomena in the oculomotor system, not observed until now.

  2. Toward statistical modeling of saccadic eye-movement and visual saliency.

    PubMed

    Sun, Xiaoshuai; Yao, Hongxun; Ji, Rongrong; Liu, Xian-Ming

    2014-11-01

    In this paper, we present a unified statistical framework for modeling both saccadic eye movements and visual saliency. By analyzing the statistical properties of human eye fixations on natural images, we found that human attention is sparsely distributed and usually deployed to locations with abundant structural information. This observations inspired us to model saccadic behavior and visual saliency based on super-Gaussian component (SGC) analysis. Our model sequentially obtains SGC using projection pursuit, and generates eye movements by selecting the location with maximum SGC response. Besides human saccadic behavior simulation, we also demonstrated our superior effectiveness and robustness over state-of-the-arts by carrying out dense experiments on synthetic patterns and human eye fixation benchmarks. Multiple key issues in saliency modeling research, such as individual differences, the effects of scale and blur, are explored in this paper. Based on extensive qualitative and quantitative experimental results, we show promising potentials of statistical approaches for human behavior research.

  3. Is adaptation to perceived interocular differences in height explained by vertical fusional eye movements?

    PubMed

    Maier, Felix M; Schaeffel, Frank

    2013-07-24

    To find out whether adaptation to a vertical prism involves more than fusional vertical eye movements. Adaptation to a vertical base-up 3 prism diopter prism was measured in a custom-programmed Maddox test in nine visually normal emmetropic subjects (mean age 27.0 ± 2.8 years). Vertical eye movements were binocularly measured in six of the subjects with a custom-programmed binocular video eye tracker. In the Maddox test, some subjects adjusted the perceived height as expected from the power of the prism while others appeared to ignore the prism. After 15 minutes of adaptation, the interocular difference in perceived height was reduced by on average 51% (from 0.86°-0.44°). The larger the initially perceived difference in height in a subject, the larger the amplitude of adaptation was. Eye tracking showed that the prism generated divergent vertical eye movements of 1.2° on average, which was less than expected from its power. Differences in eye elevation were maintained as long as the prism was in place. Small angles of lateral head tilt generated large interocular differences in eye elevation, much larger than the effects introduced by the prism. Vertical differences in retinal image height were compensated by vertical fusional eye movements but some subjects responded poorly to a vertical prism in both experiments; fusional eye movements were generally too small to realign both foveae with the fixation target; and the prism adaptation in the Maddox test was fully explained by the changes in vertical eye position, suggesting that no further adaptational mechanism may be involved.

  4. The role of eye movements in depth from motion parallax during infancy

    PubMed Central

    Nawrot, Elizabeth; Nawrot, Mark

    2013-01-01

    Motion parallax is a motion-based, monocular depth cue that uses an object's relative motion and velocity as a cue to relative depth. In adults, and in monkeys, a smooth pursuit eye movement signal is used to disambiguate the depth-sign provided by these relative motion cues. The current study investigates infants' perception of depth from motion parallax and the development of two oculomotor functions, smooth pursuit and the ocular following response (OFR) eye movements. Infants 8 to 20 weeks of age were presented with three tasks in a single session: depth from motion parallax, smooth pursuit tracking, and OFR to translation. The development of smooth pursuit was significantly related to age, as was sensitivity to motion parallax. OFR eye movements also corresponded to both age and smooth pursuit gain, with groups of infants demonstrating asymmetric function in both types of eye movements. These results suggest that the development of the eye movement system may play a crucial role in the sensitivity to depth from motion parallax in infancy. Moreover, describing the development of these oculomotor functions in relation to depth perception may aid in the understanding of certain visual dysfunctions. PMID:24353309

  5. Eye movements evoked by electrical microstimulation of the mesencephalic reticular formation in goldfish.

    PubMed

    Luque, M A; Pérez-Pérez, M P; Herrero, L; Waitzman, D M; Torres, B

    2006-02-01

    Anatomical studies in goldfish show that the tectofugal axons provide a large number of boutons within the mesencephalic reticular formation. Electrical stimulation, reversible inactivation and cell recording in the primate central mesencephalic reticular formation have suggested that it participates in the control of rapid eye movements (saccades). Moreover, the role of this tecto-recipient area in the generation of saccadic eye movements in fish is unknown. In this study we show that the electrical microstimulation of the mesencephalic reticular formation of goldfish evoked short latency saccadic eye movements in any direction (contraversive or ipsiversive, upward or downward). Movements of the eyes were usually disjunctive. Based on the location of the sites from which eye movements were evoked and the preferred saccade direction, eye movements were divided into different groups: pure vertical saccades were mainly elicited from the rostral mesencephalic reticular formation, while oblique and pure horizontal were largely evoked from middle and caudal mesencephalic reticular formation zones. The direction and amplitude of pure vertical and horizontal saccades were unaffected by initial eye position. However the amplitude, but not the direction of most oblique saccades was systematically modified by initial eye position. At the same time, the amplitude of elicited saccades did not vary in any consistent manner along either the anteroposterior, dorsoventral or mediolateral axes (i.e. there was no topographic organization of the mesencephalic reticular formation with respect to amplitude). In addition to these groups of movements, we found convergent and goal-directed saccades evoked primarily from the anterior and posterior mesencephalic reticular formation, respectively. Finally, the metric and kinetic characteristics of saccades could be manipulated by changes in the stimulation parameters. We conclude that the mesencephalic reticular formation in goldfish shares physiological functions that correspond closely with those found in mammals.

  6. What is the role of the film viewer? The effects of narrative comprehension and viewing task on gaze control in film.

    PubMed

    Hutson, John P; Smith, Tim J; Magliano, Joseph P; Loschky, Lester C

    2017-01-01

    Film is ubiquitous, but the processes that guide viewers' attention while viewing film narratives are poorly understood. In fact, many film theorists and practitioners disagree on whether the film stimulus (bottom-up) or the viewer (top-down) is more important in determining how we watch movies. Reading research has shown a strong connection between eye movements and comprehension, and scene perception studies have shown strong effects of viewing tasks on eye movements, but such idiosyncratic top-down control of gaze in film would be anathema to the universal control mainstream filmmakers typically aim for. Thus, in two experiments we tested whether the eye movements and comprehension relationship similarly held in a classic film example, the famous opening scene of Orson Welles' Touch of Evil (Welles & Zugsmith, Touch of Evil, 1958). Comprehension differences were compared with more volitionally controlled task-based effects on eye movements. To investigate the effects of comprehension on eye movements during film viewing, we manipulated viewers' comprehension by starting participants at different points in a film, and then tracked their eyes. Overall, the manipulation created large differences in comprehension, but only produced modest differences in eye movements. To amplify top-down effects on eye movements, a task manipulation was designed to prioritize peripheral scene features: a map task. This task manipulation created large differences in eye movements when compared to participants freely viewing the clip for comprehension. Thus, to allow for strong, volitional top-down control of eye movements in film, task manipulations need to make features that are important to narrative comprehension irrelevant to the viewing task. The evidence provided by this experimental case study suggests that filmmakers' belief in their ability to create systematic gaze behavior across viewers is confirmed, but that this does not indicate universally similar comprehension of the film narrative.

  7. Controlling a human-computer interface system with a novel classification method that uses electrooculography signals.

    PubMed

    Wu, Shang-Lin; Liao, Lun-De; Lu, Shao-Wei; Jiang, Wei-Ling; Chen, Shi-An; Lin, Chin-Teng

    2013-08-01

    Electrooculography (EOG) signals can be used to control human-computer interface (HCI) systems, if properly classified. The ability to measure and process these signals may help HCI users to overcome many of the physical limitations and inconveniences in daily life. However, there are currently no effective multidirectional classification methods for monitoring eye movements. Here, we describe a classification method used in a wireless EOG-based HCI device for detecting eye movements in eight directions. This device includes wireless EOG signal acquisition components, wet electrodes and an EOG signal classification algorithm. The EOG classification algorithm is based on extracting features from the electrical signals corresponding to eight directions of eye movement (up, down, left, right, up-left, down-left, up-right, and down-right) and blinking. The recognition and processing of these eight different features were achieved in real-life conditions, demonstrating that this device can reliably measure the features of EOG signals. This system and its classification procedure provide an effective method for identifying eye movements. Additionally, it may be applied to study eye functions in real-life conditions in the near future.

  8. An eye movement study for identification of suitable font characters for presentation on a computer screen.

    PubMed

    Banerjee, Jayeeta; Majumdar, Dhurjati; Majumdar, Deepti; Pal, Madhu Sudan

    2010-06-01

    We are experiencing a shifting of media: from the printed paper to the computer screen. This transition is modifying the process of how we read and understand a text. It is very difficult to conclude on suitability of font characters based upon subjective evaluation method only. Present study evaluates the effect of font type on human cognitive workload during perception of individual alphabets on a computer screen. Twenty six young subjects volunteered for this study. Here, subjects have been shown individual characters of different font types and their eye movements have been recorded. A binocular eye movement recorder was used for eye movement recording. The results showed that different eye movement parameters such as pupil diameter, number of fixations, fixation duration were less for font type Verdana. The present study recommends the use of font type Verdana for presentation of individual alphabets on various electronic displays in order to reduce cognitive workload.

  9. Like a rolling stone: naturalistic visual kinematics facilitate tracking eye movements.

    PubMed

    Souto, David; Kerzel, Dirk

    2013-02-06

    Newtonian physics constrains object kinematics in the real world. We asked whether eye movements towards tracked objects depend on their compliance with those constraints. In particular, the force of gravity constrains round objects to roll on the ground with a particular rotational and translational motion. We measured tracking eye movements towards rolling objects. We found that objects with rotational and translational motion that was congruent with an object rolling on the ground elicited faster tracking eye movements during pursuit initiation than incongruent stimuli. Relative to a condition without rotational component, we compared objects with this motion with a condition in which there was no rotational component, we essentially obtained benefits of congruence, and, to a lesser extent, costs from incongruence. Anticipatory pursuit responses showed no congruence effect, suggesting that the effect is based on visually-driven predictions, not on velocity storage. We suggest that the eye movement system incorporates information about object kinematics acquired by a lifetime of experience with visual stimuli obeying the laws of Newtonian physics.

  10. Eye-movements and ongoing task processing.

    PubMed

    Burke, David T; Meleger, Alec; Schneider, Jeffrey C; Snyder, Jim; Dorvlo, Atsu S S; Al-Adawi, Samir

    2003-06-01

    This study tests the relation between eye-movements and thought processing. Subjects were given specific modality tasks (visual, gustatory, kinesthetic) and assessed on whether they responded with distinct eye-movements. Some subjects' eye-movements reflected ongoing thought processing. Instead of a universal pattern, as suggested by the neurolinguistic programming hypothesis, this study yielded subject-specific idiosyncratic eye-movements across all modalities. Included is a discussion of the neurolinguistic programming hypothesis regarding eye-movements and its implications for the eye-movement desensitization and reprocessing theory.

  11. GOM-Face: GKP, EOG, and EMG-based multimodal interface with application to humanoid robot control.

    PubMed

    Nam, Yunjun; Koo, Bonkon; Cichocki, Andrzej; Choi, Seungjin

    2014-02-01

    We present a novel human-machine interface, called GOM-Face , and its application to humanoid robot control. The GOM-Face bases its interfacing on three electric potentials measured on the face: 1) glossokinetic potential (GKP), which involves the tongue movement; 2) electrooculogram (EOG), which involves the eye movement; 3) electromyogram, which involves the teeth clenching. Each potential has been individually used for assistive interfacing to provide persons with limb motor disabilities or even complete quadriplegia an alternative communication channel. However, to the best of our knowledge, GOM-Face is the first interface that exploits all these potentials together. We resolved the interference between GKP and EOG by extracting discriminative features from two covariance matrices: a tongue-movement-only data matrix and eye-movement-only data matrix. With the feature extraction method, GOM-Face can detect four kinds of horizontal tongue or eye movements with an accuracy of 86.7% within 2.77 s. We demonstrated the applicability of the GOM-Face to humanoid robot control: users were able to communicate with the robot by selecting from a predefined menu using the eye and tongue movements.

  12. Individual differences in language ability are related to variation in word recognition, not speech perception: evidence from eye movements.

    PubMed

    McMurray, Bob; Munson, Cheyenne; Tomblin, J Bruce

    2014-08-01

    The authors examined speech perception deficits associated with individual differences in language ability, contrasting auditory, phonological, or lexical accounts by asking whether lexical competition is differentially sensitive to fine-grained acoustic variation. Adolescents with a range of language abilities (N = 74, including 35 impaired) participated in an experiment based on McMurray, Tanenhaus, and Aslin (2002). Participants heard tokens from six 9-step voice onset time (VOT) continua spanning 2 words (beach/peach, beak/peak, etc.) while viewing a screen containing pictures of those words and 2 unrelated objects. Participants selected the referent while eye movements to each picture were monitored as a measure of lexical activation. Fixations were examined as a function of both VOT and language ability. Eye movements were sensitive to within-category VOT differences: As VOT approached the boundary, listeners made more fixations to the competing word. This did not interact with language ability, suggesting that language impairment is not associated with differential auditory sensitivity or phonetic categorization. Listeners with poorer language skills showed heightened competitors fixations overall, suggesting a deficit in lexical processes. Language impairment may be better characterized by a deficit in lexical competition (inability to suppress competing words), rather than differences in phonological categorization or auditory abilities.

  13. Evaluating the influence of motor control on selective attention through a stochastic model: the paradigm of motor control dysfunction in cerebellar patient.

    PubMed

    Veneri, Giacomo; Federico, Antonio; Rufa, Alessandra

    2014-01-01

    Attention allows us to selectively process the vast amount of information with which we are confronted, prioritizing some aspects of information and ignoring others by focusing on a certain location or aspect of the visual scene. Selective attention is guided by two cognitive mechanisms: saliency of the image (bottom up) and endogenous mechanisms (top down). These two mechanisms interact to direct attention and plan eye movements; then, the movement profile is sent to the motor system, which must constantly update the command needed to produce the desired eye movement. A new approach is described here to study how the eye motor control could influence this selection mechanism in clinical behavior: two groups of patients (SCA2 and late onset cerebellar ataxia LOCA) with well-known problems of motor control were studied; patients performed a cognitively demanding task; the results were compared to a stochastic model based on Monte Carlo simulations and a group of healthy subjects. The analytical procedure evaluated some energy functions for understanding the process. The implemented model suggested that patients performed an optimal visual search, reducing intrinsic noise sources. Our findings theorize a strict correlation between the "optimal motor system" and the "optimal stimulus encoders."

  14. Eyes that bind us: Gaze leading induces an implicit sense of agency.

    PubMed

    Stephenson, Lisa J; Edwards, S Gareth; Howard, Emma E; Bayliss, Andrew P

    2018-03-01

    Humans feel a sense of agency over the effects their motor system causes. This is the case for manual actions such as pushing buttons, kicking footballs, and all acts that affect the physical environment. We ask whether initiating joint attention - causing another person to follow our eye movement - can elicit an implicit sense of agency over this congruent gaze response. Eye movements themselves cannot directly affect the physical environment, but joint attention is an example of how eye movements can indirectly cause social outcomes. Here we show that leading the gaze of an on-screen face induces an underestimation of the temporal gap between action and consequence (Experiments 1 and 2). This underestimation effect, named 'temporal binding,' is thought to be a measure of an implicit sense of agency. Experiment 3 asked whether merely making an eye movement in a non-agentic, non-social context might also affect temporal estimation, and no reliable effects were detected, implying that inconsequential oculomotor acts do not reliably affect temporal estimations under these conditions. Together, these findings suggest that an implicit sense of agency is generated when initiating joint attention interactions. This is important for understanding how humans can efficiently detect and understand the social consequences of their actions. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Eye movements and hazard perception in active and passive driving

    PubMed Central

    Mackenzie, Andrew K.; Harris, Julie M.

    2015-01-01

    ABSTRACT Differences in eye movement patterns are often found when comparing passive viewing paradigms to actively engaging in everyday tasks. Arguably, investigations into visuomotor control should therefore be most useful when conducted in settings that incorporate the intrinsic link between vision and action. We present a study that compares oculomotor behaviour and hazard reaction times across a simulated driving task and a comparable, but passive, video-based hazard perception task. We found that participants scanned the road less during the active driving task and fixated closer to the front of the vehicle. Participants were also slower to detect the hazards in the driving task. Our results suggest that the interactivity of simulated driving places increased demand upon the visual and attention systems than simply viewing driving movies. We offer insights into why these differences occur and explore the possible implications of such findings within the wider context of driver training and assessment. PMID:26681913

  16. Memory and decision making in the frontal cortex during visual motion processing for smooth pursuit eye movements.

    PubMed

    Shichinohe, Natsuko; Akao, Teppei; Kurkin, Sergei; Fukushima, Junko; Kaneko, Chris R S; Fukushima, Kikuro

    2009-06-11

    Cortical motor areas are thought to contribute "higher-order processing," but what that processing might include is unknown. Previous studies of the smooth pursuit-related discharge of supplementary eye field (SEF) neurons have not distinguished activity associated with the preparation for pursuit from discharge related to processing or memory of the target motion signals. Using a memory-based task designed to separate these components, we show that the SEF contains signals coding retinal image-slip-velocity, memory, and assessment of visual motion direction, the decision of whether to pursue, and the preparation for pursuit eye movements. Bilateral muscimol injection into SEF resulted in directional errors in smooth pursuit, errors of whether to pursue, and impairment of initial correct eye movements. These results suggest an important role for the SEF in memory and assessment of visual motion direction and the programming of appropriate pursuit eye movements.

  17. Face recognition increases during saccade preparation.

    PubMed

    Lin, Hai; Rizak, Joshua D; Ma, Yuan-ye; Yang, Shang-chuan; Chen, Lin; Hu, Xin-tian

    2014-01-01

    Face perception is integral to human perception system as it underlies social interactions. Saccadic eye movements are frequently made to bring interesting visual information, such as faces, onto the fovea for detailed processing. Just before eye movement onset, the processing of some basic features, such as the orientation, of an object improves at the saccade landing point. Interestingly, there is also evidence that indicates faces are processed in early visual processing stages similar to basic features. However, it is not known whether this early enhancement of processing includes face recognition. In this study, three experiments were performed to map the timing of face presentation to the beginning of the eye movement in order to evaluate pre-saccadic face recognition. Faces were found to be similarly processed as simple objects immediately prior to saccadic movements. Starting ∼ 120 ms before a saccade to a target face, independent of whether or not the face was surrounded by other faces, the face recognition gradually improved and the critical spacing of the crowding decreased as saccade onset was approaching. These results suggest that an upcoming saccade prepares the visual system for new information about faces at the saccade landing site and may reduce the background in a crowd to target the intended face. This indicates an important role of pre-saccadic eye movement signals in human face recognition.

  18. Development and learning of saccadic eye movements in 7- to 42-month-old children.

    PubMed

    Alahyane, Nadia; Lemoine-Lardennois, Christelle; Tailhefer, Coline; Collins, Thérèse; Fagard, Jacqueline; Doré-Mazars, Karine

    2016-01-01

    From birth, infants move their eyes to explore their environment, interact with it, and progressively develop a multitude of motor and cognitive abilities. The characteristics and development of oculomotor control in early childhood remain poorly understood today. Here, we examined reaction time and amplitude of saccadic eye movements in 93 7- to 42-month-old children while they oriented toward visual animated cartoon characters appearing at unpredictable locations on a computer screen over 140 trials. Results revealed that saccade performance is immature in children compared to a group of adults: Saccade reaction times were longer, and saccade amplitude relative to target location (10° eccentricity) was shorter. Results also indicated that performance is flexible in children. Although saccade reaction time decreased as age increased, suggesting developmental improvements in saccade control, saccade amplitude gradually improved over trials. Moreover, similar to adults, children were able to modify saccade amplitude based on the visual error made in the previous trial. This second set of results suggests that short visual experience and/or rapid sensorimotor learning are functional in children and can also affect saccade performance.

  19. Lossless compression of otoneurological eye movement signals.

    PubMed

    Tossavainen, Timo; Juhola, Martti

    2002-12-01

    We studied the performance of several lossless compression algorithms on eye movement signals recorded in otoneurological balance and other physiological laboratories. Despite the wide use of these signals their compression has not been studied prior to our research. The compression methods were based on the common model of using a predictor to decorrelate the input and using an entropy coder to encode the residual. We found that these eye movement signals recorded at 400 Hz and with 13 bit amplitude resolution could losslessly be compressed with a compression ratio of about 2.7.

  20. Method of Menu Selection by Gaze Movement Using AC EOG Signals

    NASA Astrophysics Data System (ADS)

    Kanoh, Shin'ichiro; Futami, Ryoko; Yoshinobu, Tatsuo; Hoshimiya, Nozomu

    A method to detect the direction and the distance of voluntary eye gaze movement from EOG (electrooculogram) signals was proposed and tested. In this method, AC-amplified vertical and horizontal transient EOG signals were classified into 8-class directions and 2-class distances of voluntary eye gaze movements. A horizontal and a vertical EOGs during eye gaze movement at each sampling time were treated as a two-dimensional vector, and the center of gravity of the sample vectors whose norms were more than 80% of the maximum norm was used as a feature vector to be classified. By the classification using the k-nearest neighbor algorithm, it was shown that the averaged correct detection rates on each subject were 98.9%, 98.7%, 94.4%, respectively. This method can avoid strict EOG-based eye tracking which requires DC amplification of very small signal. It would be useful to develop robust human interfacing systems based on menu selection for severely paralyzed patients.

  1. The real-time prediction and inhibition of linguistic outcomes: Effects of language and literacy skill.

    PubMed

    Kukona, Anuenue; Braze, David; Johns, Clinton L; Mencl, W Einar; Van Dyke, Julie A; Magnuson, James S; Pugh, Kenneth R; Shankweiler, Donald P; Tabor, Whitney

    2016-11-01

    Recent studies have found considerable individual variation in language comprehenders' predictive behaviors, as revealed by their anticipatory eye movements during language comprehension. The current study investigated the relationship between these predictive behaviors and the language and literacy skills of a diverse, community-based sample of young adults. We found that rapid automatized naming (RAN) was a key determinant of comprehenders' prediction ability (e.g., as reflected in predictive eye movements to a white cake on hearing "The boy will eat the white…"). Simultaneously, comprehension-based measures predicted participants' ability to inhibit eye movements to objects that shared features with predictable referents but were implausible completions (e.g., as reflected in eye movements to a white but inedible white car). These findings suggest that the excitatory and inhibitory mechanisms that support prediction during language processing are closely linked with specific cognitive abilities that support literacy. We show that a self-organizing cognitive architecture captures this pattern of results. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A geometric method for computing ocular kinematics and classifying gaze events using monocular remote eye tracking in a robotic environment.

    PubMed

    Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M

    2016-01-26

    Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.

  3. Modelling eye movements in a categorical search task

    PubMed Central

    Zelinsky, Gregory J.; Adeli, Hossein; Peng, Yifan; Samaras, Dimitris

    2013-01-01

    We introduce a model of eye movements during categorical search, the task of finding and recognizing categorically defined targets. It extends a previous model of eye movements during search (target acquisition model, TAM) by using distances from an support vector machine classification boundary to create probability maps indicating pixel-by-pixel evidence for the target category in search images. Other additions include functionality enabling target-absent searches, and a fixation-based blurring of the search images now based on a mapping between visual and collicular space. We tested this model on images from a previously conducted variable set-size (6/13/20) present/absent search experiment where participants searched for categorically defined teddy bear targets among random category distractors. The model not only captured target-present/absent set-size effects, but also accurately predicted for all conditions the numbers of fixations made prior to search judgements. It also predicted the percentages of first eye movements during search landing on targets, a conservative measure of search guidance. Effects of set size on false negative and false positive errors were also captured, but error rates in general were overestimated. We conclude that visual features discriminating a target category from non-targets can be learned and used to guide eye movements during categorical search. PMID:24018720

  4. The Coordinated Interplay of Scene, Utterance, and World Knowledge: Evidence from Eye Tracking

    ERIC Educational Resources Information Center

    Knoeferle, Pia; Crocker, Matthew W.

    2006-01-01

    Two studies investigated the interaction between utterance and scene processing by monitoring eye movements in agent-action-patient events, while participants listened to related utterances. The aim of Experiment 1 was to determine if and when depicted events are used for thematic role assignment and structural disambiguation of temporarily…

  5. Diagnostic accuracy of eye movements in assessing pedophilia.

    PubMed

    Fromberger, Peter; Jordan, Kirsten; Steinkrauss, Henrike; von Herder, Jakob; Witzel, Joachim; Stolpmann, Georg; Kröner-Herwig, Birgit; Müller, Jürgen Leo

    2012-07-01

    Given that recurrent sexual interest in prepubescent children is one of the strongest single predictors for pedosexual offense recidivism, valid and reliable diagnosis of pedophilia is of particular importance. Nevertheless, current assessment methods still fail to fulfill psychometric quality criteria. The aim of the study was to evaluate the diagnostic accuracy of eye-movement parameters in regard to pedophilic sexual preferences. Eye movements were measured while 22 pedophiles (according to ICD-10 F65.4 diagnosis), 8 non-pedophilic forensic controls, and 52 healthy controls simultaneously viewed the picture of a child and the picture of an adult. Fixation latency was assessed as a parameter for automatic attentional processes and relative fixation time to account for controlled attentional processes. Receiver operating characteristic (ROC) analyses, which are based on calculated age-preference indices, were carried out to determine the classifier performance. Cross-validation using the leave-one-out method was used to test the validity of classifiers. Pedophiles showed significantly shorter fixation latencies and significantly longer relative fixation times for child stimuli than either of the control groups. Classifier performance analysis revealed an area under the curve (AUC) = 0.902 for fixation latency and an AUC = 0.828 for relative fixation time. The eye-tracking method based on fixation latency discriminated between pedophiles and non-pedophiles with a sensitivity of 86.4% and a specificity of 90.0%. Cross-validation demonstrated good validity of eye-movement parameters. Despite some methodological limitations, measuring eye movements seems to be a promising approach to assess deviant pedophilic interests. Eye movements, which represent automatic attentional processes, demonstrated high diagnostic accuracy. © 2012 International Society for Sexual Medicine.

  6. View-invariant object category learning, recognition, and search: how spatial and object attention are coordinated using surface-based attentional shrouds.

    PubMed

    Fazl, Arash; Grossberg, Stephen; Mingolla, Ennio

    2009-02-01

    How does the brain learn to recognize an object from multiple viewpoints while scanning a scene with eye movements? How does the brain avoid the problem of erroneously classifying parts of different objects together? How are attention and eye movements intelligently coordinated to facilitate object learning? A neural model provides a unified mechanistic explanation of how spatial and object attention work together to search a scene and learn what is in it. The ARTSCAN model predicts how an object's surface representation generates a form-fitting distribution of spatial attention, or "attentional shroud". All surface representations dynamically compete for spatial attention to form a shroud. The winning shroud persists during active scanning of the object. The shroud maintains sustained activity of an emerging view-invariant category representation while multiple view-specific category representations are learned and are linked through associative learning to the view-invariant object category. The shroud also helps to restrict scanning eye movements to salient features on the attended object. Object attention plays a role in controlling and stabilizing the learning of view-specific object categories. Spatial attention hereby coordinates the deployment of object attention during object category learning. Shroud collapse releases a reset signal that inhibits the active view-invariant category in the What cortical processing stream. Then a new shroud, corresponding to a different object, forms in the Where cortical processing stream, and search using attention shifts and eye movements continues to learn new objects throughout a scene. The model mechanistically clarifies basic properties of attention shifts (engage, move, disengage) and inhibition of return. It simulates human reaction time data about object-based spatial attention shifts, and learns with 98.1% accuracy and a compression of 430 on a letter database whose letters vary in size, position, and orientation. The model provides a powerful framework for unifying many data about spatial and object attention, and their interactions during perception, cognition, and action.

  7. Smooth-pursuit eye-movement deficits with chemical lesions in macaque nucleus reticularis tegmenti pontis.

    PubMed

    Suzuki, D A; Yamada, T; Hoedema, R; Yee, R D

    1999-09-01

    Anatomic and neuronal recordings suggest that the nucleus reticularis tegmenti pontis (NRTP) of macaques may be a major pontine component of a cortico-ponto-cerebellar pathway that subserves the control of smooth-pursuit eye movements. The existence of such a pathway was implicated by the lack of permanent pursuit impairment after bilateral lesions in the dorsolateral pontine nucleus. To provide more direct evidence that NRTP is involved with regulating smooth-pursuit eye movements, chemical lesions were made in macaque NRTP by injecting either lidocaine or ibotenic acid. Injection sites first were identified by the recording of smooth-pursuit-related modulations in neuronal activity. The resulting lesions caused significant deficits in both the maintenance and the initiation of smooth-pursuit eye movements. After lesion formation, the gain of constant-velocity, maintained smooth-pursuit eye movements decreased, on the average, by 44%. Recovery of the ability to maintain smooth-pursuit eye movements occurred over approximately 3 days when maintained pursuit gains attained normal values. The step-ramp, "Rashbass" task was used to investigate the effects of the lesions on the initiation of smooth-pursuit eye movements. Eye accelerations averaged over the initial 80 ms of pursuit initiation were determined and found to be decremented, on the average, by 48% after the administration of ibotenic acid. Impairments in the initiation and maintenance of smooth-pursuit eye movements were directional in nature. Upward pursuit seemed to be the most vulnerable and was impaired in all cases independent of lesioning agent and type of pursuit investigated. Downward smooth pursuit seemed more resistant to the effects of chemical lesions in NRTP. Impairments in horizontal tracking were observed with examples of deficits in ipsilaterally and contralaterally directed pursuit. The results provide behavioral support for the physiologically and anatomic-based conclusion that NRTP is a component of a cortico-ponto-cerebellar circuit that presumably involves the pursuit area of the frontal eye field (FEF) and projects to ocular motor-related areas of the cerebellum. This FEF-NRTP-cerebellum path would parallel a middle and medial superior temporal cerebral cortical area-dorsolateral pontine nucleus-cerebellum pathway also known to be involved with regulating smooth-pursuit eye movements.

  8. Abelson Interactor 1 (Abi1) and Its Interaction with Wiskott-Aldrich Syndrome Protein (Wasp) Are Critical for Proper Eye Formation in Xenopus Embryos*

    PubMed Central

    Singh, Arvinder; Winterbottom, Emily F.; Ji, Yon Ju; Hwang, Yoo-Seok; Daar, Ira O.

    2013-01-01

    Abl interactor 1 (Abi1) is a scaffold protein that plays a central role in the regulation of actin cytoskeleton dynamics as a constituent of several key protein complexes, and homozygous loss of this protein leads to embryonic lethality in mice. Because this scaffold protein has been shown in cultured cells to be a critical component of pathways controlling cell migration and actin regulation at cell-cell contacts, we were interested to investigate the in vivo role of Abi1 in morphogenesis during the development of Xenopus embryos. Using morpholino-mediated translation inhibition, we demonstrate that knockdown of Abi1 in the whole embryo, or specifically in eye field progenitor cells, leads to disruption of eye morphogenesis. Moreover, signaling through the Src homology 3 domain of Abi1 is critical for proper movement of retinal progenitor cells into the eye field and their appropriate differentiation, and this process is dependent upon an interaction with the nucleation-promoting factor Wasp (Wiskott-Aldrich syndrome protein). Collectively, our data demonstrate that the Abi1 scaffold protein is an essential regulator of cell movement processes required for normal eye development in Xenopus embryos and specifically requires an Src homology 3 domain-dependent interaction with Wasp to regulate this complex morphogenetic process. PMID:23558677

  9. Interaction of the body, head, and eyes during walking and turning

    NASA Technical Reports Server (NTRS)

    Imai, T.; Moore, S. T.; Raphan, T.; Cohen, B.

    2001-01-01

    Body, head, and eye movements were measured in five subjects during straight walking and while turning corners. The purpose was to determine how well the head and eyes followed the linear trajectory of the body in space and whether head orientation followed changes in the gravito-inertial acceleration vector (GIA). Head and body movements were measured with a video-based motion analysis system and horizontal, vertical, and torsional eye movements with video-oculography. During straight walking, there was lateral body motion at the stride frequency, which was at half the frequency of stepping. The GIA oscillated about the direction of heading, according to the acceleration and deceleration associated with heel strike and toe flexion, and the body yawed in concert with stepping. Despite the linear and rotatory motions of the head and body, the head pointed along the forward motion of the body during straight walking. The head pitch/roll component appeared to compensate for vertical and horizontal acceleration of the head rather than orienting to the tilt of the GIA or anticipating it. When turning corners, subjects walked on a 50-cm radius over two steps or on a 200-cm radius in five to seven steps. Maximum centripetal accelerations in sharp turns were ca.0.4 g, which tilted the GIA ca.21 degrees with regard to the heading. This was anticipated by a roll tilt of the head of up to 8 degrees. The eyes rolled 1-1.5 degrees and moved down into the direction of linear acceleration during the tilts of the GIA. Yaw head deviations moved smoothly through the turn, anticipating the shift in lateral body trajectory by as much as 25 degrees. The trunk did not anticipate the change in trajectory. Thus, in contrast to straight walking, the tilt axes of the head and the GIA tended to align during turns. Gaze was stable in space during the slow phases and jumped forward in saccades along the trajectory, leading it by larger angles when the angular velocity of turning was greater. The anticipatory roll head movements during turning are likely to be utilized to overcome inertial forces that would destabilize balance during turning. The data show that compensatory eye, head, and body movements stabilize gaze during straight walking, while orienting mechanisms direct the eyes, head, and body to tilts of the GIA in space during turning.

  10. Video-based eye tracking for neuropsychiatric assessment.

    PubMed

    Adhikari, Sam; Stark, David E

    2017-01-01

    This paper presents a video-based eye-tracking method, ideally deployed via a mobile device or laptop-based webcam, as a tool for measuring brain function. Eye movements and pupillary motility are tightly regulated by brain circuits, are subtly perturbed by many disease states, and are measurable using video-based methods. Quantitative measurement of eye movement by readily available webcams may enable early detection and diagnosis, as well as remote/serial monitoring, of neurological and neuropsychiatric disorders. We successfully extracted computational and semantic features for 14 testing sessions, comprising 42 individual video blocks and approximately 17,000 image frames generated across several days of testing. Here, we demonstrate the feasibility of collecting video-based eye-tracking data from a standard webcam in order to assess psychomotor function. Furthermore, we were able to demonstrate through systematic analysis of this data set that eye-tracking features (in particular, radial and tangential variance on a circular visual-tracking paradigm) predict performance on well-validated psychomotor tests. © 2017 New York Academy of Sciences.

  11. Predicting the Valence of a Scene from Observers’ Eye Movements

    PubMed Central

    R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne

    2015-01-01

    Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322

  12. Is having similar eye movement patterns during face learning and recognition beneficial for recognition performance? Evidence from hidden Markov modeling.

    PubMed

    Chuk, Tim; Chan, Antoni B; Hsiao, Janet H

    2017-12-01

    The hidden Markov model (HMM)-based approach for eye movement analysis is able to reflect individual differences in both spatial and temporal aspects of eye movements. Here we used this approach to understand the relationship between eye movements during face learning and recognition, and its association with recognition performance. We discovered holistic (i.e., mainly looking at the face center) and analytic (i.e., specifically looking at the two eyes in addition to the face center) patterns during both learning and recognition. Although for both learning and recognition, participants who adopted analytic patterns had better recognition performance than those with holistic patterns, a significant positive correlation between the likelihood of participants' patterns being classified as analytic and their recognition performance was only observed during recognition. Significantly more participants adopted holistic patterns during learning than recognition. Interestingly, about 40% of the participants used different patterns between learning and recognition, and among them 90% switched their patterns from holistic at learning to analytic at recognition. In contrast to the scan path theory, which posits that eye movements during learning have to be recapitulated during recognition for the recognition to be successful, participants who used the same or different patterns during learning and recognition did not differ in recognition performance. The similarity between their learning and recognition eye movement patterns also did not correlate with their recognition performance. These findings suggested that perceptuomotor memory elicited by eye movement patterns during learning does not play an important role in recognition. In contrast, the retrieval of diagnostic information for recognition, such as the eyes for face recognition, is a better predictor for recognition performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Eye movement training is most effective when it involves a task-relevant sensorimotor decision.

    PubMed

    Fooken, Jolande; Lalonde, Kathryn M; Mann, Gurkiran K; Spering, Miriam

    2018-04-01

    Eye and hand movements are closely linked when performing everyday actions. We conducted a perceptual-motor training study to investigate mutually beneficial effects of eye and hand movements, asking whether training in one modality benefits performance in the other. Observers had to predict the future trajectory of a briefly presented moving object, and intercept it at its assumed location as accurately as possible with their finger. Eye and hand movements were recorded simultaneously. Different training protocols either included eye movements or a combination of eye and hand movements with or without external performance feedback. Eye movement training did not transfer across modalities: Irrespective of feedback, finger interception accuracy and precision improved after training that involved the hand, but not after isolated eye movement training. Conversely, eye movements benefited from hand movement training or when external performance feedback was given, thus improving only when an active interceptive task component was involved. These findings indicate only limited transfer across modalities. However, they reveal the importance of creating a training task with an active sensorimotor decision to improve the accuracy and precision of eye and hand movements.

  14. Three-dimensional organization of otolith-ocular reflexes in rhesus monkeys. I. Linear acceleration responses during off-vertical axis rotation

    NASA Technical Reports Server (NTRS)

    Angelaki, D. E.; Hess, B. J.

    1996-01-01

    1. The dynamic properties of otolith-ocular reflexes elicited by sinusoidal linear acceleration along the three cardinal head axes were studied during off-vertical axis rotations in rhesus monkeys. As the head rotates in space at constant velocity about an off-vertical axis, otolith-ocular reflexes are elicited in response to the sinusoidally varying linear acceleration (gravity) components along the interaural, nasooccipital, or vertical head axis. Because the frequency of these sinusoidal stimuli is proportional to the velocity of rotation, rotation at low and moderately fast speeds allows the study of the mid-and low-frequency dynamics of these otolith-ocular reflexes. 2. Animals were rotated in complete darkness in the yaw, pitch, and roll planes at velocities ranging between 7.4 and 184 degrees/s. Accordingly, otolith-ocular reflexes (manifested as sinusoidal modulations in eye position and/or slow-phase eye velocity) were quantitatively studied for stimulus frequencies ranging between 0.02 and 0.51 Hz. During yaw and roll rotation, torsional, vertical, and horizontal slow-phase eye velocity was sinusoidally modulated as a function of head position. The amplitudes of these responses were symmetric for rotations in opposite directions. In contrast, mainly vertical slow-phase eye velocity was modulated during pitch rotation. This modulation was asymmetric for rotations in opposite direction. 3. Each of these response components in a given rotation plane could be associated with an otolith-ocular response vector whose sensitivity, temporal phase, and spatial orientation were estimated on the basis of the amplitude and phase of sinusoidal modulations during both directions of rotation. Based on this analysis, which was performed either for slow-phase eye velocity alone or for total eye excursion (including both slow and fast eye movements), two distinct response patterns were observed: 1) response vectors with pronounced dynamics and spatial/temporal properties that could be characterized as the low-frequency range of "translational" otolith-ocular reflexes; and 2) response vectors associated with an eye position modulation in phase with head position ("tilt" otolith-ocular reflexes). 4. The responses associated with two otolith-ocular vectors with pronounced dynamics consisted of horizontal eye movements evoked as a function of gravity along the interaural axis and vertical eye movements elicited as a function of gravity along the vertical head axis. Both responses were characterized by a slow-phase eye velocity sensitivity that increased three- to five-fold and large phase changes of approximately 100-180 degrees between 0.02 and 0.51 Hz. These dynamic properties could suggest nontraditional temporal processing in utriculoocular and sacculoocular pathways, possibly involving spatiotemporal otolith-ocular interactions. 5. The two otolith-ocular vectors associated with eye position responses in phase with head position (tilt otolith-ocular reflexes) consisted of torsional eye movements in response to gravity along the interaural axis, and vertical eye movements in response to gravity along the nasooccipital head axis. These otolith-ocular responses did not result from an otolithic effect on slow eye movements alone. Particularly at high frequencies (i.e., high speed rotations), saccades were responsible for most of the modulation of torsional and vertical eye position, which was relatively large (on average +/- 8-10 degrees/g) and remained independent of frequency. Such reflex dynamics can be simulated by a direct coupling of primary otolith afferent inputs to the oculomotor plant. (ABSTRACT TRUNCATED).

  15. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.

    PubMed

    Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido

    2017-06-01

    The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.

  16. Ictal SPECT in patients with rapid eye movement sleep behaviour disorder.

    PubMed

    Mayer, Geert; Bitterlich, Marion; Kuwert, Torsten; Ritt, Philipp; Stefan, Hermann

    2015-05-01

    Rapid eye movement sleep behaviour disorder is a rapid eye movement parasomnia clinically characterized by acting out dreams due to disinhibition of muscle tone in rapid eye movement sleep. Up to 80-90% of the patients with rapid eye movement sleep behaviour disorder develop neurodegenerative disorders within 10-15 years after symptom onset. The disorder is reported in 45-60% of all narcoleptic patients. Whether rapid eye movement sleep behaviour disorder is also a predictor for neurodegeneration in narcolepsy is not known. Although the pathophysiology causing the disinhibition of muscle tone in rapid eye movement sleep behaviour disorder has been studied extensively in animals, little is known about the mechanisms in humans. Most of the human data are from imaging or post-mortem studies. Recent studies show altered functional connectivity between substantia nigra and striatum in patients with rapid eye movement sleep behaviour disorder. We were interested to study which regions are activated in rapid eye movement sleep behaviour disorder during actual episodes by performing ictal single photon emission tomography. We studied one patient with idiopathic rapid eye movement sleep behaviour disorder, one with Parkinson's disease and rapid eye movement sleep behaviour disorder, and two patients with narcolepsy and rapid eye movement sleep behaviour disorder. All patients underwent extended video polysomnography. The tracer was injected after at least 10 s of consecutive rapid eye movement sleep and 10 s of disinhibited muscle tone accompanied by movements registered by an experienced sleep technician. Ictal single photon emission tomography displayed the same activation in the bilateral premotor areas, the interhemispheric cleft, the periaqueductal area, the dorsal and ventral pons and the anterior lobe of the cerebellum in all patients. Our study shows that in patients with Parkinson's disease and rapid eye movement sleep behaviour disorder-in contrast to wakefulness-the neural activity generating movement during episodes of rapid eye movement sleep behaviour disorder bypasses the basal ganglia, a mechanism that is shared by patients with idiopathic rapid eye movement sleep behaviour disorder and narcolepsy patients with rapid eye movement sleep behaviour disorder. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Recognition method of construction conflict based on driver's eye movement.

    PubMed

    Xu, Yi; Li, Shiwu; Gao, Song; Tan, Derong; Guo, Dong; Wang, Yuqiong

    2018-04-01

    Drivers eye movement data in simulated construction conflicts at different speeds were collected and analyzed to find the relationship between the drivers' eye movement and the construction conflict. On the basis of the relationship between the drivers' eye movement and the construction conflict, the peak point of wavelet processed pupil diameter, the first point on the left side of the peak point and the first blink point after the peak point are selected as key points for locating construction conflict periods. On the basis of the key points and the GSA, a construction conflict recognition method so called the CCFRM is proposed. And the construction conflict recognition speed and location accuracy of the CCFRM are verified. The good performance of the CCFRM verified the feasibility of proposed key points in construction conflict recognition. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Evaluation of search strategies for microcalcifications and masses in 3D images

    NASA Astrophysics Data System (ADS)

    Eckstein, Miguel P.; Lago, Miguel A.; Abbey, Craig K.

    2018-03-01

    Medical imaging is quickly evolving towards 3D image modalities such as computed tomography (CT), magnetic resonance imaging (MRI) and digital breast tomosynthesis (DBT). These 3D image modalities add volumetric information but further increase the need for radiologists to search through the image data set. Although much is known about search strategies in 2D images less is known about the functional consequences of different 3D search strategies. We instructed readers to use two different search strategies: drillers had their eye movements restricted to a few regions while they quickly scrolled through the image stack, scanners explored through eye movements the 2D slices. We used real-time eye position monitoring to ensure observers followed the drilling or the scanning strategy while approximately preserving the percentage of the volumetric data covered by the useful field of view. We investigated search for two signals: a simulated microcalcification and a larger simulated mass. Results show an interaction between the search strategy and lesion type. In particular, scanning provided significantly better detectability for microcalcifications at the cost of 5 times more time to search while there was little change in the detectability for the larger simulated masses. Analyses of eye movements support the hypothesis that the effectiveness of a search strategy in 3D imaging arises from the interaction of the fixational sampling of visual information and the signals' visibility in the visual periphery.

  19. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Via, Riccardo, E-mail: riccardo.via@polimi.it; Fassi, Aurora; Fattori, Giovanni

    Purpose: External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Methods: Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by twomore » calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Results: Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. Conclusions: A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.« less

  20. Optical eye tracking system for real-time noninvasive tumor localization in external beam radiotherapy.

    PubMed

    Via, Riccardo; Fassi, Aurora; Fattori, Giovanni; Fontana, Giulia; Pella, Andrea; Tagaste, Barbara; Riboldi, Marco; Ciocca, Mario; Orecchia, Roberto; Baroni, Guido

    2015-05-01

    External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by two calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.

  1. The influence of clutter on real-world scene search: evidence from search efficiency and eye movements.

    PubMed

    Henderson, John M; Chanceaux, Myriam; Smith, Tim J

    2009-01-23

    We investigated the relationship between visual clutter and visual search in real-world scenes. Specifically, we investigated whether visual clutter, indexed by feature congestion, sub-band entropy, and edge density, correlates with search performance as assessed both by traditional behavioral measures (response time and error rate) and by eye movements. Our results demonstrate that clutter is related to search performance. These results hold for both traditional search measures and for eye movements. The results suggest that clutter may serve as an image-based proxy for search set size in real-world scenes.

  2. Sensor-Based Interactive Balance Training with Visual Joint Movement Feedback for Improving Postural Stability in Diabetics with Peripheral Neuropathy: A Randomized Controlled Trial.

    PubMed

    Grewal, Gurtej Singh; Schwenk, Michael; Lee-Eng, Jacqueline; Parvaneh, Saman; Bharara, Manish; Menzies, Robert A; Talal, Talal K; Armstrong, David G; Najafi, Bijan

    2015-01-01

    Individuals with diabetic peripheral neuropathy (DPN) have deficits in sensory and motor skills leading to inadequate proprioceptive feedback, impaired postural balance and higher fall risk. This study investigated the effect of sensor-based interactive balance training on postural stability and daily physical activity in older adults with diabetes. Thirty-nine older adults with DPN were enrolled (age 63.7 ± 8.2 years, BMI 30.6 ± 6, 54% females) and randomized to either an intervention (IG) or a control (CG) group. The IG received sensor-based interactive exercise training tailored for people with diabetes (twice a week for 4 weeks). The exercises focused on shifting weight and crossing virtual obstacles. Body-worn sensors were implemented to acquire kinematic data and provide real-time joint visual feedback during the training. Outcome measurements included changes in center of mass (CoM) sway, ankle and hip joint sway measured during a balance test while the eyes were open and closed at baseline and after the intervention. Daily physical activities were also measured during a 48-hour period at baseline and at follow-up. Analysis of covariance was performed for the post-training outcome comparison. Compared with the CG, the patients in the IG showed a significantly reduced CoM sway (58.31%; p = 0.009), ankle sway (62.7%; p = 0.008) and hip joint sway (72.4%; p = 0.017) during the balance test with open eyes. The ankle sway was also significantly reduced in the IG group (58.8%; p = 0.037) during measurements while the eyes were closed. The number of steps walked showed a substantial but nonsignificant increase (+27.68%; p = 0.064) in the IG following training. The results of this randomized controlled trial demonstrate that people with DPN can significantly improve their postural balance with diabetes-specific, tailored, sensor-based exercise training. The results promote the use of wearable technology in exercise training; however, future studies comparing this technology with commercially available systems are required to evaluate the benefit of interactive visual joint movement feedback. © 2015 S. Karger AG, Basel.

  3. Visual Contrast Sensitivity in Early-Stage Parkinson's Disease.

    PubMed

    Ming, Wendy; Palidis, Dimitrios J; Spering, Miriam; McKeown, Martin J

    2016-10-01

    Visual impairments are frequent in Parkinson's disease (PD) and impact normal functioning in daily activities. Visual contrast sensitivity is a powerful nonmotor sign for discriminating PD patients from controls. However, it is usually assessed with static visual stimuli. Here we examined the interaction between perception and eye movements in static and dynamic contrast sensitivity tasks in a cohort of mildly impaired, early-stage PD patients. Patients (n = 13) and healthy age-matched controls (n = 12) viewed stimuli of various spatial frequencies (0-8 cyc/deg) and speeds (0°/s, 10°/s, 30°/s) on a computer monitor. Detection thresholds were determined by asking participants to adjust luminance contrast until they could just barely see the stimulus. Eye position was recorded with a video-based eye tracker. Patients' static contrast sensitivity was impaired in the intermediate spatial-frequency range and this impairment correlated with fixational instability. However, dynamic contrast sensitivity and patients' smooth pursuit were relatively normal. An independent component analysis revealed contrast sensitivity profiles differentiating patients and controls. Our study simultaneously assesses perceptual contrast sensitivity and eye movements in PD, revealing a possible link between fixational instability and perceptual deficits. Spatiotemporal contrast sensitivity profiles may represent an easily measurable metric as a component of a broader combined biometric for nonmotor features observed in PD.

  4. Binocular eye movement control and motion perception: what is being tracked?

    PubMed

    van der Steen, Johannes; Dits, Joyce

    2012-10-19

    We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking.

  5. Binocular Eye Movement Control and Motion Perception: What Is Being Tracked?

    PubMed Central

    van der Steen, Johannes; Dits, Joyce

    2012-01-01

    Purpose. We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. Methods. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Results. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. Conclusions. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking. PMID:22997286

  6. Research on wheelchair robot control system based on EOG

    NASA Astrophysics Data System (ADS)

    Xu, Wang; Chen, Naijian; Han, Xiangdong; Sun, Jianbo

    2018-04-01

    The paper describes an intelligent wheelchair control system based on EOG. It can help disabled people improve their living ability. The system can acquire EOG signal from the user, detect the number of blink and the direction of glancing, and then send commands to the wheelchair robot via RS-232 to achieve the control of wheelchair robot. Wheelchair robot control system based on EOG is composed of processing EOG signal and human-computer interactive technology, which achieves a purpose of using conscious eye movement to control wheelchair robot.

  7. VARIABILITY OF VISUAL FIELD MEASUREMENTS IS CORRELATED WITH THE GRADIENT OF VISUAL SENSITIVITY

    PubMed Central

    Wyatt, Harry J.; Dul, Mitchell W.; Swanson, William H.

    2007-01-01

    Conventional static automated perimetry provides important clinical information, but its utility is limited by considerable test-retest variability. Fixational eye movements during testing could contribute to variability. To assess this possibility, it is important to know how much sensitivity change would be caused by a given eye movement. To investigate this, we have evaluated the gradient, the rate at which sensitivity changes with location. We tested one eye each, twice within 3 weeks, of 29 patients with glaucoma, 17 young normal subjects and 13 older normal subjects. The 10-2 test pattern with the SITA Standard algorithm was used to assess sensitivity at locations with 2° spacing. Variability and gradient were calculated at individual test locations. Matrix correlations were determined between variability and gradient, and were substantial for the patients with glaucoma. The results were consistent with a substantial contribution to test-retest variability from small fixational eye movements interacting with visual field gradient. Successful characterization of the gradient of sensitivity appears to require sampling at relatively close spacing, as in the 10-2 test pattern. PMID:17320924

  8. Variability of visual field measurements is correlated with the gradient of visual sensitivity.

    PubMed

    Wyatt, Harry J; Dul, Mitchell W; Swanson, William H

    2007-03-01

    Conventional static automated perimetry provides important clinical information, but its utility is limited by considerable test-retest variability. Fixational eye movements during testing could contribute to variability. To assess this possibility, it is important to know how much sensitivity change would be caused by a given eye movement. To investigate this, we have evaluated the gradient, the rate at which sensitivity changes with location. We tested one eye each, twice within 3 weeks, of 29 patients with glaucoma, 17 young normal subjects and 13 older normal subjects. The 10-2 test pattern with the SITA Standard algorithm was used to assess sensitivity at locations with 2 degrees spacing. Variability and gradient were calculated at individual test locations. Matrix correlations were determined between variability and gradient, and were substantial for the patients with glaucoma. The results were consistent with a substantial contribution to test-retest variability from small fixational eye movements interacting with visual field gradient. Successful characterization of the gradient of sensitivity appears to require sampling at relatively close spacing, as in the 10-2 test pattern.

  9. The perception of heading during eye movements

    NASA Technical Reports Server (NTRS)

    Royden, Constance S.; Banks, Martin S.; Crowell, James A.

    1992-01-01

    Warren and Hannon (1988, 1990), while studying the perception of heading during eye movements, concluded that people do not require extraretinal information to judge heading with eye/head movements present. Here, heading judgments are examined at higher, more typical eye movement velocities than the extremely slow tracking eye movements used by Warren and Hannon. It is found that people require extraretinal information about eye position to perceive heading accurately under many viewing conditions.

  10. The Eye of a Mathematical Physicist

    NASA Astrophysics Data System (ADS)

    Hepp, Klaus

    2009-03-01

    In this essay we are searching for neural correlates of `doing mathematical physics'. We introduce a toy model of a mathematical physicist, a brain connected with the outside world only by vision and saccadic eye movements and interacting with a computer screen. First, we describe the neuroanatomy of the visuo-saccadic system and Listing's law, which binds saccades and the optics of the eye. Then we explain space-time transformations in the superior colliculus, the performance of a canonical cortical circuit in the frontal eye field and finally the recurrent interaction of both areas, which leads to a coherent percept of space in spite of saccades. This sets the stage in the brain for doing mathematical physics, which is analyzed in simple examples.

  11. Eye Movements during Silent and Oral Reading in a Regular Orthography: Basic Characteristics and Correlations with Childhood Cognitive Abilities and Adolescent Reading Skills

    PubMed Central

    Krieber, Magdalena; Bartl-Pokorny, Katrin D.; Pokorny, Florian B.; Zhang, Dajie; Landerl, Karin; Körner, Christof; Pernkopf, Franz; Pock, Thomas; Einspieler, Christa; Marschik, Peter B.

    2017-01-01

    The present study aimed to define differences between silent and oral reading with respect to spatial and temporal eye movement parameters. Eye movements of 22 German-speaking adolescents (14 females; mean age = 13;6 years;months) were recorded while reading an age-appropriate text silently and orally. Preschool cognitive abilities were assessed at the participants’ age of 5;7 (years;months) using the Kaufman Assessment Battery for Children. The participants’ reading speed and reading comprehension at the age of 13;6 (years;months) were determined using a standardized inventory to evaluate silent reading skills in German readers (Lesegeschwindigkeits- und -verständnistest für Klassen 6–12). The results show that (i) reading mode significantly influenced both spatial and temporal characteristics of eye movement patterns; (ii) articulation decreased the consistency of intraindividual reading performances with regard to a significant number of eye movement parameters; (iii) reading skills predicted the majority of eye movement parameters during silent reading, but influenced only a restricted number of eye movement parameters when reading orally; (iv) differences with respect to a subset of eye movement parameters increased with reading skills; (v) an overall preschool cognitive performance score predicted reading skills at the age of 13;6 (years;months), but not eye movement patterns during either silent or oral reading. However, we found a few significant correlations between preschool performances on subscales of sequential and simultaneous processing and eye movement parameters for both reading modes. Overall, the findings suggest that eye movement patterns depend on the reading mode. Preschool cognitive abilities were more closely related to eye movement patterns of oral than silent reading, while reading skills predicted eye movement patterns during silent reading, but less so during oral reading. PMID:28151950

  12. Analysis of Eye Movements and Linguistic Boundaries in a Text for the Investigation of Japanese Reading Processes

    NASA Astrophysics Data System (ADS)

    Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo

    In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.

  13. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.

    PubMed

    Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo

    2017-07-01

    Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.

  14. Eye Movement Correlates of Acquired Central Dyslexia

    ERIC Educational Resources Information Center

    Schattka, Kerstin I.; Radach, Ralph; Huber, Walter

    2010-01-01

    Based on recent progress in theory and measurement techniques, the analysis of eye movements has become one of the major methodological tools in experimental reading research. Our work uses this approach to advance the understanding of impaired information processing in acquired central dyslexia of stroke patients with aphasia. Up to now there has…

  15. Inactivation of Semicircular Canals Causes Adaptive Increases in Otolith-driven Tilt Responses

    NASA Technical Reports Server (NTRS)

    Angelaki, Dora E.; Newlands, Shawn D.; Dickman, J. David

    2002-01-01

    Growing experimental and theoretical evidence suggests a functional synergy in the processing of otolith and semicircular canal signals for the generation of the vestibulo-ocular reflexes (VORs). In this study we have further tested this functional interaction by quantifying the adaptive changes in the otolith-ocular system during both rotational and translational movements after surgical inactivation of the semicircular canals. For 0.1- 0.5 Hz (stimuli for which there is no recovery of responses from the plugged canals), pitch and roll VOR gains recovered during earth- horizontal (but not earth-vertical) axis rotations. Corresponding changes were also observed in eye movements elicited by translational motion (0.1 - 5 Hz). Specifically, torsional eye movements increased during lateral motion, whereas vertical eye movements increased during fore-aft motion. The findings indicate that otolith signals can be adapted according to compromised strategy that leads to improved gaze stabilization during motion. Because canal-plugged animals permanently lose the ability to discriminate gravitoinertial accelerations, adapted animals can use the presence of gravity through otolith-driven tilt responses to assist gaze stabilization during earth-horizontal axis rotations.

  16. Eye Movement Disorders

    MedlinePlus

    ... t work properly. There are many kinds of eye movement disorders. Two common ones are Strabismus - a disorder ... of the eyes, sometimes called "dancing eyes" Some eye movement disorders are present at birth. Others develop over ...

  17. Semantic guidance of eye movements in real-world scenes

    PubMed Central

    Hwang, Alex D.; Wang, Hsueh-Cheng; Pomplun, Marc

    2011-01-01

    The perception of objects in our visual world is influenced by not only their low-level visual features such as shape and color, but also their high-level features such as meaning and semantic relations among them. While it has been shown that low-level features in real-world scenes guide eye movements during scene inspection and search, the influence of semantic similarity among scene objects on eye movements in such situations has not been investigated. Here we study guidance of eye movements by semantic similarity among objects during real-world scene inspection and search. By selecting scenes from the LabelMe object-annotated image database and applying Latent Semantic Analysis (LSA) to the object labels, we generated semantic saliency maps of real-world scenes based on the semantic similarity of scene objects to the currently fixated object or the search target. An ROC analysis of these maps as predictors of subjects’ gaze transitions between objects during scene inspection revealed a preference for transitions to objects that were semantically similar to the currently inspected one. Furthermore, during the course of a scene search, subjects’ eye movements were progressively guided toward objects that were semantically similar to the search target. These findings demonstrate substantial semantic guidance of eye movements in real-world scenes and show its importance for understanding real-world attentional control. PMID:21426914

  18. Semantic guidance of eye movements in real-world scenes.

    PubMed

    Hwang, Alex D; Wang, Hsueh-Cheng; Pomplun, Marc

    2011-05-25

    The perception of objects in our visual world is influenced by not only their low-level visual features such as shape and color, but also their high-level features such as meaning and semantic relations among them. While it has been shown that low-level features in real-world scenes guide eye movements during scene inspection and search, the influence of semantic similarity among scene objects on eye movements in such situations has not been investigated. Here we study guidance of eye movements by semantic similarity among objects during real-world scene inspection and search. By selecting scenes from the LabelMe object-annotated image database and applying latent semantic analysis (LSA) to the object labels, we generated semantic saliency maps of real-world scenes based on the semantic similarity of scene objects to the currently fixated object or the search target. An ROC analysis of these maps as predictors of subjects' gaze transitions between objects during scene inspection revealed a preference for transitions to objects that were semantically similar to the currently inspected one. Furthermore, during the course of a scene search, subjects' eye movements were progressively guided toward objects that were semantically similar to the search target. These findings demonstrate substantial semantic guidance of eye movements in real-world scenes and show its importance for understanding real-world attentional control. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Extended Fitts' model of pointing time in eye-gaze input system - Incorporating effects of target shape and movement direction into modeling.

    PubMed

    Murata, Atsuo; Fukunaga, Daichi

    2018-04-01

    This study attempted to investigate the effects of the target shape and the movement direction on the pointing time using an eye-gaze input system and extend Fitts' model so that these factors are incorporated into the model and the predictive power of Fitts' model is enhanced. The target shape, the target size, the movement distance, and the direction of target presentation were set as within-subject experimental variables. The target shape included: a circle, and rectangles with an aspect ratio of 1:1, 1:2, 1:3, and 1:4. The movement direction included eight directions: upper, lower, left, right, upper left, upper right, lower left, and lower right. On the basis of the data for identifying the effects of the target shape and the movement direction on the pointing time, an attempt was made to develop a generalized and extended Fitts' model that took into account the movement direction and the target shape. As a result, the generalized and extended model was found to fit better to the experimental data, and be more effective for predicting the pointing time for a variety of human-computer interaction (HCI) task using an eye-gaze input system. Copyright © 2017. Published by Elsevier Ltd.

  20. Pioneers of eye movement research

    PubMed Central

    Wade, Nicholas J

    2010-01-01

    Recent advances in the technology affording eye movement recordings carry the risk of neglecting past achievements. Without the assistance of this modern armoury, great strides were made in describing the ways the eyes move. For Aristotle the fundamental features of eye movements were binocular, and he described the combined functions of the eyes. This was later given support using simple procedures like placing a finger over the eyelid of the closed eye and culminated in Hering's law of equal innervation. However, the overriding concern in the 19th century was with eye position rather than eye movements. Appreciating discontinuities of eye movements arose from studies of vertigo. The characteristics of nystagmus were recorded before those of saccades and fixations. Eye movements during reading were described by Hering and by Lamare in 1879; both used similar techniques of listening to sounds made during contractions of the extraocular muscles. Photographic records of eye movements during reading were made by Dodge early in the 20th century, and this stimulated research using a wider array of patterns. In the mid-20th century attention shifted to the stability of the eyes during fixation, with the emphasis on involuntary movements. The contributions of pioneers from Aristotle to Yarbus are outlined. PMID:23396982

  1. Measuring eye movements during locomotion: filtering techniques for obtaining velocity signals from a video-based eye monitor

    NASA Technical Reports Server (NTRS)

    Das, V. E.; Thomas, C. W.; Zivotofsky, A. Z.; Leigh, R. J.

    1996-01-01

    Video-based eye-tracking systems are especially suited to studying eye movements during naturally occurring activities such as locomotion, but eye velocity records suffer from broad band noise that is not amenable to conventional filtering methods. We evaluated the effectiveness of combined median and moving-average filters by comparing prefiltered and postfiltered records made synchronously with a video eye-tracker and the magnetic search coil technique, which is relatively noise free. Root-mean-square noise was reduced by half, without distorting the eye velocity signal. To illustrate the practical use of this technique, we studied normal subjects and patients with deficient labyrinthine function and compared their ability to hold gaze on a visual target that moved with their heads (cancellation of the vestibulo-ocular reflex). Patients and normal subjects performed similarly during active head rotation but, during locomotion, patients held their eyes more steadily on the visual target than did subjects.

  2. Towards understanding addiction factors of mobile devices: An eye tracking study on effect of screen size.

    PubMed

    Wibirama, Sunu; Nugroho, Hanung A

    2017-07-01

    Mobile devices addiction has been an important research topic in cognitive science, mental health, and human-machine interaction. Previous works observed mobile device addiction by logging mobile devices activity. Although immersion has been linked as a significant predictor of video game addiction, investigation on addiction factors of mobile device with behavioral measurement has never been done before. In this research, we demonstrated the usage of eye tracking to observe effect of screen size on experience of immersion. We compared subjective judgment with eye movements analysis. Non-parametric analysis on immersion score shows that screen size affects experience of immersion (p<;0.05). Furthermore, our experimental results suggest that fixational eye movements may be used as an indicator for future investigation of mobile devices addiction. Our experimental results are also useful to develop a guideline as well as intervention strategy to deal with smartphone addiction.

  3. A 2D eye gaze estimation system with low-resolution webcam images

    NASA Astrophysics Data System (ADS)

    Ince, Ibrahim Furkan; Kim, Jin Woo

    2011-12-01

    In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI) algorithm. Deformable template-based 2D gaze estimation (DTBGE) algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Right and left eye estimates are evaluated separately. DAISMI finds the stable approximate pupil-center location by calculating the mass-center of eyeball border vertices to be employed for initial deformable template alignment. DTBGE starts running with initial alignment and updates the template alignment with resulting eye movements and eyeball size frame by frame. The horizontal and vertical deviation of eye movements through eyeball size is considered as if it is directly proportional with the deviation of cursor movements in a certain screen size and resolution. The core advantage of the system is that it does not employ the real pupil-center as a reference point for gaze estimation which is more reliable against corneal reflection. Visual angle accuracy is used for the evaluation and benchmarking of the system. Effectiveness of the proposed system is presented and experimental results are shown.

  4. Eye movement perimetry in glaucoma.

    PubMed

    Trope, G E; Eizenman, M; Coyle, E

    1989-08-01

    Present-day computerized perimetry is often inaccurate and unreliable owing to the need to maintain central fixation over long periods while repressing the normal response to presentation of peripheral stimuli. We tested a new method of perimetry that does not require prolonged central fixation. During this test eye movements were encouraged on presentation of a peripheral target. Twenty-three eyes were studied with an Octopus perimeter, with a technician monitoring eye movements. The sensitivity was 100% and the specificity 23%. The low specificity was due to the technician's inability to accurately monitor small eye movements in the central 6 degrees field. If small eye movements are monitored accurately with an eye tracker, eye movement perimetry could become an alternative method to standard perimetry.

  5. Three-dimensional ocular kinematics during eccentric rotations: evidence for functional rather than mechanical constraints

    NASA Technical Reports Server (NTRS)

    Angelaki, Dora E.

    2003-01-01

    Previous studies have reported that the translational vestibuloocular reflex (TVOR) follows a three-dimensional (3D) kinematic behavior that is more similar to visually guided eye movements, like pursuit, rather than the rotational VOR (RVOR). Accordingly, TVOR rotation axes tilted with eye position toward an eye-fixed reference frame rather than staying relatively fixed in the head like in the RVOR. This difference arises because, contrary to the RVOR where peripheral image stability is functionally important, the TVOR like pursuit and saccades cares to stabilize images on the fovea. During most natural head and body movements, both VORs are simultaneously activated. In the present study, we have investigated in rhesus monkeys the 3D kinematics of the combined VOR during yaw rotation about eccentric axes. The experiments were motivated by and quantitatively compared with the predictions of two distinct hypotheses. According to the first (fixed-rule) hypothesis, an eye-position-dependent torsion is computed downstream of a site for RVOR/TVOR convergence, and the combined VOR axis would tilt through an angle that is proportional to gaze angle and independent of the relative RVOR/TVOR contributions to the total eye movement. This hypothesis would be consistent with the recently postulated mechanical constraints imposed by extraocular muscle pulleys. According to the second (image-stabilization) hypothesis, an eye-position-dependent torsion is computed separately for the RVOR and the TVOR components, implying a processing that takes place upstream of a site for RVOR/TVOR convergence. The latter hypothesis is based on the functional requirement that the 3D kinematics of the combined VOR should be governed by the need to keep images stable on the fovea with slip on the peripheral retina being dependent on the different functional goals of the two VORs. In contrast to the fixed-rule hypothesis, the data demonstrated a variable eye-position-dependent torsion for the combined VOR that was different for synergistic versus antagonistic RVOR/TVOR interactions. Furthermore, not only were the eye-velocity tilt slopes of the combined VOR as much as 10 times larger than what would be expected based on extraocular muscle pulley location, but also eye velocity during antagonistic RVOR/TVOR combinations often tilted opposite to gaze. These results are qualitatively and quantitatively consistent with the image-stabilization hypothesis, suggesting that the eye-position-dependent torsion is computed separately for the RVOR and the TVOR and that the 3D kinematics of the combined VOR are dependent on functional rather than mechanical constraints.

  6. How detrimental is eye movement during photorefractive keratectomy to the patient's postoperative vision?

    NASA Astrophysics Data System (ADS)

    Taylor, Natalie M.; van Saarloos, Paul P.; Eikelboom, Robert H.

    2000-06-01

    This study aimed to gauge the effect of the patient's eye movement during Photo Refractive Keratectomy (PRK) on post- operative vision. A computer simulation of both the PRK procedure and the visual outcome has been performed. The PRK simulation incorporated the pattern of movement of the laser beam to perform a given correction, the beam characteristics, an initial corneal profile, and an eye movement scenario; and generated the corrected corneal profile. The regrowth of the epithelium was simulated by selecting the smoothing filter which, when applied to a corrected cornea with no patient eye movement, produced similar ray tracing results to the original corneal model. Ray tracing several objects, such as letters of various contrast and sizes was performed to assess the quality of the post-operative vision. Eye movement scenarios included no eye movement, constant decentration and normally distributed random eye movement of varying magnitudes. Random eye movement of even small amounts, such as 50 microns reduces the contrast sensitivity of the image. Constant decentration decenters the projected image on the retina, and in extreme cases can lead to astigmatism. Eye movements of the magnitude expected during laser refractive surgery have minimal effect on the final visual outcome.

  7. Coordination of eye and head components of movements evoked by stimulation of the paramedian pontine reticular formation.

    PubMed

    Gandhi, Neeraj J; Barton, Ellen J; Sparks, David L

    2008-07-01

    Constant frequency microstimulation of the paramedian pontine reticular formation (PPRF) in head-restrained monkeys evokes a constant velocity eye movement. Since the PPRF receives significant projections from structures that control coordinated eye-head movements, we asked whether stimulation of the pontine reticular formation in the head-unrestrained animal generates a combined eye-head movement or only an eye movement. Microstimulation of most sites yielded a constant-velocity gaze shift executed as a coordinated eye-head movement, although eye-only movements were evoked from some sites. The eye and head contributions to the stimulation-evoked movements varied across stimulation sites and were drastically different from the lawful relationship observed for visually-guided gaze shifts. These results indicate that the microstimulation activated elements that issued movement commands to the extraocular and, for most sites, neck motoneurons. In addition, the stimulation-evoked changes in gaze were similar in the head-restrained and head-unrestrained conditions despite the assortment of eye and head contributions, suggesting that the vestibulo-ocular reflex (VOR) gain must be near unity during the coordinated eye-head movements evoked by stimulation of the PPRF. These findings contrast the attenuation of VOR gain associated with visually-guided gaze shifts and suggest that the vestibulo-ocular pathway processes volitional and PPRF stimulation-evoked gaze shifts differently.

  8. Individual predictions of eye-movements with dynamic scenes

    NASA Astrophysics Data System (ADS)

    Barth, Erhardt; Drewes, Jan; Martinetz, Thomas

    2003-06-01

    We present a model that predicts saccadic eye-movements and can be tuned to a particular human observer who is viewing a dynamic sequence of images. Our work is motivated by applications that involve gaze-contingent interactive displays on which information is displayed as a function of gaze direction. The approach therefore differs from standard approaches in two ways: (1) we deal with dynamic scenes, and (2) we provide means of adapting the model to a particular observer. As an indicator for the degree of saliency we evaluate the intrinsic dimension of the image sequence within a geometric approach implemented by using the structure tensor. Out of these candidate saliency-based locations, the currently attended location is selected according to a strategy found by supervised learning. The data are obtained with an eye-tracker and subjects who view video sequences. The selection algorithm receives candidate locations of current and past frames and a limited history of locations attended in the past. We use a linear mapping that is obtained by minimizing the quadratic difference between the predicted and the actually attended location by gradient descent. Being linear, the learned mapping can be quickly adapted to the individual observer.

  9. Eye movement-invariant representations in the human visual system.

    PubMed

    Nishimoto, Shinji; Huth, Alexander G; Bilenko, Natalia Y; Gallant, Jack L

    2017-01-01

    During natural vision, humans make frequent eye movements but perceive a stable visual world. It is therefore likely that the human visual system contains representations of the visual world that are invariant to eye movements. Here we present an experiment designed to identify visual areas that might contain eye-movement-invariant representations. We used functional MRI to record brain activity from four human subjects who watched natural movies. In one condition subjects were required to fixate steadily, and in the other they were allowed to freely make voluntary eye movements. The movies used in each condition were identical. We reasoned that the brain activity recorded in a visual area that is invariant to eye movement should be similar under fixation and free viewing conditions. In contrast, activity in a visual area that is sensitive to eye movement should differ between fixation and free viewing. We therefore measured the similarity of brain activity across repeated presentations of the same movie within the fixation condition, and separately between the fixation and free viewing conditions. The ratio of these measures was used to determine which brain areas are most likely to contain eye movement-invariant representations. We found that voxels located in early visual areas are strongly affected by eye movements, while voxels in ventral temporal areas are only weakly affected by eye movements. These results suggest that the ventral temporal visual areas contain a stable representation of the visual world that is invariant to eye movements made during natural vision.

  10. Learning optimal eye movements to unusual faces

    PubMed Central

    Peterson, Matthew F.; Eckstein, Miguel P.

    2014-01-01

    Eye movements, which guide the fovea’s high resolution and computational power to relevant areas of the visual scene, are integral to efficient, successful completion of many visual tasks. How humans modify their eye movements through experience with their perceptual environments, and its functional role in learning new tasks, has not been fully investigated. Here, we used a face identification task where only the mouth discriminated exemplars to assess if, how, and when eye movement modulation may mediate learning. By interleaving trials of unconstrained eye movements with trials of forced fixation, we attempted to separate the contributions of eye movements and covert mechanisms to performance improvements. Without instruction, a majority of observers substantially increased accuracy and learned to direct their initial eye movements towards the optimal fixation point. The proximity of an observer’s default face identification eye movement behavior to the new optimal fixation point and the observer’s peripheral processing ability were predictive of performance gains and eye movement learning. After practice in a subsequent condition in which observers were directed to fixate different locations along the face, including the relevant mouth region, all observers learned to make eye movements to the optimal fixation point. In this fully learned state, augmented fixation strategy accounted for 43% of total efficiency improvements while covert mechanisms accounted for the remaining 57%. The findings suggest a critical role for eye movement planning to perceptual learning, and elucidate factors that can predict when and how well an observer can learn a new task with unusual exemplars. PMID:24291712

  11. Volitional control of anticipatory ocular pursuit responses under stabilised image conditions in humans.

    PubMed

    Barnes, G; Goodbody, S; Collins, S

    1995-01-01

    Ocular pursuit responses have been examined in humans in three experiments in which the pursuit target image has been fully or partially stabilised on the fovea by feeding a recorded eye movement signal back to drive the target motion. The objective was to establish whether subjects could volitionally control smooth eye movement to reproduce trajectories of target motion in the absence of a concurrent target motion stimulus. In experiment 1 subjects were presented with a target moving with a triangular waveform in the horizontal axis with a frequency of 0.325 Hz and velocities of +/- 10-50 degrees/s. The target was illuminated twice per cycle for pulse durations (PD) of 160-640 ms as it passed through the centre position; otherwise subjects were in darkness. Subjects initially tracked the target motion in a conventional closed-loop mode for four cycles. Prior to the next target presentation the target image was stabilised on the fovea, so that any target motion generated resulted solely from volitional eye movement. Subjects continued to make anticipatory smooth eye movements both to the left and the right with a velocity trajectory similar to that observed in the closed-loop phase. Peak velocity in the stabilised-image mode was highly correlated with that in the prior closed-loop phase, but was slightly less (84% on average). In experiment 2 subjects were presented with a continuously illuminated target that was oscillated sinusoidally at frequencies of 0.2-1.34 Hz and amplitudes of +/- 5-20 degrees. After four cycles of closed-loop stimulation the image was stabilised on the fovea at the time of peak target displacement. Subjects continued to generate an oscillatory smooth eye velocity pattern that mimicked the sinusoidal motion of the previous closed-loop phase for at least three further cycles. The peak eye velocity generated ranged from 57-95% of that in the closed-loop phase at frequencies up to 0.8 Hz but decreased significantly at 1.34 Hz. In experiment 3 subjects were presented with a stabilised display throughout and generated smooth eye movements with peak velocity up to 84 degrees/s in the complete absence of any prior external target motion stimulus, by transferring their attention alternately to left and right of the centre of the display. Eye velocity was found to be dependent on the eccentricity of the centre of attention and the frequency of alternation. When the target was partially stabilised on the retina by feeding back only a proportion (Kf = 0.6-0.9) of the eye movement signal to drive the target, subjects were still able to generate smooth movements at will, even though the display did not move as far or as fast as the eye. Peak eye velocity decreased as Kf decreased, suggesting that there was a continuous competitive interaction between the volitional drive and the visual feedback provided by the relative motion of the display with respect to the retina. These results support the evidence for two separate mechanisms of smooth eye movement control in ocular pursuit: reflex control from retinal velocity error feedback and volitional control from an internal source. Arguments are presented to indicate how smooth pursuit may be controlled by matching a voluntarily initiated estimate of the required smooth movement, normally derived from storage of past re-afferent information, against current visual feedback information. Such a mechanism allows preemptive smooth eye movements to be made that can overcome the inherent delays in the visual feedback pathway.

  12. Hybrid EEG--Eye Tracker: Automatic Identification and Removal of Eye Movement and Blink Artifacts from Electroencephalographic Signal.

    PubMed

    Mannan, Malik M Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M Ahmad

    2016-02-19

    Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.

  13. The Role of Linear Acceleration in Visual-Vestibular Interactions and Implications in Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Correia, Manning J.; Luke, Brian L.; McGrath, Braden J.; Clark, John B.; Rupert, Angus H.

    1996-01-01

    While considerable attention has been given to visual-vestibular interaction (VVI) during angular motion of the head as might occur during an aircraft spin, much less attention has been given to VVI during linear motion of the head. Such interaction might occur, for example, while viewing a stationary or moving display during vertical take-off and landing operations Research into linear VVI, particularly during prolonged periods of linear acceleration, has been hampered by the unavailability of a programmable translator capable of large excursions We collaborated with Otis Elevator Co. and used their research tower and elevator, whose motion could be digitally programmed, to vertically translate human subjects over a distance of 92.3 meters with a peak linear acceleration of 2 meters/sec(exp 2) During pulsatile or sinusoidal translation, the subjects viewed moving stripes (optokinetic stimulus) or a fixed point source (light emitting diode, led, display), respectively and it was generally found that. The direction of linear acceleration relative to the cardinal head axes and the direction of the slow component of optokinetic nystagmus (OKN) determined the extent of VVI during concomitant stripe motion and linear acceleration. Acceleration along the z head axis (A(sub z)) produced the largest VVI, particularly when the slow component of OKN was in the same direction as eye movements produced by the linear acceleration and Eye movements produced by linear acceleration are suppressed by viewing a fixed target at frequencies below 10 Hz But, above this frequency the suppression produced by VVI is removed. Finally, as demonstrated in non-human primates, vergence of the eyes appears to modulate the vertical eye movement response to linear acceleration in humans.

  14. Fetal eye movements on magnetic resonance imaging.

    PubMed

    Woitek, Ramona; Kasprian, Gregor; Lindner, Christian; Stuhr, Fritz; Weber, Michael; Schöpf, Veronika; Brugger, Peter C; Asenbaum, Ulrika; Furtner, Julia; Bettelheim, Dieter; Seidl, Rainer; Prayer, Daniela

    2013-01-01

    Eye movements are the physical expression of upper fetal brainstem function. Our aim was to identify and differentiate specific types of fetal eye movement patterns using dynamic MRI sequences. Their occurrence as well as the presence of conjugated eyeball motion and consistently parallel eyeball position was systematically analyzed. Dynamic SSFP sequences were acquired in 72 singleton fetuses (17-40 GW, three age groups [17-23 GW, 24-32 GW, 33-40 GW]). Fetal eye movements were evaluated according to a modified classification originally published by Birnholz (1981): Type 0: no eye movements; Type I: single transient deviations; Type Ia: fast deviation, slower reposition; Type Ib: fast deviation, fast reposition; Type II: single prolonged eye movements; Type III: complex sequences; and Type IV: nystagmoid. In 95.8% of fetuses, the evaluation of eye movements was possible using MRI, with a mean acquisition time of 70 seconds. Due to head motion, 4.2% of the fetuses and 20.1% of all dynamic SSFP sequences were excluded. Eye movements were observed in 45 fetuses (65.2%). Significant differences between the age groups were found for Type I (p = 0.03), Type Ia (p = 0.031), and Type IV eye movements (p = 0.033). Consistently parallel bulbs were found in 27.3-45%. In human fetuses, different eye movement patterns can be identified and described by MRI in utero. In addition to the originally classified eye movement patterns, a novel subtype has been observed, which apparently characterizes an important step in fetal brainstem development. We evaluated, for the first time, eyeball position in fetuses. Ultimately, the assessment of fetal eye movements by MRI yields the potential to identify early signs of brainstem dysfunction, as encountered in brain malformations such as Chiari II or molar tooth malformations.

  15. Soft, Conformal Bioelectronics for a Wireless Human-Wheelchair Interface

    PubMed Central

    Mishra, Saswat; Norton, James J. S.; Lee, Yongkuk; Lee, Dong Sup; Agee, Nicolas; Chen, Yanfei; Chun, Youngjae; Yeo, Woon-Hong

    2017-01-01

    There are more than 3 million people in the world whose mobility relies on wheelchairs. Recent advancement on engineering technology enables more intuitive, easy-to-use rehabilitation systems. A human-machine interface that uses non-invasive, electrophysiological signals can allow a systematic interaction between human and devices; for example, eye movement-based wheelchair control. However, the existing machine-interface platforms are obtrusive, uncomfortable, and often cause skin irritations as they require a metal electrode affixed to the skin with a gel and acrylic pad. Here, we introduce a bioelectronic system that makes dry, conformal contact to the skin. The mechanically comfortable sensor records high-fidelity electrooculograms, comparable to the conventional gel electrode. Quantitative signal analysis and infrared thermographs show the advantages of the soft biosensor for an ergonomic human-machine interface. A classification algorithm with an optimized set of features shows the accuracy of 94% with five eye movements. A Bluetooth-enabled system incorporating the soft bioelectronics demonstrates a precise, hands-free control of a robotic wheelchair via electrooculograms. PMID:28152485

  16. General purpose algorithms for characterization of slow and fast phase nystagmus

    NASA Technical Reports Server (NTRS)

    Lessard, Charles S.

    1987-01-01

    In the overall aim for a better understanding of the vestibular and optokinetic systems and their roles in space motion sickness, the eye movement responses to various dynamic stimuli are measured. The vestibulo-ocular reflex (VOR) and the optokinetic response, as the eye movement responses are known, consist of slow phase and fast phase nystagmus. The specific objective is to develop software programs necessary to characterize the vestibulo-ocular and optokinetic responses by distinguishing between the two phases of nystagmus. The overall program is to handle large volumes of highly variable data with minimum operator interaction. The programs include digital filters, differentiation, identification of fast phases, and reconstruction of the slow phase with a least squares fit such that sinusoidal or psuedorandom data may be processed with accurate results. The resultant waveform, slow phase velocity eye movements, serves as input data to the spectral analysis programs previously developed for NASA to analyze nystagmus responses to pseudorandom angular velocity inputs.

  17. Immaturity of the Oculomotor Saccade and Vergence Interaction in Dyslexic Children: Evidence from a Reading and Visual Search Study

    PubMed Central

    Bucci, Maria Pia; Nassibi, Naziha; Gerard, Christophe-Loic; Bui-Quoc, Emmanuel; Seassau, Magali

    2012-01-01

    Studies comparing binocular eye movements during reading and visual search in dyslexic children are, at our knowledge, inexistent. In the present study we examined ocular motor characteristics in dyslexic children versus two groups of non dyslexic children with chronological/reading age-matched. Binocular eye movements were recorded by an infrared system (mobileEBT®, e(ye)BRAIN) in twelve dyslexic children (mean age 11 years old) and a group of chronological age-matched (N = 9) and reading age-matched (N = 10) non dyslexic children. Two visual tasks were used: text reading and visual search. Independently of the task, the ocular motor behavior in dyslexic children is similar to those reported in reading age-matched non dyslexic children: many and longer fixations as well as poor quality of binocular coordination during and after the saccades. In contrast, chronological age-matched non dyslexic children showed a small number of fixations and short duration of fixations in reading task with respect to visual search task; furthermore their saccades were well yoked in both tasks. The atypical eye movement's patterns observed in dyslexic children suggest a deficiency in the visual attentional processing as well as an immaturity of the ocular motor saccade and vergence systems interaction. PMID:22438934

  18. Contribution of the frontal eye field to gaze shifts in the head-unrestrained rhesus monkey: neuronal activity.

    PubMed

    Knight, T A

    2012-12-06

    The frontal eye field (FEF) has a strong influence on saccadic eye movements with the head restrained. With the head unrestrained, eye saccades combine with head movements to produce large gaze shifts, and microstimulation of the FEF evokes both eye and head movements. To test whether the dorsomedial FEF provides commands for the entire gaze shift or its separate eye and head components, we recorded extracellular single-unit activity in monkeys trained to make large head-unrestrained gaze shifts. We recorded 80 units active during gaze shifts, and closely examined 26 of these that discharged a burst of action potentials that preceded horizontal gaze movements. These units were movement or visuomovement related and most exhibited open movement fields with respect to amplitude. To reveal the relations of burst parameters to gaze, eye, and/or head movement metrics, we used behavioral dissociations of gaze, eye, and head movements and linear regression analyses. The burst number of spikes (NOS) was strongly correlated with movement amplitude and burst temporal parameters were strongly correlated with movement temporal metrics for eight gaze-related burst neurons and five saccade-related burst neurons. For the remaining 13 neurons, the NOS was strongly correlated with the head movement amplitude, but burst temporal parameters were most strongly correlated with eye movement temporal metrics (head-eye-related burst neurons, HEBNs). These results suggest that FEF units do not encode a command for the unified gaze shift only; instead, different units may carry signals related to the overall gaze shift or its eye and/or head components. Moreover, the HEBNs exhibit bursts whose magnitude and timing may encode a head displacement signal and a signal that influences the timing of the eye saccade, thereby serving as a mechanism for coordinating the eye and head movements of a gaze shift. Copyright © 2012 IBRO. Published by Elsevier Ltd. All rights reserved.

  19. Effects of bilateral eye movements on the retrieval of item, associative, and contextual information.

    PubMed

    Parker, Andrew; Relph, Sarah; Dagnall, Neil

    2008-01-01

    Two experiments are reported that investigate the effects of saccadic bilateral eye movements on the retrieval of item, associative, and contextual information. Experiment 1 compared the effects of bilateral versus vertical versus no eye movements on tests of item recognition, followed by remember-know responses and associative recognition. Supporting previous research, bilateral eye movements enhanced item recognition by increasing the hit rate and decreasing the false alarm rate. Analysis of remember-know responses indicated that eye movement effects were accompanied by increases in remember responses. The test of associative recognition found that bilateral eye movements increased correct responses to intact pairs and decreased false alarms to rearranged pairs. Experiment 2 assessed the effects of eye movements on the recall of intrinsic (color) and extrinsic (spatial location) context. Bilateral eye movements increased correct recall for both types of context. The results are discussed within the framework of dual-process models of memory and the possible neural underpinnings of these effects are considered.

  20. System for assisted mobility using eye movements based on electrooculography.

    PubMed

    Barea, Rafael; Boquete, Luciano; Mazo, Manuel; López, Elena

    2002-12-01

    This paper describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on electroculographic signal is proposed and its validity is studied. Several human-machine interfaces (HMI) based on EOG are commented, focusing our study on guiding and controlling a wheelchair for disabled people, where the control is actually effected by eye movements within the socket. Different techniques and guidance strategies are then shown with comments on the advantages and disadvantages of each one. The system consists of a standard electric wheelchair with an on-board computer, sensors and a graphic user interface run by the computer. On the other hand, this eye-control method can be applied to handle graphical interfaces, where the eye is used as a mouse computer. Results obtained show that this control technique could be useful in multiple applications, such as mobility and communication aid for handicapped persons.

  1. The adaptive nature of eye movements in linguistic tasks: how payoff and architecture shape speed-accuracy trade-offs.

    PubMed

    Lewis, Richard L; Shvartsman, Michael; Singh, Satinder

    2013-07-01

    We explore the idea that eye-movement strategies in reading are precisely adapted to the joint constraints of task structure, task payoff, and processing architecture. We present a model of saccadic control that separates a parametric control policy space from a parametric machine architecture, the latter based on a small set of assumptions derived from research on eye movements in reading (Engbert, Nuthmann, Richter, & Kliegl, 2005; Reichle, Warren, & McConnell, 2009). The eye-control model is embedded in a decision architecture (a machine and policy space) that is capable of performing a simple linguistic task integrating information across saccades. Model predictions are derived by jointly optimizing the control of eye movements and task decisions under payoffs that quantitatively express different desired speed-accuracy trade-offs. The model yields distinct eye-movement predictions for the same task under different payoffs, including single-fixation durations, frequency effects, accuracy effects, and list position effects, and their modulation by task payoff. The predictions are compared to-and found to accord with-eye-movement data obtained from human participants performing the same task under the same payoffs, but they are found not to accord as well when the assumptions concerning payoff optimization and processing architecture are varied. These results extend work on rational analysis of oculomotor control and adaptation of reading strategy (Bicknell & Levy, ; McConkie, Rayner, & Wilson, 1973; Norris, 2009; Wotschack, 2009) by providing evidence for adaptation at low levels of saccadic control that is shaped by quantitatively varying task demands and the dynamics of processing architecture. Copyright © 2013 Cognitive Science Society, Inc.

  2. How Individual Differences Interact with Task Demands in Text Processing

    ERIC Educational Resources Information Center

    Wang, Zuowei; Sabatini, John; O'Reilly, Tenaha; Feng, Gary

    2017-01-01

    Reading is affected by both situational requirements and one's cognitive skills. The current study investigated how individual differences interacted with task requirements to determine reading behavior and outcome. We recorded the eye movements of college students, who differed in reading efficiency, while they completed a multiple-choice (MC)…

  3. Neurons in the monkey amygdala detect eye-contact during naturalistic social interactions

    PubMed Central

    Mosher, Clayton P.; Zimmerman, Prisca E.; Gothard, Katalin M.

    2014-01-01

    Summary Primates explore the visual world through eye-movement sequences. Saccades bring details of interest into the fovea while fixations stabilize the image [1]. During natural vision, social primates direct their gaze at the eyes of others to communicate their own emotions and intentions and to gather information about the mental states of others [2]. Direct gaze is an integral part of facial expressions that signals cooperation or conflict over resources and social status [3-6]. Despite the great importance of making and breaking eye contact in the behavioral repertoire of primates, little is known about the neural substrates that support these behaviors. Here we show that the monkey amygdala contains neurons that respond selectively to fixations at the eyes of others and to eye contact. These “eye cells” share several features with the canonical, visually responsive neurons in the monkey amygdala, however, they respond to the eyes only when they fall within the fovea of the viewer, either as a result of a deliberate saccade, or as eyes move into the fovea of the viewer during a fixation intended to explore a different feature. The presence of eyes in peripheral vision fails to activate the eye cells. These findings link the primate amygdala to eye-movements involved in the exploration and selection of details in visual scenes that contain socially and emotionally salient features. PMID:25283782

  4. Neurons in the monkey amygdala detect eye contact during naturalistic social interactions.

    PubMed

    Mosher, Clayton P; Zimmerman, Prisca E; Gothard, Katalin M

    2014-10-20

    Primates explore the visual world through eye-movement sequences. Saccades bring details of interest into the fovea, while fixations stabilize the image. During natural vision, social primates direct their gaze at the eyes of others to communicate their own emotions and intentions and to gather information about the mental states of others. Direct gaze is an integral part of facial expressions that signals cooperation or conflict over resources and social status. Despite the great importance of making and breaking eye contact in the behavioral repertoire of primates, little is known about the neural substrates that support these behaviors. Here we show that the monkey amygdala contains neurons that respond selectively to fixations on the eyes of others and to eye contact. These "eye cells" share several features with the canonical, visually responsive neurons in the monkey amygdala; however, they respond to the eyes only when they fall within the fovea of the viewer, either as a result of a deliberate saccade or as eyes move into the fovea of the viewer during a fixation intended to explore a different feature. The presence of eyes in peripheral vision fails to activate the eye cells. These findings link the primate amygdala to eye movements involved in the exploration and selection of details in visual scenes that contain socially and emotionally salient features. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Relationship Between Eye Movement and Cognitive Information Acquisition Utilizing an Unobtrusive Eye Movement Monitoring Device.

    ERIC Educational Resources Information Center

    Nesbit, Larry L.

    A research study was designed to test the relationship between the number of eye fixations and amount of learning as determined by a criterion referenced posttest. The study sought to answer the following questions: (1) Are differences in eye movement indices related to the posttest score? (2) Do differences in eye movement indices of subjects…

  6. Transcranial magnetic stimulation over the cerebellum delays predictive head movements in the coordination of gaze.

    PubMed

    Zangemeister, W H; Nagel, M

    2001-01-01

    We investigated coordinated saccadic eye and head movements following predictive horizontal visual targets at +/- 30 degrees by applying transcranial magnetic stimulation (TMS) over the cerebellum before the start of the gaze movement in 10 young subjects. We found three effects of TMS on eye-head movements: 1. Saccadic latency effect. When stimulation took place shortly before movements commenced (75-25 ms before), significantly shorter latencies were found between predictive target presentation and initiation of saccades. Eye latencies were significantly decreased by 45 ms on average, but head latencies were not. 2. Gaze amplitude effect. Without TMS, for the 60 degrees target amplitudes, head movements usually preceded eye movements, as expected (predictive gaze type 3). With TMS 5-75 ms before the gaze movement, the number of eye movements preceding head movements by 20-50 ms was significantly increased (p < 0.001) and the delay between eye and head movements was reversed (p < 0.001), i.e. we found eye-predictive gaze type 1. 3. Saccadic peak velocity effect. For TMS 5-25 s before the start of head movement, mean peak velocity of synkinetic eye saccades increased by 20-30% up to 600 degrees/s, compared to 350-400 degrees/s without TMS. We conclude that transient functional cerebellar deficits exerted by means of TMS can change the central synkinesis of eye-head coordination, including the preprogramming of the saccadic pulse and step of a coordinated gaze movement.

  7. Distinct eye movement patterns enhance dynamic visual acuity.

    PubMed

    Palidis, Dimitrios J; Wyder-Hodge, Pearson A; Fooken, Jolande; Spering, Miriam

    2017-01-01

    Dynamic visual acuity (DVA) is the ability to resolve fine spatial detail in dynamic objects during head fixation, or in static objects during head or body rotation. This ability is important for many activities such as ball sports, and a close relation has been shown between DVA and sports expertise. DVA tasks involve eye movements, yet, it is unclear which aspects of eye movements contribute to successful performance. Here we examined the relation between DVA and the kinematics of smooth pursuit and saccadic eye movements in a cohort of 23 varsity baseball players. In a computerized dynamic-object DVA test, observers reported the location of the gap in a small Landolt-C ring moving at various speeds while eye movements were recorded. Smooth pursuit kinematics-eye latency, acceleration, velocity gain, position error-and the direction and amplitude of saccadic eye movements were linked to perceptual performance. Results reveal that distinct eye movement patterns-minimizing eye position error, tracking smoothly, and inhibiting reverse saccades-were related to dynamic visual acuity. The close link between eye movement quality and DVA performance has important implications for the development of perceptual training programs to improve DVA.

  8. Distinct eye movement patterns enhance dynamic visual acuity

    PubMed Central

    Palidis, Dimitrios J.; Wyder-Hodge, Pearson A.; Fooken, Jolande; Spering, Miriam

    2017-01-01

    Dynamic visual acuity (DVA) is the ability to resolve fine spatial detail in dynamic objects during head fixation, or in static objects during head or body rotation. This ability is important for many activities such as ball sports, and a close relation has been shown between DVA and sports expertise. DVA tasks involve eye movements, yet, it is unclear which aspects of eye movements contribute to successful performance. Here we examined the relation between DVA and the kinematics of smooth pursuit and saccadic eye movements in a cohort of 23 varsity baseball players. In a computerized dynamic-object DVA test, observers reported the location of the gap in a small Landolt-C ring moving at various speeds while eye movements were recorded. Smooth pursuit kinematics—eye latency, acceleration, velocity gain, position error—and the direction and amplitude of saccadic eye movements were linked to perceptual performance. Results reveal that distinct eye movement patterns—minimizing eye position error, tracking smoothly, and inhibiting reverse saccades—were related to dynamic visual acuity. The close link between eye movement quality and DVA performance has important implications for the development of perceptual training programs to improve DVA. PMID:28187157

  9. Eye Movements Affect Postural Control in Young and Older Females

    PubMed Central

    Thomas, Neil M.; Bampouras, Theodoros M.; Donovan, Tim; Dewhurst, Susan

    2016-01-01

    Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions. PMID:27695412

  10. Eye Movements Affect Postural Control in Young and Older Females.

    PubMed

    Thomas, Neil M; Bampouras, Theodoros M; Donovan, Tim; Dewhurst, Susan

    2016-01-01

    Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions.

  11. Whisking.

    PubMed

    Sofroniew, Nicholas J; Svoboda, Karel

    2015-02-16

    Eyes may be 'the window to the soul' in humans, but whiskers provide a better path to the inner lives of rodents. The brain has remarkable abilities to focus its limited resources on information that matters, while ignoring a cacophony of distractions. While inspecting a visual scene, primates foveate to multiple salient locations, for example mouths and eyes in images of people, and ignore the rest. Similar processes have now been observed and studied in rodents in the context of whisker-based tactile sensation. Rodents use their mechanosensitive whiskers for a diverse range of tactile behaviors such as navigation, object recognition and social interactions. These animals move their whiskers in a purposive manner to locations of interest. The shapes of whiskers, as well as their movements, are exquisitely adapted for tactile exploration in the dark tight burrows where many rodents live. By studying whisker movements during tactile behaviors, we can learn about the tactile information available to rodents through their whiskers and how rodents direct their attention. In this primer, we focus on how the whisker movements of rats and mice are providing clues about the logic of active sensation and the underlying neural mechanisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Adaptive control for eye-gaze input system

    NASA Astrophysics Data System (ADS)

    Zhao, Qijie; Tu, Dawei; Yin, Hairong

    2004-01-01

    The characteristics of the vision-based human-computer interaction system have been analyzed, and the practical application and its limited factors at present time have also been mentioned. The information process methods have been put forward. In order to make the communication flexible and spontaneous, the algorithms to adaptive control of user"s head movement has been designed, and the events-based methods and object-oriented computer language is used to develop the system software, by experiment testing, we found that under given condition, these methods and algorithms can meet the need of the HCI.

  13. Initial component control in disparity vergence: a model-based study.

    PubMed

    Horng, J L; Semmlow, J L; Hung, G K; Ciuffreda, K J

    1998-02-01

    The dual-mode theory for the control of disparity-vergence eye movements states that two components control the response to a step change in disparity. The initial component uses a motor preprogram to drive the eyes to an approximate final position. This initial component is followed by activation of a late component operating under visual feedback control that reduces residual disparity to within fusional limits. A quantitative model based on a pulse-step controller, similar to that postulated for saccadic eye movements, has been developed to represent the initial component. This model, an adaptation of one developed by Zee et al. [1], provides accurate simulations of isolated initial component movements and is compatible with the known underlying neurophysiology and existing neurophysiological data. The model has been employed to investigate the difference in dynamics between convergent and divergent movements. Results indicate that the pulse-control component active in convergence is reduced or absent from the control signals of divergence movements. This suggests somewhat different control structures of convergence versus divergence, and is consistent with other directional asymmetries seen in horizontal vergence.

  14. Eye movements and attention in reading, scene perception, and visual search.

    PubMed

    Rayner, Keith

    2009-08-01

    Eye movements are now widely used to investigate cognitive processes during reading, scene perception, and visual search. In this article, research on the following topics is reviewed with respect to reading: (a) the perceptual span (or span of effective vision), (b) preview benefit, (c) eye movement control, and (d) models of eye movements. Related issues with respect to eye movements during scene perception and visual search are also reviewed. It is argued that research on eye movements during reading has been somewhat advanced over research on eye movements in scene perception and visual search and that some of the paradigms developed to study reading should be more widely adopted in the study of scene perception and visual search. Research dealing with "real-world" tasks and research utilizing the visual-world paradigm are also briefly discussed.

  15. Real-time recording and classification of eye movements in an immersive virtual environment.

    PubMed

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-10-10

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements.

  16. Real-time recording and classification of eye movements in an immersive virtual environment

    PubMed Central

    Diaz, Gabriel; Cooper, Joseph; Kit, Dmitry; Hayhoe, Mary

    2013-01-01

    Despite the growing popularity of virtual reality environments, few laboratories are equipped to investigate eye movements within these environments. This primer is intended to reduce the time and effort required to incorporate eye-tracking equipment into a virtual reality environment. We discuss issues related to the initial startup and provide algorithms necessary for basic analysis. Algorithms are provided for the calculation of gaze angle within a virtual world using a monocular eye-tracker in a three-dimensional environment. In addition, we provide algorithms for the calculation of the angular distance between the gaze and a relevant virtual object and for the identification of fixations, saccades, and pursuit eye movements. Finally, we provide tools that temporally synchronize gaze data and the visual stimulus and enable real-time assembly of a video-based record of the experiment using the Quicktime MOV format, available at http://sourceforge.net/p/utdvrlibraries/. This record contains the visual stimulus, the gaze cursor, and associated numerical data and can be used for data exportation, visual inspection, and validation of calculated gaze movements. PMID:24113087

  17. 21 CFR 886.1510 - Eye movement monitor.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...

  18. 21 CFR 886.1510 - Eye movement monitor.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...

  19. A Review on Eye Movement Studies in Childhood and Adolescent Psychiatry

    ERIC Educational Resources Information Center

    Rommelse, Nanda N. J.; Van der Stigchel, Stefan; Sergeant, Joseph A.

    2008-01-01

    The neural substrates of eye movement measures are largely known. Therefore, measurement of eye movements in psychiatric disorders may provide insight into the underlying neuropathology of these disorders. Visually guided saccades, antisaccades, memory guided saccades, and smooth pursuit eye movements will be reviewed in various childhood…

  20. Normal Morning Melanin-Concentrating Hormone Levels and No Association with Rapid Eye Movement or Non-Rapid Eye Movement Sleep Parameters in Narcolepsy Type 1 and Type 2

    PubMed Central

    Schrölkamp, Maren; Jennum, Poul J.; Gammeltoft, Steen; Holm, Anja; Kornum, Birgitte R.; Knudsen, Stine

    2017-01-01

    Study Objectives: Other than hypocretin-1 (HCRT-1) deficiency in narcolepsy type 1 (NT1), the neurochemical imbalance of NT1 and narcolepsy type 2 (NT2) with normal HCRT-1 levels is largely unknown. The neuropeptide melanin-concentrating hormone (MCH) is mainly secreted during sleep and is involved in rapid eye movement (REM) and non-rapid eye movement (NREM) sleep regulation. Hypocretin neurons reciprocally interact with MCH neurons. We hypothesized that altered MCH secretion contributes to the symptoms and sleep abnormalities of narcolepsy and that this is reflected in morning cerebrospinal fluid (CSF) MCH levels, in contrast to previously reported normal evening/afternoon levels. Methods: Lumbar CSF and plasma were collected from 07:00 to 10:00 from 57 patients with narcolepsy (subtypes: 47 NT1; 10 NT2) diagnosed according to International Classification of Sleep Disorders, Third Edition (ICSD-3) and 20 healthy controls. HCRT-1 and MCH levels were quantified by radioimmunoassay and correlated with clinical symptoms, polysomnography (PSG), and Multiple Sleep Latency Test (MSLT) parameters. Results: CSF and plasma MCH levels were not significantly different between narcolepsy patients regardless of ICSD-3 subtype, HCRT-1 levels, or compared to controls. CSF MCH and HCRT-1 levels were not significantly correlated. Multivariate regression models of CSF MCH levels, age, sex, and body mass index predicting clinical, PSG, and MSLT parameters did not reveal any significant associations to CSF MCH levels. Conclusions: Our study shows that MCH levels in CSF collected in the morning are normal in narcolepsy and not associated with the clinical symptoms, REM sleep abnormalities, nor number of muscle movements during REM or NREM sleep of the patients. We conclude that morning lumbar CSF MCH measurement is not an informative diagnostic marker for narcolepsy. Citation: Schrölkamp M, Jennum PJ, Gammeltoft S, Holm A, Kornum BR, Knudsen S. Normal morning melanin-concentrating hormone levels and no association with rapid eye movement or non-rapid eye movement sleep parameters in narcolepsy type 1 and type 2. J Clin Sleep Med. 2017;13(2):235–243. PMID:27855741

  1. Development and validation of an algorithm for the study of sleep using a biometric shirt in young healthy adults.

    PubMed

    Pion-Massicotte, Joëlle; Godbout, Roger; Savard, Pierre; Roy, Jean-François

    2018-02-23

    Portable polysomnography is often too complex and encumbering for recording sleep at home. We recorded sleep using a biometric shirt (electrocardiogram sensors, respiratory inductance plethysmography bands and an accelerometer) in 21 healthy young adults recorded in a sleep laboratory for two consecutive nights, together with standard polysomnography. Polysomnographic recordings were scored using standard methods. An algorithm was developed to classify the biometric shirt recordings into rapid eye movement sleep, non-rapid eye movement sleep and wake. The algorithm was based on breathing rate and heart rate variability, body movement, and included a correction for sleep onset and offset. The overall mean percentage of agreement between the two sets of recordings was 77.4%; when non-rapid eye movement and rapid eye movement sleep epochs were grouped together, it increased to 90.8%. The overall kappa coefficient was 0.53. Five of the seven sleep variables were significantly correlated. The findings of this pilot study indicate that this simple portable system could be used to estimate the general sleep pattern of young healthy adults. © 2018 European Sleep Research Society.

  2. Enhancing Assessments of Mental Health Programs and Program Planning

    DTIC Science & Technology

    2012-06-01

    Eye Movement Desensitization Reprocessing ( EMDR ) 3. Cognitive Processing Therapy (CPT) 4...Department of Defense Instruction EMDR Eye Movement Desensitization Reprocessing FEA Front End Assessment FM Field Manual FOB Forward Operating Base...1.26 N = 47 EMDR 1.04 N = 10 Group .46 N = 6 Other -.53-.78 N = 4 (some ns) Source: Roberts and Schnurr 2012. Slide 64. Large Number of Studies

  3. Treatment of a Patient with Borderline Personality Disorder Based on Phase-Oriented Model of Eye Movement Desensitization and Reprocessing (EMDR): A Case Report.

    PubMed

    Momeni Safarabad, Nahid; Asgharnejad Farid, Ali-Asghar; Gharraee, Banafsheh; Habibi, Mojtaba

    2018-01-01

    Objective: This study aimed at reporting the effect of the 3-phase model of eye movement desensitization and reprocessing in the treatment of a patient with borderline personality disorder. Method : A 33-year-old female, who met the DSM-IV-TR criteria for borderline personality disorder, received a 20-session therapy based on the 3-phase model of eye movement desensitization and reprocessing. Borderline Personality Disorder Checklist (BPD-Checklist), Dissociative Experience Scale (DES-II), Beck Depression Inventory-II-second edition (BDI-II), and Anxiety Inventory (BAI) were filled out by the patient at all treatment phases and at the 3- month follow- up. Results: According to the obtained results, the patient's pretest scores in all research tools were 161, 44, 37, and 38 for BPD-Checklist, DES-II, BDI-II, and BAI, respectively. After treatment, these scores decreased significantly (69, 14, 6 and 10 respectively). So, the patient exhibited improvement in borderline personality disorder, dissociative, depression and anxiety symptoms, which were maintained after the 3-month follow-up. Conclusion: The results supported the positive effect of phasic model of eye movement desensitization and reprocessing on borderline personality disorder.

  4. Treatment of a Patient with Borderline Personality Disorder Based on Phase-Oriented Model of Eye Movement Desensitization and Reprocessing (EMDR): A Case Report

    PubMed Central

    Momeni Safarabad, Nahid; Asgharnejad Farid, Ali-Asghar; Gharraee, Banafsheh; Habibi, Mojtaba

    2018-01-01

    Objective: This study aimed at reporting the effect of the 3-phase model of eye movement desensitization and reprocessing in the treatment of a patient with borderline personality disorder. Method : A 33-year-old female, who met the DSM-IV-TR criteria for borderline personality disorder, received a 20-session therapy based on the 3-phase model of eye movement desensitization and reprocessing. Borderline Personality Disorder Checklist (BPD-Checklist), Dissociative Experience Scale (DES-II), Beck Depression Inventory-II-second edition (BDI-II), and Anxiety Inventory (BAI) were filled out by the patient at all treatment phases and at the 3- month follow- up. Results: According to the obtained results, the patient’s pretest scores in all research tools were 161, 44, 37, and 38 for BPD-Checklist, DES-II, BDI-II, and BAI, respectively. After treatment, these scores decreased significantly (69, 14, 6 and 10 respectively). So, the patient exhibited improvement in borderline personality disorder, dissociative, depression and anxiety symptoms, which were maintained after the 3-month follow-up. Conclusion: The results supported the positive effect of phasic model of eye movement desensitization and reprocessing on borderline personality disorder. PMID:29892320

  5. Direct evidence for a position input to the smooth pursuit system.

    PubMed

    Blohm, Gunnar; Missal, Marcus; Lefèvre, Philippe

    2005-07-01

    When objects move in our environment, the orientation of the visual axis in space requires the coordination of two types of eye movements: saccades and smooth pursuit. The principal input to the saccadic system is position error, whereas it is velocity error for the smooth pursuit system. Recently, it has been shown that catch-up saccades to moving targets are triggered and programmed by using velocity error in addition to position error. Here, we show that, when a visual target is flashed during ongoing smooth pursuit, it evokes a smooth eye movement toward the flash. The velocity of this evoked smooth movement is proportional to the position error of the flash; it is neither influenced by the velocity of the ongoing smooth pursuit eye movement nor by the occurrence of a saccade, but the effect is absent if the flash is ignored by the subject. Furthermore, the response started around 85 ms after the flash presentation and decayed with an average time constant of 276 ms. Thus this is the first direct evidence of a position input to the smooth pursuit system. This study shows further evidence for a coupling between saccadic and smooth pursuit systems. It also suggests that there is an interaction between position and velocity error signals in the control of more complex movements.

  6. Lateral eye-movement responses to visual stimuli.

    PubMed

    Wilbur, M P; Roberts-Wilbur, J

    1985-08-01

    The association of left lateral eye-movement with emotionality or arousal of affect and of right lateral eye-movement with cognitive/interpretive operations and functions was investigated. Participants were junior and senior students enrolled in an undergraduate course in developmental psychology. There were 37 women and 13 men, ranging from 19 to 45 yr. of age. Using videotaped lateral eye-movements of 50 participants' responses to 15 visually presented stimuli (precategorized as neutral, emotional, or intellectual), content and statistical analyses supported the association between left lateral eye-movement and emotional arousal and between right lateral eye-movement and cognitive functions. Precategorized visual stimuli included items such as a ball (neutral), gun (emotional), and calculator (intellectual). The findings are congruent with existing lateral eye-movement literature and also are additive by using visual stimuli that do not require the explicit response or implicit processing of verbal questioning.

  7. Effects of background motion on eye-movement information.

    PubMed

    Nakamura, S

    1997-02-01

    The effect of background stimulus on eye-movement information was investigated by analyzing the underestimation of the target velocity during pursuit eye movement (Aubert-Fleishl paradox). In the experiment, a striped pattern with various brightness contrasts and spatial frequencies was used as a background stimulus, which was moved at various velocities. Analysis showed that the perceived velocity of the pursuit target, which indicated the magnitudes of eye-movement information, decreased when the background stripes moved in the same direction as eye movement at higher velocities and increased when the background moved in the opposite direction. The results suggest that the eye-movement information varied as a linear function of the velocity of the motion of the background retinal image (optic flow). In addition, the effectiveness of optic flow on eye-movement information was determined by the attributes of the background stimulus such as the brightness contrast or the spatial frequency of the striped pattern.

  8. Eye vs. Text Movement: Which Technique Leads to Faster Reading Comprehension?

    ERIC Educational Resources Information Center

    Abdellah, Antar Solhy

    2009-01-01

    Eye fixation is a frequent problem that faces foreign language learners and hinders the flow of their reading comprehension. Although students are usually advised to read fast/skim to overcome this problem, eye fixation persists. The present study investigates the effect of using a paper-based program as compared to a computer-based software in…

  9. The utility of modeling word identification from visual input within models of eye movements in reading

    PubMed Central

    Bicknell, Klinton; Levy, Roger

    2012-01-01

    Decades of empirical work have shown that a range of eye movement phenomena in reading are sensitive to the details of the process of word identification. Despite this, major models of eye movement control in reading do not explicitly model word identification from visual input. This paper presents a argument for developing models of eye movements that do include detailed models of word identification. Specifically, we argue that insights into eye movement behavior can be gained by understanding which phenomena naturally arise from an account in which the eyes move for efficient word identification, and that one important use of such models is to test which eye movement phenomena can be understood this way. As an extended case study, we present evidence from an extension of a previous model of eye movement control in reading that does explicitly model word identification from visual input, Mr. Chips (Legge, Klitz, & Tjan, 1997), to test two proposals for the effect of using linguistic context on reading efficiency. PMID:23074362

  10. A review on eye movement studies in childhood and adolescent psychiatry.

    PubMed

    Rommelse, Nanda N J; Van der Stigchel, Stefan; Sergeant, Joseph A

    2008-12-01

    The neural substrates of eye movement measures are largely known. Therefore, measurement of eye movements in psychiatric disorders may provide insight into the underlying neuropathology of these disorders. Visually guided saccades, antisaccades, memory guided saccades, and smooth pursuit eye movements will be reviewed in various childhood psychiatric disorders. The four aims of this review are (1) to give a thorough overview of eye movement studies in a wide array of psychiatric disorders occurring during childhood and adolescence (attention-deficit/hyperactivity disorder, oppositional deviant disorder and conduct disorder, autism spectrum disorders, reading disorder, childhood-onset schizophrenia, Tourette's syndrome, obsessive compulsive disorder, and anxiety and depression), (2) to discuss the specificity and overlap of eye movement findings across disorders and paradigms, (3) to discuss the developmental aspects of eye movement abnormalities in childhood and adolescence psychiatric disorders, and (4) to present suggestions for future research. In order to make this review of interest to a broad audience, attention will be given to the clinical manifestation of the disorders and the theoretical background of the eye movement paradigms.

  11. Spatial constancy mechanisms in motor control

    PubMed Central

    Medendorp, W. Pieter

    2011-01-01

    The success of the human species in interacting with the environment depends on the ability to maintain spatial stability despite the continuous changes in sensory and motor inputs owing to movements of eyes, head and body. In this paper, I will review recent advances in the understanding of how the brain deals with the dynamic flow of sensory and motor information in order to maintain spatial constancy of movement goals. The first part summarizes studies in the saccadic system, showing that spatial constancy is governed by a dynamic feed-forward process, by gaze-centred remapping of target representations in anticipation of and across eye movements. The subsequent sections relate to other oculomotor behaviour, such as eye–head gaze shifts, smooth pursuit and vergence eye movements, and their implications for feed-forward mechanisms for spatial constancy. Work that studied the geometric complexities in spatial constancy and saccadic guidance across head and body movements, distinguishing between self-generated and passively induced motion, indicates that both feed-forward and sensory feedback processing play a role in spatial updating of movement goals. The paper ends with a discussion of the behavioural mechanisms of spatial constancy for arm motor control and their physiological implications for the brain. Taken together, the emerging picture is that the brain computes an evolving representation of three-dimensional action space, whose internal metric is updated in a nonlinear way, by optimally integrating noisy and ambiguous afferent and efferent signals. PMID:21242137

  12. The organisation of spatial and temporal relations in memory.

    PubMed

    Rondina, Renante; Curtiss, Kaitlin; Meltzer, Jed A; Barense, Morgan D; Ryan, Jennifer D

    2017-04-01

    Episodic memories are comprised of details of "where" and "when"; spatial and temporal relations, respectively. However, evidence from behavioural, neuropsychological, and neuroimaging studies has provided mixed interpretations about how memories for spatial and temporal relations are organised-they may be hierarchical, fully interactive, or independent. In the current study, we examined the interaction of memory for spatial and temporal relations. Using explicit reports and eye-tracking, we assessed younger and older adults' memory for spatial and temporal relations of objects that were presented singly across time in unique spatial locations. Explicit change detection of spatial relations was affected by a change in temporal relations, but explicit change detection of temporal relations was not affected by a change in spatial relations. Younger and older adults showed eye movement evidence of incidental memory for temporal relations, but only younger adults showed eye movement evidence of incidental memory for spatial relations. Together, these findings point towards a hierarchical organisation of relational memory. The implications of these findings are discussed in the context of the neural mechanisms that may support such a hierarchical organisation of memory.

  13. Eye Movement Patterns of the Elderly during Stair Descent:Effect of Illumination

    NASA Astrophysics Data System (ADS)

    Kasahara, Satoko; Okabe, Sonoko; Nakazato, Naoko; Ohno, Yuko

    The relationship between the eye movement pattern during stair descent and illumination was studied in 4 elderly people in comparison with that in 5 young people. The illumination condition was light (85.0±30.9 lx) or dark (0.7±0.3 lx), and data of eye movements were obtained using an eye mark recorder. A flight of 15 steps was used for the experiment, and data on 3 steps in the middle, on which the descent movements were stabilized, were analyzed. The elderly subjects pointed their eyes mostly directly in front in the facial direction regardless of the illumination condition, but the young subjects tended to look down under the light condition. The young subjects are considered to have confirmed the safety of the front by peripheral vision, checked the stepping surface by central vision, and still maintained the upright position without leaning forward during stair descent. The elderly subjects, in contrast, always looked at the visual target by central vision even under the light condition and leaned forward. The range of eye movements was larger vertically than horizontally in both groups, and a characteristic eye movement pattern of repeating a vertical shuttle movement synchronous with descent of each step was observed. Under the dark condition, the young subjects widened the range of vertical eye movements and reduced duration of fixation. The elderly subjects showed no change in the range of eye movements but increased duration of fixation during stair descent. These differences in the eye movements are considered to be compensatory reactions to narrowing of the vertical visual field, reduced dark adaptation, and reduced dynamic visual acuity due to aging. These characteristics of eye movements of the elderly lead to an anteriorly leaned posture and lack of attention to the front during stair descent.

  14. Fetal Eye Movements on Magnetic Resonance Imaging

    PubMed Central

    Woitek, Ramona; Kasprian, Gregor; Lindner, Christian; Stuhr, Fritz; Weber, Michael; Schöpf, Veronika; Brugger, Peter C.; Asenbaum, Ulrika; Furtner, Julia; Bettelheim, Dieter; Seidl, Rainer; Prayer, Daniela

    2013-01-01

    Objectives Eye movements are the physical expression of upper fetal brainstem function. Our aim was to identify and differentiate specific types of fetal eye movement patterns using dynamic MRI sequences. Their occurrence as well as the presence of conjugated eyeball motion and consistently parallel eyeball position was systematically analyzed. Methods Dynamic SSFP sequences were acquired in 72 singleton fetuses (17–40 GW, three age groups [17–23 GW, 24–32 GW, 33–40 GW]). Fetal eye movements were evaluated according to a modified classification originally published by Birnholz (1981): Type 0: no eye movements; Type I: single transient deviations; Type Ia: fast deviation, slower reposition; Type Ib: fast deviation, fast reposition; Type II: single prolonged eye movements; Type III: complex sequences; and Type IV: nystagmoid. Results In 95.8% of fetuses, the evaluation of eye movements was possible using MRI, with a mean acquisition time of 70 seconds. Due to head motion, 4.2% of the fetuses and 20.1% of all dynamic SSFP sequences were excluded. Eye movements were observed in 45 fetuses (65.2%). Significant differences between the age groups were found for Type I (p = 0.03), Type Ia (p = 0.031), and Type IV eye movements (p = 0.033). Consistently parallel bulbs were found in 27.3–45%. Conclusions In human fetuses, different eye movement patterns can be identified and described by MRI in utero. In addition to the originally classified eye movement patterns, a novel subtype has been observed, which apparently characterizes an important step in fetal brainstem development. We evaluated, for the first time, eyeball position in fetuses. Ultimately, the assessment of fetal eye movements by MRI yields the potential to identify early signs of brainstem dysfunction, as encountered in brain malformations such as Chiari II or molar tooth malformations. PMID:24194885

  15. Evidence that non-dreamers do dream: a REM sleep behaviour disorder model.

    PubMed

    Herlin, Bastien; Leu-Semenescu, Smaranda; Chaumereuil, Charlotte; Arnulf, Isabelle

    2015-12-01

    To determine whether non-dreamers do not produce dreams or do not recall them, subjects were identified with no dream recall with dreamlike behaviours during rapid eye movement sleep behaviour disorder, which is typically characterised by dream-enacting behaviours congruent with sleep mentation. All consecutive patients with idiopathic rapid eye movement sleep behaviour disorder or rapid eye movement sleep behaviour disorder associated with Parkinson's disease who underwent a video-polysomnography were interviewed regarding the presence or absence of dream recall, retrospectively or upon spontaneous arousals. The patients with no dream recall for at least 10 years, and never-ever recallers were compared with dream recallers with rapid eye movement sleep behaviour disorder regarding their clinical, cognitive and sleep features. Of the 289 patients with rapid eye movement sleep behaviour disorder, eight (2.8%) patients had no dream recall, including four (1.4%) patients who had never ever recalled dreams, and four patients who had no dream recall for 10-56 years. All non-recallers exhibited, daily or almost nightly, several complex, scenic and dreamlike behaviours and speeches, which were also observed during rapid eye movement sleep on video-polysomnography (arguing, fighting and speaking). They did not recall a dream following sudden awakenings from rapid eye movement sleep. These eight non-recallers with rapid eye movement sleep behaviour disorder did not differ in terms of cognition, clinical, treatment or sleep measures from the 17 dreamers with rapid eye movement sleep behaviour disorder matched for age, sex and disease. The scenic dreamlike behaviours reported and observed during rapid eye movement sleep in the rare non-recallers with rapid eye movement sleep behaviour disorder (even in the never-ever recallers) provide strong evidence that non-recallers produce dreams, but do not recall them. Rapid eye movement sleep behaviour disorder provides a new model to evaluate cognitive processing during dreaming and subsequent recall. © 2015 European Sleep Research Society.

  16. Analysis of Students' Eye Movement in Relation to Contents of Multimedia Lecture

    NASA Astrophysics Data System (ADS)

    Murakami, Masayuki; Kakusho, Koh; Minoh, Michihiko

    In this article, we report our analysis of how the students' eye movement is affected by the content of lecture in order to utilize as standard of selection of image for distance learning and WBT. We classified content of lecture into nine parts: introduction, presentation, explanation, illustration, assertion, query, reply, question, response.We analyzed students' eye movement in the multimedia lecture "Japanese Economics", which was distance lecture between Kyoto University and UCLA. As the result of analysis, we get the following characteristic of eye movement of each course process in practical lecture.Introduction; students gaze at lecturer at first in order to achieve advance organizer, and next look at material.Presentation; they mainly stare at material and sometimes peer at lecturer to complement lack of understanding with information given by lecturer.Explanation; staring time is longer than other course process categories, and students stare at the object which they regard as important.Illustration; students stare at material which offers main information source.Assertion; they gaze at lecturer because of interaction between lecturer and students.Question-and-answer; generally students look at speaker but in the case of "query" about material, they change their focuses on material and lecturer fast and by turns in order to get information of lecturer and material.And our research suggests the practical guide for our choice of image information.

  17. Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems

    PubMed Central

    Shinozaki, Takahiro

    2018-01-01

    Human-computer interface systems whose input is based on eye movements can serve as a means of communication for patients with locked-in syndrome. Eye-writing is one such system; users can input characters by moving their eyes to follow the lines of the strokes corresponding to characters. Although this input method makes it easy for patients to get started because of their familiarity with handwriting, existing eye-writing systems suffer from slow input rates because they require a pause between input characters to simplify the automatic recognition process. In this paper, we propose a continuous eye-writing recognition system that achieves a rapid input rate because it accepts characters eye-written continuously, with no pauses. For recognition purposes, the proposed system first detects eye movements using electrooculography (EOG), and then a hidden Markov model (HMM) is applied to model the EOG signals and recognize the eye-written characters. Additionally, this paper investigates an EOG adaptation that uses a deep neural network (DNN)-based HMM. Experiments with six participants showed an average input speed of 27.9 character/min using Japanese Katakana as the input target characters. A Katakana character-recognition error rate of only 5.0% was achieved using 13.8 minutes of adaptation data. PMID:29425248

  18. Instrument Display Visual Angles for Conventional Aircraft and the MQ-9 Ground Control Station

    NASA Technical Reports Server (NTRS)

    Kamine, Tovy Haber; Bendrick, Gregg A.

    2008-01-01

    Aircraft instrument panels should be designed such that primary displays are in optimal viewing location to minimize pilot perception and response time. Human Factors engineers define three zones (i.e. cones ) of visual location: 1) "Easy Eye Movement" (foveal vision); 2) "Maximum Eye Movement" (peripheral vision with saccades), and 3) "Head Movement (head movement required). Instrument display visual angles were measured to determine how well conventional aircraft (T-34, T-38, F- 15B, F-16XL, F/A-18A, U-2D, ER-2, King Air, G-III, B-52H, DC-10, B747-SCA) and the MQ-9 ground control station (GCS) complied with these standards, and how they compared with each other. Selected instrument parameters included: attitude, pitch, bank, power, airspeed, altitude, vertical speed, heading, turn rate, slip/skid, AOA, flight path, latitude, longitude, course, bearing, range and time. Vertical and horizontal visual angles for each component were measured from the pilot s eye position in each system. The vertical visual angles of displays in conventional aircraft lay within the cone of "Easy Eye Movement" for all but three of the parameters measured, and almost all of the horizontal visual angles fell within this range. All conventional vertical and horizontal visual angles lay within the cone of Maximum Eye Movement. However, most instrument vertical visual angles of the MQ-9 GCS lay outside the cone of Easy Eye Movement, though all were within the cone of Maximum Eye Movement. All the horizontal visual angles for the MQ-9 GCS were within the cone of "Easy Eye Movement". Most instrument displays in conventional aircraft lay within the cone of Easy Eye Movement, though mission-critical instruments sometimes displaced less important instruments outside this area. Many of the MQ-9 GCS systems lay outside this area. Specific training for MQ-9 pilots may be needed to avoid increased response time and potential error during flight. The learning objectives include: 1) Know three physiologic cones of eye/head movement; 2) Understand how instrument displays comply with these design principles in conventional aircraft and an uninhabited aerial vehicle system. Which of the following is NOT a recognized physiologic principle of instrument display design? Cone of Easy Eye Movement 2) Cone of Binocular Eye Movement 3) Cone of Maximum Eye Movement 4) Cone of Head Movement 5) None of the above. Answer: # 2) Cone of Binocular Eye Movement

  19. Stride-Cycle Influences on Goal-Directed Head Movements Made During Walking

    NASA Technical Reports Server (NTRS)

    Peters, Brian T.; vanEmmerik, Richard E. A.; Bloomberg, Jacob J.

    2006-01-01

    Horizontal head movements were studied in six subjects as they made rapid horizontal gaze adjustments while walking. The aim of the present research was to determine if gait-cycle events alter the head movement response to a visual target acquisition task. Gaze shifts of approximately 40deg were elicited by a step change in the position of a visual target from a central location to a second location in the left or right horizontal periphery. The timing of the target position change was constrained to occur at 25,50,75 and 100% of the stride cycle. The trials were randomly presented as the subjects walked on a treadmill at their preferred speed (range: 1.25 to 1.48 m/s, mean: 1.39 +/- 0.09 m/s ) . Analyses focused on the movement onset latencies of the head and eyes and on the peak velocity and saccade amplitude of the head movement response. A comparison of the group means indicated that the head movement onset lagged the eye onset (262 ms versus 252 ms). The head and eye movement onset latencies were not affected by either the direction of the target change nor the point in the gait cycle during which the target relocation occurred. However, the presence of an interaction between the gait cycle events and the direction of the visual target shift indicates that the peak head saccade velocity and head saccade amplitude are affected by the natural head oscillations that occur while walking.

  20. A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: application to robot control.

    PubMed

    Ma, Jiaxin; Zhang, Yu; Cichocki, Andrzej; Matsuno, Fumitoshi

    2015-03-01

    This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.

  1. The Impact of Multimedia Effect on Science Learning: Evidence from Eye Movements

    ERIC Educational Resources Information Center

    She, Hsiao-Ching; Chen, Yi-Zen

    2009-01-01

    This study examined how middle school students constructed their understanding of the mitosis and meiosis processes at a molecular level through multimedia learning materials presented in different interaction and sensory modality modes. A two (interaction modes: animation/simulation) by two (sensory modality modes: narration/on-screen text)…

  2. Quality of life in patients with an idiopathic rapid eye movement sleep behaviour disorder in Korea.

    PubMed

    Kim, Keun Tae; Motamedi, Gholam K; Cho, Yong Won

    2017-08-01

    There have been few quality of life studies in patients with idiopathic rapid eye movement sleep behaviour disorder. We compared the quality of life in idiopathic rapid eye movement sleep behaviour disorder patients to healthy controls, patients with hypertension, type 2 diabetes mellitus without complication and idiopathic restless legs syndrome. Sixty patients with idiopathic rapid eye movement sleep behaviour disorder (24 female; mean age: 61.43 ± 8.99) were enrolled retrospectively. The diagnosis was established based on sleep history, overnight polysomnography, neurological examination and Mini-Mental State Examination to exclude secondary rapid eye movement sleep behavior disorder. All subjects completed questionnaires, including the Short Form 36-item Health Survey for quality of life. The total quality of life score in idiopathic rapid eye movement sleep behaviour disorder (70.63 ± 20.83) was lower than in the healthy control group (83.38 ± 7.96) but higher than in the hypertension (60.55 ± 24.82), diabetes mellitus (62.42 ± 19.37) and restless legs syndrome (61.77 ± 19.25) groups. The total score of idiopathic rapid eye movement sleep behaviour disorder patients had a negative correlation with the Pittsburg Sleep Quality Index (r = -0.498, P < 0.001), Insomnia Severity Index (r = -0.645, P < 0.001) and the Beck Depression Inventory-2 (r = -0.694, P < 0.001). Multiple regression showed a negative correlation between the Short Form 36-item Health Survey score and the Insomnia Severity Index (β = -1.100, P = 0.001) and Beck Depression Inventory-2 (β = -1.038, P < 0.001). idiopathic rapid eye movement sleep behaviour disorder had a significant negative impact on quality of life, although this effect was less than that of other chronic disorders. This negative effect might be related to a depressive mood associated with the disease. © 2016 European Sleep Research Society.

  3. Hybrid EEG—Eye Tracker: Automatic Identification and Removal of Eye Movement and Blink Artifacts from Electroencephalographic Signal

    PubMed Central

    Mannan, Malik M. Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M. Ahmad

    2016-01-01

    Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data. PMID:26907276

  4. The Human Engineering Eye Movement Measurement Research Facility.

    DTIC Science & Technology

    1985-04-01

    tracked reliably. When tracking is disrupted (e.g., by gross and sudden head movements, gross change in the head position, sneezing, prolonged eye...these are density ^\\ and " busyness " of the slides (stimulus material), as well as consistency . I„ between successive... change the material being projected based on the subject’s previous performance. The minicomputer relays the calibrated data to one of the magnetic

  5. View-Invariant Object Category Learning, Recognition, and Search: How Spatial and Object Attention are Coordinated Using Surface-Based Attentional Shrouds

    ERIC Educational Resources Information Center

    Fazl, Arash; Grossberg, Stephen; Mingolla, Ennio

    2009-01-01

    How does the brain learn to recognize an object from multiple viewpoints while scanning a scene with eye movements? How does the brain avoid the problem of erroneously classifying parts of different objects together? How are attention and eye movements intelligently coordinated to facilitate object learning? A neural model provides a unified…

  6. ECEM (eye closure eye movements): integrating aspects of EMDR with hypnosis for treatment of trauma.

    PubMed

    Hollander, H E; Bender, S S

    2001-01-01

    The paper addresses distinctions between hypnotic interventions and Eye Movement Desensitizing and Reprocessing (EMDR) and discusses their effect on persons who have symptoms of Posttraumatic Stress Disorder (PTSD). Eye movements in hypnosis and EMDR are considered in terms of the different ways they may affect responses in treatment. A treatment intervention within hypnosis called ECEM (Eye Closure, Eye Movements) is described. ECEM can be used for patients with histories of trauma who did not benefit adequately from either interventions in hypnosis or the EMDR treatment protocol used separately. In ECEM the eye movement variable of EMDR is integrated within a hypnosis protocol to enhance benefits of hypnosis and reduce certain risks of EMDR.

  7. Visual Discomfort From Flash Afterimages of Riloid Patterns.

    PubMed

    O'Hare, Louise

    2017-06-01

    Op-art-based stimuli have been shown to be uncomfortable, possibly due to a combination of fixational eye movements (microsaccades) and excessive cortical responses. Efforts have been made to measure illusory phenomena arising from these stimuli in the absence of microsaccades, but there has been no attempt thus far to decouple the effects of the cortical response from the effect of fixational eye movements. This study uses flash afterimages to stabilise the image on the retina and thus reduce the systematic effect of eye movements, in order to investigate the role of the brain in discomfort from op-art-based stimuli. There was a relationship between spatial frequency and the magnitude of the P300 response, showing a similar pattern to that of discomfort judgements, which suggests that there might be a role of discomfort and excessive neural responses independently from the effects of microsaccades.

  8. Development of a Computer Writing System Based on EOG

    PubMed Central

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-01-01

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders. PMID:28672863

  9. Development of a Computer Writing System Based on EOG.

    PubMed

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  10. A stochastic model for eye movements during fixation on a stationary target.

    NASA Technical Reports Server (NTRS)

    Vasudevan, R.; Phatak, A. V.; Smith, J. D.

    1971-01-01

    A stochastic model describing small eye movements occurring during steady fixation on a stationary target is presented. Based on eye movement data for steady gaze, the model has a hierarchical structure; the principal level represents the random motion of the image point within a local area of fixation, while the higher level mimics the jump processes involved in transitions from one local area to another. Target image motion within a local area is described by a Langevin-like stochastic differential equation taking into consideration the microsaccadic jumps pictured as being due to point processes and the high frequency muscle tremor, represented as a white noise. The transform of the probability density function for local area motion is obtained, leading to explicit expressions for their means and moments. Evaluation of these moments based on the model is comparable with experimental results.

  11. Prevalence and phenomenology of eye tics in Gilles de la Tourette syndrome.

    PubMed

    Martino, Davide; Cavanna, Andrea E; Robertson, Mary M; Orth, Michael

    2012-10-01

    Eye tics seem to be common in Gilles de la Tourette syndrome (GTS). We analyzed the frequency and clinical characteristics of eye tics in 212 GTS patients. Of the 212 patients, 201 (94.8 %) reported eye tics in their life-time; 166 (78.3 %) reported eye movement tics (rolling eyes up/down, eyes looking sideways, staring), and 194 (91.5 %) eyelid/eyebrow movement tics (frowning, raising eyebrows, blinking or winking). Patients with eye movement tics were younger at age of GTS onset (7.1 ± 4 years) than those without (8.9 ± 6.8; p = 0.024). Tic severity positively correlated to lifetime history of eye and/or eyelid/eyebrow movement tics. Our data confirm that eye and eyelid/eyebrow movement tics are very common in GTS, and most patients have several types of eye tics over time. Eye tic phenomenology was similar in patients with or without co-morbidity. Eye tics are therefore likely to be a core feature of GTS and should be routinely evaluated in order to strengthen the clinician's confidence in diagnosing GTS.

  12. Three dimensional eye movements of squirrel monkeys following postrotatory tilt

    NASA Technical Reports Server (NTRS)

    Merfeld, D. M.; Young, L. R.; Paige, G. D.; Tomko, D. L.

    1993-01-01

    Three-dimensional squirrel monkey eye movements were recorded during and immediately following rotation around an earth-vertical yaw axis (160 degrees/s steady state, 100 degrees/s2 acceleration and deceleration). To study interactions between the horizontal angular vestibulo-ocular reflex (VOR) and head orientation, postrotatory VOR alignment was changed relative to gravity by tilting the head out of the horizontal plane (pitch or roll tilt between 15 degrees and 90 degrees) immediately after cessation of motion. Results showed that in addition to post rotatory horizontal nystagmus, vertical nystagmus followed tilts to the left or right (roll), and torsional nystagmus followed forward or backward (pitch) tilts. When the time course and spatial orientation of eye velocity were considered in three dimensions, the axis of eye rotation always shifted toward alignment with gravity, and the postrotatory horizontal VOR decay was accelerated by the tilts. These phenomena may reflect a neural process that resolves the sensory conflict induced by this postrotatory tilt paradigm.

  13. Application of eye movement measuring system OBER 2 to medicine and technology

    NASA Astrophysics Data System (ADS)

    Ober, Jozef; Hajda, Janusz; Loska, Jacek; Jamicki, Michal

    1997-08-01

    The OBER 2 is an infrared light eye movement measuring system and it works with IBM PC compatible computers. As one of the safest systems for measuring of eye movement it uses a very short period of infrared light flashing time (80 microsecond for each measure point). System has an advanced analog-digital controller, which includes background suppression and prediction mechanisms guaranteeing elimination of slow changes and fluctuations of external illumination frequency up to 100 Hz, with effectiveness better than 40 dB. Setting from PC the active measure axis, sampling rate (25 - 4000 Hz) and making start and stop the measure, make it possible to control the outside environment in real-time. By proper controlling of gain it is possible to get high time and position resolution of 0.5 minute of arc even for big amplitude of eye movement (plus or minus 20 degree of visual angle). The whole communication system can also be driven directly by eye movement in real time. The possibility of automatic selection of the most essential elements of eye movement, individual for each person and those that take place for each person in determined situations of life independently from personal features, is a key to practical application. Hence one of conducted research topic is a personal identification based on personal features. Another task is a research project of falling asleep detection, which can be applied to warn the drivers before falling asleep while driving. This measuring system with a proper expert system can also be used to detect a dyslexia and other disabilities of the optic system.

  14. Grey matter density changes of structures involved in Posttraumatic Stress Disorder (PTSD) after recovery following Eye Movement Desensitization and Reprocessing (EMDR) therapy.

    PubMed

    Boukezzi, Sarah; El Khoury-Malhame, Myriam; Auzias, Guillaume; Reynaud, Emmanuelle; Rousseau, Pierre-François; Richard, Emmanuel; Zendjidjian, Xavier; Roques, Jacques; Castelli, Nathalie; Correard, Nadia; Guyon, Valérie; Gellato, Caroline; Samuelian, Jean-Claude; Cancel, Aida; Comte, Magali; Latinus, Marianne; Guedj, Eric; Khalfa, Stéphanie

    2017-08-30

    Recovery of stress-induced structural alterations in Posttraumatic Stress Disorder (PTSD) remains largely unexplored. This study aimed to determine whether symptoms improvement is associated with grey matter (GM) density changes of brain structures involved in PTSD. Two groups of PTSD patients were involved in this study. The first group was treated with Eye Movement Desensitization and Reprocessing (EMDR) therapy and recovered from their symptoms (recovery group) (n = 11); Patients were scanned prior to therapy (T1), one week (T2) and five months after the end of therapy (T3). The second group included patients which followed a supportive therapy and remained symptomatic (wait-list group) (n = 7). They were scanned at three time-steps mimicking the same inter-scan intervals. Voxel-based morphometry (VBM) was used to characterize GM density evolution. GM density values showed a significant group-by-time interaction effect between T1 and T3 in prefrontal cortex areas. These interaction effects were driven by a GM density increase in the recovery group with respect to the wait-list group. Symptoms removal goes hand-in-hand with GM density enhancement of structures involved in emotional regulation. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  15. Variability of eye movements when viewing dynamic natural scenes.

    PubMed

    Dorr, Michael; Martinetz, Thomas; Gegenfurtner, Karl R; Barth, Erhardt

    2010-08-26

    How similar are the eye movement patterns of different subjects when free viewing dynamic natural scenes? We collected a large database of eye movements from 54 subjects on 18 high-resolution videos of outdoor scenes and measured their variability using the Normalized Scanpath Saliency, which we extended to the temporal domain. Even though up to about 80% of subjects looked at the same image region in some video parts, variability usually was much greater. Eye movements on natural movies were then compared with eye movements in several control conditions. "Stop-motion" movies had almost identical semantic content as the original videos but lacked continuous motion. Hollywood action movie trailers were used to probe the upper limit of eye movement coherence that can be achieved by deliberate camera work, scene cuts, etc. In a "repetitive" condition, subjects viewed the same movies ten times each over the course of 2 days. Results show several systematic differences between conditions both for general eye movement parameters such as saccade amplitude and fixation duration and for eye movement variability. Most importantly, eye movements on static images are initially driven by stimulus onset effects and later, more so than on continuous videos, by subject-specific idiosyncrasies; eye movements on Hollywood movies are significantly more coherent than those on natural movies. We conclude that the stimuli types often used in laboratory experiments, static images and professionally cut material, are not very representative of natural viewing behavior. All stimuli and gaze data are publicly available at http://www.inb.uni-luebeck.de/tools-demos/gaze.

  16. The effect of image sharpness on quantitative eye movement data and on image quality evaluation while viewing natural images

    NASA Astrophysics Data System (ADS)

    Vuori, Tero; Olkkonen, Maria

    2006-01-01

    The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.

  17. The "hypnotic state" and eye movements: Less there than meets the eye?

    PubMed Central

    Nordhjem, Barbara; Marcusson-Clavertz, David; Holmqvist, Kenneth

    2017-01-01

    Responsiveness to hypnotic procedures has been related to unusual eye behaviors for centuries. Kallio and collaborators claimed recently that they had found a reliable index for "the hypnotic state" through eye-tracking methods. Whether or not hypnotic responding involves a special state of consciousness has been part of a contentious debate in the field, so the potential validity of their claim would constitute a landmark. However, their conclusion was based on 1 highly hypnotizable individual compared with 14 controls who were not measured on hypnotizability. We sought to replicate their results with a sample screened for High (n = 16) or Low (n = 13) hypnotizability. We used a factorial 2 (high vs. low hypnotizability) x 2 (hypnosis vs. resting conditions) counterbalanced order design with these eye-tracking tasks: Fixation, Saccade, Optokinetic nystagmus (OKN), Smooth pursuit, and Antisaccade (the first three tasks has been used in Kallio et al.'s experiment). Highs reported being more deeply in hypnosis than Lows but only in the hypnotic condition, as expected. There were no significant main or interaction effects for the Fixation, OKN, or Smooth pursuit tasks. For the Saccade task both Highs and Lows had smaller saccades during hypnosis, and in the Antisaccade task both groups had slower Antisaccades during hypnosis. Although a couple of results suggest that a hypnotic condition may produce reduced eye motility, the lack of significant interactions (e.g., showing only Highs expressing a particular eye behavior during hypnosis) does not support the claim that eye behaviors (at least as measured with the techniques used) are an indicator of a "hypnotic state.” Our results do not preclude the possibility that in a more spontaneous or different setting the experience of being hypnotized might relate to specific eye behaviors. PMID:28846696

  18. Interaction of semicircular canal stimulation with carotid baroreceptor reflex control of heart rate

    NASA Technical Reports Server (NTRS)

    Convertino, V. A.

    1998-01-01

    The carotid-cardiac baroreflex contributes to the prediction of orthostatic tolerance; experimental attenuation of the reflex response leads to orthostatic hypotension in humans and animals. Anecdotal observations indicate that rotational head movements about the vertical axis of the body can also induce orthostatic bradycardia and hypotension through increased parasympathetic activity. We therefore measured the chronotropic response to carotid baroreceptor stimulation in 12 men during varying conditions of vestibulo-oculomotor stimulation to test the hypothesis that stimulation of the semicircular canals associated with head movements in the yaw plane inhibits cardioacceleration through a vagally mediated baroreflex. Carotid-cardiac baroreflex response was assessed by plotting R-R intervals (ms) at each of 8 neck pressure steps with their respective carotid distending pressures (mmHg). Calculated baroreflex gain (maximal slope of the stimulus-response relationship) was measured under 4 experimental conditions: 1) sinusoidal whole-body yaw rotation of the subject in the dark without visual fixation (combined vestibular-oculomotor stimulation); 2) yaw oscillation of the subject while tracking a small head-fixed light moving with the subject (vestibular stimulation without eye movements); 3) subject stationary while fixating on a small light oscillating in yaw at the same frequency, peak acceleration, and velocity as the chair (eye movements without vestibular stimulation); and 4) subject stationary in the dark (no eye or head motion). Head motion alone and with eye movement reduced baseline baroreflex responsiveness to the same stimulus by 30%. Inhibition of cardioacceleration during rotational head movements may have significant impact on functional performance in aerospace environments, particularly in high-performance aircraft pilots during high angular acceleration in aerial combat maneuvers or in astronauts upon return from spaceflight who already have attenuated baroreflex functions.

  19. Dissociation of eye and head components of gaze shifts by stimulation of the omnipause neuron region.

    PubMed

    Gandhi, Neeraj J; Sparks, David L

    2007-07-01

    Natural movements often include actions integrated across multiple effectors. Coordinated eye-head movements are driven by a command to shift the line of sight by a desired displacement vector. Yet because extraocular and neck motoneurons are separate entities, the gaze shift command must be separated into independent signals for eye and head movement control. We report that this separation occurs, at least partially, at or before the level of pontine omnipause neurons (OPNs). Stimulation of the OPNs prior to and during gaze shifts temporally decoupled the eye and head components by inhibiting gaze and eye saccades. In contrast, head movements were consistently initiated before gaze onset, and ongoing head movements continued along their trajectories, albeit with some characteristic modulations. After stimulation offset, a gaze shift composed of an eye saccade, and a reaccelerated head movement was produced to preserve gaze accuracy. We conclude that signals subject to OPN inhibition produce the eye-movement component of a coordinated eye-head gaze shift and are not the only signals involved in the generation of the head component of the gaze shift.

  20. Temporal eye movement strategies during naturalistic viewing

    PubMed Central

    Wang, Helena X.; Freeman, Jeremy; Merriam, Elisha P.; Hasson, Uri; Heeger, David J.

    2011-01-01

    The deployment of eye movements to complex spatiotemporal stimuli likely involves a variety of cognitive factors. However, eye movements to movies are surprisingly reliable both within and across observers. We exploited and manipulated that reliability to characterize observers’ temporal viewing strategies. Introducing cuts and scrambling the temporal order of the resulting clips systematically changed eye movement reliability. We developed a computational model that exhibited this behavior and provided an excellent fit to the measured eye movement reliability. The model assumed that observers searched for, found, and tracked a point-of-interest, and that this process reset when there was a cut. The model did not require that eye movements depend on temporal context in any other way, and it managed to describe eye movements consistently across different observers and two movie sequences. Thus, we found no evidence for the integration of information over long time scales (greater than a second). The results are consistent with the idea that observers employ a simple tracking strategy even while viewing complex, engaging naturalistic stimuli. PMID:22262911

  1. Head position modulates optokinetic nystagmus

    PubMed Central

    Ferraresi, A.; Botti, F. M.; Panichi, R.; Barmack, N. H.

    2011-01-01

    Orientation and movement relies on both visual and vestibular information mapped in separate coordinate systems. Here, we examine how coordinate systems interact to guide eye movements of rabbits. We exposed rabbits to continuous horizontal optokinetic stimulation (HOKS) at 5°/s to evoke horizontal eye movements, while they were statically or dynamically roll-tilted about the longitudinal axis. During monocular or binocular HOKS, when the rabbit was roll-tilted 30° onto the side of the eye stimulated in the posterior → anterior (P → A) direction, slow phase eye velocity (SPEV) increased by 3.5–5°/s. When the rabbit was roll-tilted 30° onto the side of the eye stimulated in the A → P direction, SPEV decreased to ~2.5°/s. We also tested the effect of roll-tilt after prolonged optokinetic stimulation had induced a negative optokinetic afternystagmus (OKAN II). In this condition, the SPEV occurred in the dark, “open loop.” Modulation of SPEV of OKAN II depended on the direction of the nystagmus and was consistent with that observed during “closed loop” HOKS. Dynamic roll-tilt influenced SPEV evoked by HOKS in a similar way. The amplitude and the phase of SPEV depended on the frequency of vestibular oscillation and on HOKS velocity. We conclude that the change in the linear acceleration of the gravity vector with respect to the head during roll-tilt modulates the gain of SPEV depending on its direction. This modulation improves gaze stability at different image retinal slip velocities caused by head roll-tilt during centric or eccentric head movement. PMID:21735244

  2. Head position modulates optokinetic nystagmus.

    PubMed

    Pettorossi, V E; Ferraresi, A; Botti, F M; Panichi, R; Barmack, N H

    2011-08-01

    Orientation and movement relies on both visual and vestibular information mapped in separate coordinate systems. Here, we examine how coordinate systems interact to guide eye movements of rabbits. We exposed rabbits to continuous horizontal optokinetic stimulation (HOKS) at 5°/s to evoke horizontal eye movements, while they were statically or dynamically roll-tilted about the longitudinal axis. During monocular or binocular HOKS, when the rabbit was roll-tilted 30° onto the side of the eye stimulated in the posterior → anterior (P → A) direction, slow phase eye velocity (SPEV) increased by 3.5-5°/s. When the rabbit was roll-tilted 30° onto the side of the eye stimulated in the A → P direction, SPEV decreased to ~2.5°/s. We also tested the effect of roll-tilt after prolonged optokinetic stimulation had induced a negative optokinetic afternystagmus (OKAN II). In this condition, the SPEV occurred in the dark, "open loop." Modulation of SPEV of OKAN II depended on the direction of the nystagmus and was consistent with that observed during "closed loop" HOKS. Dynamic roll-tilt influenced SPEV evoked by HOKS in a similar way. The amplitude and the phase of SPEV depended on the frequency of vestibular oscillation and on HOKS velocity. We conclude that the change in the linear acceleration of the gravity vector with respect to the head during roll-tilt modulates the gain of SPEV depending on its direction. This modulation improves gaze stability at different image retinal slip velocities caused by head roll-tilt during centric or eccentric head movement.

  3. Spatial and Temporal Eye–Hand Coordination Relies on the Parietal Reach Region

    PubMed Central

    Hauschild, Markus; Wilke, Melanie; Andersen, Richard A.

    2014-01-01

    Coordinated eye movements are crucial for precision control of our hands. A commonly believed neural mechanism underlying eye–hand coordination is interaction between the neural networks controlling each effector, exchanging, and matching information, such as movement target location and onset time. Alternatively, eye–hand coordination may result simply from common inputs to independent eye and hand control pathways. Thus far, it remains unknown whether and where either of these two possible mechanisms exists. A candidate location for the former mechanism, interpathway communication, includes the posterior parietal cortex (PPC) where distinct effector-specific areas reside. If the PPC were within the network for eye–hand coordination, perturbing it would affect both eye and hand movements that are concurrently planned. In contrast, if eye–hand coordination arises solely from common inputs, perturbing one effector pathway, e.g., the parietal reach region (PRR), would not affect the other effector. To test these hypotheses, we inactivated part of PRR in the macaque, located in the medial bank of the intraparietal sulcus encompassing the medial intraparietal area and area 5V. When each effector moved alone, PRR inactivation shortened reach but not saccade amplitudes, compatible with the known reach-selective activity of PRR. However, when both effectors moved concurrently, PRR inactivation shortened both reach and saccade amplitudes, and decoupled their reaction times. Therefore, consistent with the interpathway communication hypothesis, we propose that the planning of concurrent eye and hand movements causes the spatial information in PRR to influence the otherwise independent eye control pathways, and that their temporal coupling requires an intact PRR. PMID:25232123

  4. EEG Negativity in Fixations Used for Gaze-Based Control: Toward Converting Intentions into Actions with an Eye-Brain-Computer Interface

    PubMed Central

    Shishkin, Sergei L.; Nuzhdin, Yuri O.; Svirin, Evgeny P.; Trofimov, Alexander G.; Fedorova, Anastasia A.; Kozyrskiy, Bogdan L.; Velichkovsky, Boris M.

    2016-01-01

    We usually look at an object when we are going to manipulate it. Thus, eye tracking can be used to communicate intended actions. An effective human-machine interface, however, should be able to differentiate intentional and spontaneous eye movements. We report an electroencephalogram (EEG) marker that differentiates gaze fixations used for control from spontaneous fixations involved in visual exploration. Eight healthy participants played a game with their eye movements only. Their gaze-synchronized EEG data (fixation-related potentials, FRPs) were collected during game's control-on and control-off conditions. A slow negative wave with a maximum in the parietooccipital region was present in each participant's averaged FRPs in the control-on conditions and was absent or had much lower amplitude in the control-off condition. This wave was similar but not identical to stimulus-preceding negativity, a slow negative wave that can be observed during feedback expectation. Classification of intentional vs. spontaneous fixations was based on amplitude features from 13 EEG channels using 300 ms length segments free from electrooculogram contamination (200–500 ms relative to the fixation onset). For the first fixations in the fixation triplets required to make moves in the game, classified against control-off data, a committee of greedy classifiers provided 0.90 ± 0.07 specificity and 0.38 ± 0.14 sensitivity. Similar (slightly lower) results were obtained for the shrinkage Linear Discriminate Analysis (LDA) classifier. The second and third fixations in the triplets were classified at lower rate. We expect that, with improved feature sets and classifiers, a hybrid dwell-based Eye-Brain-Computer Interface (EBCI) can be built using the FRP difference between the intended and spontaneous fixations. If this direction of BCI development will be successful, such a multimodal interface may improve the fluency of interaction and can possibly become the basis for a new input device for paralyzed and healthy users, the EBCI “Wish Mouse.” PMID:27917105

  5. Acting without seeing: eye movements reveal visual processing without awareness.

    PubMed

    Spering, Miriam; Carrasco, Marisa

    2015-04-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. Here, we review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movement. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging, and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Acting without seeing: Eye movements reveal visual processing without awareness Miriam Spering & Marisa Carrasco

    PubMed Central

    Spering, Miriam; Carrasco, Marisa

    2015-01-01

    Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. We review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movements. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. PMID:25765322

  7. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    PubMed

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  8. SmartEye and Polhemus data for vestibulo-ocular reflex and optokinetic reflex model.

    PubMed

    Le, Anh Son; Aoki, Hirofumi

    2018-06-01

    In this data article, this dataset included raw data of head and eye movement that collected by Polhemus (Polhemus Inc) and SmartEye (Smart Eye AB) equipment. Subjects who have driver license participated in this experiment. The experiment was conducted with a driving simulator that was controlled by CarSim (Mechanical simulation Co., Anna Arbor, MI) with the vehicle motion. This data set not only contained the eye and head movement but also had eye gaze, pupil diameter, saccades, and so on. It can be used for the parameter identification of the vestibulor-ocular reflex (VOR) model, simulation eye movement, as well as running other analysis related to eye movement.

  9. Classification of visual and linguistic tasks using eye-movement features.

    PubMed

    Coco, Moreno I; Keller, Frank

    2014-03-07

    The role of the task has received special attention in visual-cognition research because it can provide causal explanations of goal-directed eye-movement responses. The dependency between visual attention and task suggests that eye movements can be used to classify the task being performed. A recent study by Greene, Liu, and Wolfe (2012), however, fails to achieve accurate classification of visual tasks based on eye-movement features. In the present study, we hypothesize that tasks can be successfully classified when they differ with respect to the involvement of other cognitive domains, such as language processing. We extract the eye-movement features used by Greene et al. as well as additional features from the data of three different tasks: visual search, object naming, and scene description. First, we demonstrated that eye-movement responses make it possible to characterize the goals of these tasks. Then, we trained three different types of classifiers and predicted the task participants performed with an accuracy well above chance (a maximum of 88% for visual search). An analysis of the relative importance of features for classification accuracy reveals that just one feature, i.e., initiation time, is sufficient for above-chance performance (a maximum of 79% accuracy in object naming). Crucially, this feature is independent of task duration, which differs systematically across the three tasks we investigated. Overall, the best task classification performance was obtained with a set of seven features that included both spatial information (e.g., entropy of attention allocation) and temporal components (e.g., total fixation on objects) of the eye-movement record. This result confirms the task-dependent allocation of visual attention and extends previous work by showing that task classification is possible when tasks differ in the cognitive processes involved (purely visual tasks such as search vs. communicative tasks such as scene description).

  10. Gaze failure, drifting eye movements, and centripetal nystagmus in cerebellar disease.

    PubMed Central

    Leech, J; Gresty, M; Hess, K; Rudge, P

    1977-01-01

    Three abnormalities of eye movement in man are described which are indicative of cerebellar system disorder, namely, centripetally beating nystagmus, failure to maintain lateral gaze either in darkness or with eye closure, and slow drifting movements of the eyes in the absence of fixation. Similar eye movement signs follow cerebellectomy in the primate and the cat. These abnormalities of eye movement, together with other signs of cerebellar disease, such as rebound alternating, and gaze paretic nystagmus, are explained by the hypothesis that the cerebellum helps to maintain lateral gaze and that brain stem mechanisms which monitor gaze position generate compensatory biases in the absence of normal cerebellar function. PMID:603785

  11. Exogenous orienting of attention depends upon the ability to execute eye movements.

    PubMed

    Smith, Daniel T; Rorden, Chris; Jackson, Stephen R

    2004-05-04

    Shifts of attention can be made overtly by moving the eyes or covertly with attention being allocated to a region of space that does not correspond to the current direction of gaze. However, the precise relationship between eye movements and the covert orienting of attention remains controversial. The influential premotor theory proposes that the covert orienting of attention is produced by the programming of (unexecuted) eye movements and thus predicts a strong relationship between the ability to execute eye movements and the operation of spatial attention. Here, we demonstrate for the first time that impaired spatial attention is observed in an individual (AI) who is neurologically healthy but who cannot execute eye movements as a result of a congenital impairment in the elasticity of her eye muscles. This finding provides direct support for the role of the eye-movement system in the covert orienting of attention and suggests that whereas intact cortical structures may be necessary for normal attentional reflexes, they are not sufficient. The ability to move our eyes is essential for the development of normal patterns of spatial attention.

  12. Extraocular muscle function testing

    MedlinePlus

    ... may result in double vision or rapid, uncontrolled eye movements . Normal Results Normal movement of the eyes in all directions. What Abnormal Results Mean Eye movement disorders may be due to abnormalities of the ...

  13. Internuclear Ophthalmoplegia

    MedlinePlus

    ... Nerve Disorders Internuclear ophthalmoplegia is impairment of horizontal eye movements caused by damage to certain connections between nerve ... include Lyme disease, tumors, and head injuries. Horizontal eye movements are impaired, but vertical eye movements are not. ...

  14. Eye Movements to Natural Images as a Function of Sex and Personality

    PubMed Central

    Mercer Moss, Felix Joseph; Baddeley, Roland; Canagarajah, Nishan

    2012-01-01

    Women and men are different. As humans are highly visual animals, these differences should be reflected in the pattern of eye movements they make when interacting with the world. We examined fixation distributions of 52 women and men while viewing 80 natural images and found systematic differences in their spatial and temporal characteristics. The most striking of these was that women looked away and usually below many objects of interest, particularly when rating images in terms of their potency. We also found reliable differences correlated with the images' semantic content, the observers' personality, and how the images were semantically evaluated. Information theoretic techniques showed that many of these differences increased with viewing time. These effects were not small: the fixations to a single action or romance film image allow the classification of the sex of an observer with 64% accuracy. While men and women may live in the same environment, what they see in this environment is reliably different. Our findings have important implications for both past and future eye movement research while confirming the significant role individual differences play in visual attention. PMID:23248740

  15. Exploring responses to art in adolescence: a behavioral and eye-tracking study.

    PubMed

    Savazzi, Federica; Massaro, Davide; Di Dio, Cinzia; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2014-01-01

    Adolescence is a peculiar age mainly characterized by physical and psychological changes that may affect the perception of one's own and others' body. This perceptual peculiarity may influence the way in which bottom-up and top-down processes interact and, consequently, the perception and evaluation of art. This study is aimed at investigating, by means of the eye-tracking technique, the visual explorative behavior of adolescents while looking at paintings. Sixteen color paintings, categorized as dynamic and static, were presented to twenty adolescents; half of the images represented natural environments and half human individuals; all stimuli were displayed under aesthetic and movement judgment tasks. Participants' ratings revealed that, generally, nature images are explicitly evaluated as more appealing than human images. Eye movement data, on the other hand, showed that the human body exerts a strong power in orienting and attracting visual attention and that, in adolescence, it plays a fundamental role during aesthetic experience. In particular, adolescents seem to approach human-content images by giving priority to elements calling forth movement and action, supporting the embodiment theory of aesthetic perception.

  16. Exploring Responses to Art in Adolescence: A Behavioral and Eye-Tracking Study

    PubMed Central

    Savazzi, Federica; Massaro, Davide; Di Dio, Cinzia; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2014-01-01

    Adolescence is a peculiar age mainly characterized by physical and psychological changes that may affect the perception of one's own and others' body. This perceptual peculiarity may influence the way in which bottom-up and top-down processes interact and, consequently, the perception and evaluation of art. This study is aimed at investigating, by means of the eye-tracking technique, the visual explorative behavior of adolescents while looking at paintings. Sixteen color paintings, categorized as dynamic and static, were presented to twenty adolescents; half of the images represented natural environments and half human individuals; all stimuli were displayed under aesthetic and movement judgment tasks. Participants' ratings revealed that, generally, nature images are explicitly evaluated as more appealing than human images. Eye movement data, on the other hand, showed that the human body exerts a strong power in orienting and attracting visual attention and that, in adolescence, it plays a fundamental role during aesthetic experience. In particular, adolescents seem to approach human-content images by giving priority to elements calling forth movement and action, supporting the embodiment theory of aesthetic perception. PMID:25048813

  17. Dissemination Of Evidence-Based CBT Intervention Components: Online Self-Administered Training For Providers Treating Military Deployment-Related PTSD

    DTIC Science & Technology

    2010-08-01

    eye movement desensitization and reprocessing ( EMDR ), fluoxetine, and pill placebo in the treatment of posttraumatic...stress disorder: treatment effects and long-term maintenance. J Clin Psychiatry;68(1):37-46. 2007. 5. Shapiro F. Eye movement desensitization and ... reprocessing : Basic principles, protocols, and procedures (2nd edition). New York: Guilford Press. 2001. 6. Monson CM, Schnurr PP, Resick

  18. An eye movement based reading intervention in lexical and segmental readers with acquired dyslexia.

    PubMed

    Ablinger, Irene; von Heyden, Kerstin; Vorstius, Christian; Halm, Katja; Huber, Walter; Radach, Ralph

    2014-01-01

    Due to their brain damage, aphasic patients with acquired dyslexia often rely to a greater extent on lexical or segmental reading procedures. Thus, therapy intervention is mostly targeted on the more impaired reading strategy. In the present work we introduce a novel therapy approach based on real-time measurement of patients' eye movements as they attempt to read words. More specifically, an eye movement contingent technique of stepwise letter de-masking was used to support sequential reading, whereas fixation-dependent initial masking of non-central letters stimulated a lexical (parallel) reading strategy. Four lexical and four segmental readers with acquired central dyslexia received our intensive reading intervention. All participants showed remarkable improvements as evident in reduced total reading time, a reduced number of fixations per word and improved reading accuracy. Both types of intervention led to item-specific training effects in all subjects. A generalisation to untrained items was only found in segmental readers after the lexical training. Eye movement analyses were also used to compare word processing before and after therapy, indicating that all patients, with one exclusion, maintained their preferred reading strategy. However, in several cases the balance between sequential and lexical processing became less extreme, indicating a more effective individual interplay of both word processing routes.

  19. Assessing visual attention using eye tracking sensors in intelligent cognitive therapies based on serious games.

    PubMed

    Frutos-Pascual, Maite; Garcia-Zapirain, Begonya

    2015-05-12

    This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems.

  20. Assessing Visual Attention Using Eye Tracking Sensors in Intelligent Cognitive Therapies Based on Serious Games

    PubMed Central

    Frutos-Pascual, Maite; Garcia-Zapirain, Begonya

    2015-01-01

    This study examines the use of eye tracking sensors as a means to identify children's behavior in attention-enhancement therapies. For this purpose, a set of data collected from 32 children with different attention skills is analyzed during their interaction with a set of puzzle games. The authors of this study hypothesize that participants with better performance may have quantifiably different eye-movement patterns from users with poorer results. The use of eye trackers outside the research community may help to extend their potential with available intelligent therapies, bringing state-of-the-art technologies to users. The use of gaze data constitutes a new information source in intelligent therapies that may help to build new approaches that are fully-customized to final users' needs. This may be achieved by implementing machine learning algorithms for classification. The initial study of the dataset has proven a 0.88 (±0.11) classification accuracy with a random forest classifier, using cross-validation and hierarchical tree-based feature selection. Further approaches need to be examined in order to establish more detailed attention behaviors and patterns among children with and without attention problems. PMID:25985158

  1. The improvement of movement and speech during rapid eye movement sleep behaviour disorder in multiple system atrophy.

    PubMed

    De Cock, Valérie Cochen; Debs, Rachel; Oudiette, Delphine; Leu, Smaranda; Radji, Fatai; Tiberge, Michel; Yu, Huan; Bayard, Sophie; Roze, Emmanuel; Vidailhet, Marie; Dauvilliers, Yves; Rascol, Olivier; Arnulf, Isabelle

    2011-03-01

    Multiple system atrophy is an atypical parkinsonism characterized by severe motor disabilities that are poorly levodopa responsive. Most patients develop rapid eye movement sleep behaviour disorder. Because parkinsonism is absent during rapid eye movement sleep behaviour disorder in patients with Parkinson's disease, we studied the movements of patients with multiple system atrophy during rapid eye movement sleep. Forty-nine non-demented patients with multiple system atrophy and 49 patients with idiopathic Parkinson's disease were interviewed along with their 98 bed partners using a structured questionnaire. They rated the quality of movements, vocal and facial expressions during rapid eye movement sleep behaviour disorder as better than, equal to or worse than the same activities in an awake state. Sleep and movements were monitored using video-polysomnography in 22/49 patients with multiple system atrophy and in 19/49 patients with Parkinson's disease. These recordings were analysed for the presence of parkinsonism and cerebellar syndrome during rapid eye movement sleep movements. Clinical rapid eye movement sleep behaviour disorder was observed in 43/49 (88%) patients with multiple system atrophy. Reports from the 31/43 bed partners who were able to evaluate movements during sleep indicate that 81% of the patients showed some form of improvement during rapid eye movement sleep behaviour disorder. These included improved movement (73% of patients: faster, 67%; stronger, 52%; and smoother, 26%), improved speech (59% of patients: louder, 55%; more intelligible, 17%; and better articulated, 36%) and normalized facial expression (50% of patients). The rate of improvement was higher in Parkinson's disease than in multiple system atrophy, but no further difference was observed between the two forms of multiple system atrophy (predominant parkinsonism versus cerebellar syndrome). Video-monitored movements during rapid eye movement sleep in patients with multiple system atrophy revealed more expressive faces, and movements that were faster and more ample in comparison with facial expression and movements during wakefulness. These movements were still somewhat jerky but lacked any visible parkinsonism. Cerebellar signs were not assessable. We conclude that parkinsonism also disappears during rapid eye movement sleep behaviour disorder in patients with multiple system atrophy, but this improvement is not due to enhanced dopamine transmission because these patients are not levodopa-sensitive. These data suggest that these movements are not influenced by extrapyramidal regions; however, the influence of abnormal cerebellar control remains unclear. The transient disappearance of parkinsonism here is all the more surprising since no treatment (even dopaminergic) provides a real benefit in this disabling disease.

  2. Detection of differential viewing patterns to erotic and non-erotic stimuli using eye-tracking methodology.

    PubMed

    Lykins, Amy D; Meana, Marta; Kambe, Gretchen

    2006-10-01

    As a first step in the investigation of the role of visual attention in the processing of erotic stimuli, eye-tracking methodology was employed to measure eye movements during erotic scene presentation. Because eye-tracking is a novel methodology in sexuality research, we attempted to determine whether the eye-tracker could detect differences (should they exist) in visual attention to erotic and non-erotic scenes. A total of 20 men and 20 women were presented with a series of erotic and non-erotic images and tracked their eye movements during image presentation. Comparisons between erotic and non-erotic image groups showed significant differences on two of three dependent measures of visual attention (number of fixations and total time) in both men and women. As hypothesized, there was a significant Stimulus x Scene Region interaction, indicating that participants visually attended to the body more in the erotic stimuli than in the non-erotic stimuli, as evidenced by a greater number of fixations and longer total time devoted to that region. These findings provide support for the application of eye-tracking methodology as a measure of visual attentional capture in sexuality research. Future applications of this methodology to expand our knowledge of the role of cognition in sexuality are suggested.

  3. Gain Modulation as a Mechanism for Coding Depth from Motion Parallax in Macaque Area MT

    PubMed Central

    Kim, HyungGoo R.; Angelaki, Dora E.

    2017-01-01

    Observer translation produces differential image motion between objects that are located at different distances from the observer's point of fixation [motion parallax (MP)]. However, MP can be ambiguous with respect to depth sign (near vs far), and this ambiguity can be resolved by combining retinal image motion with signals regarding eye movement relative to the scene. We have previously demonstrated that both extra-retinal and visual signals related to smooth eye movements can modulate the responses of neurons in area MT of macaque monkeys, and that these modulations generate neural selectivity for depth sign. However, the neural mechanisms that govern this selectivity have remained unclear. In this study, we analyze responses of MT neurons as a function of both retinal velocity and direction of eye movement, and we show that smooth eye movements modulate MT responses in a systematic, temporally precise, and directionally specific manner to generate depth-sign selectivity. We demonstrate that depth-sign selectivity is primarily generated by multiplicative modulations of the response gain of MT neurons. Through simulations, we further demonstrate that depth can be estimated reasonably well by a linear decoding of a population of MT neurons with response gains that depend on eye velocity. Together, our findings provide the first mechanistic description of how visual cortical neurons signal depth from MP. SIGNIFICANCE STATEMENT Motion parallax is a monocular cue to depth that commonly arises during observer translation. To compute from motion parallax whether an object appears nearer or farther than the point of fixation requires combining retinal image motion with signals related to eye rotation, but the neurobiological mechanisms have remained unclear. This study provides the first mechanistic account of how this interaction takes place in the responses of cortical neurons. Specifically, we show that smooth eye movements modulate the gain of responses of neurons in area MT in a directionally specific manner to generate selectivity for depth sign from motion parallax. We also show, through simulations, that depth could be estimated from a population of such gain-modulated neurons. PMID:28739582

  4. VisualEyes: a modular software system for oculomotor experimentation.

    PubMed

    Guo, Yi; Kim, Eun H; Kim, Eun; Alvarez, Tara; Alvarez, Tara L

    2011-03-25

    Eye movement studies have provided a strong foundation forming an understanding of how the brain acquires visual information in both the normal and dysfunctional brain.(1) However, development of a platform to stimulate and store eye movements can require substantial programming, time and costs. Many systems do not offer the flexibility to program numerous stimuli for a variety of experimental needs. However, the VisualEyes System has a flexible architecture, allowing the operator to choose any background and foreground stimulus, program one or two screens for tandem or opposing eye movements and stimulate the left and right eye independently. This system can significantly reduce the programming development time needed to conduct an oculomotor study. The VisualEyes System will be discussed in three parts: 1) the oculomotor recording device to acquire eye movement responses, 2) the VisualEyes software written in LabView, to generate an array of stimuli and store responses as text files and 3) offline data analysis. Eye movements can be recorded by several types of instrumentation such as: a limbus tracking system, a sclera search coil, or a video image system. Typical eye movement stimuli such as saccadic steps, vergent ramps and vergent steps with the corresponding responses will be shown. In this video report, we demonstrate the flexibility of a system to create numerous visual stimuli and record eye movements that can be utilized by basic scientists and clinicians to study healthy as well as clinical populations.

  5. Evidence-based ergonomics. A comparison of Japanese and American office layouts.

    PubMed

    Noro, Kageyu; Fujimaki, Goroh; Kishi, Shinsuke

    2003-01-01

    There is a variety of alternatives in office layouts. Yet the theoretical basis and criteria for predicting how well these layouts accommodate employees are poorly understood. The objective of this study was to evaluate criteria for selecting office layouts. Intensive computer workers worked in simulated office layouts in a controlled experimental laboratory. Eye movement measures indicate that knowledge work requires both concentration and interaction. Findings pointed to one layout as providing optimum balance between these 2 requirements. Recommendations for establishing a theoretical basis and design criteria for selecting office layouts based on work style are suggested.

  6. Eye Movements and the Use of Parafoveal Word Length Information in Reading

    PubMed Central

    Juhasz, Barbara J.; White, Sarah J.; Liversedge, Simon P.; Rayner, Keith

    2009-01-01

    Eye movements were monitored in 4 experiments that explored the role of parafoveal word length in reading. The experiments employed a type of compound word where the deletion of a letter results in 2 short words (e.g., backhand, back and). The boundary technique (K. Rayner, 1975) was employed to manipulate word length information in the parafovea. Accuracy of the parafoveal word length preview significantly affected landing positions and fixation durations. This disruption was larger for 2-word targets, but the results demonstrated that this interaction was not due to the morphological status of the target words. Manipulation of sentence context also demonstrated that parafoveal word length information can be used in combination with sentence context to narrow down lexical candidates. The 4 experiments converge in demonstrating that an important role of parafoveal word length information is to direct the eyes to the center of the parafoveal word. PMID:19045993

  7. Basal Ganglia Neuronal Activity during Scanning Eye Movements in Parkinson’s Disease

    PubMed Central

    Sieger, Tomáš; Bonnet, Cecilia; Serranová, Tereza; Wild, Jiří; Novák, Daniel; Růžička, Filip; Urgošík, Dušan; Růžička, Evžen; Gaymard, Bertrand; Jech, Robert

    2013-01-01

    The oculomotor role of the basal ganglia has been supported by extensive evidence, although their role in scanning eye movements is poorly understood. Nineteen Parkinsońs disease patients, which underwent implantation of deep brain stimulation electrodes, were investigated with simultaneous intraoperative microelectrode recordings and single channel electrooculography in a scanning eye movement task by viewing a series of colored pictures selected from the International Affective Picture System. Four patients additionally underwent a visually guided saccade task. Microelectrode recordings were analyzed selectively from the subthalamic nucleus, substantia nigra pars reticulata and from the globus pallidus by the WaveClus program which allowed for detection and sorting of individual neurons. The relationship between neuronal firing rate and eye movements was studied by crosscorrelation analysis. Out of 183 neurons that were detected, 130 were found in the subthalamic nucleus, 30 in the substantia nigra and 23 in the globus pallidus. Twenty percent of the neurons in each of these structures showed eye movement-related activity. Neurons related to scanning eye movements were mostly unrelated to the visually guided saccades. We conclude that a relatively large number of basal ganglia neurons are involved in eye motion control. Surprisingly, neurons related to scanning eye movements differed from neurons activated during saccades suggesting functional specialization and segregation of both systems for eye movement control. PMID:24223158

  8. A common control signal and a ballistic stage can explain the control of coordinated eye-hand movements.

    PubMed

    Gopal, Atul; Murthy, Aditya

    2016-06-01

    Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration. Copyright © 2016 the American Physiological Society.

  9. A common control signal and a ballistic stage can explain the control of coordinated eye-hand movements

    PubMed Central

    Gopal, Atul

    2016-01-01

    Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration. PMID:26888104

  10. Word processing during reading sentences in patients with schizophrenia: evidences from the eyetracking technique.

    PubMed

    Fernández, Gerardo; Sapognikoff, Marcelo; Guinjoan, Salvador; Orozco, David; Agamennoni, Osvaldo

    2016-07-01

    The current study analyze the effect of word properties (i.e., word length, word frequency and word predictability) on the eye movement behavior of patients with schizophrenia (SZ) compared to age-matched controls. 18 SZ patients and 40 age matched controls participated in the study. Eye movements were recorded during reading regular sentences by using the eyetracking technique. Eye movement analyses were performed using linear mixed models. Analysis of eye movements revealed that patients with SZ decreased the amount of single fixations, increased their total number of second pass fixations compared with healthy individuals (Controls). In addition, SZ patients showed an increase in gaze duration, compared to Controls. Interestingly, the effects of current word frequency and current word length processing were similar in Controls and SZ patients. The high rate of second pass fixations and its low rate in single fixation might reveal impairments in working memory when integrating neighbor words. In contrast, word frequency and length processing might require less complex mechanisms, which were functioning in SZ patients. To the best of our knowledge, this is the first study measuring how patients with SZ process dynamically well-defined words embedded in regular sentences. The findings suggest that evaluation of the resulting changes in eye movement behavior may supplement current symptom-based diagnosis. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Genetics Home Reference: Duane-radial ray syndrome

    MedlinePlus

    ... condition is characterized by a particular problem with eye movement called Duane anomaly (also known as Duane syndrome). ... the improper development of certain nerves that control eye movement. Duane anomaly limits outward eye movement (toward the ...

  12. Eye movements reduce vividness and emotionality of "flashforwards".

    PubMed

    Engelhard, Iris M; van den Hout, Marcel A; Janssen, Wilco C; van der Beek, Jorinde

    2010-05-01

    Earlier studies have shown that eye movements during retrieval of disturbing images about past events reduce their vividness and emotionality, which may be due to both tasks competing for working memory resources. This study examined whether eye movements reduce vividness and emotionality of visual distressing images about feared future events: "flashforwards". A non-clinical sample was asked to select two images of feared future events, which were self-rated for vividness and emotionality. These images were retrieved while making eye movements or without a concurrent secondary task, and then vividness and emotionality were rated again. Relative to the no-dual task condition, eye movements while thinking of future-oriented images resulted in decreased ratings of image vividness and emotional intensity. Apparently, eye movements reduce vividness and emotionality of visual images about past and future feared events. This is in line with a working memory account of the beneficial effects of eye movements, which predicts that any task that taxes working memory during retrieval of disturbing mental images will be beneficial. Copyright 2010 Elsevier Ltd. All rights reserved.

  13. Novel methodology to examine cognitive and experiential factors in language development: combining eye-tracking and LENA technology

    PubMed Central

    Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.

    2015-01-01

    Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591

  14. Eye Movement Control during Reading: I. The Location of Initial Eye Fixations on Words. Technical Report No. 406.

    ERIC Educational Resources Information Center

    McConkie, G. W.; And Others

    Sixty-six college students read two chapters from a contemporary novel while their eye movements were monitored. The eye movement data were analyzed to identify factors that influence the location of a reader's initial eye fixation on a word. When the data were partitioned according to the location of the prior fixation (i.e., launch site), the…

  15. Proprioceptive role for palisade endings in extraocular muscles: evidence from the Jendrassik Maneuver.

    PubMed

    Niechwiej-Szwedo, E; González, E; Bega, S; Verrier, M C; Wong, A M; Steinbach, M J

    2006-07-01

    A proprioceptive hypothesis for the control of eye movements has been recently proposed based on neuroanatomical tracing studies. It has been suggested that the non-twitch motoneurons could be involved in modulating the gain of sensory feedback from the eye muscles analogous to the gamma (gamma) motoneurons which control the gain of proprioceptive feedback in skeletal muscles. We conducted behavioral and psychophysical experiments to test the above hypothesis using the Jendrassik Maneuver (JM) to alter the activity of gamma motoneurons. It was hypothesized that the JM would alter the proprioceptive feedback from the eye muscles which would result in misregistration of eye position and mislocalization of targets. In the first experiment, vergence eye movements and pointing responses were examined. Data showed that the JM affected the localization responses but not the actual eye position. Perceptual judgments were tested in the second experiment, and the results showed that targets were perceived as farther when the afferent feedback was altered by the JM. Overall, the results from the two experiments showed that eye position was perceived as more divergent with the JM, but the actual eye movements were not affected. We tested this further in Experiment 3 by examining the effect of JM on the amplitude and velocity of saccadic eye movements. As expected, there were no significant differences in saccadic parameters between the control and experimental conditions. Overall, the present study provides novel insight into the mechanism which may be involved in the use of sensory feedback from the eye muscles. Data from the first two experiments support the hypothesis that the JM alters the registered eye position, as evidenced by the localization errors. We propose that the altered eye position signal is due to the effect of the JM which changes the gain of the sensory feedback from the eye muscles, possibly via the activity of non-twitch motoneurons.

  16. An eye movement pre-training fosters the comprehension of processes and functions in technical systems.

    PubMed

    Skuballa, Irene T; Fortunski, Caroline; Renkl, Alexander

    2015-01-01

    The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning.

  17. Postural Effects on Pharyngeal Protective Reflex Mechanisms

    PubMed Central

    Malhotra, Atul; Trinder, John; Fogel, Robert; Stanchina, Michael; Patel, Sanjay R.; Schory, Karen; Kleverlaan, Darci; White, David P.

    2012-01-01

    Study Objectives Pharyngeal muscle dilators are important in obstructive sleep apnea pathogenesis because the failure of protective reflexes involving these muscles yields pharyngeal collapse. Conflicting results exist in the literature regarding the responsiveness of these muscles during stable non-rapid eye movement sleep. However, variations in posture in previous studies may have influenced these findings. We hypothesized that tongue protruder muscles are maximally responsive to negative pressure pulses during supine sleep, when posterior tongue displacement yields pharyngeal occlusion. Design We studied all subjects in the supine and lateral postures during wakefulness and stable non-rapid eye movement sleep by measuring genioglossus and tensor palatini electromyograms during basal breathing and following negative pressure pulses. Setting Upper-airway physiology laboratory of Sleep Medicine Division, Brigham and Women’s Hospital. Subjects/Participants 17 normal subjects. Measurements and Results We observed an increase in genioglossal responsiveness to negative pressure pulses in sleep as compared to wakefulness in supine subjects (3.9 percentage of maximum [%max] ± 1.1 vs 4.4 %max ± 1.0) but a decrease in the lateral decubitus position (4.1 %max ± 1.0 vs 1.5 %max ± 0.4), the interaction effect being significant. Despite this augmented reflex, collapsibility, as measured during negative pressure pulses, increased more while subjects were in the supine position as compared with the lateral decubitus position. While the interaction between wake-sleep state and position was also significant for the tensor palatini, the effect was weaker than for genioglossus, although, for tensor palatini, baseline activity was markedly reduced during non-rapid eye movement sleep as compared with wakefulness. Conclusions We conclude that body posture does have an important impact on genioglossal responsiveness to negative pressure pulses during non-rapid eye movement sleep. We speculate that this mechanism works to prevent pharyngeal occlusion when the upper airway is most vulnerable to collapse eg, during supine sleep. PMID:15532204

  18. A common stochastic accumulator with effector-dependent noise can explain eye-hand coordination

    PubMed Central

    Gopal, Atul; Viswanathan, Pooja

    2015-01-01

    The computational architecture that enables the flexible coupling between otherwise independent eye and hand effector systems is not understood. By using a drift diffusion framework, in which variability of the reaction time (RT) distribution scales with mean RT, we tested the ability of a common stochastic accumulator to explain eye-hand coordination. Using a combination of behavior, computational modeling and electromyography, we show how a single stochastic accumulator to threshold, followed by noisy effector-dependent delays, explains eye-hand RT distributions and their correlation, while an alternate independent, interactive eye and hand accumulator model does not. Interestingly, the common accumulator model did not explain the RT distributions of the same subjects when they made eye and hand movements in isolation. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning. PMID:25568161

  19. Premotor neurons encode torsional eye velocity during smooth-pursuit eye movements

    NASA Technical Reports Server (NTRS)

    Angelaki, Dora E.; Dickman, J. David

    2003-01-01

    Responses to horizontal and vertical ocular pursuit and head and body rotation in multiple planes were recorded in eye movement-sensitive neurons in the rostral vestibular nuclei (VN) of two rhesus monkeys. When tested during pursuit through primary eye position, the majority of the cells preferred either horizontal or vertical target motion. During pursuit of targets that moved horizontally at different vertical eccentricities or vertically at different horizontal eccentricities, eye angular velocity has been shown to include a torsional component the amplitude of which is proportional to half the gaze angle ("half-angle rule" of Listing's law). Approximately half of the neurons, the majority of which were characterized as "vertical" during pursuit through primary position, exhibited significant changes in their response gain and/or phase as a function of gaze eccentricity during pursuit, as if they were also sensitive to torsional eye velocity. Multiple linear regression analysis revealed a significant contribution of torsional eye movement sensitivity to the responsiveness of the cells. These findings suggest that many VN neurons encode three-dimensional angular velocity, rather than the two-dimensional derivative of eye position, during smooth-pursuit eye movements. Although no clear clustering of pursuit preferred-direction vectors along the semicircular canal axes was observed, the sensitivity of VN neurons to torsional eye movements might reflect a preservation of similar premotor coding of visual and vestibular-driven slow eye movements for both lateral-eyed and foveate species.

  20. High-Speed Video-Oculography for Measuring Three-Dimensional Rotation Vectors of Eye Movements in Mice

    PubMed Central

    Takeda, Noriaki; Uno, Atsuhiko; Inohara, Hidenori; Shimada, Shoichi

    2016-01-01

    Background The mouse is the most commonly used animal model in biomedical research because of recent advances in molecular genetic techniques. Studies related to eye movement in mice are common in fields such as ophthalmology relating to vision, neuro-otology relating to the vestibulo-ocular reflex (VOR), neurology relating to the cerebellum’s role in movement, and psychology relating to attention. Recording eye movements in mice, however, is technically difficult. Methods We developed a new algorithm for analyzing the three-dimensional (3D) rotation vector of eye movement in mice using high-speed video-oculography (VOG). The algorithm made it possible to analyze the gain and phase of VOR using the eye’s angular velocity around the axis of eye rotation. Results When mice were rotated at 0.5 Hz and 2.5 Hz around the earth’s vertical axis with their heads in a 30° nose-down position, the vertical components of their left eye movements were in phase with the horizontal components. The VOR gain was 0.42 at 0.5 Hz and 0.74 at 2.5 Hz, and the phase lead of the eye movement against the turntable was 16.1° at 0.5 Hz and 4.88° at 2.5 Hz. Conclusions To the best of our knowledge, this is the first report of this algorithm being used to calculate a 3D rotation vector of eye movement in mice using high-speed VOG. We developed a technique for analyzing the 3D rotation vector of eye movements in mice with a high-speed infrared CCD camera. We concluded that the technique is suitable for analyzing eye movements in mice. We also include a C++ source code that can calculate the 3D rotation vectors of the eye position from two-dimensional coordinates of the pupil and the iris freckle in the image to this article. PMID:27023859

  1. Effect of glaucoma on eye movement patterns and laboratory-based hazard detection ability

    PubMed Central

    Black, Alex A.; Wood, Joanne M.

    2017-01-01

    Purpose The mechanisms underlying the elevated crash rates of older drivers with glaucoma are poorly understood. A key driving skill is timely detection of hazards; however, the hazard detection ability of drivers with glaucoma has been largely unexplored. This study assessed the eye movement patterns and visual predictors of performance on a laboratory-based hazard detection task in older drivers with glaucoma. Methods Participants included 30 older drivers with glaucoma (71±7 years; average better-eye mean deviation (MD) = −3.1±3.2 dB; average worse-eye MD = −11.9±6.2 dB) and 25 age-matched controls (72±7 years). Visual acuity, contrast sensitivity, visual fields, useful field of view (UFoV; processing speeds), and motion sensitivity were assessed. Participants completed a computerised Hazard Perception Test (HPT) while their eye movements were recorded using a desk-mounted Tobii TX300 eye-tracking system. The HPT comprises a series of real-world traffic videos recorded from the driver’s perspective; participants responded to road hazards appearing in the videos, and hazard response times were determined. Results Participants with glaucoma exhibited an average of 0.42 seconds delay in hazard response time (p = 0.001), smaller saccades (p = 0.010), and delayed first fixation on hazards (p<0.001) compared to controls. Importantly, larger saccades were associated with faster hazard responses in the glaucoma group (p = 0.004), but not in the control group (p = 0.19). Across both groups, significant visual predictors of hazard response times included motion sensitivity, UFoV, and worse-eye MD (p<0.05). Conclusions Older drivers with glaucoma had delayed hazard response times compared to controls, with associated changes in eye movement patterns. The association between larger saccades and faster hazard response time in the glaucoma group may represent a compensatory behaviour to facilitate improved performance. PMID:28570621

  2. Constraining eye movement in individuals with Parkinson's disease during walking turns.

    PubMed

    Ambati, V N Pradeep; Saucedo, Fabricio; Murray, Nicholas G; Powell, Douglas W; Reed-Jones, Rebecca J

    2016-10-01

    Walking and turning is a movement that places individuals with Parkinson's disease (PD) at increased risk for fall-related injury. However, turning is an essential movement in activities of daily living, making up to 45 % of the total steps taken in a given day. Hypotheses regarding how turning is controlled suggest an essential role of anticipatory eye movements to provide feedforward information for body coordination. However, little research has investigated control of turning in individuals with PD with specific consideration for eye movements. The purpose of this study was to examine eye movement behavior and body segment coordination in individuals with PD during walking turns. Three experimental groups, a group of individuals with PD, a group of healthy young adults (YAC), and a group of healthy older adults (OAC), performed walking and turning tasks under two visual conditions: free gaze and fixed gaze. Whole-body motion capture and eye tracking characterized body segment coordination and eye movement behavior during walking trials. Statistical analysis revealed significant main effects of group (PD, YAC, and OAC) and visual condition (free and fixed gaze) on timing of segment rotation and horizontal eye movement. Within group comparisons, revealed timing of eye and head movement was significantly different between the free and fixed gaze conditions for YAC (p < 0.001) and OAC (p < 0.05), but not for the PD group (p > 0.05). In addition, while intersegment timings (reflecting segment coordination) were significantly different for YAC and OAC during free gaze (p < 0.05), they were not significantly different in PD. These results suggest individuals with PD do not make anticipatory eye and head movements ahead of turning and that this may result in altered segment coordination during turning. As such, eye movements may be an important addition to training programs for those with PD, possibly promoting better coordination during turning and potentially reducing the risk of falls.

  3. The anatomy and physiology of the ocular motor system.

    PubMed

    Horn, Anja K E; Leigh, R John

    2011-01-01

    Accurate diagnosis of abnormal eye movements depends upon knowledge of the purpose, properties, and neural substrate of distinct functional classes of eye movement. Here, we summarize current concepts of the anatomy of eye movement control. Our approach is bottom-up, starting with the extraocular muscles and their innervation by the cranial nerves. Second, we summarize the neural circuits in the pons underlying horizontal gaze control, and the midbrain connections that coordinate vertical and torsional movements. Third, the role of the cerebellum in governing and optimizing eye movements is presented. Fourth, each area of cerebral cortex contributing to eye movements is discussed. Last, descending projections from cerebral cortex, including basal ganglionic circuits that govern different components of gaze, and the superior colliculus, are summarized. At each stage of this review, the anatomical scheme is used to predict the effects of lesions on the control of eye movements, providing clinical-anatomical correlation. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. An Integrative Model for the Neural Mechanism of Eye Movement Desensitization and Reprocessing (EMDR).

    PubMed

    Coubard, Olivier A

    2016-01-01

    Since the seminal report by Shapiro that bilateral stimulation induces cognitive and emotional changes, 26 years of basic and clinical research have examined the effects of Eye Movement Desensitization and Reprocessing (EMDR) in anxiety disorders, particularly in post-traumatic stress disorder (PTSD). The present article aims at better understanding EMDR neural mechanism. I first review procedural aspects of EMDR protocol and theoretical hypothesis about EMDR effects, and develop the reasons why the scientific community is still divided about EMDR. I then slide from psychology to physiology describing eye movements/emotion interaction from the physiological viewpoint, and introduce theoretical and technical tools used in movement research to re-examine EMDR neural mechanism. Using a recent physiological model for the neuropsychological architecture of motor and cognitive control, the Threshold Interval Modulation with Early Release-Rate of rIse Deviation with Early Release (TIMER-RIDER)-model, I explore how attentional control and bilateral stimulation may participate to EMDR effects. These effects may be obtained by two processes acting in parallel: (i) activity level enhancement of attentional control component; and (ii) bilateral stimulation in any sensorimotor modality, both resulting in lower inhibition enabling dysfunctional information to be processed and anxiety to be reduced. The TIMER-RIDER model offers quantitative predictions about EMDR effects for future research about its underlying physiological mechanisms.

  5. Rehearsal in serial memory for visual-spatial information: evidence from eye movements.

    PubMed

    Tremblay, Sébastien; Saint-Aubin, Jean; Jalbert, Annie

    2006-06-01

    It is well established that rote rehearsal plays a key role in serial memory for lists of verbal items. Although a great deal of research has informed us about the nature of verbal rehearsal, much less attention has been devoted to rehearsal in serial memory for visual-spatial information. By using the dot task--a visual-spatial analogue of the classical verbal serial recall task--with delayed recall, performance and eyetracking data were recorded in order to establish whether visual-spatial rehearsal could be evidenced by eye movement. The use of eye movement as a form of rehearsal is detectable (Experiment 1), and it seems to contribute to serial memory performance over and above rehearsal based on shifts of spatial attention (Experiments 1 and 2).

  6. Smooth-pursuit eye-movement-related neuronal activity in macaque nucleus reticularis tegmenti pontis.

    PubMed

    Suzuki, David A; Yamada, Tetsuto; Yee, Robert D

    2003-04-01

    Neuronal responses that were observed during smooth-pursuit eye movements were recorded from cells in rostral portions of the nucleus reticularis tegmenti pontis (rNRTP). The responses were categorized as smooth-pursuit eye velocity (78%) or eye acceleration (22%). A separate population of rNRTP cells encoded static eye position. The sensitivity to pursuit eye velocity averaged 0.81 spikes/s per degrees /s, whereas the average sensitivity to pursuit eye acceleration was 0.20 spikes/s per degrees /s(2). Of the eye-velocity cells with horizontal preferences for pursuit responses, 56% were optimally responsive to contraversive smooth-pursuit eye movements and 44% preferred ipsiversive pursuit. For cells with vertical pursuit preferences, 61% preferred upward pursuit and 39% preferred downward pursuit. The direction selectivity was broad with 50% of the maximal response amplitude observed for directions of smooth pursuit up to +/-85 degrees away from the optimal direction. The activities of some rNRTP cells were linearly related to eye position with an average sensitivity of 2.1 spikes/s per deg. In some cells, the magnitude of the response during smooth-pursuit eye movements was affected by the position of the eyes even though these cells did not encode eye position. On average, pursuit centered to one side of screen center elicited a response that was 73% of the response amplitude obtained with tracking centered at screen center. For pursuit centered on the opposite side, the average response was 127% of the response obtained at screen center. The results provide a neuronal rationale for the slow, pursuit-like eye movements evoked with rNRTP microstimulation and for the deficits in smooth-pursuit eye movements observed with ibotenic acid injection into rNRTP. More globally, the results support the notion of a frontal and supplementary eye field-rNRTP-cerebellum pathway involved with controlling smooth-pursuit eye movements.

  7. Believing androids - fMRI activation in the right temporo-parietal junction is modulated by ascribing intentions to non-human agents.

    PubMed

    Özdem, Ceylan; Wiese, Eva; Wykowska, Agnieszka; Müller, Hermann; Brass, Marcel; Van Overwalle, Frank

    2017-10-01

    Attributing mind to interaction partners has been shown to increase the social relevance we ascribe to others' actions and to modulate the amount of attention dedicated to them. However, it remains unclear how the relationship between higher-order mind attribution and lower-level attention processes is established in the brain. In this neuroimaging study, participants saw images of an anthropomorphic robot that moved its eyes left- or rightwards to signal the appearance of an upcoming stimulus in the same (valid cue) or opposite location (invalid cue). Independently, participants' beliefs about the intentionality underlying the observed eye movements were manipulated by describing the eye movements as under human control or preprogrammed. As expected, we observed a validity effect behaviorally and neurologically (increased response times and activation in the invalid vs. valid condition). More importantly, we observed that this effect was more pronounced for the condition in which the robot's behavior was believed to be controlled by a human, as opposed to be preprogrammed. This interaction effect between cue validity and belief was, however, only found at the neural level and was manifested as a significant increase of activation in bilateral anterior temporoparietal junction.

  8. Factors influencing the shear rate acting on silicone oil to cause silicone oil emulsification.

    PubMed

    Chan, Yau Kei; Cheung, Ning; Wong, David

    2014-10-30

    The shear force between silicone oil (SO) bubble and aqueous during eye movements may underlie the development of SO emulsification. This study examines factors that may affect such shear force induced by eye movements. A surface-modified model eye chamber was put under large-amplitude eye movements (amplitude 90°, angular velocity 360°/s, and a duration 300 ms). Agarose-made indentations were introduced to mimic the effect of encircling scleral buckle. Two SOs (1300 and 5000 centistokes [cSt]), three volumes (3, 4, and 5 mL), and two eye chambers (with and without indentation) were tested. Video recording was used to capture the movements of SO inside the model chamber under various conditions. The presence of indentation within the eye chamber significantly reduced the velocity of SO movements relative to the eye chamber movements (P < 0.001). To a lesser extent, an increase in viscosity also had a significant effect in reducing the relative movements. No significant effect was observed for the extent of SO fill in the chamber. Our experimental model suggests indentation within an eye, such as that created by scleral buckling, may have the greatest influence in reducing shear force induced by eye movements. Therefore, using an encircling scleral buckle may be similarly or more effective than using SO with higher viscosity in lowering the propensity to SO emulsification. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  9. Development of a novel visuomotor integration paradigm by integrating a virtual environment with mobile eye-tracking and motion-capture systems

    PubMed Central

    Miller, Haylie L.; Bugnariu, Nicoleta; Patterson, Rita M.; Wijayasinghe, Indika; Popa, Dan O.

    2018-01-01

    Visuomotor integration (VMI), the use of visual information to guide motor planning, execution, and modification, is necessary for a wide range of functional tasks. To comprehensively, quantitatively assess VMI, we developed a paradigm integrating virtual environments, motion-capture, and mobile eye-tracking. Virtual environments enable tasks to be repeatable, naturalistic, and varied in complexity. Mobile eye-tracking and minimally-restricted movement enable observation of natural strategies for interacting with the environment. This paradigm yields a rich dataset that may inform our understanding of VMI in typical and atypical development. PMID:29876370

  10. Eye movements when viewing advertisements

    PubMed Central

    Higgins, Emily; Leinenger, Mallorie; Rayner, Keith

    2013-01-01

    In this selective review, we examine key findings on eye movements when viewing advertisements. We begin with a brief, general introduction to the properties and neural underpinnings of saccadic eye movements. Next, we provide an overview of eye movement behavior during reading, scene perception, and visual search, since each of these activities is, at various times, involved in viewing ads. We then review the literature on eye movements when viewing print ads and warning labels (of the kind that appear on alcohol and tobacco ads), before turning to a consideration of advertisements in dynamic media (television and the Internet). Finally, we propose topics and methodological approaches that may prove to be useful in future research. PMID:24672500

  11. Cue-dependent memory-based smooth-pursuit in normal human subjects: importance of extra-retinal mechanisms for initial pursuit.

    PubMed

    Ito, Norie; Barnes, Graham R; Fukushima, Junko; Fukushima, Kikuro; Warabi, Tateo

    2013-08-01

    Using a cue-dependent memory-based smooth-pursuit task previously applied to monkeys, we examined the effects of visual motion-memory on smooth-pursuit eye movements in normal human subjects and compared the results with those of the trained monkeys. These results were also compared with those during simple ramp-pursuit that did not require visual motion-memory. During memory-based pursuit, all subjects exhibited virtually no errors in either pursuit-direction or go/no-go selection. Tracking eye movements of humans and monkeys were similar in the two tasks, but tracking eye movements were different between the two tasks; latencies of the pursuit and corrective saccades were prolonged, initial pursuit eye velocity and acceleration were lower, peak velocities were lower, and time to reach peak velocities lengthened during memory-based pursuit. These characteristics were similar to anticipatory pursuit initiated by extra-retinal components during the initial extinction task of Barnes and Collins (J Neurophysiol 100:1135-1146, 2008b). We suggest that the differences between the two tasks reflect differences between the contribution of extra-retinal and retinal components. This interpretation is supported by two further studies: (1) during popping out of the correct spot to enhance retinal image-motion inputs during memory-based pursuit, pursuit eye velocities approached those during simple ramp-pursuit, and (2) during initial blanking of spot motion during memory-based pursuit, pursuit components appeared in the correct direction. Our results showed the importance of extra-retinal mechanisms for initial pursuit during memory-based pursuit, which include priming effects and extra-retinal drive components. Comparison with monkey studies on neuronal responses and model analysis suggested possible pathways for the extra-retinal mechanisms.

  12. Gaze-contingent displays: a review.

    PubMed

    Duchowski, Andrew T; Cournia, Nathan; Murphy, Hunter

    2004-12-01

    Gaze-contingent displays (GCDs) attempt to balance the amount of information displayed against the visual information processing capacity of the observer through real-time eye movement sensing. Based on the assumed knowledge of the instantaneous location of the observer's focus of attention, GCD content can be "tuned" through several display processing means. Screen-based displays alter pixel level information generally matching the resolvability of the human retina in an effort to maximize bandwidth. Model-based displays alter geometric-level primitives along similar goals. Attentive user interfaces (AUIs) manage object- level entities (e.g., windows, applications) depending on the assumed attentive state of the observer. Such real-time display manipulation is generally achieved through non-contact, unobtrusive tracking of the observer's eye movements. This paper briefly reviews past and present display techniques as well as emerging graphics and eye tracking technology for GCD development.

  13. Visual field recovery after vision restoration therapy (VRT) is independent of eye movements: an eye tracker study.

    PubMed

    Kasten, Erich; Bunzenthal, Ulrike; Sabel, Bernhard A

    2006-11-25

    It has been argued that patients with visual field defects compensate for their deficit by making more frequent eye movements toward the hemianopic field and that visual field enlargements found after vision restoration therapy (VRT) may be an artefact of such eye movements. In order to determine if this was correct, we recorded eye movements in hemianopic subjects before and after VRT. Visual fields were measured in subjects with homonymous visual field defects (n=15) caused by trauma, cerebral ischemia or haemorrhage (lesion age >6 months). Visual field charts were plotted using both high-resolution perimetry (HRP) and conventional perimetry before and after a 3-month period of VRT, with eye movements being recorded with a 2D-eye tracker. This permitted quantification of eye positions and measurements of deviation from fixation. VRT lead to significant visual field enlargements as indicated by an increase of stimulus detection of 3.8% when tested using HRP and about 2.2% (OD) and 3.5% (OS) fewer misses with conventional perimetry. Eye movements were expressed as the standard deviations (S.D.) of the eye position recordings from fixation. Before VRT, the S.D. was +/-0.82 degrees horizontally and +/-1.16 degrees vertically; after VRT, it was +/-0.68 degrees and +/-1.39 degrees , respectively. A cluster analysis of the horizontal eye movements before VRT showed three types of subjects with (i) small (n=7), (ii) medium (n=7) or (iii) large fixation instability (n=1). Saccades were directed equally to the right or the left side; i.e., with no preference toward the blind hemifield. After VRT, many subjects showed a smaller variability of horizontal eye movements. Before VRT, 81.6% of the recorded eye positions were found within a range of 1 degrees horizontally from fixation, whereas after VRT, 88.3% were within that range. In the 2 degrees range, we found 94.8% before and 98.9% after VRT. Subjects moved their eyes 5 degrees or more 0.3% of the time before VRT versus 0.1% after VRT. Thus, in this study, subjects with homonymous visual field defects who were attempting to fixate a central target while their fields were being plotted, typically showed brief horizontal shifts with no preference toward or away from the blind hemifield. These eye movements were usually less than 1 degrees from fixation. Large saccades toward the blind field after VRT were very rare. VRT has no effect on either the direction or the amplitude of horizontal eye movements during visual field testing. These results argue against the theory that the visual field enlargements are artefacts induced by eye movements.

  14. GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.

    PubMed

    Sogo, Hiroyuki

    2013-09-01

    Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.

  15. Eye Movements of Online Chinese Learners

    ERIC Educational Resources Information Center

    Stickler, Ursula; Shi, Lijing

    2015-01-01

    Although online tutorials are becoming commonplace for language teaching, very few studies to date have provided insights into learners' behaviours in synchronous online interactions from their own perspective. This study employs eyetracking technology to investigate ten learners' attention during synchronous online language learning in a…

  16. The cell adhesion molecules Echinoid and Friend of Echinoid coordinate cell adhesion and cell signaling to regulate the fidelity of ommatidial rotation in the Drosophila eye.

    PubMed

    Fetting, Jennifer L; Spencer, Susan A; Wolff, Tanya

    2009-10-01

    Directed cellular movements are a universal feature of morphogenesis in multicellular organisms. Differential adhesion between the stationary and motile cells promotes these cellular movements to effect spatial patterning of cells. A prominent feature of Drosophila eye development is the 90 degrees rotational movement of the multicellular ommatidial precursors within a matrix of stationary cells. We demonstrate that the cell adhesion molecules Echinoid (Ed) and Friend of Echinoid (Fred) act throughout ommatidial rotation to modulate the degree of ommatidial precursor movement. We propose that differential levels of Ed and Fred between stationary and rotating cells at the initiation of rotation create a permissive environment for cell movement, and that uniform levels in these two populations later contribute to stopping the movement. Based on genetic data, we propose that ed and fred impart a second, independent, ;brake-like' contribution to this process via Egfr signaling. Ed and Fred are localized in largely distinct and dynamic patterns throughout rotation. However, ed and fred are required in only a subset of cells - photoreceptors R1, R7 and R6 - for normal rotation, cells that have only recently been linked to a role in planar cell polarity (PCP). This work also provides the first demonstration of a requirement for cone cells in the ommatidial rotation aspect of PCP. ed and fred also genetically interact with the PCP genes, but affect only the degree-of-rotation aspect of the PCP phenotype. Significantly, we demonstrate that at least one PCP protein, Stbm, is required in R7 to control the degree of ommatidial rotation.

  17. Cranial mononeuropathy III

    MedlinePlus

    ... is one of the cranial nerves that control eye movement. Causes may include: Brain aneurysm Infections Abnormal blood ... show: Enlarged (dilated) pupil of the affected eye Eye movement abnormalities Eyes that are not aligned Your health ...

  18. Comprehensive Oculomotor Behavioral Response Assessment (COBRA)

    NASA Technical Reports Server (NTRS)

    Stone, Leland S. (Inventor); Liston, Dorion B. (Inventor)

    2017-01-01

    An eye movement-based methodology and assessment tool may be used to quantify many aspects of human dynamic visual processing using a relatively simple and short oculomotor task, noninvasive video-based eye tracking, and validated oculometric analysis techniques. By examining the eye movement responses to a task including a radially-organized appropriately randomized sequence of Rashbass-like step-ramp pursuit-tracking trials, distinct performance measurements may be generated that may be associated with, for example, pursuit initiation (e.g., latency and open-loop pursuit acceleration), steady-state tracking (e.g., gain, catch-up saccade amplitude, and the proportion of the steady-state response consisting of smooth movement), direction tuning (e.g., oblique effect amplitude, horizontal-vertical asymmetry, and direction noise), and speed tuning (e.g., speed responsiveness and noise). This quantitative approach may provide fast and results (e.g., a multi-dimensional set of oculometrics and a single scalar impairment index) that can be interpreted by one without a high degree of scientific sophistication or extensive training.

  19. Using E-Z Reader to Simulate Eye Movements in Nonreading Tasks: A Unified Framework for Understanding the Eye-Mind Link

    ERIC Educational Resources Information Center

    Reichle, Erik D.; Pollatsek, Alexander; Rayner, Keith

    2012-01-01

    Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including…

  20. The coeruleus/subcoeruleus complex in idiopathic rapid eye movement sleep behaviour disorder.

    PubMed

    Ehrminger, Mickael; Latimier, Alice; Pyatigorskaya, Nadya; Garcia-Lorenzo, Daniel; Leu-Semenescu, Smaranda; Vidailhet, Marie; Lehericy, Stéphane; Arnulf, Isabelle

    2016-04-01

    Idiopathic rapid eye movement sleep behaviour disorder is characterized by nocturnal violence, increased muscle tone during rapid eye movement sleep and the lack of any other neurological disease. However, idiopathic rapid eye movement sleep behaviour disorder can precede parkinsonism and dementia by several years. Using 3 T magnetic resonance imaging and neuromelanin-sensitive sequences, we previously found that the signal intensity was reduced in the locus coeruleus/subcoeruleus area of patients with Parkinson's disease and rapid eye movement sleep behaviour disorder. Here, we studied the integrity of the locus coeruleus/subcoeruleus complex with neuromelanin-sensitive imaging in 21 patients with idiopathic rapid eye movement sleep behaviour disorder and compared the results with those from 21 age- and gender-matched healthy volunteers. All subjects underwent a clinical examination, motor, cognitive, autonomous, psychological, olfactory and colour vision tests, and rapid eye movement sleep characterization using video-polysomnography and 3 T magnetic resonance imaging. The patients more frequently had preclinical markers of alpha-synucleinopathies, including constipation, olfactory deficits, orthostatic hypotension, and subtle motor impairment. Using neuromelanin-sensitive imaging, reduced signal intensity was identified in the locus coeruleus/subcoeruleus complex of the patients with idiopathic rapid eye movement sleep behaviour. The mean sensitivity of the visual analyses of the signal performed by neuroradiologists who were blind to the clinical diagnoses was 82.5%, and the specificity was 81% for the identification of idiopathic rapid eye movement sleep behaviour. The results confirm that this complex is affected in idiopathic rapid eye movement sleep behaviour (to the same degree as it is affected in Parkinson's disease). Neuromelanin-sensitive imaging provides an early marker of non-dopaminergic alpha-synucleinopathy that can be detected on an individual basis. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Cervico-ocular coordination during neck rotation is distorted in people with whiplash-associated disorders.

    PubMed

    Bexander, Catharina S M; Hodges, Paul W

    2012-03-01

    People with whiplash-associated disorders (WAD) not only suffer from neck/head pain, but commonly report deficits in eye movement control. Recent work has highlighted a strong relationship between eye and neck muscle activation in pain-free subjects. It is possible that WAD may disrupt the intricate coordination between eye and neck movement. Electromyographic activity (EMG) of muscles that rotate the cervical spine to the right (left sternocleidomastoid, right obliquus capitis inferior (OI), right splenius capitis (SC) and right multifidus (MF)) was recorded in nine people with chronic WAD. Cervical rotation was performed with five gaze conditions involving different gaze directions relative to cervical rotation. The relationship between eye position/movement and neck muscle activity was contrasted with previous observations from pain-free controls. Three main differences were observed in WAD. First, the superficial muscle SC was active with both directions of cervical rotation in contrast to activity only with right rotation in pain-free controls. Second, activity of OI and MF varied between directions of cervical rotation, unlike the non-direction-specific activity in controls. Third, the effect of horizontal gaze direction on neck muscle EMG was augmented compared to controls. These observations provide evidence of redistribution of activity between neck muscles during cervical rotation and increased interaction between eye and neck muscle activity in people with WAD. These changes in cervico-ocular coordination may underlie clinical symptoms reported by people with WAD that involve visual deficits and changes in function during cervical rotation such as postural control.

  2. Eye Movements in Risky Choice

    PubMed Central

    Hermens, Frouke; Matthews, William J.

    2015-01-01

    Abstract We asked participants to make simple risky choices while we recorded their eye movements. We built a complete statistical model of the eye movements and found very little systematic variation in eye movements over the time course of a choice or across the different choices. The only exceptions were finding more (of the same) eye movements when choice options were similar, and an emerging gaze bias in which people looked more at the gamble they ultimately chose. These findings are inconsistent with prospect theory, the priority heuristic, or decision field theory. However, the eye movements made during a choice have a large relationship with the final choice, and this is mostly independent from the contribution of the actual attribute values in the choice options. That is, eye movements tell us not just about the processing of attribute values but also are independently associated with choice. The pattern is simple—people choose the gamble they look at more often, independently of the actual numbers they see—and this pattern is simpler than predicted by decision field theory, decision by sampling, and the parallel constraint satisfaction model. © 2015 The Authors. Journal of Behavioral Decision Making published by John Wiley & Sons Ltd. PMID:27522985

  3. The role of eye movements in decision making and the prospect of exposure effects.

    PubMed

    Bird, Gary D; Lauwereyns, Johan; Crawford, Matthew T

    2012-05-01

    The aim of the current study was to follow on from previous findings that eye movements can have a causal influence on preference formation. Shimojo et al. (2003) previously found that faces that were presented for a longer duration in a two alternative forced choice task were more likely to be judged as more attractive. This effect only occurred when an eye movement was made towards the faces (with no effect when faces were centrally presented). The current study replicated Shimojo et al.'s (2003) design, whilst controlling for potential inter-stimuli interference in central presentations. As per previous findings, when eye movements were made towards the stimuli, faces that were presented for longer durations were preferred. However, faces that were centrally presented (thus not requiring an eye movement) were also preferred in the current study. The presence of an exposure duration effect for centrally presented faces casts doubt on the necessity of the eye movement in this decision making process and has implications for decision theories that place an emphasis on the role of eye movements in decision making. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  5. Where’s Waldo? How perceptual, cognitive, and emotional brain processes cooperate during learning to categorize and find desired objects in a cluttered scene

    PubMed Central

    Chang, Hung-Cheng; Grossberg, Stephen; Cao, Yongqiang

    2014-01-01

    The Where’s Waldo problem concerns how individuals can rapidly learn to search a scene to detect, attend, recognize, and look at a valued target object in it. This article develops the ARTSCAN Search neural model to clarify how brain mechanisms across the What and Where cortical streams are coordinated to solve the Where’s Waldo problem. The What stream learns positionally-invariant object representations, whereas the Where stream controls positionally-selective spatial and action representations. The model overcomes deficiencies of these computationally complementary properties through What and Where stream interactions. Where stream processes of spatial attention and predictive eye movement control modulate What stream processes whereby multiple view- and positionally-specific object categories are learned and associatively linked to view- and positionally-invariant object categories through bottom-up and attentive top-down interactions. Gain fields control the coordinate transformations that enable spatial attention and predictive eye movements to carry out this role. What stream cognitive-emotional learning processes enable the focusing of motivated attention upon the invariant object categories of desired objects. What stream cognitive names or motivational drives can prime a view- and positionally-invariant object category of a desired target object. A volitional signal can convert these primes into top-down activations that can, in turn, prime What stream view- and positionally-specific categories. When it also receives bottom-up activation from a target, such a positionally-specific category can cause an attentional shift in the Where stream to the positional representation of the target, and an eye movement can then be elicited to foveate it. These processes describe interactions among brain regions that include visual cortex, parietal cortex, inferotemporal cortex, prefrontal cortex (PFC), amygdala, basal ganglia (BG), and superior colliculus (SC). PMID:24987339

  6. Paroxysmal eye–head movements in Glut1 deficiency syndrome

    PubMed Central

    Engelstad, Kristin; Kane, Steven A.; Goldberg, Michael E.; De Vivo, Darryl C.

    2017-01-01

    Objective: To describe a characteristic paroxysmal eye–head movement disorder that occurs in infants with Glut1 deficiency syndrome (Glut1 DS). Methods: We retrospectively reviewed the medical charts of 101 patients with Glut1 DS to obtain clinical data about episodic abnormal eye movements and analyzed video recordings of 18 eye movement episodes from 10 patients. Results: A documented history of paroxysmal abnormal eye movements was found in 32/101 patients (32%), and a detailed description was available in 18 patients, presented here. Episodes started before age 6 months in 15/18 patients (83%), and preceded the onset of seizures in 10/16 patients (63%) who experienced both types of episodes. Eye movement episodes resolved, with or without treatment, by 6 years of age in 7/8 patients with documented long-term course. Episodes were brief (usually <5 minutes). Video analysis revealed that the eye movements were rapid, multidirectional, and often accompanied by a head movement in the same direction. Eye movements were separated by clear intervals of fixation, usually ranging from 200 to 800 ms. The movements were consistent with eye–head gaze saccades. These movements can be distinguished from opsoclonus by the presence of a clear intermovement fixation interval and the association of a same-direction head movement. Conclusions: Paroxysmal eye–head movements, for which we suggest the term aberrant gaze saccades, are an early symptom of Glut1 DS in infancy. Recognition of the episodes will facilitate prompt diagnosis of this treatable neurodevelopmental disorder. PMID:28341645

  7. CEFR and Eye Movement Characteristics during EFL Reading: The Case of Intermediate Readers

    ERIC Educational Resources Information Center

    Dolgunsöz, Emrah; Sariçoban, Arif

    2016-01-01

    This study primarily aims to (1) examine the relationship between foreign language reading proficiency and eye movements during reading, and (2) to describe eye movement differences between two CEFR proficiency groups (B1 and B2) by using eye tracking technique. 57 learners of EFL were tested under two experimental conditions: Natural L2 reading…

  8. Food-Predicting Stimuli Differentially Influence Eye Movements and Goal-Directed Behavior in Normal-Weight, Overweight, and Obese Individuals

    PubMed Central

    Lehner, Rea; Balsters, Joshua H.; Bürgler, Alexandra; Hare, Todd A.; Wenderoth, Nicole

    2017-01-01

    Obese individuals have been shown to exhibit abnormal sensitivity to rewards and reward-predicting cues as for example food-associated cues frequently used in advertisements. It has also been shown that food-associated cues can increase goal-directed behavior but it is currently unknown, whether this effect differs between normal-weight, overweight, and obese individuals. Here, we investigate this question by using a Pavlovian-to-instrumental transfer (PIT) task in normal-weight (N = 20), overweight (N = 17), and obese (N = 17) individuals. Furthermore, we applied eye tracking during Pavlovian conditioning to measure the participants’ conditioned response as a proxy of the incentive salience of the predicted reward. Our results show that the goal-directed behavior of overweight individuals was more strongly influenced by food-predicting cues (i.e., stronger PIT effect) than that of normal-weight and obese individuals (p < 0.001). The weight groups were matched for age, gender, education, and parental education. Eye movements during Pavlovian conditioning also differed between weight categories (p < 0.05) and were used to categorize individuals based on their fixation style into “high eye index” versus “low eye index” as well. Our main finding was that the fixation style exhibited a complex interaction with the weight category. Furthermore, we found that normal-weight individuals of the group “high eye index” had higher body mass index within the healthy range than individuals of the group “low eye index” (p < 0.001), but this relationship was not found within in the overweight or obese groups (p > 0.646). Our findings are largely consistent with the incentive sensitization theory predicting that overweight individuals are more susceptible to food-related cues than normal-weight controls. However, this hypersensitivity might be reduced in obese individuals, possibly due to habitual/compulsive overeating or differences in reward valuation. PMID:29180968

  9. Food-Predicting Stimuli Differentially Influence Eye Movements and Goal-Directed Behavior in Normal-Weight, Overweight, and Obese Individuals.

    PubMed

    Lehner, Rea; Balsters, Joshua H; Bürgler, Alexandra; Hare, Todd A; Wenderoth, Nicole

    2017-01-01

    Obese individuals have been shown to exhibit abnormal sensitivity to rewards and reward-predicting cues as for example food-associated cues frequently used in advertisements. It has also been shown that food-associated cues can increase goal-directed behavior but it is currently unknown, whether this effect differs between normal-weight, overweight, and obese individuals. Here, we investigate this question by using a Pavlovian-to-instrumental transfer (PIT) task in normal-weight ( N  = 20), overweight ( N  = 17), and obese ( N  = 17) individuals. Furthermore, we applied eye tracking during Pavlovian conditioning to measure the participants' conditioned response as a proxy of the incentive salience of the predicted reward. Our results show that the goal-directed behavior of overweight individuals was more strongly influenced by food-predicting cues (i.e., stronger PIT effect) than that of normal-weight and obese individuals ( p  < 0.001). The weight groups were matched for age, gender, education, and parental education. Eye movements during Pavlovian conditioning also differed between weight categories ( p  < 0.05) and were used to categorize individuals based on their fixation style into "high eye index" versus "low eye index" as well. Our main finding was that the fixation style exhibited a complex interaction with the weight category. Furthermore, we found that normal-weight individuals of the group "high eye index" had higher body mass index within the healthy range than individuals of the group "low eye index" ( p  < 0.001), but this relationship was not found within in the overweight or obese groups ( p  > 0.646). Our findings are largely consistent with the incentive sensitization theory predicting that overweight individuals are more susceptible to food-related cues than normal-weight controls. However, this hypersensitivity might be reduced in obese individuals, possibly due to habitual/compulsive overeating or differences in reward valuation.

  10. Vasoactive intestinal polypeptide microinjections into the oral pontine tegmentum enhance rapid eye movement sleep in the rat.

    PubMed

    Bourgin, P; Lebrand, C; Escourrou, P; Gaultier, C; Franc, B; Hamon, M; Adrien, J

    1997-03-01

    Rapid eye movement sleep can be elicited in the rat by microinjection of the cholinergic agonist carbachol into the oral pontine reticular nucleus. Intracerebroventricular administration, during the light period, of vasoactive intestinal peptide enhances rapid eye movement sleep in several species. Since this peptide is co-localized with acetylcholine in many neurons in the central nervous system, it was assumed that the oral pontine tegmentum could also be one target for vasoactive intestinal peptide to induce rapid eye movement sleep. This hypothesis was tested by recording the sleep-wakefulness cycle in freely-moving rats injected with vasoactive intestinal peptide or its fragments (1-12 and 10-28) directly into the oral pontine reticular nucleus. when administered into the posterior part of this nucleus, vasoactive intestinal peptide at 1 and 10 ng (in 0.1 microliter of saline), but not its fragments, induced a 2-fold enhancement of rapid eye movement sleep during 4 h, at the expense of wakefulness. At the dose of 10 ng, a significant increase in rapid eye movement sleep persisted for up to 8 h. Moreover, when the peptide was injected into the centre of the positive zone, rapid eye movement sleep was enhanced during three to eight consecutive days. These data provide the first evidence that rapid eye movement sleep can be elicited at both short- and long-term by a single intracerebral microinjection of vasoactive intestinal peptide. Peptidergic mechanisms, possibly in association with cholinergic mechanisms, within the caudal part of the oral pontine reticular nucleus may play a critical role in the long-term regulation of rapid eye movement sleep in rats.

  11. Modeling the Scheduling of Eye Movements and Manual Responses in Performing a Sequence of Discrete Tasks

    NASA Technical Reports Server (NTRS)

    Wu, Shu-Chieh; Remington, Roger W.; Lewis, Richard

    2006-01-01

    Common tasks in daily life are often accomplished by a sequence of actions that interleave information acquisition through the eyes and action execution by the hands. How are eye movements coordinated with the release of manual responses and how may their coordination be represented at the level of component mental operations? We have previously presented data from a typing-like task requiring separate choice responses to a series of five stimuli. We found a consistent pattern of results in both motor and ocular timing, and hypothesized possible relationships among underlying components. Here we report a model of that task, which demonstrates how the observed timing of eye movements to successive stimuli could be accounted for by assuming systems: an open-loop system generating saccades at a periodic rate, and a closed-loop system commanding a saccade based on stimulus processing. We relate this model to models of reading and discuss the motivation for dual control.

  12. Updating visual memory across eye movements for ocular and arm motor control.

    PubMed

    Thompson, Aidan A; Henriques, Denise Y P

    2008-11-01

    Remembered object locations are stored in an eye-fixed reference frame, so that every time the eyes move, spatial representations must be updated for the arm-motor system to reflect the target's new relative position. To date, studies have not investigated how the brain updates these spatial representations during other types of eye movements, such as smooth-pursuit. Further, it is unclear what information is used in spatial updating. To address these questions we investigated whether remembered locations of pointing targets are updated following smooth-pursuit eye movements, as they are following saccades, and also investigated the role of visual information in estimating eye-movement amplitude for updating spatial memory. Misestimates of eye-movement amplitude were induced when participants visually tracked stimuli presented with a background that moved in either the same or opposite direction of the eye before pointing or looking back to the remembered target location. We found that gaze-dependent pointing errors were similar following saccades and smooth-pursuit and that incongruent background motion did result in a misestimate of eye-movement amplitude. However, the background motion had no effect on spatial updating for pointing, but did when subjects made a return saccade, suggesting that the oculomotor and arm-motor systems may rely on different sources of information for spatial updating.

  13. Effects of background stimulation upon eye-movement information.

    PubMed

    Nakamura, S

    1996-04-01

    To investigate the effects of background stimulation upon eye-movement information (EMI), the perceived deceleration of the target motion during pursuit eye movement (Aubert-Fleishl paradox) was analyzed. In the experiment, a striped pattern was used as a background stimulus with various brightness contrasts and spatial frequencies for serially manipulating the attributions of the background stimulus. Analysis showed that the retinal-image motion of the background stimulus (optic flow) affected eye-movement information and that the effects of optic flow became stronger when high contrast and low spatial frequency stripes were presented as the background stimulus. In conclusion, optic flow is one source of eye-movement information in determining real object motion, and the effectiveness of optic flow depends on the attributes of the background stimulus.

  14. A Pilot Study of Horizontal Head and Eye Rotations in Baseball Batting.

    PubMed

    Fogt, Nick; Persson, Tyler W

    2017-08-01

    The purpose of the study was to measure and compare horizontal head and eye tracking movements as baseball batters "took" pitches and swung at baseball pitches. Two former college baseball players were tested in two conditions. A pitching machine was used to project tennis balls toward the subjects. In the first condition, subjects acted as if they were taking (i.e., not swinging) the pitches. In the second condition, subjects attempted to bat the pitched balls. Head movements were measured with an inertial sensor; eye movements were measured with a video eye tracker. For each condition, the relationship between the horizontal head and eye rotations was similar for the two subjects, as were the overall head-, eye-, and gaze-tracking strategies. In the "take" condition, head movements in the direction of the ball were larger than eye movements for much of the pitch trajectory. Large eye movements occurred only late in the pitch trajectory. Gaze was directed near the ball until approximately 150 milliseconds before the ball arrived at the batter, at which time gaze was directed ahead of the ball to a location near that occupied when the ball crosses the plate. In the "swing" condition, head movements in the direction of the ball were larger than eye movements throughout the pitch trajectory. Gaze was directed near the ball until approximately 50 to 60 milliseconds prior to pitch arrival at the batter. Horizontal head rotations were larger than horizontal eye rotations in both the "take" and "swing" conditions. Gaze was directed ahead of the ball late in the pitch trajectory in the "take" condition, whereas gaze was directed near the ball throughout much of the pitch trajectory in the "swing" condition.

  15. Detecting eye movements in dynamic environments.

    PubMed

    Reimer, Bryan; Sodhi, Manbir

    2006-11-01

    To take advantage of the increasing number of in-vehicle devices, automobile drivers must divide their attention between primary (driving) and secondary (operating in-vehicle device) tasks. In dynamic environments such as driving, however, it is not easy to identify and quantify how a driver focuses on the various tasks he/she is simultaneously engaged in, including the distracting tasks. Measures derived from the driver's scan path have been used as correlates of driver attention. This article presents a methodology for analyzing eye positions, which are discrete samples of a subject's scan path, in order to categorize driver eye movements. Previous methods of analyzing eye positions recorded in a dynamic environment have relied completely on the manual identification of the focus of visual attention from a point of regard superimposed on a video of a recorded scene, failing to utilize information regarding movement structure in the raw recorded eye positions. Although effective, these methods are too time consuming to be easily used when the large data sets that would be required to identify subtle differences between drivers, under different road conditions, and with different levels of distraction are processed. The aim of the methods presented in this article are to extend the degree of automation in the processing of eye movement data by proposing a methodology for eye movement analysis that extends automated fixation identification to include smooth and saccadic movements. By identifying eye movements in the recorded eye positions, a method of reducing the analysis of scene video to a finite search space is presented. The implementation of a software tool for the eye movement analysis is described, including an example from an on-road test-driving sample.

  16. Object based implicit contextual learning: a study of eye movements.

    PubMed

    van Asselen, Marieke; Sampaio, Joana; Pina, Ana; Castelo-Branco, Miguel

    2011-02-01

    Implicit contextual cueing refers to a top-down mechanism in which visual search is facilitated by learned contextual features. In the current study we aimed to investigate the mechanism underlying implicit contextual learning using object information as a contextual cue. Therefore, we measured eye movements during an object-based contextual cueing task. We demonstrated that visual search is facilitated by repeated object information and that this reduction in response times is associated with shorter fixation durations. This indicates that by memorizing associations between objects in our environment we can recognize objects faster, thereby facilitating visual search.

  17. Methodological Aspects of Cognitive Rehabilitation with Eye Movement Desensitization and Reprocessing (EMDR)

    PubMed Central

    Zarghi, Afsaneh; Zali, Alireza; Tehranidost, Mehdi

    2013-01-01

    A variety of nervous system components such as medulla, pons, midbrain, cerebellum, basal ganglia, parietal, frontal and occipital lobes have role in Eye Movement Desensitization and Reprocessing (EMDR) processes. The eye movement is done simultaneously for attracting client's attention to an external stimulus while concentrating on a certain internal subject. Eye movement guided by therapist is the most common attention stimulus. The role of eye movement has been documented previously in relation with cognitive processing mechanisms. A series of systemic experiments have shown that the eyes’ spontaneous movement is associated with emotional and cognitive changes and results in decreased excitement, flexibility in attention, memory processing, and enhanced semantic recalling. Eye movement also decreases the memory's image clarity and the accompanying excitement. By using EMDR, we can reach some parts of memory which were inaccessible before and also emotionally intolerable. Various researches emphasize on the effectiveness of EMDR in treating and curing phobias, pains, and dependent personality disorders. Consequently, due to the involvement of multiple neural system components, this palliative method of treatment can also help to rehabilitate the neuro-cognitive system. PMID:25337334

  18. Predictive saccade in the absence of smooth pursuit: interception of moving targets in the archer fish.

    PubMed

    Ben-Simon, Avi; Ben-Shahar, Ohad; Vasserman, Genadiy; Segev, Ronen

    2012-12-15

    Interception of fast-moving targets is a demanding task many animals solve. To handle it successfully, mammals employ both saccadic and smooth pursuit eye movements in order to confine the target to their area centralis. But how can non-mammalian vertebrates, which lack smooth pursuit, intercept moving targets? We studied this question by exploring eye movement strategies employed by archer fish, an animal that possesses an area centralis, lacks smooth pursuit eye movements, but can intercept moving targets by shooting jets of water at them. We tracked the gaze direction of fish during interception of moving targets and found that they employ saccadic eye movements based on prediction of target position when it is hit. The fish fixates on the target's initial position for ∼0.2 s from the onset of its motion, a time period used to predict whether a shot can be made before the projection of the target exits the area centralis. If the prediction indicates otherwise, the fish performs a saccade that overshoots the center of gaze beyond the present target projection on the retina, such that after the saccade the moving target remains inside the area centralis long enough to prepare and perform a shot. These results add to the growing body of knowledge on biological target tracking and may shed light on the mechanism underlying this behavior in other animals with no neural system for the generation of smooth pursuit eye movements.

  19. Learning rational temporal eye movement strategies.

    PubMed

    Hoppe, David; Rothkopf, Constantin A

    2016-07-19

    During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.

  20. Measuring the effect of multiple eye fixations on memory for visual attributes.

    PubMed

    Palmer, J; Ames, C T

    1992-09-01

    Because of limited peripheral vision, many visual tasks depend on multiple eye fixations. Good performance in such tasks demonstrates that some memory must survive from one fixation to the next. One factor that must influence performance is the degree to which multiple eye fixations interfere with the critical memories. In the present study, the amount of interference was measured by comparing visual discriminations based on multiple fixations to visual discriminations based on a single fixation. The procedure resembled partial report, but used a discrimination measure. In the prototype study, two lines were presented, followed by a single line and a cue. The cue pointed toward one of the positions of the first two lines. Observers were required to judge if the single line in the second display was longer or shorter than the cued line of the first display. These judgments were used to estimate a length threshold. The critical manipulation was to instruct observers either to maintain fixation between the lines of the first display or to fixate each line in sequence. The results showed an advantage for multiple fixations despite the intervening eye movements. In fact, thresholds for the multiple-fixation condition were nearly as good as those in a control condition where the lines were foveally viewed without eye movements. Thus, eye movements had little or no interfering effect in this task. Additional studies generalized the procedure and the stimuli. In conclusion, information about a variety of size and shape attributes was remembered with essentially no interference across eye fixations.

  1. The analysis of the influence of fractal structure of stimuli on fractal dynamics in fixational eye movements and EEG signal

    NASA Astrophysics Data System (ADS)

    Namazi, Hamidreza; Kulish, Vladimir V.; Akrami, Amin

    2016-05-01

    One of the major challenges in vision research is to analyze the effect of visual stimuli on human vision. However, no relationship has been yet discovered between the structure of the visual stimulus, and the structure of fixational eye movements. This study reveals the plasticity of human fixational eye movements in relation to the ‘complex’ visual stimulus. We demonstrated that the fractal temporal structure of visual dynamics shifts towards the fractal dynamics of the visual stimulus (image). The results showed that images with higher complexity (higher fractality) cause fixational eye movements with lower fractality. Considering the brain, as the main part of nervous system that is engaged in eye movements, we analyzed the governed Electroencephalogram (EEG) signal during fixation. We have found out that there is a coupling between fractality of image, EEG and fixational eye movements. The capability observed in this research can be further investigated and applied for treatment of different vision disorders.

  2. Countermanding eye-head gaze shifts in humans: marching orders are delivered to the head first.

    PubMed

    Corneil, Brian D; Elsley, James K

    2005-07-01

    The countermanding task requires subjects to cancel a planned movement on appearance of a stop signal, providing insights into response generation and suppression. Here, we studied human eye-head gaze shifts in a countermanding task with targets located beyond the horizontal oculomotor range. Consistent with head-restrained saccadic countermanding studies, the proportion of gaze shifts on stop trials increased the longer the stop signal was delayed after target presentation, and gaze shift stop-signal reaction times (SSRTs: a derived statistic measuring how long it takes to cancel a movement) averaged approximately 120 ms across seven subjects. We also observed a marked proportion of trials (13% of all stop trials) during which gaze remained stable but the head moved toward the target. Such head movements were more common at intermediate stop signal delays. We never observed the converse sequence wherein gaze moved while the head remained stable. SSRTs for head movements averaged approximately 190 ms or approximately 70-75 ms longer than gaze SSRTs. Although our findings are inconsistent with a single race to threshold as proposed for controlling saccadic eye movements, movement parameters on stop trials attested to interactions consistent with a race model architecture. To explain our data, we tested two extensions to the saccadic race model. The first assumed that gaze shifts and head movements are controlled by parallel but independent races. The second model assumed that gaze shifts and head movements are controlled by a single race, preceded by terminal ballistic intervals not under inhibitory control, and that the head-movement branch is activated at a lower threshold. Although simulations of both models produced acceptable fits to the empirical data, we favor the second alternative as it is more parsimonious with recent findings in the oculomotor system. Using the second model, estimates for gaze and head ballistic intervals were approximately 25 and 90 ms, respectively, consistent with the known physiology of the final motor paths. Further, the threshold of the head movement branch was estimated to be 85% of that required to activate gaze shifts. From these results, we conclude that a commitment to a head movement is made in advance of gaze shifts and that the comparative SSRT differences result primarily from biomechanical differences inherent to eye and head motion.

  3. Comparing two types of engineering visualizations: task-related manipulations matter.

    PubMed

    Cölln, Martin C; Kusch, Kerstin; Helmert, Jens R; Kohler, Petra; Velichkovsky, Boris M; Pannasch, Sebastian

    2012-01-01

    This study focuses on the comparison of traditional engineering drawings with a CAD (computer aided design) visualization in terms of user performance and eye movements in an applied context. Twenty-five students of mechanical engineering completed search tasks for measures in two distinct depictions of a car engine component (engineering drawing vs. CAD model). Besides spatial dimensionality, the display types most notably differed in terms of information layout, access and interaction options. The CAD visualization yielded better performance, if users directly manipulated the object, but was inferior, if employed in a conventional static manner, i.e. inspecting only predefined views. An additional eye movement analysis revealed longer fixation durations and a stronger increase of task-relevant fixations over time when interacting with the CAD visualization. This suggests a more focused extraction and filtering of information. We conclude that the three-dimensional CAD visualization can be advantageous if its ability to manipulate is used. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Estimation of Delta Wave by Mutual Information of Heartbeat During Sleep

    NASA Astrophysics Data System (ADS)

    Kurihara, Yosuke; Watanabe, Kajiro; Kobayashi, Kazuyuki; Tanaka, Hiroshi

    The quality of sleep is evaluated based on the sleep stages judged by R-K method or the manual of American Academy of Sleep Medicine. The brainwaves, eye movements, and chin EMG of sleeping subjects are used for the judgment. These methods above, however, require some electrodes to be attached to the head and the face to obtain the brainwaves, eye movements, and chin EMG, thus making the measurements troublesome to be held on a daily basis. If non-invasive measurements of brainwaves, eye movements, and chin EMG are feasible, or their equivalent data can be estimated through other bio-signals, the monitoring of the quality of daily sleeps, which influences the health condition, will be easy. In this paper, we discuss the appearance rate of delta wave occurrences, which is deeply related with the depth of sleep, can be estimated based on the average amount of mutual information calculated by pulse wave signals and body movements measured non-invasively by the pneumatic method. As a result, the root mean square error between the appearance rate of delta wave occurrences measured with a polysomnography and the estimated delta pulse was 14.93%.

  5. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    PubMed Central

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  6. Intuitive Face Judgments Rely on Holistic Eye Movement Pattern

    PubMed Central

    Mega, Laura F.; Volz, Kirsten G.

    2017-01-01

    Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an ‘intuitive group,’ instructed to rely on their “gut feeling” for the authenticity judgments, and a ‘deliberative group,’ instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the “gestalt” of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research. PMID:28676773

  7. Intuitive Face Judgments Rely on Holistic Eye Movement Pattern.

    PubMed

    Mega, Laura F; Volz, Kirsten G

    2017-01-01

    Non-verbal signals such as facial expressions are of paramount importance for social encounters. Their perception predominantly occurs without conscious awareness and is effortlessly integrated into social interactions. In other words, face perception is intuitive. Contrary to classical intuition tasks, this work investigates intuitive processes in the realm of every-day type social judgments. Two differently instructed groups of participants judged the authenticity of emotional facial expressions, while their eye movements were recorded: an 'intuitive group,' instructed to rely on their "gut feeling" for the authenticity judgments, and a 'deliberative group,' instructed to make their judgments after careful analysis of the face. Pixel-wise statistical maps of the resulting eye movements revealed a differential viewing pattern, wherein the intuitive judgments relied on fewer, longer and more centrally located fixations. These markers have been associated with a global/holistic viewing strategy. The holistic pattern of intuitive face judgments is in line with evidence showing that intuition is related to processing the "gestalt" of an object, rather than focusing on details. Our work thereby provides further evidence that intuitive processes are characterized by holistic perception, in an understudied and real world domain of intuition research.

  8. Zinc-containing yeast extract promotes nonrapid eye movement sleep in mice.

    PubMed

    Cherasse, Yoan; Saito, Hitomi; Nagata, Nanae; Aritake, Kosuke; Lazarus, Michael; Urade, Yoshihiro

    2015-10-01

    Zinc is an essential trace element for humans and animals, being located, among other places, in the synaptic vesicles of cortical glutamatergic neurons and hippocampal mossy fibers in the brain. Extracellular zinc has the potential to interact with and modulate many different synaptic targets, including glutamate and GABA receptors. Because of the central role of these neurotransmitters in brain activity, we examined in this study the sleep-promoting activity of zinc by monitoring locomotor activity and electroencephalogram after its administration to mice. Zinc-containing yeast extract (40 and 80 mg/kg) dose dependently increased the total amount of nonrapid eye movement sleep and decreased the locomotor activity. However, this preparation did not change the amount of rapid eye movement sleep or show any adverse effects such as rebound of insomnia during a period of 24 h following the induction of sleep; whereas the extracts containing other divalent cations (manganese, iron, and copper) did not decrease the locomotor activity. This is the first evidence that zinc can induce sleep. Our data open the way to new types of food supplements designed to improve sleep. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Eye movement instructions modulate motion illusion and body sway with Op Art.

    PubMed

    Kapoula, Zoï; Lang, Alexandre; Vernet, Marine; Locher, Paul

    2015-01-01

    Op Art generates illusory visual motion. It has been proposed that eye movements participate in such illusion. This study examined the effect of eye movement instructions (fixation vs. free exploration) on the sensation of motion as well as the body sway of subjects viewing Op Art paintings. Twenty-eight healthy adults in orthostatic stance were successively exposed to three visual stimuli consisting of one figure representing a cross (baseline condition) and two Op Art paintings providing sense of motion in depth-Bridget Riley's Movements in Squares and Akiyoshi Kitaoka's Rollers. Before their exposure to the Op Art images, participants were instructed either to fixate at the center of the image (fixation condition) or to explore the artwork (free viewing condition). Posture was measured for 30 s per condition using a body fixed sensor (accelerometer). The major finding of this study is that the two Op Art paintings induced a larger antero-posterior body sway both in terms of speed and displacement and an increased motion illusion in the free viewing condition as compared to the fixation condition. For body sway, this effect was significant for the Riley painting, while for motion illusion this effect was significant for Kitaoka's image. These results are attributed to macro-saccades presumably occurring under free viewing instructions, and most likely to the small vergence drifts during fixations following the saccades; such movements in interaction with visual properties of each image would increase either the illusory motion sensation or the antero-posterior body sway.

  10. Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys.

    PubMed

    Ando, K; Johanson, C E; Levy, D L; Yasillo, N J; Holzman, P S; Schuster, C R

    1983-01-01

    Rhesus monkeys were trained to track a moving disk using a procedure in which responses on a lever were reinforced with water delivery only when the disk, oscillating in a horizontal plane on a screen at a frequency of 0.4 Hz in a visual angle of 20 degrees, dimmed for a brief period. Pursuit eye movements were recorded by electrooculography (EOG). IM phencyclidine, secobarbital, and diazepam injections decreased the number of reinforced lever presses in a dose-related manner. Both secobarbital and diazepam produced episodic jerky-pursuit eye movements, while phencyclidine had no consistent effects on eye movements. Lever pressing was disrupted at doses which had little effect on the quality of smooth-pursuit eye movements in some monkeys. This separation was particularly pronounced with diazepam. The similarities of the drug effects on smooth-pursuit eye movements between the present study and human studies indicate that the present method using rhesus monkeys may be useful for predicting drug effects on eye tracking and oculomotor function in humans.

  11. Eye-Pursuit and Reafferent Head Movement Signals Carried by Pursuit Neurons in the Caudal Part of the Frontal Eye Fields during Head-Free Pursuit

    PubMed Central

    Kasahara, Satoshi; Akao, Teppei; Kurkin, Sergei; Peterson, Barry W.

    2009-01-01

    Eye and head movements are coordinated during head-free pursuit. To examine whether pursuit neurons in frontal eye fields (FEF) carry gaze-pursuit commands that drive both eye-pursuit and head-pursuit, monkeys whose heads were free to rotate about a vertical axis were trained to pursue a juice feeder with their head and a target with their eyes. Initially the feeder and target moved synchronously with the same visual angle. FEF neurons responding to this gaze-pursuit were tested for eye-pursuit of target motion while the feeder was stationary and for head-pursuit while the target was stationary. The majority of pursuit neurons exhibited modulation during head-pursuit, but their preferred directions during eye-pursuit and head-pursuit were different. Although peak modulation occurred during head movements, the onset of discharge usually was not aligned with the head movement onset. The minority of neurons whose discharge onset was so aligned discharged after the head movement onset. These results do not support the idea that the head-pursuit–related modulation reflects head-pursuit commands. Furthermore, modulation similar to that during head-pursuit was obtained by passive head rotation on stationary trunk. Our results suggest that FEF pursuit neurons issue gaze or eye movement commands during gaze-pursuit and that the head-pursuit–related modulation primarily reflects reafferent signals resulting from head movements. PMID:18483002

  12. Visual and non-visual motion information processing during pursuit eye tracking in schizophrenia and bipolar disorder.

    PubMed

    Trillenberg, Peter; Sprenger, Andreas; Talamo, Silke; Herold, Kirsten; Helmchen, Christoph; Verleger, Rolf; Lencer, Rebekka

    2017-04-01

    Despite many reports on visual processing deficits in psychotic disorders, studies are needed on the integration of visual and non-visual components of eye movement control to improve the understanding of sensorimotor information processing in these disorders. Non-visual inputs to eye movement control include prediction of future target velocity from extrapolation of past visual target movement and anticipation of future target movements. It is unclear whether non-visual input is impaired in patients with schizophrenia. We recorded smooth pursuit eye movements in 21 patients with schizophrenia spectrum disorder, 22 patients with bipolar disorder, and 24 controls. In a foveo-fugal ramp task, the target was either continuously visible or was blanked during movement. We determined peak gain (measuring overall performance), initial eye acceleration (measuring visually driven pursuit), deceleration after target extinction (measuring prediction), eye velocity drifts before onset of target visibility (measuring anticipation), and residual gain during blanking intervals (measuring anticipation and prediction). In both patient groups, initial eye acceleration was decreased and the ability to adjust eye acceleration to increasing target acceleration was impaired. In contrast, neither deceleration nor eye drift velocity was reduced in patients, implying unimpaired non-visual contributions to pursuit drive. Disturbances of eye movement control in psychotic disorders appear to be a consequence of deficits in sensorimotor transformation rather than a pure failure in adding cognitive contributions to pursuit drive in higher-order cortical circuits. More generally, this deficit might reflect a fundamental imbalance between processing external input and acting according to internal preferences.

  13. Advances in Relating Eye Movements and Cognition

    ERIC Educational Resources Information Center

    Hayhoe, Mary M.

    2004-01-01

    Measurement of eye movements is a powerful tool for investigating perceptual and cognitive function in both infants and adults. Straightforwardly, eye movements provide a multifaceted measure of performance. For example, the location of fixations, their duration, time of occurrence, and accuracy all are potentially revealing and often allow…

  14. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task.

    PubMed

    Bott, Nicholas T; Lange, Alex; Rentz, Dorene; Buffalo, Elizabeth; Clopton, Paul; Zola, Stuart

    2017-01-01

    Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive "window on the brain," and the recording of eye movements using web cameras is a burgeoning area of research. Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC) decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS)] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS). Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera. Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits ( r = 0.88-0.92). Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81-0.88). There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets ( r = 0.88-0.94). Significantly fewer data quality issues were encountered using the built-in web camera. Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as fixation points, built-in web cameras are a standard feature of most smart devices (e.g., laptops, tablets, smart phones) and can be effectively employed to track eye movements on decisional tasks with high accuracy and minimal cost.

  15. Eye movements during spoken word recognition in Russian children.

    PubMed

    Sekerina, Irina A; Brooks, Patricia J

    2007-09-01

    This study explores incremental processing in spoken word recognition in Russian 5- and 6-year-olds and adults using free-viewing eye-tracking. Participants viewed scenes containing pictures of four familiar objects and clicked on a target embedded in a spoken instruction. In the cohort condition, two object names shared identical three-phoneme onsets. In the noncohort condition, all object names had unique onsets. Coarse-grain analyses of eye movements indicated that adults produced looks to the competitor on significantly more cohort trials than on noncohort trials, whereas children surprisingly failed to demonstrate cohort competition due to widespread exploratory eye movements across conditions. Fine-grain analyses, in contrast, showed a similar time course of eye movements across children and adults, but with cohort competition lingering more than 1s longer in children. The dissociation between coarse-grain and fine-grain eye movements indicates a need to consider multiple behavioral measures in making developmental comparisons in language processing.

  16. Hawk Eyes I: Diurnal Raptors Differ in Visual Fields and Degree of Eye Movement

    PubMed Central

    O'Rourke, Colleen T.; Hall, Margaret I.; Pitlik, Todd; Fernández-Juricic, Esteban

    2010-01-01

    Background Different strategies to search and detect prey may place specific demands on sensory modalities. We studied visual field configuration, degree of eye movement, and orbit orientation in three diurnal raptors belonging to the Accipitridae and Falconidae families. Methodology/Principal Findings We used an ophthalmoscopic reflex technique and an integrated 3D digitizer system. We found inter-specific variation in visual field configuration and degree of eye movement, but not in orbit orientation. Red-tailed Hawks have relatively small binocular areas (∼33°) and wide blind areas (∼82°), but intermediate degree of eye movement (∼5°), which underscores the importance of lateral vision rather than binocular vision to scan for distant prey in open areas. Cooper's Hawks' have relatively wide binocular fields (∼36°), small blind areas (∼60°), and high degree of eye movement (∼8°), which may increase visual coverage and enhance prey detection in closed habitats. Additionally, we found that Cooper's Hawks can visually inspect the items held in the tip of the bill, which may facilitate food handling. American Kestrels have intermediate-sized binocular and lateral areas that may be used in prey detection at different distances through stereopsis and motion parallax; whereas the low degree eye movement (∼1°) may help stabilize the image when hovering above prey before an attack. Conclusions We conclude that: (a) there are between-species differences in visual field configuration in these diurnal raptors; (b) these differences are consistent with prey searching strategies and degree of visual obstruction in the environment (e.g., open and closed habitats); (c) variations in the degree of eye movement between species appear associated with foraging strategies; and (d) the size of the binocular and blind areas in hawks can vary substantially due to eye movements. Inter-specific variation in visual fields and eye movements can influence behavioral strategies to visually search for and track prey while perching. PMID:20877645

  17. Hawk eyes I: diurnal raptors differ in visual fields and degree of eye movement.

    PubMed

    O'Rourke, Colleen T; Hall, Margaret I; Pitlik, Todd; Fernández-Juricic, Esteban

    2010-09-22

    Different strategies to search and detect prey may place specific demands on sensory modalities. We studied visual field configuration, degree of eye movement, and orbit orientation in three diurnal raptors belonging to the Accipitridae and Falconidae families. We used an ophthalmoscopic reflex technique and an integrated 3D digitizer system. We found inter-specific variation in visual field configuration and degree of eye movement, but not in orbit orientation. Red-tailed Hawks have relatively small binocular areas (∼33°) and wide blind areas (∼82°), but intermediate degree of eye movement (∼5°), which underscores the importance of lateral vision rather than binocular vision to scan for distant prey in open areas. Cooper's Hawks' have relatively wide binocular fields (∼36°), small blind areas (∼60°), and high degree of eye movement (∼8°), which may increase visual coverage and enhance prey detection in closed habitats. Additionally, we found that Cooper's Hawks can visually inspect the items held in the tip of the bill, which may facilitate food handling. American Kestrels have intermediate-sized binocular and lateral areas that may be used in prey detection at different distances through stereopsis and motion parallax; whereas the low degree eye movement (∼1°) may help stabilize the image when hovering above prey before an attack. We conclude that: (a) there are between-species differences in visual field configuration in these diurnal raptors; (b) these differences are consistent with prey searching strategies and degree of visual obstruction in the environment (e.g., open and closed habitats); (c) variations in the degree of eye movement between species appear associated with foraging strategies; and (d) the size of the binocular and blind areas in hawks can vary substantially due to eye movements. Inter-specific variation in visual fields and eye movements can influence behavioral strategies to visually search for and track prey while perching.

  18. 2015 Summer Series - Lee Stone - Brain Function Through the Eyes of the Beholder

    NASA Image and Video Library

    2015-06-09

    The Visuomotor Control Laboratory (VCL) at NASA Ames conducts neuroscience research on the link between eye movements and brain function to provide an efficient and quantitative means of monitoring human perceptual performance. The VCL aims to make dramatic improvements in mission success through analysis, experimentation, and modeling of human performance and human-automation interaction. Dr. Lee Stone elaborates on how this research is conducted and how it contributes to NASA's mission and advances human-centered design and operations of complex aerospace systems.

  19. A Proposed Treatment for Visual Field Loss caused by Traumatic Brain Injury using Interactive Visuotactile Virtual Environment

    NASA Astrophysics Data System (ADS)

    Farkas, Attila J.; Hajnal, Alen; Shiratuddin, Mohd F.; Szatmary, Gabriella

    In this paper, we propose a novel approach of using interactive virtual environment technology in Vision Restoration Therapy caused by Traumatic Brain Injury. We called the new system Interactive Visuotactile Virtual Environment and it holds a promise of expanding the scope of already existing rehabilitation techniques. Traditional vision rehabilitation methods are based on passive psychophysical training procedures, and can last up to six months before any modest improvements can be seen in patients. A highly immersive and interactive virtual environment will allow the patient to practice everyday activities such as object identification and object manipulation through the use 3D motion sensoring handheld devices such data glove or the Nintendo Wiimote. Employing both perceptual and action components in the training procedures holds the promise of more efficient sensorimotor rehabilitation. Increased stimulation of visual and sensorimotor areas of the brain should facilitate a comprehensive recovery of visuomotor function by exploiting the plasticity of the central nervous system. Integrated with a motion tracking system and an eye tracking device, the interactive virtual environment allows for the creation and manipulation of a wide variety of stimuli, as well as real-time recording of hand-, eye- and body movements and coordination. The goal of the project is to design a cost-effective and efficient vision restoration system.

  20. Instrument Display Visual Angles for Conventional Aircraft and the MQ-9 Ground Control Station

    NASA Technical Reports Server (NTRS)

    Bendrick, Gregg A.; Kamine, Tovy Haber

    2008-01-01

    Aircraft instrument panels should be designed such that primary displays are in optimal viewing location to minimize pilot perception and response time. Human Factors engineers define three zones (i.e. "cones") of visual location: 1) "Easy Eye Movement" (foveal vision); 2) "Maximum Eye Movement" (peripheral vision with saccades), and 3) "Head Movement" (head movement required). Instrument display visual angles were measured to determine how well conventional aircraft (T-34, T-38, F- 15B, F-16XL, F/A-18A, U-2D, ER-2, King Air, G-III, B-52H, DC-10, B747-SCA) and the MQ-9 ground control station (GCS) complied with these standards, and how they compared with each other. Methods: Selected instrument parameters included: attitude, pitch, bank, power, airspeed, altitude, vertical speed, heading, turn rate, slip/skid, AOA, flight path, latitude, longitude, course, bearing, range and time. Vertical and horizontal visual angles for each component were measured from the pilot s eye position in each system. Results: The vertical visual angles of displays in conventional aircraft lay within the cone of "Easy Eye Movement" for all but three of the parameters measured, and almost all of the horizontal visual angles fell within this range. All conventional vertical and horizontal visual angles lay within the cone of "Maximum Eye Movement". However, most instrument vertical visual angles of the MQ-9 GCS lay outside the cone of "Easy Eye Movement", though all were within the cone of "Maximum Eye Movement". All the horizontal visual angles for the MQ-9 GCS were within the cone of "Easy Eye Movement". Discussion: Most instrument displays in conventional aircraft lay within the cone of "Easy Eye Movement", though mission-critical instruments sometimes displaced less important instruments outside this area. Many of the MQ-9 GCS systems lay outside this area. Specific training for MQ-9 pilots may be needed to avoid increased response time and potential error during flight.

  1. Moving the eye of the beholder. Motor components in vision determine aesthetic preference.

    PubMed

    Topolinski, Sascha

    2010-09-01

    Perception entails not only sensory input (e.g., merely seeing), but also subsidiary motor processes (e.g., moving the eyes); such processes have been neglected in research on aesthetic preferences. To fill this gap, the present research manipulated the fluency of perceptual motor processes independently from sensory input and predicted that this increased fluency would result in increased aesthetic preference for stimulus movements that elicited the same motor movements as had been previously trained. Specifically, addressing the muscles that move the eyes, I trained participants to follow a stimulus movement without actually seeing it. Experiment 1 demonstrated that ocular-muscle training resulted in the predicted increase in preference for trained stimulus movements compared with untrained stimulus movements, although participants had not previously seen any of the movements. Experiments 2 and 3 showed that actual motor matching and not perceptual similarity drove this effect. Thus, beauty may be not only in the eye of the beholder, but also in the eyes' movements.

  2. Short-latency primate vestibuloocular responses during translation

    NASA Technical Reports Server (NTRS)

    Angelaki, D. E.; McHenry, M. Q.

    1999-01-01

    Short-lasting, transient head displacements and near target fixation were used to measure the latency and early response gain of vestibularly evoked eye movements during lateral and fore-aft translations in rhesus monkeys. The latency of the horizontal eye movements elicited during lateral motion was 11.9 +/- 5.4 ms. Viewing distance-dependent behavior was seen as early as the beginning of the response profile. For fore-aft motion, latencies were different for forward and backward displacements. Latency averaged 7.1 +/- 9.3 ms during forward motion (same for both eyes) and 12.5 +/- 6.3 ms for the adducting eye (e.g., left eye during right fixation) during backward motion. Latencies during backward motion were significantly longer for the abducting eye (18.9 +/- 9.8 ms). Initial acceleration gains of the two eyes were generally larger than unity but asymmetric. Specifically, gains were consistently larger for abducting than adducting eye movements. The large initial acceleration gains tended to compensate for the response latencies such that the early eye movement response approached, albeit consistently incompletely, that required for maintaining visual acuity during the movement. These short-latency vestibuloocular responses could complement the visually generated optic flow responses that have been shown to exhibit much longer latencies.

  3. Analysis of EEG Related Saccadic Eye Movement

    NASA Astrophysics Data System (ADS)

    Funase, Arao; Kuno, Yoshiaki; Okuma, Shigeru; Yagi, Tohru

    Our final goal is to establish the model for saccadic eye movement that connects the saccade and the electroencephalogram(EEG). As the first step toward this goal, we recorded and analyzed the saccade-related EEG. In the study recorded in this paper, we tried detecting a certain EEG that is peculiar to the eye movement. In these experiments, each subject was instructed to point their eyes toward visual targets (LEDs) or the direction of the sound sources (buzzers). In the control cases, the EEG was recorded in the case of no eye movemens. As results, in the visual experiments, we found that the potential of EEG changed sharply on the occipital lobe just before eye movement. Furthermore, in the case of the auditory experiments, similar results were observed. In the case of the visual experiments and auditory experiments without eye movement, we could not observed the EEG changed sharply. Moreover, when the subject moved his/her eyes toward a right-side target, a change in EEG potential was found on the right occipital lobe. On the contrary, when the subject moved his/her eyes toward a left-side target, a sharp change in EEG potential was found on the left occipital lobe.

  4. One-Step "Change" and "Compare" Word Problems: Focusing on Eye-Movements

    ERIC Educational Resources Information Center

    Moutsios-Rentzos, Andreas; Stamatis, Panagiotis J.

    2015-01-01

    Introduction. In this study, we focus on the relationship between the students' mathematical thinking and their non-mechanically identified eye-movements with the purpose to gain deeper understanding about the students' reasoning processes and to investigate the feasibility of incorporating eye-movement information in everyday pedagogy. Method.…

  5. Horizontal Saccadic Eye Movements Enhance the Retrieval of Landmark Shape and Location Information

    ERIC Educational Resources Information Center

    Brunye, Tad T.; Mahoney, Caroline R.; Augustyn, Jason S.; Taylor, Holly A.

    2009-01-01

    Recent work has demonstrated that horizontal saccadic eye movements enhance verbal episodic memory retrieval, particularly in strongly right-handed individuals. The present experiments test three primary assumptions derived from this research. First, horizontal eye movements should facilitate episodic memory for both verbal and non-verbal…

  6. Eye Movement as an Indicator of Sensory Components in Thought.

    ERIC Educational Resources Information Center

    Buckner, Michael; And Others

    1987-01-01

    Investigated Neuro-Linguistic Programming eye movement model's claim that specific eye movements are indicative of specific sensory components in thought. Agreement between students' (N=48) self-reports and trained observers' records support visual and auditory portions of model; do not support kinesthetic portion. Interrater agreement supports…

  7. Adaptation to vestibular disorientation. III, Influence on adaptation of interrupting nystagmic eye movements with opposing stimuli.

    DOT National Transportation Integrated Search

    1966-09-01

    Failure of adaptation of nystagmic eye movements to occur under certain conditions of stimulation by angular acceleration has been ascribed to a failure to allow the eye-movement response to run its course. In this study, 3 groups of subjects were te...

  8. Sigmund Exner's (1887) Einige Beobachtungen über Bewegungsnachbilder (Some Observations on Movement Aftereffects): An Illustrated Translation With Commentary.

    PubMed

    Verstraten, Frans A J; Niehorster, Diederick C; van de Grind, Wim A; Wade, Nicholas J

    2015-10-01

    In his original contribution, Exner's principal concern was a comparison between the properties of different aftereffects, and particularly to determine whether aftereffects of motion were similar to those of color and whether they could be encompassed within a unified physiological framework. Despite the fact that he was unable to answer his main question, there are some excellent-so far unknown-contributions in Exner's paper. For example, he describes observations that can be related to binocular interaction, not only in motion aftereffects but also in rivalry. To the best of our knowledge, Exner provides the first description of binocular rivalry induced by differently moving patterns in each eye, for motion as well as for their aftereffects. Moreover, apart from several known, but beautifully addressed, phenomena he makes a clear distinction between motion in depth based on stimulus properties and motion in depth based on the interpretation of motion. That is, the experience of movement, as distinct from the perception of movement. The experience, unlike the perception, did not result in a motion aftereffect in depth.

  9. Sigmund Exner’s (1887) Einige Beobachtungen über Bewegungsnachbilder (Some Observations on Movement Aftereffects): An Illustrated Translation With Commentary

    PubMed Central

    Niehorster, Diederick C.; van de Grind, Wim A.; Wade, Nicholas J.

    2015-01-01

    In his original contribution, Exner’s principal concern was a comparison between the properties of different aftereffects, and particularly to determine whether aftereffects of motion were similar to those of color and whether they could be encompassed within a unified physiological framework. Despite the fact that he was unable to answer his main question, there are some excellent—so far unknown—contributions in Exner’s paper. For example, he describes observations that can be related to binocular interaction, not only in motion aftereffects but also in rivalry. To the best of our knowledge, Exner provides the first description of binocular rivalry induced by differently moving patterns in each eye, for motion as well as for their aftereffects. Moreover, apart from several known, but beautifully addressed, phenomena he makes a clear distinction between motion in depth based on stimulus properties and motion in depth based on the interpretation of motion. That is, the experience of movement, as distinct from the perception of movement. The experience, unlike the perception, did not result in a motion aftereffect in depth. PMID:27648213

  10. Does the perception of moving eyes trigger reflexive visual orienting in autism?

    PubMed Central

    Swettenham, John; Condie, Samantha; Campbell, Ruth; Milne, Elizabeth; Coleman, Mike

    2003-01-01

    Does movement of the eyes in one or another direction function as an automatic attentional cue to a location of interest? Two experiments explored the directional movement of the eyes in a full face for speed of detection of an aftercoming location target in young people with autism and in control participants. Our aim was to investigate whether a low-level perceptual impairment underlies the delay in gaze following characteristic of autism. The participants' task was to detect a target appearing on the left or right of the screen either 100 ms or 800 ms after a face cue appeared with eyes averting to the left or right. Despite instructions to ignore eye-movement in the face cue, people with autism and control adolescents were quicker to detect targets that had been preceded by an eye movement cue congruent with target location compared with targets preceded by an incongruent eye movement cue. The attention shifts are thought to be reflexive because the cue was to be ignored, and because the effect was found even when cue-target duration was short (100 ms). Because (experiment two) the effect persisted even when the face was inverted, it would seem that the direction of movement of eyes can provide a powerful (involuntary) cue to a location. PMID:12639330

  11. Three-dimensional organization of vestibular-related eye movements to off-vertical axis rotation and linear translation in pigeons

    NASA Technical Reports Server (NTRS)

    Dickman, J. D.; Angelaki, D. E.

    1999-01-01

    During linear accelerations, compensatory reflexes should continually occur in order to maintain objects of visual interest as stable images on the retina. In the present study, the three-dimensional organization of the vestibulo-ocular reflex in pigeons was quantitatively examined during linear accelerations produced by constant velocity off-vertical axis yaw rotations and translational motion in darkness. With off-vertical axis rotations, sinusoidally modulated eye-position and velocity responses were observed in all three components, with the vertical and torsional eye movements predominating the response. Peak torsional and vertical eye positions occurred when the head was oriented with the lateral visual axis of the right eye directed orthogonal to or aligned with the gravity vector, respectively. No steady-state horizontal nystagmus was obtained with any of the rotational velocities (8-58 degrees /s) tested. During translational motion, delivered along or perpendicular to the lateral visual axis, vertical and torsional eye movements were elicited. No significant horizontal eye movements were observed during lateral translation at frequencies up to 3 Hz. These responses suggest that, in pigeons, all linear accelerations generate eye movements that are compensatory to the direction of actual or perceived tilt of the head relative to gravity. In contrast, no translational horizontal eye movements, which are known to be compensatory to lateral translational motion in primates, were observed under the present experimental conditions.

  12. Brain stem omnipause neurons and the control of combined eye-head gaze saccades in the alert cat.

    PubMed

    Paré, M; Guitton, D

    1998-06-01

    When the head is unrestrained, rapid displacements of the visual axis-gaze shifts (eye-re-space)-are made by coordinated movements of the eyes (eye-re-head) and head (head-re-space). To address the problem of the neural control of gaze shifts, we studied and contrasted the discharges of omnipause neurons (OPNs) during a variety of combined eye-head gaze shifts and head-fixed eye saccades executed by alert cats. OPNs discharged tonically during intersaccadic intervals and at a reduced level during slow perisaccadic gaze movements sometimes accompanying saccades. Their activity ceased for the duration of the saccadic gaze shifts the animal executed, either by head-fixed eye saccades alone or by combined eye-head movements. This was true for all types of gaze shifts studied: active movements to visual targets; passive movements induced by whole-body rotation or by head rotation about stationary body; and electrically evoked movements by stimulation of the caudal part of the superior colliculus (SC), a central structure for gaze control. For combined eye-head gaze shifts, the OPN pause was therefore not correlated to the eye-in-head trajectory. For instance, in active gaze movements, the end of the pause was better correlated with the gaze end than with either the eye saccade end or the time of eye counterrotation. The hypothesis that cat OPNs participate in controlling gaze shifts is supported by these results, and also by the observation that the movements of both the eyes and the head were transiently interrupted by stimulation of OPNs during gaze shifts. However, we found that the OPN pause could be dissociated from the gaze-motor-error signal producing the gaze shift. First, OPNs resumed discharging when perturbation of head motion briefly interrupted a gaze shift before its intended amplitude was attained. Second, stimulation of caudal SC sites in head-free cat elicited large head-free gaze shifts consistent with the creation of a large gaze-motor-error signal. However, stimulation of the same sites in head-fixed cat produced small "goal-directed" eye saccades, and OPNs paused only for the duration of the latter; neither a pause nor an eye movement occurred when the same stimulation was applied with the eyes at the goal location. We conclude that OPNs can be controlled by neither a simple eye control system nor an absolute gaze control system. Our data cannot be accounted for by existing models describing the control of combined eye-head gaze shifts and therefore put new constraints on future models, which will have to incorporate all the various signals that act synergistically to control gaze shifts.

  13. An Integrative Model for the Neural Mechanism of Eye Movement Desensitization and Reprocessing (EMDR)

    PubMed Central

    Coubard, Olivier A.

    2016-01-01

    Since the seminal report by Shapiro that bilateral stimulation induces cognitive and emotional changes, 26 years of basic and clinical research have examined the effects of Eye Movement Desensitization and Reprocessing (EMDR) in anxiety disorders, particularly in post-traumatic stress disorder (PTSD). The present article aims at better understanding EMDR neural mechanism. I first review procedural aspects of EMDR protocol and theoretical hypothesis about EMDR effects, and develop the reasons why the scientific community is still divided about EMDR. I then slide from psychology to physiology describing eye movements/emotion interaction from the physiological viewpoint, and introduce theoretical and technical tools used in movement research to re-examine EMDR neural mechanism. Using a recent physiological model for the neuropsychological architecture of motor and cognitive control, the Threshold Interval Modulation with Early Release-Rate of rIse Deviation with Early Release (TIMER-RIDER)—model, I explore how attentional control and bilateral stimulation may participate to EMDR effects. These effects may be obtained by two processes acting in parallel: (i) activity level enhancement of attentional control component; and (ii) bilateral stimulation in any sensorimotor modality, both resulting in lower inhibition enabling dysfunctional information to be processed and anxiety to be reduced. The TIMER-RIDER model offers quantitative predictions about EMDR effects for future research about its underlying physiological mechanisms. PMID:27092064

  14. Eye-hand coupling during closed-loop drawing: evidence of shared motor planning?

    PubMed

    Reina, G Anthony; Schwartz, Andrew B

    2003-04-01

    Previous paradigms have used reaching movements to study coupling of eye-hand kinematics. In the present study, we investigated eye-hand kinematics as curved trajectories were drawn at normal speeds. Eye and hand movements were tracked as a monkey traced ellipses and circles with the hand in free space while viewing the hand's position on a computer monitor. The results demonstrate that the movement of the hand was smooth and obeyed the 2/3 power law. Eye position, however, was restricted to 2-3 clusters along the hand's trajectory and fixed approximately 80% of the time in one of these clusters. The eye remained stationary as the hand moved away from the fixation for up to 200 ms and saccaded ahead of the hand position to the next fixation along the trajectory. The movement from one fixation cluster to another consistently occurred just after the tangential hand velocity had reached a local minimum, but before the next segment of the hand's trajectory began. The next fixation point was close to an area of high curvature along the hand's trajectory even though the hand had not reached that point along the path. A visuo-motor illusion of hand movement demonstrated that the eye movement was influenced by hand movement and not simply by visual input. During the task, neural activity of pre-motor cortex (area F4) was recorded using extracellular electrodes and used to construct a population vector of the hand's trajectory. The results suggest that the saccade onset is correlated in time with maximum curvature in the population vector trajectory for the hand movement. We hypothesize that eye and arm movements may have common, or shared, information in forming their motor plans.

  15. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces

    PubMed Central

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-01-01

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain–computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles. PMID:28644398

  16. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.

    PubMed

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-06-23

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

  17. Eye, head, and body coordination during large gaze shifts in rhesus monkeys: movement kinematics and the influence of posture.

    PubMed

    McCluskey, Meaghan K; Cullen, Kathleen E

    2007-04-01

    Coordinated movements of the eye, head, and body are used to redirect the axis of gaze between objects of interest. However, previous studies of eye-head gaze shifts in head-unrestrained primates generally assumed the contribution of body movement to be negligible. Here we characterized eye-head-body coordination during horizontal gaze shifts made by trained rhesus monkeys to visual targets while they sat upright in a standard primate chair and assumed a more natural sitting posture in a custom-designed chair. In both postures, gaze shifts were characterized by the sequential onset of eye, head, and body movements, which could be described by predictable relationships. Body motion made a small but significant contribution to gaze shifts that were > or =40 degrees in amplitude. Furthermore, as gaze shift amplitude increased (40-120 degrees ), body contribution and velocity increased systematically. In contrast, peak eye and head velocities plateaued at velocities of approximately 250-300 degrees /s, and the rotation of the eye-in-orbit and head-on-body remained well within the physical limits of ocular and neck motility during large gaze shifts, saturating at approximately 35 and 60 degrees , respectively. Gaze shifts initiated with the eye more contralateral in the orbit were accompanied by smaller body as well as head movement amplitudes and velocities were greater when monkeys were seated in the more natural body posture. Taken together, our findings show that body movement makes a predictable contribution to gaze shifts that is systematically influenced by factors such as orbital position and posture. We conclude that body movements are part of a coordinated series of motor events that are used to voluntarily reorient gaze and that these movements can be significant even in a typical laboratory setting. Our results emphasize the need for caution in the interpretation of data from neurophysiological studies of the control of saccadic eye movements and/or eye-head gaze shifts because single neurons can code motor commands to move the body as well as the head and eyes.

  18. Expansion of visual space during optokinetic afternystagmus (OKAN).

    PubMed

    Kaminiarz, André; Krekelberg, Bart; Bremmer, Frank

    2008-05-01

    The mechanisms underlying visual perceptual stability are usually investigated using voluntary eye movements. In such studies, errors in perceptual stability during saccades and pursuit are commonly interpreted as mismatches between actual eye position and eye-position signals in the brain. The generality of this interpretation could in principle be tested by investigating spatial localization during reflexive eye movements whose kinematics are very similar to those of voluntary eye movements. Accordingly, in this study, we determined mislocalization of flashed visual targets during optokinetic afternystagmus (OKAN). These eye movements are quite unique in that they occur in complete darkness and are generated by subcortical control mechanisms. We found that during horizontal OKAN slow phases, subjects mislocalize targets away from the fovea in the horizontal direction. This corresponds to a perceived expansion of visual space and is unlike mislocalization found for any other voluntary or reflexive eye movement. Around the OKAN fast phases, we found a bias in the direction of the fast phase prior to its onset and opposite to the fast-phase direction thereafter. Such a biphasic modulation has also been reported in the temporal vicinity of saccades and during optokinetic nystagmus (OKN). A direct comparison, however, showed that the modulation during OKAN was much larger and occurred earlier relative to fast-phase onset than during OKN. A simple mismatch between the current eye position and the eye-position signal in the brain is unlikely to explain such disparate results across similar eye movements. Instead, these data support the view that mislocalization arises from errors in eye-centered position information.

  19. An eye movement pre-training fosters the comprehension of processes and functions in technical systems

    PubMed Central

    Skuballa, Irene T.; Fortunski, Caroline; Renkl, Alexander

    2015-01-01

    The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning. PMID:26029138

  20. The Cerebellar Dysplasia of Chiari II Malformation as Revealed by Eye Movements

    PubMed Central

    Salman, Michael S.; Dennis, Maureen; Sharpe, James A.

    2011-01-01

    Introduction Chiari type II malformation (CII) is a developmental deformity of the hindbrain. We have previously reported that many patients with CII have impaired smooth pursuit, while few make inaccurate saccades or have an abnormal vestibulo-ocular reflex. In contrast, saccadic adaptation and visual fixation are normal. In this report, we correlate results from several eye movement studies with neuroimaging in CII. We present a model for structural changes within the cerebellum in CII. Methods Saccades, smooth pursuit, the vestibulo-ocular reflex, and visual fixation were recorded in 21 patients with CII, aged 8–19 years and 39 age-matched controls, using an infrared eye tracker. Qualitative and quantitative MRI data were correlated with eye movements in 19 CII patients and 28 controls. Results Nine patients with CII had abnormal eye movements. Smooth pursuit gain was subnormal in eight, saccadic accuracy abnormal in four, and vestibulo-ocular reflex gain abnormal in three. None had fixation instability. Patients with CII had a significantly smaller cerebellar volume than controls, and those with normal eye motion had an expanded midsagittal vermis compared to controls. However, patients with abnormal eye movements had a smaller (non-expanded) midsagittal vermis area, posterior fossa area and medial cerebellar volumes than CII patients with normal eye movements. Conclusions The deformity of CII affects the structure and function of the cerebellum selectively and differently in those with abnormal eye movements. We propose that the vermis can expand when compressed within a small posterior fossa in some CII patients, thus sparing its ocular motor functions. PMID:19960749

  1. Eye-tracking the own-race bias in face recognition: revealing the perceptual and socio-cognitive mechanisms.

    PubMed

    Hills, Peter J; Pake, J Michael

    2013-12-01

    Own-race faces are recognised more accurately than other-race faces and may even be viewed differently as measured by an eye-tracker (Goldinger, Papesh, & He, 2009). Alternatively, observer race might direct eye-movements (Blais, Jack, Scheepers, Fiset, & Caldara, 2008). Observer differences in eye-movements are likely to be based on experience of the physiognomic characteristics that are differentially discriminating for Black and White faces. Two experiments are reported that employed standard old/new recognition paradigms in which Black and White observers viewed Black and White faces with their eye-movements recorded. Experiment 1 showed that there were observer race differences in terms of the features scanned but observers employed the same strategy across different types of faces. Experiment 2 demonstrated that other-race faces could be recognised more accurately if participants had their first fixation directed to more diagnostic features using fixation crosses. These results are entirely consistent with those presented by Blais et al. (2008) and with the perceptual interpretation that the own-race bias is due to inappropriate attention allocated to the facial features (Hills & Lewis, 2006, 2011). Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Asymmetries in the Control of Saccadic Eye Movements to Bifurcating Targets.

    ERIC Educational Resources Information Center

    Zeevi, Yehoshua Y.; And Others

    The examination of saccadic eye movements--rapid shifts in gaze from one visual area of interest to another--is useful in studying pilot's visual learning in flight simulator training. Saccadic eye movements are the basic oculomotor response associated with the acquisition of visual information and provide an objective measure of higher perceptual…

  3. The Role of Eye Movement Driven Attention in Functional Strabismic Amblyopia

    PubMed Central

    2015-01-01

    Strabismic amblyopia “blunt vision” is a developmental anomaly that affects binocular vision and results in lowered visual acuity. Strabismus is a term for a misalignment of the visual axes and is usually characterized by impaired ability of the strabismic eye to take up fixation. Such impaired fixation is usually a function of the temporally and spatially impaired binocular eye movements that normally underlie binocular shifts in visual attention. In this review, we discuss how abnormal eye movement function in children with misaligned eyes influences the development of normal binocular visual attention and results in deficits in visual function such as depth perception. We also discuss how eye movement function deficits in adult amblyopia patients can also lead to other abnormalities in visual perception. Finally, we examine how the nonamblyopic eye of an amblyope is also affected in strabismic amblyopia. PMID:25838941

  4. Effects of reward on the accuracy and dynamics of smooth pursuit eye movements.

    PubMed

    Brielmann, Aenne A; Spering, Miriam

    2015-08-01

    Reward modulates behavioral choices and biases goal-oriented behavior, such as eye or hand movements, toward locations or stimuli associated with higher rewards. We investigated reward effects on the accuracy and timing of smooth pursuit eye movements in 4 experiments. Eye movements were recorded in participants tracking a moving visual target on a computer monitor. Before target motion onset, a monetary reward cue indicated whether participants could earn money by tracking accurately, or whether the trial was unrewarded (Experiments 1 and 2, n = 11 each). Reward significantly improved eye-movement accuracy across different levels of task difficulty. Improvements were seen even in the earliest phase of the eye movement, within 70 ms of tracking onset, indicating that reward impacts visual-motor processing at an early level. We obtained similar findings when reward was not precued but explicitly associated with the pursuit target (Experiment 3, n = 16); critically, these results were not driven by stimulus prevalence or other factors such as preparation or motivation. Numerical cues (Experiment 4, n = 9) were not effective. (c) 2015 APA, all rights reserved).

  5. Diurnal variation of eye movement and heart rate variability in the human fetus at term.

    PubMed

    Morokuma, S; Horimoto, N; Satoh, S; Nakano, H

    2001-07-01

    To elucidate diurnal variations in eye movement and fetal heart rate (FHR) variability in the term fetus, we observed these two parameters continuously for 24 h, using real-time ultrasound and Doppler cardiotocograph, respectively. Studied were five uncomplicated fetuses at term. The time series data of the presence and absence of eye movement and mean FHR value for each 1 min were analyzed using the maximum entropy method (MEM) and subsequent nonlinear least squares fitting. According to the power value of eye movement, all five cases were classified into two groups: three cases in the large power group and two cases in the small power group. The acrophases of eye movement and FHR variability in the large power group were close, thereby implying the existence of a diurnal rhythm in both these parameters and also that they are synchronized. In the small power group, the acrophases were separated. The synchronization of eye movement and FHR variability in the large power group suggests that these phenomena are governed by a common central mechanism related to diurnal rhythm generation.

  6. Are smooth pursuit eye movements altered in chronic whiplash-associated disorders? A cross-sectional study.

    PubMed

    Kongsted, A; Jørgensen, L V; Bendix, T; Korsholm, L; Leboeuf-Yde, C

    2007-11-01

    To evaluate whether smooth pursuit eye movements differed between patients with long-lasting whiplash-associated disorders and controls when using a purely computerized method for the eye movement analysis. Cross-sectional study comparing patients with whiplash-associated disorders and controls who had not been exposed to head or neck trauma and had no notable neck complaints. Smooth pursuit eye movements were registered while the subjects were seated with and without rotated cervical spine. Thirty-four patients with whiplash-associated disorders with symptoms more than six months after a car collision and 60 controls. Smooth pursuit eye movements were almost identical in patients with chronic whiplash-associated disorders and controls, both when the neck was rotated and in the neutral position. Disturbed smooth pursuit eye movements do not appear to be a distinct feature in patients with chronic whiplash-associated disorders. This is in contrast to results of previous studies and may be due to the fact that analyses were performed in a computerized and objective manner. Other possible reasons for the discrepancy to previous studies are discussed.

  7. Opening a Window into Reading Development: Eye Movements' Role Within a Broader Literacy Research Framework.

    PubMed

    Miller, Brett; O'Donnell, Carol

    2013-01-01

    The cumulative body of eye movement research provides significant insight into how readers process text. The heart of this work spans roughly 40 years reflecting the maturity of both the topics under study and experimental approaches used to investigate reading. Recent technological advancements offer increased flexibility to the field providing the potential to more concertedly study reading and literacy from an individual differences perspective. Historically, eye movement research focused far less on developmental issues related to individual differences in reading; however, this issue and the broader change it represents signal a meaningful transition inclusive of individual differences. The six papers in this special issue signify the recent, increased attention to and recognition of eye movement research's transition to emphasize individual differences in reading while appreciating early contributions (e.g., Rayner, 1986) in this direction. We introduce these six papers and provide some historical context for the use of eye movement methodology to examine reading and context for the eye movement field's early transition to examining individual differences, culminating in future research recommendations.

  8. The Trajectories of Saccadic Eye Movements.

    ERIC Educational Resources Information Center

    Bahill, A. Terry; Stark, Lawrence

    1979-01-01

    Investigates the trajectories of saccadic eye movements, the control signals of the eye, and nature of the mechanisms that generate them, using the techniques of bioengineering in collecting the data. (GA)

  9. Strabismus

    MedlinePlus

    ... do not aim in the same direction Uncoordinated eye movements (eyes do not move together) Loss of vision ... Stahl ED, Ariss MM, Lindquist TP. Disorders of eye movement and alignment. In: Kliegman RM, Stanton BF, St. ...

  10. Volitional and Real-Time Control Cursor Based on Eye Movement Decoding Using a Linear Decoding Model

    PubMed Central

    Zhang, Cheng

    2016-01-01

    The aim of this study is to build a linear decoding model that reveals the relationship between the movement information and the EOG (electrooculogram) data to online control a cursor continuously with blinks and eye pursuit movements. First of all, a blink detection method is proposed to reject a voluntary single eye blink or double-blink information from EOG. Then, a linear decoding model of time series is developed to predict the position of gaze, and the model parameters are calibrated by the RLS (Recursive Least Square) algorithm; besides, the assessment of decoding accuracy is assessed through cross-validation procedure. Additionally, the subsection processing, increment control, and online calibration are presented to realize the online control. Finally, the technology is applied to the volitional and online control of a cursor to hit the multiple predefined targets. Experimental results show that the blink detection algorithm performs well with the voluntary blink detection rate over 95%. Through combining the merits of blinks and smooth pursuit movements, the movement information of eyes can be decoded in good conformity with the average Pearson correlation coefficient which is up to 0.9592, and all signal-to-noise ratios are greater than 0. The novel system allows people to successfully and economically control a cursor online with a hit rate of 98%. PMID:28058044

  11. A 3D character animation engine for multimodal interaction on mobile devices

    NASA Astrophysics Data System (ADS)

    Sandali, Enrico; Lavagetto, Fabio; Pisano, Paolo

    2005-03-01

    Talking virtual characters are graphical simulations of real or imaginary persons that enable natural and pleasant multimodal interaction with the user, by means of voice, eye gaze, facial expression and gestures. This paper presents an implementation of a 3D virtual character animation and rendering engine, compliant with the MPEG-4 standard, running on Symbian-based SmartPhones. Real-time animation of virtual characters on mobile devices represents a challenging task, since many limitations must be taken into account with respect to processing power, graphics capabilities, disk space and execution memory size. The proposed optimization techniques allow to overcome these issues, guaranteeing a smooth and synchronous animation of facial expressions and lip movements on mobile phones such as Sony-Ericsson's P800 and Nokia's 6600. The animation engine is specifically targeted to the development of new "Over The Air" services, based on embodied conversational agents, with applications in entertainment (interactive story tellers), navigation aid (virtual guides to web sites and mobile services), news casting (virtual newscasters) and education (interactive virtual teachers).

  12. Kinematics and eye-head coordination of gaze shifts evoked from different sites in the superior colliculus of the cat.

    PubMed

    Guillaume, Alain; Pélisson, Denis

    2006-12-15

    Shifting gaze requires precise coordination of eye and head movements. It is clear that the superior colliculus (SC) is involved with saccadic gaze shifts. Here we investigate its role in controlling both eye and head movements during gaze shifts. Gaze shifts of the same amplitude can be evoked from different SC sites by controlled electrical microstimulation. To describe how the SC coordinates the eye and the head, we compare the characteristics of these amplitude-matched gaze shifts evoked from different SC sites. We show that matched amplitude gaze shifts elicited from progressively more caudal sites are progressively slower and associated with a greater head contribution. Stimulation at more caudal SC sites decreased the peak velocity of the eye but not of the head, suggesting that the lower peak gaze velocity for the caudal sites is due to the increased contribution of the slower-moving head. Eye-head coordination across the SC motor map is also indicated by the relative latencies of the eye and head movements. For some amplitudes of gaze shift, rostral stimulation evoked eye movement before head movement, whereas this reversed with caudal stimulation, which caused the head to move before the eyes. These results show that gaze shifts of similar amplitude evoked from different SC sites are produced with different kinematics and coordination of eye and head movements. In other words, gaze shifts evoked from different SC sites follow different amplitude-velocity curves, with different eye-head contributions. These findings shed light on mechanisms used by the central nervous system to translate a high-level motor representation (a desired gaze displacement on the SC map) into motor commands appropriate for the involved body segments (the eye and the head).

  13. Is eye to eye contact really threatening and avoided in social anxiety?--An eye-tracking and psychophysiology study.

    PubMed

    Wieser, Matthias J; Pauli, Paul; Alpers, Georg W; Mühlberger, Andreas

    2009-01-01

    The effects of direct and averted gaze on autonomic arousal and gaze behavior in social anxiety were investigated using a new paradigm including animated movie stimuli and eye-tracking methodology. While high, medium, and low socially anxious (HSA vs. MSA vs. LSA) women watched animated movie clips, in which faces responded to the gaze of the participants with either direct or averted gaze, their eye movements, heart rate (HR) and skin conductance responses (SCR) were continuously recorded. Groups did not differ in their gaze behavior concerning direct vs. averted gaze, but high socially anxious women tended to fixate the eye region of the presented face longer than MSA and LSA, respectively. Furthermore, they responded to direct gaze with more pronounced cardiac acceleration. This physiological finding indicates that direct gaze may be a fear-relevant feature for socially anxious individuals in social interaction. However, this seems not to result in gaze avoidance. Future studies should examine the role of gaze direction and its interaction with facial expressions in social anxiety and its consequences for avoidance behavior and fear responses. Additionally, further research is needed to clarify the role of gaze perception in social anxiety.

  14. Design of a Gaze-Sensitive Virtual Social Interactive System for Children With Autism

    PubMed Central

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2013-01-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child’s dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. PMID:21609889

  15. Design of a gaze-sensitive virtual social interactive system for children with autism.

    PubMed

    Lahiri, Uttama; Warren, Zachary; Sarkar, Nilanjan

    2011-08-01

    Impairments in social communication skills are thought to be core deficits in children with autism spectrum disorder (ASD). In recent years, several assistive technologies, particularly Virtual Reality (VR), have been investigated to promote social interactions in this population. It is well known that children with ASD demonstrate atypical viewing patterns during social interactions and thus monitoring eye-gaze can be valuable to design intervention strategies. While several studies have used eye-tracking technology to monitor eye-gaze for offline analysis, there exists no real-time system that can monitor eye-gaze dynamically and provide individualized feedback. Given the promise of VR-based social interaction and the usefulness of monitoring eye-gaze in real-time, a novel VR-based dynamic eye-tracking system is developed in this work. This system, called Virtual Interactive system with Gaze-sensitive Adaptive Response Technology (VIGART), is capable of delivering individualized feedback based on a child's dynamic gaze patterns during VR-based interaction. Results from a usability study with six adolescents with ASD are presented that examines the acceptability and usefulness of VIGART. The results in terms of improvement in behavioral viewing and changes in relevant eye physiological indexes of participants while interacting with VIGART indicate the potential of this novel technology. © 2011 IEEE

  16. Spasmus nutans

    MedlinePlus

    ... infants and young children. It involves rapid, uncontrolled eye movements, head bobbing, and, sometimes, holding the neck in ... spasmus nutans include: Small, quick, side-to-side eye movements called nystagmus (both eyes are involved, but each ...

  17. Brown Syndrome

    MedlinePlus

    ... Does Brown syndrome cause eye problems besides abnormal eye movements? In the more severely affected cases of Brown ... acquired and congenital cases. In congenital cases, the eye movement problem is usually constant and unlikely to resolve ...

  18. A multimodal dataset for authoring and editing multimedia content: The MAMEM project.

    PubMed

    Nikolopoulos, Spiros; Petrantonakis, Panagiotis C; Georgiadis, Kostas; Kalaganis, Fotis; Liaros, Georgios; Lazarou, Ioulietta; Adam, Katerina; Papazoglou-Chalikias, Anastasios; Chatzilari, Elisavet; Oikonomou, Vangelis P; Kumar, Chandan; Menges, Raphael; Staab, Steffen; Müller, Daniel; Sengupta, Korok; Bostantjopoulou, Sevasti; Katsarou, Zoe; Zeilig, Gabi; Plotnik, Meir; Gotlieb, Amihai; Kizoni, Racheli; Fountoukidou, Sofia; Ham, Jaap; Athanasiou, Dimitrios; Mariakaki, Agnes; Comanducci, Dario; Sabatini, Edoardo; Nistico, Walter; Plank, Markus; Kompatsiaris, Ioannis

    2017-12-01

    We present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals collected from 34 individuals (18 able-bodied and 16 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. The presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.

  19. Eye movements reveal sexually dimorphic deficits in children with fetal alcohol spectrum disorder

    PubMed Central

    Paolozza, Angelina; Munn, Rebecca; Munoz, Douglas P.; Reynolds, James N.

    2015-01-01

    Background: We examined the accuracy and characteristics of saccadic eye movements in children with fetal alcohol spectrum disorder (FASD) compared with typically developing control children. Previous studies have found that children with FASD produce saccades that are quantifiably different from controls. Additionally, animal studies have found sex-based differences for behavioral effects after prenatal alcohol exposure. Therefore, we hypothesized that eye movement measures will show sexually dimorphic results. Methods: Children (aged 5–18 years) with FASD (n = 71) and typically developing controls (n = 113) performed a visually-guided saccade task. Saccade metrics and behavior were analyzed for sex and group differences. Results: Female control participants had greater amplitude saccades than control males or females with FASD. Accuracy was significantly poorer in the FASD group, especially in males, which introduced significantly greater variability in the data. Therefore, we conducted additional analyses including only those trials in which the first saccade successfully reached the target within a ± 1° window. In this restricted amplitude dataset, the females with FASD made saccades with significantly lower velocity and longer duration, whereas the males with FASD did not differ from the control group. Additionally, the mean and peak deceleration were selectively decreased in the females with FASD. Conclusions: These data support the hypothesis that children with FASD exhibit specific deficits in eye movement control and sensory-motor integration associated with cerebellar and/or brain stem circuits. Moreover, prenatal alcohol exposure may have a sexually dimorphic impact on eye movement metrics, with males and females exhibiting differential patterns of deficit. PMID:25814922

  20. Eye Movement Indices in the Study of Depressive Disorder

    PubMed Central

    LI, Yu; XU, Yangyang; XIA, Mengqing; ZHANG, Tianhong; WANG, Junjie; LIU, Xu; HE, Yongguang; WANG, Jijun

    2016-01-01

    Background Impaired cognition is one of the most common core symptoms of depressive disorder. Eye movement testing mainly reflects patients’ cognitive functions, such as cognition, memory, attention, recognition, and recall. This type of testing has great potential to improve theories related to cognitive functioning in depressive episodes as well as potential in its clinical application. Aims This study investigated whether eye movement indices of patients with unmedicated depressive disorder were abnormal or not, as well as the relationship between these indices and mental symptoms. Methods Sixty patients with depressive disorder and sixty healthy controls (who were matched by gender, age and years of education) were recruited, and completed eye movement tests including three tasks: fixation task, saccade task and free-view task. The EyeLink desktop eye tracking system was employed to collect eye movement information, and analyze the eye movement indices of the three tasks between the two groups. Results (1) In the fixation task, compared to healthy controls, patients with depressive disorder showed more fixations, shorter fixation durations, more saccades and longer saccadic lengths; (2) In the saccade task, patients with depressive disorder showed longer anti-saccade latencies and smaller anti-saccade peak velocities; (3) In the free-view task, patients with depressive disorder showed fewer saccades and longer mean fixation durations; (4) Correlation analysis showed that there was a negative correlation between the pro-saccade amplitude and anxiety symptoms, and a positive correlation between the anti-saccade latency and anxiety symptoms. The depression symptoms were negatively correlated with fixation times, saccades, and saccadic paths respectively in the free-view task; while the mean fixation duration and depression symptoms showed a positive correlation. Conclusion Compared to healthy controls, patients with depressive disorder showed significantly abnormal eye movement indices. In addition patients’ anxiety and depression symptoms and eye movement indices were correlated. The pathological meaning of these phenomena deserve further exploration. PMID:28638208

  1. The horizontal and vertical cervico-ocular reflexes of the rabbit.

    PubMed

    Barmack, N H; Nastos, M A; Pettorossi, V E

    1981-11-16

    Horizontal and vertical cervico-ocular reflexes of the rabbit (HCOR, VCOR) were evoked by sinusoidal oscillation of the body about the vertical and longitudinal axes while the head was fixed. These reflexes were studied over a frequency range of 0.005-0.800 Hz and at stimulus amplitudes of +/- 10 degrees. When the body of the rabbit was rotated horizontally clockwise around the fixed head, clockwise conjugate eye movements were evoked. When the body was rotated about the longitudinal axis onto the right side, the right eye rotated down and the left eye rotated up. The mean gain of the HCOR (eye velocity/body velocity) rose from 0.21 and 0.005 Hz to 0.27 at 0.020 Hz and then declined to 0.06 at 0.3Hz. The gain of the VCOR was less than the gain of the HCOR by a factor of 2-3. The HCOR was measured separately and in combination with the horizontal vestibulo-ocular reflex (HVOR). These reflexes combine linearly. The relative movements of the first 3 cervical vertebrae during stimulation of the HCOR and VCOR were measured. For the HCOR, the largest angular displacement (74%) occurs between C1 and C2. For the VCOR, the largest relative angular displacement (45%) occurs between C2 and C3. Step horizontal clockwise rotation of the head and body (HVOR) evoked low velocity counterclockwise eye movements followed by fast clockwise (resetting) eye movements. Step horizontal clockwise rotation of the body about the fixed head (HCOR) evoked low velocity clockwise eye movements which were followed by fast clockwise eye movements. Step horizontal clockwise rotation of the head about the fixed body (HCOR + HVOR) evoked low velocity counterclockwise eye movements which were not interrupted by fast clockwise eye movements. These data provide further evidence for a linear combination of independent HCOR and HVOR signals.

  2. Eye Movement Indices in the Study of Depressive Disorder.

    PubMed

    Li, Yu; Xu, Yangyang; Xia, Mengqing; Zhang, Tianhong; Wang, Junjie; Liu, Xu; He, Yongguang; Wang, Jijun

    2016-12-25

    Impaired cognition is one of the most common core symptoms of depressive disorder. Eye movement testing mainly reflects patients' cognitive functions, such as cognition, memory, attention, recognition, and recall. This type of testing has great potential to improve theories related to cognitive functioning in depressive episodes as well as potential in its clinical application. This study investigated whether eye movement indices of patients with unmedicated depressive disorder were abnormal or not, as well as the relationship between these indices and mental symptoms. Sixty patients with depressive disorder and sixty healthy controls (who were matched by gender, age and years of education) were recruited, and completed eye movement tests including three tasks: fixation task, saccade task and free-view task. The EyeLink desktop eye tracking system was employed to collect eye movement information, and analyze the eye movement indices of the three tasks between the two groups. (1) In the fixation task, compared to healthy controls, patients with depressive disorder showed more fixations, shorter fixation durations, more saccades and longer saccadic lengths; (2) In the saccade task, patients with depressive disorder showed longer anti-saccade latencies and smaller anti-saccade peak velocities; (3) In the free-view task, patients with depressive disorder showed fewer saccades and longer mean fixation durations; (4) Correlation analysis showed that there was a negative correlation between the pro-saccade amplitude and anxiety symptoms, and a positive correlation between the anti-saccade latency and anxiety symptoms. The depression symptoms were negatively correlated with fixation times, saccades, and saccadic paths respectively in the free-view task; while the mean fixation duration and depression symptoms showed a positive correlation. Compared to healthy controls, patients with depressive disorder showed significantly abnormal eye movement indices. In addition patients' anxiety and depression symptoms and eye movement indices were correlated. The pathological meaning of these phenomena deserve further exploration.

  3. Effect of limbal marking prior to laser ablation on the magnitude of cyclotorsional error.

    PubMed

    Chen, Xiangjun; Stojanovic, Aleksandar; Stojanovic, Filip; Eidet, Jon Roger; Raeder, Sten; Øritsland, Haakon; Utheim, Tor Paaske

    2012-05-01

    To evaluate the residual registration error after limbal-marking-based manual adjustment in cyclotorsional tracker-controlled laser refractive surgery. Two hundred eyes undergoing custom surface ablation with the iVIS Suite (iVIS Technologies) were divided into limbal marked (marked) and non-limbal marked (unmarked) groups. Iris registration information was acquired preoperatively from all eyes. Preoperatively, the horizontal axis was recorded in the marked group for use in manual cyclotorsional alignment prior to surgical iris registration. During iris registration, the preoperative iris information was compared to the eye-tracker captured image. The magnitudes of the registration error angle and cyclotorsional movement during the subsequent laser ablation were recorded and analyzed. Mean magnitude of registration error angle (absolute value) was 1.82°±1.31° (range: 0.00° to 5.50°) and 2.90°±2.40° (range: 0.00° to 13.50°) for the marked and unmarked groups, respectively (P<.001). Mean magnitude of cyclotorsional movement during the laser ablation (absolute value) was 1.15°±1.34° (range: 0.00° to 7.00°) and 0.68°±0.97° (range: 0.00° to 6.00°) for the marked and unmarked groups, respectively (P=.005). Forty-six percent and 60% of eyes had registration error >2°, whereas 22% and 20% of eyes had cyclotorsional movement during ablation >2° in the marked and unmarked groups, respectively. Limbal-marking-based manual alignment prior to laser ablation significantly reduced cyclotorsional registration error. However, residual registration misalignment and cyclotorsional movements remained during ablation. Copyright 2012, SLACK Incorporated.

  4. Assessment of Attentional Workload while Driving by Eye-fixation-related Potentials

    NASA Astrophysics Data System (ADS)

    Takeda, Yuji; Yoshitsugu, Noritoshi; Itoh, Kazuya; Kanamori, Nobuhiro

    How do drivers cope with the attentional workload of in-vehicle information technology? In the present study, we propose a new psychophysiological measure for assessing drivers' attention: eye-fixation-related potential (EFRP). EFRP is a kind of event-related brain potential measurable at the eye-movement situation that reflects how closely observers examine visual information at the eye-fixated position. In the experiment, the effects of verbal working memory load and spatial working memory load during simulated driving were examined by measuring the number of saccadic eye-movements and EFRP as the indices of drivers' attention. The results showed that the spatial working memory load affected both the number of saccadic eye-movements and the amplitude of the P100 component of EFRP, whereas the verbal working memory load affected only the number of saccadic eye-movements. This implies that drivers can perform time-sharing processing between driving and the verbal working memory task, but the decline of accuracy of visual processing during driving is inescapable when the spatial working memory load is given. The present study suggests that EFRP can provide a new index of drivers' attention, other than saccadic eye-movements.

  5. EYE MOVEMENT RECORDING AND NONLINEAR DYNAMICS ANALYSIS – THE CASE OF SACCADES#

    PubMed Central

    Aştefănoaei, Corina; Pretegiani, Elena; Optican, L.M.; Creangă, Dorina; Rufa, Alessandra

    2015-01-01

    Evidence of a chaotic behavioral trend in eye movement dynamics was examined in the case of a saccadic temporal series collected from a healthy human subject. Saccades are highvelocity eye movements of very short duration, their recording being relatively accessible, so that the resulting data series could be studied computationally for understanding the neural processing in a motor system. The aim of this study was to assess the complexity degree in the eye movement dynamics. To do this we analyzed the saccadic temporal series recorded with an infrared camera eye tracker from a healthy human subject in a special experimental arrangement which provides continuous records of eye position, both saccades (eye shifting movements) and fixations (focusing over regions of interest, with rapid, small fluctuations). The semi-quantitative approach used in this paper in studying the eye functioning from the viewpoint of non-linear dynamics was accomplished by some computational tests (power spectrum, portrait in the state space and its fractal dimension, Hurst exponent and largest Lyapunov exponent) derived from chaos theory. A high complexity dynamical trend was found. Lyapunov largest exponent test suggested bi-stability of cellular membrane resting potential during saccadic experiment. PMID:25698889

  6. Magnifying visual target information and the role of eye movements in motor sequence learning.

    PubMed

    Massing, Matthias; Blandin, Yannick; Panzer, Stefan

    2016-01-01

    An experiment investigated the influence of eye movements on learning a simple motor sequence task when the visual display was magnified. The task was to reproduce a 1300 ms spatial-temporal pattern of elbow flexions and extensions. The spatial-temporal pattern was displayed in front of the participants. Participants were randomly assigned to four groups differing on eye movements (free to use their eyes/instructed to fixate) and the visual display (small/magnified). All participants had to perform a pre-test, an acquisition phase, a delayed retention test, and a transfer test. The results indicated that participants in each practice condition increased their performance during acquisition. The participants who were permitted to use their eyes in the magnified visual display outperformed those who were instructed to fixate on the magnified visual display. When a small visual display was used, the instruction to fixate induced no performance decrements compared to participants who were permitted to use their eyes during acquisition. The findings demonstrated that a spatial-temporal pattern can be learned without eye movements, but being permitting to use eye movements facilitates the response production when the visual angle is increased. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Nonretinotopic visual processing in the brain.

    PubMed

    Melcher, David; Morrone, Maria Concetta

    2015-01-01

    A basic principle in visual neuroscience is the retinotopic organization of neural receptive fields. Here, we review behavioral, neurophysiological, and neuroimaging evidence for nonretinotopic processing of visual stimuli. A number of behavioral studies have shown perception depending on object or external-space coordinate systems, in addition to retinal coordinates. Both single-cell neurophysiology and neuroimaging have provided evidence for the modulation of neural firing by gaze position and processing of visual information based on craniotopic or spatiotopic coordinates. Transient remapping of the spatial and temporal properties of neurons contingent on saccadic eye movements has been demonstrated in visual cortex, as well as frontal and parietal areas involved in saliency/priority maps, and is a good candidate to mediate some of the spatial invariance demonstrated by perception. Recent studies suggest that spatiotopic selectivity depends on a low spatial resolution system of maps that operates over a longer time frame than retinotopic processing and is strongly modulated by high-level cognitive factors such as attention. The interaction of an initial and rapid retinotopic processing stage, tied to new fixations, and a longer lasting but less precise nonretinotopic level of visual representation could underlie the perception of both a detailed and a stable visual world across saccadic eye movements.

  8. Eye and Head Response to Peripheral Targets

    DTIC Science & Technology

    1989-08-01

    nystagmus movements of the eyes. These move- ments tend to be oscillatory or unstable in nature and can be elicited in three ways: stimuli 2 in the...Hall and Cusack, 1972). Nystagmus can best be described through example. As mentioned previously, the com- pensatory eye movements serve to stabilize...movements are what are referred to as nystagmus . The direction of the nystagmus is identified by the movement of the fast phase, that is, the direction

  9. Eye muscle repair

    MedlinePlus

    ... and physical exam before the procedure Orthoptic measurements (eye movement measurements) Always tell your child's health care provider: ... D, Plummer LS, Stass-Isern M. Disorders of eye movement and alignment. In: Kliegman RM, Stanton BF, St. ...

  10. Very Slow Search and Reach: Failure to Maximize Expected Gain in an Eye-Hand Coordination Task

    PubMed Central

    Zhang, Hang; Morvan, Camille; Etezad-Heydari, Louis-Alexandre; Maloney, Laurence T.

    2012-01-01

    We examined an eye-hand coordination task where optimal visual search and hand movement strategies were inter-related. Observers were asked to find and touch a target among five distractors on a touch screen. Their reward for touching the target was reduced by an amount proportional to how long they took to locate and reach to it. Coordinating the eye and the hand appropriately would markedly reduce the search-reach time. Using statistical decision theory we derived the sequence of interrelated eye and hand movements that would maximize expected gain and we predicted how hand movements should change as the eye gathered further information about target location. We recorded human observers' eye movements and hand movements and compared them with the optimal strategy that would have maximized expected gain. We found that most observers failed to adopt the optimal search-reach strategy. We analyze and describe the strategies they did adopt. PMID:23071430

  11. Eye Movements Reveal the Influence of Event Structure on Reading Behavior.

    PubMed

    Swets, Benjamin; Kurby, Christopher A

    2016-03-01

    When we read narrative texts such as novels and newspaper articles, we segment information presented in such texts into discrete events, with distinct boundaries between those events. But do our eyes reflect this event structure while reading? This study examines whether eye movements during the reading of discourse reveal how readers respond online to event structure. Participants read narrative passages as we monitored their eye movements. Several measures revealed that event structure predicted eye movements. In two experiments, we found that both early and overall reading times were longer for event boundaries. We also found that regressive saccades were more likely to land on event boundaries, but that readers were less likely to regress out of an event boundary. Experiment 2 also demonstrated that tracking event structure carries a working memory load. Eye movements provide a rich set of online data to test the cognitive reality of event segmentation during reading. Copyright © 2015 Cognitive Science Society, Inc.

  12. Time-course of eye movement-related decrease in vividness and emotionality of unpleasant autobiographical memories.

    PubMed

    Smeets, Monique A M; Dijs, M Willem; Pervan, Iva; Engelhard, Iris M; van den Hout, Marcel A

    2012-01-01

    The time-course of changes in vividness and emotionality of unpleasant autobiographical memories associated with making eye movements (eye movement desensitisation and reprocessing, EMDR) was investigated. Participants retrieved unpleasant autobiographical memories and rated their vividness and emotionality prior to and following 96 seconds of making eye movements (EM) or keeping eyes stationary (ES); at 2, 4, 6, and 10 seconds into the intervention; then followed by regular larger intervals throughout the 96-second intervention. Results revealed a significant drop compared to the ES group in emotionality after 74 seconds compared to a significant drop in vividness at only 2 seconds into the intervention. These results support that emotionality becomes reduced only after vividness has dropped. The results are discussed in light of working memory theory and visual imagery theory, following which the regular refreshment of the visual memory needed to maintain it in working memory is interfered with by eye movements that also tax working memory, which affects vividness first.

  13. Clinical-Radiologic Correlation of Extraocular Eye Movement Disorders: Seeing beneath the Surface.

    PubMed

    Thatcher, Joshua; Chang, Yu-Ming; Chapman, Margaret N; Hovis, Keegan; Fujita, Akifumi; Sobel, Rachel; Sakai, Osamu

    2016-01-01

    Extraocular eye movement disorders are relatively common and may be a significant source of discomfort and morbidity for patients. The presence of restricted eye movement can be detected clinically with quick, easily performed, noninvasive maneuvers that assess medial, lateral, upward, and downward gaze. However, detecting the presence of ocular dysmotility may not be sufficient to pinpoint the exact cause of eye restriction. Imaging plays an important role in excluding, in some cases, and detecting, in others, a specific cause responsible for the clinical presentation. However, the radiologist should be aware that the imaging findings in many of these conditions when taken in isolation from the clinical history and symptoms are often nonspecific. Normal eye movements are directly controlled by the ocular motor cranial nerves (CN III, IV, and VI) in coordination with indirect input or sensory stimuli derived from other cranial nerves. Specific causes of ocular dysmotility can be localized to the cranial nerve nuclei in the brainstem, the cranial nerve pathways in the peripheral nervous system, and the extraocular muscles in the orbit, with disease at any of these sites manifesting clinically as an eye movement disorder. A thorough understanding of central nervous system anatomy, cranial nerve pathways, and orbital anatomy, as well as familiarity with patterns of eye movement restriction, are necessary for accurate detection of radiologic abnormalities that support a diagnostic source of the suspected extraocular movement disorder. © RSNA, 2016.

  14. The contributions of eye movements to the efficacy of brief exposure treatment for reducing fear of public speaking.

    PubMed

    Carrigan, M H; Levis, D J

    1999-01-01

    The present study was designed to isolate the effects of the eye-movement component of the Eye Movement Desensitization and Reprocessing (EMDR) procedure in the treatment of fear of public speaking. Seventy-one undergraduate psychology students who responded in a fearful manner on the Fear Survey Schedule II and on a standardized, self-report measure of public speaking anxiety (Personal Report of Confidence as a Speaker; PRCS) were randomly assigned to one of four groups in a 2x2 factorial design. The two independent variables assessed were treatment condition (imagery plus eye movements vs. imagery alone) and type of imagery (fear-relevant vs. relaxing). Dependent variables assessed were self-reported and physiological anxiety during exposure and behavioral indices of anxiety while giving a speech. Although process measures indicated exposure to fear-relevant imagery increased anxiety during the procedure, no significant differences among groups were found on any of the outcome measures, except that participants who received eye movements were less likely to give a speech posttreatment than participants who did not receive eye movements. Addition of the eye movements to the experimental procedure did not result in enhancement of fear reduction. It was concluded, consistent with the results of past research, that previously reported positive effects of the EMDR procedure may be largely due to exposure to conditioned stimuli.

  15. A MATLAB-based eye tracking control system using non-invasive helmet head restraint in the macaque.

    PubMed

    De Luna, Paolo; Mohamed Mustafar, Mohamed Faiz Bin; Rainer, Gregor

    2014-09-30

    Tracking eye position is vital for behavioral and neurophysiological investigations in systems and cognitive neuroscience. Infrared camera systems which are now available can be used for eye tracking without the need to surgically implant magnetic search coils. These systems are generally employed using rigid head fixation in monkeys, which maintains the eye in a constant position and facilitates eye tracking. We investigate the use of non-rigid head fixation using a helmet that constrains only general head orientation and allows some freedom of movement. We present a MATLAB software solution to gather and process eye position data, present visual stimuli, interact with various devices, provide experimenter feedback and store data for offline analysis. Our software solution achieves excellent timing performance due to the use of data streaming, instead of the traditionally employed data storage mode for processing analog eye position data. We present behavioral data from two monkeys, demonstrating that adequate performance levels can be achieved on a simple fixation paradigm and show how performance depends on parameters such as fixation window size. Our findings suggest that non-rigid head restraint can be employed for behavioral training and testing on a variety of gaze-dependent visual paradigms, reducing the need for rigid head restraint systems for some applications. While developed for macaque monkey, our system of course can work equally well for applications in human eye tracking where head constraint is undesirable. Copyright © 2014. Published by Elsevier B.V.

  16. Rapid Eye Movements (REMs) and visual dream recall in both congenitally blind and sighted subjects

    NASA Astrophysics Data System (ADS)

    Bértolo, Helder; Mestre, Tiago; Barrio, Ana; Antona, Beatriz

    2017-08-01

    Our objective was to evaluate rapid eye movements (REMs) associated with visual dream recall in sighted subjects and congenital blind. During two consecutive nights polysomnographic recordings were performed at subjects home. REMs were detected by visual inspection on both EOG channels (EOG-H, EOG-V) and further classified as occurring isolated or in bursts. Dream recall was defined by the existence of a dream report. The two groups were compared using t-test and also the two-way ANOVA and a post-hoc Fisher test (for the features diagnosis (blind vs. sighted) and dream recall (yes or no) as a function of time). The average of REM awakenings per subject and the recall ability were identical in both groups. CB had a lower REM density than CS; the same applied to REM bursts and isolated eye movements. In the two-way ANOVA, REM bursts and REM density were significantly different for positive dream recall, mainly for the CB group and for diagnosis; furthermore for both features significant results were obtained for the interaction of time, recall and diagnosis; the interaction of recall and time was however, stronger. In line with previous findings the data show that blind have lower REMs density. However the ability of dream recall in congenitally blind and sighted controls is identical. In both groups visual dream recall is associated with an increase in REM bursts and density. REM bursts also show differences in the temporal profile. REM visual dream recall is associated with increased REMs activity.

  17. Eye gaze tracking for endoscopic camera positioning: an application of a hardware/software interface developed to automate Aesop.

    PubMed

    Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K

    2008-01-01

    A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.

  18. Tracking the Eye Movement of Four Years Old Children Learning Chinese Words

    ERIC Educational Resources Information Center

    Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei

    2018-01-01

    Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese…

  19. Influence of Eye Movements, Auditory Perception, and Phonemic Awareness in the Reading Process

    ERIC Educational Resources Information Center

    Megino-Elvira, Laura; Martín-Lobo, Pilar; Vergara-Moragues, Esperanza

    2016-01-01

    The authors' aim was to analyze the relationship of eye movements, auditory perception, and phonemic awareness with the reading process. The instruments used were the King-Devick Test (saccade eye movements), the PAF test (auditory perception), the PFC (phonemic awareness), the PROLEC-R (lexical process), the Canals reading speed test, and the…

  20. Children's Eye Movements, Miscue Analysis Patterns, and Retellings When Reading a Counterpoint Picture Book

    ERIC Educational Resources Information Center

    Liwanag, Maria Perpetua Socorro U.; Pelatti, Christina Yeager; Martens, Ray; Martens, Prisca

    2016-01-01

    This study incorporated eye movement miscue analysis to investigate two second-graders' oral reading and comprehension of a counterpoint picture book. Findings suggest the second-graders' strategies when reading the written and pictorial text affected their comprehension as opposed to the number and location of their eye movements. Specifically,…

  1. Eye Movements during Scene Recollection Have a Functional Role, but They Are Not Reinstatements of Those Produced during Encoding

    ERIC Educational Resources Information Center

    Johansson, Roger; Holsanova, Jana; Dewhurst, Richard; Holmqvist, Kenneth

    2012-01-01

    Current debate in mental imagery research revolves around the perceptual and cognitive role of eye movements to "nothing" (Ferreira, Apel, & Henderson, 2008; Richardson, Altmann, Spivey, & Hoover, 2009). While it is established that eye movements are comparable when inspecting a scene (or hearing a scene description) as when…

  2. Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects

    ERIC Educational Resources Information Center

    Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang

    2009-01-01

    Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…

  3. Learning to See: Guiding Students' Attention via a Model's Eye Movements Fosters Learning

    ERIC Educational Resources Information Center

    Jarodzka, Halszka; van Gog, Tamara; Dorr, Michael; Scheiter, Katharina; Gerjets, Peter

    2013-01-01

    This study investigated how to teach perceptual tasks, that is, classifying fish locomotion, through eye movement modeling examples (EMME). EMME consisted of a replay of eye movements of a didactically behaving domain expert (model), which had been recorded while he executed the task, superimposed onto the video stimulus. Seventy-five students…

  4. Eye Movements during Multiple Object Tracking: Where Do Participants Look?

    ERIC Educational Resources Information Center

    Fehd, Hilda M.; Seiffert, Adriane E.

    2008-01-01

    Similar to the eye movements you might make when viewing a sports game, this experiment investigated where participants tend to look while keeping track of multiple objects. While eye movements were recorded, participants tracked either 1 or 3 of 8 red dots that moved randomly within a square box on a black background. Results indicated that…

  5. Individual Differences in Fifth Graders' Literacy and Academic Language Predict Comprehension Monitoring Development: An Eye-Movement Study

    ERIC Educational Resources Information Center

    Connor, Carol McDonald; Radach, Ralph; Vorstius, Christian; Day, Stephanie L.; McLean, Leigh; Morrison, Frederick J.

    2015-01-01

    In this study, we investigated fifth graders' (n = 52) fall literacy, academic language, and motivation and how these skills predicted fall and spring comprehension monitoring on an eye movement task. Comprehension monitoring was defined as the identification and repair of misunderstandings when reading text. In the eye movement task, children…

  6. Exploring Cultural Variation in Eye Movements on a Web Page between Americans and Koreans

    ERIC Educational Resources Information Center

    Yang, Changwoo

    2009-01-01

    This study explored differences in eye movement on a Web page between members of two different cultures to provide insight and guidelines for implementation of global Web site development. More specifically, the research examines whether differences of eye movement exist between the two cultures (American vs. Korean) when viewing a Web page, and…

  7. Secondary-Task Effects on Learning with Multimedia: An Investigation through Eye-Movement Analysis

    ERIC Educational Resources Information Center

    Acarturk, Cengiz; Ozcelik, Erol

    2017-01-01

    This study investigates secondary-task interference on eye movements through learning with multimedia. We focus on the relationship between the influence of the secondary task on the eye movements of learners, and the learning outcomes as measured by retention, matching, and transfer. Half of the participants performed a spatial tapping task while…

  8. Initial Scene Representations Facilitate Eye Movement Guidance in Visual Search

    ERIC Educational Resources Information Center

    Castelhano, Monica S.; Henderson, John M.

    2007-01-01

    What role does the initial glimpse of a scene play in subsequent eye movement guidance? In 4 experiments, a brief scene preview was followed by object search through the scene via a small moving window that was tied to fixation position. Experiment 1 demonstrated that the scene preview resulted in more efficient eye movements compared with a…

  9. Attention Switching during Scene Perception: How Goals Influence the Time Course of Eye Movements across Advertisements

    ERIC Educational Resources Information Center

    Wedel, Michel; Pieters, Rik; Liechty, John

    2008-01-01

    Eye movements across advertisements express a temporal pattern of bursts of respectively relatively short and long saccades, and this pattern is systematically influenced by activated scene perception goals. This was revealed by a continuous-time hidden Markov model applied to eye movements of 220 participants exposed to 17 ads under a…

  10. Using Eye Movements to Model the Sequence of Text-Picture Processing for Multimedia Comprehension

    ERIC Educational Resources Information Center

    Mason, L.; Scheiter, K.; Tornatora, M. C.

    2017-01-01

    This study used eye movement modeling examples (EMME) to support students' integrative processing of verbal and graphical information during the reading of an illustrated text. EMME consists of a replay of eye movements of a model superimposed onto the materials that are processed for accomplishing the task. Specifically, the study investigated…

  11. Pursuit Eye Movements

    NASA Technical Reports Server (NTRS)

    Krauzlis, Rich; Stone, Leland; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    When viewing objects, primates use a combination of saccadic and pursuit eye movements to stabilize the retinal image of the object of regard within the high-acuity region near the fovea. Although these movements involve widespread regions of the nervous system, they mix seamlessly in normal behavior. Saccades are discrete movements that quickly direct the eyes toward a visual target, thereby translating the image of the target from an eccentric retinal location to the fovea. In contrast, pursuit is a continuous movement that slowly rotates the eyes to compensate for the motion of the visual target, minimizing the blur that can compromise visual acuity. While other mammalian species can generate smooth optokinetic eye movements - which track the motion of the entire visual surround - only primates can smoothly pursue a single small element within a complex visual scene, regardless of the motion elsewhere on the retina. This ability likely reflects the greater ability of primates to segment the visual scene, to identify individual visual objects, and to select a target of interest.

  12. The effect of age and sex on facial mimicry: a three-dimensional study in healthy adults.

    PubMed

    Sforza, C; Mapelli, A; Galante, D; Moriconi, S; Ibba, T M; Ferraro, L; Ferrario, V F

    2010-10-01

    To assess sex- and age-related characteristics in standardized facial movements, 40 healthy adults (20 men, 20 women; aged 20-50 years) performed seven standardized facial movements (maximum smile; free smile; "surprise" with closed mouth; "surprise" with open mouth; eye closure; right- and left-side eye closures). The three-dimensional coordinates of 21 soft tissue facial landmarks were recorded by a motion analyser, their movements computed, and asymmetry indices calculated. Within each movement, total facial mobility was independent from sex and age (analysis of variance, p>0.05). Asymmetry indices of the eyes and mouth were similar in both sexes (p>0.05). Age significantly influenced eye and mouth asymmetries of the right-side eye closure, and eye asymmetry of the surprise movement. On average, the asymmetry indices of the symmetric movements were always lower than 8%, and most did not deviate from the expected value of 0 (Student's t). Larger asymmetries were found for the asymmetric eye closures (eyes, up to 50%, p<0.05; mouth, up to 30%, p<0.05 only in the 20-30-year-old subjects). In conclusion, sex and age had a limited influence on total facial motion and asymmetry in normal adult men and women. Copyright © 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Difference in Visual Processing Assessed by Eye Vergence Movements

    PubMed Central

    Solé Puig, Maria; Puigcerver, Laura; Aznar-Casanova, J. Antonio; Supèr, Hans

    2013-01-01

    Orienting visual attention is closely linked to the oculomotor system. For example, a shift of attention is usually followed by a saccadic eye movement and can be revealed by micro saccades. Recently we reported a novel role of another type of eye movement, namely eye vergence, in orienting visual attention. Shifts in visuospatial attention are characterized by the response modulation to a selected target. However, unlike (micro-) saccades, eye vergence movements do not carry spatial information (except for depth) and are thus not specific to a particular visual location. To further understand the role of eye vergence in visual attention, we tested subjects with different perceptual styles. Perceptual style refers to the characteristic way individuals perceive environmental stimuli, and is characterized by a spatial difference (local vs. global) in perceptual processing. We tested field independent (local; FI) and field dependent (global; FD) observers in a cue/no-cue task and a matching task. We found that FI observers responded faster and had stronger modulation in eye vergence in both tasks than FD subjects. The results may suggest that eye vergence modulation may relate to the trade-off between the size of spatial region covered by attention and the processing efficiency of sensory information. Alternatively, vergence modulation may have a role in the switch in cortical state to prepare the visual system for new incoming sensory information. In conclusion, vergence eye movements may be added to the growing list of functions of fixational eye movements in visual perception. However, further studies are needed to elucidate its role. PMID:24069140

  14. Anticipatory Eye Movements in Interleaving Templates of Human Behavior

    NASA Technical Reports Server (NTRS)

    Matessa, Michael

    2004-01-01

    Performance modeling has been made easier by architectures which package psychological theory for reuse at useful levels of abstraction. CPM-GOMS uses templates of behavior to package at a task level (e.g., mouse move-click, typing) predictions of lower-level cognitive, perceptual, and motor resource use. CPM-GOMS also has a theory for interleaving resource use between templates. One example of interleaving is anticipatory eye movements. This paper describes the use of ACT-Stitch, a framework for translating CPM-GOMS templates and interleaving theory into ACT-R, to model anticipatory eye movements in skilled behavior. The anticipatory eye movements explain performance in a well-practiced perceptual/motor task, and the interleaving theory is supported with results from an eye-tracking experiment.

  15. Human-Computer Interface Controlled by Horizontal Directional Eye Movements and Voluntary Blinks Using AC EOG Signals

    NASA Astrophysics Data System (ADS)

    Kajiwara, Yusuke; Murata, Hiroaki; Kimura, Haruhiko; Abe, Koji

    As a communication support tool for cases of amyotrophic lateral sclerosis (ALS), researches on eye gaze human-computer interfaces have been active. However, since voluntary and involuntary eye movements cannot be distinguished in the interfaces, their performance is still not sufficient for practical use. This paper presents a high performance human-computer interface system which unites high quality recognitions of horizontal directional eye movements and voluntary blinks. The experimental results have shown that the number of incorrect inputs is decreased by 35.1% in an existing system which equips recognitions of horizontal and vertical directional eye movements in addition to voluntary blinks and character inputs are speeded up by 17.4% from the existing system.

  16. A Relationship Between Visual Complexity and Aesthetic Appraisal of Car Front Images: An Eye-Tracker Study.

    PubMed

    Chassy, Philippe; Lindell, Trym A E; Jones, Jessica A; Paramei, Galina V

    2015-01-01

    Image aesthetic pleasure (AP) is conjectured to be related to image visual complexity (VC). The aim of the present study was to investigate whether (a) two image attributes, AP and VC, are reflected in eye-movement parameters; and (b) subjective measures of AP and VC are related. Participants (N=26) explored car front images (M=50) while their eye movements were recorded. Following image exposure (10 seconds), its VC and AP were rated. Fixation count was found to positively correlate with the subjective VC and its objective proxy, JPEG compression size, suggesting that this eye-movement parameter can be considered an objective behavioral measure of VC. AP, in comparison, positively correlated with average dwelling time. Subjective measures of AP and VC were related too, following an inverted U-shape function best-fit by a quadratic equation. In addition, AP was found to be modulated by car prestige. Our findings reveal a close relationship between subjective and objective measures of complexity and aesthetic appraisal, which is interpreted within a prototype-based theory framework. © The Author(s) 2015.

  17. Using eye movements as an index of implicit face recognition in autism spectrum disorder.

    PubMed

    Hedley, Darren; Young, Robyn; Brewer, Neil

    2012-10-01

    Individuals with an autism spectrum disorder (ASD) typically show impairment on face recognition tasks. Performance has usually been assessed using overt, explicit recognition tasks. Here, a complementary method involving eye tracking was used to examine implicit face recognition in participants with ASD and in an intelligence quotient-matched non-ASD control group. Differences in eye movement indices between target and foil faces were used as an indicator of implicit face recognition. Explicit face recognition was assessed using old-new discrimination and reaction time measures. Stimuli were faces of studied (target) or unfamiliar (foil) persons. Target images at test were either identical to the images presented at study or altered by changing the lighting, pose, or by masking with visual noise. Participants with ASD performed worse than controls on the explicit recognition task. Eye movement-based measures, however, indicated that implicit recognition may not be affected to the same degree as explicit recognition. Autism Res 2012, 5: 363-379. © 2012 International Society for Autism Research, Wiley Periodicals, Inc. © 2012 International Society for Autism Research, Wiley Periodicals, Inc.

  18. Is Making a Risky Choice Based on a Weighting and Adding Process? An Eye-Tracking Investigation

    ERIC Educational Resources Information Center

    Su, Yin; Rao, Li-Lin; Sun, Hong-Yue; Du, Xue-Lei; Li, Xingshan; Li, Shu

    2013-01-01

    The debate about whether making a risky choice is based on a weighting and adding process has a long history and is still unresolved. To address this long-standing controversy, we developed a comparative paradigm. Participants' eye movements in 2 risky choice tasks that required participants to choose between risky options in single-play and…

  19. Wernicke-Korsakoff syndrome

    MedlinePlus

    ... cause leg tremor Vision changes such as abnormal eye movements (back and forth movements called nystagmus), double vision , ... may show damage to many nerve systems: Abnormal eye movement Decreased or abnormal reflexes Fast pulse (heart rate) ...

  20. How young adults with autism spectrum disorder watch and interpret pragmatically complex scenes.

    PubMed

    Lönnqvist, Linda; Loukusa, Soile; Hurtig, Tuula; Mäkinen, Leena; Siipo, Antti; Väyrynen, Eero; Palo, Pertti; Laukka, Seppo; Mämmelä, Laura; Mattila, Marja-Leena; Ebeling, Hanna

    2017-11-01

    The aim of the current study was to investigate subtle characteristics of social perception and interpretation in high-functioning individuals with autism spectrum disorders (ASDs), and to study the relation between watching and interpreting. As a novelty, we used an approach that combined moment-by-moment eye tracking and verbal assessment. Sixteen young adults with ASD and 16 neurotypical control participants watched a video depicting a complex communication situation while their eye movements were tracked. The participants also completed a verbal task with questions related to the pragmatic content of the video. We compared verbal task scores and eye movements between groups, and assessed correlations between task performance and eye movements. Individuals with ASD had more difficulty than the controls in interpreting the video, and during two short moments there were significant group differences in eye movements. Additionally, we found significant correlations between verbal task scores and moment-level eye movement in the ASD group, but not among the controls. We concluded that participants with ASD had slight difficulties in understanding the pragmatic content of the video stimulus and attending to social cues, and that the connection between pragmatic understanding and eye movements was more pronounced for participants with ASD than for neurotypical participants.

  1. Microgravity

    NASA Image and Video Library

    2003-01-22

    One concern about human adaptation to space is how returning from the microgravity of orbit to Earth can affect an astronaut's ability to fly safely. There are monitors and infrared video cameras to measure eye movements without having to affect the crew member. A computer screen provides moving images which the eye tracks while the brain determines what it is seeing. A video camera records movement of the subject's eyes. Researchers can then correlate perception and response. Test subjects perceive different images when a moving object is covered by a mask that is visible or invisible (above). Early results challenge the accepted theory that smooth pursuit -- the fluid eye movement that humans and primates have -- does not involve the higher brain. NASA results show that: Eye movement can predict human perceptual performance, smooth pursuit and saccadic (quick or ballistic) movement share some signal pathways, and common factors can make both smooth pursuit and visual perception produce errors in motor responses.

  2. Understanding Visible Perception

    NASA Technical Reports Server (NTRS)

    2003-01-01

    One concern about human adaptation to space is how returning from the microgravity of orbit to Earth can affect an astronaut's ability to fly safely. There are monitors and infrared video cameras to measure eye movements without having to affect the crew member. A computer screen provides moving images which the eye tracks while the brain determines what it is seeing. A video camera records movement of the subject's eyes. Researchers can then correlate perception and response. Test subjects perceive different images when a moving object is covered by a mask that is visible or invisible (above). Early results challenge the accepted theory that smooth pursuit -- the fluid eye movement that humans and primates have -- does not involve the higher brain. NASA results show that: Eye movement can predict human perceptual performance, smooth pursuit and saccadic (quick or ballistic) movement share some signal pathways, and common factors can make both smooth pursuit and visual perception produce errors in motor responses.

  3. Rapid eye movements during sleep in mice: High trait-like stability qualifies rapid eye movement density for characterization of phenotypic variation in sleep patterns of rodents

    PubMed Central

    2011-01-01

    Background In humans, rapid eye movements (REM) density during REM sleep plays a prominent role in psychiatric diseases. Especially in depression, an increased REM density is a vulnerability marker for depression. In clinical practice and research measurement of REM density is highly standardized. In basic animal research, almost no tools are available to obtain and systematically evaluate eye movement data, although, this would create increased comparability between human and animal sleep studies. Methods We obtained standardized electroencephalographic (EEG), electromyographic (EMG) and electrooculographic (EOG) signals from freely behaving mice. EOG electrodes were bilaterally and chronically implanted with placement of the electrodes directly between the musculus rectus superior and musculus rectus lateralis. After recovery, EEG, EMG and EOG signals were obtained for four days. Subsequent to the implantation process, we developed and validated an Eye Movement scoring in Mice Algorithm (EMMA) to detect REM as singularities of the EOG signal, based on wavelet methodology. Results The distribution of wakefulness, non-REM (NREM) sleep and rapid eye movement (REM) sleep was typical of nocturnal rodents with small amounts of wakefulness and large amounts of NREM sleep during the light period and reversed proportions during the dark period. REM sleep was distributed correspondingly. REM density was significantly higher during REM sleep than NREM sleep. REM bursts were detected more often at the end of the dark period than the beginning of the light period. During REM sleep REM density showed an ultradian course, and during NREM sleep REM density peaked at the beginning of the dark period. Concerning individual eye movements, REM duration was longer and amplitude was lower during REM sleep than NREM sleep. The majority of single REM and REM bursts were associated with micro-arousals during NREM sleep, but not during REM sleep. Conclusions Sleep-stage specific distributions of REM in mice correspond to human REM density during sleep. REM density, now also assessable in animal models through our approach, is increased in humans after acute stress, during PTSD and in depression. This relationship can now be exploited to match animal models more closely to clinical situations, especially in animal models of depression. PMID:22047102

  4. Eye movement instructions modulate motion illusion and body sway with Op Art

    PubMed Central

    Kapoula, Zoï; Lang, Alexandre; Vernet, Marine; Locher, Paul

    2015-01-01

    Op Art generates illusory visual motion. It has been proposed that eye movements participate in such illusion. This study examined the effect of eye movement instructions (fixation vs. free exploration) on the sensation of motion as well as the body sway of subjects viewing Op Art paintings. Twenty-eight healthy adults in orthostatic stance were successively exposed to three visual stimuli consisting of one figure representing a cross (baseline condition) and two Op Art paintings providing sense of motion in depth—Bridget Riley’s Movements in Squares and Akiyoshi Kitaoka’s Rollers. Before their exposure to the Op Art images, participants were instructed either to fixate at the center of the image (fixation condition) or to explore the artwork (free viewing condition). Posture was measured for 30 s per condition using a body fixed sensor (accelerometer). The major finding of this study is that the two Op Art paintings induced a larger antero-posterior body sway both in terms of speed and displacement and an increased motion illusion in the free viewing condition as compared to the fixation condition. For body sway, this effect was significant for the Riley painting, while for motion illusion this effect was significant for Kitaoka’s image. These results are attributed to macro-saccades presumably occurring under free viewing instructions, and most likely to the small vergence drifts during fixations following the saccades; such movements in interaction with visual properties of each image would increase either the illusory motion sensation or the antero-posterior body sway. PMID:25859197

  5. Using Multiple Ways to Investigate Cognitive Load Theory in the Context of Physics Instruction

    NASA Astrophysics Data System (ADS)

    Zu, Tianlong

    Cognitive load theory (CLT) (Sweller 1988, 1998, 2010) provides us a guiding framework for designing instructional materials. CLT differentiates three subtypes of cognitive load: intrinsic, extraneous, and germane cognitive load. The three cognitive loads are theorized based on the number of simultaneously processed elements in working memory. Intrinsic cognitive load depends upon the number of interacting elements in the instructional material that are related to the learning objective. Extraneous cognitive load is the mental resources allocated to processing unnecessary information which does not contribute to learning as caused by non- optimal instructional procedure. It is determined by the number of interacting elements which are not related to learning goal. Both intrinsic and extraneous load vary according to prior knowledge of learners. Germane cognitive load is indirectly related to interacting elements. It represents the cognitive resources deployed for processing intrinsic load, chunking information and constructing and automating schema. Germane cognitive load is related to level of motivation of the learner. Given this triarchic model of cognitive load and their different roles in learning activities, different learning outcomes can be expected depending upon the characteristics of the educational materials, learner characteristics, and instructional setting. In three experiments, we investigated cognitive load theory following different approaches. Given the triarchic nature of cognitive load construct, it is critical to find non- intrusive ways to measure cognitive load. In study one, we replicated and extended a previous landmark study to investigate the use of eye movements related metrics to measure the three kinds of cognitive load independently. We also collected working memory capacity of students using a cognitive operation-span task. Two of the three types of cognitive load (intrinsic and extraneous) were directly manipulated, and the third type of cognitive load (germane) was indirectly ascertained. We found that different eye-movement based parameters were most sensitive to different types of cognitive load. These results indicate that it is possible to monitor the three kinds of cognitive load separately using eye movement parameters. We also compared the up-to-date cognitive load theory model with an alternative model using a multi-level model analysis and we found that Sweller's (2010) up-to-date model is supported by our data. In educational settings, active learning based methodologies such as peer instruction have been shown to be effective in facilitating students' conceptual understanding. In study two, we discussed the effect of peer interaction on conceptual test performance of students from a cognitive load perspective. Based on the literature, a self-reported cognitive load survey was developed to measure each type of cognitive load. We found that a certain level of prior knowledge is necessary for peer interaction to work and that peer interaction is effective mainly through significantly decreasing the intrinsic load experienced by students, even though it may increase the extraneous load. In study three, we compared the effect of guided instruction in the form of worked examples using narrated-animated video solutions and semi-guided instruction using visual cues on students' performance, shift of visual attention during transfer, and extraneous cognitive load during learning. We found that multimedia video solutions can be more effective in promoting transfer performance of learners than visual cues. We also found evidence that guided instruction in the form of multimedia video solutions can decrease extraneous cognitive load of students during learning, more so than semi-guided instruction using visual cues.

  6. Smooth pursuit eye movements and schizophrenia: literature review.

    PubMed

    Franco, J G; de Pablo, J; Gaviria, A M; Sepúlveda, E; Vilella, E

    2014-09-01

    To review the scientific literature about the relationship between impairment on smooth pursuit eye movements and schizophrenia. Narrative review that includes historical articles, reports about basic and clinical investigation, systematic reviews, and meta-analysis on the topic. Up to 80% of schizophrenic patients have impairment of smooth pursuit eye movements. Despite the diversity of test protocols, 65% of patients and controls are correctly classified by their overall performance during this pursuit. The smooth pursuit eye movements depend on the ability to anticipate the target's velocity and the visual feedback, as well as on learning and attention. The neuroanatomy implicated in smooth pursuit overlaps to some extent with certain frontal cortex zones associated with some clinical and neuropsychological characteristics of the schizophrenia, therefore some specific components of smooth pursuit anomalies could serve as biomarkers of the disease. Due to their sedative effect, antipsychotics have a deleterious effect on smooth pursuit eye movements, thus these movements cannot be used to evaluate the efficacy of the currently available treatments. Standardized evaluation of smooth pursuit eye movements on schizophrenia will allow to use specific aspects of that pursuit as biomarkers for the study of its genetics, psychopathology, or neuropsychology. Copyright © 2013 Sociedad Española de Oftalmología. Published by Elsevier Espana. All rights reserved.

  7. Geometric adjustments to account for eye eccentricity in processing horizontal and vertical eye and head movement data

    NASA Technical Reports Server (NTRS)

    Huebner, W. P.; Paloski, W. H.; Reschke, M. F.; Bloomberg, J. J.

    1995-01-01

    Neglecting the eccentric position of the eyes in the head can lead to erroneous interpretation of ocular motor data, particularly for near targets. We discuss the geometric effects that eye eccentricity has on the processing of target-directed eye and head movement data, and we highlight two approaches to processing and interpreting such data. The first approach involves determining the true position of the target with respect to the location of the eyes in space for evaluating the efficacy of gaze, and it allows calculation of retinal error directly from measured eye, head, and target data. The second approach effectively eliminates eye eccentricity effects by adjusting measured eye movement data to yield equivalent responses relative to a specified reference location (such as the center of head rotation). This latter technique can be used to standardize measured eye movement signals, enabling waveforms collected under different experimental conditions to be directly compared, both with the measured target signals and with each other. Mathematical relationships describing these approaches are presented for horizontal and vertical rotations, for both tangential and circumferential display screens, and efforts are made to describe the sensitivity of parameter variations on the calculated results.

  8. Effects of aging, word frequency, and text stimulus quality on reading across the adult lifespan: Evidence from eye movements.

    PubMed

    Warrington, Kayleigh L; McGowan, Victoria A; Paterson, Kevin B; White, Sarah J

    2018-04-19

    Reductions in stimulus quality may disrupt the reading performance of older adults more when compared with young adults because of sensory declines that begin early in middle age. However, few studies have investigated adult age differences in the effects of stimulus quality on reading, and none have examined how this affects lexical processing and eye movement control. Accordingly, we report two experiments that examine the effects of reduced stimulus quality on the eye movements of young (18-24 years), middle-aged (41-51 years), and older (65+ years) adult readers. In Experiment 1, participants read sentences that contained a high- or low-frequency critical word and that were presented normally or with contrast reduced so that words appeared faint. Experiment 2 further investigated effects of reduced stimulus quality using a gaze-contingent technique to present upcoming text normally or with contrast reduced. Typical patterns of age-related reading difficulty (e.g., slower reading, more regressions) were observed in both experiments. In addition, eye movements were disrupted more for older than younger adults when all text (Experiment 1) or just upcoming text (Experiment 2) appeared faint. Moreover, there was an interaction between stimulus quality and word frequency (Experiment 1), such that readers fixated faint low-frequency words for disproportionately longer. Crucially, this effect was similar across all age groups. Thus, although older readers suffer more from reduced stimulus quality, this additional difficulty primarily affects their visual processing of text. These findings have important implications for understanding the role of stimulus quality on reading behavior across the lifespan. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. Joint Attention without Gaze Following: Human Infants and Their Parents Coordinate Visual Attention to Objects through Eye-Hand Coordination

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2013-01-01

    The coordination of visual attention among social partners is central to many components of human behavior and human development. Previous research has focused on one pathway to the coordination of looking behavior by social partners, gaze following. The extant evidence shows that even very young infants follow the direction of another's gaze but they do so only in highly constrained spatial contexts because gaze direction is not a spatially precise cue as to the visual target and not easily used in spatially complex social interactions. Our findings, derived from the moment-to-moment tracking of eye gaze of one-year-olds and their parents as they actively played with toys, provide evidence for an alternative pathway, through the coordination of hands and eyes in goal-directed action. In goal-directed actions, the hands and eyes of the actor are tightly coordinated both temporally and spatially, and thus, in contexts including manual engagement with objects, hand movements and eye movements provide redundant information about where the eyes are looking. Our findings show that one-year-olds rarely look to the parent's face and eyes in these contexts but rather infants and parents coordinate looking behavior without gaze following by attending to objects held by the self or the social partner. This pathway, through eye-hand coupling, leads to coordinated joint switches in visual attention and to an overall high rate of looking at the same object at the same time, and may be the dominant pathway through which physically active toddlers align their looking behavior with a social partner. PMID:24236151

  10. Contribution of the cerebellar flocculus to gaze control during active head movements

    NASA Technical Reports Server (NTRS)

    Belton, T.; McCrea, R. A.; Peterson, B. W. (Principal Investigator)

    1999-01-01

    The flocculus and ventral paraflocculus are adjacent regions of the cerebellar cortex that are essential for controlling smooth pursuit eye movements and for altering the performance of the vestibulo-ocular reflex (VOR). The question addressed in this study is whether these regions of the cerebellum are more globally involved in controlling gaze, regardless of whether eye or active head movements are used to pursue moving visual targets. Single-unit recordings were obtained from Purkinje (Pk) cells in the floccular region of squirrel monkeys that were trained to fixate and pursue small visual targets. Cell firing rate was recorded during smooth pursuit eye movements, cancellation of the VOR, combined eye-head pursuit, and spontaneous gaze shifts in the absence of targets. Pk cells were found to be much less sensitive to gaze velocity during combined eye-head pursuit than during ocular pursuit. They were not sensitive to gaze or head velocity during gaze saccades. Temporary inactivation of the floccular region by muscimol injection compromised ocular pursuit but had little effect on the ability of monkeys to pursue visual targets with head movements or to cancel the VOR during active head movements. Thus the signals produced by Pk cells in the floccular region are necessary for controlling smooth pursuit eye movements but not for coordinating gaze during active head movements. The results imply that individual functional modules in the cerebellar cortex are less involved in the global organization and coordination of movements than with parametric control of movements produced by a specific part of the body.

  11. Saccades and Vergence Performance in a Population of Children with Vertigo and Clinically Assessed Abnormal Vergence Capabilities

    PubMed Central

    Bucci, Maria Pia; Kapoula, Zoï; Bui-Quoc, Emmanuel; Bouet, Aurelie; Wiener-Vacher, Sylvette

    2011-01-01

    Purpose Early studies reported some abnormalities in saccade and vergence eye movements in children with vertigo and vergence deficiencies. The purpose of this study was to further examine saccade and vergence performance in a population of 44 children (mean age: 12.3±1.6 years) with vertigo symptoms and with different levels of vergence abnormalities, as assessed by static orthoptic examination (near point of convergence, prism bar and cover-uncover test). Methods Three groups were identified on the basis of the orthoptic tests: group 1 (n = 13) with vergence spasms and mildly perturbed orthoptic scores, group 2 (n = 14) with moderately perturbed orthoptic scores, and group 3 (n = 17) with severely perturbed orthoptic scores. Data were compared to those recorded from 28 healthy children of similar ages. Latency, accuracy and peak velocity of saccades and vergence movements were measured in two different conditions: gap (fixation offset 200 ms prior to target onset) and simultaneous paradigms. Binocular horizontal movements were recorded by a photoelectric device. Results Group 2 of children with vergence abnormalities showed significantly longer latency than normal children in several types of eye movements recorded. For all three groups of children with vergence abnormalities, the gain was poor, particularly for vergence movement. The peak velocity values did not differ between the different groups of children examined. Interpretation Eye movement measures together with static orthoptic evaluation allowed us to better identify children with vergence abnormalities based on their slow initiation of eye movements. Overall, these findings support the hypothesis of a central deficit in the programming and triggering of saccades and vergence in these children. PMID:21858007

  12. The Neural Basis of Smooth Pursuit Eye Movements in the Rhesus Monkey Brain

    ERIC Educational Resources Information Center

    Ilg, Uwe J.; Thier, Peter

    2008-01-01

    Smooth pursuit eye movements are performed in order to prevent retinal image blur of a moving object. Rhesus monkeys are able to perform smooth pursuit eye movements quite similar as humans, even if the pursuit target does not consist in a simple moving dot. Therefore, the study of the neuronal responses as well as the consequences of…

  13. Mental Imagery as Revealed by Eye Movements and Spoken Predicates: A Test of Neurolinguistic Programming.

    ERIC Educational Resources Information Center

    Elich, Matthew; And Others

    1985-01-01

    Tested Bandler and Grinder's proposal that eye movement direction and spoken predicates are indicative of sensory modality of imagery. Subjects reported images in the three modes, but no relation between imagery and eye movements or predicates was found. Visual images were most vivid and often reported. Most subjects rated themselves as visual,…

  14. Evaluating and Reporting Data Quality in Eye Movement Research. Technical Report No. 193.

    ERIC Educational Resources Information Center

    McConkie, George W.

    Stressing that it is necessary to have information about the quality of eye movement data in order to judge the degree of confidence one should have in the results of an experiment using eye movement records as data, this report suggests ways for assessing and reporting such information. Specifically, the report deals with three areas: (1)…

  15. The Importance of Reading Naturally: Evidence from Combined Recordings of Eye Movements and Electric Brain Potentials

    ERIC Educational Resources Information Center

    Metzner, Paul; von der Malsburg, Titus; Vasishth, Shravan; Rösler, Frank

    2017-01-01

    How important is the ability to freely control eye movements for reading comprehension? And how does the parser make use of this freedom? We investigated these questions using coregistration of eye movements and event-related brain potentials (ERPs) while participants read either freely or in a computer-controlled word-by-word format (also known…

  16. [The Strategic Organization of Skill

    NASA Technical Reports Server (NTRS)

    Roberts, Ralph

    1996-01-01

    Eye-movement software was developed in addition to several studies that focused on expert-novice differences in the acquisition and organization of skill. These studies focused on how increasingly complex strategies utilize and incorporate visual look-ahead to calibrate action. Software for collecting, calibrating, and scoring eye-movements was refined and updated. Some new algorithms were developed for analyzing corneal-reflection eye movement data that detect the location of saccadic eye movements in space and time. Two full-scale studies were carried out which examined how experts use foveal and peripheral vision to acquire information about upcoming environmental circumstances in order to plan future action(s) accordingly.

  17. Nystagmus as a Sign of Labyrinthine Disorders-Three-Dimensional Analysis of Nystagmus-

    PubMed Central

    2008-01-01

    In order to diagnose the pathological condition of vertiginous patients, a detailed observation of nystagmus in addition to examination of body equilibrium and other neurotological tests are essential. How to precisely record the eye movements is one of the goals of the researchers and clinicians who are interested in the analysis of eye movements for a long time. For considering that, one has to think about the optimal method for recording eye movements. In this review, the author introduced a new method, that is, an analysis of vestibular induced eye movements in three-dimensions and discussed the advantages and limitations of this method. PMID:19434275

  18. Correlation of climbing perception and eye movements during daytime and nighttime takeoffs using a flight simulator.

    PubMed

    Tamura, Atsushi; Wada, Yoshiro; Shimizu, Naoki; Inui, Takuo; Shiotani, Akihiro

    2016-01-01

    This study suggests that the subjective climbing perception can be quantitatively evaluated using values calculated from induced eye movements, and the findings may aid in the detection of pilots who are susceptible to spatial disorientation in a screening test. The climbing perception experienced by a pilot during takeoff at night is stronger than that experienced during the day. To investigate this illusion, this study assessed eye movements and analyzed their correlation with subjective climbing perception during daytime and nighttime takeoffs. Eight male volunteers participated in this study. A simulated aircraft takeoff environment was created using a flight simulator and the maximum slow-phase velocities and vestibulo-ocular reflex gain of vertical eye movements were calculated during takeoff simulation. Four of the eight participants reported that their perception of climbing at night was stronger, while the other four reported that there was no difference between day and night. These perceptions were correlated with eye movements; participants with a small difference in the maximum slow-phase velocities of their downward eye movements between daytime and nighttime takeoffs indicated that their perception of climbing was the same under the two conditions.

  19. Measuring saccade peak velocity using a low-frequency sampling rate of 50 Hz.

    PubMed

    Wierts, Roel; Janssen, Maurice J A; Kingma, Herman

    2008-12-01

    During the last decades, small head-mounted video eye trackers have been developed in order to record eye movements. Real-time systems-with a low sampling frequency of 50/60 Hz-are used for clinical vestibular practice, but are generally considered not to be suited for measuring fast eye movements. In this paper, it is shown that saccadic eye movements, having an amplitude of at least 5 degrees, can, in good approximation, be considered to be bandwidth limited up to a frequency of 25-30 Hz. Using the Nyquist theorem to reconstruct saccadic eye movement signals at higher temporal resolutions, it is shown that accurate values for saccade peak velocities, recorded at 50 Hz, can be obtained, but saccade peak accelerations and decelerations cannot. In conclusion, video eye trackers sampling at 50/60 Hz are appropriate for detecting the clinical relevant saccade peak velocities in contrast to what has been stated up till now.

  20. Tracking the truth: the effect of face familiarity on eye fixations during deception.

    PubMed

    Millen, Ailsa E; Hope, Lorraine; Hillstrom, Anne P; Vrij, Aldert

    2017-05-01

    In forensic investigations, suspects sometimes conceal recognition of a familiar person to protect co-conspirators or hide knowledge of a victim. The current experiment sought to determine whether eye fixations could be used to identify memory of known persons when lying about recognition of faces. Participants' eye movements were monitored whilst they lied and told the truth about recognition of faces that varied in familiarity (newly learned, famous celebrities, personally known). Memory detection by eye movements during recognition of personally familiar and famous celebrity faces was negligibly affected by lying, thereby demonstrating that detection of memory during lies is influenced by the prior learning of the face. By contrast, eye movements did not reveal lies robustly for newly learned faces. These findings support the use of eye movements as markers of memory during concealed recognition but also suggest caution when familiarity is only a consequence of one brief exposure.

  1. Real-time lexical comprehension in young children learning American Sign Language.

    PubMed

    MacDonald, Kyle; LaMarr, Todd; Corina, David; Marchman, Virginia A; Fernald, Anne

    2018-04-16

    When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality. © 2018 John Wiley & Sons Ltd.

  2. Eye-movements intervening between two successive sounds disrupt comparisons of auditory location

    PubMed Central

    Pavani, Francesco; Husain, Masud; Driver, Jon

    2008-01-01

    Summary Many studies have investigated how saccades may affect the internal representation of visual locations across eye-movements. Here we studied instead whether eye-movements can affect auditory spatial cognition. In two experiments, participants judged the relative azimuth (same/different) of two successive sounds presented from a horizontal array of loudspeakers, separated by a 2.5 secs delay. Eye-position was either held constant throughout the trial (being directed in a fixed manner to the far left or right of the loudspeaker array), or had to be shifted to the opposite side of the array during the retention delay between the two sounds, after the first sound but before the second. Loudspeakers were either visible (Experiment1) or occluded from sight (Experiment 2). In both cases, shifting eye-position during the silent delay-period affected auditory performance in the successive auditory comparison task, even though the auditory inputs to be judged were equivalent. Sensitivity (d′) for the auditory discrimination was disrupted, specifically when the second sound shifted in the opposite direction to the intervening eye-movement with respect to the first sound. These results indicate that eye-movements affect internal representation of auditory location. PMID:18566808

  3. Eye-movements intervening between two successive sounds disrupt comparisons of auditory location.

    PubMed

    Pavani, Francesco; Husain, Masud; Driver, Jon

    2008-08-01

    Many studies have investigated how saccades may affect the internal representation of visual locations across eye-movements. Here, we studied, instead, whether eye-movements can affect auditory spatial cognition. In two experiments, participants judged the relative azimuth (same/different) of two successive sounds presented from a horizontal array of loudspeakers, separated by a 2.5-s delay. Eye-position was either held constant throughout the trial (being directed in a fixed manner to the far left or right of the loudspeaker array) or had to be shifted to the opposite side of the array during the retention delay between the two sounds, after the first sound but before the second. Loudspeakers were either visible (Experiment 1) or occluded from sight (Experiment 2). In both cases, shifting eye-position during the silent delay-period affected auditory performance in thn the successive auditory comparison task, even though the auditory inputs to be judged were equivalent. Sensitivity (d') for the auditory discrimination was disrupted, specifically when the second sound shifted in the opposite direction to the intervening eye-movement with respect to the first sound. These results indicate that eye-movements affect internal representation of auditory location.

  4. Tracking Students' Cognitive Processes during Program Debugging--An Eye-Movement Approach

    ERIC Educational Resources Information Center

    Lin, Yu-Tzu; Wu, Cheng-Chih; Hou, Ting-Yun; Lin, Yu-Chih; Yang, Fang-Ying; Chang, Chia-Hu

    2016-01-01

    This study explores students' cognitive processes while debugging programs by using an eye tracker. Students' eye movements during debugging were recorded by an eye tracker to investigate whether and how high- and low-performance students act differently during debugging. Thirty-eight computer science undergraduates were asked to debug two C…

  5. Eye-Movement Patterns Are Associated with Communicative Competence in Autistic Spectrum Disorders

    ERIC Educational Resources Information Center

    Norbury, Courtenay Frazier; Brock, Jon; Cragg, Lucy; Einav, Shiri; Griffiths, Helen; Nation, Kate

    2009-01-01

    Background: Investigations using eye-tracking have reported reduced fixations to salient social cues such as eyes when participants with autism spectrum disorders (ASD) view social scenes. However, these studies have not distinguished different cognitive phenotypes. Methods: The eye-movements of 28 teenagers with ASD and 18 typically developing…

  6. Improving Silent Reading Performance through Feedback on Eye Movements: A Feasibility Study

    ERIC Educational Resources Information Center

    Korinth, Sebastian P.; Fiebach, Christian J.

    2018-01-01

    This feasibility study investigated if feedback about individual eye movements, reflecting varying word processing stages, can improve reading performance. Twenty-five university students read 90 newspaper articles during 9 eye-tracking sessions. Training group participants (n = 12) were individually briefed before each session, which eye movement…

  7. Normal Morning Melanin-Concentrating Hormone Levels and No Association with Rapid Eye Movement or Non-Rapid Eye Movement Sleep Parameters in Narcolepsy Type 1 and Type 2.

    PubMed

    Schrölkamp, Maren; Jennum, Poul J; Gammeltoft, Steen; Holm, Anja; Kornum, Birgitte R; Knudsen, Stine

    2017-02-15

    Other than hypocretin-1 (HCRT-1) deficiency in narcolepsy type 1 (NT1), the neurochemical imbalance of NT1 and narcolepsy type 2 (NT2) with normal HCRT-1 levels is largely unknown. The neuropeptide melanin-concentrating hormone (MCH) is mainly secreted during sleep and is involved in rapid eye movement (REM) and non-rapid eye movement (NREM) sleep regulation. Hypocretin neurons reciprocally interact with MCH neurons. We hypothesized that altered MCH secretion contributes to the symptoms and sleep abnormalities of narcolepsy and that this is reflected in morning cerebrospinal fluid (CSF) MCH levels, in contrast to previously reported normal evening/afternoon levels. Lumbar CSF and plasma were collected from 07:00 to 10:00 from 57 patients with narcolepsy (subtypes: 47 NT1; 10 NT2) diagnosed according to International Classification of Sleep Disorders, Third Edition (ICSD-3) and 20 healthy controls. HCRT-1 and MCH levels were quantified by radioimmunoassay and correlated with clinical symptoms, polysomnography (PSG), and Multiple Sleep Latency Test (MSLT) parameters. CSF and plasma MCH levels were not significantly different between narcolepsy patients regardless of ICSD-3 subtype, HCRT-1 levels, or compared to controls. CSF MCH and HCRT-1 levels were not significantly correlated. Multivariate regression models of CSF MCH levels, age, sex, and body mass index predicting clinical, PSG, and MSLT parameters did not reveal any significant associations to CSF MCH levels. Our study shows that MCH levels in CSF collected in the morning are normal in narcolepsy and not associated with the clinical symptoms, REM sleep abnormalities, nor number of muscle movements during REM or NREM sleep of the patients. We conclude that morning lumbar CSF MCH measurement is not an informative diagnostic marker for narcolepsy. © 2017 American Academy of Sleep Medicine

  8. Parafoveal Processing Affects Outgoing Saccade Length during the Reading of Chinese

    ERIC Educational Resources Information Center

    Liu, Yanping; Reichle, Erik D.; Li, Xingshan

    2015-01-01

    Participants' eye movements were measured while reading Chinese sentences in which target-word frequency and the availability of parafoveal processing were manipulated using a gaze-contingent boundary paradigm. The results of this study indicate that preview availability and its interaction with word frequency modulated the length of the saccades…

  9. Cognitive Mechanisms Underlying Action Prediction in Children and Adults with Autism Spectrum Condition

    ERIC Educational Resources Information Center

    Schuwerk, Tobias; Sodian, Beate; Paulus, Markus

    2016-01-01

    Recent research suggests that impaired action prediction is at the core of social interaction deficits in autism spectrum condition (ASC). Here, we targeted two cognitive mechanisms that are thought to underlie the prediction of others' actions: statistical learning and efficiency considerations. We measured proactive eye movements of 10-year-old…

  10. A Computational Model of Active Vision for Visual Search in Human-Computer Interaction

    DTIC Science & Technology

    2010-08-01

    processors that interact with the production rules to produce behavior, and (c) parameters that constrain the behavior of the model (e.g., the...velocity of a saccadic eye movement). While the parameters can be task-specific, the majority of the parameters are usually fixed across a wide variety...previously estimated durations. Hooge and Erkelens (1996) review these four explanations of fixation duration control. A variety of research

  11. The effects of reading comprehension and launch site on frequency-predictability interactions during paragraph reading.

    PubMed

    Whitford, Veronica; Titone, Debra

    2014-01-01

    We used eye movement measures of paragraph reading to examine whether word frequency and predictability interact during the earliest stages of lexical processing, with a specific focus on whether these effects are modulated by individual differences in reading comprehension or launch site (i.e., saccade length between the prior and currently fixated word--a proxy for the amount of parafoveal word processing). The joint impact of frequency and predictability on reading will elucidate whether these variables additively or multiplicatively affect the earliest stages of lexical access, which, in turn, has implications for computational models of eye movements during reading. Linear mixed effects models revealed additive effects during both early- and late-stage reading, where predictability effects were comparable for low- and high-frequency words. Moreover, less cautious readers (e.g., readers who engaged in skimming, scanning, mindless reading) demonstrated smaller frequency effects than more cautious readers. Taken together, our findings suggest that during extended reading, frequency and predictability exert additive influences on lexical and postlexical processing, and that individual differences in reading comprehension modulate sensitivity to the effects of word frequency.

  12. Eye movements and abducens motoneuron behavior after cholinergic activation of the nucleus reticularis pontis caudalis.

    PubMed

    Márquez-Ruiz, Javier; Escudero, Miguel

    2010-11-01

    the aim of this work was to characterize eye movements and abducens (ABD) motoneuron behavior after cholinergic activation of the nucleus reticularis pontis caudalis (NRPC). six female adult cats were prepared for chronic recording of eye movements (using the scleral search-coil technique), electroencephalography, electromyography, ponto-geniculo-occipital (PGO) waves in the lateral geniculate nucleus, and ABD motoneuron activities after microinjections of the cholinergic agonist carbachol into the NRPC. unilateral microinjections of carbachol in the NRPC induced tonic and phasic phenomena in the oculomotor system. Tonic effects consisted of ipsiversive rotation to the injected side, convergence, and downward rotation of the eyes. Phasic effects consisted of bursts of rhythmic rapid eye movements directed contralaterally to the injected side along with PGO-like waves in the lateral geniculate and ABD nuclei. Although tonic effects were dependent on the level of drowsiness, phasic effects were always present and appeared along with normal saccades when the animal was vigilant. ABD motoneurons showed phasic activities associated with ABD PGO-like waves during bursts of rapid eye movements, and tonic and phasic activities related to eye position and velocity during alertness. the cholinergic activation of the NRPC induces oculomotor phenomena that are somewhat similar to those described during REM sleep. A precise comparison of the dynamics and timing of the eye movements further suggests that a temporal organization of both NRPCs is needed to reproduce the complexity of the oculomotor behavior during REM sleep.

  13. Gaze-independent brain-computer interfaces based on covert attention and feature attention

    NASA Astrophysics Data System (ADS)

    Treder, M. S.; Schmidt, N. M.; Blankertz, B.

    2011-10-01

    There is evidence that conventional visual brain-computer interfaces (BCIs) based on event-related potentials cannot be operated efficiently when eye movements are not allowed. To overcome this limitation, the aim of this study was to develop a visual speller that does not require eye movements. Three different variants of a two-stage visual speller based on covert spatial attention and non-spatial feature attention (i.e. attention to colour and form) were tested in an online experiment with 13 healthy participants. All participants achieved highly accurate BCI control. They could select one out of thirty symbols (chance level 3.3%) with mean accuracies of 88%-97% for the different spellers. The best results were obtained for a speller that was operated using non-spatial feature attention only. These results show that, using feature attention, it is possible to realize high-accuracy, fast-paced visual spellers that have a large vocabulary and are independent of eye gaze.

  14. Punctal function in lacrimal drainage: the 'pipette sign' and functional ectropion.

    PubMed

    Beigi, Bijan; Gupta, Deepak; Luo, Yvonne H-L; Saldana, Manuel; Georgalas, Ilias; Kalantzis, George; El-Hindy, Nabil

    2015-07-01

    The aim was to assess the movements of the inferior punctum during blinking and discuss pertinent clinical applications. This is a prospective, non-comparative observational case-series examining the function of inferior punctum during blinking using video recordings of the blinking action at the slitlamp with slow-motion analysis and comparison. In all 56 eyes of 28 patents, supero-medial movement of the lower punctum toward the medial canthus, together with a medially directed protrusion of the inferior punctum was noted. It was also noted that the punctum blanched during this projectile movement compared to the rest of the lid margin. Simultaneous posterior rotation of the punctum was also observed in 48 eyes (85.7 per cent; 23 right eyes and 25 left eyes), resulting in apposition of the punctum to the lacus lacrimalis. In eight eyes (14.3 per cent; five right eyes and three left) from six patients, co-existence of medial punctal ectropion led to failure of internal rotation of the punctum during blinking, even though punctal 'pipette formation' was preserved. These six patients all suffered from epiphora in the affected eyes. The presence of 'pipette' formation was calculated to have a sensitivity of 80 per cent and specificity of 100 per cent for punctal ectropion in our series. A two-tailed Fisher exact test showed that based on our 56 eyes, these results were statistically significant (p < 0.0001). The inferior punctum plays an active and important role in the drainage of tears by the mechanism of supero-medial movement and medially directed protrusion ('pipetting action'), failure of which contributes to epiphora. This is a highly specific sign and should be sought in the evaluation of epiphora, even in the absence of frank ectropion. In punctual stenosis where location of the punctal orifice is proving difficult, inducing the pipette sign will help in its identification. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  15. Eye movements and manual interception of ballistic trajectories: effects of law of motion perturbations and occlusions.

    PubMed

    Delle Monache, Sergio; Lacquaniti, Francesco; Bosco, Gianfranco

    2015-02-01

    Manual interceptions are known to depend critically on integration of visual feedback information and experience-based predictions of the interceptive event. Within this framework, coupling between gaze and limb movements might also contribute to the interceptive outcome, since eye movements afford acquisition of high-resolution visual information. We investigated this issue by analyzing subjects' head-fixed oculomotor behavior during manual interceptions. Subjects moved a mouse cursor to intercept computer-generated ballistic trajectories either congruent with Earth's gravity or perturbed with weightlessness (0 g) or hypergravity (2 g) effects. In separate sessions, trajectories were either fully visible or occluded before interception to enforce visual prediction. Subjects' oculomotor behavior was classified in terms of amounts of time they gazed at different visual targets and of overall number of saccades. Then, by way of multivariate analyses, we assessed the following: (1) whether eye movement patterns depended on targets' laws of motion and occlusions; and (2) whether interceptive performance was related to the oculomotor behavior. First, we found that eye movement patterns depended significantly on targets' laws of motion and occlusion, suggesting predictive mechanisms. Second, subjects coupled differently oculomotor and interceptive behavior depending on whether targets were visible or occluded. With visible targets, subjects made smaller interceptive errors if they gazed longer at the mouse cursor. Instead, with occluded targets, they achieved better performance by increasing the target's tracking accuracy and by avoiding gaze shifts near interception, suggesting that precise ocular tracking provided better trajectory predictions for the interceptive response.

  16. Opening a Window into Reading Development: Eye Movements’ Role Within a Broader Literacy Research Framework

    PubMed Central

    Miller, Brett; O’Donnell, Carol

    2013-01-01

    The cumulative body of eye movement research provides significant insight into how readers process text. The heart of this work spans roughly 40 years reflecting the maturity of both the topics under study and experimental approaches used to investigate reading. Recent technological advancements offer increased flexibility to the field providing the potential to more concertedly study reading and literacy from an individual differences perspective. Historically, eye movement research focused far less on developmental issues related to individual differences in reading; however, this issue and the broader change it represents signal a meaningful transition inclusive of individual differences. The six papers in this special issue signify the recent, increased attention to and recognition of eye movement research’s transition to emphasize individual differences in reading while appreciating early contributions (e.g., Rayner, 1986) in this direction. We introduce these six papers and provide some historical context for the use of eye movement methodology to examine reading and context for the eye movement field’s early transition to examining individual differences, culminating in future research recommendations. PMID:24391304

  17. Electroencephalographic prodromal markers of dementia across conscious states in Parkinson’s disease

    PubMed Central

    Latreille, Véronique; Gaudet-Fex, Benjamin; Rodrigues-Brazète, Jessica; Panisset, Michel; Chouinard, Sylvain; Postuma, Ronald B.

    2016-01-01

    Abstract In Parkinson’s disease, electroencephalographic abnormalities during wakefulness and non-rapid eye movement sleep (spindles) were found to be predictive biomarkers of dementia. Because rapid eye movement sleep is regulated by the cholinergic system, which shows early degeneration in Parkinson’s disease with cognitive impairment, anomalies during this sleep stage might mirror dementia development. In this prospective study, we examined baseline electroencephalographic absolute spectral power across three states of consciousness (non-rapid eye movement sleep, rapid eye movement sleep, and wakefulness) in 68 non-demented patients with Parkinson’s disease and 44 healthy controls. All participants underwent baseline polysomnographic recordings and a comprehensive neuropsychological assessment. Power spectral analyses were performed on standard frequency bands. Dominant occipital frequency during wakefulness and ratios of slow-to-fast frequencies during rapid eye movement sleep and wakefulness were also computed. At follow-up (an average 4.5 years after baseline), 18 patients with Parkinson’s disease had developed dementia and 50 patients remained dementia-free. In rapid eye movement sleep, patients with Parkinson’s disease who later developed dementia showed, at baseline, higher absolute power in delta and theta bands and a higher slowing ratio, especially in temporal, parietal, and occipital regions, compared to patients who remained dementia-free and controls. In non-rapid eye movement sleep, lower baseline sigma power in parietal cortical regions also predicted development of dementia. During wakefulness, patients with Parkinson’s disease who later developed dementia showed lower dominant occipital frequency as well as higher delta and slowing ratio compared to patients who remained dementia-free and controls. At baseline, higher slowing ratios in temporo-occipital regions during rapid eye movement sleep were associated with poor performance on visuospatial tests in patients with Parkinson’s disease. Using receiver operating characteristic curves, we found that best predictors of dementia in Parkinson’s disease were rapid eye movement sleep slowing ratios in posterior regions, wakefulness slowing ratios in temporal areas, and lower dominant occipital frequency. These results suggest that electroencephalographic slowing during sleep is a new promising predictive biomarker for Parkinson’s disease dementia, perhaps as a marker of cholinergic denervation. PMID:26912643

  18. Parafoveal magnification: visual acuity does not modulate the perceptual span in reading.

    PubMed

    Miellet, Sébastien; O'Donnell, Patrick J; Sereno, Sara C

    2009-06-01

    Models of eye guidance in reading rely on the concept of the perceptual span-the amount of information perceived during a single eye fixation, which is considered to be a consequence of visual and attentional constraints. To directly investigate attentional mechanisms underlying the perceptual span, we implemented a new reading paradigm-parafoveal magnification (PM)-that compensates for how visual acuity drops off as a function of retinal eccentricity. On each fixation and in real time, parafoveal text is magnified to equalize its perceptual impact with that of concurrent foveal text. Experiment 1 demonstrated that PM does not increase the amount of text that is processed, supporting an attentional-based account of eye movements in reading. Experiment 2 explored a contentious issue that differentiates competing models of eye movement control and showed that, even when parafoveal information is enlarged, visual attention in reading is allocated in a serial fashion from word to word.

  19. A Model-Based Approach for the Measurement of Eye Movements Using Image Processing

    NASA Technical Reports Server (NTRS)

    Sung, Kwangjae; Reschke, Millard F.

    1997-01-01

    This paper describes a video eye-tracking algorithm which searches for the best fit of the pupil modeled as a circular disk. The algorithm is robust to common image artifacts such as the droopy eyelids and light reflections while maintaining the measurement resolution available by the centroid algorithm. The presented algorithm is used to derive the pupil size and center coordinates, and can be combined with iris-tracking techniques to measure ocular torsion. A comparison search method of pupil candidates using pixel coordinate reference lookup tables optimizes the processing requirements for a least square fit of the circular disk model. This paper includes quantitative analyses and simulation results for the resolution and the robustness of the algorithm. The algorithm presented in this paper provides a platform for a noninvasive, multidimensional eye measurement system which can be used for clinical and research applications requiring the precise recording of eye movements in three-dimensional space.

  20. Performing saccadic eye movements or blinking improves postural control.

    PubMed

    Rougier, Patrice; Garin, Mélanie

    2007-07-01

    To determine the relationship between eye movement and postural control on an undisturbed upright stance maintenance protocol, 15 young, healthy individuals were tested in various conditions. These conditions included imposed blinking patterns and horizontal and vertical saccadic eye movements. The directions taken by the center of pressure (CP) were recorded via a force platform on which the participants remained in an upright position. The CP trajectories were used to estimate, via a low-pass filter, the vertically projected movements of the center of gravity (CGv) and consequently the difference CP-CGv. An analysis of the frequency shows that regular bilateral blinking does not produce a significant change in postural control. In contrast, performing saccadic eye movements induces some reduced amplitude for both basic CGv and CP-CGv movements principally along the antero-posterior axis. The present result supports the theory that some ocular movements may modify postural control in the maintenance of the upright standing position in human participants.

Top