VisualEyes: a modular software system for oculomotor experimentation.
Guo, Yi; Kim, Eun H; Kim, Eun; Alvarez, Tara; Alvarez, Tara L
2011-03-25
Eye movement studies have provided a strong foundation forming an understanding of how the brain acquires visual information in both the normal and dysfunctional brain.(1) However, development of a platform to stimulate and store eye movements can require substantial programming, time and costs. Many systems do not offer the flexibility to program numerous stimuli for a variety of experimental needs. However, the VisualEyes System has a flexible architecture, allowing the operator to choose any background and foreground stimulus, program one or two screens for tandem or opposing eye movements and stimulate the left and right eye independently. This system can significantly reduce the programming development time needed to conduct an oculomotor study. The VisualEyes System will be discussed in three parts: 1) the oculomotor recording device to acquire eye movement responses, 2) the VisualEyes software written in LabView, to generate an array of stimuli and store responses as text files and 3) offline data analysis. Eye movements can be recorded by several types of instrumentation such as: a limbus tracking system, a sclera search coil, or a video image system. Typical eye movement stimuli such as saccadic steps, vergent ramps and vergent steps with the corresponding responses will be shown. In this video report, we demonstrate the flexibility of a system to create numerous visual stimuli and record eye movements that can be utilized by basic scientists and clinicians to study healthy as well as clinical populations.
Tonic and phasic phenomena underlying eye movements during sleep in the cat
Márquez-Ruiz, Javier; Escudero, Miguel
2008-01-01
Mammalian sleep is not a homogenous state, and different variables have traditionally been used to distinguish different periods during sleep. Of these variables, eye movement is one of the most paradigmatic, and has been used to differentiate between the so-called rapid eye movement (REM) and non-REM (NREM) sleep periods. Despite this, eye movements during sleep are poorly understood, and the behaviour of the oculomotor system remains almost unknown. In the present work, we recorded binocular eye movements during the sleep–wake cycle of adult cats by the scleral search-coil technique. During alertness, eye movements consisted of conjugated saccades and eye fixations. During NREM sleep, eye movements were slow and mostly unconjugated. The two eyes moved upwardly and in the abducting direction, producing a tonic divergence and elevation of the visual axis. During the transition period between NREM and REM sleep, rapid monocular eye movements of low amplitude in the abducting direction occurred in coincidence with ponto-geniculo-occipital waves. Along REM sleep, the eyes tended to maintain a tonic convergence and depression, broken by high-frequency bursts of complex rapid eye movements. In the horizontal plane, each eye movement in the burst comprised two consecutive movements in opposite directions, which were more evident in the eye that performed the abducting movements. In the vertical plane, rapid eye movements were always upward. Comparisons of the characteristics of eye movements during the sleep–wake cycle reveal the uniqueness of eye movements during sleep, and the noteworthy existence of tonic and phasic phenomena in the oculomotor system, not observed until now. PMID:18499729
Tonic and phasic phenomena underlying eye movements during sleep in the cat.
Márquez-Ruiz, Javier; Escudero, Miguel
2008-07-15
Mammalian sleep is not a homogenous state, and different variables have traditionally been used to distinguish different periods during sleep. Of these variables, eye movement is one of the most paradigmatic, and has been used to differentiate between the so-called rapid eye movement (REM) and non-REM (NREM) sleep periods. Despite this, eye movements during sleep are poorly understood, and the behaviour of the oculomotor system remains almost unknown. In the present work, we recorded binocular eye movements during the sleep-wake cycle of adult cats by the scleral search-coil technique. During alertness, eye movements consisted of conjugated saccades and eye fixations. During NREM sleep, eye movements were slow and mostly unconjugated. The two eyes moved upwardly and in the abducting direction, producing a tonic divergence and elevation of the visual axis. During the transition period between NREM and REM sleep, rapid monocular eye movements of low amplitude in the abducting direction occurred in coincidence with ponto-geniculo-occipital waves. Along REM sleep, the eyes tended to maintain a tonic convergence and depression, broken by high-frequency bursts of complex rapid eye movements. In the horizontal plane, each eye movement in the burst comprised two consecutive movements in opposite directions, which were more evident in the eye that performed the abducting movements. In the vertical plane, rapid eye movements were always upward. Comparisons of the characteristics of eye movements during the sleep-wake cycle reveal the uniqueness of eye movements during sleep, and the noteworthy existence of tonic and phasic phenomena in the oculomotor system, not observed until now.
De Cock, Valérie Cochen; Debs, Rachel; Oudiette, Delphine; Leu, Smaranda; Radji, Fatai; Tiberge, Michel; Yu, Huan; Bayard, Sophie; Roze, Emmanuel; Vidailhet, Marie; Dauvilliers, Yves; Rascol, Olivier; Arnulf, Isabelle
2011-03-01
Multiple system atrophy is an atypical parkinsonism characterized by severe motor disabilities that are poorly levodopa responsive. Most patients develop rapid eye movement sleep behaviour disorder. Because parkinsonism is absent during rapid eye movement sleep behaviour disorder in patients with Parkinson's disease, we studied the movements of patients with multiple system atrophy during rapid eye movement sleep. Forty-nine non-demented patients with multiple system atrophy and 49 patients with idiopathic Parkinson's disease were interviewed along with their 98 bed partners using a structured questionnaire. They rated the quality of movements, vocal and facial expressions during rapid eye movement sleep behaviour disorder as better than, equal to or worse than the same activities in an awake state. Sleep and movements were monitored using video-polysomnography in 22/49 patients with multiple system atrophy and in 19/49 patients with Parkinson's disease. These recordings were analysed for the presence of parkinsonism and cerebellar syndrome during rapid eye movement sleep movements. Clinical rapid eye movement sleep behaviour disorder was observed in 43/49 (88%) patients with multiple system atrophy. Reports from the 31/43 bed partners who were able to evaluate movements during sleep indicate that 81% of the patients showed some form of improvement during rapid eye movement sleep behaviour disorder. These included improved movement (73% of patients: faster, 67%; stronger, 52%; and smoother, 26%), improved speech (59% of patients: louder, 55%; more intelligible, 17%; and better articulated, 36%) and normalized facial expression (50% of patients). The rate of improvement was higher in Parkinson's disease than in multiple system atrophy, but no further difference was observed between the two forms of multiple system atrophy (predominant parkinsonism versus cerebellar syndrome). Video-monitored movements during rapid eye movement sleep in patients with multiple system atrophy revealed more expressive faces, and movements that were faster and more ample in comparison with facial expression and movements during wakefulness. These movements were still somewhat jerky but lacked any visible parkinsonism. Cerebellar signs were not assessable. We conclude that parkinsonism also disappears during rapid eye movement sleep behaviour disorder in patients with multiple system atrophy, but this improvement is not due to enhanced dopamine transmission because these patients are not levodopa-sensitive. These data suggest that these movements are not influenced by extrapyramidal regions; however, the influence of abnormal cerebellar control remains unclear. The transient disappearance of parkinsonism here is all the more surprising since no treatment (even dopaminergic) provides a real benefit in this disabling disease.
Image-based computer-assisted diagnosis system for benign paroxysmal positional vertigo
NASA Astrophysics Data System (ADS)
Kohigashi, Satoru; Nakamae, Koji; Fujioka, Hiromu
2005-04-01
We develop the image based computer assisted diagnosis system for benign paroxysmal positional vertigo (BPPV) that consists of the balance control system simulator, the 3D eye movement simulator, and the extraction method of nystagmus response directly from an eye movement image sequence. In the system, the causes and conditions of BPPV are estimated by searching the database for record matching with the nystagmus response for the observed eye image sequence of the patient with BPPV. The database includes the nystagmus responses for simulated eye movement sequences. The eye movement velocity is obtained by using the balance control system simulator that allows us to simulate BPPV under various conditions such as canalithiasis, cupulolithiasis, number of otoconia, otoconium size, and so on. Then the eye movement image sequence is displayed on the CRT by the 3D eye movement simulator. The nystagmus responses are extracted from the image sequence by the proposed method and are stored in the database. In order to enhance the diagnosis accuracy, the nystagmus response for a newly simulated sequence is matched with that for the observed sequence. From the matched simulation conditions, the causes and conditions of BPPV are estimated. We apply our image based computer assisted diagnosis system to two real eye movement image sequences for patients with BPPV to show its validity.
A laser-based eye-tracking system.
Irie, Kenji; Wilson, Bruce A; Jones, Richard D; Bones, Philip J; Anderson, Tim J
2002-11-01
This paper reports on the development of a new eye-tracking system for noninvasive recording of eye movements. The eye tracker uses a flying-spot laser to selectively image landmarks on the eye and, subsequently, measure horizontal, vertical, and torsional eye movements. Considerable work was required to overcome the adverse effects of specular reflection of the flying-spot from the surface of the eye onto the sensing elements of the eye tracker. These effects have been largely overcome, and the eye-tracker has been used to document eye movement abnormalities, such as abnormal torsional pulsion of saccades, in the clinical setting.
Looking around: 35 years of oculomotor modeling
NASA Technical Reports Server (NTRS)
Young, L. R.
1995-01-01
Eye movements have attracted an unusually large number of researchers from many disparate fields, especially over the past 35 years. The lure of this system stemmed from its apparent simplicity of description, measurement, and analysis, as well as the promise of providing a "window in the mind." Investigators in areas ranging from biological control systems and neurological diagnosis to applications in advertising and flight simulation expected eye movements to provide clear indicators of what the sensory-motor system was accomplishing and what the brain found to be of interest. The parallels between compensatory eye movements and perception of spatial orientation have been a subject for active study in visual-vestibular interaction, where substantial knowledge has accumulated through experiments largely guided by the challenge of proving or disproving model predictions. Even though oculomotor control has arguably benefited more from systems theory than any other branch of motor control, many of the original goals remain largely unfulfilled. This paper considers some of the promising potential benefits of eye movement research and compares accomplishments with anticipated results. Four topics are considered in greater detail: (i) the definition of oculomotor system input and output, (ii) optimization of the eye movement system, (iii) the relationship between compensatory eye movements and spatial orientation through the "internal model," and (iv) the significance of eye movements as measured in (outer) space.
Looking around: 35 years of oculomotor modeling.
Young, L R
1995-01-01
Eye movements have attracted an unusually large number of researchers from many disparate fields, especially over the past 35 years. The lure of this system stemmed from its apparent simplicity of description, measurement, and analysis, as well as the promise of providing a "window in the mind." Investigators in areas ranging from biological control systems and neurological diagnosis to applications in advertising and flight simulation expected eye movements to provide clear indicators of what the sensory-motor system was accomplishing and what the brain found to be of interest. The parallels between compensatory eye movements and perception of spatial orientation have been a subject for active study in visual-vestibular interaction, where substantial knowledge has accumulated through experiments largely guided by the challenge of proving or disproving model predictions. Even though oculomotor control has arguably benefited more from systems theory than any other branch of motor control, many of the original goals remain largely unfulfilled. This paper considers some of the promising potential benefits of eye movement research and compares accomplishments with anticipated results. Four topics are considered in greater detail: (i) the definition of oculomotor system input and output, (ii) optimization of the eye movement system, (iii) the relationship between compensatory eye movements and spatial orientation through the "internal model," and (iv) the significance of eye movements as measured in (outer) space.
Zarghi, Afsaneh; Zali, Alireza; Tehranidost, Mehdi
2013-01-01
A variety of nervous system components such as medulla, pons, midbrain, cerebellum, basal ganglia, parietal, frontal and occipital lobes have role in Eye Movement Desensitization and Reprocessing (EMDR) processes. The eye movement is done simultaneously for attracting client's attention to an external stimulus while concentrating on a certain internal subject. Eye movement guided by therapist is the most common attention stimulus. The role of eye movement has been documented previously in relation with cognitive processing mechanisms. A series of systemic experiments have shown that the eyes’ spontaneous movement is associated with emotional and cognitive changes and results in decreased excitement, flexibility in attention, memory processing, and enhanced semantic recalling. Eye movement also decreases the memory's image clarity and the accompanying excitement. By using EMDR, we can reach some parts of memory which were inaccessible before and also emotionally intolerable. Various researches emphasize on the effectiveness of EMDR in treating and curing phobias, pains, and dependent personality disorders. Consequently, due to the involvement of multiple neural system components, this palliative method of treatment can also help to rehabilitate the neuro-cognitive system. PMID:25337334
Eye movement-invariant representations in the human visual system.
Nishimoto, Shinji; Huth, Alexander G; Bilenko, Natalia Y; Gallant, Jack L
2017-01-01
During natural vision, humans make frequent eye movements but perceive a stable visual world. It is therefore likely that the human visual system contains representations of the visual world that are invariant to eye movements. Here we present an experiment designed to identify visual areas that might contain eye-movement-invariant representations. We used functional MRI to record brain activity from four human subjects who watched natural movies. In one condition subjects were required to fixate steadily, and in the other they were allowed to freely make voluntary eye movements. The movies used in each condition were identical. We reasoned that the brain activity recorded in a visual area that is invariant to eye movement should be similar under fixation and free viewing conditions. In contrast, activity in a visual area that is sensitive to eye movement should differ between fixation and free viewing. We therefore measured the similarity of brain activity across repeated presentations of the same movie within the fixation condition, and separately between the fixation and free viewing conditions. The ratio of these measures was used to determine which brain areas are most likely to contain eye movement-invariant representations. We found that voxels located in early visual areas are strongly affected by eye movements, while voxels in ventral temporal areas are only weakly affected by eye movements. These results suggest that the ventral temporal visual areas contain a stable representation of the visual world that is invariant to eye movements made during natural vision.
Real time eye tracking using Kalman extended spatio-temporal context learning
NASA Astrophysics Data System (ADS)
Munir, Farzeen; Minhas, Fayyaz ul Amir Asfar; Jalil, Abdul; Jeon, Moongu
2017-06-01
Real time eye tracking has numerous applications in human computer interaction such as a mouse cursor control in a computer system. It is useful for persons with muscular or motion impairments. However, tracking the movement of the eye is complicated by occlusion due to blinking, head movement, screen glare, rapid eye movements, etc. In this work, we present the algorithmic and construction details of a real time eye tracking system. Our proposed system is an extension of Spatio-Temporal context learning through Kalman Filtering. Spatio-Temporal Context Learning offers state of the art accuracy in general object tracking but its performance suffers due to object occlusion. Addition of the Kalman filter allows the proposed method to model the dynamics of the motion of the eye and provide robust eye tracking in cases of occlusion. We demonstrate the effectiveness of this tracking technique by controlling the computer cursor in real time by eye movements.
Instrument Display Visual Angles for Conventional Aircraft and the MQ-9 Ground Control Station
NASA Technical Reports Server (NTRS)
Kamine, Tovy Haber; Bendrick, Gregg A.
2008-01-01
Aircraft instrument panels should be designed such that primary displays are in optimal viewing location to minimize pilot perception and response time. Human Factors engineers define three zones (i.e. cones ) of visual location: 1) "Easy Eye Movement" (foveal vision); 2) "Maximum Eye Movement" (peripheral vision with saccades), and 3) "Head Movement (head movement required). Instrument display visual angles were measured to determine how well conventional aircraft (T-34, T-38, F- 15B, F-16XL, F/A-18A, U-2D, ER-2, King Air, G-III, B-52H, DC-10, B747-SCA) and the MQ-9 ground control station (GCS) complied with these standards, and how they compared with each other. Selected instrument parameters included: attitude, pitch, bank, power, airspeed, altitude, vertical speed, heading, turn rate, slip/skid, AOA, flight path, latitude, longitude, course, bearing, range and time. Vertical and horizontal visual angles for each component were measured from the pilot s eye position in each system. The vertical visual angles of displays in conventional aircraft lay within the cone of "Easy Eye Movement" for all but three of the parameters measured, and almost all of the horizontal visual angles fell within this range. All conventional vertical and horizontal visual angles lay within the cone of Maximum Eye Movement. However, most instrument vertical visual angles of the MQ-9 GCS lay outside the cone of Easy Eye Movement, though all were within the cone of Maximum Eye Movement. All the horizontal visual angles for the MQ-9 GCS were within the cone of "Easy Eye Movement". Most instrument displays in conventional aircraft lay within the cone of Easy Eye Movement, though mission-critical instruments sometimes displaced less important instruments outside this area. Many of the MQ-9 GCS systems lay outside this area. Specific training for MQ-9 pilots may be needed to avoid increased response time and potential error during flight. The learning objectives include: 1) Know three physiologic cones of eye/head movement; 2) Understand how instrument displays comply with these design principles in conventional aircraft and an uninhabited aerial vehicle system. Which of the following is NOT a recognized physiologic principle of instrument display design? Cone of Easy Eye Movement 2) Cone of Binocular Eye Movement 3) Cone of Maximum Eye Movement 4) Cone of Head Movement 5) None of the above. Answer: # 2) Cone of Binocular Eye Movement
NASA Astrophysics Data System (ADS)
Kajiwara, Yusuke; Murata, Hiroaki; Kimura, Haruhiko; Abe, Koji
As a communication support tool for cases of amyotrophic lateral sclerosis (ALS), researches on eye gaze human-computer interfaces have been active. However, since voluntary and involuntary eye movements cannot be distinguished in the interfaces, their performance is still not sufficient for practical use. This paper presents a high performance human-computer interface system which unites high quality recognitions of horizontal directional eye movements and voluntary blinks. The experimental results have shown that the number of incorrect inputs is decreased by 35.1% in an existing system which equips recognitions of horizontal and vertical directional eye movements in addition to voluntary blinks and character inputs are speeded up by 17.4% from the existing system.
Basal Ganglia Neuronal Activity during Scanning Eye Movements in Parkinson’s Disease
Sieger, Tomáš; Bonnet, Cecilia; Serranová, Tereza; Wild, Jiří; Novák, Daniel; Růžička, Filip; Urgošík, Dušan; Růžička, Evžen; Gaymard, Bertrand; Jech, Robert
2013-01-01
The oculomotor role of the basal ganglia has been supported by extensive evidence, although their role in scanning eye movements is poorly understood. Nineteen Parkinsońs disease patients, which underwent implantation of deep brain stimulation electrodes, were investigated with simultaneous intraoperative microelectrode recordings and single channel electrooculography in a scanning eye movement task by viewing a series of colored pictures selected from the International Affective Picture System. Four patients additionally underwent a visually guided saccade task. Microelectrode recordings were analyzed selectively from the subthalamic nucleus, substantia nigra pars reticulata and from the globus pallidus by the WaveClus program which allowed for detection and sorting of individual neurons. The relationship between neuronal firing rate and eye movements was studied by crosscorrelation analysis. Out of 183 neurons that were detected, 130 were found in the subthalamic nucleus, 30 in the substantia nigra and 23 in the globus pallidus. Twenty percent of the neurons in each of these structures showed eye movement-related activity. Neurons related to scanning eye movements were mostly unrelated to the visually guided saccades. We conclude that a relatively large number of basal ganglia neurons are involved in eye motion control. Surprisingly, neurons related to scanning eye movements differed from neurons activated during saccades suggesting functional specialization and segregation of both systems for eye movement control. PMID:24223158
Eye movement identification based on accumulated time feature
NASA Astrophysics Data System (ADS)
Guo, Baobao; Wu, Qiang; Sun, Jiande; Yan, Hua
2017-06-01
Eye movement is a new kind of feature for biometrical recognition, it has many advantages compared with other features such as fingerprint, face, and iris. It is not only a sort of static characteristics, but also a combination of brain activity and muscle behavior, which makes it effective to prevent spoofing attack. In addition, eye movements can be incorporated with faces, iris and other features recorded from the face region into multimode systems. In this paper, we do an exploring study on eye movement identification based on the eye movement datasets provided by Komogortsev et al. in 2011 with different classification methods. The time of saccade and fixation are extracted from the eye movement data as the eye movement features. Furthermore, the performance analysis was conducted on different classification methods such as the BP, RBF, ELMAN and SVM in order to provide a reference to the future research in this field.
Gaze failure, drifting eye movements, and centripetal nystagmus in cerebellar disease.
Leech, J; Gresty, M; Hess, K; Rudge, P
1977-01-01
Three abnormalities of eye movement in man are described which are indicative of cerebellar system disorder, namely, centripetally beating nystagmus, failure to maintain lateral gaze either in darkness or with eye closure, and slow drifting movements of the eyes in the absence of fixation. Similar eye movement signs follow cerebellectomy in the primate and the cat. These abnormalities of eye movement, together with other signs of cerebellar disease, such as rebound alternating, and gaze paretic nystagmus, are explained by the hypothesis that the cerebellum helps to maintain lateral gaze and that brain stem mechanisms which monitor gaze position generate compensatory biases in the absence of normal cerebellar function. PMID:603785
Effect of a Hypocretin/Orexin Antagonist on Neurocognitive Performance
2015-11-01
somnolence without cataplexy and, in rat, decreases active wake and increases the time spent in non-rapid eye movement (NREM) and (REM) sleep (Brisbare-Roch...system results in a narcoleptic phenotype characterized by excessive sleepiness, fragmented sleep, abnormally timed Rapid- Eye -Movement (REM) sleep, and...spent in non-rapid eye movement (NREM) and (REM) sleep with differential effects on various neurotransmitter systems. To date, no studies have reported
Fixational Eye Movements in the Earliest Stage of Metazoan Evolution
Bielecki, Jan; Høeg, Jens T.; Garm, Anders
2013-01-01
All known photoreceptor cells adapt to constant light stimuli, fading the retinal image when exposed to an immobile visual scene. Counter strategies are therefore necessary to prevent blindness, and in mammals this is accomplished by fixational eye movements. Cubomedusae occupy a key position for understanding the evolution of complex visual systems and their eyes are assumedly subject to the same adaptive problems as the vertebrate eye, but lack motor control of their visual system. The morphology of the visual system of cubomedusae ensures a constant orientation of the eyes and a clear division of the visual field, but thereby also a constant retinal image when exposed to stationary visual scenes. Here we show that bell contractions used for swimming in the medusae refresh the retinal image in the upper lens eye of Tripedalia cystophora. This strongly suggests that strategies comparable to fixational eye movements have evolved at the earliest metazoan stage to compensate for the intrinsic property of the photoreceptors. Since the timing and amplitude of the rhopalial movements concur with the spatial and temporal resolution of the eye it circumvents the need for post processing in the central nervous system to remove image blur. PMID:23776673
Fixational eye movements in the earliest stage of metazoan evolution.
Bielecki, Jan; Høeg, Jens T; Garm, Anders
2013-01-01
All known photoreceptor cells adapt to constant light stimuli, fading the retinal image when exposed to an immobile visual scene. Counter strategies are therefore necessary to prevent blindness, and in mammals this is accomplished by fixational eye movements. Cubomedusae occupy a key position for understanding the evolution of complex visual systems and their eyes are assumedly subject to the same adaptive problems as the vertebrate eye, but lack motor control of their visual system. The morphology of the visual system of cubomedusae ensures a constant orientation of the eyes and a clear division of the visual field, but thereby also a constant retinal image when exposed to stationary visual scenes. Here we show that bell contractions used for swimming in the medusae refresh the retinal image in the upper lens eye of Tripedalia cystophora. This strongly suggests that strategies comparable to fixational eye movements have evolved at the earliest metazoan stage to compensate for the intrinsic property of the photoreceptors. Since the timing and amplitude of the rhopalial movements concur with the spatial and temporal resolution of the eye it circumvents the need for post processing in the central nervous system to remove image blur.
Updating visual memory across eye movements for ocular and arm motor control.
Thompson, Aidan A; Henriques, Denise Y P
2008-11-01
Remembered object locations are stored in an eye-fixed reference frame, so that every time the eyes move, spatial representations must be updated for the arm-motor system to reflect the target's new relative position. To date, studies have not investigated how the brain updates these spatial representations during other types of eye movements, such as smooth-pursuit. Further, it is unclear what information is used in spatial updating. To address these questions we investigated whether remembered locations of pointing targets are updated following smooth-pursuit eye movements, as they are following saccades, and also investigated the role of visual information in estimating eye-movement amplitude for updating spatial memory. Misestimates of eye-movement amplitude were induced when participants visually tracked stimuli presented with a background that moved in either the same or opposite direction of the eye before pointing or looking back to the remembered target location. We found that gaze-dependent pointing errors were similar following saccades and smooth-pursuit and that incongruent background motion did result in a misestimate of eye-movement amplitude. However, the background motion had no effect on spatial updating for pointing, but did when subjects made a return saccade, suggesting that the oculomotor and arm-motor systems may rely on different sources of information for spatial updating.
Eye Movements During Everyday Behavior Predict Personality Traits.
Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A; Bulling, Andreas
2018-01-01
Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human-computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization.
Eye Movements During Everyday Behavior Predict Personality Traits
Hoppe, Sabrina; Loetscher, Tobias; Morey, Stephanie A.; Bulling, Andreas
2018-01-01
Besides allowing us to perceive our surroundings, eye movements are also a window into our mind and a rich source of information on who we are, how we feel, and what we do. Here we show that eye movements during an everyday task predict aspects of our personality. We tracked eye movements of 42 participants while they ran an errand on a university campus and subsequently assessed their personality traits using well-established questionnaires. Using a state-of-the-art machine learning method and a rich set of features encoding different eye movement characteristics, we were able to reliably predict four of the Big Five personality traits (neuroticism, extraversion, agreeableness, conscientiousness) as well as perceptual curiosity only from eye movements. Further analysis revealed new relations between previously neglected eye movement characteristics and personality. Our findings demonstrate a considerable influence of personality on everyday eye movement control, thereby complementing earlier studies in laboratory settings. Improving automatic recognition and interpretation of human social signals is an important endeavor, enabling innovative design of human–computer systems capable of sensing spontaneous natural user behavior to facilitate efficient interaction and personalization. PMID:29713270
... cause leg tremor Vision changes such as abnormal eye movements (back and forth movements called nystagmus), double vision , ... may show damage to many nerve systems: Abnormal eye movement Decreased or abnormal reflexes Fast pulse (heart rate) ...
Olsen, Rosanna K; Sebanayagam, Vinoja; Lee, Yunjo; Moscovitch, Morris; Grady, Cheryl L; Rosenbaum, R Shayna; Ryan, Jennifer D
2016-12-01
There is consistent agreement regarding the positive relationship between cumulative eye movement sampling and subsequent recognition, but the role of the hippocampus in this sampling behavior is currently unknown. It is also unclear whether the eye movement repetition effect, i.e., fewer fixations to repeated, compared to novel, stimuli, depends on explicit recognition and/or an intact hippocampal system. We investigated the relationship between cumulative sampling, the eye movement repetition effect, subsequent memory, and the hippocampal system. Eye movements were monitored in a developmental amnesic case (H.C.), whose hippocampal system is compromised, and in a group of typically developing participants while they studied single faces across multiple blocks. The faces were studied from the same viewpoint or different viewpoints and were subsequently tested with the same or different viewpoint. Our previous work suggested that hippocampal representations support explicit recognition for information that changes viewpoint across repetitions (Olsen et al., 2015). Here, examination of eye movements during encoding indicated that greater cumulative sampling was associated with better memory among controls. Increased sampling, however, was not associated with better explicit memory in H.C., suggesting that increased sampling only improves memory when the hippocampal system is intact. The magnitude of the repetition effect was not correlated with cumulative sampling, nor was it related reliably to subsequent recognition. These findings indicate that eye movements collect information that can be used to strengthen memory representations that are later available for conscious remembering, whereas eye movement repetition effects reflect a processing change due to experience that does not necessarily reflect a memory representation that is available for conscious appraisal. Lastly, H.C. demonstrated a repetition effect for fixed viewpoint faces but not for variable viewpoint faces, which suggests that repetition effects are differentially supported by neocortical and hippocampal systems, depending upon the representational nature of the underlying memory trace. Copyright © 2016 Elsevier Ltd. All rights reserved.
A 2D eye gaze estimation system with low-resolution webcam images
NASA Astrophysics Data System (ADS)
Ince, Ibrahim Furkan; Kim, Jin Woo
2011-12-01
In this article, a low-cost system for 2D eye gaze estimation with low-resolution webcam images is presented. Two algorithms are proposed for this purpose, one for the eye-ball detection with stable approximate pupil-center and the other one for the eye movements' direction detection. Eyeball is detected using deformable angular integral search by minimum intensity (DAISMI) algorithm. Deformable template-based 2D gaze estimation (DTBGE) algorithm is employed as a noise filter for deciding the stable movement decisions. While DTBGE employs binary images, DAISMI employs gray-scale images. Right and left eye estimates are evaluated separately. DAISMI finds the stable approximate pupil-center location by calculating the mass-center of eyeball border vertices to be employed for initial deformable template alignment. DTBGE starts running with initial alignment and updates the template alignment with resulting eye movements and eyeball size frame by frame. The horizontal and vertical deviation of eye movements through eyeball size is considered as if it is directly proportional with the deviation of cursor movements in a certain screen size and resolution. The core advantage of the system is that it does not employ the real pupil-center as a reference point for gaze estimation which is more reliable against corneal reflection. Visual angle accuracy is used for the evaluation and benchmarking of the system. Effectiveness of the proposed system is presented and experimental results are shown.
Intersegmental Eye-Head-Body Interactions during Complex Whole Body Movements
von Laßberg, Christoph; Beykirch, Karl A.; Mohler, Betty J.; Bülthoff, Heinrich H.
2014-01-01
Using state-of-the-art technology, interactions of eye, head and intersegmental body movements were analyzed for the first time during multiple twisting somersaults of high-level gymnasts. With this aim, we used a unique combination of a 16-channel infrared kinemetric system; a three-dimensional video kinemetric system; wireless electromyography; and a specialized wireless sport-video-oculography system, which was able to capture and calculate precise oculomotor data under conditions of rapid multiaxial acceleration. All data were synchronized and integrated in a multimodal software tool for three-dimensional analysis. During specific phases of the recorded movements, a previously unknown eye-head-body interaction was observed. The phenomenon was marked by a prolonged and complete suppression of gaze-stabilizing eye movements, in favor of a tight coupling with the head, spine and joint movements of the gymnasts. Potential reasons for these observations are discussed with regard to earlier findings and integrated within a functional model. PMID:24763143
Application of eye movement measuring system OBER 2 to medicine and technology
NASA Astrophysics Data System (ADS)
Ober, Jozef; Hajda, Janusz; Loska, Jacek; Jamicki, Michal
1997-08-01
The OBER 2 is an infrared light eye movement measuring system and it works with IBM PC compatible computers. As one of the safest systems for measuring of eye movement it uses a very short period of infrared light flashing time (80 microsecond for each measure point). System has an advanced analog-digital controller, which includes background suppression and prediction mechanisms guaranteeing elimination of slow changes and fluctuations of external illumination frequency up to 100 Hz, with effectiveness better than 40 dB. Setting from PC the active measure axis, sampling rate (25 - 4000 Hz) and making start and stop the measure, make it possible to control the outside environment in real-time. By proper controlling of gain it is possible to get high time and position resolution of 0.5 minute of arc even for big amplitude of eye movement (plus or minus 20 degree of visual angle). The whole communication system can also be driven directly by eye movement in real time. The possibility of automatic selection of the most essential elements of eye movement, individual for each person and those that take place for each person in determined situations of life independently from personal features, is a key to practical application. Hence one of conducted research topic is a personal identification based on personal features. Another task is a research project of falling asleep detection, which can be applied to warn the drivers before falling asleep while driving. This measuring system with a proper expert system can also be used to detect a dyslexia and other disabilities of the optic system.
Biometric recognition via texture features of eye movement trajectories in a visual searching task.
Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei; Zhang, Chenggang
2018-01-01
Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers' temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases.
Biometric recognition via texture features of eye movement trajectories in a visual searching task
Li, Chunyong; Xue, Jiguo; Quan, Cheng; Yue, Jingwei
2018-01-01
Biometric recognition technology based on eye-movement dynamics has been in development for more than ten years. Different visual tasks, feature extraction and feature recognition methods are proposed to improve the performance of eye movement biometric system. However, the correct identification and verification rates, especially in long-term experiments, as well as the effects of visual tasks and eye trackers’ temporal and spatial resolution are still the foremost considerations in eye movement biometrics. With a focus on these issues, we proposed a new visual searching task for eye movement data collection and a new class of eye movement features for biometric recognition. In order to demonstrate the improvement of this visual searching task being used in eye movement biometrics, three other eye movement feature extraction methods were also tested on our eye movement datasets. Compared with the original results, all three methods yielded better results as expected. In addition, the biometric performance of these four feature extraction methods was also compared using the equal error rate (EER) and Rank-1 identification rate (Rank-1 IR), and the texture features introduced in this paper were ultimately shown to offer some advantages with regard to long-term stability and robustness over time and spatial precision. Finally, the results of different combinations of these methods with a score-level fusion method indicated that multi-biometric methods perform better in most cases. PMID:29617383
Exogenous orienting of attention depends upon the ability to execute eye movements.
Smith, Daniel T; Rorden, Chris; Jackson, Stephen R
2004-05-04
Shifts of attention can be made overtly by moving the eyes or covertly with attention being allocated to a region of space that does not correspond to the current direction of gaze. However, the precise relationship between eye movements and the covert orienting of attention remains controversial. The influential premotor theory proposes that the covert orienting of attention is produced by the programming of (unexecuted) eye movements and thus predicts a strong relationship between the ability to execute eye movements and the operation of spatial attention. Here, we demonstrate for the first time that impaired spatial attention is observed in an individual (AI) who is neurologically healthy but who cannot execute eye movements as a result of a congenital impairment in the elasticity of her eye muscles. This finding provides direct support for the role of the eye-movement system in the covert orienting of attention and suggests that whereas intact cortical structures may be necessary for normal attentional reflexes, they are not sufficient. The ability to move our eyes is essential for the development of normal patterns of spatial attention.
Straube, A; Bronstein, A; Straumann, D
2012-01-01
The ocular motor system consists of several subsystems, including the vestibular ocular nystagmus saccade system, the pursuit system, the fixation and gaze-holding system and the vergence system. All these subsystems aid the stabilization of the images on the retina during eye and head movements and any kind of disturbance of one of the systems can cause instability of the eyes (e.g. nystagmus) or an inadequate eye movement causing a mismatch between head and eye movement (e.g. bilateral vestibular failure). In both situations, the subjects experience a movement of the world (oscillopsia) which is quite disturbing. New insights into the patho-physiology of some of the ocular motor disorders have helped to establish new treatment options, in particular in downbeat nystagmus, upbeat nystagmus, periodic alternating nystagmus, acquired pendular nystagmus and paroxysmal vestibular episodes/attacks. The discussed patho-physiology of these disorders and the current literature on treatment options are discussed and practical treatment recommendations are given in the paper. © 2011 The Author(s). European Journal of Neurology © 2011 EFNS.
Clinical-Radiologic Correlation of Extraocular Eye Movement Disorders: Seeing beneath the Surface.
Thatcher, Joshua; Chang, Yu-Ming; Chapman, Margaret N; Hovis, Keegan; Fujita, Akifumi; Sobel, Rachel; Sakai, Osamu
2016-01-01
Extraocular eye movement disorders are relatively common and may be a significant source of discomfort and morbidity for patients. The presence of restricted eye movement can be detected clinically with quick, easily performed, noninvasive maneuvers that assess medial, lateral, upward, and downward gaze. However, detecting the presence of ocular dysmotility may not be sufficient to pinpoint the exact cause of eye restriction. Imaging plays an important role in excluding, in some cases, and detecting, in others, a specific cause responsible for the clinical presentation. However, the radiologist should be aware that the imaging findings in many of these conditions when taken in isolation from the clinical history and symptoms are often nonspecific. Normal eye movements are directly controlled by the ocular motor cranial nerves (CN III, IV, and VI) in coordination with indirect input or sensory stimuli derived from other cranial nerves. Specific causes of ocular dysmotility can be localized to the cranial nerve nuclei in the brainstem, the cranial nerve pathways in the peripheral nervous system, and the extraocular muscles in the orbit, with disease at any of these sites manifesting clinically as an eye movement disorder. A thorough understanding of central nervous system anatomy, cranial nerve pathways, and orbital anatomy, as well as familiarity with patterns of eye movement restriction, are necessary for accurate detection of radiologic abnormalities that support a diagnostic source of the suspected extraocular movement disorder. © RSNA, 2016.
The anatomy and physiology of the ocular motor system.
Horn, Anja K E; Leigh, R John
2011-01-01
Accurate diagnosis of abnormal eye movements depends upon knowledge of the purpose, properties, and neural substrate of distinct functional classes of eye movement. Here, we summarize current concepts of the anatomy of eye movement control. Our approach is bottom-up, starting with the extraocular muscles and their innervation by the cranial nerves. Second, we summarize the neural circuits in the pons underlying horizontal gaze control, and the midbrain connections that coordinate vertical and torsional movements. Third, the role of the cerebellum in governing and optimizing eye movements is presented. Fourth, each area of cerebral cortex contributing to eye movements is discussed. Last, descending projections from cerebral cortex, including basal ganglionic circuits that govern different components of gaze, and the superior colliculus, are summarized. At each stage of this review, the anatomical scheme is used to predict the effects of lesions on the control of eye movements, providing clinical-anatomical correlation. Copyright © 2011 Elsevier B.V. All rights reserved.
Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.
Maimon-Dror, Roni O; Fernandez-Quesada, Jorge; Zito, Giuseppe A; Konnaris, Charalambos; Dziemian, Sabine; Faisal, A Aldo
2017-07-01
Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.
Instrument Display Visual Angles for Conventional Aircraft and the MQ-9 Ground Control Station
NASA Technical Reports Server (NTRS)
Bendrick, Gregg A.; Kamine, Tovy Haber
2008-01-01
Aircraft instrument panels should be designed such that primary displays are in optimal viewing location to minimize pilot perception and response time. Human Factors engineers define three zones (i.e. "cones") of visual location: 1) "Easy Eye Movement" (foveal vision); 2) "Maximum Eye Movement" (peripheral vision with saccades), and 3) "Head Movement" (head movement required). Instrument display visual angles were measured to determine how well conventional aircraft (T-34, T-38, F- 15B, F-16XL, F/A-18A, U-2D, ER-2, King Air, G-III, B-52H, DC-10, B747-SCA) and the MQ-9 ground control station (GCS) complied with these standards, and how they compared with each other. Methods: Selected instrument parameters included: attitude, pitch, bank, power, airspeed, altitude, vertical speed, heading, turn rate, slip/skid, AOA, flight path, latitude, longitude, course, bearing, range and time. Vertical and horizontal visual angles for each component were measured from the pilot s eye position in each system. Results: The vertical visual angles of displays in conventional aircraft lay within the cone of "Easy Eye Movement" for all but three of the parameters measured, and almost all of the horizontal visual angles fell within this range. All conventional vertical and horizontal visual angles lay within the cone of "Maximum Eye Movement". However, most instrument vertical visual angles of the MQ-9 GCS lay outside the cone of "Easy Eye Movement", though all were within the cone of "Maximum Eye Movement". All the horizontal visual angles for the MQ-9 GCS were within the cone of "Easy Eye Movement". Discussion: Most instrument displays in conventional aircraft lay within the cone of "Easy Eye Movement", though mission-critical instruments sometimes displaced less important instruments outside this area. Many of the MQ-9 GCS systems lay outside this area. Specific training for MQ-9 pilots may be needed to avoid increased response time and potential error during flight.
Difference in Visual Processing Assessed by Eye Vergence Movements
Solé Puig, Maria; Puigcerver, Laura; Aznar-Casanova, J. Antonio; Supèr, Hans
2013-01-01
Orienting visual attention is closely linked to the oculomotor system. For example, a shift of attention is usually followed by a saccadic eye movement and can be revealed by micro saccades. Recently we reported a novel role of another type of eye movement, namely eye vergence, in orienting visual attention. Shifts in visuospatial attention are characterized by the response modulation to a selected target. However, unlike (micro-) saccades, eye vergence movements do not carry spatial information (except for depth) and are thus not specific to a particular visual location. To further understand the role of eye vergence in visual attention, we tested subjects with different perceptual styles. Perceptual style refers to the characteristic way individuals perceive environmental stimuli, and is characterized by a spatial difference (local vs. global) in perceptual processing. We tested field independent (local; FI) and field dependent (global; FD) observers in a cue/no-cue task and a matching task. We found that FI observers responded faster and had stronger modulation in eye vergence in both tasks than FD subjects. The results may suggest that eye vergence modulation may relate to the trade-off between the size of spatial region covered by attention and the processing efficiency of sensory information. Alternatively, vergence modulation may have a role in the switch in cortical state to prepare the visual system for new incoming sensory information. In conclusion, vergence eye movements may be added to the growing list of functions of fixational eye movements in visual perception. However, further studies are needed to elucidate its role. PMID:24069140
Kukona, Anuenue; Tabor, Whitney
2011-01-01
The visual world paradigm presents listeners with a challenging problem: they must integrate two disparate signals, the spoken language and the visual context, in support of action (e.g., complex movements of the eyes across a scene). We present Impulse Processing, a dynamical systems approach to incremental eye movements in the visual world that suggests a framework for integrating language, vision, and action generally. Our approach assumes that impulses driven by the language and the visual context impinge minutely on a dynamical landscape of attractors corresponding to the potential eye-movement behaviors of the system. We test three unique predictions of our approach in an empirical study in the visual world paradigm, and describe an implementation in an artificial neural network. We discuss the Impulse Processing framework in relation to other models of the visual world paradigm. PMID:21609355
Chang, Won-Du; Cha, Ho-Seung; Im, Chang-Hwan
2016-01-01
This paper introduces a method to remove the unwanted interdependency between vertical and horizontal eye-movement components in electrooculograms (EOGs). EOGs have been widely used to estimate eye movements without a camera in a variety of human-computer interaction (HCI) applications using pairs of electrodes generally attached either above and below the eye (vertical EOG) or to the left and right of the eyes (horizontal EOG). It has been well documented that the vertical EOG component has less stability than the horizontal EOG one, making accurate estimation of the vertical location of the eyes difficult. To address this issue, an experiment was designed in which ten subjects participated. Visual inspection of the recorded EOG signals showed that the vertical EOG component is highly influenced by horizontal eye movements, whereas the horizontal EOG is rarely affected by vertical eye movements. Moreover, the results showed that this interdependency could be effectively removed by introducing an individual constant value. It is therefore expected that the proposed method can enhance the overall performance of practical EOG-based eye-tracking systems. PMID:26907271
NASA Astrophysics Data System (ADS)
Namazi, Hamidreza; Kulish, Vladimir V.; Akrami, Amin
2016-05-01
One of the major challenges in vision research is to analyze the effect of visual stimuli on human vision. However, no relationship has been yet discovered between the structure of the visual stimulus, and the structure of fixational eye movements. This study reveals the plasticity of human fixational eye movements in relation to the ‘complex’ visual stimulus. We demonstrated that the fractal temporal structure of visual dynamics shifts towards the fractal dynamics of the visual stimulus (image). The results showed that images with higher complexity (higher fractality) cause fixational eye movements with lower fractality. Considering the brain, as the main part of nervous system that is engaged in eye movements, we analyzed the governed Electroencephalogram (EEG) signal during fixation. We have found out that there is a coupling between fractality of image, EEG and fixational eye movements. The capability observed in this research can be further investigated and applied for treatment of different vision disorders.
Evaluation of Eye Metrics as a Detector of Fatigue
2010-03-01
eyeglass frames . The cameras are angled upward toward the eyes and extract real-time pupil diameter, eye-lid movement, and eye-ball movement. The...because the cameras were mounted on eyeglass -like frames , the system was able to continuously monitor the eye throughout all sessions. Overall, the...of “ fitness for duty” testing and “real-time monitoring” of operator performance has been slow (Institute of Medicine, 2004). Oculometric-based
NASA Astrophysics Data System (ADS)
Komogortsev, Oleg V.; Karpov, Alexey; Holland, Corey D.
2012-06-01
The widespread use of computers throughout modern society introduces the necessity for usable and counterfeit-resistant authentication methods to ensure secure access to personal resources such as bank accounts, e-mail, and social media. Current authentication methods require tedious memorization of lengthy pass phrases, are often prone to shouldersurfing, and may be easily replicated (either by counterfeiting parts of the human body or by guessing an authentication token based on readily available information). This paper describes preliminary work toward a counterfeit-resistant usable eye movement-based (CUE) authentication method. CUE does not require any passwords (improving the memorability aspect of the authentication system), and aims to provide high resistance to spoofing and shoulder-surfing by employing the combined biometric capabilities of two behavioral biometric traits: 1) oculomotor plant characteristics (OPC) which represent the internal, non-visible, anatomical structure of the eye; 2) complex eye movement patterns (CEM) which represent the strategies employed by the brain to guide visual attention. Both OPC and CEM are extracted from the eye movement signal provided by an eye tracking system. Preliminary results indicate that the fusion of OPC and CEM traits is capable of providing a 30% reduction in authentication error when compared to the authentication accuracy of individual traits.
Eye-hand coordination during a double-step task: evidence for a common stochastic accumulator
Gopal, Atul
2015-01-01
Many studies of reaching and pointing have shown significant spatial and temporal correlations between eye and hand movements. Nevertheless, it remains unclear whether these correlations are incidental, arising from common inputs (independent model); whether these correlations represent an interaction between otherwise independent eye and hand systems (interactive model); or whether these correlations arise from a single dedicated eye-hand system (common command model). Subjects were instructed to redirect gaze and pointing movements in a double-step task in an attempt to decouple eye-hand movements and causally distinguish between the three architectures. We used a drift-diffusion framework in the context of a race model, which has been previously used to explain redirect behavior for eye and hand movements separately, to predict the pattern of eye-hand decoupling. We found that the common command architecture could best explain the observed frequency of different eye and hand response patterns to the target step. A common stochastic accumulator for eye-hand coordination also predicts comparable variances, despite significant difference in the means of the eye and hand reaction time (RT) distributions, which we tested. Consistent with this prediction, we observed that the variances of the eye and hand RTs were similar, despite much larger hand RTs (∼90 ms). Moreover, changes in mean eye RTs, which also increased eye RT variance, produced a similar increase in mean and variance of the associated hand RT. Taken together, these data suggest that a dedicated circuit underlies coordinated eye-hand planning. PMID:26084906
Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M
2016-01-26
Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.
Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors.
Belkacem, Abdelkader Nasreddine; Saetia, Supat; Zintus-art, Kalanyu; Shin, Duk; Kambara, Hiroyuki; Yoshimura, Natsue; Berrached, Nasreddine; Koike, Yasuharu
2015-01-01
EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control.
Real-Time Control of a Video Game Using Eye Movements and Two Temporal EEG Sensors
Saetia, Supat; Zintus-art, Kalanyu; Shin, Duk; Kambara, Hiroyuki; Yoshimura, Natsue; Berrached, Nasreddine; Koike, Yasuharu
2015-01-01
EEG-controlled gaming applications range widely from strictly medical to completely nonmedical applications. Games can provide not only entertainment but also strong motivation for practicing, thereby achieving better control with rehabilitation system. In this paper we present real-time control of video game with eye movements for asynchronous and noninvasive communication system using two temporal EEG sensors. We used wavelets to detect the instance of eye movement and time-series characteristics to distinguish between six classes of eye movement. A control interface was developed to test the proposed algorithm in real-time experiments with opened and closed eyes. Using visual feedback, a mean classification accuracy of 77.3% was obtained for control with six commands. And a mean classification accuracy of 80.2% was obtained using auditory feedback for control with five commands. The algorithm was then applied for controlling direction and speed of character movement in two-dimensional video game. Results showed that the proposed algorithm had an efficient response speed and timing with a bit rate of 30 bits/min, demonstrating its efficacy and robustness in real-time control. PMID:26690500
Maroufi, Mohsen; Zamani, Shahla; Izadikhah, Zahra; Marofi, Maryam; O'Connor, Peter
2016-09-01
To investigate the efficacy of Eye Movement Desensitization and Reprocessing for postoperative pain management in adolescents. Eye Movement Desensitization and Reprocessing is an inexpensive, non-pharmacological intervention that has successfully been used to treat chronic pain. It holds promise in the treatment of acute, postsurgical pain based on its purported effects on the brain and nervous system. A randomized controlled trial was used. Fifty-six adolescent surgical patients aged between 12-18 years were allocated to gender-balanced Eye Movement Desensitization and Reprocessing (treatment) or non-Eye Movement Desensitization and Reprocessing (control) groups. Pain was measured using the Wong-Baker FACES(®) Pain Rating Scale (WBFS) before and after the intervention (or non-intervention for the control group). A Wilcoxon signed-rank test demonstrated that the Eye Movement Desensitization and Reprocessing group experienced a significant reduction in pain intensity after treatment intervention, whereas the control group did not. Additionally, a Mann-Whitney U-test showed that, while there was no significant difference between the two groups at time 1, there was a significant difference in pain intensity between the two groups at time 2, with the Eye Movement Desensitization and Reprocessing group experiencing lower levels of pain. These results suggest that Eye Movement Desensitization and Reprocessing may be an effective treatment modality for postoperative pain. © 2016 John Wiley & Sons Ltd.
Wu, Shang-Lin; Liao, Lun-De; Lu, Shao-Wei; Jiang, Wei-Ling; Chen, Shi-An; Lin, Chin-Teng
2013-08-01
Electrooculography (EOG) signals can be used to control human-computer interface (HCI) systems, if properly classified. The ability to measure and process these signals may help HCI users to overcome many of the physical limitations and inconveniences in daily life. However, there are currently no effective multidirectional classification methods for monitoring eye movements. Here, we describe a classification method used in a wireless EOG-based HCI device for detecting eye movements in eight directions. This device includes wireless EOG signal acquisition components, wet electrodes and an EOG signal classification algorithm. The EOG classification algorithm is based on extracting features from the electrical signals corresponding to eight directions of eye movement (up, down, left, right, up-left, down-left, up-right, and down-right) and blinking. The recognition and processing of these eight different features were achieved in real-life conditions, demonstrating that this device can reliably measure the features of EOG signals. This system and its classification procedure provide an effective method for identifying eye movements. Additionally, it may be applied to study eye functions in real-life conditions in the near future.
Pharmacological Treatment Effects on Eye Movement Control
ERIC Educational Resources Information Center
Reilly, James L.; Lencer, Rebekka; Bishop, Jeffrey R.; Keedy, Sarah; Sweeney, John A.
2008-01-01
The increasing use of eye movement paradigms to assess the functional integrity of brain systems involved in sensorimotor and cognitive processing in clinical disorders requires greater attention to effects of pharmacological treatments on these systems. This is needed to better differentiate disease and medication effects in clinical samples, to…
Enhanced Video-Oculography System
NASA Technical Reports Server (NTRS)
Moore, Steven T.; MacDougall, Hamish G.
2009-01-01
A previously developed video-oculography system has been enhanced for use in measuring vestibulo-ocular reflexes of a human subject in a centrifuge, motor vehicle, or other setting. The system as previously developed included a lightweight digital video camera mounted on goggles. The left eye was illuminated by an infrared light-emitting diode via a dichroic mirror, and the camera captured images of the left eye in infrared light. To extract eye-movement data, the digitized video images were processed by software running in a laptop computer. Eye movements were calibrated by having the subject view a target pattern, fixed with respect to the subject s head, generated by a goggle-mounted laser with a diffraction grating. The system as enhanced includes a second camera for imaging the scene from the subject s perspective, and two inertial measurement units (IMUs) for measuring linear accelerations and rates of rotation for computing head movements. One IMU is mounted on the goggles, the other on the centrifuge or vehicle frame. All eye-movement and head-motion data are time-stamped. In addition, the subject s point of regard is superimposed on each scene image to enable analysis of patterns of gaze in real time.
Brain stem omnipause neurons and the control of combined eye-head gaze saccades in the alert cat.
Paré, M; Guitton, D
1998-06-01
When the head is unrestrained, rapid displacements of the visual axis-gaze shifts (eye-re-space)-are made by coordinated movements of the eyes (eye-re-head) and head (head-re-space). To address the problem of the neural control of gaze shifts, we studied and contrasted the discharges of omnipause neurons (OPNs) during a variety of combined eye-head gaze shifts and head-fixed eye saccades executed by alert cats. OPNs discharged tonically during intersaccadic intervals and at a reduced level during slow perisaccadic gaze movements sometimes accompanying saccades. Their activity ceased for the duration of the saccadic gaze shifts the animal executed, either by head-fixed eye saccades alone or by combined eye-head movements. This was true for all types of gaze shifts studied: active movements to visual targets; passive movements induced by whole-body rotation or by head rotation about stationary body; and electrically evoked movements by stimulation of the caudal part of the superior colliculus (SC), a central structure for gaze control. For combined eye-head gaze shifts, the OPN pause was therefore not correlated to the eye-in-head trajectory. For instance, in active gaze movements, the end of the pause was better correlated with the gaze end than with either the eye saccade end or the time of eye counterrotation. The hypothesis that cat OPNs participate in controlling gaze shifts is supported by these results, and also by the observation that the movements of both the eyes and the head were transiently interrupted by stimulation of OPNs during gaze shifts. However, we found that the OPN pause could be dissociated from the gaze-motor-error signal producing the gaze shift. First, OPNs resumed discharging when perturbation of head motion briefly interrupted a gaze shift before its intended amplitude was attained. Second, stimulation of caudal SC sites in head-free cat elicited large head-free gaze shifts consistent with the creation of a large gaze-motor-error signal. However, stimulation of the same sites in head-fixed cat produced small "goal-directed" eye saccades, and OPNs paused only for the duration of the latter; neither a pause nor an eye movement occurred when the same stimulation was applied with the eyes at the goal location. We conclude that OPNs can be controlled by neither a simple eye control system nor an absolute gaze control system. Our data cannot be accounted for by existing models describing the control of combined eye-head gaze shifts and therefore put new constraints on future models, which will have to incorporate all the various signals that act synergistically to control gaze shifts.
Iáñez, Eduardo; Azorin, Jose M.; Perez-Vidal, Carlos
2013-01-01
This paper describes a human-computer interface based on electro-oculography (EOG) that allows interaction with a computer using eye movement. The EOG registers the movement of the eye by measuring, through electrodes, the difference of potential between the cornea and the retina. A new pair of EOG glasses have been designed to improve the user's comfort and to remove the manual procedure of placing the EOG electrodes around the user's eye. The interface, which includes the EOG electrodes, uses a new processing algorithm that is able to detect the gaze direction and the blink of the eyes from the EOG signals. The system reliably enabled subjects to control the movement of a dot on a video screen. PMID:23843986
Contextual effects on motion perception and smooth pursuit eye movements.
Spering, Miriam; Gegenfurtner, Karl R
2008-08-15
Smooth pursuit eye movements are continuous, slow rotations of the eyes that allow us to follow the motion of a visual object of interest. These movements are closely related to sensory inputs from the visual motion processing system. To track a moving object in the natural environment, its motion first has to be segregated from the motion signals provided by surrounding stimuli. Here, we review experiments on the effect of the visual context on motion processing with a focus on the relationship between motion perception and smooth pursuit eye movements. While perception and pursuit are closely linked, we show that they can behave quite distinctly when required by the visual context.
NASA Technical Reports Server (NTRS)
Beutter, Brent R.; Stone, Leland S.
1997-01-01
Although numerous studies have examined the relationship between smooth-pursuit eye movements and motion perception, it remains unresolved whether a common motion-processing system subserves both perception and pursuit. To address this question, we simultaneously recorded perceptual direction judgments and the concomitant smooth eye movement response to a plaid stimulus that we have previously shown generates systematic perceptual errors. We measured the perceptual direction biases psychophysically and the smooth eye-movement direction biases using two methods (standard averaging and oculometric analysis). We found that the perceptual and oculomotor biases were nearly identical suggesting that pursuit and perception share a critical motion processing stage, perhaps in area MT or MST of extrastriate visual cortex.
NASA Technical Reports Server (NTRS)
Beutter, B. R.; Stone, L. S.
1998-01-01
Although numerous studies have examined the relationship between smooth-pursuit eye movements and motion perception, it remains unresolved whether a common motion-processing system subserves both perception and pursuit. To address this question, we simultaneously recorded perceptual direction judgments and the concomitant smooth eye-movement response to a plaid stimulus that we have previously shown generates systematic perceptual errors. We measured the perceptual direction biases psychophysically and the smooth eye-movement direction biases using two methods (standard averaging and oculometric analysis). We found that the perceptual and oculomotor biases were nearly identical, suggesting that pursuit and perception share a critical motion processing stage, perhaps in area MT or MST of extrastriate visual cortex.
NASA Technical Reports Server (NTRS)
Krauzlis, Rich; Stone, Leland; Null, Cynthia H. (Technical Monitor)
1998-01-01
When viewing objects, primates use a combination of saccadic and pursuit eye movements to stabilize the retinal image of the object of regard within the high-acuity region near the fovea. Although these movements involve widespread regions of the nervous system, they mix seamlessly in normal behavior. Saccades are discrete movements that quickly direct the eyes toward a visual target, thereby translating the image of the target from an eccentric retinal location to the fovea. In contrast, pursuit is a continuous movement that slowly rotates the eyes to compensate for the motion of the visual target, minimizing the blur that can compromise visual acuity. While other mammalian species can generate smooth optokinetic eye movements - which track the motion of the entire visual surround - only primates can smoothly pursue a single small element within a complex visual scene, regardless of the motion elsewhere on the retina. This ability likely reflects the greater ability of primates to segment the visual scene, to identify individual visual objects, and to select a target of interest.
Measuring saccade peak velocity using a low-frequency sampling rate of 50 Hz.
Wierts, Roel; Janssen, Maurice J A; Kingma, Herman
2008-12-01
During the last decades, small head-mounted video eye trackers have been developed in order to record eye movements. Real-time systems-with a low sampling frequency of 50/60 Hz-are used for clinical vestibular practice, but are generally considered not to be suited for measuring fast eye movements. In this paper, it is shown that saccadic eye movements, having an amplitude of at least 5 degrees, can, in good approximation, be considered to be bandwidth limited up to a frequency of 25-30 Hz. Using the Nyquist theorem to reconstruct saccadic eye movement signals at higher temporal resolutions, it is shown that accurate values for saccade peak velocities, recorded at 50 Hz, can be obtained, but saccade peak accelerations and decelerations cannot. In conclusion, video eye trackers sampling at 50/60 Hz are appropriate for detecting the clinical relevant saccade peak velocities in contrast to what has been stated up till now.
Cognitive Control of Saccadic Eye Movements
ERIC Educational Resources Information Center
Hutton, S. B.
2008-01-01
The saccadic eye movement system provides researchers with a powerful tool with which to explore the cognitive control of behaviour. It is a behavioural system whose limited output can be measured with exceptional precision, and whose input can be controlled and manipulated in subtle ways. A range of cognitive processes (notably those involved in…
Saccadic Eye Movements Impose a Natural Bottleneck on Visual Short-Term Memory
ERIC Educational Resources Information Center
Ohl, Sven; Rolfs, Martin
2017-01-01
Visual short-term memory (VSTM) is a crucial repository of information when events unfold rapidly before our eyes, yet it maintains only a fraction of the sensory information encoded by the visual system. Here, we tested the hypothesis that saccadic eye movements provide a natural bottleneck for the transition of fragile content in sensory memory…
Bourgin, P; Lebrand, C; Escourrou, P; Gaultier, C; Franc, B; Hamon, M; Adrien, J
1997-03-01
Rapid eye movement sleep can be elicited in the rat by microinjection of the cholinergic agonist carbachol into the oral pontine reticular nucleus. Intracerebroventricular administration, during the light period, of vasoactive intestinal peptide enhances rapid eye movement sleep in several species. Since this peptide is co-localized with acetylcholine in many neurons in the central nervous system, it was assumed that the oral pontine tegmentum could also be one target for vasoactive intestinal peptide to induce rapid eye movement sleep. This hypothesis was tested by recording the sleep-wakefulness cycle in freely-moving rats injected with vasoactive intestinal peptide or its fragments (1-12 and 10-28) directly into the oral pontine reticular nucleus. when administered into the posterior part of this nucleus, vasoactive intestinal peptide at 1 and 10 ng (in 0.1 microliter of saline), but not its fragments, induced a 2-fold enhancement of rapid eye movement sleep during 4 h, at the expense of wakefulness. At the dose of 10 ng, a significant increase in rapid eye movement sleep persisted for up to 8 h. Moreover, when the peptide was injected into the centre of the positive zone, rapid eye movement sleep was enhanced during three to eight consecutive days. These data provide the first evidence that rapid eye movement sleep can be elicited at both short- and long-term by a single intracerebral microinjection of vasoactive intestinal peptide. Peptidergic mechanisms, possibly in association with cholinergic mechanisms, within the caudal part of the oral pontine reticular nucleus may play a critical role in the long-term regulation of rapid eye movement sleep in rats.
EYE MOVEMENT RECORDING AND NONLINEAR DYNAMICS ANALYSIS – THE CASE OF SACCADES#
Aştefănoaei, Corina; Pretegiani, Elena; Optican, L.M.; Creangă, Dorina; Rufa, Alessandra
2015-01-01
Evidence of a chaotic behavioral trend in eye movement dynamics was examined in the case of a saccadic temporal series collected from a healthy human subject. Saccades are highvelocity eye movements of very short duration, their recording being relatively accessible, so that the resulting data series could be studied computationally for understanding the neural processing in a motor system. The aim of this study was to assess the complexity degree in the eye movement dynamics. To do this we analyzed the saccadic temporal series recorded with an infrared camera eye tracker from a healthy human subject in a special experimental arrangement which provides continuous records of eye position, both saccades (eye shifting movements) and fixations (focusing over regions of interest, with rapid, small fluctuations). The semi-quantitative approach used in this paper in studying the eye functioning from the viewpoint of non-linear dynamics was accomplished by some computational tests (power spectrum, portrait in the state space and its fractal dimension, Hurst exponent and largest Lyapunov exponent) derived from chaos theory. A high complexity dynamical trend was found. Lyapunov largest exponent test suggested bi-stability of cellular membrane resting potential during saccadic experiment. PMID:25698889
Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor
Tanno, Koichi
2017-01-01
A gaze estimation system is one of the communication methods for severely disabled people who cannot perform gestures and speech. We previously developed an eye tracking method using a compact and light electrooculogram (EOG) signal, but its accuracy is not very high. In the present study, we conducted experiments to investigate the EOG component strongly correlated with the change of eye movements. The experiments in this study are of two types: experiments to see objects only by eye movements and experiments to see objects by face and eye movements. The experimental results show the possibility of an eye tracking method using EOG signals and a Kinect sensor. PMID:28912800
Eibenberger, Karin; Eibenberger, Bernhard; Rucci, Michele
2016-08-01
The precise measurement of eye movements is important for investigating vision, oculomotor control and vestibular function. The magnetic scleral search coil technique is one of the most precise measurement techniques for recording eye movements with very high spatial (≈ 1 arcmin) and temporal (>kHz) resolution. The technique is based on measuring voltage induced in a search coil through a large magnetic field. This search coil is embedded in a contact lens worn by a human subject. The measured voltage is in direct relationship to the orientation of the eye in space. This requires a magnetic field with a high homogeneity in the center, since otherwise the field inhomogeneity would give the false impression of a rotation of the eye due to a translational movement of the head. To circumvent this problem, a bite bar typically restricts head movement to a minimum. However, the need often emerges to precisely record eye movements under natural viewing conditions. To this end, one needs a uniform magnetic field that is uniform over a large area. In this paper, we present the numerical and finite element simulations of the magnetic flux density of different coil geometries that could be used for search coil recordings. Based on the results, we built a 2.2 × 2.2 × 2.2 meter coil frame with a set of 3 × 4 coils to generate a 3D magnetic field and compared the measured flux density with our simulation results. In agreement with simulation results, the system yields a highly uniform field enabling high-resolution recordings of eye movements.
Is there a common motor dysregulation in sleepwalking and REM sleep behaviour disorder?
Haridi, Mehdi; Weyn Banningh, Sebastian; Clé, Marion; Leu-Semenescu, Smaranda; Vidailhet, Marie; Arnulf, Isabelle
2017-10-01
This study sought to determine if there is any overlap between the two major non-rapid eye movement and rapid eye movement parasomnias, i.e. sleepwalking/sleep terrors and rapid eye movement sleep behaviour disorder. We assessed adult patients with sleepwalking/sleep terrors using rapid eye movement sleep behaviour disorder screening questionnaires and determined if they had enhanced muscle tone during rapid eye movement sleep. Conversely, we assessed rapid eye movement sleep behaviour disorder patients using the Paris Arousal Disorders Severity Scale and determined if they had more N3 awakenings. The 251 participants included 64 patients with rapid eye movement sleep behaviour disorder (29 with idiopathic rapid eye movement sleep behaviour disorder and 35 with rapid eye movement sleep behaviour disorder associated with Parkinson's disease), 62 patients with sleepwalking/sleep terrors, 66 old healthy controls (age-matched with the rapid eye movement sleep behaviour disorder group) and 59 young healthy controls (age-matched with the sleepwalking/sleep terrors group). They completed the rapid eye movement sleep behaviour disorder screening questionnaire, rapid eye movement sleep behaviour disorder single question and Paris Arousal Disorders Severity Scale. In addition, all the participants underwent a video-polysomnography. The sleepwalking/sleep terrors patients scored positive on rapid eye movement sleep behaviour disorder scales and had a higher percentage of 'any' phasic rapid eye movement sleep without atonia when compared with controls; however, these patients did not have higher tonic rapid eye movement sleep without atonia or complex behaviours during rapid eye movement sleep. Patients with rapid eye movement sleep behaviour disorder had moderately elevated scores on the Paris Arousal Disorders Severity Scale but did not exhibit more N3 arousals (suggestive of non-rapid eye movement parasomnia) than the control group. These results indicate that dream-enacting behaviours (assessed by rapid eye movement sleep behaviour disorder screening questionnaires) are commonly reported by sleepwalking/sleep terrors patients, thus decreasing the questionnaire's specificity. Furthermore, sleepwalking/sleep terrors patients have excessive twitching during rapid eye movement sleep, which may result either from a higher dreaming activity in rapid eye movement sleep or from a more generalised non-rapid eye movement/rapid eye movement motor dyscontrol during sleep. © 2017 European Sleep Research Society.
An ocular biomechanic model for dynamic simulation of different eye movements.
Iskander, J; Hossny, M; Nahavandi, S; Del Porto, L
2018-04-11
Simulating and analysing eye movement is useful for assessing visual system contribution to discomfort with respect to body movements, especially in virtual environments where simulation sickness might occur. It can also be used in the design of eye prosthesis or humanoid robot eye. In this paper, we present two biomechanic ocular models that are easily integrated into the available musculoskeletal models. The model was previously used to simulate eye-head coordination. The models are used to simulate and analyse eye movements. The proposed models are based on physiological and kinematic properties of the human eye. They incorporate an eye-globe, orbital suspension tissues and six muscles with their connective tissues (pulleys). Pulleys were incorporated in rectus and inferior oblique muscles. The two proposed models are the passive pulleys and the active pulleys models. Dynamic simulations of different eye movements, including fixation, saccade and smooth pursuit, are performed to validate both models. The resultant force-length curves of the models were similar to the experimental data. The simulation results show that the proposed models are suitable to generate eye movement simulations with results comparable to other musculoskeletal models. The maximum kinematic root mean square error (RMSE) is 5.68° and 4.35° for the passive and active pulley models, respectively. The analysis of the muscle forces showed realistic muscle activation with increased muscle synergy in the active pulley model. Copyright © 2018 Elsevier Ltd. All rights reserved.
Bueeler, Michael; Mrochen, Michael
2005-01-01
The aim of this theoretical work was to investigate the robustness of scanning spot laser treatments with different laser spot diameters and peak ablation depths in case of incomplete compensation of eye movements due to eye-tracker latency. Scanning spot corrections of 3rd to 5th Zernike order wavefront errors were numerically simulated. Measured eye-movement data were used to calculate the positioning error of each laser shot assuming eye-tracker latencies of 0, 5, 30, and 100 ms, and for the case of no eye tracking. The single spot ablation depth ranged from 0.25 to 1.0 microm and the spot diameter from 250 to 1000 microm. The quality of the ablation was rated by the postoperative surface variance and the Strehl intensity ratio, which was calculated after a low-pass filter was applied to simulate epithelial surface smoothing. Treatments performed with nearly ideal eye tracking (latency approximately 0) provide the best results with a small laser spot (0.25 mm) and a small ablation depth (250 microm). However, combinations of a large spot diameter (1000 microm) and a small ablation depth per pulse (0.25 microm) yield the better results for latencies above a certain threshold to be determined specifically. Treatments performed with tracker latencies in the order of 100 ms yield similar results as treatments done completely without eye-movement compensation. CONCWSIONS: Reduction of spot diameter was shown to make the correction more susceptible to eye movement induced error. A smaller spot size is only beneficial when eye movement is neutralized with a tracking system with a latency <5 ms.
Visual Data Mining: An Exploratory Approach to Analyzing Temporal Patterns of Eye Movements
ERIC Educational Resources Information Center
Yu, Chen; Yurovsky, Daniel; Xu, Tian
2012-01-01
Infant eye movements are an important behavioral resource to understand early human development and learning. But the complexity and amount of gaze data recorded from state-of-the-art eye-tracking systems also pose a challenge: how does one make sense of such dense data? Toward this goal, this article describes an interactive approach based on…
Rhesus Monkeys Behave As If They Perceive the Duncker Illusion
Zivotofsky, A. Z.; Goldberg, M. E.; Powell, K. D.
2008-01-01
The visual system uses the pattern of motion on the retina to analyze the motion of objects in the world, and the motion of the observer him/herself. Distinguishing between retinal motion evoked by movement of the retina in space and retinal motion evoked by movement of objects in the environment is computationally difficult, and the human visual system frequently misinterprets the meaning of retinal motion. In this study, we demonstrate that the visual system of the Rhesus monkey also misinterprets retinal motion. We show that monkeys erroneously report the trajectories of pursuit targets or their own pursuit eye movements during an epoch of smooth pursuit across an orthogonally moving background. Furthermore, when they make saccades to the spatial location of stimuli that flashed early in an epoch of smooth pursuit or fixation, they make large errors that appear to take into account the erroneous smooth eye movement that they report in the first experiment, and not the eye movement that they actually make. PMID:16102233
Alkan, Yelda; Biswal, Bharat B.; Alvarez, Tara L.
2011-01-01
Purpose Eye movement research has traditionally studied solely saccade and/or vergence eye movements by isolating these systems within a laboratory setting. While the neural correlates of saccadic eye movements are established, few studies have quantified the functional activity of vergence eye movements using fMRI. This study mapped the neural substrates of vergence eye movements and compared them to saccades to elucidate the spatial commonality and differentiation between these systems. Methodology The stimulus was presented in a block design where the ‘off’ stimulus was a sustained fixation and the ‘on’ stimulus was random vergence or saccadic eye movements. Data were collected with a 3T scanner. A general linear model (GLM) was used in conjunction with cluster size to determine significantly active regions. A paired t-test of the GLM beta weight coefficients was computed between the saccade and vergence functional activities to test the hypothesis that vergence and saccadic stimulation would have spatial differentiation in addition to shared neural substrates. Results Segregated functional activation was observed within the frontal eye fields where a portion of the functional activity from the vergence task was located anterior to the saccadic functional activity (z>2.3; p<0.03). An area within the midbrain was significantly correlated with the experimental design for the vergence but not the saccade data set. Similar functional activation was observed within the following regions of interest: the supplementary eye field, dorsolateral prefrontal cortex, ventral lateral prefrontal cortex, lateral intraparietal area, cuneus, precuneus, anterior and posterior cingulates, and cerebellar vermis. The functional activity from these regions was not different between the vergence and saccade data sets assessed by analyzing the beta weights of the paired t-test (p>0.2). Conclusion Functional MRI can elucidate the differences between the vergence and saccade neural substrates within the frontal eye fields and midbrain. PMID:22073141
The role of eye movements in depth from motion parallax during infancy
Nawrot, Elizabeth; Nawrot, Mark
2013-01-01
Motion parallax is a motion-based, monocular depth cue that uses an object's relative motion and velocity as a cue to relative depth. In adults, and in monkeys, a smooth pursuit eye movement signal is used to disambiguate the depth-sign provided by these relative motion cues. The current study investigates infants' perception of depth from motion parallax and the development of two oculomotor functions, smooth pursuit and the ocular following response (OFR) eye movements. Infants 8 to 20 weeks of age were presented with three tasks in a single session: depth from motion parallax, smooth pursuit tracking, and OFR to translation. The development of smooth pursuit was significantly related to age, as was sensitivity to motion parallax. OFR eye movements also corresponded to both age and smooth pursuit gain, with groups of infants demonstrating asymmetric function in both types of eye movements. These results suggest that the development of the eye movement system may play a crucial role in the sensitivity to depth from motion parallax in infancy. Moreover, describing the development of these oculomotor functions in relation to depth perception may aid in the understanding of certain visual dysfunctions. PMID:24353309
Like a rolling stone: naturalistic visual kinematics facilitate tracking eye movements.
Souto, David; Kerzel, Dirk
2013-02-06
Newtonian physics constrains object kinematics in the real world. We asked whether eye movements towards tracked objects depend on their compliance with those constraints. In particular, the force of gravity constrains round objects to roll on the ground with a particular rotational and translational motion. We measured tracking eye movements towards rolling objects. We found that objects with rotational and translational motion that was congruent with an object rolling on the ground elicited faster tracking eye movements during pursuit initiation than incongruent stimuli. Relative to a condition without rotational component, we compared objects with this motion with a condition in which there was no rotational component, we essentially obtained benefits of congruence, and, to a lesser extent, costs from incongruence. Anticipatory pursuit responses showed no congruence effect, suggesting that the effect is based on visually-driven predictions, not on velocity storage. We suggest that the eye movement system incorporates information about object kinematics acquired by a lifetime of experience with visual stimuli obeying the laws of Newtonian physics.
Saccadic eye movement applications for psychiatric disorders
Bittencourt, Juliana; Velasques, Bruna; Teixeira, Silmar; Basile, Luis F; Salles, José Inácio; Nardi, Antonio Egídio; Budde, Henning; Cagy, Mauricio; Piedade, Roberto; Ribeiro, Pedro
2013-01-01
Objective The study presented here analyzed the patterns of relationship between oculomotor performance and psychopathology, focusing on depression, bipolar disorder, schizophrenia, attention-deficit hyperactivity disorder, and anxiety disorder. Methods Scientific articles published from 1967 to 2013 in the PubMed/Medline, ISI Web of Knowledge, Cochrane, and SciELO databases were reviewed. Results Saccadic eye movement appears to be heavily involved in psychiatric diseases covered in this review via a direct mechanism. The changes seen in the execution of eye movement tasks in patients with psychopathologies of various studies confirm that eye movement is associated with the cognitive and motor system. Conclusion Saccadic eye movement changes appear to be heavily involved in the psychiatric disorders covered in this review and may be considered a possible marker of some disorders. The few existing studies that approach the topic demonstrate a need to improve the experimental paradigms, as well as the methods of analysis. Most of them report behavioral variables (latency/reaction time), though electrophysiological measures are absent. PMID:24072973
EMDR Effects on Pursuit Eye Movements
Kapoula, Zoi; Yang, Qing; Bonnet, Audrey; Bourtoire, Pauline; Sandretto, Jean
2010-01-01
This study aimed to objectivize the quality of smooth pursuit eye movements in a standard laboratory task before and after an Eye Movement Desensitization and Reprocessing (EMDR) session run on seven healthy volunteers. EMDR was applied on autobiographic worries causing moderate distress. The EMDR session was complete in 5 out of the 7 cases; distress measured by SUDS (Subjective Units of Discomfort Scale) decreased to a near zero value. Smooth pursuit eye movements were recorded by an Eyelink II video system before and after EMDR. For the five complete sessions, pursuit eye movement improved after their EMDR session. Notably, the number of saccade intrusions—catch-up saccades (CUS)—decreased and, reciprocally, there was an increase in the smooth components of the pursuit. Such an increase in the smoothness of the pursuit presumably reflects an improvement in the use of visual attention needed to follow the target accurately. Perhaps EMDR reduces distress thereby activating a cholinergic effect known to improve ocular pursuit. PMID:20505828
Hawk Eyes I: Diurnal Raptors Differ in Visual Fields and Degree of Eye Movement
O'Rourke, Colleen T.; Hall, Margaret I.; Pitlik, Todd; Fernández-Juricic, Esteban
2010-01-01
Background Different strategies to search and detect prey may place specific demands on sensory modalities. We studied visual field configuration, degree of eye movement, and orbit orientation in three diurnal raptors belonging to the Accipitridae and Falconidae families. Methodology/Principal Findings We used an ophthalmoscopic reflex technique and an integrated 3D digitizer system. We found inter-specific variation in visual field configuration and degree of eye movement, but not in orbit orientation. Red-tailed Hawks have relatively small binocular areas (∼33°) and wide blind areas (∼82°), but intermediate degree of eye movement (∼5°), which underscores the importance of lateral vision rather than binocular vision to scan for distant prey in open areas. Cooper's Hawks' have relatively wide binocular fields (∼36°), small blind areas (∼60°), and high degree of eye movement (∼8°), which may increase visual coverage and enhance prey detection in closed habitats. Additionally, we found that Cooper's Hawks can visually inspect the items held in the tip of the bill, which may facilitate food handling. American Kestrels have intermediate-sized binocular and lateral areas that may be used in prey detection at different distances through stereopsis and motion parallax; whereas the low degree eye movement (∼1°) may help stabilize the image when hovering above prey before an attack. Conclusions We conclude that: (a) there are between-species differences in visual field configuration in these diurnal raptors; (b) these differences are consistent with prey searching strategies and degree of visual obstruction in the environment (e.g., open and closed habitats); (c) variations in the degree of eye movement between species appear associated with foraging strategies; and (d) the size of the binocular and blind areas in hawks can vary substantially due to eye movements. Inter-specific variation in visual fields and eye movements can influence behavioral strategies to visually search for and track prey while perching. PMID:20877645
Hawk eyes I: diurnal raptors differ in visual fields and degree of eye movement.
O'Rourke, Colleen T; Hall, Margaret I; Pitlik, Todd; Fernández-Juricic, Esteban
2010-09-22
Different strategies to search and detect prey may place specific demands on sensory modalities. We studied visual field configuration, degree of eye movement, and orbit orientation in three diurnal raptors belonging to the Accipitridae and Falconidae families. We used an ophthalmoscopic reflex technique and an integrated 3D digitizer system. We found inter-specific variation in visual field configuration and degree of eye movement, but not in orbit orientation. Red-tailed Hawks have relatively small binocular areas (∼33°) and wide blind areas (∼82°), but intermediate degree of eye movement (∼5°), which underscores the importance of lateral vision rather than binocular vision to scan for distant prey in open areas. Cooper's Hawks' have relatively wide binocular fields (∼36°), small blind areas (∼60°), and high degree of eye movement (∼8°), which may increase visual coverage and enhance prey detection in closed habitats. Additionally, we found that Cooper's Hawks can visually inspect the items held in the tip of the bill, which may facilitate food handling. American Kestrels have intermediate-sized binocular and lateral areas that may be used in prey detection at different distances through stereopsis and motion parallax; whereas the low degree eye movement (∼1°) may help stabilize the image when hovering above prey before an attack. We conclude that: (a) there are between-species differences in visual field configuration in these diurnal raptors; (b) these differences are consistent with prey searching strategies and degree of visual obstruction in the environment (e.g., open and closed habitats); (c) variations in the degree of eye movement between species appear associated with foraging strategies; and (d) the size of the binocular and blind areas in hawks can vary substantially due to eye movements. Inter-specific variation in visual fields and eye movements can influence behavioral strategies to visually search for and track prey while perching.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Via, Riccardo, E-mail: riccardo.via@polimi.it; Fassi, Aurora; Fattori, Giovanni
Purpose: External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Methods: Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by twomore » calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Results: Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. Conclusions: A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.« less
Via, Riccardo; Fassi, Aurora; Fattori, Giovanni; Fontana, Giulia; Pella, Andrea; Tagaste, Barbara; Riboldi, Marco; Ciocca, Mario; Orecchia, Roberto; Baroni, Guido
2015-05-01
External beam radiotherapy currently represents an important therapeutic strategy for the treatment of intraocular tumors. Accurate target localization and efficient compensation of involuntary eye movements are crucial to avoid deviations in dose distribution with respect to the treatment plan. This paper describes an eye tracking system (ETS) based on noninvasive infrared video imaging. The system was designed for capturing the tridimensional (3D) ocular motion and provides an on-line estimation of intraocular lesions position based on a priori knowledge coming from volumetric imaging. Eye tracking is performed by localizing cornea and pupil centers on stereo images captured by two calibrated video cameras, exploiting eye reflections produced by infrared illumination. Additionally, torsional eye movements are detected by template matching in the iris region of eye images. This information allows estimating the 3D position and orientation of the eye by means of an eye local reference system. By combining ETS measurements with volumetric imaging for treatment planning [computed tomography (CT) and magnetic resonance (MR)], one is able to map the position of the lesion to be treated in local eye coordinates, thus enabling real-time tumor referencing during treatment setup and irradiation. Experimental tests on an eye phantom and seven healthy subjects were performed to assess ETS tracking accuracy. Measurements on phantom showed an overall median accuracy within 0.16 mm and 0.40° for translations and rotations, respectively. Torsional movements were affected by 0.28° median uncertainty. On healthy subjects, the gaze direction error ranged between 0.19° and 0.82° at a median working distance of 29 cm. The median processing time of the eye tracking algorithm was 18.60 ms, thus allowing eye monitoring up to 50 Hz. A noninvasive ETS prototype was designed to perform real-time target localization and eye movement monitoring during ocular radiotherapy treatments. The device aims at improving state-of-the-art invasive procedures based on surgical implantation of radiopaque clips and repeated acquisition of X-ray images, with expected positive effects on treatment quality and patient outcome.
The Right Track for Vision Correction
NASA Technical Reports Server (NTRS)
2003-01-01
More and more people are putting away their eyeglasses and contact lenses as a result of laser vision correction surgery. LASIK, the most widely performed version of this surgical procedure, improves vision by reshaping the cornea, the clear front surface of the eye, using an excimer laser. One excimer laser system, Alcon s LADARVision 4000, utilizes a laser radar (LADAR) eye tracking device that gives it unmatched precision. During LASIK surgery, laser During LASIK surgery, laser pulses must be accurately placed to reshape the cornea. A challenge to this procedure is the patient s constant eye movement. A person s eyes make small, involuntary movements known as saccadic movements about 100 times per second. Since the saccadic movements will not stop during LASIK surgery, most excimer laser systems use an eye tracking device that measures the movements and guides the placement of the laser beam. LADARVision s eye tracking device stems from the LADAR technology originally developed through several Small Business Innovation Research (SBIR) contracts with NASA s Johnson Space Center and the U.S. Department of Defense s Ballistic Missile Defense Office (BMDO). In the 1980s, Johnson awarded Autonomous Technologies Corporation a Phase I SBIR contract to develop technology for autonomous rendezvous and docking of space vehicles to service satellites. During Phase II of the Johnson SBIR contract, Autonomous Technologies developed a prototype range and velocity imaging LADAR to demonstrate technology that could be used for this purpose.
Guillaume, Alain; Pélisson, Denis
2006-12-15
Shifting gaze requires precise coordination of eye and head movements. It is clear that the superior colliculus (SC) is involved with saccadic gaze shifts. Here we investigate its role in controlling both eye and head movements during gaze shifts. Gaze shifts of the same amplitude can be evoked from different SC sites by controlled electrical microstimulation. To describe how the SC coordinates the eye and the head, we compare the characteristics of these amplitude-matched gaze shifts evoked from different SC sites. We show that matched amplitude gaze shifts elicited from progressively more caudal sites are progressively slower and associated with a greater head contribution. Stimulation at more caudal SC sites decreased the peak velocity of the eye but not of the head, suggesting that the lower peak gaze velocity for the caudal sites is due to the increased contribution of the slower-moving head. Eye-head coordination across the SC motor map is also indicated by the relative latencies of the eye and head movements. For some amplitudes of gaze shift, rostral stimulation evoked eye movement before head movement, whereas this reversed with caudal stimulation, which caused the head to move before the eyes. These results show that gaze shifts of similar amplitude evoked from different SC sites are produced with different kinematics and coordination of eye and head movements. In other words, gaze shifts evoked from different SC sites follow different amplitude-velocity curves, with different eye-head contributions. These findings shed light on mechanisms used by the central nervous system to translate a high-level motor representation (a desired gaze displacement on the SC map) into motor commands appropriate for the involved body segments (the eye and the head).
Eye-movements and ongoing task processing.
Burke, David T; Meleger, Alec; Schneider, Jeffrey C; Snyder, Jim; Dorvlo, Atsu S S; Al-Adawi, Samir
2003-06-01
This study tests the relation between eye-movements and thought processing. Subjects were given specific modality tasks (visual, gustatory, kinesthetic) and assessed on whether they responded with distinct eye-movements. Some subjects' eye-movements reflected ongoing thought processing. Instead of a universal pattern, as suggested by the neurolinguistic programming hypothesis, this study yielded subject-specific idiosyncratic eye-movements across all modalities. Included is a discussion of the neurolinguistic programming hypothesis regarding eye-movements and its implications for the eye-movement desensitization and reprocessing theory.
Shinozaki, Takahiro
2018-01-01
Human-computer interface systems whose input is based on eye movements can serve as a means of communication for patients with locked-in syndrome. Eye-writing is one such system; users can input characters by moving their eyes to follow the lines of the strokes corresponding to characters. Although this input method makes it easy for patients to get started because of their familiarity with handwriting, existing eye-writing systems suffer from slow input rates because they require a pause between input characters to simplify the automatic recognition process. In this paper, we propose a continuous eye-writing recognition system that achieves a rapid input rate because it accepts characters eye-written continuously, with no pauses. For recognition purposes, the proposed system first detects eye movements using electrooculography (EOG), and then a hidden Markov model (HMM) is applied to model the EOG signals and recognize the eye-written characters. Additionally, this paper investigates an EOG adaptation that uses a deep neural network (DNN)-based HMM. Experiments with six participants showed an average input speed of 27.9 character/min using Japanese Katakana as the input target characters. A Katakana character-recognition error rate of only 5.0% was achieved using 13.8 minutes of adaptation data. PMID:29425248
NASA Technical Reports Server (NTRS)
Schmid, R. M.
1973-01-01
The vestibulo-ocular system is examined from the standpoint of system theory. The evolution of a mathematical model of the vestibulo-ocular system in an attempt to match more and more experimental data is followed step by step. The final model explains many characteristics of the eye movement in vestibularly induced nystagmus. The analysis of the dynamic behavior of the model at the different stages of its development is illustrated in time domain, mainly in a qualitative way.
Hill, N Jeremy; Moinuddin, Aisha; Häuser, Ann-Katrin; Kienzle, Stephan; Schalk, Gerwin
2012-01-01
Most brain-computer interface (BCI) systems require users to modulate brain signals in response to visual stimuli. Thus, they may not be useful to people with limited vision, such as those with severe paralysis. One important approach for overcoming this issue is auditory streaming, an approach whereby a BCI system is driven by shifts of attention between two simultaneously presented auditory stimulus streams. Motivated by the long-term goal of translating such a system into a reliable, simple yes-no interface for clinical usage, we aim to answer two main questions. First, we asked which of two previously published variants provides superior performance: a fixed-phase (FP) design in which the streams have equal period and opposite phase, or a drifting-phase (DP) design where the periods are unequal. We found FP to be superior to DP (p = 0.002): average performance levels were 80 and 72% correct, respectively. We were also able to show, in a pilot with one subject, that auditory streaming can support continuous control and neurofeedback applications: by shifting attention between ongoing left and right auditory streams, the subject was able to control the position of a paddle in a computer game. Second, we examined whether the system is dependent on eye movements, since it is known that eye movements and auditory attention may influence each other, and any dependence on the ability to move one's eyes would be a barrier to translation to paralyzed users. We discovered that, despite instructions, some subjects did make eye movements that were indicative of the direction of attention. However, there was no correlation, across subjects, between the reliability of the eye movement signal and the reliability of the BCI system, indicating that our system was configured to work independently of eye movement. Together, these findings are an encouraging step forward toward BCIs that provide practical communication and control options for the most severely paralyzed users.
Eye Movement Indices in the Study of Depressive Disorder
LI, Yu; XU, Yangyang; XIA, Mengqing; ZHANG, Tianhong; WANG, Junjie; LIU, Xu; HE, Yongguang; WANG, Jijun
2016-01-01
Background Impaired cognition is one of the most common core symptoms of depressive disorder. Eye movement testing mainly reflects patients’ cognitive functions, such as cognition, memory, attention, recognition, and recall. This type of testing has great potential to improve theories related to cognitive functioning in depressive episodes as well as potential in its clinical application. Aims This study investigated whether eye movement indices of patients with unmedicated depressive disorder were abnormal or not, as well as the relationship between these indices and mental symptoms. Methods Sixty patients with depressive disorder and sixty healthy controls (who were matched by gender, age and years of education) were recruited, and completed eye movement tests including three tasks: fixation task, saccade task and free-view task. The EyeLink desktop eye tracking system was employed to collect eye movement information, and analyze the eye movement indices of the three tasks between the two groups. Results (1) In the fixation task, compared to healthy controls, patients with depressive disorder showed more fixations, shorter fixation durations, more saccades and longer saccadic lengths; (2) In the saccade task, patients with depressive disorder showed longer anti-saccade latencies and smaller anti-saccade peak velocities; (3) In the free-view task, patients with depressive disorder showed fewer saccades and longer mean fixation durations; (4) Correlation analysis showed that there was a negative correlation between the pro-saccade amplitude and anxiety symptoms, and a positive correlation between the anti-saccade latency and anxiety symptoms. The depression symptoms were negatively correlated with fixation times, saccades, and saccadic paths respectively in the free-view task; while the mean fixation duration and depression symptoms showed a positive correlation. Conclusion Compared to healthy controls, patients with depressive disorder showed significantly abnormal eye movement indices. In addition patients’ anxiety and depression symptoms and eye movement indices were correlated. The pathological meaning of these phenomena deserve further exploration. PMID:28638208
Eye Movement Indices in the Study of Depressive Disorder.
Li, Yu; Xu, Yangyang; Xia, Mengqing; Zhang, Tianhong; Wang, Junjie; Liu, Xu; He, Yongguang; Wang, Jijun
2016-12-25
Impaired cognition is one of the most common core symptoms of depressive disorder. Eye movement testing mainly reflects patients' cognitive functions, such as cognition, memory, attention, recognition, and recall. This type of testing has great potential to improve theories related to cognitive functioning in depressive episodes as well as potential in its clinical application. This study investigated whether eye movement indices of patients with unmedicated depressive disorder were abnormal or not, as well as the relationship between these indices and mental symptoms. Sixty patients with depressive disorder and sixty healthy controls (who were matched by gender, age and years of education) were recruited, and completed eye movement tests including three tasks: fixation task, saccade task and free-view task. The EyeLink desktop eye tracking system was employed to collect eye movement information, and analyze the eye movement indices of the three tasks between the two groups. (1) In the fixation task, compared to healthy controls, patients with depressive disorder showed more fixations, shorter fixation durations, more saccades and longer saccadic lengths; (2) In the saccade task, patients with depressive disorder showed longer anti-saccade latencies and smaller anti-saccade peak velocities; (3) In the free-view task, patients with depressive disorder showed fewer saccades and longer mean fixation durations; (4) Correlation analysis showed that there was a negative correlation between the pro-saccade amplitude and anxiety symptoms, and a positive correlation between the anti-saccade latency and anxiety symptoms. The depression symptoms were negatively correlated with fixation times, saccades, and saccadic paths respectively in the free-view task; while the mean fixation duration and depression symptoms showed a positive correlation. Compared to healthy controls, patients with depressive disorder showed significantly abnormal eye movement indices. In addition patients' anxiety and depression symptoms and eye movement indices were correlated. The pathological meaning of these phenomena deserve further exploration.
ERIC Educational Resources Information Center
Spichtig, Alexandra N.; Hiebert, Elfrieda H.; Vorstius, Christian; Pascoe, Jeffrey P.; Pearson, P. David; Radach, Ralph
2016-01-01
The present study measured the comprehension-based silent reading efficiency of U.S. students in grades 2, 4, 6, 8, 10, and 12. Students read standardized grade-level passages while an eye movement recording system was used to measure reading rate, fixations (eye stops) per word, fixation durations, and regressions (right-to-left eye movements)…
Márquez-Ruiz, Javier; Escudero, Miguel
2010-11-01
the aim of this work was to characterize eye movements and abducens (ABD) motoneuron behavior after cholinergic activation of the nucleus reticularis pontis caudalis (NRPC). six female adult cats were prepared for chronic recording of eye movements (using the scleral search-coil technique), electroencephalography, electromyography, ponto-geniculo-occipital (PGO) waves in the lateral geniculate nucleus, and ABD motoneuron activities after microinjections of the cholinergic agonist carbachol into the NRPC. unilateral microinjections of carbachol in the NRPC induced tonic and phasic phenomena in the oculomotor system. Tonic effects consisted of ipsiversive rotation to the injected side, convergence, and downward rotation of the eyes. Phasic effects consisted of bursts of rhythmic rapid eye movements directed contralaterally to the injected side along with PGO-like waves in the lateral geniculate and ABD nuclei. Although tonic effects were dependent on the level of drowsiness, phasic effects were always present and appeared along with normal saccades when the animal was vigilant. ABD motoneurons showed phasic activities associated with ABD PGO-like waves during bursts of rapid eye movements, and tonic and phasic activities related to eye position and velocity during alertness. the cholinergic activation of the NRPC induces oculomotor phenomena that are somewhat similar to those described during REM sleep. A precise comparison of the dynamics and timing of the eye movements further suggests that a temporal organization of both NRPCs is needed to reproduce the complexity of the oculomotor behavior during REM sleep.
Tracing Attention and the Activation Flow of Spoken Word Planning Using Eye Movements
ERIC Educational Resources Information Center
Roelofs, Ardi
2008-01-01
The flow of activation from concepts to phonological forms within the word production system was examined in 3 experiments. In Experiment 1, participants named pictures while ignoring superimposed distractor pictures that were semantically related, phonologically related, or unrelated. Eye movements and naming latencies were recorded. The…
NASA Technical Reports Server (NTRS)
Wu, Shu-Chieh; Remington, Roger W.; Lewis, Richard
2006-01-01
Common tasks in daily life are often accomplished by a sequence of actions that interleave information acquisition through the eyes and action execution by the hands. How are eye movements coordinated with the release of manual responses and how may their coordination be represented at the level of component mental operations? We have previously presented data from a typing-like task requiring separate choice responses to a series of five stimuli. We found a consistent pattern of results in both motor and ocular timing, and hypothesized possible relationships among underlying components. Here we report a model of that task, which demonstrates how the observed timing of eye movements to successive stimuli could be accounted for by assuming systems: an open-loop system generating saccades at a periodic rate, and a closed-loop system commanding a saccade based on stimulus processing. We relate this model to models of reading and discuss the motivation for dual control.
Analyzing complex gaze behavior in the natural world
NASA Astrophysics Data System (ADS)
Pelz, Jeff B.; Kinsman, Thomas B.; Evans, Karen M.
2011-03-01
The history of eye-movement research extends back at least to 1794, when Erasmus Darwin (Charles' grandfather) published Zoonomia, including descriptions of eye movements due to self-motion. But research on eye movements was restricted to the laboratory for 200 years, until Michael Land built the first wearable eyetracker at the University of Sussex and published the seminal paper "Where we look when we steer" [1]. In the intervening centuries, we learned a tremendous amount about the mechanics of the oculomotor system and how it responds to isolated stimuli, but virtually nothing about how we actually use our eyes to explore, gather information, navigate, and communicate in the real world. Inspired by Land's work, we have been working to extend knowledge in these areas by developing hardware, algorithms, and software that have allowed researchers to ask questions about how we actually use vision in the real world. Central to that effort are new methods for analyzing the volumes of data that come from the experiments made possible by the new systems. We describe a number of recent experiments and SemantiCode, a new program that supports assisted coding of eye-movement data collected in unrestricted environments.
Eye movement analysis for activity recognition using electrooculography.
Bulling, Andreas; Ward, Jamie A; Gellersen, Hans; Tröster, Gerhard
2011-04-01
In this work, we investigate eye movement analysis as a new sensing modality for activity recognition. Eye movement data were recorded using an electrooculography (EOG) system. We first describe and evaluate algorithms for detecting three eye movement characteristics from EOG signals-saccades, fixations, and blinks-and propose a method for assessing repetitive patterns of eye movements. We then devise 90 different features based on these characteristics and select a subset of them using minimum redundancy maximum relevance (mRMR) feature selection. We validate the method using an eight participant study in an office environment using an example set of five activity classes: copying a text, reading a printed paper, taking handwritten notes, watching a video, and browsing the Web. We also include periods with no specific activity (the NULL class). Using a support vector machine (SVM) classifier and person-independent (leave-one-person-out) training, we obtain an average precision of 76.1 percent and recall of 70.5 percent over all classes and participants. The work demonstrates the promise of eye-based activity recognition (EAR) and opens up discussion on the wider applicability of EAR to other activities that are difficult, or even impossible, to detect using common sensing modalities.
Mala, S.; Latha, K.
2014-01-01
Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition. PMID:25574185
Mala, S; Latha, K
2014-01-01
Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition.
Strabismus and the Oculomotor System: Insights from Macaque Models
Das, Vallabh E.
2017-01-01
Disrupting binocular vision in infancy leads to strabismus and oftentimes to a variety of associated visual sensory deficits and oculomotor abnormalities. Investigation of this disorder has been aided by the development of various animal models, each of which has advantages and disadvantages. In comparison to studies of binocular visual responses in cortical structures, investigations of neural oculomotor structures that mediate the misalignment and abnormalities of eye movements have been more recent, and these studies have shown that different brain areas are intimately involved in driving several aspects of the strabismic condition, including horizontal misalignment, dissociated deviations, A and V patterns of strabismus, disconjugate eye movements, nystagmus, and fixation switch. The responses of cells in visual and oculomotor areas that potentially drive the sensory deficits and also eye alignment and eye movement abnormalities follow a general theme of disrupted calibration, lower sensitivity, and poorer specificity compared with the normally developed visual oculomotor system. PMID:28532347
Rewards modulate saccade latency but not exogenous spatial attention.
Dunne, Stephen; Ellison, Amanda; Smith, Daniel T
2015-01-01
The eye movement system is sensitive to reward. However, whilst the eye movement system is extremely flexible, the extent to which changes to oculomotor behavior induced by reward paradigms persist beyond the training period or transfer to other oculomotor tasks is unclear. To address these issues we examined the effects of presenting feedback that represented small monetary rewards to spatial locations on the latency of saccadic eye movements, the time-course of learning and extinction of the effects of rewarding saccades on exogenous spatial attention and oculomotor inhibition of return. Reward feedback produced a relative facilitation of saccadic latency in a stimulus driven saccade task which persisted for three blocks of extinction trials. However, this hemifield-specific effect failed to transfer to peripheral cueing tasks. We conclude that rewarding specific spatial locations is unlikely to induce long-term, systemic changes to the human oculomotor or attention systems.
Development of a Computer Writing System Based on EOG
López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian
2017-01-01
The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders. PMID:28672863
Development of a Computer Writing System Based on EOG.
López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian
2017-06-26
The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.
Saccadic eye movements analysis as a measure of drug effect on central nervous system function.
Tedeschi, G; Quattrone, A; Bonavita, V
1986-04-01
Peak velocity (PSV) and duration (SD) of horizontal saccadic eye movements are demonstrably under the control of specific brain stem structures. Experimental and clinical evidence suggest the existence of an immediate premotor system for saccade generation located in the paramedian pontine reticular formation (PPRF). Effects on saccadic eye movements have been studied in normal volunteers with barbiturates, benzodiazepines, amphetamine and ethanol. On two occasions computer analysis of PSV, SD, saccade reaction time (SRT) and saccade accuracy (SA) was carried out in comparison with more traditional methods of assessment of human psychomotor performance like choice reaction time (CRT) and critical flicker fusion threshold (CFFT). The computer system proved to be a highly sensitive and objective method for measuring drug effect on central nervous system (CNS) function. It allows almost continuous sampling of data and appears to be particularly suitable for studying rapidly changing drug effects on the CNS.
Eye movement training is most effective when it involves a task-relevant sensorimotor decision.
Fooken, Jolande; Lalonde, Kathryn M; Mann, Gurkiran K; Spering, Miriam
2018-04-01
Eye and hand movements are closely linked when performing everyday actions. We conducted a perceptual-motor training study to investigate mutually beneficial effects of eye and hand movements, asking whether training in one modality benefits performance in the other. Observers had to predict the future trajectory of a briefly presented moving object, and intercept it at its assumed location as accurately as possible with their finger. Eye and hand movements were recorded simultaneously. Different training protocols either included eye movements or a combination of eye and hand movements with or without external performance feedback. Eye movement training did not transfer across modalities: Irrespective of feedback, finger interception accuracy and precision improved after training that involved the hand, but not after isolated eye movement training. Conversely, eye movements benefited from hand movement training or when external performance feedback was given, thus improving only when an active interceptive task component was involved. These findings indicate only limited transfer across modalities. However, they reveal the importance of creating a training task with an active sensorimotor decision to improve the accuracy and precision of eye and hand movements.
Fukushima, Junko; Akao, Teppei; Kurkin, Sergei; Kaneko, Chris R.S.; Fukushima, Kikuro
2006-01-01
In order to see clearly when a target is moving slowly, primates with high acuity foveae use smooth-pursuit and vergence eye movements. The former rotates both eyes in the same direction to track target motion in frontal planes, while the latter rotates left and right eyes in opposite directions to track target motion in depth. Together, these two systems pursue targets precisely and maintain their images on the foveae of both eyes. During head movements, both systems must interact with the vestibular system to minimize slip of the retinal images. The primate frontal cortex contains two pursuit-related areas; the caudal part of the frontal eye fields (FEF) and supplementary eye fields (SEF). Evoked potential studies have demonstrated vestibular projections to both areas and pursuit neurons in both areas respond to vestibular stimulation. The majority of FEF pursuit neurons code parameters of pursuit such as pursuit and vergence eye velocity, gaze velocity, and retinal image motion for target velocity in frontal and depth planes. Moreover, vestibular inputs contribute to the predictive pursuit responses of FEF neurons. In contrast, the majority of SEF pursuit neurons do not code pursuit metrics and many SEF neurons are reported to be active in more complex tasks. These results suggest that FEF- and SEF-pursuit neurons are involved in different aspects of vestibular-pursuit interactions and that eye velocity coding of SEF pursuit neurons is specialized for the task condition. PMID:16917164
Ictal SPECT in patients with rapid eye movement sleep behaviour disorder.
Mayer, Geert; Bitterlich, Marion; Kuwert, Torsten; Ritt, Philipp; Stefan, Hermann
2015-05-01
Rapid eye movement sleep behaviour disorder is a rapid eye movement parasomnia clinically characterized by acting out dreams due to disinhibition of muscle tone in rapid eye movement sleep. Up to 80-90% of the patients with rapid eye movement sleep behaviour disorder develop neurodegenerative disorders within 10-15 years after symptom onset. The disorder is reported in 45-60% of all narcoleptic patients. Whether rapid eye movement sleep behaviour disorder is also a predictor for neurodegeneration in narcolepsy is not known. Although the pathophysiology causing the disinhibition of muscle tone in rapid eye movement sleep behaviour disorder has been studied extensively in animals, little is known about the mechanisms in humans. Most of the human data are from imaging or post-mortem studies. Recent studies show altered functional connectivity between substantia nigra and striatum in patients with rapid eye movement sleep behaviour disorder. We were interested to study which regions are activated in rapid eye movement sleep behaviour disorder during actual episodes by performing ictal single photon emission tomography. We studied one patient with idiopathic rapid eye movement sleep behaviour disorder, one with Parkinson's disease and rapid eye movement sleep behaviour disorder, and two patients with narcolepsy and rapid eye movement sleep behaviour disorder. All patients underwent extended video polysomnography. The tracer was injected after at least 10 s of consecutive rapid eye movement sleep and 10 s of disinhibited muscle tone accompanied by movements registered by an experienced sleep technician. Ictal single photon emission tomography displayed the same activation in the bilateral premotor areas, the interhemispheric cleft, the periaqueductal area, the dorsal and ventral pons and the anterior lobe of the cerebellum in all patients. Our study shows that in patients with Parkinson's disease and rapid eye movement sleep behaviour disorder-in contrast to wakefulness-the neural activity generating movement during episodes of rapid eye movement sleep behaviour disorder bypasses the basal ganglia, a mechanism that is shared by patients with idiopathic rapid eye movement sleep behaviour disorder and narcolepsy patients with rapid eye movement sleep behaviour disorder. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Dudschig, Carolin; Souman, Jan; Lachmair, Martin; de la Vega, Irmgard; Kaup, Barbara
2013-01-01
Traditionally, language processing has been attributed to a separate system in the brain, which supposedly works in an abstract propositional manner. However, there is increasing evidence suggesting that language processing is strongly interrelated with sensorimotor processing. Evidence for such an interrelation is typically drawn from interactions between language and perception or action. In the current study, the effect of words that refer to entities in the world with a typical location (e.g., sun, worm) on the planning of saccadic eye movements was investigated. Participants had to perform a lexical decision task on visually presented words and non-words. They responded by moving their eyes to a target in an upper (lower) screen position for a word (non-word) or vice versa. Eye movements were faster to locations compatible with the word's referent in the real world. These results provide evidence for the importance of linguistic stimuli in directing eye movements, even if the words do not directly transfer directional information.
Pettorossi, V E; Errico, P; Ferraresi, A; Minciotti, M; Barmack, N H
1998-07-01
Researchers investigated how vestibular and optokinetic signals alter the spatial transformation of the coordinate system that governs the spatial orientation of reflexive eye movements. Also examined were the effects of sensory stimulation when vestibular and optokinetic signals act synergistically and when the two signals are in conflict.
Eye-movements and Voice as Interface Modalities to Computer Systems
NASA Astrophysics Data System (ADS)
Farid, Mohsen M.; Murtagh, Fionn D.
2003-03-01
We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance human-computer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.
NASA Astrophysics Data System (ADS)
Tera, Akemi; Shirai, Kiyoaki; Yuizono, Takaya; Sugiyama, Kozo
In order to investigate reading processes of Japanese language learners, we have conducted an experiment to record eye movements during Japanese text reading using an eye-tracking system. We showed that Japanese native speakers use “forward and backward jumping eye movements” frequently[13],[14]. In this paper, we analyzed further the same eye tracking data. Our goal is to examine whether Japanese learners fix their eye movements at boundaries of linguistic units such as words, phrases or clauses when they start or end “backward jumping”. We consider conventional linguistic boundaries as well as boundaries empirically defined based on the entropy of the N-gram model. Another goal is to examine the relation between the entropy of the N-gram model and the depth of syntactic structures of sentences. Our analysis shows that (1) Japanese learners often fix their eyes at linguistic boundaries, (2) the average of the entropy is the greatest at the fifth depth of syntactic structures.
A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces
Heo, Jeong; Yoon, Heenam; Park, Kwang Suk
2017-01-01
Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain–computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles. PMID:28644398
A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.
Heo, Jeong; Yoon, Heenam; Park, Kwang Suk
2017-06-23
Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.
Naicker, Preshanta; Anoopkumar-Dukie, Shailendra; Grant, Gary D; Modenese, Luca; Kavanagh, Justin J
2017-02-01
Anticholinergic medications largely exert their effects due to actions on the muscarinic receptor, which mediates the functions of acetylcholine in the peripheral and central nervous systems. In the central nervous system, acetylcholine plays an important role in the modulation of movement. This study investigated the effects of over-the-counter medications with varying degrees of central anticholinergic properties on fixation stability, saccadic response time and the dynamics associated with this eye movement during a temporally-cued visual reaction time task, in order to establish the significance of central cholinergic pathways in influencing eye movements during reaction time tasks. Twenty-two participants were recruited into the placebo-controlled, human double-blind, four-way crossover investigation. Eye tracking technology recorded eye movements while participants reacted to visual stimuli following temporally informative and uninformative cues. The task was performed pre-ingestion as well as 0.5 and 2 h post-ingestion of promethazine hydrochloride (strong centrally acting anticholinergic), hyoscine hydrobromide (moderate centrally acting anticholinergic), hyoscine butylbromide (anticholinergic devoid of central properties) and a placebo. Promethazine decreased fixation stability during the reaction time task. In addition, promethazine was the only drug to increase saccadic response time during temporally informative and uninformative cued trials, whereby effects on response time were more pronounced following temporally informative cues. Promethazine also decreased saccadic amplitude and increased saccadic duration during the temporally-cued reaction time task. Collectively, the results of the study highlight the significant role that central cholinergic pathways play in the control of eye movements during tasks that involve stimulus identification and motor responses following temporal cues.
Lightweight helmet-mounted eye movement measurement system
NASA Technical Reports Server (NTRS)
Barnes, J. A.
1978-01-01
The helmet-mounted eye movement measuring system, weighs 1,530 grams; the weight of the present aviators' helmet in standard form with the visor is 1,545 grams. The optical head is standard NAC Eye-Mark. This optical head was mounted on a magnesium yoke which in turn was attached to a slide cam mounted on the flight helmet. The slide cam allows one to adjust the eye-to-optics system distance quite easily and to secure it so that the system will remain in calibration. The design of the yoke and slide cam is such that the subject can, in an emergency, move the optical head forward and upward to the stowed and locked position atop the helmet. This feature was necessary for flight safety. The television camera that is used in the system is a solid state General Electric TN-2000 with a charged induced device imager used as the vidicon.
Abnormal Fixational Eye Movements in Amblyopia.
Shaikh, Aasef G; Otero-Millan, Jorge; Kumar, Priyanka; Ghasia, Fatema F
2016-01-01
Fixational saccades shift the foveal image to counteract visual fading related to neural adaptation. Drifts are slow eye movements between two adjacent fixational saccades. We quantified fixational saccades and asked whether their changes could be attributed to pathologic drifts seen in amblyopia, one of the most common causes of blindness in childhood. Thirty-six pediatric subjects with varying severity of amblyopia and eleven healthy age-matched controls held their gaze on a visual target. Eye movements were measured with high-resolution video-oculography during fellow eye-viewing and amblyopic eye-viewing conditions. Fixational saccades and drifts were analyzed in the amblyopic and fellow eye and compared with controls. We found an increase in the amplitude with decreased frequency of fixational saccades in children with amblyopia. These alterations in fixational eye movements correlated with the severity of their amblyopia. There was also an increase in eye position variance during drifts in amblyopes. There was no correlation between the eye position variance or the eye velocity during ocular drifts and the amplitude of subsequent fixational saccade. Our findings suggest that abnormalities in fixational saccades in amblyopia are independent of the ocular drift. This investigation of amblyopia in pediatric age group quantitatively characterizes the fixation instability. Impaired properties of fixational saccades could be the consequence of abnormal processing and reorganization of the visual system in amblyopia. Paucity in the visual feedback during amblyopic eye-viewing condition can attribute to the increased eye position variance and drift velocity.
Abnormal Fixational Eye Movements in Amblyopia
Shaikh, Aasef G.; Otero-Millan, Jorge; Kumar, Priyanka; Ghasia, Fatema F.
2016-01-01
Purpose Fixational saccades shift the foveal image to counteract visual fading related to neural adaptation. Drifts are slow eye movements between two adjacent fixational saccades. We quantified fixational saccades and asked whether their changes could be attributed to pathologic drifts seen in amblyopia, one of the most common causes of blindness in childhood. Methods Thirty-six pediatric subjects with varying severity of amblyopia and eleven healthy age-matched controls held their gaze on a visual target. Eye movements were measured with high-resolution video-oculography during fellow eye-viewing and amblyopic eye-viewing conditions. Fixational saccades and drifts were analyzed in the amblyopic and fellow eye and compared with controls. Results We found an increase in the amplitude with decreased frequency of fixational saccades in children with amblyopia. These alterations in fixational eye movements correlated with the severity of their amblyopia. There was also an increase in eye position variance during drifts in amblyopes. There was no correlation between the eye position variance or the eye velocity during ocular drifts and the amplitude of subsequent fixational saccade. Our findings suggest that abnormalities in fixational saccades in amblyopia are independent of the ocular drift. Discussion This investigation of amblyopia in pediatric age group quantitatively characterizes the fixation instability. Impaired properties of fixational saccades could be the consequence of abnormal processing and reorganization of the visual system in amblyopia. Paucity in the visual feedback during amblyopic eye-viewing condition can attribute to the increased eye position variance and drift velocity. PMID:26930079
Direct evidence for a position input to the smooth pursuit system.
Blohm, Gunnar; Missal, Marcus; Lefèvre, Philippe
2005-07-01
When objects move in our environment, the orientation of the visual axis in space requires the coordination of two types of eye movements: saccades and smooth pursuit. The principal input to the saccadic system is position error, whereas it is velocity error for the smooth pursuit system. Recently, it has been shown that catch-up saccades to moving targets are triggered and programmed by using velocity error in addition to position error. Here, we show that, when a visual target is flashed during ongoing smooth pursuit, it evokes a smooth eye movement toward the flash. The velocity of this evoked smooth movement is proportional to the position error of the flash; it is neither influenced by the velocity of the ongoing smooth pursuit eye movement nor by the occurrence of a saccade, but the effect is absent if the flash is ignored by the subject. Furthermore, the response started around 85 ms after the flash presentation and decayed with an average time constant of 276 ms. Thus this is the first direct evidence of a position input to the smooth pursuit system. This study shows further evidence for a coupling between saccadic and smooth pursuit systems. It also suggests that there is an interaction between position and velocity error signals in the control of more complex movements.
NASA Technical Reports Server (NTRS)
Kornilova, L. N.; Naumov, I. A.; Azarov, K. A.; Sagalovitch, S. V.; Reschke, Millard F.; Kozlovskaya, I. B.
2007-01-01
The vestibular function and tracking eye movements were investigated in 12 Russian crew members of ISS missions on days 1(2), 4(5-6), and 8(9-10) after prolonged exposure to microgravity (126 to 195 days). The spontaneous oculomotor activity, static torsional otolith-cervico-ocular reflex, dynamic vestibulo-cervico-ocular responses, vestibular reactivity, tracking eye movements, and gaze-holding were studied using videooculography (VOG) and electrooculography (EOG) for parallel eye movement recording. On post-flight days 1-2 (R+1-2) some cosmonauts demonstrated: - an increased spontaneous oculomotor activity (floating eye movements, spontaneous nystagmus of the typical and atypical form, square wave jerks, gaze nystagmus) with the head held in the vertical position; - suppressed otolith function (absent or reduced by one half amplitude of torsional compensatory eye counter-rolling) with the head inclined statically right- or leftward by 300; - increased vestibular reactivity (lowered threshold and increased intensity of the vestibular nystagmus) during head turns around the longitudinal body axis at 0.125 Hz; - a significant change in the accuracy, velocity, and temporal characteristics of the eye tracking. The pattern, depth, dynamics, and velocity of the vestibular function and tracking eye movements recovery varied with individual participants in the investigation. However, there were also regular responses during readaptation to the normal gravity: - suppression of the otolith function was typically accompanied by an exaggerated vestibular reactivity; - the structure of visual tracking (the accuracy of fixational eye rotations, smooth tracking, and gaze-holding) was disturbed (the appearance of correcting saccades, the transition of smooth tracking to saccadic tracking) only in those cosmonauts who, in parallel to an increased reactivity of the vestibular input, also had central changes in the oculomotor system (spontaneous nystagmus, gaze nystagmus).
Binocular eye movement control and motion perception: what is being tracked?
van der Steen, Johannes; Dits, Joyce
2012-10-19
We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking.
Binocular Eye Movement Control and Motion Perception: What Is Being Tracked?
van der Steen, Johannes; Dits, Joyce
2012-01-01
Purpose. We investigated under what conditions humans can make independent slow phase eye movements. The ability to make independent movements of the two eyes generally is attributed to few specialized lateral eyed animal species, for example chameleons. In our study, we showed that humans also can move the eyes in different directions. To maintain binocular retinal correspondence independent slow phase movements of each eye are produced. Methods. We used the scleral search coil method to measure binocular eye movements in response to dichoptically viewed visual stimuli oscillating in orthogonal direction. Results. Correlated stimuli led to orthogonal slow eye movements, while the binocularly perceived motion was the vector sum of the motion presented to each eye. The importance of binocular fusion on independency of the movements of the two eyes was investigated with anti-correlated stimuli. The perceived global motion pattern of anti-correlated dichoptic stimuli was perceived as an oblique oscillatory motion, as well as resulted in a conjugate oblique motion of the eyes. Conclusions. We propose that the ability to make independent slow phase eye movements in humans is used to maintain binocular retinal correspondence. Eye-of-origin and binocular information are used during the processing of binocular visual information, and it is decided at an early stage whether binocular or monocular motion information and independent slow phase eye movements of each eye are produced during binocular tracking. PMID:22997286
Hypothesis test for synchronization: twin surrogates revisited.
Romano, M Carmen; Thiel, Marco; Kurths, Jürgen; Mergenthaler, Konstantin; Engbert, Ralf
2009-03-01
The method of twin surrogates has been introduced to test for phase synchronization of complex systems in the case of passive experiments. In this paper we derive new analytical expressions for the number of twins depending on the size of the neighborhood, as well as on the length of the trajectory. This allows us to determine the optimal parameters for the generation of twin surrogates. Furthermore, we determine the quality of the twin surrogates with respect to several linear and nonlinear statistics depending on the parameters of the method. In the second part of the paper we perform a hypothesis test for phase synchronization in the case of experimental data from fixational eye movements. These miniature eye movements have been shown to play a central role in neural information processing underlying the perception of static visual scenes. The high number of data sets (21 subjects and 30 trials per person) allows us to compare the generated twin surrogates with the "natural" surrogates that correspond to the different trials. We show that the generated twin surrogates reproduce very well all linear and nonlinear characteristics of the underlying experimental system. The synchronization analysis of fixational eye movements by means of twin surrogates reveals that the synchronization between the left and right eye is significant, indicating that either the centers in the brain stem generating fixational eye movements are closely linked, or, alternatively that there is only one center controlling both eyes.
A model that integrates eye velocity commands to keep track of smooth eye displacements.
Blohm, Gunnar; Optican, Lance M; Lefèvre, Philippe
2006-08-01
Past results have reported conflicting findings on the oculomotor system's ability to keep track of smooth eye movements in darkness. Whereas some results indicate that saccades cannot compensate for smooth eye displacements, others report that memory-guided saccades during smooth pursuit are spatially correct. Recently, it was shown that the amount of time before the saccade made a difference: short-latency saccades were retinotopically coded, whereas long-latency saccades were spatially coded. Here, we propose a model of the saccadic system that can explain the available experimental data. The novel part of this model consists of a delayed integration of efferent smooth eye velocity commands. Two alternative physiologically realistic neural mechanisms for this integration stage are proposed. Model simulations accurately reproduced prior findings. Thus, this model reconciles the earlier contradictory reports from the literature about compensation for smooth eye movements before saccades because it involves a slow integration process.
A unified dynamic neural field model of goal directed eye movements
NASA Astrophysics Data System (ADS)
Quinton, J. C.; Goffart, L.
2018-01-01
Primates heavily rely on their visual system, which exploits signals of graded precision based on the eccentricity of the target in the visual field. The interactions with the environment involve actively selecting and focusing on visual targets or regions of interest, instead of contemplating an omnidirectional visual flow. Eye-movements specifically allow foveating targets and track their motion. Once a target is brought within the central visual field, eye-movements are usually classified into catch-up saccades (jumping from one orientation or fixation to another) and smooth pursuit (continuously tracking a target with low velocity). Building on existing dynamic neural field equations, we introduce a novel model that incorporates internal projections to better estimate the current target location (associated to a peak of activity). Such estimate is then used to trigger an eye movement, leading to qualitatively different behaviours depending on the dynamics of the whole oculomotor system: (1) fixational eye-movements due to small variations in the weights of projections when the target is stationary, (2) interceptive and catch-up saccades when peaks build and relax on the neural field, (3) smooth pursuit when the peak stabilises near the centre of the field, the system reaching a fixed point attractor. Learning is nevertheless required for tracking a rapidly moving target, and the proposed model thus replicates recent results in the monkey, in which repeated exercise permits the maintenance of the target within in the central visual field at its current (here-and-now) location, despite the delays involved in transmitting retinal signals to the oculomotor neurons.
A novel approach to training attention and gaze in ASD: A feasibility and efficacy pilot study.
Chukoskie, Leanne; Westerfield, Marissa; Townsend, Jeanne
2018-05-01
In addition to the social, communicative and behavioral symptoms that define the disorder, individuals with ASD have difficulty re-orienting attention quickly and accurately. Similarly, fast re-orienting saccadic eye movements are also inaccurate and more variable in both endpoint and timing. Atypical gaze and attention are among the earliest symptoms observed in ASD. Disruption of these foundation skills critically affects the development of higher level cognitive and social behavior. We propose that interventions aimed at these early deficits that support social and cognitive skills will be broadly effective. We conducted a pilot clinical trial designed to demonstrate the feasibility and preliminary efficacy of using gaze-contingent video games for low-cost in-home training of attention and eye movement. Eight adolescents with ASD participated in an 8-week training, with pre-, mid- and post-testing of eye movement and attention control. Six of the eight adolescents completed the 8 weeks of training and all six showed improvement in attention (orienting, disengagement) and eye movement control or both. All game systems remained intact for the duration of training and all participants could use the system independently. We delivered a robust, low-cost, gaze-contingent game system for home use that, in our pilot training sample, improved the attention orienting and eye movement performance of adolescent participants in 8 weeks of training. We are currently conducting a clinical trial to replicate these results and to examine what, if any, aspects of training transfer to more real-world tasks. © 2017 Wiley Periodicals, Inc. Develop Neurobiol 78: 546-554, 2018. © 2017 Wiley Periodicals, Inc.
The perception of heading during eye movements
NASA Technical Reports Server (NTRS)
Royden, Constance S.; Banks, Martin S.; Crowell, James A.
1992-01-01
Warren and Hannon (1988, 1990), while studying the perception of heading during eye movements, concluded that people do not require extraretinal information to judge heading with eye/head movements present. Here, heading judgments are examined at higher, more typical eye movement velocities than the extremely slow tracking eye movements used by Warren and Hannon. It is found that people require extraretinal information about eye position to perceive heading accurately under many viewing conditions.
Electroencephalographic prodromal markers of dementia across conscious states in Parkinson’s disease
Latreille, Véronique; Gaudet-Fex, Benjamin; Rodrigues-Brazète, Jessica; Panisset, Michel; Chouinard, Sylvain; Postuma, Ronald B.
2016-01-01
Abstract In Parkinson’s disease, electroencephalographic abnormalities during wakefulness and non-rapid eye movement sleep (spindles) were found to be predictive biomarkers of dementia. Because rapid eye movement sleep is regulated by the cholinergic system, which shows early degeneration in Parkinson’s disease with cognitive impairment, anomalies during this sleep stage might mirror dementia development. In this prospective study, we examined baseline electroencephalographic absolute spectral power across three states of consciousness (non-rapid eye movement sleep, rapid eye movement sleep, and wakefulness) in 68 non-demented patients with Parkinson’s disease and 44 healthy controls. All participants underwent baseline polysomnographic recordings and a comprehensive neuropsychological assessment. Power spectral analyses were performed on standard frequency bands. Dominant occipital frequency during wakefulness and ratios of slow-to-fast frequencies during rapid eye movement sleep and wakefulness were also computed. At follow-up (an average 4.5 years after baseline), 18 patients with Parkinson’s disease had developed dementia and 50 patients remained dementia-free. In rapid eye movement sleep, patients with Parkinson’s disease who later developed dementia showed, at baseline, higher absolute power in delta and theta bands and a higher slowing ratio, especially in temporal, parietal, and occipital regions, compared to patients who remained dementia-free and controls. In non-rapid eye movement sleep, lower baseline sigma power in parietal cortical regions also predicted development of dementia. During wakefulness, patients with Parkinson’s disease who later developed dementia showed lower dominant occipital frequency as well as higher delta and slowing ratio compared to patients who remained dementia-free and controls. At baseline, higher slowing ratios in temporo-occipital regions during rapid eye movement sleep were associated with poor performance on visuospatial tests in patients with Parkinson’s disease. Using receiver operating characteristic curves, we found that best predictors of dementia in Parkinson’s disease were rapid eye movement sleep slowing ratios in posterior regions, wakefulness slowing ratios in temporal areas, and lower dominant occipital frequency. These results suggest that electroencephalographic slowing during sleep is a new promising predictive biomarker for Parkinson’s disease dementia, perhaps as a marker of cholinergic denervation. PMID:26912643
Krieber, Magdalena; Bartl-Pokorny, Katrin D.; Pokorny, Florian B.; Zhang, Dajie; Landerl, Karin; Körner, Christof; Pernkopf, Franz; Pock, Thomas; Einspieler, Christa; Marschik, Peter B.
2017-01-01
The present study aimed to define differences between silent and oral reading with respect to spatial and temporal eye movement parameters. Eye movements of 22 German-speaking adolescents (14 females; mean age = 13;6 years;months) were recorded while reading an age-appropriate text silently and orally. Preschool cognitive abilities were assessed at the participants’ age of 5;7 (years;months) using the Kaufman Assessment Battery for Children. The participants’ reading speed and reading comprehension at the age of 13;6 (years;months) were determined using a standardized inventory to evaluate silent reading skills in German readers (Lesegeschwindigkeits- und -verständnistest für Klassen 6–12). The results show that (i) reading mode significantly influenced both spatial and temporal characteristics of eye movement patterns; (ii) articulation decreased the consistency of intraindividual reading performances with regard to a significant number of eye movement parameters; (iii) reading skills predicted the majority of eye movement parameters during silent reading, but influenced only a restricted number of eye movement parameters when reading orally; (iv) differences with respect to a subset of eye movement parameters increased with reading skills; (v) an overall preschool cognitive performance score predicted reading skills at the age of 13;6 (years;months), but not eye movement patterns during either silent or oral reading. However, we found a few significant correlations between preschool performances on subscales of sequential and simultaneous processing and eye movement parameters for both reading modes. Overall, the findings suggest that eye movement patterns depend on the reading mode. Preschool cognitive abilities were more closely related to eye movement patterns of oral than silent reading, while reading skills predicted eye movement patterns during silent reading, but less so during oral reading. PMID:28151950
Mannan, Malik M Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M Ahmad
2016-02-19
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.
Saccade Preparation Is Required for Exogenous Attention but Not Endogenous Attention or IOR
ERIC Educational Resources Information Center
Smith, Daniel T.; Schenk, Thomas; Rorden, Chris
2012-01-01
Covert attention is tightly coupled with the control of eye movements, but there is controversy about how tight this coupling is. The premotor theory of attention proposes that activation of the eye movement system is necessary to produce shifts of attention. In this study, we experimentally prevented healthy participants from planning or…
Target Selection by the Frontal Cortex during Coordinated Saccadic and Smooth Pursuit Eye Movements
ERIC Educational Resources Information Center
Srihasam, Krishna; Bullock, Daniel; Grossberg, Stephen
2009-01-01
Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth-pursuit eye movements. In particular, the saccadic and smooth-pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do…
Eye movement analysis of reading from computer displays, eReaders and printed books.
Zambarbieri, Daniela; Carniglia, Elena
2012-09-01
To compare eye movements during silent reading of three eBooks and a printed book. The three different eReading tools were a desktop PC, iPad tablet and Kindle eReader. Video-oculographic technology was used for recording eye movements. In the case of reading from the computer display the recordings were made by a video camera placed below the computer screen, whereas for reading from the iPad tablet, eReader and printed book the recording system was worn by the subject and had two cameras: one for recording the movement of the eyes and the other for recording the scene in front of the subject. Data analysis provided quantitative information in terms of number of fixations, their duration, and the direction of the movement, the latter to distinguish between fixations and regressions. Mean fixation duration was different only in reading from the computer display, and was similar for the Tablet, eReader and printed book. The percentage of regressions with respect to the total amount of fixations was comparable for eReading tools and the printed book. The analysis of eye movements during reading an eBook from different eReading tools suggests that subjects' reading behaviour is similar to reading from a printed book. © 2012 The College of Optometrists.
Murata, Atsuo; Fukunaga, Daichi
2018-04-01
This study attempted to investigate the effects of the target shape and the movement direction on the pointing time using an eye-gaze input system and extend Fitts' model so that these factors are incorporated into the model and the predictive power of Fitts' model is enhanced. The target shape, the target size, the movement distance, and the direction of target presentation were set as within-subject experimental variables. The target shape included: a circle, and rectangles with an aspect ratio of 1:1, 1:2, 1:3, and 1:4. The movement direction included eight directions: upper, lower, left, right, upper left, upper right, lower left, and lower right. On the basis of the data for identifying the effects of the target shape and the movement direction on the pointing time, an attempt was made to develop a generalized and extended Fitts' model that took into account the movement direction and the target shape. As a result, the generalized and extended model was found to fit better to the experimental data, and be more effective for predicting the pointing time for a variety of human-computer interaction (HCI) task using an eye-gaze input system. Copyright © 2017. Published by Elsevier Ltd.
... t work properly. There are many kinds of eye movement disorders. Two common ones are Strabismus - a disorder ... of the eyes, sometimes called "dancing eyes" Some eye movement disorders are present at birth. Others develop over ...
Melek, Nélida B; Blanco, Susana; Garcia, Horacio
2006-01-01
These two eye movements have not been previously studied in this condition by this method. Five cases were studied. Both visual acuity and eye examination of anterior and posterior segments were normal. A Nicolet Nystar Plus system with chloride silver electrodes was used to record the EOG. Of the two systems under study, the smooth pursuit system showed the most relevant anomalies, both in the Duane's eye and in the apparently healthy eye. No correlation was found between the pursuit and optokinetic nystagmus disorders. In some cases, more significant abnormalities were observed in the clinically normal eye. The results clearly demonstrated a significant impairment of the pursuit system. This suggests that this motor disorder is not exclusively caused by hypoplasia or aplasia of the nucleus of the abducens nerve (VIth cranial nerve). These abnormalities might be related to a poor development of the rhombencephalon since both supramotor nuclei as well as the pathways of this system arise from this region of the embryonic brain. In the particular case of OKN, the supramotor nuclei have a different origin. Therefore, these systems might be affected differently.
Pioneers of eye movement research
Wade, Nicholas J
2010-01-01
Recent advances in the technology affording eye movement recordings carry the risk of neglecting past achievements. Without the assistance of this modern armoury, great strides were made in describing the ways the eyes move. For Aristotle the fundamental features of eye movements were binocular, and he described the combined functions of the eyes. This was later given support using simple procedures like placing a finger over the eyelid of the closed eye and culminated in Hering's law of equal innervation. However, the overriding concern in the 19th century was with eye position rather than eye movements. Appreciating discontinuities of eye movements arose from studies of vertigo. The characteristics of nystagmus were recorded before those of saccades and fixations. Eye movements during reading were described by Hering and by Lamare in 1879; both used similar techniques of listening to sounds made during contractions of the extraocular muscles. Photographic records of eye movements during reading were made by Dodge early in the 20th century, and this stimulated research using a wider array of patterns. In the mid-20th century attention shifted to the stability of the eyes during fixation, with the emphasis on involuntary movements. The contributions of pioneers from Aristotle to Yarbus are outlined. PMID:23396982
Face recognition increases during saccade preparation.
Lin, Hai; Rizak, Joshua D; Ma, Yuan-ye; Yang, Shang-chuan; Chen, Lin; Hu, Xin-tian
2014-01-01
Face perception is integral to human perception system as it underlies social interactions. Saccadic eye movements are frequently made to bring interesting visual information, such as faces, onto the fovea for detailed processing. Just before eye movement onset, the processing of some basic features, such as the orientation, of an object improves at the saccade landing point. Interestingly, there is also evidence that indicates faces are processed in early visual processing stages similar to basic features. However, it is not known whether this early enhancement of processing includes face recognition. In this study, three experiments were performed to map the timing of face presentation to the beginning of the eye movement in order to evaluate pre-saccadic face recognition. Faces were found to be similarly processed as simple objects immediately prior to saccadic movements. Starting ∼ 120 ms before a saccade to a target face, independent of whether or not the face was surrounded by other faces, the face recognition gradually improved and the critical spacing of the crowding decreased as saccade onset was approaching. These results suggest that an upcoming saccade prepares the visual system for new information about faces at the saccade landing site and may reduce the background in a crowd to target the intended face. This indicates an important role of pre-saccadic eye movement signals in human face recognition.
Eye movement perimetry in glaucoma.
Trope, G E; Eizenman, M; Coyle, E
1989-08-01
Present-day computerized perimetry is often inaccurate and unreliable owing to the need to maintain central fixation over long periods while repressing the normal response to presentation of peripheral stimuli. We tested a new method of perimetry that does not require prolonged central fixation. During this test eye movements were encouraged on presentation of a peripheral target. Twenty-three eyes were studied with an Octopus perimeter, with a technician monitoring eye movements. The sensitivity was 100% and the specificity 23%. The low specificity was due to the technician's inability to accurately monitor small eye movements in the central 6 degrees field. If small eye movements are monitored accurately with an eye tracker, eye movement perimetry could become an alternative method to standard perimetry.
NASA Astrophysics Data System (ADS)
Huang, Hongxin; Toyoda, Haruyoshi; Inoue, Takashi
2017-09-01
The performance of an adaptive optics scanning laser ophthalmoscope (AO-SLO) using a liquid crystal on silicon spatial light modulator and Shack-Hartmann wavefront sensor was investigated. The system achieved high-resolution and high-contrast images of human retinas by dynamic compensation for the aberrations in the eyes. Retinal structures such as photoreceptor cells, blood vessels, and nerve fiber bundles, as well as blood flow, could be observed in vivo. We also investigated involuntary eye movements and ascertained microsaccades and drifts using both the retinal images and the aberrations recorded simultaneously. Furthermore, we measured the interframe displacement of retinal images and found that during eye drift, the displacement has a linear relationship with the residual low-order aberration. The estimated duration and cumulative displacement of the drift were within the ranges estimated by a video tracking technique. The AO-SLO would not only be used for the early detection of eye diseases, but would also offer a new approach for involuntary eye movement research.
NASA Astrophysics Data System (ADS)
Taylor, Natalie M.; van Saarloos, Paul P.; Eikelboom, Robert H.
2000-06-01
This study aimed to gauge the effect of the patient's eye movement during Photo Refractive Keratectomy (PRK) on post- operative vision. A computer simulation of both the PRK procedure and the visual outcome has been performed. The PRK simulation incorporated the pattern of movement of the laser beam to perform a given correction, the beam characteristics, an initial corneal profile, and an eye movement scenario; and generated the corrected corneal profile. The regrowth of the epithelium was simulated by selecting the smoothing filter which, when applied to a corrected cornea with no patient eye movement, produced similar ray tracing results to the original corneal model. Ray tracing several objects, such as letters of various contrast and sizes was performed to assess the quality of the post-operative vision. Eye movement scenarios included no eye movement, constant decentration and normally distributed random eye movement of varying magnitudes. Random eye movement of even small amounts, such as 50 microns reduces the contrast sensitivity of the image. Constant decentration decenters the projected image on the retina, and in extreme cases can lead to astigmatism. Eye movements of the magnitude expected during laser refractive surgery have minimal effect on the final visual outcome.
Gandhi, Neeraj J; Barton, Ellen J; Sparks, David L
2008-07-01
Constant frequency microstimulation of the paramedian pontine reticular formation (PPRF) in head-restrained monkeys evokes a constant velocity eye movement. Since the PPRF receives significant projections from structures that control coordinated eye-head movements, we asked whether stimulation of the pontine reticular formation in the head-unrestrained animal generates a combined eye-head movement or only an eye movement. Microstimulation of most sites yielded a constant-velocity gaze shift executed as a coordinated eye-head movement, although eye-only movements were evoked from some sites. The eye and head contributions to the stimulation-evoked movements varied across stimulation sites and were drastically different from the lawful relationship observed for visually-guided gaze shifts. These results indicate that the microstimulation activated elements that issued movement commands to the extraocular and, for most sites, neck motoneurons. In addition, the stimulation-evoked changes in gaze were similar in the head-restrained and head-unrestrained conditions despite the assortment of eye and head contributions, suggesting that the vestibulo-ocular reflex (VOR) gain must be near unity during the coordinated eye-head movements evoked by stimulation of the PPRF. These findings contrast the attenuation of VOR gain associated with visually-guided gaze shifts and suggest that the vestibulo-ocular pathway processes volitional and PPRF stimulation-evoked gaze shifts differently.
Trained Eyes: Experience Promotes Adaptive Gaze Control in Dynamic and Uncertain Visual Environments
Taya, Shuichiro; Windridge, David; Osman, Magda
2013-01-01
Current eye-tracking research suggests that our eyes make anticipatory movements to a location that is relevant for a forthcoming task. Moreover, there is evidence to suggest that with more practice anticipatory gaze control can improve. However, these findings are largely limited to situations where participants are actively engaged in a task. We ask: does experience modulate anticipative gaze control while passively observing a visual scene? To tackle this we tested people with varying degrees of experience of tennis, in order to uncover potential associations between experience and eye movement behaviour while they watched tennis videos. The number, size, and accuracy of saccades (rapid eye-movements) made around ‘events,’ which is critical for the scene context (i.e. hit and bounce) were analysed. Overall, we found that experience improved anticipatory eye-movements while watching tennis clips. In general, those with extensive experience showed greater accuracy of saccades to upcoming event locations; this was particularly prevalent for events in the scene that carried high uncertainty (i.e. ball bounces). The results indicate that, even when passively observing, our gaze control system utilizes prior relevant knowledge in order to anticipate upcoming uncertain event locations. PMID:23951147
Learning optimal eye movements to unusual faces
Peterson, Matthew F.; Eckstein, Miguel P.
2014-01-01
Eye movements, which guide the fovea’s high resolution and computational power to relevant areas of the visual scene, are integral to efficient, successful completion of many visual tasks. How humans modify their eye movements through experience with their perceptual environments, and its functional role in learning new tasks, has not been fully investigated. Here, we used a face identification task where only the mouth discriminated exemplars to assess if, how, and when eye movement modulation may mediate learning. By interleaving trials of unconstrained eye movements with trials of forced fixation, we attempted to separate the contributions of eye movements and covert mechanisms to performance improvements. Without instruction, a majority of observers substantially increased accuracy and learned to direct their initial eye movements towards the optimal fixation point. The proximity of an observer’s default face identification eye movement behavior to the new optimal fixation point and the observer’s peripheral processing ability were predictive of performance gains and eye movement learning. After practice in a subsequent condition in which observers were directed to fixate different locations along the face, including the relevant mouth region, all observers learned to make eye movements to the optimal fixation point. In this fully learned state, augmented fixation strategy accounted for 43% of total efficiency improvements while covert mechanisms accounted for the remaining 57%. The findings suggest a critical role for eye movement planning to perceptual learning, and elucidate factors that can predict when and how well an observer can learn a new task with unusual exemplars. PMID:24291712
Nonhuman Primate Studies to Advance Vision Science and Prevent Blindness.
Mustari, Michael J
2017-12-01
Most primate behavior is dependent on high acuity vision. Optimal visual performance in primates depends heavily upon frontally placed eyes, retinal specializations, and binocular vision. To see an object clearly its image must be placed on or near the fovea of each eye. The oculomotor system is responsible for maintaining precise eye alignment during fixation and generating eye movements to track moving targets. The visual system of nonhuman primates has a similar anatomical organization and functional capability to that of humans. This allows results obtained in nonhuman primates to be applied to humans. The visual and oculomotor systems of primates are immature at birth and sensitive to the quality of binocular visual and eye movement experience during the first months of life. Disruption of postnatal experience can lead to problems in eye alignment (strabismus), amblyopia, unsteady gaze (nystagmus), and defective eye movements. Recent studies in nonhuman primates have begun to discover the neural mechanisms associated with these conditions. In addition, genetic defects that target the retina can lead to blindness. A variety of approaches including gene therapy, stem cell treatment, neuroprosthetics, and optogenetics are currently being used to restore function associated with retinal diseases. Nonhuman primates often provide the best animal model for advancing fundamental knowledge and developing new treatments and cures for blinding diseases. © The Author(s) 2017. Published by Oxford University Press on behalf of the National Academy of Sciences. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Fetal eye movements on magnetic resonance imaging.
Woitek, Ramona; Kasprian, Gregor; Lindner, Christian; Stuhr, Fritz; Weber, Michael; Schöpf, Veronika; Brugger, Peter C; Asenbaum, Ulrika; Furtner, Julia; Bettelheim, Dieter; Seidl, Rainer; Prayer, Daniela
2013-01-01
Eye movements are the physical expression of upper fetal brainstem function. Our aim was to identify and differentiate specific types of fetal eye movement patterns using dynamic MRI sequences. Their occurrence as well as the presence of conjugated eyeball motion and consistently parallel eyeball position was systematically analyzed. Dynamic SSFP sequences were acquired in 72 singleton fetuses (17-40 GW, three age groups [17-23 GW, 24-32 GW, 33-40 GW]). Fetal eye movements were evaluated according to a modified classification originally published by Birnholz (1981): Type 0: no eye movements; Type I: single transient deviations; Type Ia: fast deviation, slower reposition; Type Ib: fast deviation, fast reposition; Type II: single prolonged eye movements; Type III: complex sequences; and Type IV: nystagmoid. In 95.8% of fetuses, the evaluation of eye movements was possible using MRI, with a mean acquisition time of 70 seconds. Due to head motion, 4.2% of the fetuses and 20.1% of all dynamic SSFP sequences were excluded. Eye movements were observed in 45 fetuses (65.2%). Significant differences between the age groups were found for Type I (p = 0.03), Type Ia (p = 0.031), and Type IV eye movements (p = 0.033). Consistently parallel bulbs were found in 27.3-45%. In human fetuses, different eye movement patterns can be identified and described by MRI in utero. In addition to the originally classified eye movement patterns, a novel subtype has been observed, which apparently characterizes an important step in fetal brainstem development. We evaluated, for the first time, eyeball position in fetuses. Ultimately, the assessment of fetal eye movements by MRI yields the potential to identify early signs of brainstem dysfunction, as encountered in brain malformations such as Chiari II or molar tooth malformations.
Yamada, T; Suzuki, D A; Yee, R D
1996-11-01
1. Smooth pursuitlike eye movements were evoked with low current microstimulation delivered to rostral portions of the nucleus reticularis tegmenti pontis (rNRTP) in alert macaques. Microstimulation sites were selected by the observation of modulations in single-cell firing rates that were correlated with periodic smoothpursuit eye movements. Current intensities ranged from 10 to 120 microA and were routinely < 40 microA. Microstimulation was delivered either in the dark with no fixation, 100 ms after a fixation target was extinguished, or during maintained fixation of a stationary or moving target. Evoked eye movements also were studied under open-loop conditions with the target image stabilized on the retina. 2. Eye movements evoked in the absence of a target rapidly accelerated to a constant velocity that was maintained for the duration of the microstimulation. Evoked eye speeds ranged from 3.7 to 23 deg/s and averaged 11 deg/s. Evoked eye speed appeared to be linearly related to initial eye position with a sensitivity to initial eye position that averaged 0.23 deg.s-1.deg-1. While some horizontal and oblique smooth eye movements were elicited, microstimulation resulted in upward eye movements in 89% of the sites. 3. Evoked eye speed was found to be dependent on microstimulation pulse frequency and current intensity. Within limits, evoked eye speed increased with increases in stimulation frequency or current intensity. For stimulation frequencies < 300-400 Hz, only smooth pursuit-like eye movements were evoked. At higher stimulation frequencies, accompanying saccades consistently were elicited. 4. Feedback of retinal image motion interacted with the evoked eye movements to decrease eye speed if the visual motion was in the opposite direction as the evoked, pursuit-like eye movements. 5. The results implicate rNRTP as part of the neuronal substrate that controls smooth-pursuit eye movements. NRTP appears to be divided functionally into a rostral, pursuit-related portion and a caudal, saccade-related area. rNRTP is a component of a corticopontocerebellar circuit that presumably involves the pursuit area of the frontal eye field and that parallels the middle and medial superior temporal cerebral cortical/dorsalateral pontine nucleus (MT/MST-DLPN-cerebellum) pathway known to be involved also with regulating smooth-pursuit eye movements.
ECEM (Eye Closure, Eye Movements): application to depersonalization disorder.
Harriet, E Hollander
2009-10-01
Eye Closure, Eye Movements (ECEM) is a hypnotically-based approach to treatment that incorporates eye movements adapted from the Eye Movement Desensitization and Reprocessing (EMDR) protocol in conjunction with hypnosis for the treatment of depersonalization disorder. Depersonalization Disorder has been differentiated from post-traumatic stress disorders and has recently been conceptualized as a subtype of panic disorder (Baker et al., 2003; David, Phillips, Medford, & Sierra, 2004; Segui et. al., 2000). During ECEM, while remaining in a hypnotic state, clients self-generated six to seven trials of eye movements to reduce anticipatory anxiety associated with depersonalization disorder. Eye movements were also used to process triggers that elicited breath holding, often followed by episodes of depersonalization. Hypnotic suggestions were used to reverse core symptoms of depersonalization, subjectively described as "feeling unreal" (Simeon et al., 1997).
Pion-Massicotte, Joëlle; Godbout, Roger; Savard, Pierre; Roy, Jean-François
2018-02-23
Portable polysomnography is often too complex and encumbering for recording sleep at home. We recorded sleep using a biometric shirt (electrocardiogram sensors, respiratory inductance plethysmography bands and an accelerometer) in 21 healthy young adults recorded in a sleep laboratory for two consecutive nights, together with standard polysomnography. Polysomnographic recordings were scored using standard methods. An algorithm was developed to classify the biometric shirt recordings into rapid eye movement sleep, non-rapid eye movement sleep and wake. The algorithm was based on breathing rate and heart rate variability, body movement, and included a correction for sleep onset and offset. The overall mean percentage of agreement between the two sets of recordings was 77.4%; when non-rapid eye movement and rapid eye movement sleep epochs were grouped together, it increased to 90.8%. The overall kappa coefficient was 0.53. Five of the seven sleep variables were significantly correlated. The findings of this pilot study indicate that this simple portable system could be used to estimate the general sleep pattern of young healthy adults. © 2018 European Sleep Research Society.
Knight, T A
2012-12-06
The frontal eye field (FEF) has a strong influence on saccadic eye movements with the head restrained. With the head unrestrained, eye saccades combine with head movements to produce large gaze shifts, and microstimulation of the FEF evokes both eye and head movements. To test whether the dorsomedial FEF provides commands for the entire gaze shift or its separate eye and head components, we recorded extracellular single-unit activity in monkeys trained to make large head-unrestrained gaze shifts. We recorded 80 units active during gaze shifts, and closely examined 26 of these that discharged a burst of action potentials that preceded horizontal gaze movements. These units were movement or visuomovement related and most exhibited open movement fields with respect to amplitude. To reveal the relations of burst parameters to gaze, eye, and/or head movement metrics, we used behavioral dissociations of gaze, eye, and head movements and linear regression analyses. The burst number of spikes (NOS) was strongly correlated with movement amplitude and burst temporal parameters were strongly correlated with movement temporal metrics for eight gaze-related burst neurons and five saccade-related burst neurons. For the remaining 13 neurons, the NOS was strongly correlated with the head movement amplitude, but burst temporal parameters were most strongly correlated with eye movement temporal metrics (head-eye-related burst neurons, HEBNs). These results suggest that FEF units do not encode a command for the unified gaze shift only; instead, different units may carry signals related to the overall gaze shift or its eye and/or head components. Moreover, the HEBNs exhibit bursts whose magnitude and timing may encode a head displacement signal and a signal that influences the timing of the eye saccade, thereby serving as a mechanism for coordinating the eye and head movements of a gaze shift. Copyright © 2012 IBRO. Published by Elsevier Ltd. All rights reserved.
Parker, Andrew; Relph, Sarah; Dagnall, Neil
2008-01-01
Two experiments are reported that investigate the effects of saccadic bilateral eye movements on the retrieval of item, associative, and contextual information. Experiment 1 compared the effects of bilateral versus vertical versus no eye movements on tests of item recognition, followed by remember-know responses and associative recognition. Supporting previous research, bilateral eye movements enhanced item recognition by increasing the hit rate and decreasing the false alarm rate. Analysis of remember-know responses indicated that eye movement effects were accompanied by increases in remember responses. The test of associative recognition found that bilateral eye movements increased correct responses to intact pairs and decreased false alarms to rearranged pairs. Experiment 2 assessed the effects of eye movements on the recall of intrinsic (color) and extrinsic (spatial location) context. Bilateral eye movements increased correct recall for both types of context. The results are discussed within the framework of dual-process models of memory and the possible neural underpinnings of these effects are considered.
Eivazi, Shahram; Hafez, Ahmad; Fuhl, Wolfgang; Afkari, Hoorieh; Kasneci, Enkelejda; Lehecka, Martin; Bednarik, Roman
2017-06-01
Previous studies have consistently demonstrated gaze behaviour differences related to expertise during various surgical procedures. In micro-neurosurgery, however, there is a lack of evidence of empirically demonstrated individual differences associated with visual attention. It is unknown exactly how neurosurgeons see a stereoscopic magnified view in the context of micro-neurosurgery and what this implies for medical training. We report on an investigation of the eye movement patterns in micro-neurosurgery using a state-of-the-art eye tracker. We studied the eye movements of nine neurosurgeons while performing cutting and suturing tasks under a surgical microscope. Eye-movement characteristics, such as fixation (focus level) and saccade (visual search pattern), were analysed. The results show a strong relationship between the level of microsurgical skill and the gaze pattern, whereas more expertise is associated with greater eye control, stability, and focusing in eye behaviour. For example, in the cutting task, well-trained surgeons increased their fixation durations on the operating field twice as much as the novices (expert, 848 ms; novice, 402 ms). Maintaining steady visual attention on the target (fixation), as well as being able to quickly make eye jumps from one target to another (saccades) are two important elements for the success of neurosurgery. The captured gaze patterns can be used to improve medical education, as part of an assessment system or in a gaze-training application.
Eye movement abnormalities in hermansky-pudlak syndrome.
Gradstein, Libe; FitzGibbon, Edmond J; Tsilou, Ekaterini T; Rubin, Benjamin I; Huizing, Marjan; Gahl, William A
2005-08-01
Hermansky-Pudlak Syndrome (HPS) is a type of oculocutaneous albinism associated with a bleeding diathesis and pulmonary fibrosis. Although it is known that patients with HPS exhibit nystagmus, the nature of these abnormal eye movements has not been studied. Twenty-seven patients with HPS, diagnosed by platelet morphology and genetic analysis, underwent a systemic evaluation and complete eye examination. Twenty-five had eye movement recordings using magnetic search coil, infrared, or video oculography. All patients had iris transillumination, foveal hypoplasia, and variable hypopigmentation in skin and eyes. All had bleeding tendencies, and 2 reported excessive bleeding during strabismus surgery. Nine patients had pulmonary fibrosis. Visual acuities ranged from 20/20- to 20/320. Twenty patients had strabismus despite 6 having strabismus surgery previously. Ocular oscillations consistent with congenital nystagmus (CN) were clinically evident in 24 of 27 patients, and half showed periodic alternating nystagmus. In 3 patients without CN, eye movement recordings revealed minimal end-gaze nystagmus, square-wave jerks, drift during fixation and saccades, and low-gain pursuit. These patients had melanin in the posterior pole and better visual acuities than the others (P = 0.002). Most patients with HPS have CN, and many have periodic alternating nystagmus. Some have subtle eye movement abnormalities without clinically evident nystagmus, which can obscure the diagnosis, especially if hypopigmentation is mild. Absence of clinical nystagmus in a child with HPS suggests good vision. Patients with albinism, especially before surgery, should be evaluated for HPS to prevent life-threatening complications.
Payne, Hannah L
2017-01-01
Eye movements provide insights about a wide range of brain functions, from sensorimotor integration to cognition; hence, the measurement of eye movements is an important tool in neuroscience research. We describe a method, based on magnetic sensing, for measuring eye movements in head-fixed and freely moving mice. A small magnet was surgically implanted on the eye, and changes in the magnet angle as the eye rotated were detected by a magnetic field sensor. Systematic testing demonstrated high resolution measurements of eye position of <0.1°. Magnetic eye tracking offers several advantages over the well-established eye coil and video-oculography methods. Most notably, it provides the first method for reliable, high-resolution measurement of eye movements in freely moving mice, revealing increased eye movements and altered binocular coordination compared to head-fixed mice. Overall, magnetic eye tracking provides a lightweight, inexpensive, easily implemented, and high-resolution method suitable for a wide range of applications. PMID:28872455
Low frequency rTMS over posterior parietal cortex impairs smooth pursuit eye tracking.
Hutton, Samuel B; Weekes, Brendan S
2007-11-01
The role of the posterior parietal cortex in smooth pursuit eye movements remains unclear. We used low frequency repetitive transcranial magnetic stimulation (rTMS) to study the cognitive and neural systems involved in the control of smooth pursuit eye movements. Eighteen participants were tested on two separate occasions. On each occasion we measured smooth pursuit eye tracking before and after 6 min of 1 Hz rTMS delivered at 90% of motor threshold. Low frequency rTMS over the posterior parietal cortex led to a significant reduction in smooth pursuit velocity gain, whereas rTMS over the motor cortex had no effect on gain. We conclude that low frequency offline rTMS is a potentially useful tool with which to explore the cortical systems involved in oculomotor control.
fMRI evidence for sensorimotor transformations in human cortex during smooth pursuit eye movements.
Kimmig, H; Ohlendorf, S; Speck, O; Sprenger, A; Rutschmann, R M; Haller, S; Greenlee, M W
2008-01-01
Smooth pursuit eye movements (SP) are driven by moving objects. The pursuit system processes the visual input signals and transforms this information into an oculomotor output signal. Despite the object's movement on the retina and the eyes' movement in the head, we are able to locate the object in space implying coordinate transformations from retinal to head and space coordinates. To test for the visual and oculomotor components of SP and the possible transformation sites, we investigated three experimental conditions: (I) fixation of a stationary target with a second target moving across the retina (visual), (II) pursuit of the moving target with the second target moving in phase (oculomotor), (III) pursuit of the moving target with the second target remaining stationary (visuo-oculomotor). Precise eye movement data were simultaneously measured with the fMRI data. Visual components of activation during SP were located in the motion-sensitive, temporo-parieto-occipital region MT+ and the right posterior parietal cortex (PPC). Motor components comprised more widespread activation in these regions and additional activations in the frontal and supplementary eye fields (FEF, SEF), the cingulate gyrus and precuneus. The combined visuo-oculomotor stimulus revealed additional activation in the putamen. Possible transformation sites were found in MT+ and PPC. The MT+ activation evoked by the motion of a single visual dot was very localized, while the activation of the same single dot motion driving the eye was rather extended across MT+. The eye movement information appeared to be dispersed across the visual map of MT+. This could be interpreted as a transfer of the one-dimensional eye movement information into the two-dimensional visual map. Potentially, the dispersed information could be used to remap MT+ to space coordinates rather than retinal coordinates and to provide the basis for a motor output control. A similar interpretation holds for our results in the PPC region.
ERIC Educational Resources Information Center
Nesbit, Larry L.
A research study was designed to test the relationship between the number of eye fixations and amount of learning as determined by a criterion referenced posttest. The study sought to answer the following questions: (1) Are differences in eye movement indices related to the posttest score? (2) Do differences in eye movement indices of subjects…
Zangemeister, W H; Nagel, M
2001-01-01
We investigated coordinated saccadic eye and head movements following predictive horizontal visual targets at +/- 30 degrees by applying transcranial magnetic stimulation (TMS) over the cerebellum before the start of the gaze movement in 10 young subjects. We found three effects of TMS on eye-head movements: 1. Saccadic latency effect. When stimulation took place shortly before movements commenced (75-25 ms before), significantly shorter latencies were found between predictive target presentation and initiation of saccades. Eye latencies were significantly decreased by 45 ms on average, but head latencies were not. 2. Gaze amplitude effect. Without TMS, for the 60 degrees target amplitudes, head movements usually preceded eye movements, as expected (predictive gaze type 3). With TMS 5-75 ms before the gaze movement, the number of eye movements preceding head movements by 20-50 ms was significantly increased (p < 0.001) and the delay between eye and head movements was reversed (p < 0.001), i.e. we found eye-predictive gaze type 1. 3. Saccadic peak velocity effect. For TMS 5-25 s before the start of head movement, mean peak velocity of synkinetic eye saccades increased by 20-30% up to 600 degrees/s, compared to 350-400 degrees/s without TMS. We conclude that transient functional cerebellar deficits exerted by means of TMS can change the central synkinesis of eye-head coordination, including the preprogramming of the saccadic pulse and step of a coordinated gaze movement.
Cercenelli, Laura; Tiberi, Guido; Corazza, Ivan; Giannaccare, Giuseppe; Fresina, Michela; Marcelli, Emanuela
2017-01-01
Many open source software packages have been recently developed to expand the usability of eye tracking systems to study oculomotor behavior, but none of these is specifically designed to encompass all the main functions required for creating eye tracking tests and for providing the automatic analysis of saccadic eye movements. The aim of this study is to introduce SacLab, an intuitive, freely-available MATLAB toolbox based on Graphical User Interfaces (GUIs) that we have developed to increase the usability of the ViewPoint EyeTracker (Arrington Research, Scottsdale, AZ, USA) in clinical ophthalmology practice. SacLab consists of four processing modules that enable the user to easily create visual stimuli tests (Test Designer), record saccadic eye movements (Data Recorder), analyze the recorded data to automatically extract saccadic parameters of clinical interest (Data Analyzer) and provide an aggregate analysis from multiple eye movements recordings (Saccade Analyzer), without requiring any programming effort by the user. A demo application of SacLab to carry out eye tracking tests for the analysis of horizontal saccades was reported. We tested the usability of SacLab toolbox with three ophthalmologists who had no programming experience; the ophthalmologists were briefly trained in the use of SacLab GUIs and were asked to perform the demo application. The toolbox gained an enthusiastic feedback from all the clinicians in terms of intuitiveness, ease of use and flexibility. Test creation and data processing were accomplished in 52±21s and 46±19s, respectively, using the SacLab GUIs. SacLab may represent a useful tool to ease the application of the ViewPoint EyeTracker system in clinical routine in ophthalmology. Copyright © 2016 Elsevier Ltd. All rights reserved.
Distinct eye movement patterns enhance dynamic visual acuity.
Palidis, Dimitrios J; Wyder-Hodge, Pearson A; Fooken, Jolande; Spering, Miriam
2017-01-01
Dynamic visual acuity (DVA) is the ability to resolve fine spatial detail in dynamic objects during head fixation, or in static objects during head or body rotation. This ability is important for many activities such as ball sports, and a close relation has been shown between DVA and sports expertise. DVA tasks involve eye movements, yet, it is unclear which aspects of eye movements contribute to successful performance. Here we examined the relation between DVA and the kinematics of smooth pursuit and saccadic eye movements in a cohort of 23 varsity baseball players. In a computerized dynamic-object DVA test, observers reported the location of the gap in a small Landolt-C ring moving at various speeds while eye movements were recorded. Smooth pursuit kinematics-eye latency, acceleration, velocity gain, position error-and the direction and amplitude of saccadic eye movements were linked to perceptual performance. Results reveal that distinct eye movement patterns-minimizing eye position error, tracking smoothly, and inhibiting reverse saccades-were related to dynamic visual acuity. The close link between eye movement quality and DVA performance has important implications for the development of perceptual training programs to improve DVA.
Distinct eye movement patterns enhance dynamic visual acuity
Palidis, Dimitrios J.; Wyder-Hodge, Pearson A.; Fooken, Jolande; Spering, Miriam
2017-01-01
Dynamic visual acuity (DVA) is the ability to resolve fine spatial detail in dynamic objects during head fixation, or in static objects during head or body rotation. This ability is important for many activities such as ball sports, and a close relation has been shown between DVA and sports expertise. DVA tasks involve eye movements, yet, it is unclear which aspects of eye movements contribute to successful performance. Here we examined the relation between DVA and the kinematics of smooth pursuit and saccadic eye movements in a cohort of 23 varsity baseball players. In a computerized dynamic-object DVA test, observers reported the location of the gap in a small Landolt-C ring moving at various speeds while eye movements were recorded. Smooth pursuit kinematics—eye latency, acceleration, velocity gain, position error—and the direction and amplitude of saccadic eye movements were linked to perceptual performance. Results reveal that distinct eye movement patterns—minimizing eye position error, tracking smoothly, and inhibiting reverse saccades—were related to dynamic visual acuity. The close link between eye movement quality and DVA performance has important implications for the development of perceptual training programs to improve DVA. PMID:28187157
System for assisted mobility using eye movements based on electrooculography.
Barea, Rafael; Boquete, Luciano; Mazo, Manuel; López, Elena
2002-12-01
This paper describes an eye-control method based on electrooculography (EOG) to develop a system for assisted mobility. One of its most important features is its modularity, making it adaptable to the particular needs of each user according to the type and degree of handicap involved. An eye model based on electroculographic signal is proposed and its validity is studied. Several human-machine interfaces (HMI) based on EOG are commented, focusing our study on guiding and controlling a wheelchair for disabled people, where the control is actually effected by eye movements within the socket. Different techniques and guidance strategies are then shown with comments on the advantages and disadvantages of each one. The system consists of a standard electric wheelchair with an on-board computer, sensors and a graphic user interface run by the computer. On the other hand, this eye-control method can be applied to handle graphical interfaces, where the eye is used as a mouse computer. Results obtained show that this control technique could be useful in multiple applications, such as mobility and communication aid for handicapped persons.
Method of Menu Selection by Gaze Movement Using AC EOG Signals
NASA Astrophysics Data System (ADS)
Kanoh, Shin'ichiro; Futami, Ryoko; Yoshinobu, Tatsuo; Hoshimiya, Nozomu
A method to detect the direction and the distance of voluntary eye gaze movement from EOG (electrooculogram) signals was proposed and tested. In this method, AC-amplified vertical and horizontal transient EOG signals were classified into 8-class directions and 2-class distances of voluntary eye gaze movements. A horizontal and a vertical EOGs during eye gaze movement at each sampling time were treated as a two-dimensional vector, and the center of gravity of the sample vectors whose norms were more than 80% of the maximum norm was used as a feature vector to be classified. By the classification using the k-nearest neighbor algorithm, it was shown that the averaged correct detection rates on each subject were 98.9%, 98.7%, 94.4%, respectively. This method can avoid strict EOG-based eye tracking which requires DC amplification of very small signal. It would be useful to develop robust human interfacing systems based on menu selection for severely paralyzed patients.
Eye movements and attention in reading, scene perception, and visual search.
Rayner, Keith
2009-08-01
Eye movements are now widely used to investigate cognitive processes during reading, scene perception, and visual search. In this article, research on the following topics is reviewed with respect to reading: (a) the perceptual span (or span of effective vision), (b) preview benefit, (c) eye movement control, and (d) models of eye movements. Related issues with respect to eye movements during scene perception and visual search are also reviewed. It is argued that research on eye movements during reading has been somewhat advanced over research on eye movements in scene perception and visual search and that some of the paradigms developed to study reading should be more widely adopted in the study of scene perception and visual search. Research dealing with "real-world" tasks and research utilizing the visual-world paradigm are also briefly discussed.
21 CFR 886.1510 - Eye movement monitor.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...
21 CFR 886.1510 - Eye movement monitor.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Eye movement monitor. 886.1510 Section 886.1510...) MEDICAL DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1510 Eye movement monitor. (a) Identification. An eye movement monitor is an AC-powered device with an electrode intended to measure and record...
A Review on Eye Movement Studies in Childhood and Adolescent Psychiatry
ERIC Educational Resources Information Center
Rommelse, Nanda N. J.; Van der Stigchel, Stefan; Sergeant, Joseph A.
2008-01-01
The neural substrates of eye movement measures are largely known. Therefore, measurement of eye movements in psychiatric disorders may provide insight into the underlying neuropathology of these disorders. Visually guided saccades, antisaccades, memory guided saccades, and smooth pursuit eye movements will be reviewed in various childhood…
Lanchester, B S; Mark, R F
1975-12-01
1. The path, eye and body movements of a teleost fish (the leatherjacket Acanthaluteres spilomelanurus) approaching and taking food were measured by cinematography. 2. Fixation of the food by movement of the eyes is an invariable feature of the approach. The eyes then remain aligned with the target while the body moves forward and round to bring the mouth to the food. 3. When pursuing pieces of food moving vertically at constant velocity through the water these fish normally trace out the pathway that can be calculated by assuming the fish aims constantly at the food. Predictive pathways that imply anticipation of the point of intersection with the food are not regularly seen. 4. Deviations from pursuit occur sporadically, usually in the direction of a predictive path, particularly when the fish approach falling food from below. 5. The geometry of the situation suggests that predictive paths may sometimes be generated if the alignment of eye and body during the pursuit of moving food can be delayed. In approaches from below this may be because forward movement of the fish would tend to stabilize the image of the falling food in the retina. 6. We suggest that a simple linked control system using both eye and body movements to fixate retinal images will on occasions generate predictive pathways without any need for the central nervous system to calculate them in advance.
Mannan, Malik M. Naeem; Kim, Shinjung; Jeong, Myung Yung; Kamran, M. Ahmad
2016-01-01
Contamination of eye movement and blink artifacts in Electroencephalogram (EEG) recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI). In this paper, we proposed an automatic framework based on independent component analysis (ICA) and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data. PMID:26907276
Video-Based Eye Tracking to Detect the Attention Shift: A Computer Classroom Context-Aware System
ERIC Educational Resources Information Center
Kuo, Yung-Lung; Lee, Jiann-Shu; Hsieh, Min-Chai
2014-01-01
Eye and head movements evoked in response to obvious visual attention shifts. However, there has been little progress on the causes of absent-mindedness so far. The paper proposes an attention awareness system that captures the conditions regarding the interaction of eye gaze and head pose under various attentional switching in computer classroom.…
The fluid mechanics of scleral buckling surgery for the repair of retinal detachment.
Foster, William Joseph; Dowla, Nadia; Joshi, Saurabh Y; Nikolaou, Michael
2010-01-01
Scleral buckling is a common surgical technique used to treat retinal detachments that involves suturing a radial or circumferential silicone element on the sclera. Although this procedure has been performed since the 1960s, and there is a reasonable experimental model of retinal detachment, there is still debate as to how this surgery facilitates the re-attachment of the retina. Finite element calculations using the COMSOL Multiphysics system are utilized to explain the influence of the scleral buckle on the flow of sub-retinal fluid in a physical model of retinal detachment. We found that, by coupling fluid mechanics with structural mechanics, laminar fluid flow and the Bernoulli effect are necessary for a physically consistent explanation of retinal reattachment. Improved fluid outflow and retinal reattachment are found with low fluid viscosity and rapid eye movements. A simulation of saccadic eye movements was more effective in removing sub-retinal fluid than slower, reading speed, eye movements in removing subretinal fluid. The results of our simulations allow us to explain the physical principles behind scleral buckling surgery and provide insight that can be utilized clinically. In particular, we find that rapid eye movements facilitate more rapid retinal reattachment. This is contradictory to the conventional wisdom of attempting to minimize eye movements.
Ma, Jiaxin; Zhang, Yu; Cichocki, Andrzej; Matsuno, Fumitoshi
2015-03-01
This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.
Decline of vertical gaze and convergence with aging.
Oguro, Hiroaki; Okada, Kazunori; Suyama, Nobuo; Yamashita, Kazuya; Yamaguchi, Shuhei; Kobayashi, Shotai
2004-01-01
Disturbance of vertical eye movement and ocular convergence is often observed in elderly people, but little is known about its frequency. The purpose of this study was to investigate age-associated changes in vertical eye movement and convergence in healthy elderly people, using a digital video camera system. We analyzed vertical eye movements and convergence in 113 neurologically normal elderly subjects (mean age 70 years) in comparison with 20 healthy young controls (mean age 32 years). The range of vertical eye movement was analyzed quantitatively and convergence was analyzed qualitatively. In the elderly subjects, the angle of vertical gaze decreased with advancing age and it was significantly smaller than that of the younger subjects. The mean angle of upward gaze was significantly smaller than that of downward gaze for both young and elderly subjects. Upward gaze impairment became apparent in subjects in their 70s, and downward gaze impairment in subjects in their 60s. Disturbance in convergence also increased with advancing age, and was found in 40.7% of the elderly subjects. These findings indicate that the mechanisms of age-related change are different for upward and downward vertical gaze. Digital video camera monitoring was useful for assessing and monitoring eye movements. Copyright 2004 S. Karger AG, Basel
MR-eyetracker: a new method for eye movement recording in functional magnetic resonance imaging.
Kimmig, H; Greenlee, M W; Huethe, F; Mergner, T
1999-06-01
We present a method for recording saccadic and pursuit eye movements in the magnetic resonance tomograph designed for visual functional magnetic resonance imaging (fMRI) experiments. To reliably classify brain areas as pursuit or saccade related it is important to carefully measure the actual eye movements. For this purpose, infrared light, created outside the scanner by light-emitting diodes (LEDs), is guided via optic fibers into the head coil and onto the eye of the subject. Two additional fiber optical cables pick up the light reflected by the iris. The illuminating and detecting cables are mounted in a plastic eyepiece that is manually lowered to the level of the eye. By means of differential amplification, we obtain a signal that covaries with the horizontal position of the eye. Calibration of eye position within the scanner yields an estimate of eye position with a resolution of 0.2 degrees at a sampling rate of 1000 Hz. Experiments are presented that employ echoplanar imaging with 12 image planes through visual, parietal and frontal cortex while subjects performed saccadic and pursuit eye movements. The distribution of BOLD (blood oxygen level dependent) responses is shown to depend on the type of eye movement performed. Our method yields high temporal and spatial resolution of the horizontal component of eye movements during fMRI scanning. Since the signal is purely optical, there is no interaction between the eye movement signals and the echoplanar images. This reasonably priced eye tracker can be used to control eye position and monitor eye movements during fMRI.
Depth-estimation-enabled compound eyes
NASA Astrophysics Data System (ADS)
Lee, Woong-Bi; Lee, Heung-No
2018-04-01
Most animals that have compound eyes determine object distances by using monocular cues, especially motion parallax. In artificial compound eye imaging systems inspired by natural compound eyes, object depths are typically estimated by measuring optic flow; however, this requires mechanical movement of the compound eyes or additional acquisition time. In this paper, we propose a method for estimating object depths in a monocular compound eye imaging system based on the computational compound eye (COMPU-EYE) framework. In the COMPU-EYE system, acceptance angles are considerably larger than interommatidial angles, causing overlap between the ommatidial receptive fields. In the proposed depth estimation technique, the disparities between these receptive fields are used to determine object distances. We demonstrate that the proposed depth estimation technique can estimate the distances of multiple objects.
Effects of Macrophage Depletion on Sleep in Mice
Ames, Conner; Boland, Erin; Szentirmai, Éva
2016-01-01
The reciprocal interaction between the immune system and sleep regulation has been widely acknowledged but the cellular mechanisms that underpin this interaction are not completely understood. In the present study, we investigated the role of macrophages in sleep loss- and cold exposure-induced sleep and body temperature responses. Macrophage apoptosis was induced in mice by systemic injection of clodronate-containing liposomes (CCL). We report that CCL treatment induced an immediate and transient increase in non-rapid-eye movement sleep (NREMS) and fever accompanied by decrease in rapid-eye movement sleep, motor activity and NREMS delta power. Chronically macrophage-depleted mice had attenuated NREMS rebound after sleep deprivation compared to normal mice. Cold-induced increase in wakefulness and decrease in NREMS, rapid-eye movement sleep and body temperature were significantly enhanced in macrophage-depleted mice indicating increased cold sensitivity. These findings provide further evidence for the reciprocal interaction among the immune system, sleep and metabolism, and identify macrophages as one of the key cellular elements in this interplay. PMID:27442442
Cognitive load disrupts implicit theory-of-mind processing.
Schneider, Dana; Lam, Rebecca; Bayliss, Andrew P; Dux, Paul E
2012-08-01
Eye movements in Sally-Anne false-belief tasks appear to reflect the ability to implicitly monitor the mental states of other individuals (theory of mind, or ToM). It has recently been proposed that an early-developing, efficient, and automatically operating ToM system subserves this ability. Surprisingly absent from the literature, however, is an empirical test of the influence of domain-general executive processing resources on this implicit ToM system. In the study reported here, a dual-task method was employed to investigate the impact of executive load on eye movements in an implicit Sally-Anne false-belief task. Under no-load conditions, adult participants displayed eye movement behavior consistent with implicit belief processing, whereas evidence for belief processing was absent for participants under cognitive load. These findings indicate that the cognitive system responsible for implicitly tracking beliefs draws at least minimally on executive processing resources. Thus, even the most low-level processing of beliefs appears to reflect a capacity-limited operation.
Implicit prosody mining based on the human eye image capture technology
NASA Astrophysics Data System (ADS)
Gao, Pei-pei; Liu, Feng
2013-08-01
The technology of eye tracker has become the main methods of analyzing the recognition issues in human-computer interaction. Human eye image capture is the key problem of the eye tracking. Based on further research, a new human-computer interaction method introduced to enrich the form of speech synthetic. We propose a method of Implicit Prosody mining based on the human eye image capture technology to extract the parameters from the image of human eyes when reading, control and drive prosody generation in speech synthesis, and establish prosodic model with high simulation accuracy. Duration model is key issues for prosody generation. For the duration model, this paper put forward a new idea for obtaining gaze duration of eyes when reading based on the eye image capture technology, and synchronous controlling this duration and pronunciation duration in speech synthesis. The movement of human eyes during reading is a comprehensive multi-factor interactive process, such as gaze, twitching and backsight. Therefore, how to extract the appropriate information from the image of human eyes need to be considered and the gaze regularity of eyes need to be obtained as references of modeling. Based on the analysis of current three kinds of eye movement control model and the characteristics of the Implicit Prosody reading, relative independence between speech processing system of text and eye movement control system was discussed. It was proved that under the same text familiarity condition, gaze duration of eyes when reading and internal voice pronunciation duration are synchronous. The eye gaze duration model based on the Chinese language level prosodic structure was presented to change previous methods of machine learning and probability forecasting, obtain readers' real internal reading rhythm and to synthesize voice with personalized rhythm. This research will enrich human-computer interactive form, and will be practical significance and application prospect in terms of disabled assisted speech interaction. Experiments show that Implicit Prosody mining based on the human eye image capture technology makes the synthesized speech has more flexible expressions.
Lateral eye-movement responses to visual stimuli.
Wilbur, M P; Roberts-Wilbur, J
1985-08-01
The association of left lateral eye-movement with emotionality or arousal of affect and of right lateral eye-movement with cognitive/interpretive operations and functions was investigated. Participants were junior and senior students enrolled in an undergraduate course in developmental psychology. There were 37 women and 13 men, ranging from 19 to 45 yr. of age. Using videotaped lateral eye-movements of 50 participants' responses to 15 visually presented stimuli (precategorized as neutral, emotional, or intellectual), content and statistical analyses supported the association between left lateral eye-movement and emotional arousal and between right lateral eye-movement and cognitive functions. Precategorized visual stimuli included items such as a ball (neutral), gun (emotional), and calculator (intellectual). The findings are congruent with existing lateral eye-movement literature and also are additive by using visual stimuli that do not require the explicit response or implicit processing of verbal questioning.
Effects of background motion on eye-movement information.
Nakamura, S
1997-02-01
The effect of background stimulus on eye-movement information was investigated by analyzing the underestimation of the target velocity during pursuit eye movement (Aubert-Fleishl paradox). In the experiment, a striped pattern with various brightness contrasts and spatial frequencies was used as a background stimulus, which was moved at various velocities. Analysis showed that the perceived velocity of the pursuit target, which indicated the magnitudes of eye-movement information, decreased when the background stripes moved in the same direction as eye movement at higher velocities and increased when the background moved in the opposite direction. The results suggest that the eye-movement information varied as a linear function of the velocity of the motion of the background retinal image (optic flow). In addition, the effectiveness of optic flow on eye-movement information was determined by the attributes of the background stimulus such as the brightness contrast or the spatial frequency of the striped pattern.
Bicknell, Klinton; Levy, Roger
2012-01-01
Decades of empirical work have shown that a range of eye movement phenomena in reading are sensitive to the details of the process of word identification. Despite this, major models of eye movement control in reading do not explicitly model word identification from visual input. This paper presents a argument for developing models of eye movements that do include detailed models of word identification. Specifically, we argue that insights into eye movement behavior can be gained by understanding which phenomena naturally arise from an account in which the eyes move for efficient word identification, and that one important use of such models is to test which eye movement phenomena can be understood this way. As an extended case study, we present evidence from an extension of a previous model of eye movement control in reading that does explicitly model word identification from visual input, Mr. Chips (Legge, Klitz, & Tjan, 1997), to test two proposals for the effect of using linguistic context on reading efficiency. PMID:23074362
A review on eye movement studies in childhood and adolescent psychiatry.
Rommelse, Nanda N J; Van der Stigchel, Stefan; Sergeant, Joseph A
2008-12-01
The neural substrates of eye movement measures are largely known. Therefore, measurement of eye movements in psychiatric disorders may provide insight into the underlying neuropathology of these disorders. Visually guided saccades, antisaccades, memory guided saccades, and smooth pursuit eye movements will be reviewed in various childhood psychiatric disorders. The four aims of this review are (1) to give a thorough overview of eye movement studies in a wide array of psychiatric disorders occurring during childhood and adolescence (attention-deficit/hyperactivity disorder, oppositional deviant disorder and conduct disorder, autism spectrum disorders, reading disorder, childhood-onset schizophrenia, Tourette's syndrome, obsessive compulsive disorder, and anxiety and depression), (2) to discuss the specificity and overlap of eye movement findings across disorders and paradigms, (3) to discuss the developmental aspects of eye movement abnormalities in childhood and adolescence psychiatric disorders, and (4) to present suggestions for future research. In order to make this review of interest to a broad audience, attention will be given to the clinical manifestation of the disorders and the theoretical background of the eye movement paradigms.
Peters, Denise M; McPherson, Aaron K; Fletcher, Blake; McClenaghan, Bruce A; Fritz, Stacy L
2013-09-01
The use of video gaming as a therapeutic intervention has increased in popularity; however, the number of repetitions in comparison with traditional therapy methods has yet to be investigated. The primary purpose of this study was to document and compare the number of repetitions performed while playing 1 of 2 video gaming systems for a time frame similar to that of a traditional therapy session in individuals with chronic stroke. Twelve participants with chronic stroke (mean age, 66.8 ± 8.2 years; time poststroke, 19.2 ± 15.4 months) completed video game play sessions, using either the Nintendo Wii or the Playstation 2 EyeToy. A total of 203 sessions were captured on video record; of these, 50 sessions for each gaming system were randomly selected for analysis. For each selected record, active upper and lower extremity repetitions were counted for a 36-minute segment of the recorded session. The Playstation 2 EyeToy group produced an average of 302.5 (228.1) upper extremity active movements and 189.3 (98.3) weight shifts, significantly higher than the Nintendo Wii group, which produced an average of 61.9 (65.7) upper extremity active movements and 109.7 (78.5) weight shifts. No significant differences were found in steps and other lower extremity active movements between the 2 systems. The Playstation 2 EyeToy group produced more upper extremity active movements and weight shifting movements than the Nintendo Wii group; the number and type of repetitions varied across games. Active gaming (specifically Playstation 2 EyeToy) provided more upper extremity repetitions than those reported in the literature by using traditional therapy, suggesting that it may be a modality to promote increased active movements in individuals poststroke.
Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments.
Andrews, T J; Coppola, D M
1999-08-01
Eye position was recorded in different viewing conditions to assess whether the temporal and spatial characteristics of saccadic eye movements in different individuals are idiosyncratic. Our aim was to determine the degree to which oculomotor control is based on endogenous factors. A total of 15 naive subjects viewed five visual environments: (1) The absence of visual stimulation (i.e. a dark room); (2) a repetitive visual environment (i.e. simple textured patterns); (3) a complex natural scene; (4) a visual search task; and (5) reading text. Although differences in visual environment had significant effects on eye movements, idiosyncrasies were also apparent. For example, the mean fixation duration and size of an individual's saccadic eye movements when passively viewing a complex natural scene covaried significantly with those same parameters in the absence of visual stimulation and in a repetitive visual environment. In contrast, an individual's spatio-temporal characteristics of eye movements during active tasks such as reading text or visual search covaried together, but did not correlate with the pattern of eye movements detected when viewing a natural scene, simple patterns or in the dark. These idiosyncratic patterns of eye movements in normal viewing reveal an endogenous influence on oculomotor control. The independent covariance of eye movements during different visual tasks shows that saccadic eye movements during active tasks like reading or visual search differ from those engaged during the passive inspection of visual scenes.
... are recorded as warm and cool water or air are gently introduced into each ear canal. Since the eyes and ears work in coordination through the nervous system, measurement of eye movements can be used to test ...
Eye Movement Patterns of the Elderly during Stair Descent:Effect of Illumination
NASA Astrophysics Data System (ADS)
Kasahara, Satoko; Okabe, Sonoko; Nakazato, Naoko; Ohno, Yuko
The relationship between the eye movement pattern during stair descent and illumination was studied in 4 elderly people in comparison with that in 5 young people. The illumination condition was light (85.0±30.9 lx) or dark (0.7±0.3 lx), and data of eye movements were obtained using an eye mark recorder. A flight of 15 steps was used for the experiment, and data on 3 steps in the middle, on which the descent movements were stabilized, were analyzed. The elderly subjects pointed their eyes mostly directly in front in the facial direction regardless of the illumination condition, but the young subjects tended to look down under the light condition. The young subjects are considered to have confirmed the safety of the front by peripheral vision, checked the stepping surface by central vision, and still maintained the upright position without leaning forward during stair descent. The elderly subjects, in contrast, always looked at the visual target by central vision even under the light condition and leaned forward. The range of eye movements was larger vertically than horizontally in both groups, and a characteristic eye movement pattern of repeating a vertical shuttle movement synchronous with descent of each step was observed. Under the dark condition, the young subjects widened the range of vertical eye movements and reduced duration of fixation. The elderly subjects showed no change in the range of eye movements but increased duration of fixation during stair descent. These differences in the eye movements are considered to be compensatory reactions to narrowing of the vertical visual field, reduced dark adaptation, and reduced dynamic visual acuity due to aging. These characteristics of eye movements of the elderly lead to an anteriorly leaned posture and lack of attention to the front during stair descent.
Fetal Eye Movements on Magnetic Resonance Imaging
Woitek, Ramona; Kasprian, Gregor; Lindner, Christian; Stuhr, Fritz; Weber, Michael; Schöpf, Veronika; Brugger, Peter C.; Asenbaum, Ulrika; Furtner, Julia; Bettelheim, Dieter; Seidl, Rainer; Prayer, Daniela
2013-01-01
Objectives Eye movements are the physical expression of upper fetal brainstem function. Our aim was to identify and differentiate specific types of fetal eye movement patterns using dynamic MRI sequences. Their occurrence as well as the presence of conjugated eyeball motion and consistently parallel eyeball position was systematically analyzed. Methods Dynamic SSFP sequences were acquired in 72 singleton fetuses (17–40 GW, three age groups [17–23 GW, 24–32 GW, 33–40 GW]). Fetal eye movements were evaluated according to a modified classification originally published by Birnholz (1981): Type 0: no eye movements; Type I: single transient deviations; Type Ia: fast deviation, slower reposition; Type Ib: fast deviation, fast reposition; Type II: single prolonged eye movements; Type III: complex sequences; and Type IV: nystagmoid. Results In 95.8% of fetuses, the evaluation of eye movements was possible using MRI, with a mean acquisition time of 70 seconds. Due to head motion, 4.2% of the fetuses and 20.1% of all dynamic SSFP sequences were excluded. Eye movements were observed in 45 fetuses (65.2%). Significant differences between the age groups were found for Type I (p = 0.03), Type Ia (p = 0.031), and Type IV eye movements (p = 0.033). Consistently parallel bulbs were found in 27.3–45%. Conclusions In human fetuses, different eye movement patterns can be identified and described by MRI in utero. In addition to the originally classified eye movement patterns, a novel subtype has been observed, which apparently characterizes an important step in fetal brainstem development. We evaluated, for the first time, eyeball position in fetuses. Ultimately, the assessment of fetal eye movements by MRI yields the potential to identify early signs of brainstem dysfunction, as encountered in brain malformations such as Chiari II or molar tooth malformations. PMID:24194885
Evidence that non-dreamers do dream: a REM sleep behaviour disorder model.
Herlin, Bastien; Leu-Semenescu, Smaranda; Chaumereuil, Charlotte; Arnulf, Isabelle
2015-12-01
To determine whether non-dreamers do not produce dreams or do not recall them, subjects were identified with no dream recall with dreamlike behaviours during rapid eye movement sleep behaviour disorder, which is typically characterised by dream-enacting behaviours congruent with sleep mentation. All consecutive patients with idiopathic rapid eye movement sleep behaviour disorder or rapid eye movement sleep behaviour disorder associated with Parkinson's disease who underwent a video-polysomnography were interviewed regarding the presence or absence of dream recall, retrospectively or upon spontaneous arousals. The patients with no dream recall for at least 10 years, and never-ever recallers were compared with dream recallers with rapid eye movement sleep behaviour disorder regarding their clinical, cognitive and sleep features. Of the 289 patients with rapid eye movement sleep behaviour disorder, eight (2.8%) patients had no dream recall, including four (1.4%) patients who had never ever recalled dreams, and four patients who had no dream recall for 10-56 years. All non-recallers exhibited, daily or almost nightly, several complex, scenic and dreamlike behaviours and speeches, which were also observed during rapid eye movement sleep on video-polysomnography (arguing, fighting and speaking). They did not recall a dream following sudden awakenings from rapid eye movement sleep. These eight non-recallers with rapid eye movement sleep behaviour disorder did not differ in terms of cognition, clinical, treatment or sleep measures from the 17 dreamers with rapid eye movement sleep behaviour disorder matched for age, sex and disease. The scenic dreamlike behaviours reported and observed during rapid eye movement sleep in the rare non-recallers with rapid eye movement sleep behaviour disorder (even in the never-ever recallers) provide strong evidence that non-recallers produce dreams, but do not recall them. Rapid eye movement sleep behaviour disorder provides a new model to evaluate cognitive processing during dreaming and subsequent recall. © 2015 European Sleep Research Society.
The Disturbance of Gaze in Progressive Supranuclear Palsy: Implications for Pathogenesis
Chen, Athena L.; Riley, David E.; King, Susan A.; Joshi, Anand C.; Serra, Alessandro; Liao, Ke; Cohen, Mark L.; Otero-Millan, Jorge; Martinez-Conde, Susana; Strupp, Michael; Leigh, R. John
2010-01-01
Progressive supranuclear palsy (PSP) is a disease of later life that is currently regarded as a form of neurodegenerative tauopathy. Disturbance of gaze is a cardinal clinical feature of PSP that often helps clinicians to establish the diagnosis. Since the neurobiology of gaze control is now well understood, it is possible to use eye movements as investigational tools to understand aspects of the pathogenesis of PSP. In this review, we summarize each disorder of gaze control that occurs in PSP, drawing on our studies of 50 patients, and on reports from other laboratories that have measured the disturbances of eye movements. When these gaze disorders are approached by considering each functional class of eye movements and its neurobiological basis, a distinct pattern of eye movement deficits emerges that provides insight into the pathogenesis of PSP. Although some aspects of all forms of eye movements are affected in PSP, the predominant defects concern vertical saccades (slow and hypometric, both up and down), impaired vergence, and inability to modulate the linear vestibulo-ocular reflex appropriately for viewing distance. These vertical and vergence eye movements habitually work in concert to enable visuomotor skills that are important during locomotion with the hands free. Taken with the prominent early feature of falls, these findings suggest that PSP tauopathy impairs a recently evolved neural system concerned with bipedal locomotion in an erect posture and frequent gaze shifts between the distant environment and proximate hands. This approach provides a conceptual framework that can be used to address the nosological challenge posed by overlapping clinical and neuropathological features of neurodegenerative tauopathies. PMID:21188269
Eye movement accuracy determines natural interception strategies.
Fooken, Jolande; Yeo, Sang-Hoon; Pai, Dinesh K; Spering, Miriam
2016-11-01
Eye movements aid visual perception and guide actions such as reaching or grasping. Most previous work on eye-hand coordination has focused on saccadic eye movements. Here we show that smooth pursuit eye movement accuracy strongly predicts both interception accuracy and the strategy used to intercept a moving object. We developed a naturalistic task in which participants (n = 42 varsity baseball players) intercepted a moving dot (a "2D fly ball") with their index finger in a designated "hit zone." Participants were instructed to track the ball with their eyes, but were only shown its initial launch (100-300 ms). Better smooth pursuit resulted in more accurate interceptions and determined the strategy used for interception, i.e., whether interception was early or late in the hit zone. Even though early and late interceptors showed equally accurate interceptions, they may have relied on distinct tactics: early interceptors used cognitive heuristics, whereas late interceptors' performance was best predicted by pursuit accuracy. Late interception may be beneficial in real-world tasks as it provides more time for decision and adjustment. Supporting this view, baseball players who were more senior were more likely to be late interceptors. Our findings suggest that interception strategies are optimally adapted to the proficiency of the pursuit system.
Visual and vestibular induced eye movements in verbal children and adults with autism
Furman, Joseph M.; Osorio, Maria Joana; Minshew, Nancy J.
2016-01-01
This study investigated several types of eye movements that rely on the function of brainstem-cerebellar pathways specifically (vestibular-ocular reflexes) or on widely distributed pathways of the brain (horizontal pursuit and saccade eye movements). Although eye movements that rely on higher brain regions have been studies fairly extensively in autism, eye movements dependent on brainstem and cerebellum have not. This study involved 79 individuals with autism and 62 typical controls aged 5 to 52 years with IQ scores above 70. No differences between the autism and control groups were present on the measures of vestibular ocular reflexes, or on saccade velocity or accuracy. The autism group was significantly slower to initiate saccades, which was most prominent in the 8-18 year old age range. These findings provide the most substantial evidence to date of the functional integrity of brainstem and cerebellar pathways in autism, suggesting that the histopathological abnormalities described in these structures may not be associated with intrinsic dysfunction but rather reflect developmental alterations related to forebrain cortical systems formation. The increase in saccade latency adds to the substantial evidence of altered function and maturation of cortical systems in autism. Objective This study assessed the functionality of vestibular, pursuit and saccade circuitry in autism across a wide age range. Methods Subjects were 79 individuals with autism (AUT) and 62 controls (CON) aged 5 to 52 years with IQ scores > 70. For vestibular testing, earth-vertical axis rotation was performed in darkness and in a lighted visual surround with a fixation target. Ocular motor testing included assessment of horizontal saccades and horizontal smooth pursuit. Results No between-group differences were found in vestibular reflexes or in mean saccade velocity or accuracy. Saccade latency was increased in the AUT group with significant age-related effects in the 8-18 year old subgroups. There was a trend toward decreased pursuit gain without age effects. Conclusions Normal vestibular-induced eye movements and normal saccade accuracy and velocity provide the most substantial evidence to date of the functional integrity of brainstem and cerebellar pathways in autism, suggesting that the histopathological abnormalities described in these structures may not be associated with intrinsic dysfunction but rather reflect developmental alterations related to forebrain cortical systems formation. Increased saccade latency with age effects adds to the extensive existing evidence of altered function and maturation of cortical systems in autism. PMID:25846907
Neuro-Linguistic Programming: Eye Movements as Indicators of Representational Systems.
1984-09-01
Elizabeth A. Beck, "Test of the Eye Movement Hypothesis of Neurolinguistic Programing : A Rebuttal of Conclu- sions," Perceptual and Motor Skills, 58: 175...Meta Publications, 1980. 64 .. .] .! S ~ ~ ~ ~ 1 - ----. 14. Maron, Davida, " Neurolinguistic Programming : The Answer to Change? Training and Development... Neurolinguistic Programming ," Perceptual and Motor Skills, 51: 230 (April 1980). 65 VITA Captain William H. Moore was born on 22 October 1949. He
Why we forget our dreams: Acetylcholine and norepinephrine in wakefulness and REM sleep.
Becchetti, Andrea; Amadeo, Alida
2016-01-01
The ascending fibers releasing norepinephrine and acetylcholine are highly active during wakefulness. In contrast, during rapid-eye-movement sleep, the neocortical tone is sustained mainly by acetylcholine. By comparing the different physiological features of the norepinephrine and acetylcholine systems in the light of the GANE (glutamate amplifies noradrenergic effects) model, we suggest how to interpret some functional differences between waking and rapid-eye-movement sleep.
NASA Astrophysics Data System (ADS)
Nguyen, T. A. K.; DiGiovanna, J.; Cavuscens, S.; Ranieri, M.; Guinand, N.; van de Berg, R.; Carpaneto, J.; Kingma, H.; Guyot, J.-P.; Micera, S.; Perez Fornos, A.
2016-08-01
Objective. The vestibular system provides essential information about balance and spatial orientation via the brain to other sensory and motor systems. Bilateral vestibular loss significantly reduces quality of life, but vestibular implants (VIs) have demonstrated potential to restore lost function. However, optimal electrical stimulation strategies have not yet been identified in patients. In this study, we compared the two most common strategies, pulse amplitude modulation (PAM) and pulse rate modulation (PRM), in patients. Approach. Four subjects with a modified cochlear implant including electrodes targeting the peripheral vestibular nerve branches were tested. Charge-equivalent PAM and PRM were applied after adaptation to baseline stimulation. Vestibulo-ocular reflex eye movement responses were recorded to evaluate stimulation efficacy during acute clinical testing sessions. Main results. PAM evoked larger amplitude eye movement responses than PRM. Eye movement response axes for lateral canal stimulation were marginally better aligned with PRM than with PAM. A neural network model was developed for the tested stimulation strategies to provide insights on possible neural mechanisms. This model suggested that PAM would consistently cause a larger ensemble firing rate of neurons and thus larger responses than PRM. Significance. Due to the larger magnitude of eye movement responses, our findings strongly suggest PAM as the preferred strategy for initial VI modulation.
What interests them in the pictures?--differences in eye-tracking between rhesus monkeys and humans.
Hu, Ying-Zhou; Jiang, Hui-Hui; Liu, Ci-Rong; Wang, Jian-Hong; Yu, Cheng-Yang; Carlson, Synnöve; Yang, Shang-Chuan; Saarinen, Veli-Matti; Rizak, Joshua D; Tian, Xiao-Guang; Tan, Hen; Chen, Zhu-Yue; Ma, Yuan-Ye; Hu, Xin-Tian
2013-10-01
Studies estimating eye movements have demonstrated that non-human primates have fixation patterns similar to humans at the first sight of a picture. In the current study, three sets of pictures containing monkeys, humans or both were presented to rhesus monkeys and humans. The eye movements on these pictures by the two species were recorded using a Tobii eye-tracking system. We found that monkeys paid more attention to the head and body in pictures containing monkeys, whereas both monkeys and humans paid more attention to the head in pictures containing humans. The humans always concentrated on the eyes and head in all the pictures, indicating the social role of facial cues in society. Although humans paid more attention to the hands than monkeys, both monkeys and humans were interested in the hands and what was being done with them in the pictures. This may suggest the importance and necessity of hands for survival. Finally, monkeys scored lower in eye-tracking when fixating on the pictures, as if they were less interested in looking at the screen than humans. The locations of fixation in monkeys may provide insight into the role of eye movements in an evolutionary context.
ECEM (eye closure eye movements): integrating aspects of EMDR with hypnosis for treatment of trauma.
Hollander, H E; Bender, S S
2001-01-01
The paper addresses distinctions between hypnotic interventions and Eye Movement Desensitizing and Reprocessing (EMDR) and discusses their effect on persons who have symptoms of Posttraumatic Stress Disorder (PTSD). Eye movements in hypnosis and EMDR are considered in terms of the different ways they may affect responses in treatment. A treatment intervention within hypnosis called ECEM (Eye Closure, Eye Movements) is described. ECEM can be used for patients with histories of trauma who did not benefit adequately from either interventions in hypnosis or the EMDR treatment protocol used separately. In ECEM the eye movement variable of EMDR is integrated within a hypnosis protocol to enhance benefits of hypnosis and reduce certain risks of EMDR.
Prevalence and phenomenology of eye tics in Gilles de la Tourette syndrome.
Martino, Davide; Cavanna, Andrea E; Robertson, Mary M; Orth, Michael
2012-10-01
Eye tics seem to be common in Gilles de la Tourette syndrome (GTS). We analyzed the frequency and clinical characteristics of eye tics in 212 GTS patients. Of the 212 patients, 201 (94.8 %) reported eye tics in their life-time; 166 (78.3 %) reported eye movement tics (rolling eyes up/down, eyes looking sideways, staring), and 194 (91.5 %) eyelid/eyebrow movement tics (frowning, raising eyebrows, blinking or winking). Patients with eye movement tics were younger at age of GTS onset (7.1 ± 4 years) than those without (8.9 ± 6.8; p = 0.024). Tic severity positively correlated to lifetime history of eye and/or eyelid/eyebrow movement tics. Our data confirm that eye and eyelid/eyebrow movement tics are very common in GTS, and most patients have several types of eye tics over time. Eye tic phenomenology was similar in patients with or without co-morbidity. Eye tics are therefore likely to be a core feature of GTS and should be routinely evaluated in order to strengthen the clinician's confidence in diagnosing GTS.
Variability of eye movements when viewing dynamic natural scenes.
Dorr, Michael; Martinetz, Thomas; Gegenfurtner, Karl R; Barth, Erhardt
2010-08-26
How similar are the eye movement patterns of different subjects when free viewing dynamic natural scenes? We collected a large database of eye movements from 54 subjects on 18 high-resolution videos of outdoor scenes and measured their variability using the Normalized Scanpath Saliency, which we extended to the temporal domain. Even though up to about 80% of subjects looked at the same image region in some video parts, variability usually was much greater. Eye movements on natural movies were then compared with eye movements in several control conditions. "Stop-motion" movies had almost identical semantic content as the original videos but lacked continuous motion. Hollywood action movie trailers were used to probe the upper limit of eye movement coherence that can be achieved by deliberate camera work, scene cuts, etc. In a "repetitive" condition, subjects viewed the same movies ten times each over the course of 2 days. Results show several systematic differences between conditions both for general eye movement parameters such as saccade amplitude and fixation duration and for eye movement variability. Most importantly, eye movements on static images are initially driven by stimulus onset effects and later, more so than on continuous videos, by subject-specific idiosyncrasies; eye movements on Hollywood movies are significantly more coherent than those on natural movies. We conclude that the stimuli types often used in laboratory experiments, static images and professionally cut material, are not very representative of natural viewing behavior. All stimuli and gaze data are publicly available at http://www.inb.uni-luebeck.de/tools-demos/gaze.
Corsi-Cabrera, María; Velasco, Francisco; Del Río-Portilla, Yolanda; Armony, Jorge L; Trejo-Martínez, David; Guevara, Miguel A; Velasco, Ana L
2016-10-01
The amygdaloid complex plays a crucial role in processing emotional signals and in the formation of emotional memories. Neuroimaging studies have shown human amygdala activation during rapid eye movement sleep (REM). Stereotactically implanted electrodes for presurgical evaluation in epileptic patients provide a unique opportunity to directly record amygdala activity. The present study analysed amygdala activity associated with REM sleep eye movements on the millisecond scale. We propose that phasic activation associated with rapid eye movements may provide the amygdala with endogenous excitation during REM sleep. Standard polysomnography and stereo-electroencephalograph (SEEG) were recorded simultaneously during spontaneous sleep in the left amygdala of four patients. Time-frequency analysis and absolute power of gamma activity were obtained for 250 ms time windows preceding and following eye movement onset in REM sleep, and in spontaneous waking eye movements in the dark. Absolute power of the 44-48 Hz band increased significantly during the 250 ms time window after REM sleep rapid eye movements onset, but not during waking eye movements. Transient activation of the amygdala provides physiological support for the proposed participation of the amygdala in emotional expression, in the emotional content of dreams and for the reactivation and consolidation of emotional memories during REM sleep, as well as for next-day emotional regulation, and its possible role in the bidirectional interaction between REM sleep and such sleep disorders as nightmares, anxiety and post-traumatic sleep disorder. These results provide unique, direct evidence of increased activation of the human amygdala time-locked to REM sleep rapid eye movements. © 2016 European Sleep Research Society.
Overview of sleep: the neurologic processes of the sleep-wake cycle.
Scammell, Thomas E
2015-05-01
Sleep problems are common in adults and should be treated to improve overall health and safety. To choose the best treatment for patients with sleep problems, clinicians should understand the sleep-wake cycle and the stages of rapid eye movement and non-rapid eye movement sleep as well as the neurologic pathways of sleep and wake systems. The sleep- and wake-promoting systems are mutually inhibitory, with the predominantly active system determining if a person is awake or asleep. The orexin system also plays an important role in the stabilization of the sleep-wake cycle. © Copyright 2015 Physicians Postgraduate Press, Inc.
iTemplate: A template-based eye movement data analysis approach.
Xiao, Naiqi G; Lee, Kang
2018-02-08
Current eye movement data analysis methods rely on defining areas of interest (AOIs). Due to the fact that AOIs are created and modified manually, variances in their size, shape, and location are unavoidable. These variances affect not only the consistency of the AOI definitions, but also the validity of the eye movement analyses based on the AOIs. To reduce the variances in AOI creation and modification and achieve a procedure to process eye movement data with high precision and efficiency, we propose a template-based eye movement data analysis method. Using a linear transformation algorithm, this method registers the eye movement data from each individual stimulus to a template. Thus, users only need to create one set of AOIs for the template in order to analyze eye movement data, rather than creating a unique set of AOIs for all individual stimuli. This change greatly reduces the error caused by the variance from manually created AOIs and boosts the efficiency of the data analysis. Furthermore, this method can help researchers prepare eye movement data for some advanced analysis approaches, such as iMap. We have developed software (iTemplate) with a graphic user interface to make this analysis method available to researchers.
NASA Astrophysics Data System (ADS)
Vuori, Tero; Olkkonen, Maria
2006-01-01
The aim of the study is to test both customer image quality rating (subjective image quality) and physical measurement of user behavior (eye movements tracking) to find customer satisfaction differences in imaging technologies. Methodological aim is to find out whether eye movements could be quantitatively used in image quality preference studies. In general, we want to map objective or physically measurable image quality to subjective evaluations and eye movement data. We conducted a series of image quality tests, in which the test subjects evaluated image quality while we recorded their eye movements. Results show that eye movement parameters consistently change according to the instructions given to the user, and according to physical image quality, e.g. saccade duration increased with increasing blur. Results indicate that eye movement tracking could be used to differentiate image quality evaluation strategies that the users have. Results also show that eye movements would help mapping between technological and subjective image quality. Furthermore, these results give some empirical emphasis to top-down perception processes in image quality perception and evaluation by showing differences between perceptual processes in situations when cognitive task varies.
Gandhi, Neeraj J; Sparks, David L
2007-07-01
Natural movements often include actions integrated across multiple effectors. Coordinated eye-head movements are driven by a command to shift the line of sight by a desired displacement vector. Yet because extraocular and neck motoneurons are separate entities, the gaze shift command must be separated into independent signals for eye and head movement control. We report that this separation occurs, at least partially, at or before the level of pontine omnipause neurons (OPNs). Stimulation of the OPNs prior to and during gaze shifts temporally decoupled the eye and head components by inhibiting gaze and eye saccades. In contrast, head movements were consistently initiated before gaze onset, and ongoing head movements continued along their trajectories, albeit with some characteristic modulations. After stimulation offset, a gaze shift composed of an eye saccade, and a reaccelerated head movement was produced to preserve gaze accuracy. We conclude that signals subject to OPN inhibition produce the eye-movement component of a coordinated eye-head gaze shift and are not the only signals involved in the generation of the head component of the gaze shift.
Temporal eye movement strategies during naturalistic viewing
Wang, Helena X.; Freeman, Jeremy; Merriam, Elisha P.; Hasson, Uri; Heeger, David J.
2011-01-01
The deployment of eye movements to complex spatiotemporal stimuli likely involves a variety of cognitive factors. However, eye movements to movies are surprisingly reliable both within and across observers. We exploited and manipulated that reliability to characterize observers’ temporal viewing strategies. Introducing cuts and scrambling the temporal order of the resulting clips systematically changed eye movement reliability. We developed a computational model that exhibited this behavior and provided an excellent fit to the measured eye movement reliability. The model assumed that observers searched for, found, and tracked a point-of-interest, and that this process reset when there was a cut. The model did not require that eye movements depend on temporal context in any other way, and it managed to describe eye movements consistently across different observers and two movie sequences. Thus, we found no evidence for the integration of information over long time scales (greater than a second). The results are consistent with the idea that observers employ a simple tracking strategy even while viewing complex, engaging naturalistic stimuli. PMID:22262911
On Biometrics With Eye Movements.
Zhang, Youming; Juhola, Martti
2017-09-01
Eye movements are a relatively novel data source for biometric identification. When video cameras applied to eye tracking become smaller and more efficient, this data source could offer interesting opportunities for the development of eye movement biometrics. In this paper, we study primarily biometric identification as seen as a classification task of multiple classes, and secondarily biometric verification considered as binary classification. Our research is based on the saccadic eye movement signal measurements from 109 young subjects. In order to test the data measured, we use a procedure of biometric identification according to the one-versus-one (subject) principle. In a development from our previous research, which also involved biometric verification based on saccadic eye movements, we now apply another eye movement tracker device with a higher sampling frequency of 250 Hz. The results obtained are good, with correct identification rates at 80-90% at their best.
Acting without seeing: eye movements reveal visual processing without awareness.
Spering, Miriam; Carrasco, Marisa
2015-04-01
Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. Here, we review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movement. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging, and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. Copyright © 2015 Elsevier Ltd. All rights reserved.
Spering, Miriam; Carrasco, Marisa
2015-01-01
Visual perception and eye movements are considered to be tightly linked. Diverse fields, ranging from developmental psychology to computer science, utilize eye tracking to measure visual perception. However, this prevailing view has been challenged by recent behavioral studies. We review converging evidence revealing dissociations between the contents of perceptual awareness and different types of eye movements. Such dissociations reveal situations in which eye movements are sensitive to particular visual features that fail to modulate perceptual reports. We also discuss neurophysiological, neuroimaging and clinical studies supporting the role of subcortical pathways for visual processing without awareness. Our review links awareness to perceptual-eye movement dissociations and furthers our understanding of the brain pathways underlying vision and movement with and without awareness. PMID:25765322
SmartEye and Polhemus data for vestibulo-ocular reflex and optokinetic reflex model.
Le, Anh Son; Aoki, Hirofumi
2018-06-01
In this data article, this dataset included raw data of head and eye movement that collected by Polhemus (Polhemus Inc) and SmartEye (Smart Eye AB) equipment. Subjects who have driver license participated in this experiment. The experiment was conducted with a driving simulator that was controlled by CarSim (Mechanical simulation Co., Anna Arbor, MI) with the vehicle motion. This data set not only contained the eye and head movement but also had eye gaze, pupil diameter, saccades, and so on. It can be used for the parameter identification of the vestibulor-ocular reflex (VOR) model, simulation eye movement, as well as running other analysis related to eye movement.
Ali, S M; Reisner, L A; King, B; Cao, A; Auner, G; Klein, M; Pandya, A K
2008-01-01
A redesigned motion control system for the medical robot Aesop allows automating and programming its movements. An IR eye tracking system has been integrated with this control interface to implement an intelligent, autonomous eye gaze-based laparoscopic positioning system. A laparoscopic camera held by Aesop can be moved based on the data from the eye tracking interface to keep the user's gaze point region at the center of a video feedback monitor. This system setup provides autonomous camera control that works around the surgeon, providing an optimal robotic camera platform.
Real-time eye tracking for the assessment of driver fatigue.
Xu, Junli; Min, Jianliang; Hu, Jianfeng
2018-04-01
Eye-tracking is an important approach to collect evidence regarding some participants' driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants' eye state for collecting eye-movement data. These data are useful to get insights into assessing participants' fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1-2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K -nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue.
Real-time eye tracking for the assessment of driver fatigue
Xu, Junli; Min, Jianliang
2018-01-01
Eye-tracking is an important approach to collect evidence regarding some participants’ driving fatigue. In this contribution, the authors present a non-intrusive system for evaluating driver fatigue by tracking eye movement behaviours. A real-time eye-tracker was used to monitor participants’ eye state for collecting eye-movement data. These data are useful to get insights into assessing participants’ fatigue state during monotonous driving. Ten healthy subjects performed continuous simulated driving for 1–2 h with eye state monitoring on a driving simulator in this study, and these measured features of the fixation time and the pupil area were recorded via using eye movement tracking device. For achieving a good cost-performance ratio and fast computation time, the fuzzy K-nearest neighbour was employed to evaluate and analyse the influence of different participants on the variations in the fixation duration and pupil area of drivers. The findings of this study indicated that there are significant differences in domain value distribution of the pupil area under the condition with normal and fatigue driving state. Result also suggests that the recognition accuracy by jackknife validation reaches to about 89% in average, implying that show a significant potential of real-time applicability of the proposed approach and is capable of detecting driver fatigue. PMID:29750113
NASA Astrophysics Data System (ADS)
Elleuch, Hanene; Wali, Ali; Samet, Anis; Alimi, Adel M.
2017-03-01
Two systems of eyes and hand gestures recognition are used to control mobile devices. Based on a real-time video streaming captured from the device's camera, the first system recognizes the motion of user's eyes and the second one detects the static hand gestures. To avoid any confusion between natural and intentional movements we developed a system to fuse the decision coming from eyes and hands gesture recognition systems. The phase of fusion was based on decision tree approach. We conducted a study on 5 volunteers and the results that our system is robust and competitive.
The control of voluntary eye movements: new perspectives.
Krauzlis, Richard J
2005-04-01
Primates use two types of voluntary eye movements to track objects of interest: pursuit and saccades. Traditionally, these two eye movements have been viewed as distinct systems that are driven automatically by low-level visual inputs. However, two sets of findings argue for a new perspective on the control of voluntary eye movements. First, recent experiments have shown that pursuit and saccades are not controlled by entirely different neural pathways but are controlled by similar networks of cortical and subcortical regions and, in some cases, by the same neurons. Second, pursuit and saccades are not automatic responses to retinal inputs but are regulated by a process of target selection that involves a basic form of decision making. The selection process itself is guided by a variety of complex processes, including attention, perception, memory, and expectation. Together, these findings indicate that pursuit and saccades share a similar functional architecture. These points of similarity may hold the key for understanding how neural circuits negotiate the links between the many higher order functions that can influence behavior and the singular and coordinated motor actions that follow.
Young children with autism spectrum disorder use predictive eye movements in action observation.
Falck-Ytter, Terje
2010-06-23
Does a dysfunction in the mirror neuron system (MNS) underlie the social symptoms defining autism spectrum disorder (ASD)? Research suggests that the MNS matches observed actions to motor plans for similar actions, and that these motor plans include directions for predictive eye movements when observing goal-directed actions. Thus, one important question is whether children with ASD use predictive eye movements in action observation. Young children with ASD as well as typically developing children and adults were shown videos in which an actor performed object-directed actions (human agent condition). Children with ASD were also shown control videos showing objects moving by themselves (self-propelled condition). Gaze was measured using a corneal reflection technique. Children with ASD and typically developing individuals used strikingly similar goal-directed eye movements when observing others' actions in the human agent condition. Gaze was reactive in the self-propelled condition, suggesting that prediction is linked to seeing a hand-object interaction. This study does not support the view that ASD is characterized by a global dysfunction in the MNS.
Eye Tracking and Head Movement Detection: A State-of-Art Survey
2013-01-01
Eye-gaze detection and tracking have been an active research field in the past years as it adds convenience to a variety of applications. It is considered a significant untraditional method of human computer interaction. Head movement detection has also received researchers' attention and interest as it has been found to be a simple and effective interaction method. Both technologies are considered the easiest alternative interface methods. They serve a wide range of severely disabled people who are left with minimal motor abilities. For both eye tracking and head movement detection, several different approaches have been proposed and used to implement different algorithms for these technologies. Despite the amount of research done on both technologies, researchers are still trying to find robust methods to use effectively in various applications. This paper presents a state-of-art survey for eye tracking and head movement detection methods proposed in the literature. Examples of different fields of applications for both technologies, such as human-computer interaction, driving assistance systems, and assistive technologies are also investigated. PMID:27170851
Caffeine increases the velocity of rapid eye movements in unfatigued humans.
Connell, Charlotte J W; Thompson, Benjamin; Turuwhenua, Jason; Hess, Robert F; Gant, Nicholas
2017-08-01
Caffeine is a widely used dietary stimulant that can reverse the effects of fatigue on cognitive, motor and oculomotor function. However, few studies have examined the effect of caffeine on the oculomotor system when homeostasis has not been disrupted by physical fatigue. This study examined the influence of a moderate dose of caffeine on oculomotor control and visual perception in participants who were not fatigued. Within a placebo-controlled crossover design, 13 healthy adults ingested caffeine (5 mg·kg -1 body mass) and were tested over 3 h. Eye movements, including saccades, smooth pursuit and optokinetic nystagmus, were measured using infrared oculography. Caffeine was associated with higher peak saccade velocities (472 ± 60° s -1 ) compared to placebo (455 ± 62° s -1 ). Quick phases of optokinetic nystagmus were also significantly faster with caffeine, whereas pursuit eye movements were unchanged. Non-oculomotor perceptual tasks (global motion and global orientation processing) were unaffected by caffeine. These results show that oculomotor control is modulated by a moderate dose of caffeine in unfatigued humans. These effects are detectable in the kinematics of rapid eye movements, whereas pursuit eye movements and visual perception are unaffected. Oculomotor functions may be sensitive to changes in central catecholamines mediated via caffeine's action as an adenosine antagonist, even when participants are not fatigued.
Extraocular muscle function testing
... may result in double vision or rapid, uncontrolled eye movements . Normal Results Normal movement of the eyes in all directions. What Abnormal Results Mean Eye movement disorders may be due to abnormalities of the ...
... Nerve Disorders Internuclear ophthalmoplegia is impairment of horizontal eye movements caused by damage to certain connections between nerve ... include Lyme disease, tumors, and head injuries. Horizontal eye movements are impaired, but vertical eye movements are not. ...
NASA Technical Reports Server (NTRS)
Das, V. E.; Thomas, C. W.; Zivotofsky, A. Z.; Leigh, R. J.
1996-01-01
Video-based eye-tracking systems are especially suited to studying eye movements during naturally occurring activities such as locomotion, but eye velocity records suffer from broad band noise that is not amenable to conventional filtering methods. We evaluated the effectiveness of combined median and moving-average filters by comparing prefiltered and postfiltered records made synchronously with a video eye-tracker and the magnetic search coil technique, which is relatively noise free. Root-mean-square noise was reduced by half, without distorting the eye velocity signal. To illustrate the practical use of this technique, we studied normal subjects and patients with deficient labyrinthine function and compared their ability to hold gaze on a visual target that moved with their heads (cancellation of the vestibulo-ocular reflex). Patients and normal subjects performed similarly during active head rotation but, during locomotion, patients held their eyes more steadily on the visual target than did subjects.
Pursuit Latency for Chromatic Targets
NASA Technical Reports Server (NTRS)
Mulligan, Jeffrey B.; Ellis, Stephen R. (Technical Monitor)
1998-01-01
The temporal dynamics of eye movement response to a change in direction of stimulus motion has been used to compare the processing speeds of different types of stimuli (Mulligan, ARVO '97). In this study, the pursuit response to colored targets was measured to test the hypothesis that the slow response of the chromatic system (as measured using traditional temporal sensitivity measures such as contrast sensitivity) results in increased eye movement latencies. Subjects viewed a small (0.4 deg) Gaussian spot which moved downward at a speed of 6.6 deg/sec. At a variable time during the trajectory, the dot's direction of motion changed by 30 degrees, either to the right or left. Subjects were instructed to pursue the spot. Eye movements were measured using a video ophthalmoscope with an angular resolution of approximately 1 arc min and a temporal sampling rate of 60 Hz. Stimuli were modulated in chrominance for a variety of hue directions, combined with a range of small luminance increments and decrements, to insure that some of the stimuli fell in the subjects' equiluminance planes. The smooth portions of the resulting eye movement traces were fit by convolving the stimulus velocity with an exponential having variable onset latency, time constant and amplitude. Smooth eye movements with few saccades were observed for all stimuli. Pursuit responses to stimuli having a significant luminance component are well-fit by exponentials having latencies and time constants on the order of 100 msec. Increases in pursuit response latency on the order of 100-200 msec are observed in response to certain stimuli, which occur in pairs of complementary hues, corresponding to the intersection of the stimulus section with the subjects' equiluminant plane. Smooth eye movements can be made in response to purely chromatic stimuli, but are slower than responses to stimuli with a luminance component.
Skuballa, Irene T; Fortunski, Caroline; Renkl, Alexander
2015-01-01
The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning.
Compensating For Movement Of Eye In Laser Surgery
NASA Technical Reports Server (NTRS)
Juday, Richard D.
1991-01-01
Conceptual system for laser surgery of retina includes subsystem that tracks position of retina. Tracking signal used to control galvanometer-driven mirrors keeping laser aimed at desired spot on retina as eye moves. Alternatively or additionally, indication of position used to prevent firing of laser when eye moved too far from proper aiming position.
Observers' cognitive states modulate how visual inputs relate to gaze control.
Kardan, Omid; Henderson, John M; Yourganov, Grigori; Berman, Marc G
2016-09-01
Previous research has shown that eye-movements change depending on both the visual features of our environment, and the viewer's top-down knowledge. One important question that is unclear is the degree to which the visual goals of the viewer modulate how visual features of scenes guide eye-movements. Here, we propose a systematic framework to investigate this question. In our study, participants performed 3 different visual tasks on 135 scenes: search, memorization, and aesthetic judgment, while their eye-movements were tracked. Canonical correlation analyses showed that eye-movements were reliably more related to low-level visual features at fixations during the visual search task compared to the aesthetic judgment and scene memorization tasks. Different visual features also had different relevance to eye-movements between tasks. This modulation of the relationship between visual features and eye-movements by task was also demonstrated with classification analyses, where classifiers were trained to predict the viewing task based on eye movements and visual features at fixations. Feature loadings showed that the visual features at fixations could signal task differences independent of temporal and spatial properties of eye-movements. When classifying across participants, edge density and saliency at fixations were as important as eye-movements in the successful prediction of task, with entropy and hue also being significant, but with smaller effect sizes. When classifying within participants, brightness and saturation were also significant contributors. Canonical correlation and classification results, together with a test of moderation versus mediation, suggest that the cognitive state of the observer moderates the relationship between stimulus-driven visual features and eye-movements. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Krieber, Magdalena; Bartl-Pokorny, Katrin D.; Pokorny, Florian B.; Einspieler, Christa; Langmann, Andrea; Körner, Christof; Falck-Ytter, Terje; Marschik, Peter B.
2016-01-01
Over the past decades, the relation between reading skills and eye movement behavior has been well documented in English-speaking cohorts. As English and German differ substantially with regard to orthographic complexity (i.e. grapheme-phoneme correspondence), we aimed to delineate specific characteristics of how reading speed and reading comprehension interact with eye movements in typically developing German-speaking (Austrian) adolescents. Eye movements of 22 participants (14 females; mean age = 13;6 years;months) were tracked while they were performing three tasks, namely silently reading words, texts, and pseudowords. Their reading skills were determined by means of a standardized German reading speed and reading comprehension assessment (Lesegeschwindigkeits- und -verständnistest für Klassen 6−12). We found that (a) reading skills were associated with various eye movement parameters in each of the three reading tasks; (b) better reading skills were associated with an increased efficiency of eye movements, but were primarily linked to spatial reading parameters, such as the number of fixations per word, the total number of saccades and saccadic amplitudes; (c) reading speed was a more reliable predictor for eye movement parameters than reading comprehension; (d) eye movements were highly correlated across reading tasks, which indicates consistent reading performances. Contrary to findings in English-speaking cohorts, the reading skills neither consistently correlated with temporal eye movement parameters nor with the number or percentage of regressions made while performing any of the three reading tasks. These results indicate that, although reading skills are associated with eye movement patterns irrespective of language, the temporal and spatial characteristics of this association may vary with orthographic consistency. PMID:26727255
Han, Ying; Ciuffreda, Kenneth J; Selenow, Arkady; Ali, Steven R
2003-04-01
To assess dynamic interactions of eye and head movements during return-sweep saccades (RSS) when reading with single-vision (SVL) versus progressive-addition (PAL) lenses in a simulated computer-based business environment. Horizontal eye and head movements were recorded objectively and simultaneously at a rate of 60 Hz during reading of single-page (SP; 14 degrees horizontal [H]) and double-page (DP; 37 degrees H) formats at 60 cm with binocular viewing. Subjects included 11 individuals with normal presbyopic vision aged 45 to 71 years selected by convenience sampling from a clinic population. Reading was performed with three types of spectacle lenses with a different clear near field of view (FOV): a SVL (60 degrees H clear FOV), a PAL-I with a relatively wide intermediate zone (7.85 mm; 18 degrees H clear FOV), and a PAL-II with a relatively narrow intermediate zone (5.60 mm; 13 degrees H clear FOV). Eye movements were initiated before head movements in the SP condition, and the reverse was found in the DP condition, with all three lens types. Duration of eye movements increased as the zone of clear vision decreased in the SP condition, and they were longer with the PALs than with the SVL in the DP condition. Gaze stabilization occurred later with the PALs than with the SVL in both the SP and DP conditions. The duration of head movements was longer with the PAL-II than with the SVL in both the SP and DP conditions. Eye movement peak velocity was greater with the SVL than the PALs in the DP condition. Eye movement and head movement strategies and timing were contingent on viewing conditions. The longer eye movement duration and gaze-stabilization times suggested that additional eye movements were needed to locate the clear-vision zone and commence reading after the RSS. Head movements with PALs for the SP condition were similarly optically induced. These eye movement and head movement results may contribute to the reduced reading rate and related symptoms reported by some PAL wearers. The dynamic interactions of eye movements and head movements during reading with the PALs appear to be a sensitive indicator of the effect of lens optical design parameters on overall reading performance, because the movements can discriminate between SVL and PAL designs and at times even between PALs.
Miyata, Hiromitsu; Minagawa-Kawai, Yasuyo; Watanabe, Shigeru; Sasaki, Toyofumi; Ueda, Kazuhiro
2012-01-01
Background A growing body of evidence suggests that meditative training enhances perception and cognition. In Japan, the Park-Sasaki method of speed-reading involves organized visual training while forming both a relaxed and concentrated state of mind, as in meditation. The present study examined relationships between reading speed, sentence comprehension, and eye movements while reading short Japanese novels. In addition to normal untrained readers, three middle-level trainees and one high-level expert on this method were included for the two case studies. Methodology/Principal Findings In Study 1, three of 17 participants were middle-level trainees on the speed-reading method. Immediately after reading each story once on a computer monitor, participants answered true or false questions regarding the content of the novel. Eye movements while reading were recorded using an eye-tracking system. Results revealed higher reading speed and lower comprehension scores in the trainees than in the untrained participants. Furthermore, eye-tracking data by untrained participants revealed multiple correlations between reading speed, accuracy and eye-movement measures, with faster readers showing shorter fixation durations and larger saccades in X than slower readers. In Study 2, participants included a high-level expert and 14 untrained students. The expert showed higher reading speed and statistically comparable, although numerically lower, comprehension scores compared with the untrained participants. During test sessions this expert moved her eyes along a nearly straight horizontal line as a first pass, without moving her eyes over the whole sentence display as did the untrained students. Conclusions/Significance In addition to revealing correlations between speed, comprehension and eye movements in reading Japanese contemporary novels by untrained readers, we describe cases of speed-reading trainees regarding relationships between these variables. The trainees overall tended to show poor performance influenced by the speed-accuracy trade-off, although this trade-off may be reduced in the case of at least one high-level expert. PMID:22590519
Combining EEG and eye movement recording in free viewing: Pitfalls and possibilities.
Nikolaev, Andrey R; Meghanathan, Radha Nila; van Leeuwen, Cees
2016-08-01
Co-registration of EEG and eye movement has promise for investigating perceptual processes in free viewing conditions, provided certain methodological challenges can be addressed. Most of these arise from the self-paced character of eye movements in free viewing conditions. Successive eye movements occur within short time intervals. Their evoked activity is likely to distort the EEG signal during fixation. Due to the non-uniform distribution of fixation durations, these distortions are systematic, survive across-trials averaging, and can become a source of confounding. We illustrate this problem with effects of sequential eye movements on the evoked potentials and time-frequency components of EEG and propose a solution based on matching of eye movement characteristics between experimental conditions. The proposal leads to a discussion of which eye movement characteristics are to be matched, depending on the EEG activity of interest. We also compare segmentation of EEG into saccade-related epochs relative to saccade and fixation onsets and discuss the problem of baseline selection and its solution. Further recommendations are given for implementing EEG-eye movement co-registration in free viewing conditions. By resolving some of the methodological problems involved, we aim to facilitate the transition from the traditional stimulus-response paradigm to the study of visual perception in more naturalistic conditions. Copyright © 2016 Elsevier Inc. All rights reserved.
Gopal, Atul; Murthy, Aditya
2016-06-01
Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration. Copyright © 2016 the American Physiological Society.
Gopal, Atul
2016-01-01
Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration. PMID:26888104
Spatiotemporal Filter for Visual Motion Integration from Pursuit Eye Movements in Humans and Monkeys
Liu, Bing
2017-01-01
Despite the enduring interest in motion integration, a direct measure of the space–time filter that the brain imposes on a visual scene has been elusive. This is perhaps because of the challenge of estimating a 3D function from perceptual reports in psychophysical tasks. We take a different approach. We exploit the close connection between visual motion estimates and smooth pursuit eye movements to measure stimulus–response correlations across space and time, computing the linear space–time filter for global motion direction in humans and monkeys. Although derived from eye movements, we find that the filter predicts perceptual motion estimates quite well. To distinguish visual from motor contributions to the temporal duration of the pursuit motion filter, we recorded single-unit responses in the monkey middle temporal cortical area (MT). We find that pursuit response delays are consistent with the distribution of cortical neuron latencies and that temporal motion integration for pursuit is consistent with a short integration MT subpopulation. Remarkably, the visual system appears to preferentially weight motion signals across a narrow range of foveal eccentricities rather than uniformly over the whole visual field, with a transiently enhanced contribution from locations along the direction of motion. We find that the visual system is most sensitive to motion falling at approximately one-third the radius of the stimulus aperture. Hypothesizing that the visual drive for pursuit is related to the filtered motion energy in a motion stimulus, we compare measured and predicted eye acceleration across several other target forms. SIGNIFICANCE STATEMENT A compact model of the spatial and temporal processing underlying global motion perception has been elusive. We used visually driven smooth eye movements to find the 3D space–time function that best predicts both eye movements and perception of translating dot patterns. We found that the visual system does not appear to use all available motion signals uniformly, but rather weights motion preferentially in a narrow band at approximately one-third the radius of the stimulus. Although not universal, the filter predicts responses to other types of stimuli, demonstrating a remarkable degree of generalization that may lead to a deeper understanding of visual motion processing. PMID:28003348
1989-08-01
paths for integration with the off-aperture and dual-mirror VPD designs. PREFACE The goal of this work was to explore integration of an eye line-of- gaze ...Relationship in one plane between point-of- gaze on a flat scene and relative eye, detector, and scene positions...and eye line-of- gaze measurement. As a first step towards the design of an appropriate eye trac.<ing system for interface with the virtual cockpit
Genetics Home Reference: Duane-radial ray syndrome
... condition is characterized by a particular problem with eye movement called Duane anomaly (also known as Duane syndrome). ... the improper development of certain nerves that control eye movement. Duane anomaly limits outward eye movement (toward the ...
Eye movements reduce vividness and emotionality of "flashforwards".
Engelhard, Iris M; van den Hout, Marcel A; Janssen, Wilco C; van der Beek, Jorinde
2010-05-01
Earlier studies have shown that eye movements during retrieval of disturbing images about past events reduce their vividness and emotionality, which may be due to both tasks competing for working memory resources. This study examined whether eye movements reduce vividness and emotionality of visual distressing images about feared future events: "flashforwards". A non-clinical sample was asked to select two images of feared future events, which were self-rated for vividness and emotionality. These images were retrieved while making eye movements or without a concurrent secondary task, and then vividness and emotionality were rated again. Relative to the no-dual task condition, eye movements while thinking of future-oriented images resulted in decreased ratings of image vividness and emotional intensity. Apparently, eye movements reduce vividness and emotionality of visual images about past and future feared events. This is in line with a working memory account of the beneficial effects of eye movements, which predicts that any task that taxes working memory during retrieval of disturbing mental images will be beneficial. Copyright 2010 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
McConkie, G. W.; And Others
Sixty-six college students read two chapters from a contemporary novel while their eye movements were monitored. The eye movement data were analyzed to identify factors that influence the location of a reader's initial eye fixation on a word. When the data were partitioned according to the location of the prior fixation (i.e., launch site), the…
Context effects on smooth pursuit and manual interception of a disappearing target.
Kreyenmeier, Philipp; Fooken, Jolande; Spering, Miriam
2017-07-01
In our natural environment, we interact with moving objects that are surrounded by richly textured, dynamic visual contexts. Yet most laboratory studies on vision and movement show visual objects in front of uniform gray backgrounds. Context effects on eye movements have been widely studied, but it is less well known how visual contexts affect hand movements. Here we ask whether eye and hand movements integrate motion signals from target and context similarly or differently, and whether context effects on eye and hand change over time. We developed a track-intercept task requiring participants to track the initial launch of a moving object ("ball") with smooth pursuit eye movements. The ball disappeared after a brief presentation, and participants had to intercept it in a designated "hit zone." In two experiments ( n = 18 human observers each), the ball was shown in front of a uniform or a textured background that either was stationary or moved along with the target. Eye and hand movement latencies and speeds were similarly affected by the visual context, but eye and hand interception (eye position at time of interception, and hand interception timing error) did not differ significantly between context conditions. Eye and hand interception timing errors were strongly correlated on a trial-by-trial basis across all context conditions, highlighting the close relation between these responses in manual interception tasks. Our results indicate that visual contexts similarly affect eye and hand movements but that these effects may be short-lasting, affecting movement trajectories more than movement end points. NEW & NOTEWORTHY In a novel track-intercept paradigm, human observers tracked a briefly shown object moving across a textured, dynamic context and intercepted it with their finger after it had disappeared. Context motion significantly affected eye and hand movement latency and speed, but not interception accuracy; eye and hand position at interception were correlated on a trial-by-trial basis. Visual context effects may be short-lasting, affecting movement trajectories more than movement end points. Copyright © 2017 the American Physiological Society.
Influence of virtual reality on postural stability during movements of quiet stance.
Horlings, Corinne G C; Carpenter, Mark G; Küng, Ursula M; Honegger, Flurin; Wiederhold, Brenda; Allum, John H J
2009-02-27
Balance problems during virtual reality (VR) have been mentioned in the literature but seldom investigated despite the increased use of VR systems as a training or rehabilitation tool. We examined the influence of VR on body sway under different stance conditions. Seventeen young subjects performed four tasks (standing with feet close together or tandem stance on firm and foam surfaces for 60s) under three visual conditions: eyes open without VR, eyes closed, or while viewing a virtual reality scene which moved with body movements. Angular velocity transducers mounted on the shoulder provided measures of body sway in the roll and pitch plane. VR caused increased pitch and roll angles and angular velocities compared to EO. The effects of VR were, for the most part, indistinguishable from eyes closed conditions. Use of a foam surface increased sway compared to a firm surface under eyes closed and VR conditions. During the movements of quiet stance, VR causes an increase in postural sway in amplitude similar to that caused by closing the eyes. This increased sway was present irrespective of stance surface, but was greatest on foam.
System identification of perilymphatic fistula in an animal model
NASA Technical Reports Server (NTRS)
Wall, C. 3rd; Casselbrant, M. L.
1992-01-01
An acute animal model has been developed in the chinchilla for the study of perilymphatic fistulas. Micropunctures were made in three sites to simulate bony, round window, and oval window fistulas. The eye movements in response to pressure applied to the external auditory canal were recorded after micropuncture induction and in preoperative controls. The main pressure stimulus was a pseudorandom binary sequence (PRBS) that rapidly changed between plus and minus 200 mm of water. The PRBS stimulus, with its wide frequency bandwidth, produced responses clearly above the preoperative baseline in 78 percent of the runs. The response was better between 0.5 and 3.3 Hz than it was below 0.5 Hz. The direction of horizontal eye movement was toward the side of the fistula with positive pressure applied in 92 percent of the runs. Vertical eye movements were also observed. The ratio of vertical eye displacement to horizontal eye displacement depended upon the site of the micropuncture induction. Thus, such a ratio measurement may be clinically useful in the noninvasive localization of perilymphatic fistulas in humans.
FEFsem neuronal response during combined volitional and reflexive pursuit.
Bakst, Leah; Fleuriet, Jérome; Mustari, Michael J
2017-05-01
Although much is known about volitional and reflexive smooth eye movements individually, much less is known about how they are coordinated. It is hypothesized that separate cortico-ponto-cerebellar loops subserve these different types of smooth eye movements. Specifically, the MT-MST-DLPN pathway is thought to be critical for ocular following eye movements, whereas the FEF-NRTP pathway is understood to be vital for volitional smooth pursuit. However, the role that these loops play in combined volitional and reflexive behavior is unknown. We used a large, textured background moving in conjunction with a small target spot to investigate the eye movements evoked by a combined volitional and reflexive pursuit task. We also assessed the activity of neurons in the smooth eye movement subregion of the frontal eye field (FEFsem). We hypothesized that the pursuit system would show less contribution from the volitional pathway in this task, owing to the increased involvement of the reflexive pathway. In accordance with this hypothesis, a majority of FEFsem neurons (63%) were less active during pursuit maintenance in a combined volitional and reflexive pursuit task than during purely volitional pursuit. Interestingly and surprisingly, the neuronal response to the addition of the large-field motion was highly correlated with the neuronal response to a target blink. This suggests that FEFsem neuronal responses to these different perturbations-whether the addition or subtraction of retinal input-may be related. We conjecture that these findings are due to changing weights of both the volitional and reflexive pathways, as well as retinal and extraretinal signals.
FEFsem neuronal response during combined volitional and reflexive pursuit
Bakst, Leah; Fleuriet, Jérome; Mustari, Michael J.
2017-01-01
Although much is known about volitional and reflexive smooth eye movements individually, much less is known about how they are coordinated. It is hypothesized that separate cortico-ponto-cerebellar loops subserve these different types of smooth eye movements. Specifically, the MT-MST-DLPN pathway is thought to be critical for ocular following eye movements, whereas the FEF-NRTP pathway is understood to be vital for volitional smooth pursuit. However, the role that these loops play in combined volitional and reflexive behavior is unknown. We used a large, textured background moving in conjunction with a small target spot to investigate the eye movements evoked by a combined volitional and reflexive pursuit task. We also assessed the activity of neurons in the smooth eye movement subregion of the frontal eye field (FEFsem). We hypothesized that the pursuit system would show less contribution from the volitional pathway in this task, owing to the increased involvement of the reflexive pathway. In accordance with this hypothesis, a majority of FEFsem neurons (63%) were less active during pursuit maintenance in a combined volitional and reflexive pursuit task than during purely volitional pursuit. Interestingly and surprisingly, the neuronal response to the addition of the large-field motion was highly correlated with the neuronal response to a target blink. This suggests that FEFsem neuronal responses to these different perturbations—whether the addition or subtraction of retinal input—may be related. We conjecture that these findings are due to changing weights of both the volitional and reflexive pathways, as well as retinal and extraretinal signals. PMID:28538993
Premotor neurons encode torsional eye velocity during smooth-pursuit eye movements
NASA Technical Reports Server (NTRS)
Angelaki, Dora E.; Dickman, J. David
2003-01-01
Responses to horizontal and vertical ocular pursuit and head and body rotation in multiple planes were recorded in eye movement-sensitive neurons in the rostral vestibular nuclei (VN) of two rhesus monkeys. When tested during pursuit through primary eye position, the majority of the cells preferred either horizontal or vertical target motion. During pursuit of targets that moved horizontally at different vertical eccentricities or vertically at different horizontal eccentricities, eye angular velocity has been shown to include a torsional component the amplitude of which is proportional to half the gaze angle ("half-angle rule" of Listing's law). Approximately half of the neurons, the majority of which were characterized as "vertical" during pursuit through primary position, exhibited significant changes in their response gain and/or phase as a function of gaze eccentricity during pursuit, as if they were also sensitive to torsional eye velocity. Multiple linear regression analysis revealed a significant contribution of torsional eye movement sensitivity to the responsiveness of the cells. These findings suggest that many VN neurons encode three-dimensional angular velocity, rather than the two-dimensional derivative of eye position, during smooth-pursuit eye movements. Although no clear clustering of pursuit preferred-direction vectors along the semicircular canal axes was observed, the sensitivity of VN neurons to torsional eye movements might reflect a preservation of similar premotor coding of visual and vestibular-driven slow eye movements for both lateral-eyed and foveate species.
Takeda, Noriaki; Uno, Atsuhiko; Inohara, Hidenori; Shimada, Shoichi
2016-01-01
Background The mouse is the most commonly used animal model in biomedical research because of recent advances in molecular genetic techniques. Studies related to eye movement in mice are common in fields such as ophthalmology relating to vision, neuro-otology relating to the vestibulo-ocular reflex (VOR), neurology relating to the cerebellum’s role in movement, and psychology relating to attention. Recording eye movements in mice, however, is technically difficult. Methods We developed a new algorithm for analyzing the three-dimensional (3D) rotation vector of eye movement in mice using high-speed video-oculography (VOG). The algorithm made it possible to analyze the gain and phase of VOR using the eye’s angular velocity around the axis of eye rotation. Results When mice were rotated at 0.5 Hz and 2.5 Hz around the earth’s vertical axis with their heads in a 30° nose-down position, the vertical components of their left eye movements were in phase with the horizontal components. The VOR gain was 0.42 at 0.5 Hz and 0.74 at 2.5 Hz, and the phase lead of the eye movement against the turntable was 16.1° at 0.5 Hz and 4.88° at 2.5 Hz. Conclusions To the best of our knowledge, this is the first report of this algorithm being used to calculate a 3D rotation vector of eye movement in mice using high-speed VOG. We developed a technique for analyzing the 3D rotation vector of eye movements in mice with a high-speed infrared CCD camera. We concluded that the technique is suitable for analyzing eye movements in mice. We also include a C++ source code that can calculate the 3D rotation vectors of the eye position from two-dimensional coordinates of the pupil and the iris freckle in the image to this article. PMID:27023859
Eyes that bind us: Gaze leading induces an implicit sense of agency.
Stephenson, Lisa J; Edwards, S Gareth; Howard, Emma E; Bayliss, Andrew P
2018-03-01
Humans feel a sense of agency over the effects their motor system causes. This is the case for manual actions such as pushing buttons, kicking footballs, and all acts that affect the physical environment. We ask whether initiating joint attention - causing another person to follow our eye movement - can elicit an implicit sense of agency over this congruent gaze response. Eye movements themselves cannot directly affect the physical environment, but joint attention is an example of how eye movements can indirectly cause social outcomes. Here we show that leading the gaze of an on-screen face induces an underestimation of the temporal gap between action and consequence (Experiments 1 and 2). This underestimation effect, named 'temporal binding,' is thought to be a measure of an implicit sense of agency. Experiment 3 asked whether merely making an eye movement in a non-agentic, non-social context might also affect temporal estimation, and no reliable effects were detected, implying that inconsequential oculomotor acts do not reliably affect temporal estimations under these conditions. Together, these findings suggest that an implicit sense of agency is generated when initiating joint attention interactions. This is important for understanding how humans can efficiently detect and understand the social consequences of their actions. Copyright © 2017 Elsevier B.V. All rights reserved.
Constraining eye movement in individuals with Parkinson's disease during walking turns.
Ambati, V N Pradeep; Saucedo, Fabricio; Murray, Nicholas G; Powell, Douglas W; Reed-Jones, Rebecca J
2016-10-01
Walking and turning is a movement that places individuals with Parkinson's disease (PD) at increased risk for fall-related injury. However, turning is an essential movement in activities of daily living, making up to 45 % of the total steps taken in a given day. Hypotheses regarding how turning is controlled suggest an essential role of anticipatory eye movements to provide feedforward information for body coordination. However, little research has investigated control of turning in individuals with PD with specific consideration for eye movements. The purpose of this study was to examine eye movement behavior and body segment coordination in individuals with PD during walking turns. Three experimental groups, a group of individuals with PD, a group of healthy young adults (YAC), and a group of healthy older adults (OAC), performed walking and turning tasks under two visual conditions: free gaze and fixed gaze. Whole-body motion capture and eye tracking characterized body segment coordination and eye movement behavior during walking trials. Statistical analysis revealed significant main effects of group (PD, YAC, and OAC) and visual condition (free and fixed gaze) on timing of segment rotation and horizontal eye movement. Within group comparisons, revealed timing of eye and head movement was significantly different between the free and fixed gaze conditions for YAC (p < 0.001) and OAC (p < 0.05), but not for the PD group (p > 0.05). In addition, while intersegment timings (reflecting segment coordination) were significantly different for YAC and OAC during free gaze (p < 0.05), they were not significantly different in PD. These results suggest individuals with PD do not make anticipatory eye and head movements ahead of turning and that this may result in altered segment coordination during turning. As such, eye movements may be an important addition to training programs for those with PD, possibly promoting better coordination during turning and potentially reducing the risk of falls.
Effect of viewing distance on the generation of vertical eye movements during locomotion
NASA Technical Reports Server (NTRS)
Moore, S. T.; Hirasaki, E.; Cohen, B.; Raphan, T.
1999-01-01
Vertical head and eye coordination was studied as a function of viewing distance during locomotion. Vertical head translation and pitch movements were measured using a video motion analysis system (Optotrak 3020). Vertical eye movements were recorded using a video-based pupil tracker (Iscan). Subjects (five) walked on a linear treadmill at a speed of 1.67 m/s (6 km/h) while viewing a target screen placed at distances ranging from 0.25 to 2.0 m at 0. 25-m intervals. The predominant frequency of vertical head movement was 2 Hz. In accordance with previous studies, there was a small head pitch rotation, which was compensatory for vertical head translation. The magnitude of the vertical head movements and the phase relationship between head translation and pitch were little affected by viewing distance, and tended to orient the naso-occipital axis of the head at a point approximately 1 m in front of the subject (the head fixation distance or HFD). In contrast, eye velocity was significantly affected by viewing distance. When viewing a far (2-m) target, vertical eye velocity was 180 degrees out of phase with head pitch velocity, with a gain of 0. 8. This indicated that the angular vestibulo-ocular reflex (aVOR) was generating the eye movement response. The major finding was that, at a close viewing distance (0.25 m), eye velocity was in phase with head pitch and compensatory for vertical head translation, suggesting that activation of the linear vestibulo-ocular reflex (lVOR) was contributing to the eye movement response. There was also a threefold increase in the magnitude of eye velocity when viewing near targets, which was consistent with the goal of maintaining gaze on target. The required vertical lVOR sensitivity to cancel an unmodified aVOR response and generate the observed eye velocity magnitude for near targets was almost 3 times that previously measured. Supplementary experiments were performed utilizing body-fixed active head pitch rotations at 1 and 2 Hz while viewing a head-fixed target. Results indicated that the interaction of smooth pursuit and the aVOR during visual suppression could modify both the gain and phase characteristics of the aVOR at frequencies encountered during locomotion. When walking, targets located closer than the HFD (1.0 m) would appear to move in the same direction as the head pitch, resulting in suppression of the aVOR. The results of the head-fixed target experiment suggest that phase modification of the aVOR during visual suppression could play a role in generating eye movements consistent with the goal of maintaining gaze on targets closer than the HFD, which would augment the lVOR response.
Suzuki, David A; Yamada, Tetsuto; Yee, Robert D
2003-04-01
Neuronal responses that were observed during smooth-pursuit eye movements were recorded from cells in rostral portions of the nucleus reticularis tegmenti pontis (rNRTP). The responses were categorized as smooth-pursuit eye velocity (78%) or eye acceleration (22%). A separate population of rNRTP cells encoded static eye position. The sensitivity to pursuit eye velocity averaged 0.81 spikes/s per degrees /s, whereas the average sensitivity to pursuit eye acceleration was 0.20 spikes/s per degrees /s(2). Of the eye-velocity cells with horizontal preferences for pursuit responses, 56% were optimally responsive to contraversive smooth-pursuit eye movements and 44% preferred ipsiversive pursuit. For cells with vertical pursuit preferences, 61% preferred upward pursuit and 39% preferred downward pursuit. The direction selectivity was broad with 50% of the maximal response amplitude observed for directions of smooth pursuit up to +/-85 degrees away from the optimal direction. The activities of some rNRTP cells were linearly related to eye position with an average sensitivity of 2.1 spikes/s per deg. In some cells, the magnitude of the response during smooth-pursuit eye movements was affected by the position of the eyes even though these cells did not encode eye position. On average, pursuit centered to one side of screen center elicited a response that was 73% of the response amplitude obtained with tracking centered at screen center. For pursuit centered on the opposite side, the average response was 127% of the response obtained at screen center. The results provide a neuronal rationale for the slow, pursuit-like eye movements evoked with rNRTP microstimulation and for the deficits in smooth-pursuit eye movements observed with ibotenic acid injection into rNRTP. More globally, the results support the notion of a frontal and supplementary eye field-rNRTP-cerebellum pathway involved with controlling smooth-pursuit eye movements.
Action perception as hypothesis testing.
Donnarumma, Francesco; Costantini, Marcello; Ambrosini, Ettore; Friston, Karl; Pezzulo, Giovanni
2017-04-01
We present a novel computational model that describes action perception as an active inferential process that combines motor prediction (the reuse of our own motor system to predict perceived movements) and hypothesis testing (the use of eye movements to disambiguate amongst hypotheses). The system uses a generative model of how (arm and hand) actions are performed to generate hypothesis-specific visual predictions, and directs saccades to the most informative places of the visual scene to test these predictions - and underlying hypotheses. We test the model using eye movement data from a human action observation study. In both the human study and our model, saccades are proactive whenever context affords accurate action prediction; but uncertainty induces a more reactive gaze strategy, via tracking the observed movements. Our model offers a novel perspective on action observation that highlights its active nature based on prediction dynamics and hypothesis testing. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Raghavan, Ramanujan T; Joshua, Mati
2017-10-01
We investigated the composition of preparatory activity of frontal eye field (FEF) neurons in monkeys performing a pursuit target selection task. In response to the orthogonal motion of a large and a small reward target, monkeys initiated pursuit biased toward the direction of large reward target motion. FEF neurons exhibited robust preparatory activity preceding movement initiation in this task. Preparatory activity consisted of two components, ramping activity that was constant across target selection conditions, and a flat offset in firing rates that signaled the target selection condition. Ramping activity accounted for 50% of the variance in the preparatory activity and was linked most strongly, on a trial-by-trial basis, to pursuit eye movement latency rather than to its direction or gain. The offset in firing rates that discriminated target selection conditions accounted for 25% of the variance in the preparatory activity and was commensurate with a winner-take-all representation, signaling the direction of large reward target motion rather than a representation that matched the parameters of the upcoming movement. These offer new insights into the role that the frontal eye fields play in target selection and pursuit control. They show that preparatory activity in the FEF signals more strongly when to move rather than where or how to move and suggest that structures outside the FEF augment its contributions to the target selection process. NEW & NOTEWORTHY We used the smooth eye movement pursuit system to link between patterns of preparatory activity in the frontal eye fields and movement during a target selection task. The dominant pattern was a ramping signal that did not discriminate between selection conditions and was linked, on trial-by-trial basis, to movement latency. A weaker pattern was composed of a constant signal that discriminated between selection conditions but was only weakly linked to the movement parameters. Copyright © 2017 the American Physiological Society.
Factors influencing the shear rate acting on silicone oil to cause silicone oil emulsification.
Chan, Yau Kei; Cheung, Ning; Wong, David
2014-10-30
The shear force between silicone oil (SO) bubble and aqueous during eye movements may underlie the development of SO emulsification. This study examines factors that may affect such shear force induced by eye movements. A surface-modified model eye chamber was put under large-amplitude eye movements (amplitude 90°, angular velocity 360°/s, and a duration 300 ms). Agarose-made indentations were introduced to mimic the effect of encircling scleral buckle. Two SOs (1300 and 5000 centistokes [cSt]), three volumes (3, 4, and 5 mL), and two eye chambers (with and without indentation) were tested. Video recording was used to capture the movements of SO inside the model chamber under various conditions. The presence of indentation within the eye chamber significantly reduced the velocity of SO movements relative to the eye chamber movements (P < 0.001). To a lesser extent, an increase in viscosity also had a significant effect in reducing the relative movements. No significant effect was observed for the extent of SO fill in the chamber. Our experimental model suggests indentation within an eye, such as that created by scleral buckling, may have the greatest influence in reducing shear force induced by eye movements. Therefore, using an encircling scleral buckle may be similarly or more effective than using SO with higher viscosity in lowering the propensity to SO emulsification. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.
Does the implicit models of leadership influence the scanning of other-race faces in adults?
Densten, Iain L.; Borrowman, Luc
2017-01-01
The current study aims to identify the relationships between implicit leadership theoretical (ILT) prototypes / anti-prototype and five facial features (i.e., nasion, upper nose, lower nose, and upper lip) of a leader from a different race than respondents. A sample of 81 Asian respondents viewed a 30-second video of a Caucasian female who in a non-engaging manner talked about her career achievements. As participants watch the video, their eye movements were recorded via an eye tracking devise. While previous research has identified that ILT influences perceptional and attitudinal ratings of leaders, the current study extends these findings by confirming the impact of ILT on the gaze patterns of other race participants, who appear to adopt system one type thinking. This study advances our understanding in how cognitive categories or schemas influence the physicality of individuals (i.e., eye gaze or movements). Finally, this study confirms that individual ILT factors have a relationship with the eye movements of participants and suggests future research directions. PMID:28686605
Eye movements during mental time travel follow a diagonal line.
Hartmann, Matthias; Martarelli, Corinna S; Mast, Fred W; Stocker, Kurt
2014-11-01
Recent research showed that past events are associated with the back and left side, whereas future events are associated with the front and right side of space. These spatial-temporal associations have an impact on our sensorimotor system: thinking about one's past and future leads to subtle body sways in the sagittal dimension of space (Miles, Nind, & Macrae, 2010). In this study we investigated whether mental time travel leads to sensorimotor correlates in the horizontal dimension of space. Participants were asked to mentally displace themselves into the past or future while measuring their spontaneous eye movements on a blank screen. Eye gaze was directed more rightward and upward when thinking about the future than when thinking about the past. Our results provide further insight into the spatial nature of temporal thoughts, and show that not only body, but also eye movements follow a (diagonal) "time line" during mental time travel. Copyright © 2014 Elsevier Inc. All rights reserved.
Does the implicit models of leadership influence the scanning of other-race faces in adults?
Densten, Iain L; Borrowman, Luc
2017-01-01
The current study aims to identify the relationships between implicit leadership theoretical (ILT) prototypes / anti-prototype and five facial features (i.e., nasion, upper nose, lower nose, and upper lip) of a leader from a different race than respondents. A sample of 81 Asian respondents viewed a 30-second video of a Caucasian female who in a non-engaging manner talked about her career achievements. As participants watch the video, their eye movements were recorded via an eye tracking devise. While previous research has identified that ILT influences perceptional and attitudinal ratings of leaders, the current study extends these findings by confirming the impact of ILT on the gaze patterns of other race participants, who appear to adopt system one type thinking. This study advances our understanding in how cognitive categories or schemas influence the physicality of individuals (i.e., eye gaze or movements). Finally, this study confirms that individual ILT factors have a relationship with the eye movements of participants and suggests future research directions.
Eye movements when viewing advertisements
Higgins, Emily; Leinenger, Mallorie; Rayner, Keith
2013-01-01
In this selective review, we examine key findings on eye movements when viewing advertisements. We begin with a brief, general introduction to the properties and neural underpinnings of saccadic eye movements. Next, we provide an overview of eye movement behavior during reading, scene perception, and visual search, since each of these activities is, at various times, involved in viewing ads. We then review the literature on eye movements when viewing print ads and warning labels (of the kind that appear on alcohol and tobacco ads), before turning to a consideration of advertisements in dynamic media (television and the Internet). Finally, we propose topics and methodological approaches that may prove to be useful in future research. PMID:24672500
Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research.
Gibaldi, Agostino; Vanegas, Mauricio; Bex, Peter J; Maiello, Guido
2017-06-01
The Tobii Eyex Controller is a new low-cost binocular eye tracker marketed for integration in gaming and consumer applications. The manufacturers claim that the system was conceived for natural eye gaze interaction, does not require continuous recalibration, and allows moderate head movements. The Controller is provided with a SDK to foster the development of new eye tracking applications. We review the characteristics of the device for its possible use in scientific research. We develop and evaluate an open source Matlab Toolkit that can be employed to interface with the EyeX device for gaze recording in behavioral experiments. The Toolkit provides calibration procedures tailored to both binocular and monocular experiments, as well as procedures to evaluate other eye tracking devices. The observed performance of the EyeX (i.e. accuracy < 0.6°, precision < 0.25°, latency < 50 ms and sampling frequency ≈55 Hz), is sufficient for some classes of research application. The device can be successfully employed to measure fixation parameters, saccadic, smooth pursuit and vergence eye movements. However, the relatively low sampling rate and moderate precision limit the suitability of the EyeX for monitoring micro-saccadic eye movements or for real-time gaze-contingent stimulus control. For these applications, research grade, high-cost eye tracking technology may still be necessary. Therefore, despite its limitations with respect to high-end devices, the EyeX has the potential to further the dissemination of eye tracking technology to a broad audience, and could be a valuable asset in consumer and gaming applications as well as a subset of basic and clinical research settings.
Head position modulates optokinetic nystagmus
Ferraresi, A.; Botti, F. M.; Panichi, R.; Barmack, N. H.
2011-01-01
Orientation and movement relies on both visual and vestibular information mapped in separate coordinate systems. Here, we examine how coordinate systems interact to guide eye movements of rabbits. We exposed rabbits to continuous horizontal optokinetic stimulation (HOKS) at 5°/s to evoke horizontal eye movements, while they were statically or dynamically roll-tilted about the longitudinal axis. During monocular or binocular HOKS, when the rabbit was roll-tilted 30° onto the side of the eye stimulated in the posterior → anterior (P → A) direction, slow phase eye velocity (SPEV) increased by 3.5–5°/s. When the rabbit was roll-tilted 30° onto the side of the eye stimulated in the A → P direction, SPEV decreased to ~2.5°/s. We also tested the effect of roll-tilt after prolonged optokinetic stimulation had induced a negative optokinetic afternystagmus (OKAN II). In this condition, the SPEV occurred in the dark, “open loop.” Modulation of SPEV of OKAN II depended on the direction of the nystagmus and was consistent with that observed during “closed loop” HOKS. Dynamic roll-tilt influenced SPEV evoked by HOKS in a similar way. The amplitude and the phase of SPEV depended on the frequency of vestibular oscillation and on HOKS velocity. We conclude that the change in the linear acceleration of the gravity vector with respect to the head during roll-tilt modulates the gain of SPEV depending on its direction. This modulation improves gaze stability at different image retinal slip velocities caused by head roll-tilt during centric or eccentric head movement. PMID:21735244
Head position modulates optokinetic nystagmus.
Pettorossi, V E; Ferraresi, A; Botti, F M; Panichi, R; Barmack, N H
2011-08-01
Orientation and movement relies on both visual and vestibular information mapped in separate coordinate systems. Here, we examine how coordinate systems interact to guide eye movements of rabbits. We exposed rabbits to continuous horizontal optokinetic stimulation (HOKS) at 5°/s to evoke horizontal eye movements, while they were statically or dynamically roll-tilted about the longitudinal axis. During monocular or binocular HOKS, when the rabbit was roll-tilted 30° onto the side of the eye stimulated in the posterior → anterior (P → A) direction, slow phase eye velocity (SPEV) increased by 3.5-5°/s. When the rabbit was roll-tilted 30° onto the side of the eye stimulated in the A → P direction, SPEV decreased to ~2.5°/s. We also tested the effect of roll-tilt after prolonged optokinetic stimulation had induced a negative optokinetic afternystagmus (OKAN II). In this condition, the SPEV occurred in the dark, "open loop." Modulation of SPEV of OKAN II depended on the direction of the nystagmus and was consistent with that observed during "closed loop" HOKS. Dynamic roll-tilt influenced SPEV evoked by HOKS in a similar way. The amplitude and the phase of SPEV depended on the frequency of vestibular oscillation and on HOKS velocity. We conclude that the change in the linear acceleration of the gravity vector with respect to the head during roll-tilt modulates the gain of SPEV depending on its direction. This modulation improves gaze stability at different image retinal slip velocities caused by head roll-tilt during centric or eccentric head movement.
Eye-head coordination during free exploration in human and cat.
Einhäuser, Wolfgang; Moeller, Gudrun U; Schumann, Frank; Conradt, Jörg; Vockeroth, Johannes; Bartl, Klaus; Schneider, Erich; König, Peter
2009-05-01
Eye, head, and body movements jointly control the direction of gaze and the stability of retinal images in most mammalian species. The contribution of the individual movement components, however, will largely depend on the ecological niche the animal occupies and the layout of the animal's retina, in particular its photoreceptor density distribution. Here the relative contribution of eye-in-head and head-in-world movements in cats is measured, and the results are compared to recent human data. For the cat, a lightweight custom-made head-mounted video setup was used (CatCam). Human data were acquired with the novel EyeSeeCam device, which measures eye position to control a gaze-contingent camera in real time. For both species, analysis was based on simultaneous recordings of eye and head movements during free exploration of a natural environment. Despite the substantial differences in ecological niche, photoreceptor density, and saccade frequency, eye-movement characteristics in both species are remarkably similar. Coordinated eye and head movements dominate the dynamics of the retinal input. Interestingly, compensatory (gaze-stabilizing) movements play a more dominant role in humans than they do in cats. This finding was interpreted to be a consequence of substantially different timescales for head movements, with cats' head movements showing about a 5-fold faster dynamics than humans. For both species, models and laboratory experiments therefore need to account for this rich input dynamic to obtain validity for ecologically realistic settings.
Kasten, Erich; Bunzenthal, Ulrike; Sabel, Bernhard A
2006-11-25
It has been argued that patients with visual field defects compensate for their deficit by making more frequent eye movements toward the hemianopic field and that visual field enlargements found after vision restoration therapy (VRT) may be an artefact of such eye movements. In order to determine if this was correct, we recorded eye movements in hemianopic subjects before and after VRT. Visual fields were measured in subjects with homonymous visual field defects (n=15) caused by trauma, cerebral ischemia or haemorrhage (lesion age >6 months). Visual field charts were plotted using both high-resolution perimetry (HRP) and conventional perimetry before and after a 3-month period of VRT, with eye movements being recorded with a 2D-eye tracker. This permitted quantification of eye positions and measurements of deviation from fixation. VRT lead to significant visual field enlargements as indicated by an increase of stimulus detection of 3.8% when tested using HRP and about 2.2% (OD) and 3.5% (OS) fewer misses with conventional perimetry. Eye movements were expressed as the standard deviations (S.D.) of the eye position recordings from fixation. Before VRT, the S.D. was +/-0.82 degrees horizontally and +/-1.16 degrees vertically; after VRT, it was +/-0.68 degrees and +/-1.39 degrees , respectively. A cluster analysis of the horizontal eye movements before VRT showed three types of subjects with (i) small (n=7), (ii) medium (n=7) or (iii) large fixation instability (n=1). Saccades were directed equally to the right or the left side; i.e., with no preference toward the blind hemifield. After VRT, many subjects showed a smaller variability of horizontal eye movements. Before VRT, 81.6% of the recorded eye positions were found within a range of 1 degrees horizontally from fixation, whereas after VRT, 88.3% were within that range. In the 2 degrees range, we found 94.8% before and 98.9% after VRT. Subjects moved their eyes 5 degrees or more 0.3% of the time before VRT versus 0.1% after VRT. Thus, in this study, subjects with homonymous visual field defects who were attempting to fixate a central target while their fields were being plotted, typically showed brief horizontal shifts with no preference toward or away from the blind hemifield. These eye movements were usually less than 1 degrees from fixation. Large saccades toward the blind field after VRT were very rare. VRT has no effect on either the direction or the amplitude of horizontal eye movements during visual field testing. These results argue against the theory that the visual field enlargements are artefacts induced by eye movements.
Ben-Simon, Avi; Ben-Shahar, Ohad; Vasserman, Genadiy; Segev, Ronen
2012-12-15
Interception of fast-moving targets is a demanding task many animals solve. To handle it successfully, mammals employ both saccadic and smooth pursuit eye movements in order to confine the target to their area centralis. But how can non-mammalian vertebrates, which lack smooth pursuit, intercept moving targets? We studied this question by exploring eye movement strategies employed by archer fish, an animal that possesses an area centralis, lacks smooth pursuit eye movements, but can intercept moving targets by shooting jets of water at them. We tracked the gaze direction of fish during interception of moving targets and found that they employ saccadic eye movements based on prediction of target position when it is hit. The fish fixates on the target's initial position for ∼0.2 s from the onset of its motion, a time period used to predict whether a shot can be made before the projection of the target exits the area centralis. If the prediction indicates otherwise, the fish performs a saccade that overshoots the center of gaze beyond the present target projection on the retina, such that after the saccade the moving target remains inside the area centralis long enough to prepare and perform a shot. These results add to the growing body of knowledge on biological target tracking and may shed light on the mechanism underlying this behavior in other animals with no neural system for the generation of smooth pursuit eye movements.
Learning rational temporal eye movement strategies.
Hoppe, David; Rothkopf, Constantin A
2016-07-19
During active behavior humans redirect their gaze several times every second within the visual environment. Where we look within static images is highly efficient, as quantified by computational models of human gaze shifts in visual search and face recognition tasks. However, when we shift gaze is mostly unknown despite its fundamental importance for survival in a dynamic world. It has been suggested that during naturalistic visuomotor behavior gaze deployment is coordinated with task-relevant events, often predictive of future events, and studies in sportsmen suggest that timing of eye movements is learned. Here we establish that humans efficiently learn to adjust the timing of eye movements in response to environmental regularities when monitoring locations in the visual scene to detect probabilistically occurring events. To detect the events humans adopt strategies that can be understood through a computational model that includes perceptual and acting uncertainties, a minimal processing time, and, crucially, the intrinsic costs of gaze behavior. Thus, subjects traded off event detection rate with behavioral costs of carrying out eye movements. Remarkably, based on this rational bounded actor model the time course of learning the gaze strategies is fully explained by an optimal Bayesian learner with humans' characteristic uncertainty in time estimation, the well-known scalar law of biological timing. Taken together, these findings establish that the human visual system is highly efficient in learning temporal regularities in the environment and that it can use these regularities to control the timing of eye movements to detect behaviorally relevant events.
Effects of diphenhydramine on human eye movements.
Hopfenbeck, J R; Cowley, D S; Radant, A; Greenblatt, D J; Roy-Byrne, P P
1995-04-01
Peak saccadic eye movement velocity (SEV) and average smooth pursuit gain (SP) are reduced in a dose-dependent manner by diazepam and provide reliable, quantitative measures of benzodiazepine agonist effects. To evaluate the specificity of these eye movement effects for agents acting at the central GABA-benzodiazepine receptor complex and the role of sedation in benzodiazepine effects, we studied eye movement effects of diphenhydramine, a sedating drug which does not act at the GABA-benzodiazepine receptor complex. Ten healthy males, aged 19-28 years, with no history of axis I psychiatric disorders or substance abuse, received 50 mg/70 kg intravenous diphenhydramine or a similar volume of saline on separate days 1 week apart. SEV, saccade latency and accuracy, SP, self-rated sedation, and short-term memory were assessed at baseline and at 5, 15, 30, 45, 60, 90 and 120 min after drug administration. Compared with placebo, diphenhydramine produced significant SEV slowing, and increases in saccade latency and self-rated sedation. There was no significant effect of diphenhydramine on smooth pursuit gain, saccade accuracy, or short-term memory. These results suggest that, like diazepam, diphenhydramine causes sedation, SEV slowing, and an increase in saccade latency. Since the degree of diphenhydramine-induced sedation was not correlated with changes in SEV or saccade latency, slowing of saccadic eye movements is unlikely to be attributable to sedation alone. Unlike diazepam, diphenhydramine does not impair smooth pursuit gain, saccadic accuracy, or memory. Different neurotransmitter systems may influence the neural pathways involved in SEV and smooth pursuit again.
... is one of the cranial nerves that control eye movement. Causes may include: Brain aneurysm Infections Abnormal blood ... show: Enlarged (dilated) pupil of the affected eye Eye movement abnormalities Eyes that are not aligned Your health ...
Eye/Brain/Task Testbed And Software
NASA Technical Reports Server (NTRS)
Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.
1994-01-01
Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.
ERIC Educational Resources Information Center
Reichle, Erik D.; Pollatsek, Alexander; Rayner, Keith
2012-01-01
Nonreading tasks that share some (but not all) of the task demands of reading have often been used to make inferences about how cognition influences when the eyes move during reading. In this article, we use variants of the E-Z Reader model of eye-movement control in reading to simulate eye-movement behavior in several of these tasks, including…
The coeruleus/subcoeruleus complex in idiopathic rapid eye movement sleep behaviour disorder.
Ehrminger, Mickael; Latimier, Alice; Pyatigorskaya, Nadya; Garcia-Lorenzo, Daniel; Leu-Semenescu, Smaranda; Vidailhet, Marie; Lehericy, Stéphane; Arnulf, Isabelle
2016-04-01
Idiopathic rapid eye movement sleep behaviour disorder is characterized by nocturnal violence, increased muscle tone during rapid eye movement sleep and the lack of any other neurological disease. However, idiopathic rapid eye movement sleep behaviour disorder can precede parkinsonism and dementia by several years. Using 3 T magnetic resonance imaging and neuromelanin-sensitive sequences, we previously found that the signal intensity was reduced in the locus coeruleus/subcoeruleus area of patients with Parkinson's disease and rapid eye movement sleep behaviour disorder. Here, we studied the integrity of the locus coeruleus/subcoeruleus complex with neuromelanin-sensitive imaging in 21 patients with idiopathic rapid eye movement sleep behaviour disorder and compared the results with those from 21 age- and gender-matched healthy volunteers. All subjects underwent a clinical examination, motor, cognitive, autonomous, psychological, olfactory and colour vision tests, and rapid eye movement sleep characterization using video-polysomnography and 3 T magnetic resonance imaging. The patients more frequently had preclinical markers of alpha-synucleinopathies, including constipation, olfactory deficits, orthostatic hypotension, and subtle motor impairment. Using neuromelanin-sensitive imaging, reduced signal intensity was identified in the locus coeruleus/subcoeruleus complex of the patients with idiopathic rapid eye movement sleep behaviour. The mean sensitivity of the visual analyses of the signal performed by neuroradiologists who were blind to the clinical diagnoses was 82.5%, and the specificity was 81% for the identification of idiopathic rapid eye movement sleep behaviour. The results confirm that this complex is affected in idiopathic rapid eye movement sleep behaviour (to the same degree as it is affected in Parkinson's disease). Neuromelanin-sensitive imaging provides an early marker of non-dopaminergic alpha-synucleinopathy that can be detected on an individual basis. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Hermens, Frouke; Matthews, William J.
2015-01-01
Abstract We asked participants to make simple risky choices while we recorded their eye movements. We built a complete statistical model of the eye movements and found very little systematic variation in eye movements over the time course of a choice or across the different choices. The only exceptions were finding more (of the same) eye movements when choice options were similar, and an emerging gaze bias in which people looked more at the gamble they ultimately chose. These findings are inconsistent with prospect theory, the priority heuristic, or decision field theory. However, the eye movements made during a choice have a large relationship with the final choice, and this is mostly independent from the contribution of the actual attribute values in the choice options. That is, eye movements tell us not just about the processing of attribute values but also are independently associated with choice. The pattern is simple—people choose the gamble they look at more often, independently of the actual numbers they see—and this pattern is simpler than predicted by decision field theory, decision by sampling, and the parallel constraint satisfaction model. © 2015 The Authors. Journal of Behavioral Decision Making published by John Wiley & Sons Ltd. PMID:27522985
The role of eye movements in decision making and the prospect of exposure effects.
Bird, Gary D; Lauwereyns, Johan; Crawford, Matthew T
2012-05-01
The aim of the current study was to follow on from previous findings that eye movements can have a causal influence on preference formation. Shimojo et al. (2003) previously found that faces that were presented for a longer duration in a two alternative forced choice task were more likely to be judged as more attractive. This effect only occurred when an eye movement was made towards the faces (with no effect when faces were centrally presented). The current study replicated Shimojo et al.'s (2003) design, whilst controlling for potential inter-stimuli interference in central presentations. As per previous findings, when eye movements were made towards the stimuli, faces that were presented for longer durations were preferred. However, faces that were centrally presented (thus not requiring an eye movement) were also preferred in the current study. The presence of an exposure duration effect for centrally presented faces casts doubt on the necessity of the eye movement in this decision making process and has implications for decision theories that place an emphasis on the role of eye movements in decision making. Copyright © 2012 Elsevier Ltd. All rights reserved.
Quantitative analysis on electrooculography (EOG) for neurodegenerative disease
NASA Astrophysics Data System (ADS)
Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.
2007-11-01
Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.
Eye movements reflect and shape strategies in fraction comparison.
Ischebeck, Anja; Weilharter, Marina; Körner, Christof
2016-01-01
The comparison of fractions is a difficult task that can often be facilitated by separately comparing components (numerators and denominators) of the fractions--that is, by applying so-called component-based strategies. The usefulness of such strategies depends on the type of fraction pair to be compared. We investigated the temporal organization and the flexibility of strategy deployment in fraction comparison by evaluating sequences of eye movements in 20 young adults. We found that component-based strategies could account for the response times and the overall number of fixations observed for the different fraction pairs. The analysis of eye movement sequences showed that the initial eye movements in a trial were characterized by stereotypical scanning patterns indicative of an exploratory phase that served to establish the kind of fraction pair presented. Eye movements that followed this phase adapted to the particular type of fraction pair and indicated the deployment of specific comparison strategies. These results demonstrate that participants employ eye movements systematically to support strategy use in fraction comparison. Participants showed a remarkable flexibility to adapt to the most efficient strategy on a trial-by-trial basis. Our results confirm the value of eye movement measurements in the exploration of strategic adaptation in complex tasks.
Signal-dependent noise determines motor planning
NASA Astrophysics Data System (ADS)
Harris, Christopher M.; Wolpert, Daniel M.
1998-08-01
When we make saccadic eye movements or goal-directed arm movements, there is an infinite number of possible trajectories that the eye or arm could take to reach the target,. However, humans show highly stereotyped trajectories in which velocity profiles of both the eye and hand are smooth and symmetric for brief movements,. Here we present a unifying theory of eye and arm movements based on the single physiological assumption that the neural control signals are corrupted by noise whose variance increases with the size of the control signal. We propose that in the presence of such signal-dependent noise, the shape of a trajectory is selected to minimize the variance of the final eye or arm position. This minimum-variance theory accurately predicts the trajectories of both saccades and arm movements and the speed-accuracy trade-off described by Fitt's law. These profiles are robust to changes in the dynamics of the eye or arm, as found empirically,. Moreover, the relation between path curvature and hand velocity during drawing movements reproduces the empirical `two-thirds power law',. This theory provides a simple and powerful unifying perspective for both eye and arm movement control.
Chronic Organic Solvent Exposure Changes Visual Tracking in Men and Women.
de Oliveira, Ana R; Campos Neto, Armindo de Arruda; Bezerra de Medeiros, Paloma C; de Andrade, Michael J O; Dos Santos, Natanael A
2017-01-01
Organic solvents can change CNS sensory and motor function. Eye-movement analyses can be important tools when investigating the neurotoxic changes that result from chronic organic solvent exposure. The current research measured the eye-movement patterns of men and women with and without histories of chronic organic solvent exposure. A total of 44 volunteers between 18 and 41 years old participated in this study; 22 were men (11 exposed and 11 controls), and 22 were women (11 exposed and 11 controls). Eye movement was evaluated using a 250-Hz High-Speed Video Eye Tracker Toolbox (Cambridge Research Systems) via an image of a maze. Specific body indices of exposed and non-exposed men and women were measured with an Inbody 720 to determine whether the differences in eye-movement patterns were associated with body composition. The data were analyzed using IBM SPSS Statistics version 20.0.0. The results indicated that exposed adults showed significantly more fixations ( t = 3.82; p = 0.001; r = 0.51) and longer fixations ( t = 4.27; p = 0.001, r = 0.54) than their non-exposed counterparts. Comparisons within men (e.g., exposed and non-exposed) showed significant differences in the number of fixations ( t = 2.21; p = 0.04; r = 0.20) and duration of fixations ( t = 3.29; p = 0.001; r = 0.35). The same was true for exposed vs. non-exposed women, who showed significant differences in the number of fixations ( t = 3.10; p = 0.001; r = 0.32) and fixation durations ( t = 2.76; p = 0.01; r = 0.28). However, the results did not show significant differences between exposed women and men in the number and duration of fixations. No correlations were found between eye-movement pattern and body composition measures ( p > 0.05). These results suggest that chronic organic solvent exposure affects eye movements, regardless of sex and body composition, and that eye tracking contributes to the investigation of the visual information processing disorders acquired by workers exposed to organic solvents.
Contrast and assimilation in motion perception and smooth pursuit eye movements.
Spering, Miriam; Gegenfurtner, Karl R
2007-09-01
The analysis of visual motion serves many different functions ranging from object motion perception to the control of self-motion. The perception of visual motion and the oculomotor tracking of a moving object are known to be closely related and are assumed to be controlled by shared brain areas. We compared perceived velocity and the velocity of smooth pursuit eye movements in human observers in a paradigm that required the segmentation of target object motion from context motion. In each trial, a pursuit target and a visual context were independently perturbed simultaneously to briefly increase or decrease in speed. Observers had to accurately track the target and estimate target speed during the perturbation interval. Here we show that the same motion signals are processed in fundamentally different ways for perception and steady-state smooth pursuit eye movements. For the computation of perceived velocity, motion of the context was subtracted from target motion (motion contrast), whereas pursuit velocity was determined by the motion average (motion assimilation). We conclude that the human motion system uses these computations to optimally accomplish different functions: image segmentation for object motion perception and velocity estimation for the control of smooth pursuit eye movements.
Paroxysmal eye–head movements in Glut1 deficiency syndrome
Engelstad, Kristin; Kane, Steven A.; Goldberg, Michael E.; De Vivo, Darryl C.
2017-01-01
Objective: To describe a characteristic paroxysmal eye–head movement disorder that occurs in infants with Glut1 deficiency syndrome (Glut1 DS). Methods: We retrospectively reviewed the medical charts of 101 patients with Glut1 DS to obtain clinical data about episodic abnormal eye movements and analyzed video recordings of 18 eye movement episodes from 10 patients. Results: A documented history of paroxysmal abnormal eye movements was found in 32/101 patients (32%), and a detailed description was available in 18 patients, presented here. Episodes started before age 6 months in 15/18 patients (83%), and preceded the onset of seizures in 10/16 patients (63%) who experienced both types of episodes. Eye movement episodes resolved, with or without treatment, by 6 years of age in 7/8 patients with documented long-term course. Episodes were brief (usually <5 minutes). Video analysis revealed that the eye movements were rapid, multidirectional, and often accompanied by a head movement in the same direction. Eye movements were separated by clear intervals of fixation, usually ranging from 200 to 800 ms. The movements were consistent with eye–head gaze saccades. These movements can be distinguished from opsoclonus by the presence of a clear intermovement fixation interval and the association of a same-direction head movement. Conclusions: Paroxysmal eye–head movements, for which we suggest the term aberrant gaze saccades, are an early symptom of Glut1 DS in infancy. Recognition of the episodes will facilitate prompt diagnosis of this treatable neurodevelopmental disorder. PMID:28341645
CEFR and Eye Movement Characteristics during EFL Reading: The Case of Intermediate Readers
ERIC Educational Resources Information Center
Dolgunsöz, Emrah; Sariçoban, Arif
2016-01-01
This study primarily aims to (1) examine the relationship between foreign language reading proficiency and eye movements during reading, and (2) to describe eye movement differences between two CEFR proficiency groups (B1 and B2) by using eye tracking technique. 57 learners of EFL were tested under two experimental conditions: Natural L2 reading…
Król, Magdalena Ewa; Król, Michał
2018-02-20
The aim of the study was not only to demonstrate whether eye-movement-based task decoding was possible but also to investigate whether eye-movement patterns can be used to identify cognitive processes behind the tasks. We compared eye-movement patterns elicited under different task conditions, with tasks differing systematically with regard to the types of cognitive processes involved in solving them. We used four tasks, differing along two dimensions: spatial (global vs. local) processing (Navon, Cognit Psychol, 9(3):353-383 1977) and semantic (deep vs. shallow) processing (Craik and Lockhart, J Verbal Learn Verbal Behav, 11(6):671-684 1972). We used eye-movement patterns obtained from two time periods: fixation cross preceding the target stimulus and the target stimulus. We found significant effects of both spatial and semantic processing, but in case of the latter, the effect might be an artefact of insufficient task control. We found above chance task classification accuracy for both time periods: 51.4% for the period of stimulus presentation and 34.8% for the period of fixation cross presentation. Therefore, we show that task can be to some extent decoded from the preparatory eye-movements before the stimulus is displayed. This suggests that anticipatory eye-movements reflect the visual scanning strategy employed for the task at hand. Finally, this study also demonstrates that decoding is possible even from very scant eye-movement data similar to Coco and Keller, J Vis 14(3):11-11 (2014). This means that task decoding is not limited to tasks that naturally take longer to perform and yield multi-second eye-movement recordings.
Effects of background stimulation upon eye-movement information.
Nakamura, S
1996-04-01
To investigate the effects of background stimulation upon eye-movement information (EMI), the perceived deceleration of the target motion during pursuit eye movement (Aubert-Fleishl paradox) was analyzed. In the experiment, a striped pattern was used as a background stimulus with various brightness contrasts and spatial frequencies for serially manipulating the attributions of the background stimulus. Analysis showed that the retinal-image motion of the background stimulus (optic flow) affected eye-movement information and that the effects of optic flow became stronger when high contrast and low spatial frequency stripes were presented as the background stimulus. In conclusion, optic flow is one source of eye-movement information in determining real object motion, and the effectiveness of optic flow depends on the attributes of the background stimulus.
A Pilot Study of Horizontal Head and Eye Rotations in Baseball Batting.
Fogt, Nick; Persson, Tyler W
2017-08-01
The purpose of the study was to measure and compare horizontal head and eye tracking movements as baseball batters "took" pitches and swung at baseball pitches. Two former college baseball players were tested in two conditions. A pitching machine was used to project tennis balls toward the subjects. In the first condition, subjects acted as if they were taking (i.e., not swinging) the pitches. In the second condition, subjects attempted to bat the pitched balls. Head movements were measured with an inertial sensor; eye movements were measured with a video eye tracker. For each condition, the relationship between the horizontal head and eye rotations was similar for the two subjects, as were the overall head-, eye-, and gaze-tracking strategies. In the "take" condition, head movements in the direction of the ball were larger than eye movements for much of the pitch trajectory. Large eye movements occurred only late in the pitch trajectory. Gaze was directed near the ball until approximately 150 milliseconds before the ball arrived at the batter, at which time gaze was directed ahead of the ball to a location near that occupied when the ball crosses the plate. In the "swing" condition, head movements in the direction of the ball were larger than eye movements throughout the pitch trajectory. Gaze was directed near the ball until approximately 50 to 60 milliseconds prior to pitch arrival at the batter. Horizontal head rotations were larger than horizontal eye rotations in both the "take" and "swing" conditions. Gaze was directed ahead of the ball late in the pitch trajectory in the "take" condition, whereas gaze was directed near the ball throughout much of the pitch trajectory in the "swing" condition.
Detecting eye movements in dynamic environments.
Reimer, Bryan; Sodhi, Manbir
2006-11-01
To take advantage of the increasing number of in-vehicle devices, automobile drivers must divide their attention between primary (driving) and secondary (operating in-vehicle device) tasks. In dynamic environments such as driving, however, it is not easy to identify and quantify how a driver focuses on the various tasks he/she is simultaneously engaged in, including the distracting tasks. Measures derived from the driver's scan path have been used as correlates of driver attention. This article presents a methodology for analyzing eye positions, which are discrete samples of a subject's scan path, in order to categorize driver eye movements. Previous methods of analyzing eye positions recorded in a dynamic environment have relied completely on the manual identification of the focus of visual attention from a point of regard superimposed on a video of a recorded scene, failing to utilize information regarding movement structure in the raw recorded eye positions. Although effective, these methods are too time consuming to be easily used when the large data sets that would be required to identify subtle differences between drivers, under different road conditions, and with different levels of distraction are processed. The aim of the methods presented in this article are to extend the degree of automation in the processing of eye movement data by proposing a methodology for eye movement analysis that extends automated fixation identification to include smooth and saccadic movements. By identifying eye movements in the recorded eye positions, a method of reducing the analysis of scene video to a finite search space is presented. The implementation of a software tool for the eye movement analysis is described, including an example from an on-road test-driving sample.
Eye Gaze during Observation of Static Faces in Deaf People
Watanabe, Katsumi; Matsuda, Tetsuya; Nishioka, Tomoyuki; Namatame, Miki
2011-01-01
Knowing where people look when viewing faces provides an objective measure into the part of information entering the visual system as well as into the cognitive strategy involved in facial perception. In the present study, we recorded the eye movements of 20 congenitally deaf (10 male and 10 female) and 23 (11 male and 12 female) normal-hearing Japanese participants while they evaluated the emotional valence of static face stimuli. While no difference was found in the evaluation scores, the eye movements during facial observations differed among participant groups. The deaf group looked at the eyes more frequently and for longer duration than the nose whereas the hearing group focused on the nose (or the central region of face) more than the eyes. These results suggest that the strategy employed to extract visual information when viewing static faces may differ between deaf and hearing people. PMID:21359223
Parker, Andrew; Parkin, Adam; Dagnall, Neil
2013-01-01
Performing a sequence of fast saccadic horizontal eye movements has been shown to facilitate performance on a range of cognitive tasks, including the retrieval of episodic memories. One explanation for these effects is based on the hypothesis that saccadic eye movements increase hemispheric interaction, and that such interactions are important for particular types of memory. The aim of the current research was to assess the effect of horizontal saccadic eye movements on the retrieval of both episodic autobiographical memory (event/incident based memory) and semantic autobiographical memory (fact based memory) over recent and more distant time periods. It was found that saccadic eye movements facilitated the retrieval of episodic autobiographical memories (over all time periods) but not semantic autobiographical memories. In addition, eye movements did not enhance the retrieval of non-autobiographical semantic memory. This finding illustrates a dissociation between the episodic and semantic characteristics of personal memory and is considered within the context of hemispheric contributions to episodic memory performance.
Effects of Saccadic Bilateral Eye Movements on Episodic and Semantic Autobiographical Memory Fluency
Parker, Andrew; Parkin, Adam; Dagnall, Neil
2013-01-01
Performing a sequence of fast saccadic horizontal eye movements has been shown to facilitate performance on a range of cognitive tasks, including the retrieval of episodic memories. One explanation for these effects is based on the hypothesis that saccadic eye movements increase hemispheric interaction, and that such interactions are important for particular types of memory. The aim of the current research was to assess the effect of horizontal saccadic eye movements on the retrieval of both episodic autobiographical memory (event/incident based memory) and semantic autobiographical memory (fact based memory) over recent and more distant time periods. It was found that saccadic eye movements facilitated the retrieval of episodic autobiographical memories (over all time periods) but not semantic autobiographical memories. In addition, eye movements did not enhance the retrieval of non-autobiographical semantic memory. This finding illustrates a dissociation between the episodic and semantic characteristics of personal memory and is considered within the context of hemispheric contributions to episodic memory performance. PMID:24133435
Eye movement related brain responses to emotional scenes during free viewing
Simola, Jaana; Torniainen, Jari; Moisala, Mona; Kivikangas, Markus; Krause, Christina M.
2013-01-01
Emotional stimuli are preferentially processed over neutral stimuli. Previous studies, however, disagree on whether emotional stimuli capture attention preattentively or whether the processing advantage is dependent on allocation of attention. The present study investigated attention and emotion processes by measuring brain responses related to eye movement events while 11 participants viewed images selected from the International Affective Picture System (IAPS). Brain responses to emotional stimuli were compared between serial and parallel presentation. An “emotional” set included one image with high positive or negative valence among neutral images. A “neutral” set comprised four neutral images. The participants were asked to indicate which picture—if any—was emotional and to rate that picture on valence and arousal. In the serial condition, the event-related potentials (ERPs) were time-locked to the stimulus onset. In the parallel condition, the ERPs were time-locked to the first eye entry on an image. The eye movement results showed facilitated processing of emotional, especially unpleasant information. The EEG results in both presentation conditions showed that the LPP (“late positive potential”) amplitudes at 400–500 ms were enlarged for the unpleasant and pleasant pictures as compared to neutral pictures. Moreover, the unpleasant scenes elicited stronger responses than pleasant scenes. The ERP results did not support parafoveal emotional processing, although the eye movement results suggested faster attention capture by emotional stimuli. Our findings, thus, suggested that emotional processing depends on overt attentional resources engaged in the processing of emotional content. The results also indicate that brain responses to emotional images can be analyzed time-locked to eye movement events, although the response amplitudes were larger during serial presentation. PMID:23970856
Lemieux, Chantal L; Collin, Charles A; Nelson, Elizabeth A
2015-02-01
In two experiments, we examined the effects of varying the spatial frequency (SF) content of face images on eye movements during the learning and testing phases of an old/new recognition task. At both learning and testing, participants were presented with face stimuli band-pass filtered to 11 different SF bands, as well as an unfiltered baseline condition. We found that eye movements varied significantly as a function of SF. Specifically, the frequency of transitions between facial features showed a band-pass pattern, with more transitions for middle-band faces (≈5-20 cycles/face) than for low-band (≈<5 cpf) or high-band (≈>20 cpf) ones. These findings were similar for the learning and testing phases. The distributions of transitions across facial features were similar for the middle-band, high-band, and unfiltered faces, showing a concentration on the eyes and mouth; conversely, low-band faces elicited mostly transitions involving the nose and nasion. The eye movement patterns elicited by low, middle, and high bands are similar to those previous researchers have suggested reflect holistic, configural, and featural processing, respectively. More generally, our results are compatible with the hypotheses that eye movements are functional, and that the visual system makes flexible use of visuospatial information in face processing. Finally, our finding that only middle spatial frequencies yielded the same number and distribution of fixations as unfiltered faces adds more evidence to the idea that these frequencies are especially important for face recognition, and reveals a possible mediator for the superior performance that they elicit.
Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys.
Ando, K; Johanson, C E; Levy, D L; Yasillo, N J; Holzman, P S; Schuster, C R
1983-01-01
Rhesus monkeys were trained to track a moving disk using a procedure in which responses on a lever were reinforced with water delivery only when the disk, oscillating in a horizontal plane on a screen at a frequency of 0.4 Hz in a visual angle of 20 degrees, dimmed for a brief period. Pursuit eye movements were recorded by electrooculography (EOG). IM phencyclidine, secobarbital, and diazepam injections decreased the number of reinforced lever presses in a dose-related manner. Both secobarbital and diazepam produced episodic jerky-pursuit eye movements, while phencyclidine had no consistent effects on eye movements. Lever pressing was disrupted at doses which had little effect on the quality of smooth-pursuit eye movements in some monkeys. This separation was particularly pronounced with diazepam. The similarities of the drug effects on smooth-pursuit eye movements between the present study and human studies indicate that the present method using rhesus monkeys may be useful for predicting drug effects on eye tracking and oculomotor function in humans.
Kasahara, Satoshi; Akao, Teppei; Kurkin, Sergei; Peterson, Barry W.
2009-01-01
Eye and head movements are coordinated during head-free pursuit. To examine whether pursuit neurons in frontal eye fields (FEF) carry gaze-pursuit commands that drive both eye-pursuit and head-pursuit, monkeys whose heads were free to rotate about a vertical axis were trained to pursue a juice feeder with their head and a target with their eyes. Initially the feeder and target moved synchronously with the same visual angle. FEF neurons responding to this gaze-pursuit were tested for eye-pursuit of target motion while the feeder was stationary and for head-pursuit while the target was stationary. The majority of pursuit neurons exhibited modulation during head-pursuit, but their preferred directions during eye-pursuit and head-pursuit were different. Although peak modulation occurred during head movements, the onset of discharge usually was not aligned with the head movement onset. The minority of neurons whose discharge onset was so aligned discharged after the head movement onset. These results do not support the idea that the head-pursuit–related modulation reflects head-pursuit commands. Furthermore, modulation similar to that during head-pursuit was obtained by passive head rotation on stationary trunk. Our results suggest that FEF pursuit neurons issue gaze or eye movement commands during gaze-pursuit and that the head-pursuit–related modulation primarily reflects reafferent signals resulting from head movements. PMID:18483002
Trillenberg, Peter; Sprenger, Andreas; Talamo, Silke; Herold, Kirsten; Helmchen, Christoph; Verleger, Rolf; Lencer, Rebekka
2017-04-01
Despite many reports on visual processing deficits in psychotic disorders, studies are needed on the integration of visual and non-visual components of eye movement control to improve the understanding of sensorimotor information processing in these disorders. Non-visual inputs to eye movement control include prediction of future target velocity from extrapolation of past visual target movement and anticipation of future target movements. It is unclear whether non-visual input is impaired in patients with schizophrenia. We recorded smooth pursuit eye movements in 21 patients with schizophrenia spectrum disorder, 22 patients with bipolar disorder, and 24 controls. In a foveo-fugal ramp task, the target was either continuously visible or was blanked during movement. We determined peak gain (measuring overall performance), initial eye acceleration (measuring visually driven pursuit), deceleration after target extinction (measuring prediction), eye velocity drifts before onset of target visibility (measuring anticipation), and residual gain during blanking intervals (measuring anticipation and prediction). In both patient groups, initial eye acceleration was decreased and the ability to adjust eye acceleration to increasing target acceleration was impaired. In contrast, neither deceleration nor eye drift velocity was reduced in patients, implying unimpaired non-visual contributions to pursuit drive. Disturbances of eye movement control in psychotic disorders appear to be a consequence of deficits in sensorimotor transformation rather than a pure failure in adding cognitive contributions to pursuit drive in higher-order cortical circuits. More generally, this deficit might reflect a fundamental imbalance between processing external input and acting according to internal preferences.
Eye Carduino: A Car Control System using Eye Movements
NASA Astrophysics Data System (ADS)
Kumar, Arjun; Nagaraj, Disha; Louzardo, Joel; Hegde, Rajeshwari
2011-12-01
Modern automotive systems are rapidly becoming highly of transportation, but can be a web integrated media centre. This paper explains the implementation of a vehicle control defined and characterized by embedded electronics and software. With new technologies, the vehicle industry is facing new opportunities and also new challenges. Electronics have improved the performance of vehicles and at the same time, new more complex applications are introduced. Examples of high level applications include adaptive cruise control and electronic stability programs (ESP). Further, a modern vehicle does not have to be merely a means using only eye movements. The EyeWriter's native hardware and software work to return the co-ordinates of where the user is looking. These co-ordinates are then used to control the car. A centre-point is defined on the screen. The higher on the screen the user's gaze is, the faster the car will accelerate. Braking is done by looking below centre. Steering is done by looking left and right on the screen.
Advances in Relating Eye Movements and Cognition
ERIC Educational Resources Information Center
Hayhoe, Mary M.
2004-01-01
Measurement of eye movements is a powerful tool for investigating perceptual and cognitive function in both infants and adults. Straightforwardly, eye movements provide a multifaceted measure of performance. For example, the location of fixations, their duration, time of occurrence, and accuracy all are potentially revealing and often allow…
Generating and Describing Affective Eye Behaviors
NASA Astrophysics Data System (ADS)
Mao, Xia; Li, Zheng
The manner of a person's eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.
Eye movements during spoken word recognition in Russian children.
Sekerina, Irina A; Brooks, Patricia J
2007-09-01
This study explores incremental processing in spoken word recognition in Russian 5- and 6-year-olds and adults using free-viewing eye-tracking. Participants viewed scenes containing pictures of four familiar objects and clicked on a target embedded in a spoken instruction. In the cohort condition, two object names shared identical three-phoneme onsets. In the noncohort condition, all object names had unique onsets. Coarse-grain analyses of eye movements indicated that adults produced looks to the competitor on significantly more cohort trials than on noncohort trials, whereas children surprisingly failed to demonstrate cohort competition due to widespread exploratory eye movements across conditions. Fine-grain analyses, in contrast, showed a similar time course of eye movements across children and adults, but with cohort competition lingering more than 1s longer in children. The dissociation between coarse-grain and fine-grain eye movements indicates a need to consider multiple behavioral measures in making developmental comparisons in language processing.
Eye Movements in Darkness Modulate Self-Motion Perception.
Clemens, Ivar Adrianus H; Selen, Luc P J; Pomante, Antonella; MacNeilage, Paul R; Medendorp, W Pieter
2017-01-01
During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first ( n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment ( n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation.
Eye Movements in Darkness Modulate Self-Motion Perception
Pomante, Antonella
2017-01-01
Abstract During self-motion, humans typically move the eyes to maintain fixation on the stationary environment around them. These eye movements could in principle be used to estimate self-motion, but their impact on perception is unknown. We had participants judge self-motion during different eye-movement conditions in the absence of full-field optic flow. In a two-alternative forced choice task, participants indicated whether the second of two successive passive lateral whole-body translations was longer or shorter than the first. This task was used in two experiments. In the first (n = 8), eye movements were constrained differently in the two translation intervals by presenting either a world-fixed or body-fixed fixation point or no fixation point at all (allowing free gaze). Results show that perceived translations were shorter with a body-fixed than a world-fixed fixation point. A linear model indicated that eye-movement signals received a weight of ∼25% for the self-motion percept. This model was independently validated in the trials without a fixation point (free gaze). In the second experiment (n = 10), gaze was free during both translation intervals. Results show that the translation with the larger eye-movement excursion was judged more often to be larger than chance, based on an oculomotor choice probability analysis. We conclude that eye-movement signals influence self-motion perception, even in the absence of visual stimulation. PMID:28144623
Hutson, John P; Smith, Tim J; Magliano, Joseph P; Loschky, Lester C
2017-01-01
Film is ubiquitous, but the processes that guide viewers' attention while viewing film narratives are poorly understood. In fact, many film theorists and practitioners disagree on whether the film stimulus (bottom-up) or the viewer (top-down) is more important in determining how we watch movies. Reading research has shown a strong connection between eye movements and comprehension, and scene perception studies have shown strong effects of viewing tasks on eye movements, but such idiosyncratic top-down control of gaze in film would be anathema to the universal control mainstream filmmakers typically aim for. Thus, in two experiments we tested whether the eye movements and comprehension relationship similarly held in a classic film example, the famous opening scene of Orson Welles' Touch of Evil (Welles & Zugsmith, Touch of Evil, 1958). Comprehension differences were compared with more volitionally controlled task-based effects on eye movements. To investigate the effects of comprehension on eye movements during film viewing, we manipulated viewers' comprehension by starting participants at different points in a film, and then tracked their eyes. Overall, the manipulation created large differences in comprehension, but only produced modest differences in eye movements. To amplify top-down effects on eye movements, a task manipulation was designed to prioritize peripheral scene features: a map task. This task manipulation created large differences in eye movements when compared to participants freely viewing the clip for comprehension. Thus, to allow for strong, volitional top-down control of eye movements in film, task manipulations need to make features that are important to narrative comprehension irrelevant to the viewing task. The evidence provided by this experimental case study suggests that filmmakers' belief in their ability to create systematic gaze behavior across viewers is confirmed, but that this does not indicate universally similar comprehension of the film narrative.
Assisting autistic children with wireless EOG technology.
Rapela, Joaquin; Lin, Tsong-Yan; Westerfield, Marissa; Jung, Tzyy-Ping; Townsend, Jeanne
2012-01-01
We propose a novel intervention to train the speed and accuracy of attention orienting and eye movements in Autism Spectrum Disorder (ASD). Training eye movements and attention could not only affect those important functions directly, but could also result in broader improvement of social communication skills. To this end we describe a system that would allow ASD children to improve their fixation skills while playing a computer game controlled by an eye tracker. Because this intervention will probably be time consuming, this system should be designed to be used at homes. To make this possible, we propose an implementation based on wireless and dry electrooculography (EOG) technology. If successful, this system would develop an approach to therapy that would improve clinical and behavioral function in children and adults with ASD. As our initial steps in this direction, here we describe the design of a computer game to be used in this system, and the predictions of gaze position from EOG data recorded while a subject played this game.
The oculomotor role of the pontine nuclei and the nucleus reticularis tegmenti pontis.
Thier, Peter; Möck, Martin
2006-01-01
Cerebral cortex and the cerebellum interact closely in order to facilitate spatial orientation and the generation of motor behavior, including eye movements. This interaction is based on a massive projection system that allows the exchange of signals between the two cortices. This cerebro-cerebellar communication system includes several intercalated brain stem nuclei, whose eminent role in the organization of oculomotor behavior has only recently become apparent. This review focuses on the two major nuclei of this group taking a precerebellar position, the pontine nuclei and the nucleus reticularis tegmenti pontis, both intimately involved in the visual guidance of eye movements.
Development and experimentation of an eye/brain/task testbed
NASA Technical Reports Server (NTRS)
Harrington, Nora; Villarreal, James
1987-01-01
The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)
Consequences of Traumatic Brain Injury for Human Vergence Dynamics
Tyler, Christopher W.; Likova, Lora T.; Mineff, Kristyo N.; Elsaid, Anas M.; Nicholas, Spero C.
2015-01-01
Purpose: Traumatic brain injury involving loss of consciousness has focal effects in the human brainstem, suggesting that it may have particular consequences for eye movement control. This hypothesis was investigated by measurements of vergence eye movement parameters. Methods: Disparity vergence eye movements were measured for a population of 123 normally sighted individuals, 26 of whom had suffered diffuse traumatic brain injury (dTBI) in the past, while the remainder served as controls. Vergence tracking responses were measured to sinusoidal disparity modulation of a random-dot field. Disparity vergence step responses were characterized in terms of their dynamic parameters separately for the convergence and divergence directions. Results: The control group showed notable differences between convergence and divergence dynamics. The dTBI group showed significantly abnormal vergence behavior on many of the dynamic parameters. Conclusion: The results support the hypothesis that occult injury to the oculomotor control system is a common residual outcome of dTBI. PMID:25691880
Effects of directional uncertainty on visually-guided joystick pointing.
Berryhill, Marian; Kveraga, Kestutis; Hughes, Howard C
2005-02-01
Reaction times generally follow the predictions of Hick's law as stimulus-response uncertainty increases, although notable exceptions include the oculomotor system. Saccadic and smooth pursuit eye movement reaction times are independent of stimulus-response uncertainty. Previous research showed that joystick pointing to targets, a motor analog of saccadic eye movements, is only modestly affected by increased stimulus-response uncertainty; however, a no-uncertainty condition (simple reaction time to 1 possible target) was not included. Here, we re-evaluate manual joystick pointing including a no-uncertainty condition. Analysis indicated simple joystick pointing reaction times were significantly faster than choice reaction times. Choice reaction times (2, 4, or 8 possible target locations) only slightly increased as the number of possible targets increased. These data suggest that, as with joystick tracking (a motor analog of smooth pursuit eye movements), joystick pointing is more closely approximated by a simple/choice step function than the log function predicted by Hick's law.
Eying the future: Eye movement in past and future thinking.
El Haj, Mohamad; Lenoble, Quentin
2017-06-07
We investigated eye movement during past and future thinking. Participants were invited to retrieve past events and to imagine future events while their scan path was recorded by an eye-tracker. Past thinking triggered more fixation (p < .05), and saccade counts (p < .05) than future thinking. Past and future thinking triggered a similar duration of fixations and saccades, as well as a similar amplitude of saccades. Interestingly, participants rated past thinking as more vivid than future thinking (p < .01). Therefore, the vividness of past thinking seems to be accompanied by an increased number of fixations and saccades. Fixations and saccades in past thinking can be interpreted as an attempt by the visual system to find (through saccades) and activate (through fixations) stored memory representations. The same interpretation can be applied to future thinking as this ability requires activation of past experiences. However, future thinking triggers fewer fixations and saccades than past thinking: this may be due to its decreased demand on visual imagery, but could also be related to a potentially deleterious effect of eye movements on spatial imagery required for future thinking. Copyright © 2017 Elsevier Ltd. All rights reserved.
Moving the eye of the beholder. Motor components in vision determine aesthetic preference.
Topolinski, Sascha
2010-09-01
Perception entails not only sensory input (e.g., merely seeing), but also subsidiary motor processes (e.g., moving the eyes); such processes have been neglected in research on aesthetic preferences. To fill this gap, the present research manipulated the fluency of perceptual motor processes independently from sensory input and predicted that this increased fluency would result in increased aesthetic preference for stimulus movements that elicited the same motor movements as had been previously trained. Specifically, addressing the muscles that move the eyes, I trained participants to follow a stimulus movement without actually seeing it. Experiment 1 demonstrated that ocular-muscle training resulted in the predicted increase in preference for trained stimulus movements compared with untrained stimulus movements, although participants had not previously seen any of the movements. Experiments 2 and 3 showed that actual motor matching and not perceptual similarity drove this effect. Thus, beauty may be not only in the eye of the beholder, but also in the eyes' movements.
Short-latency primate vestibuloocular responses during translation
NASA Technical Reports Server (NTRS)
Angelaki, D. E.; McHenry, M. Q.
1999-01-01
Short-lasting, transient head displacements and near target fixation were used to measure the latency and early response gain of vestibularly evoked eye movements during lateral and fore-aft translations in rhesus monkeys. The latency of the horizontal eye movements elicited during lateral motion was 11.9 +/- 5.4 ms. Viewing distance-dependent behavior was seen as early as the beginning of the response profile. For fore-aft motion, latencies were different for forward and backward displacements. Latency averaged 7.1 +/- 9.3 ms during forward motion (same for both eyes) and 12.5 +/- 6.3 ms for the adducting eye (e.g., left eye during right fixation) during backward motion. Latencies during backward motion were significantly longer for the abducting eye (18.9 +/- 9.8 ms). Initial acceleration gains of the two eyes were generally larger than unity but asymmetric. Specifically, gains were consistently larger for abducting than adducting eye movements. The large initial acceleration gains tended to compensate for the response latencies such that the early eye movement response approached, albeit consistently incompletely, that required for maintaining visual acuity during the movement. These short-latency vestibuloocular responses could complement the visually generated optic flow responses that have been shown to exhibit much longer latencies.
Analysis of EEG Related Saccadic Eye Movement
NASA Astrophysics Data System (ADS)
Funase, Arao; Kuno, Yoshiaki; Okuma, Shigeru; Yagi, Tohru
Our final goal is to establish the model for saccadic eye movement that connects the saccade and the electroencephalogram(EEG). As the first step toward this goal, we recorded and analyzed the saccade-related EEG. In the study recorded in this paper, we tried detecting a certain EEG that is peculiar to the eye movement. In these experiments, each subject was instructed to point their eyes toward visual targets (LEDs) or the direction of the sound sources (buzzers). In the control cases, the EEG was recorded in the case of no eye movemens. As results, in the visual experiments, we found that the potential of EEG changed sharply on the occipital lobe just before eye movement. Furthermore, in the case of the auditory experiments, similar results were observed. In the case of the visual experiments and auditory experiments without eye movement, we could not observed the EEG changed sharply. Moreover, when the subject moved his/her eyes toward a right-side target, a change in EEG potential was found on the right occipital lobe. On the contrary, when the subject moved his/her eyes toward a left-side target, a sharp change in EEG potential was found on the left occipital lobe.
One-Step "Change" and "Compare" Word Problems: Focusing on Eye-Movements
ERIC Educational Resources Information Center
Moutsios-Rentzos, Andreas; Stamatis, Panagiotis J.
2015-01-01
Introduction. In this study, we focus on the relationship between the students' mathematical thinking and their non-mechanically identified eye-movements with the purpose to gain deeper understanding about the students' reasoning processes and to investigate the feasibility of incorporating eye-movement information in everyday pedagogy. Method.…
ERIC Educational Resources Information Center
Nitschke, Kai; Ruh, Nina; Kappler, Sonja; Stahl, Christoph; Kaller, Christoph P.
2012-01-01
Understanding the functional neuroanatomy of planning and problem solving may substantially benefit from better insight into the chronology of the cognitive processes involved. Based on the assumption that regularities in cognitive processing are reflected in overtly observable eye-movement patterns, here we recorded eye movements while…
Horizontal Saccadic Eye Movements Enhance the Retrieval of Landmark Shape and Location Information
ERIC Educational Resources Information Center
Brunye, Tad T.; Mahoney, Caroline R.; Augustyn, Jason S.; Taylor, Holly A.
2009-01-01
Recent work has demonstrated that horizontal saccadic eye movements enhance verbal episodic memory retrieval, particularly in strongly right-handed individuals. The present experiments test three primary assumptions derived from this research. First, horizontal eye movements should facilitate episodic memory for both verbal and non-verbal…
Eye Movement as an Indicator of Sensory Components in Thought.
ERIC Educational Resources Information Center
Buckner, Michael; And Others
1987-01-01
Investigated Neuro-Linguistic Programming eye movement model's claim that specific eye movements are indicative of specific sensory components in thought. Agreement between students' (N=48) self-reports and trained observers' records support visual and auditory portions of model; do not support kinesthetic portion. Interrater agreement supports…
DOT National Transportation Integrated Search
1966-09-01
Failure of adaptation of nystagmic eye movements to occur under certain conditions of stimulation by angular acceleration has been ascribed to a failure to allow the eye-movement response to run its course. In this study, 3 groups of subjects were te...
Skuballa, Irene T.; Fortunski, Caroline; Renkl, Alexander
2015-01-01
The main research goal of the present study was to investigate in how far pre-training eye movements can facilitate knowledge acquisition in multimedia (pre-training principle). We combined considerations from research on eye movement modeling and pre-training to design and test a non-verbal eye movement-based pre-training. Participants in the experimental condition watched an animated circle moving in close spatial resemblance to a static visualization of a solar plant accompanied by a narration in a subsequently presented learning environment. This training was expected to foster top-down processes as reflected in gaze behavior during the learning process and enhance knowledge acquisition. We compared two groups (N = 45): participants in the experimental condition received pre-training in a first step and processed the learning material in a second step, whereas the control group underwent the second step without any pre-training. The pre-training group outperformed the control group in their learning outcomes, particularly in knowledge about processes and functions of the solar plant. However, the superior learning outcomes in the pre-training group could not be explained by eye-movement patterns. Furthermore, the pre-training moderated the relationship between experienced stress and learning outcomes. In the control group, high stress levels hindered learning, which was not found for the pre-training group. On a delayed posttest participants were requested to draw a picture of the learning content. Despite a non-significant effect of training on the quality of drawings, the pre-training showed associations between learning outcomes at the first testing time and process-related aspects in the quality of their drawings. Overall, non-verbal pre-training is a successful instructional intervention to promote learning processes in novices although these processes did not directly reflect in learners' eye movement behavior during learning. PMID:26029138
Eye movements reflect and shape strategies in fraction comparison
Ischebeck, Anja; Weilharter, Marina; Körner, Christof
2016-01-01
The comparison of fractions is a difficult task that can often be facilitated by separately comparing components (numerators and denominators) of the fractions—that is, by applying so-called component-based strategies. The usefulness of such strategies depends on the type of fraction pair to be compared. We investigated the temporal organization and the flexibility of strategy deployment in fraction comparison by evaluating sequences of eye movements in 20 young adults. We found that component-based strategies could account for the response times and the overall number of fixations observed for the different fraction pairs. The analysis of eye movement sequences showed that the initial eye movements in a trial were characterized by stereotypical scanning patterns indicative of an exploratory phase that served to establish the kind of fraction pair presented. Eye movements that followed this phase adapted to the particular type of fraction pair and indicated the deployment of specific comparison strategies. These results demonstrate that participants employ eye movements systematically to support strategy use in fraction comparison. Participants showed a remarkable flexibility to adapt to the most efficient strategy on a trial-by-trial basis. Our results confirm the value of eye movement measurements in the exploration of strategic adaptation in complex tasks. PMID:26039819
DOT National Transportation Integrated Search
1995-08-01
Performance of operators in aviation systems is highly dependent on their ability to visually scan information sources, identify problematic situations, and respond appropriately. Scanning behavior has often been mentioned as a contributing factor in...
Volitional and Real-Time Control Cursor Based on Eye Movement Decoding Using a Linear Decoding Model
Zhang, Cheng
2016-01-01
The aim of this study is to build a linear decoding model that reveals the relationship between the movement information and the EOG (electrooculogram) data to online control a cursor continuously with blinks and eye pursuit movements. First of all, a blink detection method is proposed to reject a voluntary single eye blink or double-blink information from EOG. Then, a linear decoding model of time series is developed to predict the position of gaze, and the model parameters are calibrated by the RLS (Recursive Least Square) algorithm; besides, the assessment of decoding accuracy is assessed through cross-validation procedure. Additionally, the subsection processing, increment control, and online calibration are presented to realize the online control. Finally, the technology is applied to the volitional and online control of a cursor to hit the multiple predefined targets. Experimental results show that the blink detection algorithm performs well with the voluntary blink detection rate over 95%. Through combining the merits of blinks and smooth pursuit movements, the movement information of eyes can be decoded in good conformity with the average Pearson correlation coefficient which is up to 0.9592, and all signal-to-noise ratios are greater than 0. The novel system allows people to successfully and economically control a cursor online with a hit rate of 98%. PMID:28058044
Does the perception of moving eyes trigger reflexive visual orienting in autism?
Swettenham, John; Condie, Samantha; Campbell, Ruth; Milne, Elizabeth; Coleman, Mike
2003-01-01
Does movement of the eyes in one or another direction function as an automatic attentional cue to a location of interest? Two experiments explored the directional movement of the eyes in a full face for speed of detection of an aftercoming location target in young people with autism and in control participants. Our aim was to investigate whether a low-level perceptual impairment underlies the delay in gaze following characteristic of autism. The participants' task was to detect a target appearing on the left or right of the screen either 100 ms or 800 ms after a face cue appeared with eyes averting to the left or right. Despite instructions to ignore eye-movement in the face cue, people with autism and control adolescents were quicker to detect targets that had been preceded by an eye movement cue congruent with target location compared with targets preceded by an incongruent eye movement cue. The attention shifts are thought to be reflexive because the cue was to be ignored, and because the effect was found even when cue-target duration was short (100 ms). Because (experiment two) the effect persisted even when the face was inverted, it would seem that the direction of movement of eyes can provide a powerful (involuntary) cue to a location. PMID:12639330
Investigating the causes of wrap-up effects: evidence from eye movements and E-Z Reader.
Warren, Tessa; White, Sarah J; Reichle, Erik D
2009-04-01
Wrap-up effects in reading have traditionally been thought to reflect increased processing associated with intra- and inter-clause integration (Just, M. A. & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological Review,87(4), 329-354; Rayner, K., Kambe, G., & Duffy, S. A. (2000). The effect of clause wrap-up on eye movements during reading. The Quarterly Journal of Experimental Psychology,53A(4), 1061-1080; cf. Hirotani, M., Frazier, L., & Rayner, K. (2006). Punctuation and intonation effects on clause and sentence wrap-up: Evidence from eye movements. Journal of Memory and Language,54, 425-443). We report an eye-tracking experiment with a strong manipulation of integrative complexity at a critical word that was either sentence-final, ended a comma-marked clause, or was not comma-marked. Although both complexity and punctuation had reliable effects, they did not interact in any eye-movement measure. These results as well as simulations using the E-Z Reader model of eye-movement control (Reichle, E. D., Warren, T., & McConnell, K. (2009). Using E-Z Reader to model the effects of higher-level language processing on eye movements during reading. Psychonomic Bulletin & Review,16(1), 1-20) suggest that traditional accounts of clause wrap-up are incomplete.
NASA Technical Reports Server (NTRS)
Dickman, J. D.; Angelaki, D. E.
1999-01-01
During linear accelerations, compensatory reflexes should continually occur in order to maintain objects of visual interest as stable images on the retina. In the present study, the three-dimensional organization of the vestibulo-ocular reflex in pigeons was quantitatively examined during linear accelerations produced by constant velocity off-vertical axis yaw rotations and translational motion in darkness. With off-vertical axis rotations, sinusoidally modulated eye-position and velocity responses were observed in all three components, with the vertical and torsional eye movements predominating the response. Peak torsional and vertical eye positions occurred when the head was oriented with the lateral visual axis of the right eye directed orthogonal to or aligned with the gravity vector, respectively. No steady-state horizontal nystagmus was obtained with any of the rotational velocities (8-58 degrees /s) tested. During translational motion, delivered along or perpendicular to the lateral visual axis, vertical and torsional eye movements were elicited. No significant horizontal eye movements were observed during lateral translation at frequencies up to 3 Hz. These responses suggest that, in pigeons, all linear accelerations generate eye movements that are compensatory to the direction of actual or perceived tilt of the head relative to gravity. In contrast, no translational horizontal eye movements, which are known to be compensatory to lateral translational motion in primates, were observed under the present experimental conditions.
New methods for the assessment of accommodative convergence.
Asakawa, Ken; Ishikawa, Hitoshi; Shoji, Nobuyuki
2009-01-01
The authors introduced a new objective method for measuring horizontal eye movements based on the first Purkinje image with the use of infrared charge-coupled device (CCD) cameras and compared stimulus accommodative convergence to accommodation (AC/A) ratios as determined by a standard gradient method. The study included 20 patients, 5 to 9 years old, who had intermittent exotropia (10 eyes) and accommodative esotropia (10 eyes). Measurement of horizontal eye movements in millimeters (mm), based on the first Purkinje image, was obtained with a TriIRIS C9000 instrument (Hamamatsu Photonics K.K., Hamamatsu, Japan). The stimulus AC/A ratio was determined with the far gradient method. The average values of horizontal eye movements (mm) and eye deviation (Delta) (a) before and (b) after an accommodative stimulus of 3.00 diopters (D) were calculated with the following formula: horizontal eye movements (mm/D) and stimulus AC/A ratio (Delta/D) = (b - a)/3. The average values of the horizontal eye movements and the stimulus AC/A ratio were 0.5 mm/D and 3.8 Delta/D, respectively. Correlation analysis showed a strong positive correlation between these two parameters (r = 0.92). Moreover, horizontal eye movements are directly proportional to the AC/A ratio measured with the gradient method. The methods used in this study allow objective recordings of accommodative convergence to be obtained in many clinical situations. Copyright 2009, SLACK Incorporated.
PERCEPTION AND TELEVISION--PHYSIOLOGICAL FACTORS OF TELEVISION VIEWING.
ERIC Educational Resources Information Center
GUBA, EGON; AND OTHERS
AN EXPERIMENTAL SYSTEM WAS DEVELOPED FOR RECORDING EYE-MOVEMENT DATA. RAW DATA WERE IN THE FORM OF MOTION PICTURES TAKEN OF THE MONITOR OF A CLOSED LOOP TELEVISION SYSTEM. A TELEVISION CAMERA WAS MOUNTED ON THE SUBJECTS' FIELD OF VIEW. THE EYE MARKER APPEARED AS A SMALL SPOT OF LIGHT AND INDICATED THE POINT IN THE VISUAL FIELD AT WHICH THE SUBJECT…
Luque, M A; Pérez-Pérez, M P; Herrero, L; Waitzman, D M; Torres, B
2006-02-01
Anatomical studies in goldfish show that the tectofugal axons provide a large number of boutons within the mesencephalic reticular formation. Electrical stimulation, reversible inactivation and cell recording in the primate central mesencephalic reticular formation have suggested that it participates in the control of rapid eye movements (saccades). Moreover, the role of this tecto-recipient area in the generation of saccadic eye movements in fish is unknown. In this study we show that the electrical microstimulation of the mesencephalic reticular formation of goldfish evoked short latency saccadic eye movements in any direction (contraversive or ipsiversive, upward or downward). Movements of the eyes were usually disjunctive. Based on the location of the sites from which eye movements were evoked and the preferred saccade direction, eye movements were divided into different groups: pure vertical saccades were mainly elicited from the rostral mesencephalic reticular formation, while oblique and pure horizontal were largely evoked from middle and caudal mesencephalic reticular formation zones. The direction and amplitude of pure vertical and horizontal saccades were unaffected by initial eye position. However the amplitude, but not the direction of most oblique saccades was systematically modified by initial eye position. At the same time, the amplitude of elicited saccades did not vary in any consistent manner along either the anteroposterior, dorsoventral or mediolateral axes (i.e. there was no topographic organization of the mesencephalic reticular formation with respect to amplitude). In addition to these groups of movements, we found convergent and goal-directed saccades evoked primarily from the anterior and posterior mesencephalic reticular formation, respectively. Finally, the metric and kinetic characteristics of saccades could be manipulated by changes in the stimulation parameters. We conclude that the mesencephalic reticular formation in goldfish shares physiological functions that correspond closely with those found in mammals.
Neurophysiology and Neuroanatomy of Smooth Pursuit in Humans
ERIC Educational Resources Information Center
Lencer, Rebekka; Trillenberg, Peter
2008-01-01
Smooth pursuit eye movements enable us to focus our eyes on moving objects by utilizing well-established mechanisms of visual motion processing, sensorimotor transformation and cognition. Novel smooth pursuit tasks and quantitative measurement techniques can help unravel the different smooth pursuit components and complex neural systems involved…
ERIC Educational Resources Information Center
Vilppu, Henna; Mikkilä-Erdmann, Mirjamaija; Södervik, Ilona; Österholm-Matikainen, Erika
2017-01-01
This study used the eye-tracking method to explore how the level of expertise influences reading, and solving, two written patient cases on cardiac failure and pulmonary embolus. Eye-tracking is a fairly commonly used method in medical education research, but it has been primarily applied to studies analyzing the processing of visualizations, such…
Eye-hand coupling during closed-loop drawing: evidence of shared motor planning?
Reina, G Anthony; Schwartz, Andrew B
2003-04-01
Previous paradigms have used reaching movements to study coupling of eye-hand kinematics. In the present study, we investigated eye-hand kinematics as curved trajectories were drawn at normal speeds. Eye and hand movements were tracked as a monkey traced ellipses and circles with the hand in free space while viewing the hand's position on a computer monitor. The results demonstrate that the movement of the hand was smooth and obeyed the 2/3 power law. Eye position, however, was restricted to 2-3 clusters along the hand's trajectory and fixed approximately 80% of the time in one of these clusters. The eye remained stationary as the hand moved away from the fixation for up to 200 ms and saccaded ahead of the hand position to the next fixation along the trajectory. The movement from one fixation cluster to another consistently occurred just after the tangential hand velocity had reached a local minimum, but before the next segment of the hand's trajectory began. The next fixation point was close to an area of high curvature along the hand's trajectory even though the hand had not reached that point along the path. A visuo-motor illusion of hand movement demonstrated that the eye movement was influenced by hand movement and not simply by visual input. During the task, neural activity of pre-motor cortex (area F4) was recorded using extracellular electrodes and used to construct a population vector of the hand's trajectory. The results suggest that the saccade onset is correlated in time with maximum curvature in the population vector trajectory for the hand movement. We hypothesize that eye and arm movements may have common, or shared, information in forming their motor plans.
Bonhage, Corinna E; Mueller, Jutta L; Friederici, Angela D; Fiebach, Christian J
2015-07-01
It is widely agreed upon that linguistic predictions are an integral part of language comprehension. Yet, experimental proof of their existence remains challenging. Here, we introduce a new predictive eye gaze reading task combining eye tracking and functional magnetic resonance imaging (fMRI) that allows us to infer the existence and timing of linguistic predictions via anticipatory eye-movements. Participants read different types of word sequences (i.e., regular sentences, meaningless jabberwocky sentences, non-word lists) up to the pre-final word. The final target word was displayed with a temporal delay and its screen position was dependent on the syntactic word category (nouns vs verbs). During the delay, anticipatory eye-movements into the correct target word area were indicative of linguistic predictions. For fMRI analysis, the predictive sentence conditions were contrasted to the non-word condition, with the anticipatory eye-movements specifying differences in timing across conditions. A conjunction analysis of both sentence conditions revealed the neural substrate of word category prediction, namely a distributed network of cortical and subcortical brain regions including language systems, basal ganglia, thalamus, and hippocampus. Direct contrasts between the regular sentence condition and the jabberwocky condition indicate that prediction of word category in meaningless jabberwocky sentences relies on classical left-hemispheric language systems involving Brodman's area 44/45 in the left inferior frontal gyrus, left superior temporal areas, and the dorsal caudate nucleus. Regular sentences, in contrast, allowed for the prediction of specific words. Word-specific predictions were specifically associated with more widely distributed temporal and parietal cortical systems, most prominently in the right hemisphere. Our results support the presence of linguistic predictions during sentence processing and demonstrate the validity of the predictive eye gaze paradigm for measuring syntactic and semantic aspects of linguistic predictions, as well as for investigating their neural substrates. Copyright © 2015 Elsevier Ltd. All rights reserved.
Eye movement control in reading unspaced text: the case of the Japanese script.
Kajii, N; Nazir, T A; Osaka, N
2001-09-01
The present study examines the landing-site distributions of the eyes during natural reading of Japanese script: a script that mixes three different writing systems (Kanji, Hiragana, and Katakana) and that misses regular spacing between words. The results show a clear preference of the eyes to land at the beginning rather than the center of the word. In addition, it was found that the eyes land on Kanji characters more frequently than on Hiragana or Katakana characters. Further analysis for two- and three-character words indicated that the eye's landing-site distribution differs depending on type of the characters in the word: the eyes prefer to land at the word beginning only when the initial character of the word is a Kanji character. For pure Hiragana words, the proportion of initial fixations did not differ between character positions. Thus, as already indicated by Kambe (National Institute of Japanese Language Report 85 (1986) 29), the visual distinctiveness of the three Japanese scripts plays a role in guiding eye movements in reading Japanese.
Suzuki, D A; Yamada, T; Hoedema, R; Yee, R D
1999-09-01
Anatomic and neuronal recordings suggest that the nucleus reticularis tegmenti pontis (NRTP) of macaques may be a major pontine component of a cortico-ponto-cerebellar pathway that subserves the control of smooth-pursuit eye movements. The existence of such a pathway was implicated by the lack of permanent pursuit impairment after bilateral lesions in the dorsolateral pontine nucleus. To provide more direct evidence that NRTP is involved with regulating smooth-pursuit eye movements, chemical lesions were made in macaque NRTP by injecting either lidocaine or ibotenic acid. Injection sites first were identified by the recording of smooth-pursuit-related modulations in neuronal activity. The resulting lesions caused significant deficits in both the maintenance and the initiation of smooth-pursuit eye movements. After lesion formation, the gain of constant-velocity, maintained smooth-pursuit eye movements decreased, on the average, by 44%. Recovery of the ability to maintain smooth-pursuit eye movements occurred over approximately 3 days when maintained pursuit gains attained normal values. The step-ramp, "Rashbass" task was used to investigate the effects of the lesions on the initiation of smooth-pursuit eye movements. Eye accelerations averaged over the initial 80 ms of pursuit initiation were determined and found to be decremented, on the average, by 48% after the administration of ibotenic acid. Impairments in the initiation and maintenance of smooth-pursuit eye movements were directional in nature. Upward pursuit seemed to be the most vulnerable and was impaired in all cases independent of lesioning agent and type of pursuit investigated. Downward smooth pursuit seemed more resistant to the effects of chemical lesions in NRTP. Impairments in horizontal tracking were observed with examples of deficits in ipsilaterally and contralaterally directed pursuit. The results provide behavioral support for the physiologically and anatomic-based conclusion that NRTP is a component of a cortico-ponto-cerebellar circuit that presumably involves the pursuit area of the frontal eye field (FEF) and projects to ocular motor-related areas of the cerebellum. This FEF-NRTP-cerebellum path would parallel a middle and medial superior temporal cerebral cortical area-dorsolateral pontine nucleus-cerebellum pathway also known to be involved with regulating smooth-pursuit eye movements.
McCluskey, Meaghan K; Cullen, Kathleen E
2007-04-01
Coordinated movements of the eye, head, and body are used to redirect the axis of gaze between objects of interest. However, previous studies of eye-head gaze shifts in head-unrestrained primates generally assumed the contribution of body movement to be negligible. Here we characterized eye-head-body coordination during horizontal gaze shifts made by trained rhesus monkeys to visual targets while they sat upright in a standard primate chair and assumed a more natural sitting posture in a custom-designed chair. In both postures, gaze shifts were characterized by the sequential onset of eye, head, and body movements, which could be described by predictable relationships. Body motion made a small but significant contribution to gaze shifts that were > or =40 degrees in amplitude. Furthermore, as gaze shift amplitude increased (40-120 degrees ), body contribution and velocity increased systematically. In contrast, peak eye and head velocities plateaued at velocities of approximately 250-300 degrees /s, and the rotation of the eye-in-orbit and head-on-body remained well within the physical limits of ocular and neck motility during large gaze shifts, saturating at approximately 35 and 60 degrees , respectively. Gaze shifts initiated with the eye more contralateral in the orbit were accompanied by smaller body as well as head movement amplitudes and velocities were greater when monkeys were seated in the more natural body posture. Taken together, our findings show that body movement makes a predictable contribution to gaze shifts that is systematically influenced by factors such as orbital position and posture. We conclude that body movements are part of a coordinated series of motor events that are used to voluntarily reorient gaze and that these movements can be significant even in a typical laboratory setting. Our results emphasize the need for caution in the interpretation of data from neurophysiological studies of the control of saccadic eye movements and/or eye-head gaze shifts because single neurons can code motor commands to move the body as well as the head and eyes.
Expansion of visual space during optokinetic afternystagmus (OKAN).
Kaminiarz, André; Krekelberg, Bart; Bremmer, Frank
2008-05-01
The mechanisms underlying visual perceptual stability are usually investigated using voluntary eye movements. In such studies, errors in perceptual stability during saccades and pursuit are commonly interpreted as mismatches between actual eye position and eye-position signals in the brain. The generality of this interpretation could in principle be tested by investigating spatial localization during reflexive eye movements whose kinematics are very similar to those of voluntary eye movements. Accordingly, in this study, we determined mislocalization of flashed visual targets during optokinetic afternystagmus (OKAN). These eye movements are quite unique in that they occur in complete darkness and are generated by subcortical control mechanisms. We found that during horizontal OKAN slow phases, subjects mislocalize targets away from the fovea in the horizontal direction. This corresponds to a perceived expansion of visual space and is unlike mislocalization found for any other voluntary or reflexive eye movement. Around the OKAN fast phases, we found a bias in the direction of the fast phase prior to its onset and opposite to the fast-phase direction thereafter. Such a biphasic modulation has also been reported in the temporal vicinity of saccades and during optokinetic nystagmus (OKN). A direct comparison, however, showed that the modulation during OKAN was much larger and occurred earlier relative to fast-phase onset than during OKN. A simple mismatch between the current eye position and the eye-position signal in the brain is unlikely to explain such disparate results across similar eye movements. Instead, these data support the view that mislocalization arises from errors in eye-centered position information.
Maier, Felix M; Schaeffel, Frank
2013-07-24
To find out whether adaptation to a vertical prism involves more than fusional vertical eye movements. Adaptation to a vertical base-up 3 prism diopter prism was measured in a custom-programmed Maddox test in nine visually normal emmetropic subjects (mean age 27.0 ± 2.8 years). Vertical eye movements were binocularly measured in six of the subjects with a custom-programmed binocular video eye tracker. In the Maddox test, some subjects adjusted the perceived height as expected from the power of the prism while others appeared to ignore the prism. After 15 minutes of adaptation, the interocular difference in perceived height was reduced by on average 51% (from 0.86°-0.44°). The larger the initially perceived difference in height in a subject, the larger the amplitude of adaptation was. Eye tracking showed that the prism generated divergent vertical eye movements of 1.2° on average, which was less than expected from its power. Differences in eye elevation were maintained as long as the prism was in place. Small angles of lateral head tilt generated large interocular differences in eye elevation, much larger than the effects introduced by the prism. Vertical differences in retinal image height were compensated by vertical fusional eye movements but some subjects responded poorly to a vertical prism in both experiments; fusional eye movements were generally too small to realign both foveae with the fixation target; and the prism adaptation in the Maddox test was fully explained by the changes in vertical eye position, suggesting that no further adaptational mechanism may be involved.
Barmack, N H; Errico, P; Ferraresi, A; Pettorossi, V E
1989-01-01
1. Eye movements in unanaesthetized rabbits were studied during horizontal neck-proprioceptive stimulation (movement of the body with respect to the fixed head), when this stimulation was given alone and when it was given simultaneously with vestibular stimulation (rotation of the head-body). The effect of neck-proprioceptive stimulation on modifying the anticompensatory fast-phase eye movements (AFPs) evoked by vestibular stimulation was studied with a 'conditioning-test' protocol; the 'conditioning' stimulus was a neck-proprioceptive signal evoked by a step-like change in body position with respect to the head and the 'test' stimulus was a vestibular signal evoked by a step rotation of the head-body. 2. The influence of eye position and direction of slow eye movements on the occurrence of compensatory fast-phase eye movements (CFPs) evoked by neck-proprioceptive stimulation was also examined. 3. The anticompensatory fast phase (AFP) evoked by vestibular stimulation was attenuated by a preceding neck-proprioceptive stimulus which when delivered alone evoked compensatory slow-phase eye movements (CSP) in the same direction as the CSP evoked by vestibular stimulation. Conversely, the vestibularly evoked AFP was potentiated by a neck-proprioceptive stimulus which evoked CSPs opposite to that of vestibularly evoked CSPs. 4. Eccentric initial eye positions increased the probability of occurrence of midline-directed compensatory fast-phase eye movements (CFPs) evoked by appropriate neck-proprioceptive stimulation. 5. The gain of the horizontal cervico-ocular reflex (GHCOR) was measured from the combined changes in eye position resulting from AFPs and CSPs. GHCOR was potentiated during simultaneous vestibular stimulation. This enhancement of GHCOR occurred at neck-proprioceptive stimulus frequencies which, in the absence of conjoint vestibular stimulation, do not evoke CSPs. PMID:2795479
Raghunath, Vignesh; Braxton, Melissa O.; Gagnon, Stephanie A.; Brunyé, Tad T.; Allison, Kimberly H.; Reisch, Lisa M.; Weaver, Donald L.; Elmore, Joann G.; Shapiro, Linda G.
2012-01-01
Context: Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists’ viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists’ viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists’ viewing strategies and time expenditures in their interpretive workflow. Aims: To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists’ attention and viewing behavior. Settings and Design: Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Materials and Methods: Participants’ foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Statistical Analysis Used: Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists’ accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Results: Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Conclusions: Data detailing mouse cursor movements may be a useful addition to future studies of pathologists’ accuracy and efficiency when using digital pathology. PMID:23372984
Raghunath, Vignesh; Braxton, Melissa O; Gagnon, Stephanie A; Brunyé, Tad T; Allison, Kimberly H; Reisch, Lisa M; Weaver, Donald L; Elmore, Joann G; Shapiro, Linda G
2012-01-01
Digital pathology has the potential to dramatically alter the way pathologists work, yet little is known about pathologists' viewing behavior while interpreting digital whole slide images. While tracking pathologist eye movements when viewing digital slides may be the most direct method of capturing pathologists' viewing strategies, this technique is cumbersome and technically challenging to use in remote settings. Tracking pathologist mouse cursor movements may serve as a practical method of studying digital slide interpretation, and mouse cursor data may illuminate pathologists' viewing strategies and time expenditures in their interpretive workflow. To evaluate the utility of mouse cursor movement data, in addition to eye-tracking data, in studying pathologists' attention and viewing behavior. Pathologists (N = 7) viewed 10 digital whole slide images of breast tissue that were selected using a random stratified sampling technique to include a range of breast pathology diagnoses (benign/atypia, carcinoma in situ, and invasive breast cancer). A panel of three expert breast pathologists established a consensus diagnosis for each case using a modified Delphi approach. Participants' foveal vision was tracked using SensoMotoric Instruments RED 60 Hz eye-tracking system. Mouse cursor movement was tracked using a custom MATLAB script. Data on eye-gaze and mouse cursor position were gathered at fixed intervals and analyzed using distance comparisons and regression analyses by slide diagnosis and pathologist expertise. Pathologists' accuracy (defined as percent agreement with the expert consensus diagnoses) and efficiency (accuracy and speed) were also analyzed. Mean viewing time per slide was 75.2 seconds (SD = 38.42). Accuracy (percent agreement with expert consensus) by diagnosis type was: 83% (benign/atypia); 48% (carcinoma in situ); and 93% (invasive). Spatial coupling was close between eye-gaze and mouse cursor positions (highest frequency ∆x was 4.00px (SD = 16.10), and ∆y was 37.50px (SD = 28.08)). Mouse cursor position moderately predicted eye gaze patterns (Rx = 0.33 and Ry = 0.21). Data detailing mouse cursor movements may be a useful addition to future studies of pathologists' accuracy and efficiency when using digital pathology.
The Cerebellar Dysplasia of Chiari II Malformation as Revealed by Eye Movements
Salman, Michael S.; Dennis, Maureen; Sharpe, James A.
2011-01-01
Introduction Chiari type II malformation (CII) is a developmental deformity of the hindbrain. We have previously reported that many patients with CII have impaired smooth pursuit, while few make inaccurate saccades or have an abnormal vestibulo-ocular reflex. In contrast, saccadic adaptation and visual fixation are normal. In this report, we correlate results from several eye movement studies with neuroimaging in CII. We present a model for structural changes within the cerebellum in CII. Methods Saccades, smooth pursuit, the vestibulo-ocular reflex, and visual fixation were recorded in 21 patients with CII, aged 8–19 years and 39 age-matched controls, using an infrared eye tracker. Qualitative and quantitative MRI data were correlated with eye movements in 19 CII patients and 28 controls. Results Nine patients with CII had abnormal eye movements. Smooth pursuit gain was subnormal in eight, saccadic accuracy abnormal in four, and vestibulo-ocular reflex gain abnormal in three. None had fixation instability. Patients with CII had a significantly smaller cerebellar volume than controls, and those with normal eye motion had an expanded midsagittal vermis compared to controls. However, patients with abnormal eye movements had a smaller (non-expanded) midsagittal vermis area, posterior fossa area and medial cerebellar volumes than CII patients with normal eye movements. Conclusions The deformity of CII affects the structure and function of the cerebellum selectively and differently in those with abnormal eye movements. We propose that the vermis can expand when compressed within a small posterior fossa in some CII patients, thus sparing its ocular motor functions. PMID:19960749
Asymmetries in the Control of Saccadic Eye Movements to Bifurcating Targets.
ERIC Educational Resources Information Center
Zeevi, Yehoshua Y.; And Others
The examination of saccadic eye movements--rapid shifts in gaze from one visual area of interest to another--is useful in studying pilot's visual learning in flight simulator training. Saccadic eye movements are the basic oculomotor response associated with the acquisition of visual information and provide an objective measure of higher perceptual…
The Role of Eye Movement Driven Attention in Functional Strabismic Amblyopia
2015-01-01
Strabismic amblyopia “blunt vision” is a developmental anomaly that affects binocular vision and results in lowered visual acuity. Strabismus is a term for a misalignment of the visual axes and is usually characterized by impaired ability of the strabismic eye to take up fixation. Such impaired fixation is usually a function of the temporally and spatially impaired binocular eye movements that normally underlie binocular shifts in visual attention. In this review, we discuss how abnormal eye movement function in children with misaligned eyes influences the development of normal binocular visual attention and results in deficits in visual function such as depth perception. We also discuss how eye movement function deficits in adult amblyopia patients can also lead to other abnormalities in visual perception. Finally, we examine how the nonamblyopic eye of an amblyope is also affected in strabismic amblyopia. PMID:25838941
Effects of reward on the accuracy and dynamics of smooth pursuit eye movements.
Brielmann, Aenne A; Spering, Miriam
2015-08-01
Reward modulates behavioral choices and biases goal-oriented behavior, such as eye or hand movements, toward locations or stimuli associated with higher rewards. We investigated reward effects on the accuracy and timing of smooth pursuit eye movements in 4 experiments. Eye movements were recorded in participants tracking a moving visual target on a computer monitor. Before target motion onset, a monetary reward cue indicated whether participants could earn money by tracking accurately, or whether the trial was unrewarded (Experiments 1 and 2, n = 11 each). Reward significantly improved eye-movement accuracy across different levels of task difficulty. Improvements were seen even in the earliest phase of the eye movement, within 70 ms of tracking onset, indicating that reward impacts visual-motor processing at an early level. We obtained similar findings when reward was not precued but explicitly associated with the pursuit target (Experiment 3, n = 16); critically, these results were not driven by stimulus prevalence or other factors such as preparation or motivation. Numerical cues (Experiment 4, n = 9) were not effective. (c) 2015 APA, all rights reserved).
Diurnal variation of eye movement and heart rate variability in the human fetus at term.
Morokuma, S; Horimoto, N; Satoh, S; Nakano, H
2001-07-01
To elucidate diurnal variations in eye movement and fetal heart rate (FHR) variability in the term fetus, we observed these two parameters continuously for 24 h, using real-time ultrasound and Doppler cardiotocograph, respectively. Studied were five uncomplicated fetuses at term. The time series data of the presence and absence of eye movement and mean FHR value for each 1 min were analyzed using the maximum entropy method (MEM) and subsequent nonlinear least squares fitting. According to the power value of eye movement, all five cases were classified into two groups: three cases in the large power group and two cases in the small power group. The acrophases of eye movement and FHR variability in the large power group were close, thereby implying the existence of a diurnal rhythm in both these parameters and also that they are synchronized. In the small power group, the acrophases were separated. The synchronization of eye movement and FHR variability in the large power group suggests that these phenomena are governed by a common central mechanism related to diurnal rhythm generation.
Kongsted, A; Jørgensen, L V; Bendix, T; Korsholm, L; Leboeuf-Yde, C
2007-11-01
To evaluate whether smooth pursuit eye movements differed between patients with long-lasting whiplash-associated disorders and controls when using a purely computerized method for the eye movement analysis. Cross-sectional study comparing patients with whiplash-associated disorders and controls who had not been exposed to head or neck trauma and had no notable neck complaints. Smooth pursuit eye movements were registered while the subjects were seated with and without rotated cervical spine. Thirty-four patients with whiplash-associated disorders with symptoms more than six months after a car collision and 60 controls. Smooth pursuit eye movements were almost identical in patients with chronic whiplash-associated disorders and controls, both when the neck was rotated and in the neutral position. Disturbed smooth pursuit eye movements do not appear to be a distinct feature in patients with chronic whiplash-associated disorders. This is in contrast to results of previous studies and may be due to the fact that analyses were performed in a computerized and objective manner. Other possible reasons for the discrepancy to previous studies are discussed.
Miller, Brett; O'Donnell, Carol
2013-01-01
The cumulative body of eye movement research provides significant insight into how readers process text. The heart of this work spans roughly 40 years reflecting the maturity of both the topics under study and experimental approaches used to investigate reading. Recent technological advancements offer increased flexibility to the field providing the potential to more concertedly study reading and literacy from an individual differences perspective. Historically, eye movement research focused far less on developmental issues related to individual differences in reading; however, this issue and the broader change it represents signal a meaningful transition inclusive of individual differences. The six papers in this special issue signify the recent, increased attention to and recognition of eye movement research's transition to emphasize individual differences in reading while appreciating early contributions (e.g., Rayner, 1986) in this direction. We introduce these six papers and provide some historical context for the use of eye movement methodology to examine reading and context for the eye movement field's early transition to examining individual differences, culminating in future research recommendations.
Eye Tracking Based Control System for Natural Human-Computer Interaction
Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design. PMID:29403528
Eye Tracking Based Control System for Natural Human-Computer Interaction.
Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan
2017-01-01
Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.
The Trajectories of Saccadic Eye Movements.
ERIC Educational Resources Information Center
Bahill, A. Terry; Stark, Lawrence
1979-01-01
Investigates the trajectories of saccadic eye movements, the control signals of the eye, and nature of the mechanisms that generate them, using the techniques of bioengineering in collecting the data. (GA)
... do not aim in the same direction Uncoordinated eye movements (eyes do not move together) Loss of vision ... Stahl ED, Ariss MM, Lindquist TP. Disorders of eye movement and alignment. In: Kliegman RM, Stanton BF, St. ...
Eye-Head Coordination in 31 Space Shuttle Astronauts during Visual Target Acquisition.
Reschke, Millard F; Kolev, Ognyan I; Clément, Gilles
2017-10-27
Between 1989 and 1995, NASA evaluated how increases in flight duration of up to 17 days affected the health and performance of Space Shuttle astronauts. Thirty-one Space Shuttle pilots participating in 17 space missions were tested at 3 different times before flight and 3 different times after flight, starting within a few hours of return to Earth. The astronauts moved their head and eyes as quickly as possible from the central fixation point to a specified target located 20°, 30°, or 60° off center. Eye movements were measured with electro-oculography (EOG). Head movements were measured with a triaxial rate sensor system mounted on a headband. The mean time to visually acquire the targets immediately after landing was 7-10% (30-34 ms) slower than mean preflight values, but results returned to baseline after 48 hours. This increase in gaze latency was due to a decrease in velocity and amplitude of both the eye saccade and head movement toward the target. Results were similar after all space missions, regardless of length.
Gaze control for an active camera system by modeling human pursuit eye movements
NASA Astrophysics Data System (ADS)
Toelg, Sebastian
1992-11-01
The ability to stabilize the image of one moving object in the presence of others by active movements of the visual sensor is an essential task for biological systems, as well as for autonomous mobile robots. An algorithm is presented that evaluates the necessary movements from acquired visual data and controls an active camera system (ACS) in a feedback loop. No a priori assumptions about the visual scene and objects are needed. The algorithm is based on functional models of human pursuit eye movements and is to a large extent influenced by structural principles of neural information processing. An intrinsic object definition based on the homogeneity of the optical flow field of relevant objects, i.e., moving mainly fronto- parallel, is used. Velocity and spatial information are processed in separate pathways, resulting in either smooth or saccadic sensor movements. The program generates a dynamic shape model of the moving object and focuses its attention to regions where the object is expected. The system proved to behave in a stable manner under real-time conditions in complex natural environments and manages general object motion. In addition it exhibits several interesting abilities well-known from psychophysics like: catch-up saccades, grouping due to coherent motion, and optokinetic nystagmus.
Binocular coordination in response to stereoscopic stimuli
NASA Astrophysics Data System (ADS)
Liversedge, Simon P.; Holliman, Nicolas S.; Blythe, Hazel I.
2009-02-01
Humans actively explore their visual environment by moving their eyes. Precise coordination of the eyes during visual scanning underlies the experience of a unified perceptual representation and is important for the perception of depth. We report data from three psychological experiments investigating human binocular coordination during visual processing of stereoscopic stimuli.In the first experiment participants were required to read sentences that contained a stereoscopically presented target word. Half of the word was presented exclusively to one eye and half exclusively to the other eye. Eye movements were recorded and showed that saccadic targeting was uninfluenced by the stereoscopic presentation, strongly suggesting that complementary retinal stimuli are perceived as a single, unified input prior to saccade initiation. In a second eye movement experiment we presented words stereoscopically to measure Panum's Fusional Area for linguistic stimuli. In the final experiment we compared binocular coordination during saccades between simple dot stimuli under 2D, stereoscopic 3D and real 3D viewing conditions. Results showed that depth appropriate vergence movements were made during saccades and fixations to real 3D stimuli, but only during fixations on stereoscopic 3D stimuli. 2D stimuli did not induce depth vergence movements. Together, these experiments indicate that stereoscopic visual stimuli are fused when they fall within Panum's Fusional Area, and that saccade metrics are computed on the basis of a unified percept. Also, there is sensitivity to non-foveal retinal disparity in real 3D stimuli, but not in stereoscopic 3D stimuli, and the system responsible for binocular coordination responds to this during saccades as well as fixations.
... infants and young children. It involves rapid, uncontrolled eye movements, head bobbing, and, sometimes, holding the neck in ... spasmus nutans include: Small, quick, side-to-side eye movements called nystagmus (both eyes are involved, but each ...
... Does Brown syndrome cause eye problems besides abnormal eye movements? In the more severely affected cases of Brown ... acquired and congenital cases. In congenital cases, the eye movement problem is usually constant and unlikely to resolve ...
Active head rotations and eye-head coordination
NASA Technical Reports Server (NTRS)
Zangemeister, W. H.; Stark, L.
1981-01-01
It is pointed out that head movements play an important role in gaze. The interaction between eye and head movements involves both their shared role in directing gaze and the compensatory vestibular ocular reflex. The dynamics of head trajectories are discussed, taking into account the use of parameterization to obtain the peak velocity, peak accelerations, the times of these extrema, and the duration of the movement. Attention is given to the main sequence, neck muscle EMG and details of the head-movement trajectory, types of head model accelerations, the latency of eye and head movement in coordinated gaze, gaze latency as a function of various factors, and coordinated gaze types. Clinical examples of gaze-plane analysis are considered along with the instantaneous change of compensatory eye movement (CEM) gain, and aspects of variability.
Sleep duration varies as a function of glutamate and GABA in rat pontine reticular formation.
Watson, Christopher J; Lydic, Ralph; Baghdoyan, Helen A
2011-08-01
The oral part of the pontine reticular formation (PnO) is a component of the ascending reticular activating system and plays a role in the regulation of sleep and wakefulness. The PnO receives glutamatergic and GABAergic projections from many brain regions that regulate behavioral state. Indirect, pharmacological evidence has suggested that glutamatergic and GABAergic signaling within the PnO alters traits that characterize wakefulness and sleep. No previous studies have simultaneously measured endogenous glutamate and GABA from rat PnO in relation to sleep and wakefulness. The present study utilized in vivo microdialysis coupled on-line to capillary electrophoresis with laser-induced fluorescence to test the hypothesis that concentrations of glutamate and GABA in the PnO vary across the sleep/wake cycle. Concentrations of glutamate and GABA were significantly higher during wakefulness than during non-rapid eye movement sleep and rapid eye movement sleep. Regression analysis revealed that decreases in glutamate and GABA accounted for a significant portion of the variance in the duration of non-rapid eye movement sleep and rapid eye movement sleep episodes. These data provide novel support for the hypothesis that endogenous glutamate and GABA in the PnO contribute to the regulation of sleep duration. © 2011 The Authors. Journal of Neurochemistry © 2011 International Society for Neurochemistry.
The horizontal and vertical cervico-ocular reflexes of the rabbit.
Barmack, N H; Nastos, M A; Pettorossi, V E
1981-11-16
Horizontal and vertical cervico-ocular reflexes of the rabbit (HCOR, VCOR) were evoked by sinusoidal oscillation of the body about the vertical and longitudinal axes while the head was fixed. These reflexes were studied over a frequency range of 0.005-0.800 Hz and at stimulus amplitudes of +/- 10 degrees. When the body of the rabbit was rotated horizontally clockwise around the fixed head, clockwise conjugate eye movements were evoked. When the body was rotated about the longitudinal axis onto the right side, the right eye rotated down and the left eye rotated up. The mean gain of the HCOR (eye velocity/body velocity) rose from 0.21 and 0.005 Hz to 0.27 at 0.020 Hz and then declined to 0.06 at 0.3Hz. The gain of the VCOR was less than the gain of the HCOR by a factor of 2-3. The HCOR was measured separately and in combination with the horizontal vestibulo-ocular reflex (HVOR). These reflexes combine linearly. The relative movements of the first 3 cervical vertebrae during stimulation of the HCOR and VCOR were measured. For the HCOR, the largest angular displacement (74%) occurs between C1 and C2. For the VCOR, the largest relative angular displacement (45%) occurs between C2 and C3. Step horizontal clockwise rotation of the head and body (HVOR) evoked low velocity counterclockwise eye movements followed by fast clockwise (resetting) eye movements. Step horizontal clockwise rotation of the body about the fixed head (HCOR) evoked low velocity clockwise eye movements which were followed by fast clockwise eye movements. Step horizontal clockwise rotation of the head about the fixed body (HCOR + HVOR) evoked low velocity counterclockwise eye movements which were not interrupted by fast clockwise eye movements. These data provide further evidence for a linear combination of independent HCOR and HVOR signals.
Assessment of Attentional Workload while Driving by Eye-fixation-related Potentials
NASA Astrophysics Data System (ADS)
Takeda, Yuji; Yoshitsugu, Noritoshi; Itoh, Kazuya; Kanamori, Nobuhiro
How do drivers cope with the attentional workload of in-vehicle information technology? In the present study, we propose a new psychophysiological measure for assessing drivers' attention: eye-fixation-related potential (EFRP). EFRP is a kind of event-related brain potential measurable at the eye-movement situation that reflects how closely observers examine visual information at the eye-fixated position. In the experiment, the effects of verbal working memory load and spatial working memory load during simulated driving were examined by measuring the number of saccadic eye-movements and EFRP as the indices of drivers' attention. The results showed that the spatial working memory load affected both the number of saccadic eye-movements and the amplitude of the P100 component of EFRP, whereas the verbal working memory load affected only the number of saccadic eye-movements. This implies that drivers can perform time-sharing processing between driving and the verbal working memory task, but the decline of accuracy of visual processing during driving is inescapable when the spatial working memory load is given. The present study suggests that EFRP can provide a new index of drivers' attention, other than saccadic eye-movements.
Magnifying visual target information and the role of eye movements in motor sequence learning.
Massing, Matthias; Blandin, Yannick; Panzer, Stefan
2016-01-01
An experiment investigated the influence of eye movements on learning a simple motor sequence task when the visual display was magnified. The task was to reproduce a 1300 ms spatial-temporal pattern of elbow flexions and extensions. The spatial-temporal pattern was displayed in front of the participants. Participants were randomly assigned to four groups differing on eye movements (free to use their eyes/instructed to fixate) and the visual display (small/magnified). All participants had to perform a pre-test, an acquisition phase, a delayed retention test, and a transfer test. The results indicated that participants in each practice condition increased their performance during acquisition. The participants who were permitted to use their eyes in the magnified visual display outperformed those who were instructed to fixate on the magnified visual display. When a small visual display was used, the instruction to fixate induced no performance decrements compared to participants who were permitted to use their eyes during acquisition. The findings demonstrated that a spatial-temporal pattern can be learned without eye movements, but being permitting to use eye movements facilitates the response production when the visual angle is increased. Copyright © 2015 Elsevier B.V. All rights reserved.
Eye and Head Response to Peripheral Targets
1989-08-01
nystagmus movements of the eyes. These move- ments tend to be oscillatory or unstable in nature and can be elicited in three ways: stimuli 2 in the...Hall and Cusack, 1972). Nystagmus can best be described through example. As mentioned previously, the com- pensatory eye movements serve to stabilize...movements are what are referred to as nystagmus . The direction of the nystagmus is identified by the movement of the fast phase, that is, the direction
... and physical exam before the procedure Orthoptic measurements (eye movement measurements) Always tell your child's health care provider: ... D, Plummer LS, Stass-Isern M. Disorders of eye movement and alignment. In: Kliegman RM, Stanton BF, St. ...
Inactivation of Semicircular Canals Causes Adaptive Increases in Otolith-driven Tilt Responses
NASA Technical Reports Server (NTRS)
Angelaki, Dora E.; Newlands, Shawn D.; Dickman, J. David
2002-01-01
Growing experimental and theoretical evidence suggests a functional synergy in the processing of otolith and semicircular canal signals for the generation of the vestibulo-ocular reflexes (VORs). In this study we have further tested this functional interaction by quantifying the adaptive changes in the otolith-ocular system during both rotational and translational movements after surgical inactivation of the semicircular canals. For 0.1- 0.5 Hz (stimuli for which there is no recovery of responses from the plugged canals), pitch and roll VOR gains recovered during earth- horizontal (but not earth-vertical) axis rotations. Corresponding changes were also observed in eye movements elicited by translational motion (0.1 - 5 Hz). Specifically, torsional eye movements increased during lateral motion, whereas vertical eye movements increased during fore-aft motion. The findings indicate that otolith signals can be adapted according to compromised strategy that leads to improved gaze stabilization during motion. Because canal-plugged animals permanently lose the ability to discriminate gravitoinertial accelerations, adapted animals can use the presence of gravity through otolith-driven tilt responses to assist gaze stabilization during earth-horizontal axis rotations.
Matsumoto, Akihiro; Tachibana, Masao
2017-01-01
Even when the body is stationary, the whole retinal image is always in motion by fixational eye movements and saccades that move the eye between fixation points. Accumulating evidence indicates that the brain is equipped with specific mechanisms for compensating for the global motion induced by these eye movements. However, it is not yet fully understood how the retina processes global motion images during eye movements. Here we show that global motion images evoke novel coordinated firing in retinal ganglion cells (GCs). We simultaneously recorded the firing of GCs in the goldfish isolated retina using a multi-electrode array, and classified each GC based on the temporal profile of its receptive field (RF). A moving target that accompanied the global motion (simulating a saccade following a period of fixational eye movements) modulated the RF properties and evoked synchronized and correlated firing among local clusters of the specific GCs. Our findings provide a novel concept for retinal information processing during eye movements.
Very Slow Search and Reach: Failure to Maximize Expected Gain in an Eye-Hand Coordination Task
Zhang, Hang; Morvan, Camille; Etezad-Heydari, Louis-Alexandre; Maloney, Laurence T.
2012-01-01
We examined an eye-hand coordination task where optimal visual search and hand movement strategies were inter-related. Observers were asked to find and touch a target among five distractors on a touch screen. Their reward for touching the target was reduced by an amount proportional to how long they took to locate and reach to it. Coordinating the eye and the hand appropriately would markedly reduce the search-reach time. Using statistical decision theory we derived the sequence of interrelated eye and hand movements that would maximize expected gain and we predicted how hand movements should change as the eye gathered further information about target location. We recorded human observers' eye movements and hand movements and compared them with the optimal strategy that would have maximized expected gain. We found that most observers failed to adopt the optimal search-reach strategy. We analyze and describe the strategies they did adopt. PMID:23071430
Eye Movements Reveal the Influence of Event Structure on Reading Behavior.
Swets, Benjamin; Kurby, Christopher A
2016-03-01
When we read narrative texts such as novels and newspaper articles, we segment information presented in such texts into discrete events, with distinct boundaries between those events. But do our eyes reflect this event structure while reading? This study examines whether eye movements during the reading of discourse reveal how readers respond online to event structure. Participants read narrative passages as we monitored their eye movements. Several measures revealed that event structure predicted eye movements. In two experiments, we found that both early and overall reading times were longer for event boundaries. We also found that regressive saccades were more likely to land on event boundaries, but that readers were less likely to regress out of an event boundary. Experiment 2 also demonstrated that tracking event structure carries a working memory load. Eye movements provide a rich set of online data to test the cognitive reality of event segmentation during reading. Copyright © 2015 Cognitive Science Society, Inc.
Smeets, Monique A M; Dijs, M Willem; Pervan, Iva; Engelhard, Iris M; van den Hout, Marcel A
2012-01-01
The time-course of changes in vividness and emotionality of unpleasant autobiographical memories associated with making eye movements (eye movement desensitisation and reprocessing, EMDR) was investigated. Participants retrieved unpleasant autobiographical memories and rated their vividness and emotionality prior to and following 96 seconds of making eye movements (EM) or keeping eyes stationary (ES); at 2, 4, 6, and 10 seconds into the intervention; then followed by regular larger intervals throughout the 96-second intervention. Results revealed a significant drop compared to the ES group in emotionality after 74 seconds compared to a significant drop in vividness at only 2 seconds into the intervention. These results support that emotionality becomes reduced only after vividness has dropped. The results are discussed in light of working memory theory and visual imagery theory, following which the regular refreshment of the visual memory needed to maintain it in working memory is interfered with by eye movements that also tax working memory, which affects vividness first.
Plöchl, Michael; Ossandón, José P.; König, Peter
2012-01-01
Eye movements introduce large artifacts to electroencephalographic recordings (EEG) and thus render data analysis difficult or even impossible. Trials contaminated by eye movement and blink artifacts have to be discarded, hence in standard EEG-paradigms subjects are required to fixate on the screen. To overcome this restriction, several correction methods including regression and blind source separation have been proposed. Yet, there is no automated standard procedure established. By simultaneously recording eye movements and 64-channel-EEG during a guided eye movement paradigm, we investigate and review the properties of eye movement artifacts, including corneo-retinal dipole changes, saccadic spike potentials and eyelid artifacts, and study their interrelations during different types of eye- and eyelid movements. In concordance with earlier studies our results confirm that these artifacts arise from different independent sources and that depending on electrode site, gaze direction, and choice of reference these sources contribute differently to the measured signal. We assess the respective implications for artifact correction methods and therefore compare the performance of two prominent approaches, namely linear regression and independent component analysis (ICA). We show and discuss that due to the independence of eye artifact sources, regression-based correction methods inevitably over- or under-correct individual artifact components, while ICA is in principle suited to address such mixtures of different types of artifacts. Finally, we propose an algorithm, which uses eye tracker information to objectively identify eye-artifact related ICA-components (ICs) in an automated manner. In the data presented here, the algorithm performed very similar to human experts when those were given both, the topographies of the ICs and their respective activations in a large amount of trials. Moreover it performed more reliable and almost twice as effective than human experts when those had to base their decision on IC topographies only. Furthermore, a receiver operating characteristic (ROC) analysis demonstrated an optimal balance of false positive and false negative at an area under curve (AUC) of more than 0.99. Removing the automatically detected ICs from the data resulted in removal or substantial suppression of ocular artifacts including microsaccadic spike potentials, while the relevant neural signal remained unaffected. In conclusion the present work aims at a better understanding of individual eye movement artifacts, their interrelations and the respective implications for eye artifact correction. Additionally, the proposed ICA-procedure provides a tool for optimized detection and correction of eye movement-related artifact components. PMID:23087632
Carrigan, M H; Levis, D J
1999-01-01
The present study was designed to isolate the effects of the eye-movement component of the Eye Movement Desensitization and Reprocessing (EMDR) procedure in the treatment of fear of public speaking. Seventy-one undergraduate psychology students who responded in a fearful manner on the Fear Survey Schedule II and on a standardized, self-report measure of public speaking anxiety (Personal Report of Confidence as a Speaker; PRCS) were randomly assigned to one of four groups in a 2x2 factorial design. The two independent variables assessed were treatment condition (imagery plus eye movements vs. imagery alone) and type of imagery (fear-relevant vs. relaxing). Dependent variables assessed were self-reported and physiological anxiety during exposure and behavioral indices of anxiety while giving a speech. Although process measures indicated exposure to fear-relevant imagery increased anxiety during the procedure, no significant differences among groups were found on any of the outcome measures, except that participants who received eye movements were less likely to give a speech posttreatment than participants who did not receive eye movements. Addition of the eye movements to the experimental procedure did not result in enhancement of fear reduction. It was concluded, consistent with the results of past research, that previously reported positive effects of the EMDR procedure may be largely due to exposure to conditioned stimuli.
The Dorsal Visual System Predicts Future and Remembers Past Eye Position
Morris, Adam P.; Bremmer, Frank; Krekelberg, Bart
2016-01-01
Eye movements are essential to primate vision but introduce potentially disruptive displacements of the retinal image. To maintain stable vision, the brain is thought to rely on neurons that carry both visual signals and information about the current direction of gaze in their firing rates. We have shown previously that these neurons provide an accurate representation of eye position during fixation, but whether they are updated fast enough during saccadic eye movements to support real-time vision remains controversial. Here we show that not only do these neurons carry a fast and accurate eye-position signal, but also that they support in parallel a range of time-lagged variants, including predictive and post dictive signals. We recorded extracellular activity in four areas of the macaque dorsal visual cortex during a saccade task, including the lateral and ventral intraparietal areas (LIP, VIP), and the middle temporal (MT) and medial superior temporal (MST) areas. As reported previously, neurons showed tonic eye-position-related activity during fixation. In addition, they showed a variety of transient changes in activity around the time of saccades, including relative suppression, enhancement, and pre-saccadic bursts for one saccade direction over another. We show that a hypothetical neuron that pools this rich population activity through a weighted sum can produce an output that mimics the true spatiotemporal dynamics of the eye. Further, with different pooling weights, this downstream eye position signal (EPS) could be updated long before (<100 ms) or after (<200 ms) an eye movement. The results suggest a flexible coding scheme in which downstream computations have access to past, current, and future eye positions simultaneously, providing a basis for visual stability and delay-free visually-guided behavior. PMID:26941617
Lustig, Avichai; Ketter-Katz, Hadas; Katzir, Gadi
2013-11-01
Chameleons (Chamaeleonidae, reptilia), in common with most ectotherms, show full optic nerve decussation and sparse inter-hemispheric commissures. Chameleons are unique in their capacity for highly independent, large-amplitude eye movements. We address the question: Do common chameleons, Chamaeleo chameleon, during detour, show patterns of lateralization of motion and of eye use that differ from those shown by other ectotherms? To reach a target (prey) in passing an obstacle in a Y-maze, chameleons were required to make a left or a right detour. We analyzed the direction of detours and eye use and found that: (i) individuals differed in their preferred detour direction, (ii) eye use was lateralized at the group level, with significantly longer durations of viewing the target with the right eye, compared with the left eye, (iii) during left side, but not during right side, detours the durations of viewing the target with the right eye were significantly longer than the durations with the left eye. Thus, despite the uniqueness of chameleons' visual system, they display patterns of lateralization of motion and of eye use, typical of other ectotherms. These findings are discussed in relation to hemispheric functions. Copyright © 2013 Elsevier B.V. All rights reserved.
Geometry and Gesture-Based Features from Saccadic Eye-Movement as a Biometric in Radiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Tracy; Tourassi, Georgia; Yoon, Hong-Jun
In this study, we present a novel application of sketch gesture recognition on eye-movement for biometric identification and estimating task expertise. The study was performed for the task of mammographic screening with simultaneous viewing of four coordinated breast views as typically done in clinical practice. Eye-tracking data and diagnostic decisions collected for 100 mammographic cases (25 normal, 25 benign, 50 malignant) and 10 readers (three board certified radiologists and seven radiology residents), formed the corpus for this study. Sketch gesture recognition techniques were employed to extract geometric and gesture-based features from saccadic eye-movements. Our results show that saccadic eye-movement, characterizedmore » using sketch-based features, result in more accurate models for predicting individual identity and level of expertise than more traditional eye-tracking features.« less
Tracking the Eye Movement of Four Years Old Children Learning Chinese Words
ERIC Educational Resources Information Center
Lin, Dan; Chen, Guangyao; Liu, Yingyi; Liu, Jiaxin; Pan, Jue; Mo, Lei
2018-01-01
Storybook reading is the major source of literacy exposure for beginning readers. The present study tracked 4-year-old Chinese children's eye movements while they were reading simulated storybook pages. Their eye-movement patterns were examined in relation to their word learning gains. The same reading list, consisting of 20 two-character Chinese…
Influence of Eye Movements, Auditory Perception, and Phonemic Awareness in the Reading Process
ERIC Educational Resources Information Center
Megino-Elvira, Laura; Martín-Lobo, Pilar; Vergara-Moragues, Esperanza
2016-01-01
The authors' aim was to analyze the relationship of eye movements, auditory perception, and phonemic awareness with the reading process. The instruments used were the King-Devick Test (saccade eye movements), the PAF test (auditory perception), the PFC (phonemic awareness), the PROLEC-R (lexical process), the Canals reading speed test, and the…
ERIC Educational Resources Information Center
Liwanag, Maria Perpetua Socorro U.; Pelatti, Christina Yeager; Martens, Ray; Martens, Prisca
2016-01-01
This study incorporated eye movement miscue analysis to investigate two second-graders' oral reading and comprehension of a counterpoint picture book. Findings suggest the second-graders' strategies when reading the written and pictorial text affected their comprehension as opposed to the number and location of their eye movements. Specifically,…
ERIC Educational Resources Information Center
Johansson, Roger; Holsanova, Jana; Dewhurst, Richard; Holmqvist, Kenneth
2012-01-01
Current debate in mental imagery research revolves around the perceptual and cognitive role of eye movements to "nothing" (Ferreira, Apel, & Henderson, 2008; Richardson, Altmann, Spivey, & Hoover, 2009). While it is established that eye movements are comparable when inspecting a scene (or hearing a scene description) as when…
Validity of Eye Movement Methods and Indices for Capturing Semantic (Associative) Priming Effects
ERIC Educational Resources Information Center
Odekar, Anshula; Hallowell, Brooke; Kruse, Hans; Moates, Danny; Lee, Chao-Yang
2009-01-01
Purpose: The purpose of this investigation was to evaluate the usefulness of eye movement methods and indices as a tool for studying priming effects by verifying whether eye movement indices capture semantic (associative) priming effects in a visual cross-format (written word to semantically related picture) priming paradigm. Method: In the…
Learning to See: Guiding Students' Attention via a Model's Eye Movements Fosters Learning
ERIC Educational Resources Information Center
Jarodzka, Halszka; van Gog, Tamara; Dorr, Michael; Scheiter, Katharina; Gerjets, Peter
2013-01-01
This study investigated how to teach perceptual tasks, that is, classifying fish locomotion, through eye movement modeling examples (EMME). EMME consisted of a replay of eye movements of a didactically behaving domain expert (model), which had been recorded while he executed the task, superimposed onto the video stimulus. Seventy-five students…
Eye Movements during Multiple Object Tracking: Where Do Participants Look?
ERIC Educational Resources Information Center
Fehd, Hilda M.; Seiffert, Adriane E.
2008-01-01
Similar to the eye movements you might make when viewing a sports game, this experiment investigated where participants tend to look while keeping track of multiple objects. While eye movements were recorded, participants tracked either 1 or 3 of 8 red dots that moved randomly within a square box on a black background. Results indicated that…
ERIC Educational Resources Information Center
Connor, Carol McDonald; Radach, Ralph; Vorstius, Christian; Day, Stephanie L.; McLean, Leigh; Morrison, Frederick J.
2015-01-01
In this study, we investigated fifth graders' (n = 52) fall literacy, academic language, and motivation and how these skills predicted fall and spring comprehension monitoring on an eye movement task. Comprehension monitoring was defined as the identification and repair of misunderstandings when reading text. In the eye movement task, children…
Exploring Cultural Variation in Eye Movements on a Web Page between Americans and Koreans
ERIC Educational Resources Information Center
Yang, Changwoo
2009-01-01
This study explored differences in eye movement on a Web page between members of two different cultures to provide insight and guidelines for implementation of global Web site development. More specifically, the research examines whether differences of eye movement exist between the two cultures (American vs. Korean) when viewing a Web page, and…
Secondary-Task Effects on Learning with Multimedia: An Investigation through Eye-Movement Analysis
ERIC Educational Resources Information Center
Acarturk, Cengiz; Ozcelik, Erol
2017-01-01
This study investigates secondary-task interference on eye movements through learning with multimedia. We focus on the relationship between the influence of the secondary task on the eye movements of learners, and the learning outcomes as measured by retention, matching, and transfer. Half of the participants performed a spatial tapping task while…
Initial Scene Representations Facilitate Eye Movement Guidance in Visual Search
ERIC Educational Resources Information Center
Castelhano, Monica S.; Henderson, John M.
2007-01-01
What role does the initial glimpse of a scene play in subsequent eye movement guidance? In 4 experiments, a brief scene preview was followed by object search through the scene via a small moving window that was tied to fixation position. Experiment 1 demonstrated that the scene preview resulted in more efficient eye movements compared with a…
ERIC Educational Resources Information Center
Wedel, Michel; Pieters, Rik; Liechty, John
2008-01-01
Eye movements across advertisements express a temporal pattern of bursts of respectively relatively short and long saccades, and this pattern is systematically influenced by activated scene perception goals. This was revealed by a continuous-time hidden Markov model applied to eye movements of 220 participants exposed to 17 ads under a…
Using Eye Movements to Model the Sequence of Text-Picture Processing for Multimedia Comprehension
ERIC Educational Resources Information Center
Mason, L.; Scheiter, K.; Tornatora, M. C.
2017-01-01
This study used eye movement modeling examples (EMME) to support students' integrative processing of verbal and graphical information during the reading of an illustrated text. EMME consists of a replay of eye movements of a model superimposed onto the materials that are processed for accomplishing the task. Specifically, the study investigated…
Hawk Eyes II: Diurnal Raptors Differ in Head Movement Strategies When Scanning from Perches
O'Rourke, Colleen T.; Pitlik, Todd; Hoover, Melissa; Fernández-Juricic, Esteban
2010-01-01
Background Relatively little is known about the degree of inter-specific variability in visual scanning strategies in species with laterally placed eyes (e.g., birds). This is relevant because many species detect prey while perching; therefore, head movement behavior may be an indicator of prey detection rate, a central parameter in foraging models. We studied head movement strategies in three diurnal raptors belonging to the Accipitridae and Falconidae families. Methodology/Principal Findings We used behavioral recording of individuals under field and captive conditions to calculate the rate of two types of head movements and the interval between consecutive head movements. Cooper's Hawks had the highest rate of regular head movements, which can facilitate tracking prey items in the visually cluttered environment they inhabit (e.g., forested habitats). On the other hand, Red-tailed Hawks showed long intervals between consecutive head movements, which is consistent with prey searching in less visually obstructed environments (e.g., open habitats) and with detecting prey movement from a distance with their central foveae. Finally, American Kestrels have the highest rates of translational head movements (vertical or frontal displacements of the head keeping the bill in the same direction), which have been associated with depth perception through motion parallax. Higher translational head movement rates may be a strategy to compensate for the reduced degree of eye movement of this species. Conclusions Cooper's Hawks, Red-tailed Hawks, and American Kestrels use both regular and translational head movements, but to different extents. We conclude that these diurnal raptors have species-specific strategies to gather visual information while perching. These strategies may optimize prey search and detection with different visual systems in habitat types with different degrees of visual obstruction. PMID:20877650
Hawk eyes II: diurnal raptors differ in head movement strategies when scanning from perches.
O'Rourke, Colleen T; Pitlik, Todd; Hoover, Melissa; Fernández-Juricic, Esteban
2010-09-22
Relatively little is known about the degree of inter-specific variability in visual scanning strategies in species with laterally placed eyes (e.g., birds). This is relevant because many species detect prey while perching; therefore, head movement behavior may be an indicator of prey detection rate, a central parameter in foraging models. We studied head movement strategies in three diurnal raptors belonging to the Accipitridae and Falconidae families. We used behavioral recording of individuals under field and captive conditions to calculate the rate of two types of head movements and the interval between consecutive head movements. Cooper's Hawks had the highest rate of regular head movements, which can facilitate tracking prey items in the visually cluttered environment they inhabit (e.g., forested habitats). On the other hand, Red-tailed Hawks showed long intervals between consecutive head movements, which is consistent with prey searching in less visually obstructed environments (e.g., open habitats) and with detecting prey movement from a distance with their central foveae. Finally, American Kestrels have the highest rates of translational head movements (vertical or frontal displacements of the head keeping the bill in the same direction), which have been associated with depth perception through motion parallax. Higher translational head movement rates may be a strategy to compensate for the reduced degree of eye movement of this species. Cooper's Hawks, Red-tailed Hawks, and American Kestrels use both regular and translational head movements, but to different extents. We conclude that these diurnal raptors have species-specific strategies to gather visual information while perching. These strategies may optimize prey search and detection with different visual systems in habitat types with different degrees of visual obstruction.
The effect of age and sex on facial mimicry: a three-dimensional study in healthy adults.
Sforza, C; Mapelli, A; Galante, D; Moriconi, S; Ibba, T M; Ferraro, L; Ferrario, V F
2010-10-01
To assess sex- and age-related characteristics in standardized facial movements, 40 healthy adults (20 men, 20 women; aged 20-50 years) performed seven standardized facial movements (maximum smile; free smile; "surprise" with closed mouth; "surprise" with open mouth; eye closure; right- and left-side eye closures). The three-dimensional coordinates of 21 soft tissue facial landmarks were recorded by a motion analyser, their movements computed, and asymmetry indices calculated. Within each movement, total facial mobility was independent from sex and age (analysis of variance, p>0.05). Asymmetry indices of the eyes and mouth were similar in both sexes (p>0.05). Age significantly influenced eye and mouth asymmetries of the right-side eye closure, and eye asymmetry of the surprise movement. On average, the asymmetry indices of the symmetric movements were always lower than 8%, and most did not deviate from the expected value of 0 (Student's t). Larger asymmetries were found for the asymmetric eye closures (eyes, up to 50%, p<0.05; mouth, up to 30%, p<0.05 only in the 20-30-year-old subjects). In conclusion, sex and age had a limited influence on total facial motion and asymmetry in normal adult men and women. Copyright © 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Srulijes, Karin; Mack, David J; Klenk, Jochen; Schwickert, Lars; Ihlen, Espen A F; Schwenk, Michael; Lindemann, Ulrich; Meyer, Miriam; Srijana, K C; Hobert, Markus A; Brockmann, Kathrin; Wurster, Isabel; Pomper, Jörn K; Synofzik, Matthis; Schneider, Erich; Ilg, Uwe; Berg, Daniela; Maetzler, Walter; Becker, Clemens
2015-10-09
Falls frequency increases with age and particularly in neurogeriatric cohorts. The interplay between eye movements and locomotion may contribute substantially to the occurrence of falls, but is hardly investigated. This paper provides an overview of current approaches to simultaneously measure eye and body movements, particularly for analyzing the association of vestibulo-ocular reflex (VOR) suppression, postural deficits and falls in neurogeriatric risk cohorts. Moreover, VOR suppression is measured during head-fixed target presentation and during gaze shifting while postural control is challenged. Using these approaches, we aim at identifying quantitative parameters of eye-head-coordination during postural balance and gait, as indicators of fall risk. Patients with Progressive Supranuclear Palsy (PSP) or Parkinson's disease (PD), age- and sex-matched healthy older adults, and a cohort of young healthy adults will be recruited. Baseline assessment will include a detailed clinical assessment, covering medical history, neurological examination, disease specific clinical rating scales, falls-related self-efficacy, activities of daily living, neuro-psychological screening, assessment of mobility function and a questionnaire for retrospective falls. Moreover, participants will simultaneously perform eye and head movements (fixating a head-fixed target vs. shifting gaze to light emitting diodes in order to quantify vestibulo-ocular reflex suppression ability) under different conditions (sitting, standing, or walking). An eye/head tracker synchronized with a 3-D motion analysis system will be used to quantify parameters related to eye-head-coordination, postural balance, and gait. Established outcome parameters related to VOR suppression ability (e.g., gain, saccadic reaction time, frequency of saccades) and motor related fall risk (e.g., step-time variability, postural sway) will be calculated. Falls will be assessed prospectively over 12 months via protocols and monthly telephone interviews. This study protocol describes an experimental setup allowing the analysis of simultaneously assessed eye, head and body movements. Results will improve our understanding of the influence of the interplay between eye, head and body movements on falls in geriatric high-risk cohorts.
Anticipatory Eye Movements in Interleaving Templates of Human Behavior
NASA Technical Reports Server (NTRS)
Matessa, Michael
2004-01-01
Performance modeling has been made easier by architectures which package psychological theory for reuse at useful levels of abstraction. CPM-GOMS uses templates of behavior to package at a task level (e.g., mouse move-click, typing) predictions of lower-level cognitive, perceptual, and motor resource use. CPM-GOMS also has a theory for interleaving resource use between templates. One example of interleaving is anticipatory eye movements. This paper describes the use of ACT-Stitch, a framework for translating CPM-GOMS templates and interleaving theory into ACT-R, to model anticipatory eye movements in skilled behavior. The anticipatory eye movements explain performance in a well-practiced perceptual/motor task, and the interleaving theory is supported with results from an eye-tracking experiment.
Warren, Amy L; Donnon, Tyrone L; Wagg, Catherine R; Priest, Heather; Fernandez, Nicole J
2018-01-18
Visual diagnostic reasoning is the cognitive process by which pathologists reach a diagnosis based on visual stimuli (cytologic, histopathologic, or gross imagery). Currently, there is little to no literature examining visual reasoning in veterinary pathology. The objective of the study was to use eye tracking to establish baseline quantitative and qualitative differences between the visual reasoning processes of novice and expert veterinary pathologists viewing cytology specimens. Novice and expert participants were each shown 10 cytology images and asked to formulate a diagnosis while wearing eye-tracking equipment (10 slides) and while concurrently verbalizing their thought processes using the think-aloud protocol (5 slides). Compared to novices, experts demonstrated significantly higher diagnostic accuracy (p<.017), shorter time to diagnosis (p<.017), and a higher percentage of time spent viewing areas of diagnostic interest (p<.017). Experts elicited more key diagnostic features in the think-aloud protocol and had more efficient patterns of eye movement. These findings suggest that experts' fast time to diagnosis, efficient eye-movement patterns, and preference for viewing areas of interest supports system 1 (pattern-recognition) reasoning and script-inductive knowledge structures with system 2 (analytic) reasoning to verify their diagnosis.
NASA Technical Reports Server (NTRS)
Hayashi, Miwa; Ravinder, Ujwala; McCann, Robert S.; Beutter, Brent; Spirkovska, Lily
2009-01-01
Performance enhancements associated with selected forms of automation were quantified in a recent human-in-the-loop evaluation of two candidate operational concepts for fault management on next-generation spacecraft. The baseline concept, called Elsie, featured a full-suite of "soft" fault management interfaces. However, operators were forced to diagnose malfunctions with minimal assistance from the standalone caution and warning system. The other concept, called Besi, incorporated a more capable C&W system with an automated fault diagnosis capability. Results from analyses of participants' eye movements indicate that the greatest empirical benefit of the automation stemmed from eliminating the need for text processing on cluttered, text-rich displays.
How young adults with autism spectrum disorder watch and interpret pragmatically complex scenes.
Lönnqvist, Linda; Loukusa, Soile; Hurtig, Tuula; Mäkinen, Leena; Siipo, Antti; Väyrynen, Eero; Palo, Pertti; Laukka, Seppo; Mämmelä, Laura; Mattila, Marja-Leena; Ebeling, Hanna
2017-11-01
The aim of the current study was to investigate subtle characteristics of social perception and interpretation in high-functioning individuals with autism spectrum disorders (ASDs), and to study the relation between watching and interpreting. As a novelty, we used an approach that combined moment-by-moment eye tracking and verbal assessment. Sixteen young adults with ASD and 16 neurotypical control participants watched a video depicting a complex communication situation while their eye movements were tracked. The participants also completed a verbal task with questions related to the pragmatic content of the video. We compared verbal task scores and eye movements between groups, and assessed correlations between task performance and eye movements. Individuals with ASD had more difficulty than the controls in interpreting the video, and during two short moments there were significant group differences in eye movements. Additionally, we found significant correlations between verbal task scores and moment-level eye movement in the ASD group, but not among the controls. We concluded that participants with ASD had slight difficulties in understanding the pragmatic content of the video stimulus and attending to social cues, and that the connection between pragmatic understanding and eye movements was more pronounced for participants with ASD than for neurotypical participants.
Eye Movements Affect Postural Control in Young and Older Females
Thomas, Neil M.; Bampouras, Theodoros M.; Donovan, Tim; Dewhurst, Susan
2016-01-01
Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions. PMID:27695412
Eye Movements Affect Postural Control in Young and Older Females.
Thomas, Neil M; Bampouras, Theodoros M; Donovan, Tim; Dewhurst, Susan
2016-01-01
Visual information is used for postural stabilization in humans. However, little is known about how eye movements prevalent in everyday life interact with the postural control system in older individuals. Therefore, the present study assessed the effects of stationary gaze fixations, smooth pursuits, and saccadic eye movements, with combinations of absent, fixed and oscillating large-field visual backgrounds to generate different forms of retinal flow, on postural control in healthy young and older females. Participants were presented with computer generated visual stimuli, whilst postural sway and gaze fixations were simultaneously assessed with a force platform and eye tracking equipment, respectively. The results showed that fixed backgrounds and stationary gaze fixations attenuated postural sway. In contrast, oscillating backgrounds and smooth pursuits increased postural sway. There were no differences regarding saccades. There were also no differences in postural sway or gaze errors between age groups in any visual condition. The stabilizing effect of the fixed visual stimuli show how retinal flow and extraocular factors guide postural adjustments. The destabilizing effect of oscillating visual backgrounds and smooth pursuits may be related to more challenging conditions for determining body shifts from retinal flow, and more complex extraocular signals, respectively. Because the older participants matched the young group's performance in all conditions, decreases of posture and gaze control during stance may not be a direct consequence of healthy aging. Further research examining extraocular and retinal mechanisms of balance control and the effects of eye movements, during locomotion, is needed to better inform fall prevention interventions.
Object-location binding across a saccade: A retinotopic Spatial Congruency Bias
Shafer-Skelton, Anna; Kupitz, Colin N.; Golomb, Julie D.
2017-01-01
Despite frequent eye movements that rapidly shift the locations of objects on our retinas, our visual system creates a stable perception of the world. To do this, it must convert eye-centered (retinotopic) input to world-centered (spatiotopic) percepts. Moreover, for successful behavior we must also incorporate information about object features/identities during this updating – a fundamental challenge that remains to be understood. Here we adapted a recent behavioral paradigm, the “Spatial Congruency Bias”, to investigate object-location binding across an eye movement. In two initial baseline experiments, we showed that the Spatial Congruency Bias was present for both gabor and face stimuli in addition to the object stimuli used in the original paradigm. Then, across three main experiments, we found the bias was preserved across an eye movement, but only in retinotopic coordinates: Subjects were more likely to perceive two stimuli as having the same features/identity when they were presented in the same retinotopic location. Strikingly, there was no evidence of location binding in the more ecologically relevant spatiotopic (world-centered) coordinates; the reference frame did not update to spatiotopic even at longer post-saccade delays, nor did it transition to spatiotopic with more complex stimuli (gabors, shapes, and faces all showed a retinotopic Congruency Bias). Our results suggest that object-location binding may be tied to retinotopic coordinates, and that it may need to be re-established following each eye movement rather than being automatically updated to spatiotopic coordinates. PMID:28070793
2003-01-22
One concern about human adaptation to space is how returning from the microgravity of orbit to Earth can affect an astronaut's ability to fly safely. There are monitors and infrared video cameras to measure eye movements without having to affect the crew member. A computer screen provides moving images which the eye tracks while the brain determines what it is seeing. A video camera records movement of the subject's eyes. Researchers can then correlate perception and response. Test subjects perceive different images when a moving object is covered by a mask that is visible or invisible (above). Early results challenge the accepted theory that smooth pursuit -- the fluid eye movement that humans and primates have -- does not involve the higher brain. NASA results show that: Eye movement can predict human perceptual performance, smooth pursuit and saccadic (quick or ballistic) movement share some signal pathways, and common factors can make both smooth pursuit and visual perception produce errors in motor responses.
Understanding Visible Perception
NASA Technical Reports Server (NTRS)
2003-01-01
One concern about human adaptation to space is how returning from the microgravity of orbit to Earth can affect an astronaut's ability to fly safely. There are monitors and infrared video cameras to measure eye movements without having to affect the crew member. A computer screen provides moving images which the eye tracks while the brain determines what it is seeing. A video camera records movement of the subject's eyes. Researchers can then correlate perception and response. Test subjects perceive different images when a moving object is covered by a mask that is visible or invisible (above). Early results challenge the accepted theory that smooth pursuit -- the fluid eye movement that humans and primates have -- does not involve the higher brain. NASA results show that: Eye movement can predict human perceptual performance, smooth pursuit and saccadic (quick or ballistic) movement share some signal pathways, and common factors can make both smooth pursuit and visual perception produce errors in motor responses.
Understanding eye movements in face recognition using hidden Markov models.
Chuk, Tim; Chan, Antoni B; Hsiao, Janet H
2014-09-16
We use a hidden Markov model (HMM) based approach to analyze eye movement data in face recognition. HMMs are statistical models that are specialized in handling time-series data. We conducted a face recognition task with Asian participants, and model each participant's eye movement pattern with an HMM, which summarized the participant's scan paths in face recognition with both regions of interest and the transition probabilities among them. By clustering these HMMs, we showed that participants' eye movements could be categorized into holistic or analytic patterns, demonstrating significant individual differences even within the same culture. Participants with the analytic pattern had longer response times, but did not differ significantly in recognition accuracy from those with the holistic pattern. We also found that correct and wrong recognitions were associated with distinctive eye movement patterns; the difference between the two patterns lies in the transitions rather than locations of the fixations alone. © 2014 ARVO.
Chuk, Tim; Chan, Antoni B; Hsiao, Janet H
2017-12-01
The hidden Markov model (HMM)-based approach for eye movement analysis is able to reflect individual differences in both spatial and temporal aspects of eye movements. Here we used this approach to understand the relationship between eye movements during face learning and recognition, and its association with recognition performance. We discovered holistic (i.e., mainly looking at the face center) and analytic (i.e., specifically looking at the two eyes in addition to the face center) patterns during both learning and recognition. Although for both learning and recognition, participants who adopted analytic patterns had better recognition performance than those with holistic patterns, a significant positive correlation between the likelihood of participants' patterns being classified as analytic and their recognition performance was only observed during recognition. Significantly more participants adopted holistic patterns during learning than recognition. Interestingly, about 40% of the participants used different patterns between learning and recognition, and among them 90% switched their patterns from holistic at learning to analytic at recognition. In contrast to the scan path theory, which posits that eye movements during learning have to be recapitulated during recognition for the recognition to be successful, participants who used the same or different patterns during learning and recognition did not differ in recognition performance. The similarity between their learning and recognition eye movement patterns also did not correlate with their recognition performance. These findings suggested that perceptuomotor memory elicited by eye movement patterns during learning does not play an important role in recognition. In contrast, the retrieval of diagnostic information for recognition, such as the eyes for face recognition, is a better predictor for recognition performance. Copyright © 2017 Elsevier Ltd. All rights reserved.
Eye Movements of Patients with Tunnel Vision while Walking
Vargas-Martín, Fernando; Peli, Eli
2006-01-01
Purpose To determine how severe peripheral field loss (PFL) affects the dispersion of eye movements relative to the head, while walking in real environments. This information should help to better define the visual field and clearance requirements for head-mounted mobility visual aids. Methods Eye positions relative to the head were recorded in five retinitis pigmentosa patients with less than 15° of visual field and three normally-sighted people, each walking in varied environments for more than 30 minutes. The eye position recorder was made portable by modifying a head-mounted ISCAN system. Custom data processing was implemented to reject unreliable data. Sample standard deviations of eye position (dispersion) were compared across subject groups and environments. Results PFL patients exhibited narrower horizontal eye position dispersions than normally-sighted subjects (9.4° vs. 14.2°, p < 0.0001) and PFL patients’ vertical dispersions were smaller when walking indoors than outdoors (8.2° vs. 10.3°, p = 0.048). Conclusions When walking, the PFL patients did not increase their scanning eye movements to compensate for missing peripheral vision information. Their horizontal scanning was actually reduced, possibly because saccadic amplitude is limited by a lack of peripheral stimulation. The results suggest that a field-of-view as wide as 40° may be needed for closed (immersive) head-mounted mobility aids, while a much narrower display, perhaps as narrow as 20°, might be sufficient with an open design. PMID:17122116
Eye movements of patients with tunnel vision while walking.
Vargas-Martín, Fernando; Peli, Eli
2006-12-01
To determine how severe peripheral field loss (PFL) affects the dispersion of eye movements relative to the head in patients walking in real environments. This information should help to define the visual field and clearance requirements for head-mounted mobility visual aids. Eye positions relative to the head were recorded in five patients with retinitis pigmentosa who had less than 15 degrees of visual field and in three normally sighted people, each walking in varied environments for more than 30 minutes. The eye-position recorder was made portable by modifying a head-mounted system (ISCAN, Burlington, MA). Custom data processing was implemented, to reject unreliable data. Sample standard deviations of eye position (dispersion) were compared across subject groups and environments. The patients with PFL exhibited narrower horizontal eye-position dispersions than did the normally sighted subjects (9.4 degrees vs. 14.2 degrees , P < 0.0001), and the vertical dispersions of patients with PFL were smaller when they were walking indoors than when walking outdoors (8.2 degrees vs. 10.3 degrees ; P = 0.048). When walking, the patients with PFL did not increase their scanning eye movements to compensate for missing peripheral vision information. Their horizontal scanning was actually reduced, possibly because of lack of peripheral stimulation. The results suggest that a field of view as wide as 40 degrees may be needed for closed (immersive) head-mounted mobility aids, whereas a much narrower display, perhaps as narrow as 20 degrees , may be sufficient with an open design.
Smooth pursuit eye movements and schizophrenia: literature review.
Franco, J G; de Pablo, J; Gaviria, A M; Sepúlveda, E; Vilella, E
2014-09-01
To review the scientific literature about the relationship between impairment on smooth pursuit eye movements and schizophrenia. Narrative review that includes historical articles, reports about basic and clinical investigation, systematic reviews, and meta-analysis on the topic. Up to 80% of schizophrenic patients have impairment of smooth pursuit eye movements. Despite the diversity of test protocols, 65% of patients and controls are correctly classified by their overall performance during this pursuit. The smooth pursuit eye movements depend on the ability to anticipate the target's velocity and the visual feedback, as well as on learning and attention. The neuroanatomy implicated in smooth pursuit overlaps to some extent with certain frontal cortex zones associated with some clinical and neuropsychological characteristics of the schizophrenia, therefore some specific components of smooth pursuit anomalies could serve as biomarkers of the disease. Due to their sedative effect, antipsychotics have a deleterious effect on smooth pursuit eye movements, thus these movements cannot be used to evaluate the efficacy of the currently available treatments. Standardized evaluation of smooth pursuit eye movements on schizophrenia will allow to use specific aspects of that pursuit as biomarkers for the study of its genetics, psychopathology, or neuropsychology. Copyright © 2013 Sociedad Española de Oftalmología. Published by Elsevier Espana. All rights reserved.
NASA Technical Reports Server (NTRS)
Huebner, W. P.; Paloski, W. H.; Reschke, M. F.; Bloomberg, J. J.
1995-01-01
Neglecting the eccentric position of the eyes in the head can lead to erroneous interpretation of ocular motor data, particularly for near targets. We discuss the geometric effects that eye eccentricity has on the processing of target-directed eye and head movement data, and we highlight two approaches to processing and interpreting such data. The first approach involves determining the true position of the target with respect to the location of the eyes in space for evaluating the efficacy of gaze, and it allows calculation of retinal error directly from measured eye, head, and target data. The second approach effectively eliminates eye eccentricity effects by adjusting measured eye movement data to yield equivalent responses relative to a specified reference location (such as the center of head rotation). This latter technique can be used to standardize measured eye movement signals, enabling waveforms collected under different experimental conditions to be directly compared, both with the measured target signals and with each other. Mathematical relationships describing these approaches are presented for horizontal and vertical rotations, for both tangential and circumferential display screens, and efforts are made to describe the sensitivity of parameter variations on the calculated results.
PC-based high-speed video-oculography for measuring rapid eye movements in mice.
Sakatani, Tomoya; Isa, Tadashi
2004-05-01
We newly developed an infrared video-oculographic system for on-line tracking of the eye position in awake and head-fixed mice, with high temporal resolution (240 Hz). The system consists of a commercially available high-speed CCD camera and an image processing software written in LabVIEW run on IBM-PC with a plug-in video grabber board. This software calculates the center and area of the pupil by fitting circular function to the pupil boundary, and allows robust and stable tracking of the eye position in small animals like mice. On-line calculation is performed to obtain reasonable circular fitting of the pupil boundary even if a part of the pupil is covered with shadows or occluded by eyelids or corneal reflections. The pupil position in the 2-D video plane is converted to the rotation angle of the eyeball by estimating its rotation center based on the anatomical eyeball model. By this recording system, it is possible to perform quantitative analysis of rapid eye movements such as saccades in mice. This will provide a powerful tool for analyzing molecular basis of oculomotor and cognitive functions by using various lines of mutant mice.
Contribution of the cerebellar flocculus to gaze control during active head movements
NASA Technical Reports Server (NTRS)
Belton, T.; McCrea, R. A.; Peterson, B. W. (Principal Investigator)
1999-01-01
The flocculus and ventral paraflocculus are adjacent regions of the cerebellar cortex that are essential for controlling smooth pursuit eye movements and for altering the performance of the vestibulo-ocular reflex (VOR). The question addressed in this study is whether these regions of the cerebellum are more globally involved in controlling gaze, regardless of whether eye or active head movements are used to pursue moving visual targets. Single-unit recordings were obtained from Purkinje (Pk) cells in the floccular region of squirrel monkeys that were trained to fixate and pursue small visual targets. Cell firing rate was recorded during smooth pursuit eye movements, cancellation of the VOR, combined eye-head pursuit, and spontaneous gaze shifts in the absence of targets. Pk cells were found to be much less sensitive to gaze velocity during combined eye-head pursuit than during ocular pursuit. They were not sensitive to gaze or head velocity during gaze saccades. Temporary inactivation of the floccular region by muscimol injection compromised ocular pursuit but had little effect on the ability of monkeys to pursue visual targets with head movements or to cancel the VOR during active head movements. Thus the signals produced by Pk cells in the floccular region are necessary for controlling smooth pursuit eye movements but not for coordinating gaze during active head movements. The results imply that individual functional modules in the cerebellar cortex are less involved in the global organization and coordination of movements than with parametric control of movements produced by a specific part of the body.
Distinct neural circuits for control of movement vs. holding still
2017-01-01
In generating a point-to-point movement, the brain does more than produce the transient commands needed to move the body part; it also produces the sustained commands that are needed to hold the body part at its destination. In the oculomotor system, these functions are mapped onto two distinct circuits: a premotor circuit that specializes in generating the transient activity that displaces the eyes and a “neural integrator” that transforms that transient input into sustained activity that holds the eyes. Different parts of the cerebellum adaptively control the motor commands during these two phases: the oculomotor vermis participates in fine tuning the transient neural signals that move the eyes, monitoring the activity of the premotor circuit via efference copy, whereas the flocculus participates in controlling the sustained neural signals that hold the eyes, monitoring the activity of the neural integrator. Here, I review the oculomotor literature and then ask whether this separation of control between moving and holding is a design principle that may be shared with other modalities of movement. To answer this question, I consider neurophysiological and psychophysical data in various species during control of head movements, arm movements, and locomotion, focusing on the brain stem, motor cortex, and hippocampus, respectively. The review of the data raises the possibility that across modalities of motor control, circuits that are responsible for producing commands that change the sensory state of a body part are distinct from those that produce commands that maintain that sensory state. PMID:28053244
NASA Technical Reports Server (NTRS)
Krauzlis, R. J.; Stone, L. S.
1999-01-01
The two components of voluntary tracking eye-movements in primates, pursuit and saccades, are generally viewed as relatively independent oculomotor subsystems that move the eyes in different ways using independent visual information. Although saccades have long been known to be guided by visual processes related to perception and cognition, only recently have psychophysical and physiological studies provided compelling evidence that pursuit is also guided by such higher-order visual processes, rather than by the raw retinal stimulus. Pursuit and saccades also do not appear to be entirely independent anatomical systems, but involve overlapping neural mechanisms that might be important for coordinating these two types of eye movement during the tracking of a selected visual object. Given that the recovery of objects from real-world images is inherently ambiguous, guiding both pursuit and saccades with perception could represent an explicit strategy for ensuring that these two motor actions are driven by a single visual interpretation.
A Model-Based Approach for the Measurement of Eye Movements Using Image Processing
NASA Technical Reports Server (NTRS)
Sung, Kwangjae; Reschke, Millard F.
1997-01-01
This paper describes a video eye-tracking algorithm which searches for the best fit of the pupil modeled as a circular disk. The algorithm is robust to common image artifacts such as the droopy eyelids and light reflections while maintaining the measurement resolution available by the centroid algorithm. The presented algorithm is used to derive the pupil size and center coordinates, and can be combined with iris-tracking techniques to measure ocular torsion. A comparison search method of pupil candidates using pixel coordinate reference lookup tables optimizes the processing requirements for a least square fit of the circular disk model. This paper includes quantitative analyses and simulation results for the resolution and the robustness of the algorithm. The algorithm presented in this paper provides a platform for a noninvasive, multidimensional eye measurement system which can be used for clinical and research applications requiring the precise recording of eye movements in three-dimensional space.
What Do Eye Gaze Metrics Tell Us about Motor Imagery?
Poiroux, Elodie; Cavaro-Ménard, Christine; Leruez, Stéphanie; Lemée, Jean Michel; Richard, Isabelle; Dinomais, Mickael
2015-01-01
Many of the brain structures involved in performing real movements also have increased activity during imagined movements or during motor observation, and this could be the neural substrate underlying the effects of motor imagery in motor learning or motor rehabilitation. In the absence of any objective physiological method of measurement, it is currently impossible to be sure that the patient is indeed performing the task as instructed. Eye gaze recording during a motor imagery task could be a possible way to "spy" on the activity an individual is really engaged in. The aim of the present study was to compare the pattern of eye movement metrics during motor observation, visual and kinesthetic motor imagery (VI, KI), target fixation, and mental calculation. Twenty-two healthy subjects (16 females and 6 males), were required to perform tests in five conditions using imagery in the Box and Block Test tasks following the procedure described by Liepert et al. Eye movements were analysed by a non-invasive oculometric measure (SMI RED250 system). Two parameters describing gaze pattern were calculated: the index of ocular mobility (saccade duration over saccade + fixation duration) and the number of midline crossings (i.e. the number of times the subjects gaze crossed the midline of the screen when performing the different tasks). Both parameters were significantly different between visual imagery and kinesthesic imagery, visual imagery and mental calculation, and visual imagery and target fixation. For the first time we were able to show that eye movement patterns are different during VI and KI tasks. Our results suggest gaze metric parameters could be used as an objective unobtrusive approach to assess engagement in a motor imagery task. Further studies should define how oculomotor parameters could be used as an indicator of the rehabilitation task a patient is engaged in.
INFRARED- BASED BLINK DETECTING GLASSES FOR FACIAL PACING: TOWARDS A BIONIC BLINK
Frigerio, Alice; Hadlock, Tessa A; Murray, Elizabeth H; Heaton, James T
2015-01-01
IMPORTANCE Facial paralysis remains one of the most challenging conditions to effectively manage, often causing life-altering deficits in both function and appearance. Facial rehabilitation via pacing and robotic technology has great yet unmet potential. A critical first step towards reanimating symmetrical facial movement in cases of unilateral paralysis is the detection of healthy movement to use as a trigger for stimulated movement. OBJECTIVE To test a blink detection system that can be attached to standard eyeglasses and used as part of a closed-loop facial pacing system. DESIGN Standard safety glasses were equipped with an infrared (IR) emitter/detector pair oriented horizontally across the palpebral fissure, creating a monitored IR beam that became interrupted when the eyelids closed. SETTING Tertiary care Facial Nerve Center. PARTICIPANTS 24 healthy volunteers. MAIN OUTCOME MEASURE Video-quantified blinking was compared with both IR sensor signal magnitude and rate of change in healthy participants with their gaze in repose, while they shifted gaze from central to far peripheral positions, and during the production of particular facial expressions. RESULTS Blink detection based on signal magnitude achieved 100% sensitivity in forward gaze, but generated false-detections on downward gaze. Calculations of peak rate of signal change (first derivative) typically distinguished blinks from gaze-related lid movements. During forward gaze, 87% of detected blink events were true positives, 11% were false positives, and 2% false negatives. Of the 11% false positives, 6% were associated with partial eyelid closures. During gaze changes, false blink detection occurred 6.3% of the time during lateral eye movements, 10.4% during upward movements, 46.5% during downward movements, and 5.6% for movements from an upward or downward gaze back to the primary gaze. Facial expressions disrupted sensor output if they caused substantial squinting or shifted the glasses. CONCLUSION AND RELEVANCE Our blink detection system provides a reliable, non-invasive indication of eyelid closure using an invisible light beam passing in front of the eye. Future versions will aim to mitigate detection errors by using multiple IR emitter/detector pairs mounted on the glasses, and alternative frame designs may reduce shifting of the sensors relative to the eye during facial movements. PMID:24699708
Effect of glaucoma on eye movement patterns and laboratory-based hazard detection ability
Black, Alex A.; Wood, Joanne M.
2017-01-01
Purpose The mechanisms underlying the elevated crash rates of older drivers with glaucoma are poorly understood. A key driving skill is timely detection of hazards; however, the hazard detection ability of drivers with glaucoma has been largely unexplored. This study assessed the eye movement patterns and visual predictors of performance on a laboratory-based hazard detection task in older drivers with glaucoma. Methods Participants included 30 older drivers with glaucoma (71±7 years; average better-eye mean deviation (MD) = −3.1±3.2 dB; average worse-eye MD = −11.9±6.2 dB) and 25 age-matched controls (72±7 years). Visual acuity, contrast sensitivity, visual fields, useful field of view (UFoV; processing speeds), and motion sensitivity were assessed. Participants completed a computerised Hazard Perception Test (HPT) while their eye movements were recorded using a desk-mounted Tobii TX300 eye-tracking system. The HPT comprises a series of real-world traffic videos recorded from the driver’s perspective; participants responded to road hazards appearing in the videos, and hazard response times were determined. Results Participants with glaucoma exhibited an average of 0.42 seconds delay in hazard response time (p = 0.001), smaller saccades (p = 0.010), and delayed first fixation on hazards (p<0.001) compared to controls. Importantly, larger saccades were associated with faster hazard responses in the glaucoma group (p = 0.004), but not in the control group (p = 0.19). Across both groups, significant visual predictors of hazard response times included motion sensitivity, UFoV, and worse-eye MD (p<0.05). Conclusions Older drivers with glaucoma had delayed hazard response times compared to controls, with associated changes in eye movement patterns. The association between larger saccades and faster hazard response time in the glaucoma group may represent a compensatory behaviour to facilitate improved performance. PMID:28570621
Effects of Bilateral Eye Movements on Gist Based False Recognition in the DRM Paradigm
ERIC Educational Resources Information Center
Parker, Andrew; Dagnall, Neil
2007-01-01
The effects of saccadic bilateral (horizontal) eye movements on gist based false recognition was investigated. Following exposure to lists of words related to a critical but non-studied word participants were asked to engage in 30s of bilateral vs. vertical vs. no eye movements. Subsequent testing of recognition memory revealed that those who…
The Neural Basis of Smooth Pursuit Eye Movements in the Rhesus Monkey Brain
ERIC Educational Resources Information Center
Ilg, Uwe J.; Thier, Peter
2008-01-01
Smooth pursuit eye movements are performed in order to prevent retinal image blur of a moving object. Rhesus monkeys are able to perform smooth pursuit eye movements quite similar as humans, even if the pursuit target does not consist in a simple moving dot. Therefore, the study of the neuronal responses as well as the consequences of…
ERIC Educational Resources Information Center
Elich, Matthew; And Others
1985-01-01
Tested Bandler and Grinder's proposal that eye movement direction and spoken predicates are indicative of sensory modality of imagery. Subjects reported images in the three modes, but no relation between imagery and eye movements or predicates was found. Visual images were most vivid and often reported. Most subjects rated themselves as visual,…
Evaluating and Reporting Data Quality in Eye Movement Research. Technical Report No. 193.
ERIC Educational Resources Information Center
McConkie, George W.
Stressing that it is necessary to have information about the quality of eye movement data in order to judge the degree of confidence one should have in the results of an experiment using eye movement records as data, this report suggests ways for assessing and reporting such information. Specifically, the report deals with three areas: (1)…
ERIC Educational Resources Information Center
Metzner, Paul; von der Malsburg, Titus; Vasishth, Shravan; Rösler, Frank
2017-01-01
How important is the ability to freely control eye movements for reading comprehension? And how does the parser make use of this freedom? We investigated these questions using coregistration of eye movements and event-related brain potentials (ERPs) while participants read either freely or in a computer-controlled word-by-word format (also known…
Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements
NASA Astrophysics Data System (ADS)
Sato, Naoyuki; Yamaguchi, Yoko
Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.
[The Strategic Organization of Skill
NASA Technical Reports Server (NTRS)
Roberts, Ralph
1996-01-01
Eye-movement software was developed in addition to several studies that focused on expert-novice differences in the acquisition and organization of skill. These studies focused on how increasingly complex strategies utilize and incorporate visual look-ahead to calibrate action. Software for collecting, calibrating, and scoring eye-movements was refined and updated. Some new algorithms were developed for analyzing corneal-reflection eye movement data that detect the location of saccadic eye movements in space and time. Two full-scale studies were carried out which examined how experts use foveal and peripheral vision to acquire information about upcoming environmental circumstances in order to plan future action(s) accordingly.
Nystagmus as a Sign of Labyrinthine Disorders-Three-Dimensional Analysis of Nystagmus-
2008-01-01
In order to diagnose the pathological condition of vertiginous patients, a detailed observation of nystagmus in addition to examination of body equilibrium and other neurotological tests are essential. How to precisely record the eye movements is one of the goals of the researchers and clinicians who are interested in the analysis of eye movements for a long time. For considering that, one has to think about the optimal method for recording eye movements. In this review, the author introduced a new method, that is, an analysis of vestibular induced eye movements in three-dimensions and discussed the advantages and limitations of this method. PMID:19434275
Tamura, Atsushi; Wada, Yoshiro; Shimizu, Naoki; Inui, Takuo; Shiotani, Akihiro
2016-01-01
This study suggests that the subjective climbing perception can be quantitatively evaluated using values calculated from induced eye movements, and the findings may aid in the detection of pilots who are susceptible to spatial disorientation in a screening test. The climbing perception experienced by a pilot during takeoff at night is stronger than that experienced during the day. To investigate this illusion, this study assessed eye movements and analyzed their correlation with subjective climbing perception during daytime and nighttime takeoffs. Eight male volunteers participated in this study. A simulated aircraft takeoff environment was created using a flight simulator and the maximum slow-phase velocities and vestibulo-ocular reflex gain of vertical eye movements were calculated during takeoff simulation. Four of the eight participants reported that their perception of climbing at night was stronger, while the other four reported that there was no difference between day and night. These perceptions were correlated with eye movements; participants with a small difference in the maximum slow-phase velocities of their downward eye movements between daytime and nighttime takeoffs indicated that their perception of climbing was the same under the two conditions.
Execution of saccadic eye movements affects speed perception
Goettker, Alexander; Braun, Doris I.; Schütz, Alexander C.; Gegenfurtner, Karl R.
2018-01-01
Due to the foveal organization of our visual system we have to constantly move our eyes to gain precise information about our environment. Doing so massively alters the retinal input. This is problematic for the perception of moving objects, because physical motion and retinal motion become decoupled and the brain has to discount the eye movements to recover the speed of moving objects. Two different types of eye movements, pursuit and saccades, are combined for tracking. We investigated how the way we track moving targets can affect the perceived target speed. We found that the execution of corrective saccades during pursuit initiation modifies how fast the target is perceived compared with pure pursuit. When participants executed a forward (catch-up) saccade they perceived the target to be moving faster. When they executed a backward saccade they perceived the target to be moving more slowly. Variations in pursuit velocity without corrective saccades did not affect perceptual judgments. We present a model for these effects, assuming that the eye velocity signal for small corrective saccades gets integrated with the retinal velocity signal during pursuit. In our model, the execution of corrective saccades modulates the integration of these two signals by giving less weight to the retinal information around the time of corrective saccades. PMID:29440494
Odean, Rosalie; Nazareth, Alina; Pruden, Shannon M.
2015-01-01
Developmental systems theory posits that development cannot be segmented by influences acting in isolation, but should be studied through a scientific lens that highlights the complex interactions between these forces over time (Overton, 2013a). This poses a unique challenge for developmental psychologists studying complex processes like language development. In this paper, we advocate for the combining of highly sophisticated data collection technologies in an effort to move toward a more systemic approach to studying language development. We investigate the efficiency and appropriateness of combining eye-tracking technology and the LENA (Language Environment Analysis) system, an automated language analysis tool, in an effort to explore the relation between language processing in early development, and external dynamic influences like parent and educator language input in the home and school environments. Eye-tracking allows us to study language processing via eye movement analysis; these eye movements have been linked to both conscious and unconscious cognitive processing, and thus provide one means of evaluating cognitive processes underlying language development that does not require the use of subjective parent reports or checklists. The LENA system, on the other hand, provides automated language output that describes a child’s language-rich environment. In combination, these technologies provide critical information not only about a child’s language processing abilities but also about the complexity of the child’s language environment. Thus, when used in conjunction these technologies allow researchers to explore the nature of interacting systems involved in language development. PMID:26379591
A relationship between eye movement patterns and performance in a precognitive tracking task
NASA Technical Reports Server (NTRS)
Repperger, D. W.; Hartzell, E. J.
1977-01-01
Eye movements made by various subjects in the performance of a precognitive tracking task are studied. The tracking task persented by an antiaircraft artillery (AAA) simulator has an input forcing function represented by a deterministic aircraft fly-by. The performance of subjects is ranked by two metrics. Good, mediocre, and poor trackers are selected for analysis based on performance during the difficult segment of the tracking task and over replications. Using phase planes to characterize both the eye movement patterns and the displayed error signal, a simple metric is developed to study these patterns. Two characterizations of eye movement strategies are defined and quantified. Using these two types of eye strategies, two conclusions are obtained about good, mediocre, and poor trackers. First, the eye tracker who used a fixed strategy will consistently perform better. Secondly, the best fixed strategy is defined as a Crosshair Fixator.
Tracking the truth: the effect of face familiarity on eye fixations during deception.
Millen, Ailsa E; Hope, Lorraine; Hillstrom, Anne P; Vrij, Aldert
2017-05-01
In forensic investigations, suspects sometimes conceal recognition of a familiar person to protect co-conspirators or hide knowledge of a victim. The current experiment sought to determine whether eye fixations could be used to identify memory of known persons when lying about recognition of faces. Participants' eye movements were monitored whilst they lied and told the truth about recognition of faces that varied in familiarity (newly learned, famous celebrities, personally known). Memory detection by eye movements during recognition of personally familiar and famous celebrity faces was negligibly affected by lying, thereby demonstrating that detection of memory during lies is influenced by the prior learning of the face. By contrast, eye movements did not reveal lies robustly for newly learned faces. These findings support the use of eye movements as markers of memory during concealed recognition but also suggest caution when familiarity is only a consequence of one brief exposure.
Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing
Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan
2017-01-01
Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant’s eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results. PMID:29071007
Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing.
Sammaknejad, Negar; Pouretemad, Hamidreza; Eslahchi, Changiz; Salahirad, Alireza; Alinejad, Ashkan
2017-01-01
Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant's eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dissimilarity in eye movements was not due to differences in frequency of fixations in the ROI s per se. Instead, it was caused by dissimilarity in saccade paths between the ROIs. The difference enhanced when saccades were towards the eyes. Females showed significant increase in transitions from other ROI s to the eyes. Consequently, the extraction of temporal transient information of saccade paths through a transition probability matrix, similar to a first order Markov chain model, significantly improved the accuracy of the gender classification results.
Predicting the Valence of a Scene from Observers’ Eye Movements
R.-Tavakoli, Hamed; Atyabi, Adham; Rantanen, Antti; Laukka, Seppo J.; Nefti-Meziani, Samia; Heikkilä, Janne
2015-01-01
Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images. PMID:26407322
Disk space and load time requirements for eye movement biometric databases
NASA Astrophysics Data System (ADS)
Kasprowski, Pawel; Harezlak, Katarzyna
2016-06-01
Biometric identification is a very popular area of interest nowadays. Problems with the so-called physiological methods like fingerprints or iris recognition resulted in increased attention paid to methods measuring behavioral patterns. Eye movement based biometric (EMB) identification is one of the interesting behavioral methods and due to the intensive development of eye tracking devices it has become possible to define new methods for the eye movement signal processing. Such method should be supported by an efficient storage used to collect eye movement data and provide it for further analysis. The aim of the research was to check various setups enabling such a storage choice. There were various aspects taken into consideration, like disk space usage, time required for loading and saving whole data set or its chosen parts.
Sensing Passive Eye Response to Impact Induced Head Acceleration Using MEMS IMUs.
Meng, Yuan; Bottenfield, Brent; Bolding, Mark; Liu, Lei; Adams, Mark L
2018-02-01
The eye may act as a surrogate for the brain in response to head acceleration during an impact. Passive eye movements in a dynamic system are sensed by microelectromechanical systems (MEMS) inertial measurement units (IMU) in this paper. The technique is validated using a three-dimensional printed scaled human skull model and on human volunteers by performing drop-and-impact experiments with ribbon-style flexible printed circuit board IMUs inserted in the eyes and reference IMUs on the heads. Data are captured by a microcontroller unit and processed using data fusion. Displacements are thus estimated and match the measured parameters. Relative accelerations and displacements of the eye to the head are computed indicating the influence of the concussion causing impacts.
Eye-movements intervening between two successive sounds disrupt comparisons of auditory location
Pavani, Francesco; Husain, Masud; Driver, Jon
2008-01-01
Summary Many studies have investigated how saccades may affect the internal representation of visual locations across eye-movements. Here we studied instead whether eye-movements can affect auditory spatial cognition. In two experiments, participants judged the relative azimuth (same/different) of two successive sounds presented from a horizontal array of loudspeakers, separated by a 2.5 secs delay. Eye-position was either held constant throughout the trial (being directed in a fixed manner to the far left or right of the loudspeaker array), or had to be shifted to the opposite side of the array during the retention delay between the two sounds, after the first sound but before the second. Loudspeakers were either visible (Experiment1) or occluded from sight (Experiment 2). In both cases, shifting eye-position during the silent delay-period affected auditory performance in the successive auditory comparison task, even though the auditory inputs to be judged were equivalent. Sensitivity (d′) for the auditory discrimination was disrupted, specifically when the second sound shifted in the opposite direction to the intervening eye-movement with respect to the first sound. These results indicate that eye-movements affect internal representation of auditory location. PMID:18566808
Eye-movements intervening between two successive sounds disrupt comparisons of auditory location.
Pavani, Francesco; Husain, Masud; Driver, Jon
2008-08-01
Many studies have investigated how saccades may affect the internal representation of visual locations across eye-movements. Here, we studied, instead, whether eye-movements can affect auditory spatial cognition. In two experiments, participants judged the relative azimuth (same/different) of two successive sounds presented from a horizontal array of loudspeakers, separated by a 2.5-s delay. Eye-position was either held constant throughout the trial (being directed in a fixed manner to the far left or right of the loudspeaker array) or had to be shifted to the opposite side of the array during the retention delay between the two sounds, after the first sound but before the second. Loudspeakers were either visible (Experiment 1) or occluded from sight (Experiment 2). In both cases, shifting eye-position during the silent delay-period affected auditory performance in thn the successive auditory comparison task, even though the auditory inputs to be judged were equivalent. Sensitivity (d') for the auditory discrimination was disrupted, specifically when the second sound shifted in the opposite direction to the intervening eye-movement with respect to the first sound. These results indicate that eye-movements affect internal representation of auditory location.
Budelmann, B U; Young, J Z
1993-04-29
Fourteen extraocular eye muscles are described in the decapods Loligo and Sepioteuthis, and thirteen in Sepia; they are supplied by four eye muscle nerves. The main action of most of the muscles is a linear movement of the eyeball, only three muscles produce strong rotations. The arrangement, innervation and action of the decapod eye muscles are compared with those of the seven eye muscles and seven eye muscle nerves in Octopus. The extra muscles in decapods are attached to the anterior and superior faces of the eyes. At least, the anterior muscles, and presumably also the superior muscles, are concerned with convergent eye movements for binocular vision during fixation and capture of prey by the tentacles. The remaining muscles are rather similar in the two cephalopod groups. In decapods, the anterior muscles include conjunctive muscles; these cross the midline and each presumably moves both eyes at the same time during fixation. In the squids Loligo and Sepioteuthis there is an additional superior conjunctive muscle of perhaps similar function. Some of the anterior muscles are associated with a narrow moveable plate, the trochlear cartilage; it is attached to the eyeball by trochlear membranes. Centripetal cobalt fillings showed that all four eye muscle nerves have fibres that originate from somata in the ipsilateral anterior lateral pedal lobe, which is the oculomotor centre. The somata of the individual nerves show different but overlapping distributions. Bundles of small presumably afferent fibres were seen in two of the four nerves. They do not enter the anterior lateral pedal lobe but run to the ventral magnocellular lobe; some afferent fibres enter the brachio-palliovisceral connective and run perhaps as far as the palliovisceral lobe.
Tracking Students' Cognitive Processes during Program Debugging--An Eye-Movement Approach
ERIC Educational Resources Information Center
Lin, Yu-Tzu; Wu, Cheng-Chih; Hou, Ting-Yun; Lin, Yu-Chih; Yang, Fang-Ying; Chang, Chia-Hu
2016-01-01
This study explores students' cognitive processes while debugging programs by using an eye tracker. Students' eye movements during debugging were recorded by an eye tracker to investigate whether and how high- and low-performance students act differently during debugging. Thirty-eight computer science undergraduates were asked to debug two C…
Eye-Movement Patterns Are Associated with Communicative Competence in Autistic Spectrum Disorders
ERIC Educational Resources Information Center
Norbury, Courtenay Frazier; Brock, Jon; Cragg, Lucy; Einav, Shiri; Griffiths, Helen; Nation, Kate
2009-01-01
Background: Investigations using eye-tracking have reported reduced fixations to salient social cues such as eyes when participants with autism spectrum disorders (ASD) view social scenes. However, these studies have not distinguished different cognitive phenotypes. Methods: The eye-movements of 28 teenagers with ASD and 18 typically developing…
Improving Silent Reading Performance through Feedback on Eye Movements: A Feasibility Study
ERIC Educational Resources Information Center
Korinth, Sebastian P.; Fiebach, Christian J.
2018-01-01
This feasibility study investigated if feedback about individual eye movements, reflecting varying word processing stages, can improve reading performance. Twenty-five university students read 90 newspaper articles during 9 eye-tracking sessions. Training group participants (n = 12) were individually briefed before each session, which eye movement…
A low-cost video-oculography system for vestibular function testing.
Jihwan Park; Youngsun Kong; Yunyoung Nam
2017-07-01
In order to remain in focus during head movements, vestibular-ocular reflex causes eyes to move in the opposite direction to head movement. Disorders of vestibular system decrease vision, causing abnormal nystagmus and dizziness. To diagnose abnormal nystagmus, various studies have been reported including the use of rotating chair tests and videonystagmography. However, these tests are unsuitable for home use due to their high costs. Thus, a low-cost video-oculography system is necessary to obtain clinical features at home. In this paper, we present a low-cost video-oculography system using an infrared camera and Raspberry Pi board for tracking the pupils and evaluating a vestibular system. Horizontal eye movement is derived from video data obtained from an infrared camera and infrared light-emitting diodes, and the velocity of head rotation is obtained from a gyroscope sensor. Each pupil was extracted using a morphology operation and a contour detection method. Rotatory chair tests were conducted with our developed device. To evaluate our system, gain, asymmetry, and phase were measured and compared with System 2000. The average IQR errors of gain, phase and asymmetry were 0.81, 2.74 and 17.35, respectively. We showed that our system is able to measure clinical features.
Oculomotor Evidence for Top-Down Control following the Initial Saccade
Siebold, Alisha; van Zoest, Wieske; Donk, Mieke
2011-01-01
The goal of the current study was to investigate how salience-driven and goal-driven processes unfold during visual search over multiple eye movements. Eye movements were recorded while observers searched for a target, which was located on (Experiment 1) or defined as (Experiment 2) a specific orientation singleton. This singleton could either be the most, medium, or least salient element in the display. Results were analyzed as a function of response time separately for initial and second eye movements. Irrespective of the search task, initial saccades elicited shortly after the onset of the search display were primarily salience-driven whereas initial saccades elicited after approximately 250 ms were completely unaffected by salience. Initial saccades were increasingly guided in line with task requirements with increasing response times. Second saccades were completely unaffected by salience and were consistently goal-driven, irrespective of response time. These results suggest that stimulus-salience affects the visual system only briefly after a visual image enters the brain and has no effect thereafter. PMID:21931603
Sunkara, Adhira
2015-01-01
As we navigate through the world, eye and head movements add rotational velocity patterns to the retinal image. When such rotations accompany observer translation, the rotational velocity patterns must be discounted to accurately perceive heading. The conventional view holds that this computation requires efference copies of self-generated eye/head movements. Here we demonstrate that the brain implements an alternative solution in which retinal velocity patterns are themselves used to dissociate translations from rotations. These results reveal a novel role for visual cues in achieving a rotation-invariant representation of heading in the macaque ventral intraparietal area. Specifically, we show that the visual system utilizes both local motion parallax cues and global perspective distortions to estimate heading in the presence of rotations. These findings further suggest that the brain is capable of performing complex computations to infer eye movements and discount their sensory consequences based solely on visual cues. DOI: http://dx.doi.org/10.7554/eLife.04693.001 PMID:25693417
Saccade preparation is required for exogenous attention but not endogenous attention or IOR.
Smith, Daniel T; Schenk, Thomas; Rorden, Chris
2012-12-01
Covert attention is tightly coupled with the control of eye movements, but there is controversy about how tight this coupling is. The premotor theory of attention proposes that activation of the eye movement system is necessary to produce shifts of attention. In this study, we experimentally prevented healthy participants from planning or executing eye movements and observed the effect on exogenous attention, inhibition of return (IOR), and endogenous attention. The participants experienced a deficit of exogenous attentional facilitation that was specific to locations that were inaccessible by saccade. In contrast, their ability to endogenously orient attention was preserved, as was IOR. These results show that (a) exogenous attention depends on motor preparation, (b) IOR is independent of motor preparation and exogenous attention, and (c) endogenous attention is independent of motor preparation. Although these data are consistent with a weak version of the premotor theory, we believe they can be better explained by a biased competition account of visual attention.
General purpose algorithms for characterization of slow and fast phase nystagmus
NASA Technical Reports Server (NTRS)
Lessard, Charles S.
1987-01-01
In the overall aim for a better understanding of the vestibular and optokinetic systems and their roles in space motion sickness, the eye movement responses to various dynamic stimuli are measured. The vestibulo-ocular reflex (VOR) and the optokinetic response, as the eye movement responses are known, consist of slow phase and fast phase nystagmus. The specific objective is to develop software programs necessary to characterize the vestibulo-ocular and optokinetic responses by distinguishing between the two phases of nystagmus. The overall program is to handle large volumes of highly variable data with minimum operator interaction. The programs include digital filters, differentiation, identification of fast phases, and reconstruction of the slow phase with a least squares fit such that sinusoidal or psuedorandom data may be processed with accurate results. The resultant waveform, slow phase velocity eye movements, serves as input data to the spectral analysis programs previously developed for NASA to analyze nystagmus responses to pseudorandom angular velocity inputs.
Role of retinal slip in the prediction of target motion during smooth and saccadic pursuit.
de Brouwer, S; Missal, M; Lefèvre, P
2001-08-01
Visual tracking of moving targets requires the combination of smooth pursuit eye movements with catch-up saccades. In primates, catch-up saccades usually take place only during pursuit initiation because pursuit gain is close to unity. This contrasts with the lower and more variable gain of smooth pursuit in cats, where smooth eye movements are intermingled with catch-up saccades during steady-state pursuit. In this paper, we studied in detail the role of retinal slip in the prediction of target motion during smooth and saccadic pursuit in the cat. We found that the typical pattern of pursuit in the cat was a combination of smooth eye movements with saccades. During smooth pursuit initiation, there was a correlation between peak eye acceleration and target velocity. During pursuit maintenance, eye velocity oscillated at approximately 3 Hz around a steady-state value. The average gain of smooth pursuit was approximately 0.5. Trained cats were able to continue pursuing in the absence of a visible target, suggesting a role of the prediction of future target motion in this species. The analysis of catch-up saccades showed that the smooth-pursuit motor command is added to the saccadic command during catch-up saccades and that both position error and retinal slip are taken into account in their programming. The influence of retinal slip on catch-up saccades showed that prediction about future target motion is used in the programming of catch-up saccades. Altogether, these results suggest that pursuit systems in primates and cats are qualitatively similar, with a lower average gain in the cat and that prediction affects both saccades and smooth eye movements during pursuit.
Vernet, Marine; Quentin, Romain; Chanes, Lorena; Mitsumasu, Andres; Valero-Cabré, Antoni
2014-01-01
The planning, control and execution of eye movements in 3D space relies on a distributed system of cortical and subcortical brain regions. Within this network, the Eye Fields have been described in animals as cortical regions in which electrical stimulation is able to trigger eye movements and influence their latency or accuracy. This review focuses on the Frontal Eye Field (FEF) a “hub” region located in Humans in the vicinity of the pre-central sulcus and the dorsal-most portion of the superior frontal sulcus. The straightforward localization of the FEF through electrical stimulation in animals is difficult to translate to the healthy human brain, particularly with non-invasive neuroimaging techniques. Hence, in the first part of this review, we describe attempts made to characterize the anatomical localization of this area in the human brain. The outcome of functional Magnetic Resonance Imaging (fMRI), Magneto-encephalography (MEG) and particularly, non-invasive mapping methods such a Transcranial Magnetic Stimulation (TMS) are described and the variability of FEF localization across individuals and mapping techniques are discussed. In the second part of this review, we will address the role of the FEF. We explore its involvement both in the physiology of fixation, saccade, pursuit, and vergence movements and in associated cognitive processes such as attentional orienting, visual awareness and perceptual modulation. Finally in the third part, we review recent evidence suggesting the high level of malleability and plasticity of these regions and associated networks to non-invasive stimulation. The exploratory, diagnostic, and therapeutic interest of such interventions for the modulation and improvement of perception in 3D space are discussed. PMID:25202241
Perceptual impairment and psychomotor control in virtual laparoscopic surgery.
Wilson, Mark R; McGrath, John S; Vine, Samuel J; Brewer, James; Defriend, David; Masters, Richard S W
2011-07-01
It is recognised that one of the major difficulties in performing laparoscopic surgery is the translation of two-dimensional video image information to a three-dimensional working area. However, research has tended to ignore the gaze and eye-hand coordination strategies employed by laparoscopic surgeons as they attempt to overcome these perceptual constraints. This study sought to examine if measures related to tool movements, gaze strategy, and eye-hand coordination (the quiet eye) differentiate between experienced and novice operators performing a two-handed manoeuvres task on a virtual reality laparoscopic surgical simulator (LAP Mentor™). Twenty-five right-handed surgeons were categorised as being either experienced (having led more than 60 laparoscopic procedures) or novice (having performed fewer than 10 procedures) operators. The 10 experienced and 15 novice surgeons completed the "two-hand manoeuvres" task from the LAP Mentor basic skills learning environment while wearing a gaze registration system. Performance, movement, gaze, and eye-hand coordination parameters were recorded and compared between groups. The experienced surgeons completed the task significantly more quickly than the novices, used significantly fewer movements, and displayed shorter tool paths. Gaze analyses revealed that experienced surgeons spent significantly more time fixating the target locations than novices, who split their time between focusing on the targets and tracking the tools. A more detailed analysis of a difficult subcomponent of the task revealed that experienced operators used a significantly longer aiming fixation (the quiet eye period) to guide precision grasping movements and hence needed fewer grasp attempts. The findings of the study provide further support for the utility of examining strategic gaze behaviour and eye-hand coordination measures to help further our understanding of how experienced surgeons attempt to overcome the perceptual difficulties inherent in the laparoscopic environment.
Miller, Brett; O’Donnell, Carol
2013-01-01
The cumulative body of eye movement research provides significant insight into how readers process text. The heart of this work spans roughly 40 years reflecting the maturity of both the topics under study and experimental approaches used to investigate reading. Recent technological advancements offer increased flexibility to the field providing the potential to more concertedly study reading and literacy from an individual differences perspective. Historically, eye movement research focused far less on developmental issues related to individual differences in reading; however, this issue and the broader change it represents signal a meaningful transition inclusive of individual differences. The six papers in this special issue signify the recent, increased attention to and recognition of eye movement research’s transition to emphasize individual differences in reading while appreciating early contributions (e.g., Rayner, 1986) in this direction. We introduce these six papers and provide some historical context for the use of eye movement methodology to examine reading and context for the eye movement field’s early transition to examining individual differences, culminating in future research recommendations. PMID:24391304
Colour blindness of the movement-detecting system of the spider Cupiennius salei.
Orlando, Eva; Schmid, Axel
2011-02-15
The nocturnal wandering spider Cupiennius salei has one pair of principal eyes and three pairs of secondary eyes located on the prosoma, which differ in both morphology and function. Their spectral sensitivity, measured with intracellular recordings, is due to three different types of photoreceptors with absorbance maxima in the mid-range of the spectrum, at 480 nm and 520 nm and in the UV at 360 nm. Based on these physiological data colour vision might be possible. In the present study, the ability to discriminate coloured moving stimuli from grey backgrounds was tested. The perception of moving coloured stripes in front of backgrounds with 29 different grey levels was measured by using extracellular recordings from the anterior median eye muscles as a monitoring system. Each of these eyes has two muscles, which increase their activity when moving stimuli are presented in front of a secondary eye. This variation in eye muscle activity can be recorded extracellulary in a living spider using a single channel telemetry device. If colour perception exists, the animal should be able to detect a moving coloured stripe in front of any grey level. Blue, green and red stripes were used as moving stimuli, in front of all 29 grey backgrounds. The results indicate that C. salei is not able to discriminate the coloured stimuli from distinct shades of grey. It is therefore evident that the movement-detecting system in this spider appears to be colour blind.
Spatial constancy mechanisms in motor control
Medendorp, W. Pieter
2011-01-01
The success of the human species in interacting with the environment depends on the ability to maintain spatial stability despite the continuous changes in sensory and motor inputs owing to movements of eyes, head and body. In this paper, I will review recent advances in the understanding of how the brain deals with the dynamic flow of sensory and motor information in order to maintain spatial constancy of movement goals. The first part summarizes studies in the saccadic system, showing that spatial constancy is governed by a dynamic feed-forward process, by gaze-centred remapping of target representations in anticipation of and across eye movements. The subsequent sections relate to other oculomotor behaviour, such as eye–head gaze shifts, smooth pursuit and vergence eye movements, and their implications for feed-forward mechanisms for spatial constancy. Work that studied the geometric complexities in spatial constancy and saccadic guidance across head and body movements, distinguishing between self-generated and passively induced motion, indicates that both feed-forward and sensory feedback processing play a role in spatial updating of movement goals. The paper ends with a discussion of the behavioural mechanisms of spatial constancy for arm motor control and their physiological implications for the brain. Taken together, the emerging picture is that the brain computes an evolving representation of three-dimensional action space, whose internal metric is updated in a nonlinear way, by optimally integrating noisy and ambiguous afferent and efferent signals. PMID:21242137
Endo, Takao; Fujikado, Takashi; Hirota, Masakazu; Kanda, Hiroyuki; Morimoto, Takeshi; Nishida, Kohji
2018-04-20
To evaluate the improvement in targeted reaching movements toward targets of various contrasts in a patient implanted with a suprachoroidal-transretinal stimulation (STS) retinal prosthesis. An STS retinal prosthesis was implanted in the right eye of a 42-year-old man with advanced Stargardt disease (visual acuity: right eye, light perception; left eye, hand motion). In localization tests during the 1-year follow-up period, the patient attempted to touch the center of a white square target (visual angle, 10°; contrast, 96, 85, or 74%) displayed at a random position on a monitor. The distance between the touched point and the center of the target (the absolute deviation) was averaged over 20 trials with the STS system on or off. With the left eye occluded, the absolute deviation was not consistently lower with the system on than off for high-contrast (96%) targets, but was consistently lower with the system on for low-contrast (74%) targets. With both eyes open, the absolute deviation was consistently lower with the system on than off for 85%-contrast targets. With the system on and 96%-contrast targets, we detected a shorter response time while covering the right eye, which was being implanted with the STS, compared to covering the left eye (2.41 ± 2.52 vs 8.45 ± 3.78 s, p < 0.01). Performance of a reaching movement improved in a patient with an STS retinal prosthesis implanted in an eye with residual natural vision. Patients with a retinal prosthesis may be able to improve their visual performance by using both artificial vision and their residual natural vision. Beginning date of the trial: Feb. 20, 2014 Date of registration: Jan. 4, 2014 Trial registration number: UMIN000012754 Registration site: UMIN Clinical Trials Registry (UMIN-CTR) http://www.umin.ac.jp/ctr/index.htm.
Performing saccadic eye movements or blinking improves postural control.
Rougier, Patrice; Garin, Mélanie
2007-07-01
To determine the relationship between eye movement and postural control on an undisturbed upright stance maintenance protocol, 15 young, healthy individuals were tested in various conditions. These conditions included imposed blinking patterns and horizontal and vertical saccadic eye movements. The directions taken by the center of pressure (CP) were recorded via a force platform on which the participants remained in an upright position. The CP trajectories were used to estimate, via a low-pass filter, the vertically projected movements of the center of gravity (CGv) and consequently the difference CP-CGv. An analysis of the frequency shows that regular bilateral blinking does not produce a significant change in postural control. In contrast, performing saccadic eye movements induces some reduced amplitude for both basic CGv and CP-CGv movements principally along the antero-posterior axis. The present result supports the theory that some ocular movements may modify postural control in the maintenance of the upright standing position in human participants.
The selective disruption of spatial working memory by eye movements
Postle, Bradley R.; Idzikowski, Christopher; Sala, Sergio Della; Logie, Robert H.; Baddeley, Alan D.
2005-01-01
In the late 1970s/early 1980s, Baddeley and colleagues conducted a series of experiments investigating the role of eye movements in visual working memory. Although only described briefly in a book (Baddeley, 1986), these studies have influenced a remarkable number of empirical and theoretical developments in fields ranging from experimental psychology to human neuropsychology to nonhuman primate electrophysiology. This paper presents, in full detail, three critical studies from this series, together with a recently performed study that includes a level of eye movement measurement and control that was not available for the older studies. Together, the results demonstrate several facts about the sensitivity of visuospatial working memory to eye movements. First, it is eye movement control, not movement per se, that produces the disruptive effects. Second, these effects are limited to working memory for locations, and do not generalize to visual working memory for shapes. Third, they can be isolated to the storage/maintenance components of working memory (e.g., to the delay period of the delayed-recognition task). These facts have important implications for models of visual working memory. PMID:16556561
Rolfs, Martin; Carrasco, Marisa
2012-01-01
Humans and other animals with foveate vision make saccadic eye movements to prioritize the visual analysis of behaviorally relevant information. Even before movement onset, visual processing is selectively enhanced at the target of a saccade, presumably gated by brain areas controlling eye movements. Here we assess concurrent changes in visual performance and perceived contrast before saccades, and show that saccade preparation enhances perception rapidly, altering early visual processing in a manner akin to increasing the physical contrast of the visual input. Observers compared orientation and contrast of a test stimulus, appearing briefly before a saccade, to a standard stimulus, presented previously during a fixation period. We found simultaneous progressive enhancement in both orientation discrimination performance and perceived contrast as time approached saccade onset. These effects were robust as early as 60 ms after the eye movement was cued, much faster than the voluntary deployment of covert attention (without eye movements), which takes ~300 ms. Our results link the dynamics of saccade preparation, visual performance, and subjective experience and show that upcoming eye movements alter visual processing by increasing the signal strength. PMID:23035086
Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision.
Ben-Simon, Avi; Ben-Shahar, Ohad; Segev, Ronen
2009-11-15
The archer fish (Toxotes chatareus) exhibits unique visual behavior in that it is able to aim at and shoot down with a squirt of water insects resting on the foliage above water level and then feed on them. This extreme behavior requires excellent visual acuity, learning, and tight synchronization between the visual system and body motion. This behavior also raises many important questions, such as the fish's ability to compensate for air-water refraction and the neural mechanisms underlying target acquisition. While many such questions remain open, significant insights towards solving them can be obtained by tracking the eye and body movements of freely behaving fish. Unfortunately, existing tracking methods suffer from either a high level of invasiveness or low resolution. Here, we present a video-based eye tracking method for accurately and remotely measuring the eye and body movements of a freely moving behaving fish. Based on a stereo vision system and a unique triangulation method that corrects for air-glass-water refraction, we are able to measure a full three-dimensional pose of the fish eye and body with high temporal and spatial resolution. Our method, being generic, can be applied to studying the behavior of marine animals in general. We demonstrate how data collected by our method may be used to show that the hunting behavior of the archer fish is composed of surfacing concomitant with rotating the body around the direction of the fish's fixed gaze towards the target, until the snout reaches in the correct shooting position at water level.
Croft, Mary Ann; McDonald, Jared P; James, Rebecca J; Heatley, Gregg A; Lin, Ting-Li; Lütjen-Drecoll, Elke; Kaufman, Paul L
2008-12-01
To determine how surgically altering the normal relationship between the lens and the ciliary body in rhesus monkeys affects centripetal ciliary body and lens movement. In 18 rhesus monkey eyes (aged 6-27 years), accommodation was induced before and after surgery by electrical stimulation of the Edinger-Westphal nucleus. Accommodative amplitude was measured by coincidence refractometry. Goniovideography was performed before and after intra- and extracapsular lens extraction (ICLE, ECLE) and anterior regional zonulolysis (ARZ). Centripetal lens/capsule movements, centripetal ciliary process (CP) movements, and circumlental space were measured by computerized image analysis of the goniovideography images. Centripetal accommodative CP and capsule movement increased in velocity and amplitude after, compared with before, ECLE regardless of age (n = 5). The presence of the lens substance retarded capsule movement by approximately 21% in the young eyes and by approximately 62% in the older eyes. Post-ICLE compared with pre-ICLE centripetal accommodative CP movement was dampened in all eyes in which the anterior vitreous was disrupted (n = 7), but not in eyes in which the anterior vitreous was left intact (n = 2). After anterior regional zonulolysis (n = 4), lens position shifted toward the lysed quadrant during accommodation. The presence of the lens substance, capsule zonular attachments, and Wieger's ligament may play a role in centripetal CP movement. The capsule is still capable of centripetal movement in the older eye (although at a reduced capacity) and may have the ability to produce approximately 6 D of accommodation in the presence of a normal, young crystalline lens or a similar surrogate.
Upward gaze and head deviation with frontal eye field stimulation.
Kaiboriboon, Kitti; Lüders, Hans O; Miller, Jonathan P; Leigh, R John
2012-03-01
Using electrical stimulation to the deep, most caudal part of the right frontal eye field (FEF), we demonstrate a novel pattern of vertical (upward) eye movement that was previously only thought possible by stimulating both frontal eye fields simultaneously. If stimulation was started when the subject looked laterally, the initial eye movement was back to the midline, followed by upward deviation. Our finding challenges current view of topological organisation in the human FEF and may have general implications for concepts of topological organisation of the motor cortex, since sustained stimulation also induced upward head movements as a component of the vertical gaze shift. [Published with video sequences].
Grossberg, Stephen; Srinivasan, Karthik; Yazdanbakhsh, Arash
2015-01-01
How does the brain maintain stable fusion of 3D scenes when the eyes move? Every eye movement causes each retinal position to process a different set of scenic features, and thus the brain needs to binocularly fuse new combinations of features at each position after an eye movement. Despite these breaks in retinotopic fusion due to each movement, previously fused representations of a scene in depth often appear stable. The 3D ARTSCAN neural model proposes how the brain does this by unifying concepts about how multiple cortical areas in the What and Where cortical streams interact to coordinate processes of 3D boundary and surface perception, spatial attention, invariant object category learning, predictive remapping, eye movement control, and learned coordinate transformations. The model explains data from single neuron and psychophysical studies of covert visual attention shifts prior to eye movements. The model further clarifies how perceptual, attentional, and cognitive interactions among multiple brain regions (LGN, V1, V2, V3A, V4, MT, MST, PPC, LIP, ITp, ITa, SC) may accomplish predictive remapping as part of the process whereby view-invariant object categories are learned. These results build upon earlier neural models of 3D vision and figure-ground separation and the learning of invariant object categories as the eyes freely scan a scene. A key process concerns how an object's surface representation generates a form-fitting distribution of spatial attention, or attentional shroud, in parietal cortex that helps maintain the stability of multiple perceptual and cognitive processes. Predictive eye movement signals maintain the stability of the shroud, as well as of binocularly fused perceptual boundaries and surface representations. PMID:25642198
Grossberg, Stephen; Srinivasan, Karthik; Yazdanbakhsh, Arash
2014-01-01
How does the brain maintain stable fusion of 3D scenes when the eyes move? Every eye movement causes each retinal position to process a different set of scenic features, and thus the brain needs to binocularly fuse new combinations of features at each position after an eye movement. Despite these breaks in retinotopic fusion due to each movement, previously fused representations of a scene in depth often appear stable. The 3D ARTSCAN neural model proposes how the brain does this by unifying concepts about how multiple cortical areas in the What and Where cortical streams interact to coordinate processes of 3D boundary and surface perception, spatial attention, invariant object category learning, predictive remapping, eye movement control, and learned coordinate transformations. The model explains data from single neuron and psychophysical studies of covert visual attention shifts prior to eye movements. The model further clarifies how perceptual, attentional, and cognitive interactions among multiple brain regions (LGN, V1, V2, V3A, V4, MT, MST, PPC, LIP, ITp, ITa, SC) may accomplish predictive remapping as part of the process whereby view-invariant object categories are learned. These results build upon earlier neural models of 3D vision and figure-ground separation and the learning of invariant object categories as the eyes freely scan a scene. A key process concerns how an object's surface representation generates a form-fitting distribution of spatial attention, or attentional shroud, in parietal cortex that helps maintain the stability of multiple perceptual and cognitive processes. Predictive eye movement signals maintain the stability of the shroud, as well as of binocularly fused perceptual boundaries and surface representations.
Saccadic eye movements as a measure of residual effects: temazepam compared with other hypnotics.
Hofferberth, B; Hirschberg; Grotemeyer
1986-01-01
Eye movements are classified into two categories: quickly running saccades and smooth pursuit movements. Saccades are fast conjugate eye movements with a preprogrammed direction, amplitude, and speed course; their purpose is to register new objects in the visual field. The duration and velocity of saccadic eye movements are very much dependent on vigilance. Comparisons were made with a number of psychometric tests [d 2 Durchstreichtest (cross out test), Viennese determination apparatus, and flicker fusion frequency] and the velocity of fast eye movements. The results of three separate investigations are presented. Standardization was undertaken in 100 healthy volunteers, 50 male and 50 female subjects aged between 20 and more than 50 years were included. In an open parallel group study, comparisons were made between various hypnotics with different half-lives (temazepam, flunitrazepam, flurazepam, and phenobarbital). There were 10 healthy volunteers in each group, and medication was taken as a single night-time dose for 7 nights. In a double-blind study, temazepam (20 mg/day) was tested against flunitrazepam (2 mg/day). Dosing lasted 7 days. A marked impairment of the saccadic eye movements was observed with flunitrazepam but not with temazepam. Of all the benzodiazepines tested, only temazepam had no influence on the parameters of the saccade test. These results can be explained by temazepam's short half-life and also by the fact that no active metabolites are formed.
Benzodiazepines impair smooth pursuit eye movements.
Bittencourt, P R; Wade, P; Smith, A T; Richens, A
1983-01-01
Five healthy male volunteers received single oral doses of 10 mg diazepam, 20 mg temazepam and placebo, in a double-blind, randomised fashion. Smooth pursuit eye movement velocity and serum benzodiazepine concentration were measured before and after at 0.5,1,1.5,2,3,4,6,9 and 12 h after administration of the treatments. Significant decrease in smooth pursuit eye movement velocity as compared to placebo was observed between 0.5-2 h after temazepam, and between 1-2 h after diazepam. Smooth pursuit eye movement velocity was log-linearly correlated with serum temazepam and diazepam concentration. The results demonstrate the relationship between serum benzodiazepine concentration and its effect on an objective measure of oculomotor performance. PMID:6133544
Andersson, Richard; Larsson, Linnea; Holmqvist, Kenneth; Stridh, Martin; Nyström, Marcus
2017-04-01
Almost all eye-movement researchers use algorithms to parse raw data and detect distinct types of eye movement events, such as fixations, saccades, and pursuit, and then base their results on these. Surprisingly, these algorithms are rarely evaluated. We evaluated the classifications of ten eye-movement event detection algorithms, on data from an SMI HiSpeed 1250 system, and compared them to manual ratings of two human experts. The evaluation focused on fixations, saccades, and post-saccadic oscillations. The evaluation used both event duration parameters, and sample-by-sample comparisons to rank the algorithms. The resulting event durations varied substantially as a function of what algorithm was used. This evaluation differed from previous evaluations by considering a relatively large set of algorithms, multiple events, and data from both static and dynamic stimuli. The main conclusion is that current detectors of only fixations and saccades work reasonably well for static stimuli, but barely better than chance for dynamic stimuli. Differing results across evaluation methods make it difficult to select one winner for fixation detection. For saccade detection, however, the algorithm by Larsson, Nyström and Stridh (IEEE Transaction on Biomedical Engineering, 60(9):2484-2493,2013) outperforms all algorithms in data from both static and dynamic stimuli. The data also show how improperly selected algorithms applied to dynamic data misestimate fixation and saccade properties.
Hypothesized eye movements of neurolinguistic programming: a statistical artifact.
Farmer, A; Rooney, R; Cunningham, J R
1985-12-01
Neurolinguistic programming's hypothesized eye-movements were measured independently from videotapes of 30 subjects, aged 15 to 76 yr., who were asked to recall visual pictures, recorded audio sounds, and textural objects. chi 2 indicated that subjects' responses were significantly different from those predicted. When chi 2 comparisons were weighted by number of eye positions assigned to each modality (3 visual, 3 auditory, 1 kinesthetic), subjects' responses did not differ significantly from the expected pattern. These data indicate that the eye-movement hypothesis may represent randomly occurring rather than sensory-modality-related positions.
ERIC Educational Resources Information Center
Henderson, John M.; Nuthmann, Antje; Luke, Steven G.
2013-01-01
Recent research on eye movements during scene viewing has primarily focused on where the eyes fixate. But eye fixations also differ in their durations. Here we investigated whether fixation durations in scene viewing are under the direct and immediate control of the current visual input. Subjects freely viewed photographs of scenes in preparation…
ERIC Educational Resources Information Center
Nieuwenhuis, Sander; Elzinga, Bernet M.; Ras, Priscilla H.; Berends, Floris; Duijs, Peter; Samara, Zoe; Slagter, Heleen A.
2013-01-01
Recent research has shown superior memory retrieval when participants make a series of horizontal saccadic eye movements between the memory encoding phase and the retrieval phase compared to participants who do not move their eyes or move their eyes vertically. It has been hypothesized that the rapidly alternating activation of the two hemispheres…
Evaluating camouflage design using eye movement data.
Lin, Chiuhsiang Joe; Chang, Chi-Chan; Lee, Yung-Hui
2014-05-01
This study investigates the characteristics of eye movements during a camouflaged target search task. Camouflaged targets were randomly presented on two natural landscapes. The performance of each camouflage design was assessed by target detection hit rate, detection time, number of fixations on display, first saccade amplitude to target, number of fixations on target, fixation duration on target, and subjective ratings of search task difficulty. The results showed that the camouflage patterns could significantly affect the eye-movement behavior, especially first saccade amplitude and fixation duration, and the findings could be used to increase the sensitivity of the camouflage assessment. We hypothesized that the assessment could be made with regard to the differences in detectability and discriminability of the camouflage patterns. These could explain less efficient search behavior in eye movements. Overall, data obtained from eye movements can be used to significantly enhance the interpretation of the effects of different camouflage design. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Banerjee, Jayeeta; Majumdar, Dhurjati; Majumdar, Deepti; Pal, Madhu Sudan
2010-06-01
We are experiencing a shifting of media: from the printed paper to the computer screen. This transition is modifying the process of how we read and understand a text. It is very difficult to conclude on suitability of font characters based upon subjective evaluation method only. Present study evaluates the effect of font type on human cognitive workload during perception of individual alphabets on a computer screen. Twenty six young subjects volunteered for this study. Here, subjects have been shown individual characters of different font types and their eye movements have been recorded. A binocular eye movement recorder was used for eye movement recording. The results showed that different eye movement parameters such as pupil diameter, number of fixations, fixation duration were less for font type Verdana. The present study recommends the use of font type Verdana for presentation of individual alphabets on various electronic displays in order to reduce cognitive workload.
Supèr, Hans; van der Togt, Chris; Spekreijse, Henk; Lamme, Victor A. F.
2004-01-01
We continuously scan the visual world via rapid or saccadic eye movements. Such eye movements are guided by visual information, and thus the oculomotor structures that determine when and where to look need visual information to control the eye movements. To know whether visual areas contain activity that may contribute to the control of eye movements, we recorded neural responses in the visual cortex of monkeys engaged in a delayed figure-ground detection task and analyzed the activity during the period of oculomotor preparation. We show that ≈100 ms before the onset of visually and memory-guided saccades neural activity in V1 becomes stronger where the strongest presaccadic responses are found at the location of the saccade target. In addition, in memory-guided saccades the strength of presaccadic activity shows a correlation with the onset of the saccade. These findings indicate that the primary visual cortex contains saccade-related responses and participates in visually guided oculomotor behavior. PMID:14970334
Effects of individual differences in verbal skills on eye-movement patterns during sentence reading
Kuperman, Victor; Van Dyke, Julie A.
2011-01-01
This study is a large-scale exploration of the influence that individual reading skills exert on eye-movement behavior in sentence reading. Seventy one non-college-bound 16–24 year-old speakers of English completed a battery of 18 verbal and cognitive skill assessments, and read a series of sentences as their eye movements were monitored. Statistical analyses were performed to establish what tests of reading abilities were predictive of eye-movement patterns across this population and how strong the effects were. We found that individual scores in rapid automatized naming and word identification tests (i) were the only participant variables with reliable predictivity throughout the time-course of reading; (ii) elicited effects that superceded in magnitude the effects of established predictors like word length or frequency; and (iii) strongly modulated the influence of word length and frequency on fixation times. We discuss implications of our findings for testing reading ability, as well as for research of eye-movements in reading. PMID:21709808
Eye movements: The past 25 years
Kowler, Eileen
2011-01-01
This article reviews the past 25 of research on eye movements (1986–2011). Emphasis is on three oculomotor behaviors: gaze control, smooth pursuit and saccades, and on their interactions with vision. Focus over the past 25 years has remained on the fundamental and classical questions: What are the mechanisms that keep gaze stable with either stationary or moving targets? How does the motion of the image on the retina affect vision? Where do we look – and why – when performing a complex task? How can the world appear clear and stable despite continual movements of the eyes? The past 25 years of investigation of these questions has seen progress and transformations at all levels due to new approaches (behavioral, neural and theoretical) aimed at studying how eye movements cope with real-world visual and cognitive demands. The work has led to a better understanding of how prediction, learning and attention work with sensory signals to contribute to the effective operation of eye movements in visually rich environments. PMID:21237189
Tracking without perceiving: a dissociation between eye movements and motion perception.
Spering, Miriam; Pomplun, Marc; Carrasco, Marisa
2011-02-01
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept.
Tracking Without Perceiving: A Dissociation Between Eye Movements and Motion Perception
Spering, Miriam; Pomplun, Marc; Carrasco, Marisa
2011-01-01
Can people react to objects in their visual field that they do not consciously perceive? We investigated how visual perception and motor action respond to moving objects whose visibility is reduced, and we found a dissociation between motion processing for perception and for action. We compared motion perception and eye movements evoked by two orthogonally drifting gratings, each presented separately to a different eye. The strength of each monocular grating was manipulated by inducing adaptation to one grating prior to the presentation of both gratings. Reflexive eye movements tracked the vector average of both gratings (pattern motion) even though perceptual responses followed one motion direction exclusively (component motion). Observers almost never perceived pattern motion. This dissociation implies the existence of visual-motion signals that guide eye movements in the absence of a corresponding conscious percept. PMID:21189353
Application of the System Identification Technique to Goal-Directed Saccades.
1985-07-01
Saccadic eye movements are among the fastest voluntary muscle movements the human body is capable of producing and are characterized by a rapid shift of gaze ...moving the target the same distance the eyeball moves. Collewijn and Van der Mark (9), in their study of the slow phase of optokinetic nystagmus , used
Frontoparietal priority maps as biomarkers for mTBI
2016-10-01
spatial attention and eye movement deficits associated with mTBI result from disruption of the gray matter and/or the white matter in cortical...The hypothesis being tested is that spatial attention and eye movement deficits associated with mTBI result from disruption of the gray matter and/or...select agents Nothing to report. PRODUCTS o Publications, conference papers, and presentations “Visual Attention and Eye Movement Deficits in
Retinal image registration for eye movement estimation.
Kolar, Radim; Tornow, Ralf P; Odstrcilik, Jan
2015-01-01
This paper describes a novel methodology for eye fixation measurement using a unique videoophthalmoscope setup and advanced image registration approach. The representation of the eye movements via Poincare plot is also introduced. The properties, limitations and perspective of this methodology are finally discussed.
A Preliminary Study: Is the Metronome Harmful or Helpful?
ERIC Educational Resources Information Center
Arthur, Patricia; Khuu, Sieu; Blom, Diana
2016-01-01
The metronome is a frequently used time-keeping tool in music instrument practice. However, if its speed is set beyond a comfortable level for the performer, their eye movement (EM) patterns can betray pressure that might have been placed on the visual processing system. The patterns of the eyes moving forward or back, (saccades); when the eye…
Moreno-López, Bernardo; Escudero, Miguel; Estrada, Carmen
2002-01-01
Nitric oxide (NO) synthesis by prepositus hypoglossi (PH) neurons is necessary for the normal performance of horizontal eye movements. We have previously shown that unilateral injections of NO synthase (NOS) inhibitors into the PH nucleus of alert cats produce velocity imbalance without alteration of the eye position control, both during spontaneous eye movements and the vestibulo-ocular reflex (VOR). This NO effect is exerted on the dorsal PH neuropil, whose fibres increase their cGMP content when stimulated by NO. In an attempt to determine whether NO acts by modulation of a specific neurotransmission system, we have now compared the oculomotor effects of NOS inhibition with those produced by local blockade of glutamatergic, GABAergic or glycinergic receptors in the PH nucleus of alert cats. Both glutamatergic antagonists used, 2-amino-5-phosphonovaleric acid (APV) and 2,3-dihydro-6-nitro-7-sulphamoyl-benzo quinoxaline (NBQX), induced a nystagmus contralateral to that observed upon NOS inhibition, and caused exponential eye position drift. In contrast, bicuculline and strychnine induced eye velocity alterations similar to those produced by NOS inhibitors, suggesting that NO oculomotor effects were due to facilitation of some inhibitory input to the PH nucleus. To investigate the anatomical location of the putative NO target neurons, the retrograde tracer Fast Blue was injected in one PH nucleus, and the brainstem sections containing Fast Blue-positive neurons were stained with double immunohistochemistry for NO-sensitive cGMP and glutamic acid decarboxylase. GABAergic neurons projecting to the PH nucleus and containing NO-sensitive cGMP were found almost exclusively in the ipsilateral medial vestibular nucleus and marginal zone. The results suggest that the nitrergic PH neurons control their own firing rate by a NO-mediated facilitation of GABAergic afferents from the ipsilateral medial vestibular nucleus. This self-control mechanism could play an important role in the maintenance of the vestibular balance necessary to generate a stable and adequate eye position signal. PMID:11927688
Anticipatory Smooth Eye Movements in Autism Spectrum Disorder
Aitkin, Cordelia D.; Santos, Elio M.; Kowler, Eileen
2013-01-01
Smooth pursuit eye movements are important for vision because they maintain the line of sight on targets that move smoothly within the visual field. Smooth pursuit is driven by neural representations of motion, including a surprisingly strong influence of high-level signals representing expected motion. We studied anticipatory smooth eye movements (defined as smooth eye movements in the direction of expected future motion) produced by salient visual cues in a group of high-functioning observers with Autism Spectrum Disorder (ASD), a condition that has been associated with difficulties in either generating predictions, or translating predictions into effective motor commands. Eye movements were recorded while participants pursued the motion of a disc that moved within an outline drawing of an inverted Y-shaped tube. The cue to the motion path was a visual barrier that blocked the untraveled branch (right or left) of the tube. ASD participants showed strong anticipatory smooth eye movements whose velocity was the same as that of a group of neurotypical participants. Anticipatory smooth eye movements appeared on the very first cued trial, indicating that trial-by-trial learning was not responsible for the responses. These results are significant because they show that anticipatory capacities are intact in high-functioning ASD in cases where the cue to the motion path is highly salient and unambiguous. Once the ability to generate anticipatory pursuit is demonstrated, the study of the anticipatory responses with a variety of types of cues provides a window into the perceptual or cognitive processes that underlie the interpretation of events in natural environments or social situations. PMID:24376667
Anticipatory smooth eye movements in autism spectrum disorder.
Aitkin, Cordelia D; Santos, Elio M; Kowler, Eileen
2013-01-01
Smooth pursuit eye movements are important for vision because they maintain the line of sight on targets that move smoothly within the visual field. Smooth pursuit is driven by neural representations of motion, including a surprisingly strong influence of high-level signals representing expected motion. We studied anticipatory smooth eye movements (defined as smooth eye movements in the direction of expected future motion) produced by salient visual cues in a group of high-functioning observers with Autism Spectrum Disorder (ASD), a condition that has been associated with difficulties in either generating predictions, or translating predictions into effective motor commands. Eye movements were recorded while participants pursued the motion of a disc that moved within an outline drawing of an inverted Y-shaped tube. The cue to the motion path was a visual barrier that blocked the untraveled branch (right or left) of the tube. ASD participants showed strong anticipatory smooth eye movements whose velocity was the same as that of a group of neurotypical participants. Anticipatory smooth eye movements appeared on the very first cued trial, indicating that trial-by-trial learning was not responsible for the responses. These results are significant because they show that anticipatory capacities are intact in high-functioning ASD in cases where the cue to the motion path is highly salient and unambiguous. Once the ability to generate anticipatory pursuit is demonstrated, the study of the anticipatory responses with a variety of types of cues provides a window into the perceptual or cognitive processes that underlie the interpretation of events in natural environments or social situations.
Binocular vision and eye movement disorders in older adults.
Leat, Susan J; Chan, Lisa Li-Li; Maharaj, Priya-Devi; Hrynchak, Patricia K; Mittelstaedt, Andrea; Machan, Carolyn M; Irving, Elizabeth L
2013-05-31
To determine the prevalence of binocular vision (BV) and eye movement disorders in a clinic population of older adults. Retrospective clinic data were abstracted from files of 500 older patients seen at the University of Waterloo Optometry Clinic over a 1-year period. Stratified sampling gave equal numbers of patients in the 60 to 69, 70 to 79, and 80+ age groups. Data included age, general and ocular history and symptoms, use of antidepressants, a habit of smoking, refraction, visual acuity, BV and eye movement status for the most recent full oculo-visual assessment, and an assessment 10 years prior. The prevalence of any BV or eye movement abnormal test (AT) result, defined as a test result outside the normal range, was determined. This included strabismus (any) or phoria; incomitancy; poor pursuits; and remote near point of convergence (NPC). The prevalence of significant BV disorders (diagnostic entities, i.e., a clinical condition that may need treatment and may have functional implications) was also determined. The prevalence of any BV or eye movement at was 41%, 44%, and 51% in the 60 to 69, 70 to 79, and 80+ age groups, respectively. These figures were lower for 10 years earlier: 31%, 36%, and 40% for ages 50 to 59, 60 to 69, and 70+, respectively. The prevalence of any BV or eye movement disorder was 27%, 30%, and 38% for the three age groups and 17%, 19%, and 24% for 10 years prior. Age and use of antidepressants most commonly predicted BV or eye movement AT or disorder. BV disorders are common among older adults.
Image processing for improved eye-tracking accuracy
NASA Technical Reports Server (NTRS)
Mulligan, J. B.; Watson, A. B. (Principal Investigator)
1997-01-01
Video cameras provide a simple, noninvasive method for monitoring a subject's eye movements. An important concept is that of the resolution of the system, which is the smallest eye movement that can be reliably detected. While hardware systems are available that estimate direction of gaze in real-time from a video image of the pupil, such systems must limit image processing to attain real-time performance and are limited to a resolution of about 10 arc minutes. Two ways to improve resolution are discussed. The first is to improve the image processing algorithms that are used to derive an estimate. Off-line analysis of the data can improve resolution by at least one order of magnitude for images of the pupil. A second avenue by which to improve resolution is to increase the optical gain of the imaging setup (i.e., the amount of image motion produced by a given eye rotation). Ophthalmoscopic imaging of retinal blood vessels provides increased optical gain and improved immunity to small head movements but requires a highly sensitive camera. The large number of images involved in a typical experiment imposes great demands on the storage, handling, and processing of data. A major bottleneck had been the real-time digitization and storage of large amounts of video imagery, but recent developments in video compression hardware have made this problem tractable at a reasonable cost. Images of both the retina and the pupil can be analyzed successfully using a basic toolbox of image-processing routines (filtering, correlation, thresholding, etc.), which are, for the most part, well suited to implementation on vectorizing supercomputers.
Deliens, Gaétane; Leproult, Rachel; Neu, Daniel; Peigneux, Philippe
2013-01-01
Study Objectives: To test the hypothesis that rapid eye movement (REM) sleep contributes to the consolidation of new memories, whereas non-rapid eye movement (NREM) sleep contributes to the prevention of retroactive interference. Design: Randomized, crossover study. Setting: Two sessions of either a morning nap or wakefulness. Participants: Twenty-five healthy young adults. Interventions: Declarative learning of word pairs followed by a nap or a wake interval, then learning of interfering word pairs and delayed recall of list A. Measurements and Results: After a restricted night (24:00-06:00), participants learned a list of word pairs (list A). They were then required to either take a nap or stay awake during 45 min, after which they learned a second list of word pairs (list B) and then had to recall list A. Fifty percent of word pairs in list B shared the first word with list A, resulting in interference. Ten subjects exhibited REM sleep whereas 13 subjects exhibited NREM stage 3 (N3) sleep. An interference effect was observed in the nap but not in the wake condition. In post-learning naps, N3 sleep was associated with a reduced interference effect, which was not the case for REM sleep. Moreover, participants exhibiting N3 sleep in the post-learning nap condition also showed a reduced interference effect in the wake condition, suggesting a higher protection ability against interference. Conclusion: Our results partly support the hypothesis that non-rapid eye movement sleep contributes in protecting novel memories against interference. However, rapid eye movement sleep-related consolidation is not evidenced. Citation: Deliens G; Leproult R; Neu D; Peigneux P. Rapid eye movement and non-rapid eye movement sleep contributions in memory consolidation and resistance to retroactive interference for verbal material. SLEEP 2013;36(12):1875-1883. PMID:24293762
Relationship between saccadic eye movements and formation of the Krukenberg's spindle-a CFD study.
Boushehrian, Hamidreza Hajiani; Abouali, Omid; Jafarpur, Khosrow; Ghaffarieh, Alireza; Ahmadi, Goodarz
2017-09-01
In this research, a series of numerical simulations for evaluating the effects of saccadic eye movement on the aqueous humour (AH) flow field and movement of pigment particles in the anterior chamber (AC) was performed. To predict the flow field of AH in the AC, the unsteady forms of continuity, momentum balance and conservation of energy equations were solved using the dynamic mesh technique for simulating the saccadic motions. Different orientations of the human eye including horizontal, vertical and angles of 10° and 20° were considered. The Lagrangian particle trajectory analysis approach was used to find the trajectories of pigment particles in the eye. Particular attention was given to the relation between the saccadic eye movement and potential formation of Krukenberg's spindle in the eye. The simulation results revealed that the natural convection flow was an effective mechanism for transferring pigment particles from the iris to near the cornea. In addition, the saccadic eye movement was the dominant mechanism for deposition of pigment particles on the cornea, which could lead to the formation of Krukenberg's spindle. The effect of amplitude of saccade motion angle in addition to the orientation of the eye on the formation of Krukenberg's spindle was investigated. © The authors 2016. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.
Exploring eye movements in patients with glaucoma when viewing a driving scene.
Crabb, David P; Smith, Nicholas D; Rauscher, Franziska G; Chisholm, Catharine M; Barbur, John L; Edgar, David F; Garway-Heath, David F
2010-03-16
Glaucoma is a progressive eye disease and a leading cause of visual disability. Automated assessment of the visual field determines the different stages in the disease process: it would be desirable to link these measurements taken in the clinic with patient's actual function, or establish if patients compensate for their restricted field of view when performing everyday tasks. Hence, this study investigated eye movements in glaucomatous patients when viewing driving scenes in a hazard perception test (HPT). The HPT is a component of the UK driving licence test consisting of a series of short film clips of various traffic scenes viewed from the driver's perspective each containing hazardous situations that require the camera car to change direction or slow down. Data from nine glaucomatous patients with binocular visual field defects and ten age-matched control subjects were considered (all experienced drivers). Each subject viewed 26 different films with eye movements simultaneously monitored by an eye tracker. Computer software was purpose written to pre-process the data, co-register it to the film clips and to quantify eye movements and point-of-regard (using a dynamic bivariate contour ellipse analysis). On average, and across all HPT films, patients exhibited different eye movement characteristics to controls making, for example, significantly more saccades (P<0.001; 95% confidence interval for mean increase: 9.2 to 22.4%). Whilst the average region of 'point-of-regard' of the patients did not differ significantly from the controls, there were revealing cases where patients failed to see a hazard in relation to their binocular visual field defect. Characteristics of eye movement patterns in patients with bilateral glaucoma can differ significantly from age-matched controls when viewing a traffic scene. Further studies of eye movements made by glaucomatous patients could provide useful information about the definition of the visual field component required for fitness to drive.
Exploring Eye Movements in Patients with Glaucoma When Viewing a Driving Scene
Crabb, David P.; Smith, Nicholas D.; Rauscher, Franziska G.; Chisholm, Catharine M.; Barbur, John L.; Edgar, David F.; Garway-Heath, David F.
2010-01-01
Background Glaucoma is a progressive eye disease and a leading cause of visual disability. Automated assessment of the visual field determines the different stages in the disease process: it would be desirable to link these measurements taken in the clinic with patient's actual function, or establish if patients compensate for their restricted field of view when performing everyday tasks. Hence, this study investigated eye movements in glaucomatous patients when viewing driving scenes in a hazard perception test (HPT). Methodology/Principal Findings The HPT is a component of the UK driving licence test consisting of a series of short film clips of various traffic scenes viewed from the driver's perspective each containing hazardous situations that require the camera car to change direction or slow down. Data from nine glaucomatous patients with binocular visual field defects and ten age-matched control subjects were considered (all experienced drivers). Each subject viewed 26 different films with eye movements simultaneously monitored by an eye tracker. Computer software was purpose written to pre-process the data, co-register it to the film clips and to quantify eye movements and point-of-regard (using a dynamic bivariate contour ellipse analysis). On average, and across all HPT films, patients exhibited different eye movement characteristics to controls making, for example, significantly more saccades (P<0.001; 95% confidence interval for mean increase: 9.2 to 22.4%). Whilst the average region of ‘point-of-regard’ of the patients did not differ significantly from the controls, there were revealing cases where patients failed to see a hazard in relation to their binocular visual field defect. Conclusions/Significance Characteristics of eye movement patterns in patients with bilateral glaucoma can differ significantly from age-matched controls when viewing a traffic scene. Further studies of eye movements made by glaucomatous patients could provide useful information about the definition of the visual field component required for fitness to drive. PMID:20300522
The effect of concurrent hand movement on estimated time to contact in a prediction motion task.
Zheng, Ran; Maraj, Brian K V
2018-04-27
In many activities, we need to predict the arrival of an occluded object. This action is called prediction motion or motion extrapolation. Previous researchers have found that both eye tracking and the internal clocking model are involved in the prediction motion task. Additionally, it is reported that concurrent hand movement facilitates the eye tracking of an externally generated target in a tracking task, even if the target is occluded. The present study examined the effect of concurrent hand movement on the estimated time to contact in a prediction motion task. We found different (accurate/inaccurate) concurrent hand movements had the opposite effect on the eye tracking accuracy and estimated TTC in the prediction motion task. That is, the accurate concurrent hand tracking enhanced eye tracking accuracy and had the trend to increase the precision of estimated TTC, but the inaccurate concurrent hand tracking decreased eye tracking accuracy and disrupted estimated TTC. However, eye tracking accuracy does not determine the precision of estimated TTC.
Dooley, K O; Farmer, A
1988-08-01
Neurolinguistic programming's hypothesized eye movements were measured independently using videotapes of 10 nonfluent aphasic and 10 control subjects matched for age and sex. Chi-squared analysis indicated that eye-position responses were significantly different for the groups. Although earlier research has not supported the hypothesized eye positions for normal subjects, the present findings support the contention that eye-position responses may differ between neurologically normal and aphasic individuals.
Objective Methods to Test Visual Dysfunction in the Presence of Cognitive Impairment
2015-12-01
the eye and 3) purposeful eye movements to track targets that are resolved. Major Findings: Three major objective tests of vision were successfully...developed and optimized to detect disease. These were 1) the pupil light reflex (either comparing the two eyes or independently evaluating each eye ...separately for retina or optic nerve damage, 2) eye movement based analysis of target acquisition, fixation, and eccentric viewing as a means of
NASA Astrophysics Data System (ADS)
Tornow, Ralf P.; Milczarek, Aleksandra; Odstrcilik, Jan; Kolar, Radim
2017-07-01
A parallel video ophthalmoscope was developed to acquire short video sequences (25 fps, 250 frames) of both eyes simultaneously with exact synchronization. Video sequences were registered off-line to compensate for eye movements. From registered video sequences dynamic parameters like cardiac cycle induced reflection changes and eye movements can be calculated and compared between eyes.
Effect of gravity on vertical eye position.
Pierrot-Deseilligny, C
2009-05-01
There is growing evidence that gravity markedly influences vertical eye position and movements. A new model for the organization of brainstem upgaze pathways is presented in this review. The crossing ventral tegmental tract (CVTT) could be the efferent tract of an "antigravitational" pathway terminating at the elevator muscle motoneurons in the third nerve nuclei and comprising, upstream, the superior vestibular nucleus and y-group, the flocculus, and the otoliths. This pathway functions in parallel to the medial longitudinal fasciculus pathways, which control vertical eye movements made to compensate for all vertical head movements and may also comprise the "gravitational" vestibular pathways, involved in the central reflection of the gravity effect. The CVTT could provide the upgaze system with the supplement of tonic activity required to counteract the gravity effect expressed in the gravitational pathway, being permanently modulated according to the static positions of the head (i.e., the instantaneous gravity vector) between a maximal activity in the upright position and a minimal activity in horizontal positions. Different types of arguments support this new model. The permanent influence of gravity on vertical eye position is strongly suggested by the vertical slow phases and nystagmus observed after rapid changes in hypo- or hypergravity. The chin-beating nystagmus, existing in normal subjects with their head in the upside-down position, suggests that gravity is not compensated for in the downgaze system. Upbeat nystagmus due to brainstem lesions, most likely affecting the CVTT circuitry, is improved when the head is in the horizontal position, suggesting that this circuitry is involved in the counteraction of gravity between the upright and horizontal positions of the head. In downbeat nystagmus due to floccular damage, in which a permanent hyperexcitation of the CVTT could exist, a marked influence of static positions of the head is also observed. Finally, the strongest argument supporting a marked role of gravity in vertical eye position is that the eye movement alterations observed in the main, typical physiological and pathological conditions are precisely those that would be expected from a direct effect of gravity on the eyeballs, with, moreover, no single alternative interpretation existing so far that could account for all these different types of findings.
Adaptive optics optical coherence tomography with dynamic retinal tracking
Kocaoglu, Omer P.; Ferguson, R. Daniel; Jonnal, Ravi S.; Liu, Zhuolin; Wang, Qiang; Hammer, Daniel X.; Miller, Donald T.
2014-01-01
Adaptive optics optical coherence tomography (AO-OCT) is a highly sensitive and noninvasive method for three dimensional imaging of the microscopic retina. Like all in vivo retinal imaging techniques, however, it suffers the effects of involuntary eye movements that occur even under normal fixation. In this study we investigated dynamic retinal tracking to measure and correct eye motion at KHz rates for AO-OCT imaging. A customized retina tracking module was integrated into the sample arm of the 2nd-generation Indiana AO-OCT system and images were acquired on three subjects. Analyses were developed based on temporal amplitude and spatial power spectra in conjunction with strip-wise registration to independently measure AO-OCT tracking performance. After optimization of the tracker parameters, the system was found to correct eye movements up to 100 Hz and reduce residual motion to 10 µm root mean square. Between session precision was 33 µm. Performance was limited by tracker-generated noise at high temporal frequencies. PMID:25071963
Wide field-of-view bifocal eyeglasses
NASA Astrophysics Data System (ADS)
Barbero, Sergio; Rubinstein, Jacob
2015-09-01
When vision is affected simultaneously by presbyopia and myopia or hyperopia, a solution based on eyeglasses implies a surface with either segmented focal regions (e.g. bifocal lenses) or a progressive addition profile (PALs). However, both options have the drawback of reducing the field-of-view for each power position, which restricts the natural eye-head movements of the wearer. To avoid this serious limitation we propose a new solution which is essentially a bifocal power-adjustable optical design ensuring a wide field-of-view for every viewing distance. The optical system is based on the Alvarez principle. Spherical refraction correction is considered for different eccentric gaze directions covering a field-of-view range up to 45degrees. Eye movements during convergence for near objects are included. We designed three bifocal systems. The first one provides 3 D for far vision (myopic eye) and -1 D for near vision (+2 D Addition). The second one provides a +3 D addition with 3 D for far vision. Finally the last system is an example of reading glasses with +1 D power Addition.
Croft, Mary Ann; Mcdonald, Jared P.; James, Rebecca J.; Heatley, Gregg A.; Lin, Ting-Li; Lütjen-Drecoll, Elke; Kaufman, Paul L.
2009-01-01
Purpose To determine how surgically altering the normal relationship between the lens and the ciliary body in rhesus monkeys affects centripetal ciliary body and lens movement. Methods In 18 rhesus monkey eyes (aged 6–27 years), accommodation was induced before and after surgery by electrical stimulation of the Edinger-Westphal (E–W) nucleus. Accommodative amplitude was measured by coincidence refractometry. Goniovideography was performed before and after intra- and extra-capsular lens extraction (ICLE, ECLE) and anterior regional zonulolysis. Centripetal lens/capsule movements, centripetal ciliary process (CP) movements, and circumlental space were measured by computerized image analysis of the goniovideography images. Results Centripetal accommodative CP and capsule movement increased in velocity and amplitude post-ECLE compared to pre-ECLE regardless of age (n=5). The presence of the lens substance retarded capsule movement by ~21% in the young eyes and by ~62% in the older eyes. Post-ICLE compared to pre-ICLE centripetal accommodative CP movement was dampened in all eyes in which the anterior vitreous was disturbed (n=7), but not in eyes in which the anterior vitreous was left intact (n=2). Following anterior regional zonulolysis (n=4), lens position shifted toward the lysed quadrant during accommodation. Conclusions The presence of the lens substance, capsule zonular attachments, and Wiegers ligament may play a role in centripetal CP movement. The capsule is still capable of centripetal movement in the older eye (although at a reduced capacity) and may have the ability to produce ~6 diopters of accommodation in the presence of a normal young crystalline lens or a similar surrogate. PMID:18552393
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this. PMID:28777822
Demšar, Urška; Çöltekin, Arzu
2017-01-01
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.
2011-06-01
questionnaire for Asperger Syndrome and other high-functioning autism spectrum disorders in school age children. Journal of Autism & Developmental...10-1-0404 TITLE: Receptive Vocabulary Knowledge in Low-Functioning Autism as Assessed by Eye Movements, Pupillary Dilation, and Event-Related...W81XWH-10-1-0404 Receptive Vocabulary Knowledge in Low-Functioning Autism as Assessed by Eye Movements, Pupillary Dilation, and Event-Related
2012-06-01
for Asperger Syndrome and other high-functioning autism spectrum disorders in school age children. Journal of Autism & Developmental Disorders, 29...Functioning Autism as Assessed by Eye Movements, Pupillary Dilation, and Event-Related Potentials PRINCIPAL INVESTIGATOR: Barry Gordon...Knowledge in Low-Functioning Autism as Assessed by Eye- Movements, Pupillary Dilation, and Event-Related Potentials 5b. GRANT NUMBER W81XWH-10-1-0404
2012-10-01
and Eye Movement Desensitization and Reprocessing ( EMDR ) which lead to clinically improved outcomes in...that uses eye movements and is designed to be brief (i.e. 1-5 treatment sessions). • By use of a randomized controlled trial, to evaluate ART versus...as to specific therapeutic role of eye movements , and 3-month follow-up results (sustainability) are pending. • Given short treatment duration
An information maximization model of eye movements
NASA Technical Reports Server (NTRS)
Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra
2005-01-01
We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.
Effects of aging on eye movements in the real world
Dowiasch, Stefan; Marx, Svenja; Einhäuser, Wolfgang; Bremmer, Frank
2015-01-01
The effects of aging on eye movements are well studied in the laboratory. Increased saccade latencies or decreased smooth-pursuit gain are well established findings. The question remains whether these findings are influenced by the rather untypical environment of a laboratory; that is, whether or not they transfer to the real world. We measured 34 healthy participants between the age of 25 and 85 during two everyday tasks in the real world: (I) walking down a hallway with free gaze, (II) visual tracking of an earth-fixed object while walking straight-ahead. Eye movements were recorded with a mobile light-weight eye tracker, the EyeSeeCam (ESC). We find that age significantly influences saccade parameters. With increasing age, saccade frequency, amplitude, peak velocity, and mean velocity are reduced and the velocity/amplitude distribution as well as the velocity profile become less skewed. In contrast to laboratory results on smooth pursuit, we did not find a significant effect of age on tracking eye-movements in the real world. Taken together, age-related eye-movement changes as measured in the laboratory only partly resemble those in the real world. It is well-conceivable that in the real world additional sensory cues, such as head-movement or vestibular signals, may partially compensate for age-related effects, which, according to this view, would be specific to early motion processing. In any case, our results highlight the importance of validity for natural situations when studying the impact of aging on real-life performance. PMID:25713524
NASA Astrophysics Data System (ADS)
Dong, Leng; Chen, Yan; Dias, Sarah; Stone, William; Dias, Joseph; Rout, John; Gale, Alastair G.
2017-03-01
Visual search techniques and FROC analysis have been widely used in radiology to understand medical image perceptual behaviour and diagnostic performance. The potential of exploiting the advantages of both methodologies is of great interest to medical researchers. In this study, eye tracking data of eight dental practitioners was investigated. The visual search measures and their analyses are considered here. Each participant interpreted 20 dental radiographs which were chosen by an expert dental radiologist. Various eye movement measurements were obtained based on image area of interest (AOI) information. FROC analysis was then carried out by using these eye movement measurements as a direct input source. The performance of FROC methods using different input parameters was tested. The results showed that there were significant differences in FROC measures, based on eye movement data, between groups with different experience levels. Namely, the area under the curve (AUC) score evidenced higher values for experienced group for the measurements of fixation and dwell time. Also, positive correlations were found for AUC scores between the eye movement data conducted FROC and rating based FROC. FROC analysis using eye movement measurements as input variables can act as a potential performance indicator to deliver assessment in medical imaging interpretation and assess training procedures. Visual search data analyses lead to new ways of combining eye movement data and FROC methods to provide an alternative dimension to assess performance and visual search behaviour in the area of medical imaging perceptual tasks.
Effects of Peripheral Visual Field Loss on Eye Movements During Visual Search
Wiecek, Emily; Pasquale, Louis R.; Fiser, Jozsef; Dakin, Steven; Bex, Peter J.
2012-01-01
Natural vision involves sequential eye movements that bring the fovea to locations selected by peripheral vision. How peripheral visual field loss (PVFL) affects this process is not well understood. We examine how the location and extent of PVFL affects eye movement behavior in a naturalistic visual search task. Ten patients with PVFL and 13 normally sighted subjects with full visual fields (FVF) completed 30 visual searches monocularly. Subjects located a 4° × 4° target, pseudo-randomly selected within a 26° × 11° natural image. Eye positions were recorded at 50 Hz. Search duration, fixation duration, saccade size, and number of saccades per trial were not significantly different between PVFL and FVF groups (p > 0.1). A χ2 test showed that the distributions of saccade directions for PVFL and FVL subjects were significantly different in 8 out of 10 cases (p < 0.01). Humphrey Visual Field pattern deviations for each subject were compared with the spatial distribution of eye movement directions. There were no significant correlations between saccade directional bias and visual field sensitivity across the 10 patients. Visual search performance was not significantly affected by PVFL. An analysis of eye movement directions revealed patients with PVFL show a biased directional distribution that was not directly related to the locus of vision loss, challenging feed-forward models of eye movement control. Consequently, many patients do not optimally compensate for visual field loss during visual search. PMID:23162511
Effects of aging on eye movements in the real world.
Dowiasch, Stefan; Marx, Svenja; Einhäuser, Wolfgang; Bremmer, Frank
2015-01-01
The effects of aging on eye movements are well studied in the laboratory. Increased saccade latencies or decreased smooth-pursuit gain are well established findings. The question remains whether these findings are influenced by the rather untypical environment of a laboratory; that is, whether or not they transfer to the real world. We measured 34 healthy participants between the age of 25 and 85 during two everyday tasks in the real world: (I) walking down a hallway with free gaze, (II) visual tracking of an earth-fixed object while walking straight-ahead. Eye movements were recorded with a mobile light-weight eye tracker, the EyeSeeCam (ESC). We find that age significantly influences saccade parameters. With increasing age, saccade frequency, amplitude, peak velocity, and mean velocity are reduced and the velocity/amplitude distribution as well as the velocity profile become less skewed. In contrast to laboratory results on smooth pursuit, we did not find a significant effect of age on tracking eye-movements in the real world. Taken together, age-related eye-movement changes as measured in the laboratory only partly resemble those in the real world. It is well-conceivable that in the real world additional sensory cues, such as head-movement or vestibular signals, may partially compensate for age-related effects, which, according to this view, would be specific to early motion processing. In any case, our results highlight the importance of validity for natural situations when studying the impact of aging on real-life performance.
A Statistical Physics Perspective to Understand Social Visual Attention in Autism Spectrum Disorder.
Liberati, Alessio; Fadda, Roberta; Doneddu, Giuseppe; Congiu, Sara; Javarone, Marco A; Striano, Tricia; Chessa, Alessandro
2017-08-01
This study investigated social visual attention in children with Autism Spectrum Disorder (ASD) and with typical development (TD) in the light of Brockmann and Geisel's model of visual attention. The probability distribution of gaze movements and clustering of gaze points, registered with eye-tracking technology, was studied during a free visual exploration of a gaze stimulus. A data-driven analysis of the distribution of eye movements was chosen to overcome any possible methodological problems related to the subjective expectations of the experimenters about the informative contents of the image in addition to a computational model to simulate group differences. Analysis of the eye-tracking data indicated that the scanpaths of children with TD and ASD were characterized by eye movements geometrically equivalent to Lévy flights. Children with ASD showed a higher frequency of long saccadic amplitudes compared with controls. A clustering analysis revealed a greater dispersion of eye movements for these children. Modeling of the results indicated higher values of the model parameter modulating the dispersion of eye movements for children with ASD. Together, the experimental results and the model point to a greater dispersion of gaze points in ASD.
Pouw, Wim T J L; Mavilidi, Myrto-Foteini; van Gog, Tamara; Paas, Fred
2016-08-01
Non-communicative hand gestures have been found to benefit problem-solving performance. These gestures seem to compensate for limited internal cognitive capacities, such as visual working memory capacity. Yet, it is not clear how gestures might perform this cognitive function. One hypothesis is that gesturing is a means to spatially index mental simulations, thereby reducing the need for visually projecting the mental simulation onto the visual presentation of the task. If that hypothesis is correct, less eye movements should be made when participants gesture during problem solving than when they do not gesture. We therefore used mobile eye tracking to investigate the effect of co-thought gesturing and visual working memory capacity on eye movements during mental solving of the Tower of Hanoi problem. Results revealed that gesturing indeed reduced the number of eye movements (lower saccade counts), especially for participants with a relatively lower visual working memory capacity. Subsequent problem-solving performance was not affected by having (not) gestured during the mental solving phase. The current findings suggest that our understanding of gestures in problem solving could be improved by taking into account eye movements during gesturing.
Extracting information of fixational eye movements through pupil tracking
NASA Astrophysics Data System (ADS)
Xiao, JiangWei; Qiu, Jian; Luo, Kaiqin; Peng, Li; Han, Peng
2018-01-01
Human eyes are never completely static even when they are fixing a stationary point. These irregular, small movements, which consist of micro-tremors, micro-saccades and drifts, can prevent the fading of the images that enter our eyes. The importance of researching the fixational eye movements has been experimentally demonstrated recently. However, the characteristics of fixational eye movements and their roles in visual process have not been explained clearly, because these signals can hardly be completely extracted by now. In this paper, we developed a new eye movement detection device with a high-speed camera. This device includes a beam splitter mirror, an infrared light source and a high-speed digital video camera with a frame rate of 200Hz. To avoid the influence of head shaking, we made the device wearable by fixing the camera on a safety helmet. Using this device, the experiments of pupil tracking were conducted. By localizing the pupil center and spectrum analysis, the envelope frequency spectrum of micro-saccades, micro-tremors and drifts are shown obviously. The experimental results show that the device is feasible and effective, so that the device can be applied in further characteristic analysis.
Biometric recognition via fixation density maps
NASA Astrophysics Data System (ADS)
Rigas, Ioannis; Komogortsev, Oleg V.
2014-05-01
This work introduces and evaluates a novel eye movement-driven biometric approach that employs eye fixation density maps for person identification. The proposed feature offers a dynamic representation of the biometric identity, storing rich information regarding the behavioral and physical eye movement characteristics of the individuals. The innate ability of fixation density maps to capture the spatial layout of the eye movements in conjunction with their probabilistic nature makes them a particularly suitable option as an eye movement biometrical trait in cases when free-viewing stimuli is presented. In order to demonstrate the effectiveness of the proposed approach, the method is evaluated on three different datasets containing a wide gamut of stimuli types, such as static images, video and text segments. The obtained results indicate a minimum EER (Equal Error Rate) of 18.3 %, revealing the perspectives on the utilization of fixation density maps as an enhancing biometrical cue during identification scenarios in dynamic visual environments.
Shichinohe, Natsuko; Akao, Teppei; Kurkin, Sergei; Fukushima, Junko; Kaneko, Chris R S; Fukushima, Kikuro
2009-06-11
Cortical motor areas are thought to contribute "higher-order processing," but what that processing might include is unknown. Previous studies of the smooth pursuit-related discharge of supplementary eye field (SEF) neurons have not distinguished activity associated with the preparation for pursuit from discharge related to processing or memory of the target motion signals. Using a memory-based task designed to separate these components, we show that the SEF contains signals coding retinal image-slip-velocity, memory, and assessment of visual motion direction, the decision of whether to pursue, and the preparation for pursuit eye movements. Bilateral muscimol injection into SEF resulted in directional errors in smooth pursuit, errors of whether to pursue, and impairment of initial correct eye movements. These results suggest an important role for the SEF in memory and assessment of visual motion direction and the programming of appropriate pursuit eye movements.
More to it than meets the eye: how eye movements can elucidate the development of episodic memory.
Pathman, Thanujeni; Ghetti, Simona
2016-07-01
The ability to recognise past events along with the contexts in which they occurred is a hallmark of episodic memory, a critical capacity. Eye movements have been shown to track veridical memory for the associations between events and their contexts (relational binding). Such eye-movement effects emerge several seconds before, or in the absence of, explicit response, and are linked to the integrity and function of the hippocampus. Drawing from research from infancy through late childhood, and by comparing to investigations from typical adults, patient populations, and animal models, it seems increasingly clear that eye movements reflect item-item, item-temporal, and item-spatial associations in developmental populations. We analyse this line of work, identify missing pieces in the literature and outline future avenues of research, in order to help elucidate the development of episodic memory.
Arba-Mosquera, Samuel; Aslanides, Ioannis M.
2012-01-01
Purpose To analyze the effects of Eye-Tracker performance on the pulse positioning errors during refractive surgery. Methods A comprehensive model, which directly considers eye movements, including saccades, vestibular, optokinetic, vergence, and miniature, as well as, eye-tracker acquisition rate, eye-tracker latency time, scanner positioning time, laser firing rate, and laser trigger delay have been developed. Results Eye-tracker acquisition rates below 100 Hz correspond to pulse positioning errors above 1.5 mm. Eye-tracker latency times to about 15 ms correspond to pulse positioning errors of up to 3.5 mm. Scanner positioning times to about 9 ms correspond to pulse positioning errors of up to 2 mm. Laser firing rates faster than eye-tracker acquisition rates basically duplicate pulse-positioning errors. Laser trigger delays to about 300 μs have minor to no impact on pulse-positioning errors. Conclusions The proposed model can be used for comparison of laser systems used for ablation processes. Due to the pseudo-random nature of eye movements, positioning errors of single pulses are much larger than observed decentrations in the clinical settings. There is no single parameter that ‘alone’ minimizes the positioning error. It is the optimal combination of the several parameters that minimizes the error. The results of this analysis are important to understand the limitations of correcting very irregular ablation patterns.
A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection
NASA Astrophysics Data System (ADS)
Tomono, Akira; Iida, Muneo; Kobayashi, Yukio
1990-04-01
This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.
Cardiac autonomic denervation in Parkinson's disease is linked to REM sleep behavior disorder.
Postuma, Ronald B; Montplaisir, Jacques; Lanfranchi, Paola; Blais, Hélène; Rompré, Sylvie; Colombo, Roberto; Gagnon, Jean-François
2011-07-01
Recent studies have suggested a close connection between autonomic dysfunction and rapid eye movement sleep behavior disorder, which differs in nature from other early-stage markers of Parkinson's disease. In this study we examined the relationship between rapid eye movement sleep behavior disorder and autonomic dysfunction in Parkinson's disease as measured by cardiac beat-to-beat variability. In 53 patients with Parkinson's disease and 36 controls, electrocardiographic trace from a polysomnogram was assessed for measures of beat-to-beat RR variability including RR-standard deviation and frequency domains (low- and high-frequency components). Results were compared between patients with Parkinson's disease and controls, and between patients with Parkinson's disease with and without rapid eye movement sleep behavior disorder. On numerous cardiac autonomic measures, patients with Parkinson's disease showed clear abnormalities compared with controls. However, these abnormalities were confined only to those patients with associated rapid eye movement sleep behavior; those without were not different than controls. As with other clinical autonomic variables, cardiac autonomic denervation is predominantly associated not with Parkinson's disease itself, but with the presence of rapid eye movement sleep behavior disorder. Copyright © 2011 Movement Disorder Society.
Meal assistance robot with ultrasonic motor
NASA Astrophysics Data System (ADS)
Kodani, Yasuhiro; Tanaka, Kanya; Wakasa, Yuji; Akashi, Takuya; Oka, Masato
2007-12-01
In this paper, we have constructed a robot that help people with disabilities of upper extremities and advanced stage amyotrophic lateral sclerosis (ALS) patients to eat with their residual abilities. Especially, many of people suffering from advanced stage ALS of the use a pacemaker. And they need to avoid electromagnetic waves. Therefore we adopt ultra sonic motor that does not generate electromagnetic waves as driving sources. Additionally we approach the problem of the conventional meal assistance robot. Moreover, we introduce the interface with eye movement so that extremities can also use our system. User operates our robot not with hands or foot but with eye movement.
Modifications of spontaneous oculomotor activity in microgravitational conditions
NASA Astrophysics Data System (ADS)
Kornilova, L. N.; Goncharenko, A. M.; Polyakov, V. V.; Grigorova, V.; Manev, A.
Investigations on spontaneous oculomotor activity were carried out prior to and after (five cosmonauts) and during space flight (two cosmonauts) on the 3rd, 5th and 164th days of the space flight. Recording of oculomotor activity was carried out by electrooculography on automated data acquisition and processing system "Zora" based on personal computers. During the space flight and after it all the cosmonauts with the eyes closed or open and dark-goggled showed an essential increase of the movements' amplitude when removing the eyes into the extreme positions especially in a vertical direction, occurrence of correcting saccadic movements (or nystagmus), an increase in time of fixing reactions.
The Face Perception System becomes Species-Specific at 3 Months: An Eye-Tracking Study
ERIC Educational Resources Information Center
Di Giorgio, Elisa; Meary, David; Pascalis, Olivier; Simion, Francesca
2013-01-01
The current study aimed at investigating own- vs. other-species preferences in 3-month-old infants. The infants' eye movements were recorded during a visual preference paradigm to assess whether they show a preference for own-species faces when contrasted with other-species faces. Human and monkey faces, equated for all low-level perceptual…
Kinematics of Visually-Guided Eye Movements
Hess, Bernhard J. M.; Thomassen, Jakob S.
2014-01-01
One of the hallmarks of an eye movement that follows Listing’s law is the half-angle rule that says that the angular velocity of the eye tilts by half the angle of eccentricity of the line of sight relative to primary eye position. Since all visually-guided eye movements in the regime of far viewing follow Listing’s law (with the head still and upright), the question about its origin is of considerable importance. Here, we provide theoretical and experimental evidence that Listing’s law results from a unique motor strategy that allows minimizing ocular torsion while smoothly tracking objects of interest along any path in visual space. The strategy consists in compounding conventional ocular rotations in meridian planes, that is in horizontal, vertical and oblique directions (which are all torsion-free) with small linear displacements of the eye in the frontal plane. Such compound rotation-displacements of the eye can explain the kinematic paradox that the fixation point may rotate in one plane while the eye rotates in other planes. Its unique signature is the half-angle law in the position domain, which means that the rotation plane of the eye tilts by half-the angle of gaze eccentricity. We show that this law does not readily generalize to the velocity domain of visually-guided eye movements because the angular eye velocity is the sum of two terms, one associated with rotations in meridian planes and one associated with displacements of the eye in the frontal plane. While the first term does not depend on eye position the second term does depend on eye position. We show that compounded rotation - displacements perfectly predict the average smooth kinematics of the eye during steady- state pursuit in both the position and velocity domain. PMID:24751602
Research on driver fatigue detection
NASA Astrophysics Data System (ADS)
Zhang, Ting; Chen, Zhong; Ouyang, Chao
2018-03-01
Driver fatigue is one of the main causes of frequent traffic accidents. In this case, driver fatigue detection system has very important significance in avoiding traffic accidents. This paper presents a real-time method based on fusion of multiple facial features, including eye closure, yawn and head movement. The eye state is classified as being open or closed by a linear SVM classifier trained using HOG features of the detected eye. The mouth state is determined according to the width-height ratio of the mouth. The head movement is detected by head pitch angle calculated by facial landmark. The driver's fatigue state can be reasoned by the model trained by above features. According to experimental results, drive fatigue detection obtains an excellent performance. It indicates that the developed method is valuable for the application of avoiding traffic accidents caused by driver's fatigue.
Eye Gaze Metrics Reflect a Shared Motor Representation for Action Observation and Movement Imagery
ERIC Educational Resources Information Center
McCormick, Sheree A.; Causer, Joe; Holmes, Paul S.
2012-01-01
Action observation (AO) and movement imagery (MI) have been reported to share similar neural networks. This study investigated the congruency between AO and MI using the eye gaze metrics, dwell time and fixation number. A simple reach-grasp-place arm movement was observed and, in a second condition, imagined where the movement was presented from…
A novel role for visual perspective cues in the neural computation of depth.
Kim, HyungGoo R; Angelaki, Dora E; DeAngelis, Gregory C
2015-01-01
As we explore a scene, our eye movements add global patterns of motion to the retinal image, complicating visual motion produced by self-motion or moving objects. Conventionally, it has been assumed that extraretinal signals, such as efference copy of smooth pursuit commands, are required to compensate for the visual consequences of eye rotations. We consider an alternative possibility: namely, that the visual system can infer eye rotations from global patterns of image motion. We visually simulated combinations of eye translation and rotation, including perspective distortions that change dynamically over time. We found that incorporating these 'dynamic perspective' cues allowed the visual system to generate selectivity for depth sign from motion parallax in macaque cortical area MT, a computation that was previously thought to require extraretinal signals regarding eye velocity. Our findings suggest neural mechanisms that analyze global patterns of visual motion to perform computations that require knowledge of eye rotations.
Barmack, N H; Pettorossi, V E
1988-08-01
The influence of unilateral plugs of the left horizontal semicircular canal (LHC plugs) of rabbits on the development and compensation of asymmetric eye movements evoked by horizontal vestibular stimulation was studied. LHC plugs caused an immediate reduction of 50-65% in the gain of the horizontal vestibuloocular reflex (HVOR). This reduction in gain was achieved without altering the symmetry of the HVOR, and was accompanied by a change in the axial alignment of eye movements evoked by vestibular stimulation about the vertical (HVOR) and longitudinal (VVOR) axes. Postoperative asymmetry of eye movements developed 12-48 hr after the plugging operation. The development of asymmetry was reduced if the rabbit was restrained for 24 hr, thereby minimizing vestibular stimulation following the plugging operation. Over a 3-4 week period, the normal symmetry of eye movements was restored and the axial alignments of the HVOR and VVOR returned to the preoperative values. The gain of the HVOR did not recover. The horizontal cervicoocular reflex (HCOR) was examined before the plugging operation and after compensation of asymmetry was complete. The gain and phase of the HCOR were not altered. A relatively simple set of explanations at a cellular level is proposed to account for the induction and compensation of asymmetric eye movements following a unilateral plug of the horizontal semicircular canal.
Matsumoto, Yukiko; Takahashi, Hideyuki; Murai, Toshiya; Takahashi, Hidehiko
2015-01-01
Schizophrenia patients have impairments at several levels of cognition including visual attention (eye movements), perception, and social cognition. However, it remains unclear how lower-level cognitive deficits influence higher-level cognition. To elucidate the hierarchical path linking deficient cognitions, we focused on biological motion perception, which is involved in both the early stage of visual perception (attention) and higher social cognition, and is impaired in schizophrenia. Seventeen schizophrenia patients and 18 healthy controls participated in the study. Using point-light walker stimuli, we examined eye movements during biological motion perception in schizophrenia. We assessed relationships among eye movements, biological motion perception and empathy. In the biological motion detection task, schizophrenia patients showed lower accuracy and fixated longer than healthy controls. As opposed to controls, patients exhibiting longer fixation durations and fewer numbers of fixations demonstrated higher accuracy. Additionally, in the patient group, the correlations between accuracy and affective empathy index and between eye movement index and affective empathy index were significant. The altered gaze patterns in patients indicate that top-down attention compensates for impaired bottom-up attention. Furthermore, aberrant eye movements might lead to deficits in biological motion perception and finally link to social cognitive impairments. The current findings merit further investigation for understanding the mechanism of social cognitive training and its development. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.
Head eye co-ordination and gaze stability in subjects with persistent whiplash associated disorders.
Treleaven, Julia; Jull, Gwendolen; Grip, Helena
2011-06-01
Symptoms of dizziness, unsteadiness and visual disturbances are frequent complaints in persons with persistent whiplash associated disorders. This study investigated eye, head co-ordination and gaze stability in subjects with persistent whiplash (n = 20) and asymptomatic controls (n = 20). Wireless motion sensors and electro-oculography were used to measure: head rotation during unconstrained head movement, head rotation during gaze stability and sequential head and eye movements. Ten control subjects participated in a repeatability study (two occasions one week apart). Between-day repeatability was acceptable (ICC > 0.6) for most measures. The whiplash group had significantly less maximal eye angle to the left, range of head movement during the gaze stability task and decreased velocity of head movement in head eye co-ordination and gaze stability tasks compared to the control group (p < 0.01). There were significant correlations (r > 0.55) between both unrestrained neck movement and neck pain and head movement and velocity in the whiplash group. Deficits in gaze stability and head eye co-ordination may be related to disturbed reflex activity associated with decreased head range of motion and/or neck pain. Further research is required to explore the mechanisms behind these deficits, the nature of changes over time and the tests' ability to measure change in response to rehabilitation. Crown Copyright © 2010. Published by Elsevier Ltd. All rights reserved.