Sample records for visual sensory processing

  1. Flexibility and Stability in Sensory Processing Revealed Using Visual-to-Auditory Sensory Substitution

    PubMed Central

    Hertz, Uri; Amedi, Amir

    2015-01-01

    The classical view of sensory processing involves independent processing in sensory cortices and multisensory integration in associative areas. This hierarchical structure has been challenged by evidence of multisensory responses in sensory areas, and dynamic weighting of sensory inputs in associative areas, thus far reported independently. Here, we used a visual-to-auditory sensory substitution algorithm (SSA) to manipulate the information conveyed by sensory inputs while keeping the stimuli intact. During scan sessions before and after SSA learning, subjects were presented with visual images and auditory soundscapes. The findings reveal 2 dynamic processes. First, crossmodal attenuation of sensory cortices changed direction after SSA learning from visual attenuations of the auditory cortex to auditory attenuations of the visual cortex. Secondly, associative areas changed their sensory response profile from strongest response for visual to that for auditory. The interaction between these phenomena may play an important role in multisensory processing. Consistent features were also found in the sensory dominance in sensory areas and audiovisual convergence in associative area Middle Temporal Gyrus. These 2 factors allow for both stability and a fast, dynamic tuning of the system when required. PMID:24518756

  2. Flexibility and Stability in Sensory Processing Revealed Using Visual-to-Auditory Sensory Substitution.

    PubMed

    Hertz, Uri; Amedi, Amir

    2015-08-01

    The classical view of sensory processing involves independent processing in sensory cortices and multisensory integration in associative areas. This hierarchical structure has been challenged by evidence of multisensory responses in sensory areas, and dynamic weighting of sensory inputs in associative areas, thus far reported independently. Here, we used a visual-to-auditory sensory substitution algorithm (SSA) to manipulate the information conveyed by sensory inputs while keeping the stimuli intact. During scan sessions before and after SSA learning, subjects were presented with visual images and auditory soundscapes. The findings reveal 2 dynamic processes. First, crossmodal attenuation of sensory cortices changed direction after SSA learning from visual attenuations of the auditory cortex to auditory attenuations of the visual cortex. Secondly, associative areas changed their sensory response profile from strongest response for visual to that for auditory. The interaction between these phenomena may play an important role in multisensory processing. Consistent features were also found in the sensory dominance in sensory areas and audiovisual convergence in associative area Middle Temporal Gyrus. These 2 factors allow for both stability and a fast, dynamic tuning of the system when required. © The Author 2014. Published by Oxford University Press.

  3. Visual perception of ADHD children with sensory processing disorder.

    PubMed

    Jung, Hyerim; Woo, Young Jae; Kang, Je Wook; Choi, Yeon Woo; Kim, Kyeong Mi

    2014-04-01

    The aim of the present study was to investigate the visual perception difference between ADHD children with and without sensory processing disorder, and the relationship between sensory processing and visual perception of the children with ADHD. Participants were 47 outpatients, aged 6-8 years, diagnosed with ADHD. After excluding those who met exclusion criteria, 38 subjects were clustered into two groups, ADHD children with and without sensory processing disorder (SPD), using SSP reported by their parents, then subjects completed K-DTVP-2. Spearman correlation analysis was run to determine the relationship between sensory processing and visual perception, and Mann-Whitney-U test was conducted to compare the K-DTVP-2 score of two groups respectively. The ADHD children with SPD performed inferiorly to ADHD children without SPD in the on 3 quotients of K-DTVP-2. The GVP of K-DTVP-2 score was related to Movement Sensitivity section (r=0.368(*)) and Low Energy/Weak section of SSP (r=0.369*). The result of the present study suggests that among children with ADHD, the visual perception is lower in those children with co-morbid SPD. Also, visual perception may be related to sensory processing, especially in the reactions of vestibular and proprioceptive senses. Regarding academic performance, it is necessary to consider how sensory processing issues affect visual perception in children with ADHD.

  4. Adaptation to sensory input tunes visual cortex to criticality

    NASA Astrophysics Data System (ADS)

    Shew, Woodrow L.; Clawson, Wesley P.; Pobst, Jeff; Karimipanah, Yahya; Wright, Nathaniel C.; Wessel, Ralf

    2015-08-01

    A long-standing hypothesis at the interface of physics and neuroscience is that neural networks self-organize to the critical point of a phase transition, thereby optimizing aspects of sensory information processing. This idea is partially supported by strong evidence for critical dynamics observed in the cerebral cortex, but the impact of sensory input on these dynamics is largely unknown. Thus, the foundations of this hypothesis--the self-organization process and how it manifests during strong sensory input--remain unstudied experimentally. Here we show in visual cortex and in a computational model that strong sensory input initially elicits cortical network dynamics that are not critical, but adaptive changes in the network rapidly tune the system to criticality. This conclusion is based on observations of multifaceted scaling laws predicted to occur at criticality. Our findings establish sensory adaptation as a self-organizing mechanism that maintains criticality in visual cortex during sensory information processing.

  5. Sensory system plasticity in a visually specialized, nocturnal spider.

    PubMed

    Stafstrom, Jay A; Michalik, Peter; Hebets, Eileen A

    2017-04-21

    The interplay between an animal's environmental niche and its behavior can influence the evolutionary form and function of its sensory systems. While intraspecific variation in sensory systems has been documented across distant taxa, fewer studies have investigated how changes in behavior might relate to plasticity in sensory systems across developmental time. To investigate the relationships among behavior, peripheral sensory structures, and central processing regions in the brain, we take advantage of a dramatic within-species shift of behavior in a nocturnal, net-casting spider (Deinopis spinosa), where males cease visually-mediated foraging upon maturation. We compared eye diameters and brain region volumes across sex and life stage, the latter through micro-computed X-ray tomography. We show that mature males possess altered peripheral visual morphology when compared to their juvenile counterparts, as well as juvenile and mature females. Matching peripheral sensory structure modifications, we uncovered differences in relative investment in both lower-order and higher-order processing regions in the brain responsible for visual processing. Our study provides evidence for sensory system plasticity when individuals dramatically change behavior across life stages, uncovering new avenues of inquiry focusing on altered reliance of specific sensory information when entering a new behavioral niche.

  6. Enhanced alpha-oscillations in visual cortex during anticipation of self-generated visual stimulation.

    PubMed

    Stenner, Max-Philipp; Bauer, Markus; Haggard, Patrick; Heinze, Hans-Jochen; Dolan, Ray

    2014-11-01

    The perceived intensity of sensory stimuli is reduced when these stimuli are caused by the observer's actions. This phenomenon is traditionally explained by forward models of sensory action-outcome, which arise from motor processing. Although these forward models critically predict anticipatory modulation of sensory neural processing, neurophysiological evidence for anticipatory modulation is sparse and has not been linked to perceptual data showing sensory attenuation. By combining a psychophysical task involving contrast discrimination with source-level time-frequency analysis of MEG data, we demonstrate that the amplitude of alpha-oscillations in visual cortex is enhanced before the onset of a visual stimulus when the identity and onset of the stimulus are controlled by participants' motor actions. Critically, this prestimulus enhancement of alpha-amplitude is paralleled by psychophysical judgments of a reduced contrast for this stimulus. We suggest that alpha-oscillations in visual cortex preceding self-generated visual stimulation are a likely neurophysiological signature of motor-induced sensory anticipation and mediate sensory attenuation. We discuss our results in relation to proposals that attribute generic inhibitory functions to alpha-oscillations in prioritizing and gating sensory information via top-down control.

  7. Persistent recruitment of somatosensory cortex during active maintenance of hand images in working memory.

    PubMed

    Galvez-Pol, A; Calvo-Merino, B; Capilla, A; Forster, B

    2018-07-01

    Working memory (WM) supports temporary maintenance of task-relevant information. This process is associated with persistent activity in the sensory cortex processing the information (e.g., visual stimuli activate visual cortex). However, we argue here that more multifaceted stimuli moderate this sensory-locked activity and recruit distinctive cortices. Specifically, perception of bodies recruits somatosensory cortex (SCx) beyond early visual areas (suggesting embodiment processes). Here we explore persistent activation in processing areas beyond the sensory cortex initially relevant to the modality of the stimuli. Using visual and somatosensory evoked-potentials in a visual WM task, we isolated different levels of visual and somatosensory involvement during encoding of body and non-body-related images. Persistent activity increased in SCx only when maintaining body images in WM, whereas visual/posterior regions' activity increased significantly when maintaining non-body images. Our results bridge WM and embodiment frameworks, supporting a dynamic WM process where the nature of the information summons specific processing resources. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Sensory Symptoms and Processing of Nonverbal Auditory and Visual Stimuli in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Stewart, Claire R.; Sanchez, Sandra S.; Grenesko, Emily L.; Brown, Christine M.; Chen, Colleen P.; Keehn, Brandon; Velasquez, Francisco; Lincoln, Alan J.; Müller, Ralph-Axel

    2016-01-01

    Atypical sensory responses are common in autism spectrum disorder (ASD). While evidence suggests impaired auditory-visual integration for verbal information, findings for nonverbal stimuli are inconsistent. We tested for sensory symptoms in children with ASD (using the Adolescent/Adult Sensory Profile) and examined unisensory and bisensory…

  9. Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition.

    PubMed

    Díaz, Begoña; Blank, Helen; von Kriegstein, Katharina

    2018-05-14

    The cerebral cortex modulates early sensory processing via feed-back connections to sensory pathway nuclei. The functions of this top-down modulation for human behavior are poorly understood. Here, we show that top-down modulation of the visual sensory thalamus (the lateral geniculate body, LGN) is involved in visual-speech recognition. In two independent functional magnetic resonance imaging (fMRI) studies, LGN response increased when participants processed fast-varying features of articulatory movements required for visual-speech recognition, as compared to temporally more stable features required for face identification with the same stimulus material. The LGN response during the visual-speech task correlated positively with the visual-speech recognition scores across participants. In addition, the task-dependent modulation was present for speech movements and did not occur for control conditions involving non-speech biological movements. In face-to-face communication, visual speech recognition is used to enhance or even enable understanding what is said. Speech recognition is commonly explained in frameworks focusing on cerebral cortex areas. Our findings suggest that task-dependent modulation at subcortical sensory stages has an important role for communication: Together with similar findings in the auditory modality the findings imply that task-dependent modulation of the sensory thalami is a general mechanism to optimize speech recognition. Copyright © 2018. Published by Elsevier Inc.

  10. Sensori-motor experience leads to changes in visual processing in the developing brain.

    PubMed

    James, Karin Harman

    2010-03-01

    Since Broca's studies on language processing, cortical functional specialization has been considered to be integral to efficient neural processing. A fundamental question in cognitive neuroscience concerns the type of learning that is required for functional specialization to develop. To address this issue with respect to the development of neural specialization for letters, we used functional magnetic resonance imaging (fMRI) to compare brain activation patterns in pre-school children before and after different letter-learning conditions: a sensori-motor group practised printing letters during the learning phase, while the control group practised visual recognition. Results demonstrated an overall left-hemisphere bias for processing letters in these pre-literate participants, but, more interestingly, showed enhanced blood oxygen-level-dependent activation in the visual association cortex during letter perception only after sensori-motor (printing) learning. It is concluded that sensori-motor experience augments processing in the visual system of pre-school children. The change of activation in these neural circuits provides important evidence that 'learning-by-doing' can lay the foundation for, and potentially strengthen, the neural systems used for visual letter recognition.

  11. On the dependence of response inhibition processes on sensory modality.

    PubMed

    Bodmer, Benjamin; Beste, Christian

    2017-04-01

    The ability to inhibit responses is a central sensorimotor function but only recently the importance of sensory processes for motor inhibition mechanisms went more into the research focus. In this regard it is elusive, whether there are differences between sensory modalities to trigger response inhibition processes. Due to functional neuroanatomical considerations strong differences may exist, for example, between the visual and the tactile modality. In the current study we examine what neurophysiological mechanisms as well as functional neuroanatomical networks are modulated during response inhibition. Therefore, a Go/NoGo-paradigm employing a novel combination of visual, tactile, and visuotactile stimuli was used. The data show that the tactile modality is more powerful than the visual modality to trigger response inhibition processes. However, the tactile modality loses its efficacy to trigger response inhibition processes when being combined with the visual modality. This may be due to competitive mechanisms leading to a suppression of certain sensory stimuli and the response selection level. Variations in sensory modalities specifically affected conflict monitoring processes during response inhibition by modulating activity in a frontal parietal network including the right inferior frontal gyrus, anterior cingulate cortex and the temporoparietal junction. Attentional selection processes are not modulated. The results suggest that the functional neuroanatomical networks involved in response inhibition critically depends on the nature of the sensory input. Hum Brain Mapp 38:1941-1951, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  12. Orienting attention to visual or verbal/auditory imagery differentially impairs the processing of visual stimuli.

    PubMed

    Villena-González, Mario; López, Vladimir; Rodríguez, Eugenio

    2016-05-15

    When attention is oriented toward inner thoughts, as spontaneously occurs during mind wandering, the processing of external information is attenuated. However, the potential effects of thought's content regarding sensory attenuation are still unknown. The present study aims to assess if the representational format of thoughts, such as visual imagery or inner speech, might differentially affect the sensory processing of external stimuli. We recorded the brain activity of 20 participants (12 women) while they were exposed to a probe visual stimulus in three different conditions: executing a task on the visual probe (externally oriented attention), and two conditions involving inward-turned attention i.e. generating inner speech and performing visual imagery. Event-related potentials results showed that the P1 amplitude, related with sensory response, was significantly attenuated during both task involving inward attention compared with external task. When both representational formats were compared, the visual imagery condition showed stronger attenuation in sensory processing than inner speech condition. Alpha power in visual areas was measured as an index of cortical inhibition. Larger alpha amplitude was found when participants engaged in an internal thought contrasted with the external task, with visual imagery showing even more alpha power than inner speech condition. Our results show, for the first time to our knowledge, that visual attentional processing to external stimuli during self-generated thoughts is differentially affected by the representational format of the ongoing train of thoughts. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. The Inversion of Sensory Processing by Feedback Pathways: A Model of Visual Cognitive Functions.

    ERIC Educational Resources Information Center

    Harth, E.; And Others

    1987-01-01

    Explains the hierarchic structure of the mammalian visual system. Proposes a model in which feedback pathways serve to modify sensory stimuli in ways that enhance and complete sensory input patterns. Investigates the functioning of the system through computer simulations. (ML)

  14. Rhythmic Oscillations of Visual Contrast Sensitivity Synchronized with Action

    PubMed Central

    Tomassini, Alice; Spinelli, Donatella; Jacono, Marco; Sandini, Giulio; Morrone, Maria Concetta

    2016-01-01

    It is well known that the motor and the sensory systems structure sensory data collection and cooperate to achieve an efficient integration and exchange of information. Increasing evidence suggests that both motor and sensory functions are regulated by rhythmic processes reflecting alternating states of neuronal excitability, and these may be involved in mediating sensory-motor interactions. Here we show an oscillatory fluctuation in early visual processing time locked with the execution of voluntary action, and, crucially, even for visual stimuli irrelevant to the motor task. Human participants were asked to perform a reaching movement toward a display and judge the orientation of a Gabor patch, near contrast threshold, briefly presented at random times before and during the reaching movement. When the data are temporally aligned to the onset of movement, visual contrast sensitivity oscillates with periodicity within the theta band. Importantly, the oscillations emerge during the motor planning stage, ~500 ms before movement onset. We suggest that brain oscillatory dynamics may mediate an automatic coupling between early motor planning and early visual processing, possibly instrumental in linking and closing up the visual-motor control loop. PMID:25948254

  15. Action preparation modulates sensory perception in unseen personal space: An electrophysiological investigation.

    PubMed

    Job, Xavier E; de Fockert, Jan W; van Velzen, José

    2016-08-01

    Behavioural and electrophysiological evidence has demonstrated that preparation of goal-directed actions modulates sensory perception at the goal location before the action is executed. However, previous studies have focused on sensory perception in areas of peripersonal space. The present study investigated visual and tactile sensory processing at the goal location of upcoming movements towards the body, much of which is not visible, as well as visible peripersonal space. A motor task cued participants to prepare a reaching movement towards goals either in peripersonal space in front of them or personal space on the upper chest. In order to assess modulations of sensory perception during movement preparation, event-related potentials (ERPs) were recorded in response to task-irrelevant visual and tactile probe stimuli delivered randomly at one of the goal locations of the movements. In line with previous neurophysiological findings, movement preparation modulated visual processing at the goal of a movement in peripersonal space. Movement preparation also modulated somatosensory processing at the movement goal in personal space. The findings demonstrate that tactile perception in personal space is subject to similar top-down sensory modulation by motor preparation as observed for visual stimuli presented in peripersonal space. These findings show for the first time that the principles and mechanisms underlying adaptive modulation of sensory processing in the context of action extend to tactile perception in unseen personal space. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Contribution of amygdalar and lateral hypothalamic neurons to visual information processing of food and nonfood in monkey.

    PubMed

    Ono, T; Tamura, R; Nishijo, H; Nakamura, K; Tabuchi, E

    1989-02-01

    Visual information processing was investigated in the inferotemporal cortical (ITCx)-amygdalar (AM)-lateral hypothalamic (LHA) axis which contributes to food-nonfood discrimination. Neuronal activity was recorded from monkey AM and LHA during discrimination of sensory stimuli including sight of food or nonfood. The task had four phases: control, visual, bar press, and ingestion. Of 710 AM neurons tested, 220 (31.0%) responded during visual phase: 48 to only visual stimulation, 13 (1.9%) to visual plus oral sensory stimulation, 142 (20.0%) to multimodal stimulation and 17 (2.4%) to one affectively significant item. Of 669 LHA neurons tested, 106 (15.8%) responded in the visual phase. Of 80 visual-related neurons tested systematically, 33 (41.2%) responded selectively to the sight of any object predicting the availability of reward, and 47 (58.8%) responded nondifferentially to both food and nonfood. Many of AM neuron responses were graded according to the degree of affective significance of sensory stimuli (sensory-affective association), but responses of LHA food responsive neurons did not depend on the kind of reward indicated by the sensory stimuli (stimulus-reinforcement association). Some AM and LHA food responses were modulated by extinction or reversal. Dynamic information processing in ITCx-AM-LHA axis was investigated by reversible deficits of bilateral ITCx or AM by cooling. ITCx cooling suppressed discrimination by vision responding AM neurons (8/17). AM cooling suppressed LHA responses to food (9/22). We suggest deep AM-LHA involvement in food-nonfood discrimination based on AM sensory-affective association and LHA stimulus-reinforcement association.

  17. Enhanced and bilateralized visual sensory processing in the ventral stream may be a feature of normal aging.

    PubMed

    De Sanctis, Pierfilippo; Katz, Richard; Wylie, Glenn R; Sehatpour, Pejman; Alexopoulos, George S; Foxe, John J

    2008-10-01

    Evidence has emerged for age-related amplification of basic sensory processing indexed by early components of the visual evoked potential (VEP). However, since these age-related effects have been incidental to the main focus of these studies, it is unclear whether they are performance dependent or alternately, represent intrinsic sensory processing changes. High-density VEPs were acquired from 19 healthy elderly and 15 young control participants who viewed alphanumeric stimuli in the absence of any active task. The data show both enhanced and delayed neural responses within structures of the ventral visual stream, with reduced hemispheric asymmetry in the elderly that may be indicative of a decline in hemispheric specialization. Additionally, considerably enhanced early frontal cortical activation was observed in the elderly, suggesting frontal hyper-activation. These age-related differences in early sensory processing are discussed in terms of recent proposals that normal aging involves large-scale compensatory reorganization. Our results suggest that such compensatory mechanisms are not restricted to later higher-order cognitive processes but may also be a feature of early sensory-perceptual processes.

  18. Locomotor sensory organization test: a novel paradigm for the assessment of sensory contributions in gait.

    PubMed

    Chien, Jung Hung; Eikema, Diderik-Jan Anthony; Mukherjee, Mukul; Stergiou, Nicholas

    2014-12-01

    Feedback based balance control requires the integration of visual, proprioceptive and vestibular input to detect the body's movement within the environment. When the accuracy of sensory signals is compromised, the system reorganizes the relative contributions through a process of sensory recalibration, for upright postural stability to be maintained. Whereas this process has been studied extensively in standing using the Sensory Organization Test (SOT), less is known about these processes in more dynamic tasks such as locomotion. In the present study, ten healthy young adults performed the six conditions of the traditional SOT to quantify standing postural control when exposed to sensory conflict. The same subjects performed these six conditions using a novel experimental paradigm, the Locomotor SOT (LSOT), to study dynamic postural control during walking under similar types of sensory conflict. To quantify postural control during walking, the net Center of Pressure sway variability was used. This corresponds to the Performance Index of the center of pressure trajectory, which is used to quantify postural control during standing. Our results indicate that dynamic balance control during locomotion in healthy individuals is affected by the systematic manipulation of multisensory inputs. The sway variability patterns observed during locomotion reflect similar balance performance with standing posture, indicating that similar feedback processes may be involved. However, the contribution of visual input is significantly increased during locomotion, compared to standing in similar sensory conflict conditions. The increased visual gain in the LSOT conditions reflects the importance of visual input for the control of locomotion. Since balance perturbations tend to occur in dynamic tasks and in response to environmental constraints not present during the SOT, the LSOT may provide additional information for clinical evaluation on healthy and deficient sensory processing.

  19. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation

    PubMed Central

    Moberg, Kayla; Amin, Kemia N.; Wright, Melissa; Newkirk, Jordan J.; Ponder, Monica A.; Acuff, Gary R.; Dickson, James S.

    2017-01-01

    Abstract Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted‐steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L *) and more red (higher a*) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b *; lower L *) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. PMID:28407236

  20. Top-down modulation of visual and auditory cortical processing in aging.

    PubMed

    Guerreiro, Maria J S; Eck, Judith; Moerel, Michelle; Evers, Elisabeth A T; Van Gerven, Pascal W M

    2015-02-01

    Age-related cognitive decline has been accounted for by an age-related deficit in top-down attentional modulation of sensory cortical processing. In light of recent behavioral findings showing that age-related differences in selective attention are modality dependent, our goal was to investigate the role of sensory modality in age-related differences in top-down modulation of sensory cortical processing. This question was addressed by testing younger and older individuals in several memory tasks while undergoing fMRI. Throughout these tasks, perceptual features were kept constant while attentional instructions were varied, allowing us to devise all combinations of relevant and irrelevant, visual and auditory information. We found no top-down modulation of auditory sensory cortical processing in either age group. In contrast, we found top-down modulation of visual cortical processing in both age groups, and this effect did not differ between age groups. That is, older adults enhanced cortical processing of relevant visual information and suppressed cortical processing of visual distractors during auditory attention to the same extent as younger adults. The present results indicate that older adults are capable of suppressing irrelevant visual information in the context of cross-modal auditory attention, and thereby challenge the view that age-related attentional and cognitive decline is due to a general deficits in the ability to suppress irrelevant information. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Modality-specificity of sensory aging in vision and audition: evidence from event-related potentials.

    PubMed

    Ceponiene, R; Westerfield, M; Torki, M; Townsend, J

    2008-06-18

    Major accounts of aging implicate changes in processing external stimulus information. Little is known about differential effects of auditory and visual sensory aging, and the mechanisms of sensory aging are still poorly understood. Using event-related potentials (ERPs) elicited by unattended stimuli in younger (M=25.5 yrs) and older (M=71.3 yrs) subjects, this study examined mechanisms of sensory aging under minimized attention conditions. Auditory and visual modalities were examined to address modality-specificity vs. generality of sensory aging. Between-modality differences were robust. The earlier-latency responses (P1, N1) were unaffected in the auditory modality but were diminished in the visual modality. The auditory N2 and early visual N2 were diminished. Two similarities between the modalities were age-related enhancements in the late P2 range and positive behavior-early N2 correlation, the latter suggesting that N2 may reflect long-latency inhibition of irrelevant stimuli. Since there is no evidence for salient differences in neuro-biological aging between the two sensory regions, the observed between-modality differences are best explained by the differential reliance of auditory and visual systems on attention. Visual sensory processing relies on facilitation by visuo-spatial attention, withdrawal of which appears to be more disadvantageous in older populations. In contrast, auditory processing is equipped with powerful inhibitory capacities. However, when the whole auditory modality is unattended, thalamo-cortical gating deficits may not manifest in the elderly. In contrast, ERP indices of longer-latency, stimulus-level inhibitory modulation appear to diminish with age.

  2. Sensory processing patterns predict the integration of information held in visual working memory.

    PubMed

    Lowe, Matthew X; Stevenson, Ryan A; Wilson, Kristin E; Ouslis, Natasha E; Barense, Morgan D; Cant, Jonathan S; Ferber, Susanne

    2016-02-01

    Given the limited resources of visual working memory, multiple items may be remembered as an averaged group or ensemble. As a result, local information may be ill-defined, but these ensemble representations provide accurate diagnostics of the natural world by combining gist information with item-level information held in visual working memory. Some neurodevelopmental disorders are characterized by sensory processing profiles that predispose individuals to avoid or seek-out sensory stimulation, fundamentally altering their perceptual experience. Here, we report such processing styles will affect the computation of ensemble statistics in the general population. We identified stable adult sensory processing patterns to demonstrate that individuals with low sensory thresholds who show a greater proclivity to engage in active response strategies to prevent sensory overstimulation are less likely to integrate mean size information across a set of similar items and are therefore more likely to be biased away from the mean size representation of an ensemble display. We therefore propose the study of ensemble processing should extend beyond the statistics of the display, and should also consider the statistics of the observer. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Eagle-eyed visual acuity: an experimental investigation of enhanced perception in autism.

    PubMed

    Ashwin, Emma; Ashwin, Chris; Rhydderch, Danielle; Howells, Jessica; Baron-Cohen, Simon

    2009-01-01

    Anecdotal accounts of sensory hypersensitivity in individuals with autism spectrum conditions (ASC) have been noted since the first reports of the condition. Over time, empirical evidence has supported the notion that those with ASC have superior visual abilities compared with control subjects. However, it remains unclear whether these abilities are specifically the result of differences in sensory thresholds (low-level processing), rather than higher-level cognitive processes. This study investigates visual threshold in n = 15 individuals with ASC and n = 15 individuals without ASC, using a standardized optometric test, the Freiburg Visual Acuity and Contrast Test, to investigate basic low-level visual acuity. Individuals with ASC have significantly better visual acuity (20:7) compared with control subjects (20:13)-acuity so superior that it lies in the region reported for birds of prey. The results of this study suggest that inclusion of sensory hypersensitivity in the diagnostic criteria for ASC may be warranted and that basic standardized tests of sensory thresholds may inform causal theories of ASC.

  4. Late development of cue integration is linked to sensory fusion in cortex.

    PubMed

    Dekker, Tessa M; Ban, Hiroshi; van der Velde, Bauke; Sereno, Martin I; Welchman, Andrew E; Nardini, Marko

    2015-11-02

    Adults optimize perceptual judgements by integrating different types of sensory information [1, 2]. This engages specialized neural circuits that fuse signals from the same [3-5] or different [6] modalities. Whereas young children can use sensory cues independently, adult-like precision gains from cue combination only emerge around ages 10 to 11 years [7-9]. Why does it take so long to make best use of sensory information? Existing data cannot distinguish whether this (1) reflects surprisingly late changes in sensory processing (sensory integration mechanisms in the brain are still developing) or (2) depends on post-perceptual changes (integration in sensory cortex is adult-like, but higher-level decision processes do not access the information) [10]. We tested visual depth cue integration in the developing brain to distinguish these possibilities. We presented children aged 6-12 years with displays depicting depth from binocular disparity and relative motion and made measurements using psychophysics, retinotopic mapping, and pattern classification fMRI. Older children (>10.5 years) showed clear evidence for sensory fusion in V3B, a visual area thought to integrate depth cues in the adult brain [3-5]. By contrast, in younger children (<10.5 years), there was no evidence for sensory fusion in any visual area. This significant age difference was paired with a shift in perceptual performance around ages 10 to 11 years and could not be explained by motion artifacts, visual attention, or signal quality differences. Thus, whereas many basic visual processes mature early in childhood [11, 12], the brain circuits that fuse cues take a very long time to develop. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Late Development of Cue Integration Is Linked to Sensory Fusion in Cortex

    PubMed Central

    Dekker, Tessa M.; Ban, Hiroshi; van der Velde, Bauke; Sereno, Martin I.; Welchman, Andrew E.; Nardini, Marko

    2015-01-01

    Summary Adults optimize perceptual judgements by integrating different types of sensory information [1, 2]. This engages specialized neural circuits that fuse signals from the same [3, 4, 5] or different [6] modalities. Whereas young children can use sensory cues independently, adult-like precision gains from cue combination only emerge around ages 10 to 11 years [7, 8, 9]. Why does it take so long to make best use of sensory information? Existing data cannot distinguish whether this (1) reflects surprisingly late changes in sensory processing (sensory integration mechanisms in the brain are still developing) or (2) depends on post-perceptual changes (integration in sensory cortex is adult-like, but higher-level decision processes do not access the information) [10]. We tested visual depth cue integration in the developing brain to distinguish these possibilities. We presented children aged 6–12 years with displays depicting depth from binocular disparity and relative motion and made measurements using psychophysics, retinotopic mapping, and pattern classification fMRI. Older children (>10.5 years) showed clear evidence for sensory fusion in V3B, a visual area thought to integrate depth cues in the adult brain [3, 4, 5]. By contrast, in younger children (<10.5 years), there was no evidence for sensory fusion in any visual area. This significant age difference was paired with a shift in perceptual performance around ages 10 to 11 years and could not be explained by motion artifacts, visual attention, or signal quality differences. Thus, whereas many basic visual processes mature early in childhood [11, 12], the brain circuits that fuse cues take a very long time to develop. PMID:26480841

  6. Awake vs. anesthetized: layer-specific sensory processing in visual cortex and functional connectivity between cortical areas

    PubMed Central

    Sellers, Kristin K.; Bennett, Davis V.; Hutt, Axel; Williams, James H.

    2015-01-01

    During general anesthesia, global brain activity and behavioral state are profoundly altered. Yet it remains mostly unknown how anesthetics alter sensory processing across cortical layers and modulate functional cortico-cortical connectivity. To address this gap in knowledge of the micro- and mesoscale effects of anesthetics on sensory processing in the cortical microcircuit, we recorded multiunit activity and local field potential in awake and anesthetized ferrets (Mustela putoris furo) during sensory stimulation. To understand how anesthetics alter sensory processing in a primary sensory area and the representation of sensory input in higher-order association areas, we studied the local sensory responses and long-range functional connectivity of primary visual cortex (V1) and prefrontal cortex (PFC). Isoflurane combined with xylazine provided general anesthesia for all anesthetized recordings. We found that anesthetics altered the duration of sensory-evoked responses, disrupted the response dynamics across cortical layers, suppressed both multimodal interactions in V1 and sensory responses in PFC, and reduced functional cortico-cortical connectivity between V1 and PFC. Together, the present findings demonstrate altered sensory responses and impaired functional network connectivity during anesthesia at the level of multiunit activity and local field potential across cortical layers. PMID:25833839

  7. Visually Evoked 3-5 Hz Membrane Potential Oscillations Reduce the Responsiveness of Visual Cortex Neurons in Awake Behaving Mice.

    PubMed

    Einstein, Michael C; Polack, Pierre-Olivier; Tran, Duy T; Golshani, Peyman

    2017-05-17

    Low-frequency membrane potential ( V m ) oscillations were once thought to only occur in sleeping and anesthetized states. Recently, low-frequency V m oscillations have been described in inactive awake animals, but it is unclear whether they shape sensory processing in neurons and whether they occur during active awake behavioral states. To answer these questions, we performed two-photon guided whole-cell V m recordings from primary visual cortex layer 2/3 excitatory and inhibitory neurons in awake mice during passive visual stimulation and performance of visual and auditory discrimination tasks. We recorded stereotyped 3-5 Hz V m oscillations where the V m baseline hyperpolarized as the V m underwent high amplitude rhythmic fluctuations lasting 1-2 s in duration. When 3-5 Hz V m oscillations coincided with visual cues, excitatory neuron responses to preferred cues were significantly reduced. Despite this disruption to sensory processing, visual cues were critical for evoking 3-5 Hz V m oscillations when animals performed discrimination tasks and passively viewed drifting grating stimuli. Using pupillometry and animal locomotive speed as indicators of arousal, we found that 3-5 Hz oscillations were not restricted to unaroused states and that they occurred equally in aroused and unaroused states. Therefore, low-frequency V m oscillations play a role in shaping sensory processing in visual cortical neurons, even during active wakefulness and decision making. SIGNIFICANCE STATEMENT A neuron's membrane potential ( V m ) strongly shapes how information is processed in sensory cortices of awake animals. Yet, very little is known about how low-frequency V m oscillations influence sensory processing and whether they occur in aroused awake animals. By performing two-photon guided whole-cell recordings from layer 2/3 excitatory and inhibitory neurons in the visual cortex of awake behaving animals, we found visually evoked stereotyped 3-5 Hz V m oscillations that disrupt excitatory responsiveness to visual stimuli. Moreover, these oscillations occurred when animals were in high and low arousal states as measured by animal speed and pupillometry. These findings show, for the first time, that low-frequency V m oscillations can significantly modulate sensory signal processing, even in awake active animals. Copyright © 2017 the authors 0270-6474/17/375084-15$15.00/0.

  8. Exploring Mechanisms Underlying Impaired Brain Function in Gulf War Illness through Advanced Network Analysis

    DTIC Science & Technology

    2017-10-01

    networks of the brain responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased...4    For each subject, the rsFMRI voxel time-series were temporally shifted to account for differences in slice acquisition times...responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased connectivity in

  9. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation.

    PubMed

    Duncan, Susan E; Moberg, Kayla; Amin, Kemia N; Wright, Melissa; Newkirk, Jordan J; Ponder, Monica A; Acuff, Gary R; Dickson, James S

    2017-05-01

    Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted-steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L * ) and more red (higher a * ) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b * ; lower L * ) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. © 2017 The Authors. Journal of Food Science published by Wiley Periodicals, Inc. on behalf of Institute of Food Technologists.

  10. The trait of sensory processing sensitivity and neural responses to changes in visual scenes

    PubMed Central

    Xu, Xiaomeng; Aron, Arthur; Aron, Elaine; Cao, Guikang; Feng, Tingyong; Weng, Xuchu

    2011-01-01

    This exploratory study examined the extent to which individual differences in sensory processing sensitivity (SPS), a temperament/personality trait characterized by social, emotional and physical sensitivity, are associated with neural response in visual areas in response to subtle changes in visual scenes. Sixteen participants completed the Highly Sensitive Person questionnaire, a standard measure of SPS. Subsequently, they were tested on a change detection task while undergoing functional magnetic resonance imaging (fMRI). SPS was associated with significantly greater activation in brain areas involved in high-order visual processing (i.e. right claustrum, left occipitotemporal, bilateral temporal and medial and posterior parietal regions) as well as in the right cerebellum, when detecting minor (vs major) changes in stimuli. These findings remained strong and significant after controlling for neuroticism and introversion, traits that are often correlated with SPS. These results provide the first evidence of neural differences associated with SPS, the first direct support for the sensory aspect of this trait that has been studied primarily for its social and affective implications, and preliminary evidence for heightened sensory processing in individuals high in SPS. PMID:20203139

  11. Differential effects of ADORA2A gene variations in pre-attentive visual sensory memory subprocesses.

    PubMed

    Beste, Christian; Stock, Ann-Kathrin; Ness, Vanessa; Epplen, Jörg T; Arning, Larissa

    2012-08-01

    The ADORA2A gene encodes the adenosine A(2A) receptor that is highly expressed in the striatum where it plays a role in modulating glutamatergic and dopaminergic transmission. Glutamatergic signaling has been suggested to play a pivotal role in cognitive functions related to the pre-attentive processing of external stimuli. Yet, the precise molecular mechanism of these processes is poorly understood. Therefore, we aimed to investigate whether ADORA2A gene variation has modulating effects on visual pre-attentive sensory memory processing. Studying two polymorphisms, rs5751876 and rs2298383, in 199 healthy control subjects who performed a partial-report paradigm, we find that ADORA2A variation is associated with differences in the efficiency of pre-attentive sensory memory sub-processes. We show that especially the initial visual availability of stimulus information is rendered more efficiently in the homozygous rare genotype groups. Processes related to the transfer of information into working memory and the duration of visual sensory (iconic) memory are compromised in the homozygous rare genotype groups. Our results show a differential genotype-dependent modulation of pre-attentive sensory memory sub-processes. Hence, we assume that this modulation may be due to differential effects of increased adenosine A(2A) receptor signaling on glutamatergic transmission and striatal medium spiny neuron (MSN) interaction. Copyright © 2011 Elsevier B.V. and ECNP. All rights reserved.

  12. Multisensory integration, sensory substitution and visual rehabilitation.

    PubMed

    Proulx, Michael J; Ptito, Maurice; Amedi, Amir

    2014-04-01

    Sensory substitution has advanced remarkably over the past 35 years since first introduced to the scientific literature by Paul Bach-y-Rita. In this issue dedicated to his memory, we describe a collection of reviews that assess the current state of neuroscience research on sensory substitution, visual rehabilitation, and multisensory processes. Copyright © 2014. Published by Elsevier Ltd.

  13. Touch to see: neuropsychological evidence of a sensory mirror system for touch.

    PubMed

    Bolognini, Nadia; Olgiati, Elena; Xaiz, Annalisa; Posteraro, Lucio; Ferraro, Francesco; Maravita, Angelo

    2012-09-01

    The observation of touch can be grounded in the activation of brain areas underpinning direct tactile experience, namely the somatosensory cortices. What is the behavioral impact of such a mirror sensory activity on visual perception? To address this issue, we investigated the causal interplay between observed and felt touch in right brain-damaged patients, as a function of their underlying damaged visual and/or tactile modalities. Patients and healthy controls underwent a detection task, comprising visual stimuli depicting touches or without a tactile component. Touch and No-touch stimuli were presented in egocentric or allocentric perspectives. Seeing touches, regardless of the viewing perspective, differently affects visual perception depending on which sensory modality is damaged: In patients with a selective visual deficit, but without any tactile defect, the sight of touch improves the visual impairment; this effect is associated with a lesion to the supramarginal gyrus. In patients with a tactile deficit, but intact visual perception, the sight of touch disrupts visual processing, inducing a visual extinction-like phenomenon. This disruptive effect is associated with the damage of the postcentral gyrus. Hence, a damage to the somatosensory system can lead to a dysfunctional visual processing, and an intact somatosensory processing can aid visual perception.

  14. Auditory and visual cortex of primates: a comparison of two sensory systems

    PubMed Central

    Rauschecker, Josef P.

    2014-01-01

    A comparative view of the brain, comparing related functions across species and sensory systems, offers a number of advantages. In particular, it allows separating the formal purpose of a model structure from its implementation in specific brains. Models of auditory cortical processing can be conceived by analogy to the visual cortex, incorporating neural mechanisms that are found in both the visual and auditory systems. Examples of such canonical features on the columnar level are direction selectivity, size/bandwidth selectivity, as well as receptive fields with segregated versus overlapping on- and off-sub-regions. On a larger scale, parallel processing pathways have been envisioned that represent the two main facets of sensory perception: 1) identification of objects and 2) processing of space. Expanding this model in terms of sensorimotor integration and control offers an overarching view of cortical function independent of sensory modality. PMID:25728177

  15. Visual function and cognitive speed of processing mediate age-related decline in memory span and fluid intelligence

    PubMed Central

    Clay, Olivio J.; Edwards, Jerri D.; Ross, Lesley A.; Okonkwo, Ozioma; Wadley, Virginia G.; Roth, David L.; Ball, Karlene K.

    2010-01-01

    Objectives: To evaluate the relationship between sensory and cognitive decline, particularly with respect to speed of processing, memory span, and fluid intelligence. Additionally, the common cause, sensory degradation and speed of processing hypotheses were compared. Methods: Structural equation modeling was used to investigate the complex relationships among age-related decrements in these areas. Results: Cross-sectional data analyses included 842 older adult participants (M = 73 years). After accounting for age-related declines in vision and processing speed, the direct associations between age and memory span and between age and fluid intelligence were nonsignificant. Older age was associated with visual decline, which was associated with slower speed of processing, which in turn was associated with greater cognitive deficits. Discussion: The findings support both the sensory degradation and speed of processing accounts of age-related cognitive decline. Further, the findings highlight positive aspects of normal cognitive aging in that older age may not be associated with a loss of fluid intelligence if visual sensory functioning and processing speed can be maintained. PMID:19436063

  16. Thalamic control of sensory selection in divided attention.

    PubMed

    Wimmer, Ralf D; Schmitt, L Ian; Davidson, Thomas J; Nakajima, Miho; Deisseroth, Karl; Halassa, Michael M

    2015-10-29

    How the brain selects appropriate sensory inputs and suppresses distractors is unknown. Given the well-established role of the prefrontal cortex (PFC) in executive function, its interactions with sensory cortical areas during attention have been hypothesized to control sensory selection. To test this idea and, more generally, dissect the circuits underlying sensory selection, we developed a cross-modal divided-attention task in mice that allowed genetic access to this cognitive process. By optogenetically perturbing PFC function in a temporally precise window, the ability of mice to select appropriately between conflicting visual and auditory stimuli was diminished. Equivalent sensory thalamocortical manipulations showed that behaviour was causally dependent on PFC interactions with the sensory thalamus, not sensory cortex. Consistent with this notion, we found neurons of the visual thalamic reticular nucleus (visTRN) to exhibit PFC-dependent changes in firing rate predictive of the modality selected. visTRN activity was causal to performance as confirmed by bidirectional optogenetic manipulations of this subnetwork. Using a combination of electrophysiology and intracellular chloride photometry, we demonstrated that visTRN dynamically controls visual thalamic gain through feedforward inhibition. Our experiments introduce a new subcortical model of sensory selection, in which the PFC biases thalamic reticular subnetworks to control thalamic sensory gain, selecting appropriate inputs for further processing.

  17. Top-down beta oscillatory signaling conveys behavioral context in early visual cortex.

    PubMed

    Richter, Craig G; Coppola, Richard; Bressler, Steven L

    2018-05-03

    Top-down modulation of sensory processing is a critical neural mechanism subserving numerous important cognitive roles, one of which may be to inform lower-order sensory systems of the current 'task at hand' by conveying behavioral context to these systems. Accumulating evidence indicates that top-down cortical influences are carried by directed interareal synchronization of oscillatory neuronal populations, with recent results pointing to beta-frequency oscillations as particularly important for top-down processing. However, it remains to be determined if top-down beta-frequency oscillations indeed convey behavioral context. We measured spectral Granger Causality (sGC) using local field potentials recorded from microelectrodes chronically implanted in visual areas V1/V2, V4, and TEO of two rhesus macaque monkeys, and applied multivariate pattern analysis to the spatial patterns of top-down sGC. We decoded behavioral context by discriminating patterns of top-down (V4/TEO-to-V1/V2) beta-peak sGC for two different task rules governing correct responses to identical visual stimuli. The results indicate that top-down directed influences are carried to visual cortex by beta oscillations, and differentiate task demands even before visual stimulus processing. They suggest that top-down beta-frequency oscillatory processes coordinate processing of sensory information by conveying global knowledge states to early levels of the sensory cortical hierarchy independently of bottom-up stimulus-driven processing.

  18. Episodic Memory Retrieval Functionally Relies on Very Rapid Reactivation of Sensory Information.

    PubMed

    Waldhauser, Gerd T; Braun, Verena; Hanslmayr, Simon

    2016-01-06

    Episodic memory retrieval is assumed to rely on the rapid reactivation of sensory information that was present during encoding, a process termed "ecphory." We investigated the functional relevance of this scarcely understood process in two experiments in human participants. We presented stimuli to the left or right of fixation at encoding, followed by an episodic memory test with centrally presented retrieval cues. This allowed us to track the reactivation of lateralized sensory memory traces during retrieval. Successful episodic retrieval led to a very early (∼100-200 ms) reactivation of lateralized alpha/beta (10-25 Hz) electroencephalographic (EEG) power decreases in the visual cortex contralateral to the visual field at encoding. Applying rhythmic transcranial magnetic stimulation to interfere with early retrieval processing in the visual cortex led to decreased episodic memory performance specifically for items encoded in the visual field contralateral to the site of stimulation. These results demonstrate, for the first time, that episodic memory functionally relies on very rapid reactivation of sensory information. Remembering personal experiences requires a "mental time travel" to revisit sensory information perceived in the past. This process is typically described as a controlled, relatively slow process. However, by using electroencephalography to measure neural activity with a high time resolution, we show that such episodic retrieval entails a very rapid reactivation of sensory brain areas. Using transcranial magnetic stimulation to alter brain function during retrieval revealed that this early sensory reactivation is causally relevant for conscious remembering. These results give first neural evidence for a functional, preconscious component of episodic remembering. This provides new insight into the nature of human memory and may help in the understanding of psychiatric conditions that involve the automatic intrusion of unwanted memories. Copyright © 2016 the authors 0270-6474/16/360251-10$15.00/0.

  19. The dorsal raphe modulates sensory responsiveness during arousal in zebrafish

    PubMed Central

    Yokogawa, Tohei; Hannan, Markus C.; Burgess, Harold A.

    2012-01-01

    During waking behavior animals adapt their state of arousal in response to environmental pressures. Sensory processing is regulated in aroused states and several lines of evidence imply that this is mediated at least partly by the serotonergic system. However there is little information directly showing that serotonergic function is required for state-dependent modulation of sensory processing. Here we find that zebrafish larvae can maintain a short-term state of arousal during which neurons in the dorsal raphe modulate sensory responsiveness to behaviorally relevant visual cues. Following a brief exposure to water flow, larvae show elevated activity and heightened sensitivity to perceived motion. Calcium imaging of neuronal activity after flow revealed increased activity in serotonergic neurons of the dorsal raphe. Genetic ablation of these neurons abolished the increase in visual sensitivity during arousal without affecting baseline visual function or locomotor activity. We traced projections from the dorsal raphe to a major visual area, the optic tectum. Laser ablation of the tectum demonstrated that this structure, like the dorsal raphe, is required for improved visual sensitivity during arousal. These findings reveal that serotonergic neurons of the dorsal raphe have a state-dependent role in matching sensory responsiveness to behavioral context. PMID:23100441

  20. Biasing the brain's attentional set: I. cue driven deployments of intersensory selective attention.

    PubMed

    Foxe, John J; Simpson, Gregory V; Ahlfors, Seppo P; Saron, Clifford D

    2005-10-01

    Brain activity associated with directing attention to one of two possible sensory modalities was examined using high-density mapping of human event-related potentials. The deployment of selective attention was based on visually presented symbolic cue-words instructing subjects on a trial-by-trial basis, which sensory modality to attend. We measured the spatio-temporal pattern of activation in the approximately 1 second period between the cue-instruction and a subsequent compound auditory-visual imperative stimulus. This allowed us to assess the flow of processing across brain regions involved in deploying and sustaining inter-sensory selective attention, prior to the actual selective processing of the compound audio-visual target stimulus. Activity over frontal and parietal areas showed sensory specific increases in activation during the early part of the anticipatory period (~230 ms), probably representing the activation of fronto-parietal attentional deployment systems for top-down control of attention. In the later period preceding the arrival of the "to-be-attended" stimulus, sustained differential activity was seen over fronto-central regions and parieto-occipital regions, suggesting the maintenance of sensory-specific biased attentional states that would allow for subsequent selective processing. Although there was clear sensory biasing in this late sustained period, it was also clear that both sensory systems were being prepared during the cue-target period. These late sensory-specific biasing effects were also accompanied by sustained activations over frontal cortices that also showed both common and sensory specific activation patterns, suggesting that maintenance of the biased state includes top-down inputs from generators in frontal cortices, some of which are sensory-specific regions. These data support extensive interactions between sensory, parietal and frontal regions during processing of cue information, deployment of attention, and maintenance of the focus of attention in anticipation of impending attentionally relevant input.

  1. The answer is blowing in the wind: free-flying honeybees can integrate visual and mechano-sensory inputs for making complex foraging decisions.

    PubMed

    Ravi, Sridhar; Garcia, Jair E; Wang, Chun; Dyer, Adrian G

    2016-11-01

    Bees navigate in complex environments using visual, olfactory and mechano-sensorial cues. In the lowest region of the atmosphere, the wind environment can be highly unsteady and bees employ fine motor-skills to enhance flight control. Recent work reveals sophisticated multi-modal processing of visual and olfactory channels by the bee brain to enhance foraging efficiency, but it currently remains unclear whether wind-induced mechano-sensory inputs are also integrated with visual information to facilitate decision making. Individual honeybees were trained in a linear flight arena with appetitive-aversive differential conditioning to use a context-setting cue of 3 m s -1 cross-wind direction to enable decisions about either a 'blue' or 'yellow' star stimulus being the correct alternative. Colour stimuli properties were mapped in bee-specific opponent-colour spaces to validate saliency, and to thus enable rapid reverse learning. Bees were able to integrate mechano-sensory and visual information to facilitate decisions that were significantly different to chance expectation after 35 learning trials. An independent group of bees were trained to find a single rewarding colour that was unrelated to the wind direction. In these trials, wind was not used as a context-setting cue and served only as a potential distracter in identifying the relevant rewarding visual stimuli. Comparison between respective groups shows that bees can learn to integrate visual and mechano-sensory information in a non-elemental fashion, revealing an unsuspected level of sensory processing in honeybees, and adding to the growing body of knowledge on the capacity of insect brains to use multi-modal sensory inputs in mediating foraging behaviour. © 2016. Published by The Company of Biologists Ltd.

  2. Measuring the effect of attention on simple visual search.

    PubMed

    Palmer, J; Ames, C T; Lindsey, D T

    1993-02-01

    Set-size in visual search may be due to 1 or more of 3 factors: sensory processes such as lateral masking between stimuli, attentional processes limiting the perception of individual stimuli, or attentional processes affecting the decision rules for combining information from multiple stimuli. These possibilities were evaluated in tasks such as searching for a longer line among shorter lines. To evaluate sensory contributions, display set-size effects were compared with cuing conditions that held sensory phenomena constant. Similar effects for the display and cue manipulations suggested that sensory processes contributed little under the conditions of this experiment. To evaluate the contribution of decision processes, the set-size effects were modeled with signal detection theory. In these models, a decision effect alone was sufficient to predict the set-size effects without any attentional limitation due to perception.

  3. Sandwich masking eliminates both visual awareness of faces and face-specific brain activity through a feedforward mechanism.

    PubMed

    Harris, Joseph A; Wu, Chien-Te; Woldorff, Marty G

    2011-06-07

    It is generally agreed that considerable amounts of low-level sensory processing of visual stimuli can occur without conscious awareness. On the other hand, the degree of higher level visual processing that occurs in the absence of awareness is as yet unclear. Here, event-related potential (ERP) measures of brain activity were recorded during a sandwich-masking paradigm, a commonly used approach for attenuating conscious awareness of visual stimulus content. In particular, the present study used a combination of ERP activation contrasts to track both early sensory-processing ERP components and face-specific N170 ERP activations, in trials with versus without awareness. The electrophysiological measures revealed that the sandwich masking abolished the early face-specific N170 neural response (peaking at ~170 ms post-stimulus), an effect that paralleled the abolition of awareness of face versus non-face image content. Furthermore, however, the masking appeared to render a strong attenuation of earlier feedforward visual sensory-processing signals. This early attenuation presumably resulted in insufficient information being fed into the higher level visual system pathways specific to object category processing, thus leading to unawareness of the visual object content. These results support a coupling of visual awareness and neural indices of face processing, while also demonstrating an early low-level mechanism of interference in sandwich masking.

  4. Reading with sounds: sensory substitution selectively activates the visual word form area in the blind.

    PubMed

    Striem-Amit, Ella; Cohen, Laurent; Dehaene, Stanislas; Amedi, Amir

    2012-11-08

    Using a visual-to-auditory sensory-substitution algorithm, congenitally fully blind adults were taught to read and recognize complex images using "soundscapes"--sounds topographically representing images. fMRI was used to examine key questions regarding the visual word form area (VWFA): its selectivity for letters over other visual categories without visual experience, its feature tolerance for reading in a novel sensory modality, and its plasticity for scripts learned in adulthood. The blind activated the VWFA specifically and selectively during the processing of letter soundscapes relative to both textures and visually complex object categories and relative to mental imagery and semantic-content controls. Further, VWFA recruitment for reading soundscapes emerged after 2 hr of training in a blind adult on a novel script. Therefore, the VWFA shows category selectivity regardless of input sensory modality, visual experience, and long-term familiarity or expertise with the script. The VWFA may perform a flexible task-specific rather than sensory-specific computation, possibly linking letter shapes to phonology. Copyright © 2012 Elsevier Inc. All rights reserved.

  5. Brain correlates of automatic visual change detection.

    PubMed

    Cléry, H; Andersson, F; Fonlupt, P; Gomot, M

    2013-07-15

    A number of studies support the presence of visual automatic detection of change, but little is known about the brain generators involved in such processing and about the modulation of brain activity according to the salience of the stimulus. The study presented here was designed to locate the brain activity elicited by unattended visual deviant and novel stimuli using fMRI. Seventeen adult participants were presented with a passive visual oddball sequence while performing a concurrent visual task. Variations in BOLD signal were observed in the modality-specific sensory cortex, but also in non-specific areas involved in preattentional processing of changing events. A degree-of-deviance effect was observed, since novel stimuli elicited more activity in the sensory occipital regions and at the medial frontal site than small changes. These findings could be compared to those obtained in the auditory modality and might suggest a "general" change detection process operating in several sensory modalities. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Sequential sensory and decision processing in posterior parietal cortex

    PubMed Central

    Ibos, Guilhem; Freedman, David J

    2017-01-01

    Decisions about the behavioral significance of sensory stimuli often require comparing sensory inference of what we are looking at to internal models of what we are looking for. Here, we test how neuronal selectivity for visual features is transformed into decision-related signals in posterior parietal cortex (area LIP). Monkeys performed a visual matching task that required them to detect target stimuli composed of conjunctions of color and motion-direction. Neuronal recordings from area LIP revealed two main findings. First, the sequential processing of visual features and the selection of target-stimuli suggest that LIP is involved in transforming sensory information into decision-related signals. Second, the patterns of color and motion selectivity and their impact on decision-related encoding suggest that LIP plays a role in detecting target stimuli by comparing bottom-up sensory inputs (what the monkeys were looking at) and top-down cognitive encoding inputs (what the monkeys were looking for). DOI: http://dx.doi.org/10.7554/eLife.23743.001 PMID:28418332

  7. Cross-frequency synchronization connects networks of fast and slow oscillations during visual working memory maintenance.

    PubMed

    Siebenhühner, Felix; Wang, Sheng H; Palva, J Matias; Palva, Satu

    2016-09-26

    Neuronal activity in sensory and fronto-parietal (FP) areas underlies the representation and attentional control, respectively, of sensory information maintained in visual working memory (VWM). Within these regions, beta/gamma phase-synchronization supports the integration of sensory functions, while synchronization in theta/alpha bands supports the regulation of attentional functions. A key challenge is to understand which mechanisms integrate neuronal processing across these distinct frequencies and thereby the sensory and attentional functions. We investigated whether such integration could be achieved by cross-frequency phase synchrony (CFS). Using concurrent magneto- and electroencephalography, we found that CFS was load-dependently enhanced between theta and alpha-gamma and between alpha and beta-gamma oscillations during VWM maintenance among visual, FP, and dorsal attention (DA) systems. CFS also connected the hubs of within-frequency-synchronized networks and its strength predicted individual VWM capacity. We propose that CFS integrates processing among synchronized neuronal networks from theta to gamma frequencies to link sensory and attentional functions.

  8. Other ways of seeing: From behavior to neural mechanisms in the online “visual” control of action with sensory substitution

    PubMed Central

    Proulx, Michael J.; Gwinnutt, James; Dell’Erba, Sara; Levy-Tzedek, Shelly; de Sousa, Alexandra A.; Brown, David J.

    2015-01-01

    Vision is the dominant sense for perception-for-action in humans and other higher primates. Advances in sight restoration now utilize the other intact senses to provide information that is normally sensed visually through sensory substitution to replace missing visual information. Sensory substitution devices translate visual information from a sensor, such as a camera or ultrasound device, into a format that the auditory or tactile systems can detect and process, so the visually impaired can see through hearing or touch. Online control of action is essential for many daily tasks such as pointing, grasping and navigating, and adapting to a sensory substitution device successfully requires extensive learning. Here we review the research on sensory substitution for vision restoration in the context of providing the means of online control for action in the blind or blindfolded. It appears that the use of sensory substitution devices utilizes the neural visual system; this suggests the hypothesis that sensory substitution draws on the same underlying mechanisms as unimpaired visual control of action. Here we review the current state of the art for sensory substitution approaches to object recognition, localization, and navigation, and the potential these approaches have for revealing a metamodal behavioral and neural basis for the online control of action. PMID:26599473

  9. Which Aspects of Visual Attention Are Changed by Deafness? The Case of the Attentional Network Test

    ERIC Educational Resources Information Center

    Dye, Matthew W. G.; Baril, Dara E.; Bavelier, Daphne

    2007-01-01

    The loss of one sensory modality can lead to a reorganization of the other intact sensory modalities. In the case of individuals who are born profoundly deaf, there is growing evidence of changes in visual functions. Specifically, deaf individuals demonstrate enhanced visual processing in the periphery, and in particular enhanced peripheral visual…

  10. Visual short-term memory load reduces retinotopic cortex response to contrast.

    PubMed

    Konstantinou, Nikos; Bahrami, Bahador; Rees, Geraint; Lavie, Nilli

    2012-11-01

    Load Theory of attention suggests that high perceptual load in a task leads to reduced sensory visual cortex response to task-unrelated stimuli resulting in "load-induced blindness" [e.g., Lavie, N. Attention, distraction and cognitive control under load. Current Directions in Psychological Science, 19, 143-148, 2010; Lavie, N. Distracted and confused?: Selective attention under load. Trends in Cognitive Sciences, 9, 75-82, 2005]. Consideration of the findings that visual STM (VSTM) involves sensory recruitment [e.g., Pasternak, T., & Greenlee, M. Working memory in primate sensory systems. Nature Reviews Neuroscience, 6, 97-107, 2005] within Load Theory led us to a new hypothesis regarding the effects of VSTM load on visual processing. If VSTM load draws on sensory visual capacity, then similar to perceptual load, high VSTM load should also reduce visual cortex response to incoming stimuli leading to a failure to detect them. We tested this hypothesis with fMRI and behavioral measures of visual detection sensitivity. Participants detected the presence of a contrast increment during the maintenance delay in a VSTM task requiring maintenance of color and position. Increased VSTM load (manipulated by increased set size) led to reduced retinotopic visual cortex (V1-V3) responses to contrast as well as reduced detection sensitivity, as we predicted. Additional visual detection experiments established a clear tradeoff between the amount of information maintained in VSTM and detection sensitivity, while ruling out alternative accounts for the effects of VSTM load in terms of differential spatial allocation strategies or task difficulty. These findings extend Load Theory to demonstrate a new form of competitive interactions between early visual cortex processing and visual representations held in memory under load and provide a novel line of support for the sensory recruitment hypothesis of VSTM.

  11. The associations between multisensory temporal processing and symptoms of schizophrenia.

    PubMed

    Stevenson, Ryan A; Park, Sohee; Cochran, Channing; McIntosh, Lindsey G; Noel, Jean-Paul; Barense, Morgan D; Ferber, Susanne; Wallace, Mark T

    2017-01-01

    Recent neurobiological accounts of schizophrenia have included an emphasis on changes in sensory processing. These sensory and perceptual deficits can have a cascading effect onto higher-level cognitive processes and clinical symptoms. One form of sensory dysfunction that has been consistently observed in schizophrenia is altered temporal processing. In this study, we investigated temporal processing within and across the auditory and visual modalities in individuals with schizophrenia (SCZ) and age-matched healthy controls. Individuals with SCZ showed auditory and visual temporal processing abnormalities, as well as multisensory temporal processing dysfunction that extended beyond that attributable to unisensory processing dysfunction. Most importantly, these multisensory temporal deficits were associated with the severity of hallucinations. This link between atypical multisensory temporal perception and clinical symptomatology suggests that clinical symptoms of schizophrenia may be at least partly a result of cascading effects from (multi)sensory disturbances. These results are discussed in terms of underlying neural bases and the possible implications for remediation. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

    PubMed

    Wahn, Basil; König, Peter

    2017-01-01

    Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves object-based attention (e.g., the discrimination of stimulus attributes) or spatial attention (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

  13. Prestimulus neural oscillations inhibit visual perception via modulation of response gain.

    PubMed

    Chaumon, Maximilien; Busch, Niko A

    2014-11-01

    The ongoing state of the brain radically affects how it processes sensory information. How does this ongoing brain activity interact with the processing of external stimuli? Spontaneous oscillations in the alpha range are thought to inhibit sensory processing, but little is known about the psychophysical mechanisms of this inhibition. We recorded ongoing brain activity with EEG while human observers performed a visual detection task with stimuli of different contrast intensities. To move beyond qualitative description, we formally compared psychometric functions obtained under different levels of ongoing alpha power and evaluated the inhibitory effect of ongoing alpha oscillations in terms of contrast or response gain models. This procedure opens the way to understanding the actual functional mechanisms by which ongoing brain activity affects visual performance. We found that strong prestimulus occipital alpha oscillations-but not more anterior mu oscillations-reduce performance most strongly for stimuli of the highest intensities tested. This inhibitory effect is best explained by a divisive reduction of response gain. Ongoing occipital alpha oscillations thus reflect changes in the visual system's input/output transformation that are independent of the sensory input to the system. They selectively scale the system's response, rather than change its sensitivity to sensory information.

  14. When a hit sounds like a kiss: An electrophysiological exploration of semantic processing in visual narrative.

    PubMed

    Manfredi, Mirella; Cohn, Neil; Kutas, Marta

    2017-06-01

    Researchers have long questioned whether information presented through different sensory modalities involves distinct or shared semantic systems. We investigated uni-sensory cross-modal processing by recording event-related brain potentials to words replacing the climactic event in a visual narrative sequence (comics). We compared Onomatopoeic words, which phonetically imitate action sounds (Pow!), with Descriptive words, which describe an action (Punch!), that were (in)congruent within their sequence contexts. Across two experiments, larger N400s appeared to Anomalous Onomatopoeic or Descriptive critical panels than to their congruent counterparts, reflecting a difficulty in semantic access/retrieval. Also, Descriptive words evinced a greater late frontal positivity compared to Onomatopoetic words, suggesting that, though plausible, they may be less predictable/expected in visual narratives. Our results indicate that uni-sensory cross-model integration of word/letter-symbol strings within visual narratives elicit ERP patterns typically observed for written sentence processing, thereby suggesting the engagement of similar domain-independent integration/interpretation mechanisms. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. When a hit sounds like a kiss: an electrophysiological exploration of semantic processing in visual narrative

    PubMed Central

    Manfredi, Mirella; Cohn, Neil; Kutas, Marta

    2017-01-01

    Researchers have long questioned whether information presented through different sensory modalities involves distinct or shared semantic systems. We investigated uni-sensory cross-modal processing by recording event-related brain potentials to words replacing the climactic event in a visual narrative sequence (comics). We compared Onomatopoeic words, which phonetically imitate action sounds (Pow!), with Descriptive words, which describe an action (Punch!), that were (in)congruent within their sequence contexts. Across two experiments, larger N400s appeared to Anomalous Onomatopoeic or Descriptive critical panels than to their congruent counterparts, reflecting a difficulty in semantic access/retrieval. Also, Descriptive words evinced a greater late frontal positivity compared to Onomatopoetic words, suggesting that, though plausible, they may be less predictable/expected in visual narratives. Our results indicate that uni-sensory cross-model integration of word/letter-symbol strings within visual narratives elicit ERP patterns typically observed for written sentence processing, thereby suggesting the engagement of similar domain-independent integration/interpretation mechanisms. PMID:28242517

  16. The involvement of central attention in visual search is determined by task demands.

    PubMed

    Han, Suk Won

    2017-04-01

    Attention, the mechanism by which a subset of sensory inputs is prioritized over others, operates at multiple processing stages. Specifically, attention enhances weak sensory signal at the perceptual stage, while it serves to select appropriate responses or consolidate sensory representations into short-term memory at the central stage. This study investigated the independence and interaction between perceptual and central attention. To do so, I used a dual-task paradigm, pairing a four-alternative choice task with a visual search task. The results showed that central attention for response selection was engaged in perceptual processing for visual search when the number of search items increased, thereby increasing the demand for serial allocation of focal attention. By contrast, central attention and perceptual attention remained independent as far as the demand for serial shifting of focal attention remained constant; decreasing stimulus contrast or increasing the set size of a parallel search did not evoke the involvement of central attention in visual search. These results suggest that the nature of concurrent visual search process plays a crucial role in the functional interaction between two different types of attention.

  17. [Sensory loss and brain reorganization].

    PubMed

    Fortin, Madeleine; Voss, Patrice; Lassonde, Maryse; Lepore, Franco

    2007-11-01

    It is without a doubt that humans are first and foremost visual beings. Even though the other sensory modalities provide us with valuable information, it is vision that generally offers the most reliable and detailed information concerning our immediate surroundings. It is therefore not surprising that nearly a third of the human brain processes, in one way or another, visual information. But what happens when the visual information no longer reaches these brain regions responsible for processing it? Indeed numerous medical conditions such as congenital glaucoma, retinis pigmentosa and retinal detachment, to name a few, can disrupt the visual system and lead to blindness. So, do the brain areas responsible for processing visual stimuli simply shut down and become non-functional? Do they become dead weight and simply stop contributing to cognitive and sensory processes? Current data suggests that this is not the case. Quite the contrary, it would seem that congenitally blind individuals benefit from the recruitment of these areas by other sensory modalities to carry out non-visual tasks. In fact, our laboratory has been studying blindness and its consequences on both the brain and behaviour for many years now. We have shown that blind individuals demonstrate exceptional hearing abilities. This finding holds true for stimuli originating from both near and far space. It also holds true, under certain circumstances, for those who lost their sight later in life, beyond a period generally believed to limit the brain changes following the loss of sight. In the case of the early blind, we have shown their ability to localize sounds is strongly correlated with activity in the occipital cortex (the location of the visual processing), demonstrating that these areas are functionally engaged by the task. Therefore it would seem that the plastic nature of the human brain allows them to make new use of the cerebral areas normally dedicated to visual processing.

  18. Artificial organs: recent progress in artificial hearing and vision.

    PubMed

    Ifukube, Tohru

    2009-01-01

    Artificial sensory organs are a prosthetic means of sending visual or auditory information to the brain by electrical stimulation of the optic or auditory nerves to assist visually impaired or hearing-impaired people. However, clinical application of artificial sensory organs, except for cochlear implants, is still a trial-and-error process. This is because how and where the information transmitted to the brain is processed is still unknown, and also because changes in brain function (plasticity) remain unknown, even though brain plasticity plays an important role in meaningful interpretation of new sensory stimuli. This article discusses some basic unresolved issues and potential solutions in the development of artificial sensory organs such as cochlear implants, brainstem implants, artificial vision, and artificial retinas.

  19. Mate choice in the eye and ear of the beholder? Female multimodal sensory configuration influences her preferences.

    PubMed

    Ronald, Kelly L; Fernández-Juricic, Esteban; Lucas, Jeffrey R

    2018-05-16

    A common assumption in sexual selection studies is that receivers decode signal information similarly. However, receivers may vary in how they rank signallers if signal perception varies with an individual's sensory configuration. Furthermore, receivers may vary in their weighting of different elements of multimodal signals based on their sensory configuration. This could lead to complex levels of selection on signalling traits. We tested whether multimodal sensory configuration could affect preferences for multimodal signals. We used brown-headed cowbird ( Molothrus ater ) females to examine how auditory sensitivity and auditory filters, which influence auditory spectral and temporal resolution, affect song preferences, and how visual spatial resolution and visual temporal resolution, which influence resolution of a moving visual signal, affect visual display preferences. Our results show that multimodal sensory configuration significantly affects preferences for male displays: females with better auditory temporal resolution preferred songs that were shorter, with lower Wiener entropy, and higher frequency; and females with better visual temporal resolution preferred males with less intense visual displays. Our findings provide new insights into mate-choice decisions and receiver signal processing. Furthermore, our results challenge a long-standing assumption in animal communication which can affect how we address honest signalling, assortative mating and sensory drive. © 2018 The Author(s).

  20. Emotional facilitation of sensory processing in the visual cortex.

    PubMed

    Schupp, Harald T; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2003-01-01

    A key function of emotion is the preparation for action. However, organization of successful behavioral strategies depends on efficient stimulus encoding. The present study tested the hypothesis that perceptual encoding in the visual cortex is modulated by the emotional significance of visual stimuli. Event-related brain potentials were measured while subjects viewed pleasant, neutral, and unpleasant pictures. Early selective encoding of pleasant and unpleasant images was associated with a posterior negativity, indicating primary sources of activation in the visual cortex. The study also replicated previous findings in that affective cues also elicited enlarged late positive potentials, indexing increased stimulus relevance at higher-order stages of stimulus processing. These results support the hypothesis that sensory encoding of affective stimuli is facilitated implicitly by natural selective attention. Thus, the affect system not only modulates motor output (i.e., favoring approach or avoidance dispositions), but already operates at an early level of sensory encoding.

  1. Making Sense of Education: Sensory Ethnography and Visual Impairment

    ERIC Educational Resources Information Center

    Morris, Ceri

    2017-01-01

    Education involves the engagement of the full range of the senses in the accomplishment of tasks and the learning of knowledge and skills. However both in pedagogical practices and in the process of educational research, there has been a tendency to privilege the visual. To explore these issues, detailed sensory ethnographic fieldwork was…

  2. Short-Term Memory for Space and Time Flexibly Recruit Complementary Sensory-Biased Frontal Lobe Attention Networks.

    PubMed

    Michalka, Samantha W; Kong, Lingqiang; Rosen, Maya L; Shinn-Cunningham, Barbara G; Somers, David C

    2015-08-19

    The frontal lobes control wide-ranging cognitive functions; however, functional subdivisions of human frontal cortex are only coarsely mapped. Here, functional magnetic resonance imaging reveals two distinct visual-biased attention regions in lateral frontal cortex, superior precentral sulcus (sPCS) and inferior precentral sulcus (iPCS), anatomically interdigitated with two auditory-biased attention regions, transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic functional connectivity analysis demonstrates that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Interestingly, we observe that spatial and temporal short-term memory (STM), respectively, recruit visual and auditory attention networks in the frontal lobe, independent of sensory modality. These findings not only demonstrate that both sensory modality and information domain influence frontal lobe functional organization, they also demonstrate that spatial processing co-localizes with visual processing and that temporal processing co-localizes with auditory processing in lateral frontal cortex. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Cortical Neuroprosthesis Merges Visible and Invisible Light Without Impairing Native Sensory Function

    PubMed Central

    Thomson, Eric E.; Zea, Ivan; França, Wendy

    2017-01-01

    Abstract Adult rats equipped with a sensory prosthesis, which transduced infrared (IR) signals into electrical signals delivered to somatosensory cortex (S1), took approximately 4 d to learn a four-choice IR discrimination task. Here, we show that when such IR signals are projected to the primary visual cortex (V1), rats that are pretrained in a visual-discrimination task typically learn the same IR discrimination task on their first day of training. However, without prior training on a visual discrimination task, the learning rates for S1- and V1-implanted animals converged, suggesting there is no intrinsic difference in learning rate between the two areas. We also discovered that animals were able to integrate IR information into the ongoing visual processing stream in V1, performing a visual-IR integration task in which they had to combine IR and visual information. Furthermore, when the IR prosthesis was implanted in S1, rats showed no impairment in their ability to use their whiskers to perform a tactile discrimination task. Instead, in some rats, this ability was actually enhanced. Cumulatively, these findings suggest that cortical sensory neuroprostheses can rapidly augment the representational scope of primary sensory areas, integrating novel sources of information into ongoing processing while incurring minimal loss of native function. PMID:29279860

  4. Association of visual sensory function and higher order visual processing skills with incident driving cessation

    PubMed Central

    Huisingh, Carrie; McGwin, Gerald; Owsley, Cynthia

    2017-01-01

    Background Many studies on vision and driving cessation have relied on measures of sensory function, which are insensitive to the higher order cognitive aspects of visual processing. The purpose of this study was to examine the association between traditional measures of visual sensory function and higher order visual processing skills with incident driving cessation in a population-based sample of older drivers. Methods Two thousand licensed drivers aged ≥70 were enrolled and followed-up for three years. Tests for central vision and visual processing were administered at baseline and included visual acuity, contrast sensitivity, sensitivity in the driving visual field, visual processing speed (Useful Field of View (UFOV) Subtest 2 and Trails B), and spatial ability measured by the Visual Closure Subtest of the Motor-free Visual Perception Test. Participants self-reported the month and year of driving cessation and provided a reason for cessation. Cox proportional hazards models were used to generate crude and adjusted hazard ratios with 95% confidence intervals between visual functioning characteristics and risk of driving cessation over a three-year period. Results During the study period, 164 participants stopped driving which corresponds to a cumulative incidence of 8.5%. Impaired contrast sensitivity, visual fields, visual processing speed (UFOVand Trails B), and spatial ability were significant risk factors for subsequent driving cessation after adjusting for age, gender, marital status, number of medical conditions, and miles driven. Visual acuity impairment was not associated with driving cessation. Medical problems (63%), specifically musculoskeletal and neurological problems, as well as vision problems (17%) were cited most frequently as the reason for driving cessation. Conclusion Assessment of cognitive and visual functioning can provide useful information about subsequent risk of driving cessation among older drivers. In addition, a variety of factors, not just vision, influenced the decision to stop driving and may be amenable to intervention. PMID:27353969

  5. Manually controlled human balancing using visual, vestibular and proprioceptive senses involves a common, low frequency neural process

    PubMed Central

    Lakie, Martin; Loram, Ian D

    2006-01-01

    Ten subjects balanced their own body or a mechanically equivalent unstable inverted pendulum by hand, through a compliant spring linkage. Their balancing process was always characterized by repeated small reciprocating hand movements. These bias adjustments were an observable sign of intermittent alterations in neural output. On average, the adjustments occurred at intervals of ∼400 ms. To generate appropriate stabilizing bias adjustments, sensory information about body or load movement is needed. Subjects used visual, vestibular or proprioceptive sensation alone and in combination to perform the tasks. We first ask, is the time between adjustments (bias duration) sensory specific? Vision is associated with slow responses. Other senses involved with balance are known to be faster. Our second question is; does bias duration depend on sensory abundance? An appropriate bias adjustment cannot occur until unplanned motion is unambiguously perceived (a sensory threshold). The addition of more sensory data should therefore expedite action, decreasing the mean bias adjustment duration. Statistical analysis showed that (1) the mean bias adjustment duration was remarkably independent of the sensory modality and (2) the addition of one or two sensory modalities made a small, but significant, decrease in the mean bias adjustment duration. Thus, a threshold effect can alter only a very minor part of the bias duration. The bias adjustment duration in manual balancing must reflect something more than visual sensation and perceptual thresholds; our suggestion is that it is a common central motor planning process. We predict that similar processes may be identified in the control of standing. PMID:16959857

  6. Cross-frequency synchronization connects networks of fast and slow oscillations during visual working memory maintenance

    PubMed Central

    Siebenhühner, Felix; Wang, Sheng H; Palva, J Matias; Palva, Satu

    2016-01-01

    Neuronal activity in sensory and fronto-parietal (FP) areas underlies the representation and attentional control, respectively, of sensory information maintained in visual working memory (VWM). Within these regions, beta/gamma phase-synchronization supports the integration of sensory functions, while synchronization in theta/alpha bands supports the regulation of attentional functions. A key challenge is to understand which mechanisms integrate neuronal processing across these distinct frequencies and thereby the sensory and attentional functions. We investigated whether such integration could be achieved by cross-frequency phase synchrony (CFS). Using concurrent magneto- and electroencephalography, we found that CFS was load-dependently enhanced between theta and alpha–gamma and between alpha and beta-gamma oscillations during VWM maintenance among visual, FP, and dorsal attention (DA) systems. CFS also connected the hubs of within-frequency-synchronized networks and its strength predicted individual VWM capacity. We propose that CFS integrates processing among synchronized neuronal networks from theta to gamma frequencies to link sensory and attentional functions. DOI: http://dx.doi.org/10.7554/eLife.13451.001 PMID:27669146

  7. The cortical basis of true memory and false memory for motion.

    PubMed

    Karanian, Jessica M; Slotnick, Scott D

    2014-02-01

    Behavioral evidence indicates that false memory, like true memory, can be rich in sensory detail. By contrast, there is fMRI evidence that true memory for visual information produces greater activity in earlier visual regions than false memory, which suggests true memory is associated with greater sensory detail. However, false memory in previous fMRI paradigms may have lacked sufficient sensory detail to recruit earlier visual processing regions. To investigate this possibility in the present fMRI study, we employed a paradigm that produced feature-specific false memory with a high degree of visual detail. During the encoding phase, moving or stationary abstract shapes were presented to the left or right of fixation. During the retrieval phase, shapes from encoding were presented at fixation and participants classified each item as previously "moving" or "stationary" within each visual field. Consistent with previous fMRI findings, true memory but not false memory for motion activated motion processing region MT+, while both true memory and false memory activated later cortical processing regions. In addition, false memory but not true memory for motion activated language processing regions. The present findings indicate that true memory activates earlier visual regions to a greater degree than false memory, even under conditions of detailed retrieval. Thus, the dissociation between previous behavioral findings and fMRI findings do not appear to be task dependent. Future work will be needed to assess whether the same pattern of true memory and false memory activity is observed for different sensory modalities. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Visual noise disrupts conceptual integration in reading.

    PubMed

    Gao, Xuefei; Stine-Morrow, Elizabeth A L; Noh, Soo Rim; Eskew, Rhea T

    2011-02-01

    The Effortfulness Hypothesis suggests that sensory impairment (either simulated or age-related) may decrease capacity for semantic integration in language comprehension. We directly tested this hypothesis by measuring resource allocation to different levels of processing during reading (i.e., word vs. semantic analysis). College students read three sets of passages word-by-word, one at each of three levels of dynamic visual noise. There was a reliable interaction between processing level and noise, such that visual noise increased resources allocated to word-level processing, at the cost of attention paid to semantic analysis. Recall of the most important ideas also decreased with increasing visual noise. Results suggest that sensory challenge can impair higher-level cognitive functions in learning from text, supporting the Effortfulness Hypothesis.

  9. Differential sensory cortical involvement in auditory and visual sensorimotor temporal recalibration: Evidence from transcranial direct current stimulation (tDCS).

    PubMed

    Aytemür, Ali; Almeida, Nathalia; Lee, Kwang-Hyuk

    2017-02-01

    Adaptation to delayed sensory feedback following an action produces a subjective time compression between the action and the feedback (temporal recalibration effect, TRE). TRE is important for sensory delay compensation to maintain a relationship between causally related events. It is unclear whether TRE is a sensory modality-specific phenomenon. In 3 experiments employing a sensorimotor synchronization task, we investigated this question using cathodal transcranial direct-current stimulation (tDCS). We found that cathodal tDCS over the visual cortex, and to a lesser extent over the auditory cortex, produced decreased visual TRE. However, both auditory and visual cortex tDCS did not produce any measurable effects on auditory TRE. Our study revealed different nature of TRE in auditory and visual domains. Visual-motor TRE, which is more variable than auditory TRE, is a sensory modality-specific phenomenon, modulated by the auditory cortex. The robustness of auditory-motor TRE, unaffected by tDCS, suggests the dominance of the auditory system in temporal processing, by providing a frame of reference in the realignment of sensorimotor timing signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Attention distributed across sensory modalities enhances perceptual performance

    PubMed Central

    Mishra, Jyoti; Gazzaley, Adam

    2012-01-01

    This study investigated the interaction between top-down attentional control and multisensory processing in humans. Using semantically congruent and incongruent audiovisual stimulus streams, we found target detection to be consistently improved in the setting of distributed audiovisual attention versus focused visual attention. This performance benefit was manifested as faster reaction times for congruent audiovisual stimuli, and as accuracy improvements for incongruent stimuli, resulting in a resolution of stimulus interference. Electrophysiological recordings revealed that these behavioral enhancements were associated with reduced neural processing of both auditory and visual components of the audiovisual stimuli under distributed vs. focused visual attention. These neural changes were observed at early processing latencies, within 100–300 ms post-stimulus onset, and localized to auditory, visual, and polysensory temporal cortices. These results highlight a novel neural mechanism for top-down driven performance benefits via enhanced efficacy of sensory neural processing during distributed audiovisual attention relative to focused visual attention. PMID:22933811

  11. Sensory gain control (amplification) as a mechanism of selective attention: electrophysiological and neuroimaging evidence.

    PubMed Central

    Hillyard, S A; Vogel, E K; Luck, S J

    1998-01-01

    Both physiological and behavioral studies have suggested that stimulus-driven neural activity in the sensory pathways can be modulated in amplitude during selective attention. Recordings of event-related brain potentials indicate that such sensory gain control or amplification processes play an important role in visual-spatial attention. Combined event-related brain potential and neuroimaging experiments provide strong evidence that attentional gain control operates at an early stage of visual processing in extrastriate cortical areas. These data support early selection theories of attention and provide a basis for distinguishing between separate mechanisms of attentional suppression (of unattended inputs) and attentional facilitation (of attended inputs). PMID:9770220

  12. Central Processing Dysfunctions in Children: A Review of Research.

    ERIC Educational Resources Information Center

    Chalfant, James C.; Scheffelin, Margaret A.

    Research on central processing dysfunctions in children is reviewed in three major areas. The first, dysfunctions in the analysis of sensory information, includes auditory, visual, and haptic processing. The second, dysfunction in the synthesis of sensory information, covers multiple stimulus integration and short-term memory. The third area of…

  13. Experimental and Computational Studies of Cortical Neural Network Properties Through Signal Processing

    NASA Astrophysics Data System (ADS)

    Clawson, Wesley Patrick

    Previous studies, both theoretical and experimental, of network level dynamics in the cerebral cortex show evidence for a statistical phenomenon called criticality; a phenomenon originally studied in the context of phase transitions in physical systems and that is associated with favorable information processing in the context of the brain. The focus of this thesis is to expand upon past results with new experimentation and modeling to show a relationship between criticality and the ability to detect and discriminate sensory input. A line of theoretical work predicts maximal sensory discrimination as a functional benefit of criticality, which can then be characterized using mutual information between sensory input, visual stimulus, and neural response,. The primary finding of our experiments in the visual cortex in turtles and neuronal network modeling confirms this theoretical prediction. We show that sensory discrimination is maximized when visual cortex operates near criticality. In addition to presenting this primary finding in detail, this thesis will also address our preliminary results on change-point-detection in experimentally measured cortical dynamics.

  14. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2010-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing attentional locations at each node.

  15. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2011-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing all attentional locations at each node.

  16. Learning Enhances Sensory and Multiple Non-sensory Representations in Primary Visual Cortex

    PubMed Central

    Poort, Jasper; Khan, Adil G.; Pachitariu, Marius; Nemri, Abdellatif; Orsolic, Ivana; Krupic, Julija; Bauza, Marius; Sahani, Maneesh; Keller, Georg B.; Mrsic-Flogel, Thomas D.; Hofer, Sonja B.

    2015-01-01

    Summary We determined how learning modifies neural representations in primary visual cortex (V1) during acquisition of a visually guided behavioral task. We imaged the activity of the same layer 2/3 neuronal populations as mice learned to discriminate two visual patterns while running through a virtual corridor, where one pattern was rewarded. Improvements in behavioral performance were closely associated with increasingly distinguishable population-level representations of task-relevant stimuli, as a result of stabilization of existing and recruitment of new neurons selective for these stimuli. These effects correlated with the appearance of multiple task-dependent signals during learning: those that increased neuronal selectivity across the population when expert animals engaged in the task, and those reflecting anticipation or behavioral choices specifically in neuronal subsets preferring the rewarded stimulus. Therefore, learning engages diverse mechanisms that modify sensory and non-sensory representations in V1 to adjust its processing to task requirements and the behavioral relevance of visual stimuli. PMID:26051421

  17. Methods and Apparatus for Autonomous Robotic Control

    NASA Technical Reports Server (NTRS)

    Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Versace, Massimiliano (Inventor); Palma, Jesse (Inventor)

    2017-01-01

    Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.

  18. Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli

    PubMed Central

    Störmer, Viola S.; McDonald, John J.; Hillyard, Steven A.

    2009-01-01

    The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex. PMID:20007778

  19. Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli.

    PubMed

    Störmer, Viola S; McDonald, John J; Hillyard, Steven A

    2009-12-29

    The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex.

  20. Shared sensory estimates for human motion perception and pursuit eye movements.

    PubMed

    Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio; Osborne, Leslie C

    2015-06-03

    Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. Copyright © 2015 the authors 0270-6474/15/358515-16$15.00/0.

  1. Shared Sensory Estimates for Human Motion Perception and Pursuit Eye Movements

    PubMed Central

    Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio

    2015-01-01

    Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. PMID:26041919

  2. Sensory Eye Dominance in Treated Anisometropic Amblyopia

    PubMed Central

    Chen, Yao

    2017-01-01

    Amblyopia results from inadequate visual experience during the critical period of visual development. Abnormal binocular interactions are believed to play a critical role in amblyopia. These binocular deficits can often be resolved, owing to the residual visual plasticity in amblyopes. In this study, we quantitatively measured the sensory eye dominance in treated anisometropic amblyopes to determine whether they had fully recovered. Fourteen treated anisometropic amblyopes with normal or corrected to normal visual acuity participated, and their sensory eye dominance was assessed by using a binocular phase combination paradigm. We found that the two eyes were unequal in binocular combination in most (11 out of 14) of our treated anisometropic amblyopes, but none of the controls. We concluded that the treated anisometropic amblyopes, even those with a normal range of visual acuity, exhibited abnormal binocular processing. Our results thus suggest that there is potential for improvement in treated anisometropic amblyopes that may further enhance their binocular visual functioning. PMID:28573051

  3. Sensory Contributions to Impaired Emotion Processing in Schizophrenia

    PubMed Central

    Butler, Pamela D.; Abeles, Ilana Y.; Weiskopf, Nicole G.; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E.; Zemon, Vance; Loughead, James; Gur, Ruben C.; Javitt, Daniel C.

    2009-01-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective. PMID:19793797

  4. Sensory contributions to impaired emotion processing in schizophrenia.

    PubMed

    Butler, Pamela D; Abeles, Ilana Y; Weiskopf, Nicole G; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E; Zemon, Vance; Loughead, James; Gur, Ruben C; Javitt, Daniel C

    2009-11-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective.

  5. A magnetoencephalography study of multi-modal processing of pain anticipation in primary sensory cortices.

    PubMed

    Gopalakrishnan, R; Burgess, R C; Plow, E B; Floden, D P; Machado, A G

    2015-09-24

    Pain anticipation plays a critical role in pain chronification and results in disability due to pain avoidance. It is important to understand how different sensory modalities (auditory, visual or tactile) may influence pain anticipation as different strategies could be applied to mitigate anticipatory phenomena and chronification. In this study, using a countdown paradigm, we evaluated with magnetoencephalography the neural networks associated with pain anticipation elicited by different sensory modalities in normal volunteers. When encountered with well-established cues that signaled pain, visual and somatosensory cortices engaged the pain neuromatrix areas early during the countdown process, whereas the auditory cortex displayed delayed processing. In addition, during pain anticipation, the visual cortex displayed independent processing capabilities after learning the contextual meaning of cues from associative and limbic areas. Interestingly, cross-modal activation was also evident and strong when visual and tactile cues signaled upcoming pain. Dorsolateral prefrontal cortex and mid-cingulate cortex showed significant activity during pain anticipation regardless of modality. Our results show pain anticipation is processed with great time efficiency by a highly specialized and hierarchical network. The highest degree of higher-order processing is modulated by context (pain) rather than content (modality) and rests within the associative limbic regions, corroborating their intrinsic role in chronification. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  6. Fundamental Visual Representations of Social Cognition in ASD

    DTIC Science & Technology

    2015-10-01

    autism spectrum disorder as assessed by high density electrical mapping...C., Russo, N. N., & Foxe, J. J. (2013). Atypical cortical representation of peripheral visual space in children with an autism spectrum disorder . European Journal of Neuroscience, 38(1), 2125-2138. ...Sensory processing issues are prevalent in the autism spectrum (ASD) population, and sensory adaptation can be a potential biomarker - a

  7. Multi-modal distraction: insights from children's limited attention.

    PubMed

    Matusz, Pawel J; Broadbent, Hannah; Ferrari, Jessica; Forrest, Benjamin; Merkley, Rebecca; Scerif, Gaia

    2015-03-01

    How does the multi-sensory nature of stimuli influence information processing? Cognitive systems with limited selective attention can elucidate these processes. Six-year-olds, 11-year-olds and 20-year-olds engaged in a visual search task that required them to detect a pre-defined coloured shape under conditions of low or high visual perceptual load. On each trial, a peripheral distractor that could be either compatible or incompatible with the current target colour was presented either visually, auditorily or audiovisually. Unlike unimodal distractors, audiovisual distractors elicited reliable compatibility effects across the two levels of load in adults and in the older children, but high visual load significantly reduced distraction for all children, especially the youngest participants. This study provides the first demonstration that multi-sensory distraction has powerful effects on selective attention: Adults and older children alike allocate attention to potentially relevant information across multiple senses. However, poorer attentional resources can, paradoxically, shield the youngest children from the deleterious effects of multi-sensory distraction. Furthermore, we highlight how developmental research can enrich the understanding of distinct mechanisms controlling adult selective attention in multi-sensory environments. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Model-based analysis of pattern motion processing in mouse primary visual cortex

    PubMed Central

    Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.

    2015-01-01

    Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738

  9. Visual Perceptual Learning and Models.

    PubMed

    Dosher, Barbara; Lu, Zhong-Lin

    2017-09-15

    Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.

  10. Auditory and visual sequence learning in humans and monkeys using an artificial grammar learning paradigm.

    PubMed

    Milne, Alice E; Petkov, Christopher I; Wilson, Benjamin

    2017-07-05

    Language flexibly supports the human ability to communicate using different sensory modalities, such as writing and reading in the visual modality and speaking and listening in the auditory domain. Although it has been argued that nonhuman primate communication abilities are inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks and statistical learning experiments can be used to emulate ordering relationships between words in a sentence. However, previous comparative work using such paradigms has primarily investigated sequence learning within a single sensory modality. We used an AGL paradigm to evaluate how humans and macaque monkeys learn and respond to identically structured sequences of either auditory or visual stimuli. In the auditory and visual experiments, we found that both species were sensitive to the ordering relationships between elements in the sequences. Moreover, the humans and monkeys produced largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed in comparable ways across the sensory modalities. These results provide evidence that human sequence processing abilities stem from an evolutionarily conserved capacity that appears to operate comparably across the sensory modalities in both human and nonhuman primates. The findings set the stage for future neurobiological studies to investigate the multisensory nature of these sequencing operations in nonhuman primates and how they compare to related processes in humans. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  11. Age-equivalent top-down modulation during cross-modal selective attention.

    PubMed

    Guerreiro, Maria J S; Anguera, Joaquin A; Mishra, Jyoti; Van Gerven, Pascal W M; Gazzaley, Adam

    2014-12-01

    Selective attention involves top-down modulation of sensory cortical areas, such that responses to relevant information are enhanced whereas responses to irrelevant information are suppressed. Suppression of irrelevant information, unlike enhancement of relevant information, has been shown to be deficient in aging. Although these attentional mechanisms have been well characterized within the visual modality, little is known about these mechanisms when attention is selectively allocated across sensory modalities. The present EEG study addressed this issue by testing younger and older participants in three different tasks: Participants attended to the visual modality and ignored the auditory modality, attended to the auditory modality and ignored the visual modality, or passively perceived information presented through either modality. We found overall modulation of visual and auditory processing during cross-modal selective attention in both age groups. Top-down modulation of visual processing was observed as a trend toward enhancement of visual information in the setting of auditory distraction, but no significant suppression of visual distraction when auditory information was relevant. Top-down modulation of auditory processing, on the other hand, was observed as suppression of auditory distraction when visual stimuli were relevant, but no significant enhancement of auditory information in the setting of visual distraction. In addition, greater visual enhancement was associated with better recognition of relevant visual information, and greater auditory distractor suppression was associated with a better ability to ignore auditory distraction. There were no age differences in these effects, suggesting that when relevant and irrelevant information are presented through different sensory modalities, selective attention remains intact in older age.

  12. Bottlenecks of Motion Processing during a Visual Glance: The Leaky Flask Model

    PubMed Central

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E.; Tripathy, Srimant P.

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing. PMID:24391806

  13. Bottlenecks of motion processing during a visual glance: the leaky flask model.

    PubMed

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E; Tripathy, Srimant P

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing.

  14. Spatiotemporal Processing in Crossmodal Interactions for Perception of the External World: A Review

    PubMed Central

    Hidaka, Souta; Teramoto, Wataru; Sugita, Yoichi

    2015-01-01

    Research regarding crossmodal interactions has garnered much interest in the last few decades. A variety of studies have demonstrated that multisensory information (vision, audition, tactile sensation, and so on) can perceptually interact with each other in the spatial and temporal domains. Findings regarding crossmodal interactions in the spatiotemporal domain (i.e., motion processing) have also been reported, with updates in the last few years. In this review, we summarize past and recent findings on spatiotemporal processing in crossmodal interactions regarding perception of the external world. A traditional view regarding crossmodal interactions holds that vision is superior to audition in spatial processing, but audition is dominant over vision in temporal processing. Similarly, vision is considered to have dominant effects over the other sensory modalities (i.e., visual capture) in spatiotemporal processing. However, recent findings demonstrate that sound could have a driving effect on visual motion perception. Moreover, studies regarding perceptual associative learning reported that, after association is established between a sound sequence without spatial information and visual motion information, the sound sequence could trigger visual motion perception. Other sensory information, such as motor action or smell, has also exhibited similar driving effects on visual motion perception. Additionally, recent brain imaging studies demonstrate that similar activation patterns could be observed in several brain areas, including the motion processing areas, between spatiotemporal information from different sensory modalities. Based on these findings, we suggest that multimodal information could mutually interact in spatiotemporal processing in the percept of the external world and that common perceptual and neural underlying mechanisms would exist for spatiotemporal processing. PMID:26733827

  15. A hierarchy of timescales explains distinct effects of local inhibition of primary visual cortex and frontal eye fields

    PubMed Central

    Cocchi, Luca; Sale, Martin V; L Gollo, Leonardo; Bell, Peter T; Nguyen, Vinh T; Zalesky, Andrew; Breakspear, Michael; Mattingley, Jason B

    2016-01-01

    Within the primate visual system, areas at lower levels of the cortical hierarchy process basic visual features, whereas those at higher levels, such as the frontal eye fields (FEF), are thought to modulate sensory processes via feedback connections. Despite these functional exchanges during perception, there is little shared activity between early and late visual regions at rest. How interactions emerge between regions encompassing distinct levels of the visual hierarchy remains unknown. Here we combined neuroimaging, non-invasive cortical stimulation and computational modelling to characterize changes in functional interactions across widespread neural networks before and after local inhibition of primary visual cortex or FEF. We found that stimulation of early visual cortex selectively increased feedforward interactions with FEF and extrastriate visual areas, whereas identical stimulation of the FEF decreased feedback interactions with early visual areas. Computational modelling suggests that these opposing effects reflect a fast-slow timescale hierarchy from sensory to association areas. DOI: http://dx.doi.org/10.7554/eLife.15252.001 PMID:27596931

  16. A hierarchy of timescales explains distinct effects of local inhibition of primary visual cortex and frontal eye fields.

    PubMed

    Cocchi, Luca; Sale, Martin V; L Gollo, Leonardo; Bell, Peter T; Nguyen, Vinh T; Zalesky, Andrew; Breakspear, Michael; Mattingley, Jason B

    2016-09-06

    Within the primate visual system, areas at lower levels of the cortical hierarchy process basic visual features, whereas those at higher levels, such as the frontal eye fields (FEF), are thought to modulate sensory processes via feedback connections. Despite these functional exchanges during perception, there is little shared activity between early and late visual regions at rest. How interactions emerge between regions encompassing distinct levels of the visual hierarchy remains unknown. Here we combined neuroimaging, non-invasive cortical stimulation and computational modelling to characterize changes in functional interactions across widespread neural networks before and after local inhibition of primary visual cortex or FEF. We found that stimulation of early visual cortex selectively increased feedforward interactions with FEF and extrastriate visual areas, whereas identical stimulation of the FEF decreased feedback interactions with early visual areas. Computational modelling suggests that these opposing effects reflect a fast-slow timescale hierarchy from sensory to association areas.

  17. Temporal recalibration of motor and visual potentials in lag adaptation in voluntary movement.

    PubMed

    Cai, Chang; Ogawa, Kenji; Kochiyama, Takanori; Tanaka, Hirokazu; Imamizu, Hiroshi

    2018-05-15

    Adaptively recalibrating motor-sensory asynchrony is critical for animals to perceive self-produced action consequences. It is controversial whether motor- or sensory-related neural circuits recalibrate this asynchrony. By combining magnetoencephalography (MEG) and functional MRI (fMRI), we investigate the temporal changes in brain activities caused by repeated exposure to a 150-ms delay inserted between a button-press action and a subsequent flash. We found that readiness potentials significantly shift later in the motor system, especially in parietal regions (average: 219.9 ms), while visually evoked potentials significantly shift earlier in occipital regions (average: 49.7 ms) in the delay condition compared to the no-delay condition. Moreover, the shift in readiness potentials, but not in visually evoked potentials, was significantly correlated with the psychophysical measure of motor-sensory adaptation. These results suggest that although both motor and sensory processes contribute to the recalibration, the motor process plays the major role, given the magnitudes of shift and the correlation with the psychophysical measure. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Influences of selective adaptation on perception of audiovisual speech

    PubMed Central

    Dias, James W.; Cook, Theresa C.; Rosenblum, Lawrence D.

    2016-01-01

    Research suggests that selective adaptation in speech is a low-level process dependent on sensory-specific information shared between the adaptor and test-stimuli. However, previous research has only examined how adaptors shift perception of unimodal test stimuli, either auditory or visual. In the current series of experiments, we investigated whether adaptation to cross-sensory phonetic information can influence perception of integrated audio-visual phonetic information. We examined how selective adaptation to audio and visual adaptors shift perception of speech along an audiovisual test continuum. This test-continuum consisted of nine audio-/ba/-visual-/va/ stimuli, ranging in visual clarity of the mouth. When the mouth was clearly visible, perceivers “heard” the audio-visual stimulus as an integrated “va” percept 93.7% of the time (e.g., McGurk & MacDonald, 1976). As visibility of the mouth became less clear across the nine-item continuum, the audio-visual “va” percept weakened, resulting in a continuum ranging in audio-visual percepts from /va/ to /ba/. Perception of the test-stimuli was tested before and after adaptation. Changes in audiovisual speech perception were observed following adaptation to visual-/va/ and audiovisual-/va/, but not following adaptation to auditory-/va/, auditory-/ba/, or visual-/ba/. Adaptation modulates perception of integrated audio-visual speech by modulating the processing of sensory-specific information. The results suggest that auditory and visual speech information are not completely integrated at the level of selective adaptation. PMID:27041781

  19. The sensory timecourses associated with conscious visual item memory and source memory.

    PubMed

    Thakral, Preston P; Slotnick, Scott D

    2015-09-01

    Previous event-related potential (ERP) findings have suggested that during visual item and source memory, nonconscious and conscious sensory (occipital-temporal) activity onsets may be restricted to early (0-800 ms) and late (800-1600 ms) temporal epochs, respectively. In an ERP experiment, we tested this hypothesis by separately assessing whether the onset of conscious sensory activity was restricted to the late epoch during source (location) memory and item (shape) memory. We found that conscious sensory activity had a late (>800 ms) onset during source memory and an early (<200 ms) onset during item memory. In a follow-up fMRI experiment, conscious sensory activity was localized to BA17, BA18, and BA19. Of primary importance, the distinct source memory and item memory ERP onsets contradict the hypothesis that there is a fixed temporal boundary separating nonconscious and conscious processing during all forms of visual conscious retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Information fusion via isocortex-based Area 37 modeling

    NASA Astrophysics Data System (ADS)

    Peterson, James K.

    2004-08-01

    A simplified model of information processing in the brain can be constructed using primary sensory input from two modalities (auditory and visual) and recurrent connections to the limbic subsystem. Information fusion would then occur in Area 37 of the temporal cortex. The creation of meta concepts from the low order primary inputs is managed by models of isocortex processing. Isocortex algorithms are used to model parietal (auditory), occipital (visual), temporal (polymodal fusion) cortex and the limbic system. Each of these four modules is constructed out of five cortical stacks in which each stack consists of three vertically oriented six layer isocortex models. The input to output training of each cortical model uses the OCOS (on center - off surround) and FFP (folded feedback pathway) circuitry of (Grossberg, 1) which is inherently a recurrent network type of learning characterized by the identification of perceptual groups. Models of this sort are thus closely related to cognitive models as it is difficult to divorce the sensory processing subsystems from the higher level processing in the associative cortex. The overall software architecture presented is biologically based and is presented as a potential architectural prototype for the development of novel sensory fusion strategies. The algorithms are motivated to some degree by specific data from projects on musical composition and autonomous fine art painting programs, but only in the sense that these projects use two specific types of auditory and visual cortex data. Hence, the architectures are presented for an artificial information processing system which utilizes two disparate sensory sources. The exact nature of the two primary sensory input streams is irrelevant.

  1. Visual and cross-modal cues increase the identification of overlapping visual stimuli in Balint's syndrome.

    PubMed

    D'Imperio, Daniela; Scandola, Michele; Gobbetto, Valeria; Bulgarelli, Cristina; Salgarello, Matteo; Avesani, Renato; Moro, Valentina

    2017-10-01

    Cross-modal interactions improve the processing of external stimuli, particularly when an isolated sensory modality is impaired. When information from different modalities is integrated, object recognition is facilitated probably as a result of bottom-up and top-down processes. The aim of this study was to investigate the potential effects of cross-modal stimulation in a case of simultanagnosia. We report a detailed analysis of clinical symptoms and an 18 F-fluorodeoxyglucose (FDG) brain positron emission tomography/computed tomography (PET/CT) study of a patient affected by Balint's syndrome, a rare and invasive visual-spatial disorder following bilateral parieto-occipital lesions. An experiment was conducted to investigate the effects of visual and nonvisual cues on performance in tasks involving the recognition of overlapping pictures. Four modalities of sensory cues were used: visual, tactile, olfactory, and auditory. Data from neuropsychological tests showed the presence of ocular apraxia, optic ataxia, and simultanagnosia. The results of the experiment indicate a positive effect of the cues on the recognition of overlapping pictures, not only in the identification of the congruent valid-cued stimulus (target) but also in the identification of the other, noncued stimuli. All the sensory modalities analyzed (except the auditory stimulus) were efficacious in terms of increasing visual recognition. Cross-modal integration improved the patient's ability to recognize overlapping figures. However, while in the visual unimodal modality both bottom-up (priming, familiarity effect, disengagement of attention) and top-down processes (mental representation and short-term memory, the endogenous orientation of attention) are involved, in the cross-modal integration it is semantic representations that mainly activate visual recognition processes. These results are potentially useful for the design of rehabilitation training for attentional and visual-perceptual deficits.

  2. A crossmodal crossover: opposite effects of visual and auditory perceptual load on steady-state evoked potentials to irrelevant visual stimuli.

    PubMed

    Jacoby, Oscar; Hall, Sarah E; Mattingley, Jason B

    2012-07-16

    Mechanisms of attention are required to prioritise goal-relevant sensory events under conditions of stimulus competition. According to the perceptual load model of attention, the extent to which task-irrelevant inputs are processed is determined by the relative demands of discriminating the target: the more perceptually demanding the target task, the less unattended stimuli will be processed. Although much evidence supports the perceptual load model for competing stimuli within a single sensory modality, the effects of perceptual load in one modality on distractor processing in another is less clear. Here we used steady-state evoked potentials (SSEPs) to measure neural responses to irrelevant visual checkerboard stimuli while participants performed either a visual or auditory task that varied in perceptual load. Consistent with perceptual load theory, increasing visual task load suppressed SSEPs to the ignored visual checkerboards. In contrast, increasing auditory task load enhanced SSEPs to the ignored visual checkerboards. This enhanced neural response to irrelevant visual stimuli under auditory load suggests that exhausting capacity within one modality selectively compromises inhibitory processes required for filtering stimuli in another. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps.

    PubMed

    Sood, Mariam R; Sereno, Martin I

    2016-08-01

    Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor-preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface-based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory-motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory-motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M-I. Hum Brain Mapp 37:2784-2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  4. Effects of attention and laterality on motion and orientation discrimination in deaf signers.

    PubMed

    Bosworth, Rain G; Petrich, Jennifer A F; Dobkins, Karen R

    2013-06-01

    Previous studies have asked whether visual sensitivity and attentional processing in deaf signers are enhanced or altered as a result of their different sensory experiences during development, i.e., auditory deprivation and exposure to a visual language. In particular, deaf and hearing signers have been shown to exhibit a right visual field/left hemisphere advantage for motion processing, while hearing nonsigners do not. To examine whether this finding extends to other aspects of visual processing, we compared deaf signers and hearing nonsigners on motion, form, and brightness discrimination tasks. Secondly, to examine whether hemispheric lateralities are affected by attention, we employed a dual-task paradigm to measure form and motion thresholds under "full" vs. "poor" attention conditions. Deaf signers, but not hearing nonsigners, exhibited a right visual field advantage for motion processing. This effect was also seen for form processing and not for the brightness task. Moreover, no group differences were observed in attentional effects, and the motion and form visual field asymmetries were not modulated by attention, suggesting they occur at early levels of sensory processing. In sum, the results show that processing of motion and form, believed to be mediated by dorsal and ventral visual pathways, respectively, are left-hemisphere dominant in deaf signers. Published by Elsevier Inc.

  5. Mental Imagery Induces Cross-Modal Sensory Plasticity and Changes Future Auditory Perception.

    PubMed

    Berger, Christopher C; Ehrsson, H Henrik

    2018-04-01

    Can what we imagine in our minds change how we perceive the world in the future? A continuous process of multisensory integration and recalibration is responsible for maintaining a correspondence between the senses (e.g., vision, touch, audition) and, ultimately, a stable and coherent perception of our environment. This process depends on the plasticity of our sensory systems. The so-called ventriloquism aftereffect-a shift in the perceived localization of sounds presented alone after repeated exposure to spatially mismatched auditory and visual stimuli-is a clear example of this type of plasticity in the audiovisual domain. In a series of six studies with 24 participants each, we investigated an imagery-induced ventriloquism aftereffect in which imagining a visual stimulus elicits the same frequency-specific auditory aftereffect as actually seeing one. These results demonstrate that mental imagery can recalibrate the senses and induce the same cross-modal sensory plasticity as real sensory stimuli.

  6. Expectations Do Not Alter Early Sensory Processing during Perceptual Decision-Making.

    PubMed

    Rungratsameetaweemana, Nuttida; Itthipuripat, Sirawaj; Salazar, Annalisa; Serences, John T

    2018-06-13

    Two factors play important roles in shaping perception: the allocation of selective attention to behaviorally relevant sensory features, and prior expectations about regularities in the environment. Signal detection theory proposes distinct roles of attention and expectation on decision-making such that attention modulates early sensory processing, whereas expectation influences the selection and execution of motor responses. Challenging this classic framework, recent studies suggest that expectations about sensory regularities enhance the encoding and accumulation of sensory evidence during decision-making. However, it is possible, that these findings reflect well documented attentional modulations in visual cortex. Here, we tested this framework in a group of male and female human participants by examining how expectations about stimulus features (orientation and color) and expectations about motor responses impacted electroencephalography (EEG) markers of early sensory processing and the accumulation of sensory evidence during decision-making (the early visual negative potential and the centro-parietal positive potential, respectively). We first demonstrate that these markers are sensitive to changes in the amount of sensory evidence in the display. Then we show, counter to recent findings, that neither marker is modulated by either feature or motor expectations, despite a robust effect of expectations on behavior. Instead, violating expectations about likely sensory features and motor responses impacts posterior alpha and frontal theta oscillations, signals thought to index overall processing time and cognitive conflict. These findings are inconsistent with recent theoretical accounts and suggest instead that expectations primarily influence decisions by modulating post-perceptual stages of information processing. SIGNIFICANCE STATEMENT Expectations about likely features or motor responses play an important role in shaping behavior. Classic theoretical frameworks posit that expectations modulate decision-making by biasing late stages of decision-making including the selection and execution of motor responses. In contrast, recent accounts suggest that expectations also modulate decisions by improving the quality of early sensory processing. However, these effects could instead reflect the influence of selective attention. Here we examine the effect of expectations about sensory features and motor responses on a set of electroencephalography (EEG) markers that index early sensory processing and later post-perceptual processing. Counter to recent empirical results, expectations have little effect on early sensory processing but instead modulate EEG markers of time-on-task and cognitive conflict. Copyright © 2018 the authors 0270-6474/18/385632-17$15.00/0.

  7. Common Sense in Choice: The Effect of Sensory Modality on Neural Value Representations.

    PubMed

    Shuster, Anastasia; Levy, Dino J

    2018-01-01

    Although it is well established that the ventromedial prefrontal cortex (vmPFC) represents value using a common currency across categories of rewards, it is unknown whether the vmPFC represents value irrespective of the sensory modality in which alternatives are presented. In the current study, male and female human subjects completed a decision-making task while their neural activity was recorded using functional magnetic resonance imaging. On each trial, subjects chose between a safe alternative and a lottery, which was presented visually or aurally. A univariate conjunction analysis revealed that the anterior portion of the vmPFC tracks subjective value (SV) irrespective of the sensory modality. Using a novel cross-modality multivariate classifier, we were able to decode auditory value based on visual trials and vice versa. In addition, we found that the visual and auditory sensory cortices, which were identified using functional localizers, are also sensitive to the value of stimuli, albeit in a modality-specific manner. Whereas both primary and higher-order auditory cortices represented auditory SV (aSV), only a higher-order visual area represented visual SV (vSV). These findings expand our understanding of the common currency network of the brain and shed a new light on the interplay between sensory and value information processing.

  8. Common Sense in Choice: The Effect of Sensory Modality on Neural Value Representations

    PubMed Central

    2018-01-01

    Abstract Although it is well established that the ventromedial prefrontal cortex (vmPFC) represents value using a common currency across categories of rewards, it is unknown whether the vmPFC represents value irrespective of the sensory modality in which alternatives are presented. In the current study, male and female human subjects completed a decision-making task while their neural activity was recorded using functional magnetic resonance imaging. On each trial, subjects chose between a safe alternative and a lottery, which was presented visually or aurally. A univariate conjunction analysis revealed that the anterior portion of the vmPFC tracks subjective value (SV) irrespective of the sensory modality. Using a novel cross-modality multivariate classifier, we were able to decode auditory value based on visual trials and vice versa. In addition, we found that the visual and auditory sensory cortices, which were identified using functional localizers, are also sensitive to the value of stimuli, albeit in a modality-specific manner. Whereas both primary and higher-order auditory cortices represented auditory SV (aSV), only a higher-order visual area represented visual SV (vSV). These findings expand our understanding of the common currency network of the brain and shed a new light on the interplay between sensory and value information processing. PMID:29619408

  9. High perceptual load leads to both reduced gain and broader orientation tuning

    PubMed Central

    Stolte, Moritz; Bahrami, Bahador; Lavie, Nilli

    2014-01-01

    Due to its limited capacity, visual perception depends on the allocation of attention. The resultant phenomena of inattentional blindness, accompanied by reduced sensory visual cortex response to unattended stimuli in conditions of high perceptual load in the attended task, are now well established (Lavie, 2005; Lavie, 2010, for reviews). However, the underlying mechanisms for these effects remain to be elucidated. Specifically, is reduced perceptual processing under high perceptual load a result of reduced sensory signal gain, broader tuning, or both? We examined this question with psychophysical measures of orientation tuning under different levels of perceptual load in the task performed. Our results show that increased perceptual load leads to both reduced sensory signal and broadening of tuning. These results clarify the effects of attention on elementary visual perception and suggest that high perceptual load is critical for attentional effects on sensory tuning. PMID:24610952

  10. Does a Sensory Processing Deficit Explain Counting Accuracy on Rapid Visual Sequencing Tasks in Adults with and without Dyslexia?

    ERIC Educational Resources Information Center

    Conlon, Elizabeth G.; Wright, Craig M.; Norris, Karla; Chekaluk, Eugene

    2011-01-01

    The experiments conducted aimed to investigate whether reduced accuracy when counting stimuli presented in rapid temporal sequence in adults with dyslexia could be explained by a sensory processing deficit, a general slowing in processing speed or difficulties shifting attention between stimuli. To achieve these aims, the influence of the…

  11. The synaptic pharmacology underlying sensory processing in the superior colliculus.

    PubMed

    Binns, K E

    1999-10-01

    The superior colliculus (SC) is one of the most ancient regions of the vertebrate central sensory system. In this hub afferents from several sensory pathways converge, and an extensive range of neural circuits enable primary sensory processing, multi-sensory integration and the generation of motor commands for orientation behaviours. The SC has a laminar structure and is usually considered in two parts; the superficial visual layers and the deep multi-modal/motor layers. Neurones in the superficial layers integrate visual information from the retina, cortex and other sources, while the deep layers draw together data from many cortical and sub-cortical sensory areas, including the superficial layers, to generate motor commands. Functional studies in anaesthetized subjects and in slice preparations have used pharmacological tools to probe some of the SC's interacting circuits. The studies reviewed here reveal important roles for ionotropic glutamate receptors in the mediation of sensory inputs to the SC and in transmission between the superficial and deep layers. N-methyl-D-aspartate receptors appear to have special responsibility for the temporal matching of retinal and cortical activity in the superficial layers and for the integration of multiple sensory data-streams in the deep layers. Sensory responses are shaped by intrinsic inhibitory mechanisms mediated by GABA(A) and GABA(B) receptors and influenced by nicotinic acetylcholine receptors. These sensory and motor-command activities of SC neurones are modulated by levels of arousal through extrinsic connections containing GABA, serotonin and other transmitters. It is possible to naturally stimulate many of the SC's sensory and non-sensory inputs either independently or simultaneously and this brain area is an ideal location in which to study: (a) interactions between inputs from the same sensory system; (b) the integration of inputs from several sensory systems; and (c) the influence of non-sensory systems on sensory processing.

  12. Does manipulating the speed of visual flow in virtual reality change distance estimation while walking in Parkinson's disease?

    PubMed

    Ehgoetz Martens, Kaylena A; Ellard, Colin G; Almeida, Quincy J

    2015-03-01

    Although dopaminergic replacement therapy is believed to improve sensory processing in PD, while delayed perceptual speed is thought to be caused by a predominantly cholinergic deficit, it is unclear whether sensory-perceptual deficits are a result of corrupt sensory processing, or a delay in updating perceived feedback during movement. The current study aimed to examine these two hypotheses by manipulating visual flow speed and dopaminergic medication to examine which influenced distance estimation in PD. Fourteen PD and sixteen HC participants were instructed to estimate the distance of a remembered target by walking to the position the target formerly occupied. This task was completed in virtual reality in order to manipulate the visual flow (VF) speed in real time. Three conditions were carried out: (1) BASELINE: VF speed was equal to participants' real-time movement speed; (2) SLOW: VF speed was reduced by 50 %; (2) FAST: VF speed was increased by 30 %. Individuals with PD performed the experiment in their ON and OFF state. PD demonstrated significantly greater judgement error during BASELINE and FAST conditions compared to HC, although PD did not improve their judgement error during the SLOW condition. Additionally, PD had greater variable error during baseline compared to HC; however, during the SLOW conditions, PD had significantly less variable error compared to baseline and similar variable error to HC participants. Overall, dopaminergic medication did not significantly influence judgement error. Therefore, these results suggest that corrupt processing of sensory information is the main contributor to sensory-perceptual deficits during movement in PD rather than delayed updating of sensory feedback.

  13. Visual and acoustic communication in non-human animals: a comparison.

    PubMed

    Rosenthal, G G; Ryan, M J

    2000-09-01

    The visual and auditory systems are two major sensory modalities employed in communication. Although communication in these two sensory modalities can serve analogous functions and evolve in response to similar selection forces, the two systems also operate under different constraints imposed by the environment and the degree to which these sensory modalities are recruited for non-communication functions. Also, the research traditions in each tend to differ, with studies of mechanisms of acoustic communication tending to take a more reductionist tack often concentrating on single signal parameters, and studies of visual communication tending to be more concerned with multivariate signal arrays in natural environments and higher level processing of such signals. Each research tradition would benefit by being more expansive in its approach.

  14. Visual Predictions in the Orbitofrontal Cortex Rely on Associative Content

    PubMed Central

    Chaumon, Maximilien; Kveraga, Kestutis; Barrett, Lisa Feldman; Bar, Moshe

    2014-01-01

    Predicting upcoming events from incomplete information is an essential brain function. The orbitofrontal cortex (OFC) plays a critical role in this process by facilitating recognition of sensory inputs via predictive feedback to sensory cortices. In the visual domain, the OFC is engaged by low spatial frequency (LSF) and magnocellular-biased inputs, but beyond this, we know little about the information content required to activate it. Is the OFC automatically engaged to analyze any LSF information for meaning? Or is it engaged only when LSF information matches preexisting memory associations? We tested these hypotheses and show that only LSF information that could be linked to memory associations engages the OFC. Specifically, LSF stimuli activated the OFC in 2 distinct medial and lateral regions only if they resembled known visual objects. More identifiable objects increased activity in the medial OFC, known for its function in affective responses. Furthermore, these objects also increased the connectivity of the lateral OFC with the ventral visual cortex, a crucial region for object identification. At the interface between sensory, memory, and affective processing, the OFC thus appears to be attuned to the associative content of visual information and to play a central role in visuo-affective prediction. PMID:23771980

  15. Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked Responses

    PubMed Central

    Molloy, Katharine; Griffiths, Timothy D.; Lavie, Nilli

    2015-01-01

    Due to capacity limits on perception, conditions of high perceptual load lead to reduced processing of unattended stimuli (Lavie et al., 2014). Accumulating work demonstrates the effects of visual perceptual load on visual cortex responses, but the effects on auditory processing remain poorly understood. Here we establish the neural mechanisms underlying “inattentional deafness”—the failure to perceive auditory stimuli under high visual perceptual load. Participants performed a visual search task of low (target dissimilar to nontarget items) or high (target similar to nontarget items) load. On a random subset (50%) of trials, irrelevant tones were presented concurrently with the visual stimuli. Brain activity was recorded with magnetoencephalography, and time-locked responses to the visual search array and to the incidental presence of unattended tones were assessed. High, compared to low, perceptual load led to increased early visual evoked responses (within 100 ms from onset). This was accompanied by reduced early (∼100 ms from tone onset) auditory evoked activity in superior temporal sulcus and posterior middle temporal gyrus. A later suppression of the P3 “awareness” response to the tones was also observed under high load. A behavioral experiment revealed reduced tone detection sensitivity under high visual load, indicating that the reduction in neural responses was indeed associated with reduced awareness of the sounds. These findings support a neural account of shared audiovisual resources, which, when depleted under load, leads to failures of sensory perception and awareness. SIGNIFICANCE STATEMENT The present work clarifies the neural underpinning of inattentional deafness under high visual load. The findings of near-simultaneous load effects on both visual and auditory evoked responses suggest shared audiovisual processing capacity. Temporary depletion of shared capacity in perceptually demanding visual tasks leads to a momentary reduction in sensory processing of auditory stimuli, resulting in inattentional deafness. The dynamic “push–pull” pattern of load effects on visual and auditory processing furthers our understanding of both the neural mechanisms of attention and of cross-modal effects across visual and auditory processing. These results also offer an explanation for many previous failures to find cross-modal effects in experiments where the visual load effects may not have coincided directly with auditory sensory processing. PMID:26658858

  16. A number-form area in the blind

    PubMed Central

    Abboud, Sami; Maidenbaum, Shachar; Dehaene, Stanislas; Amedi, Amir

    2015-01-01

    Distinct preference for visual number symbols was recently discovered in the human right inferior temporal gyrus (rITG). It remains unclear how this preference emerges, what is the contribution of shape biases to its formation and whether visual processing underlies it. Here we use congenital blindness as a model for brain development without visual experience. During fMRI, we present blind subjects with shapes encoded using a novel visual-to-music sensory-substitution device (The EyeMusic). Greater activation is observed in the rITG when subjects process symbols as numbers compared with control tasks on the same symbols. Using resting-state fMRI in the blind and sighted, we further show that the areas with preference for numerals and letters exhibit distinct patterns of functional connectivity with quantity and language-processing areas, respectively. Our findings suggest that specificity in the ventral ‘visual’ stream can emerge independently of sensory modality and visual experience, under the influence of distinct connectivity patterns. PMID:25613599

  17. Vasopressin Proves Es-sense-tial: Vasopressin and the Modulation of Sensory Processing in Mammals

    PubMed Central

    Bester-Meredith, Janet K.; Fancher, Alexandria P.; Mammarella, Grace E.

    2015-01-01

    As mammals develop, they encounter increasing social complexity in the surrounding world. In order to survive, mammals must show appropriate behaviors toward their mates, offspring, and same-sex conspecifics. Although the behavioral effects of the neuropeptide arginine vasopressin (AVP) have been studied in a variety of social contexts, the effects of this neuropeptide on multimodal sensory processing have received less attention. AVP is widely distributed through sensory regions of the brain and has been demonstrated to modulate olfactory, auditory, gustatory, and visual processing. Here, we review the evidence linking AVP to the processing of social stimuli in sensory regions of the brain and explore how sensory processing can shape behavioral responses to these stimuli. In addition, we address the interplay between hormonal and neural AVP in regulating sensory processing of social cues. Because AVP pathways show plasticity during development, early life experiences may shape life-long processing of sensory information. Furthermore, disorders of social behavior such as autism and schizophrenia that have been linked with AVP also have been linked with dysfunctions in sensory processing. Together, these studies suggest that AVP’s diversity of effects on social behavior across a variety of mammalian species may result from the effects of this neuropeptide on sensory processing. PMID:25705203

  18. Parallel pathways for cross-modal memory retrieval in Drosophila.

    PubMed

    Zhang, Xiaonan; Ren, Qingzhong; Guo, Aike

    2013-05-15

    Memory-retrieval processing of cross-modal sensory preconditioning is vital for understanding the plasticity underlying the interactions between modalities. As part of the sensory preconditioning paradigm, it has been hypothesized that the conditioned response to an unreinforced cue depends on the memory of the reinforced cue via a sensory link between the two cues. To test this hypothesis, we studied cross-modal memory-retrieval processing in a genetically tractable model organism, Drosophila melanogaster. By expressing the dominant temperature-sensitive shibire(ts1) (shi(ts1)) transgene, which blocks synaptic vesicle recycling of specific neural subsets with the Gal4/UAS system at the restrictive temperature, we specifically blocked visual and olfactory memory retrieval, either alone or in combination; memory acquisition remained intact for these modalities. Blocking the memory retrieval of the reinforced olfactory cues did not impair the conditioned response to the unreinforced visual cues or vice versa, in contrast to the canonical memory-retrieval processing of sensory preconditioning. In addition, these conditioned responses can be abolished by blocking the memory retrieval of the two modalities simultaneously. In sum, our results indicated that a conditioned response to an unreinforced cue in cross-modal sensory preconditioning can be recalled through parallel pathways.

  19. Multisensory integration across the senses in young and old adults

    PubMed Central

    Mahoney, Jeannette R.; Li, Po Ching Clara; Oh-Park, Mooyeon; Verghese, Joe; Holtzer, Roee

    2011-01-01

    Stimuli are processed concurrently and across multiple sensory inputs. Here we directly compared the effect of multisensory integration (MSI) on reaction time across three paired sensory inputs in eighteen young (M=19.17 yrs) and eighteen old (M=76.44 yrs) individuals. Participants were determined to be non-demented and without any medical or psychiatric conditions that would affect their performance. Participants responded to randomly presented unisensory (auditory, visual, somatosensory) stimuli and three paired sensory inputs consisting of auditory-somatosensory (AS) auditory-visual (AV) and visual-somatosensory (VS) stimuli. Results revealed that reaction time (RT) to all multisensory pairings was significantly faster than those elicited to the constituent unisensory conditions across age groups; findings that could not be accounted for by simple probability summation. Both young and old participants responded the fastest to multisensory pairings containing somatosensory input. Compared to younger adults, older adults demonstrated a significantly greater RT benefit when processing concurrent VS information. In terms of co-activation, older adults demonstrated a significant increase in the magnitude of visual-somatosensory co-activation (i.e., multisensory integration), while younger adults demonstrated a significant increase in the magnitude of auditory-visual and auditory-somatosensory co-activation. This study provides first evidence in support of the facilitative effect of pairing somatosensory with visual stimuli in older adults. PMID:22024545

  20. Predictions penetrate perception: Converging insights from brain, behaviour and disorder

    PubMed Central

    O’Callaghan, Claire; Kveraga, Kestutis; Shine, James M; Adams, Reginald B.; Bar, Moshe

    2018-01-01

    It is argued that during ongoing visual perception, the brain is generating top-down predictions to facilitate, guide and constrain the processing of incoming sensory input. Here we demonstrate that these predictions are drawn from a diverse range of cognitive processes, in order to generate the richest and most informative prediction signals. This is consistent with a central role for cognitive penetrability in visual perception. We review behavioural and mechanistic evidence that indicate a wide spectrum of domains—including object recognition, contextual associations, cognitive biases and affective state—that can directly influence visual perception. We combine these insights from the healthy brain with novel observations from neuropsychiatric disorders involving visual hallucinations, which highlight the consequences of imbalance between top-down signals and incoming sensory information. Together, these lines of evidence converge to indicate that predictive penetration, be it cognitive, social or emotional, should be considered a fundamental framework that supports visual perception. PMID:27222169

  1. Iconic Memory and Reading Performance in Nine-Year-Old Children

    ERIC Educational Resources Information Center

    Riding, R. J.; Pugh, J. C.

    1977-01-01

    The reading process incorporates three factors: images registered in visual sensory memory, semantic analysis in short-term memory, and long-term memory storage. The focus here is on the contribution of sensory memory to reading performance. (Author/RK)

  2. ON THE PERCEPTION OF PROBABLE THINGS

    PubMed Central

    Albright, Thomas D.

    2012-01-01

    SUMMARY Perception is influenced both by the immediate pattern of sensory inputs and by memories acquired through prior experiences with the world. Throughout much of its illustrious history, however, study of the cellular basis of perception has focused on neuronal structures and events that underlie the detection and discrimination of sensory stimuli. Relatively little attention has been paid to the means by which memories interact with incoming sensory signals. Building upon recent neurophysiological/behavioral studies of the cortical substrates of visual associative memory, I propose a specific functional process by which stored information about the world supplements sensory inputs to yield neuronal signals that can account for visual perceptual experience. This perspective represents a significant shift in the way we think about the cellular bases of perception. PMID:22542178

  3. A dual-trace model for visual sensory memory.

    PubMed

    Cappiello, Marcus; Zhang, Weiwei

    2016-11-01

    Visual sensory memory refers to a transient memory lingering briefly after the stimulus offset. Although previous literature suggests that visual sensory memory is supported by a fine-grained trace for continuous representation and a coarse-grained trace of categorical information, simultaneous separation and assessment of these traces can be difficult without a quantitative model. The present study used a continuous estimation procedure to test a novel mathematical model of the dual-trace hypothesis of visual sensory memory according to which visual sensory memory could be modeled as a mixture of 2 von Mises (2VM) distributions differing in standard deviation. When visual sensory memory and working memory (WM) for colors were distinguished using different experimental manipulations in the first 3 experiments, the 2VM model outperformed Zhang and Luck (2008) standard mixture model (SM) representing a mixture of a single memory trace and random guesses, even though SM outperformed 2VM for WM. Experiment 4 generalized 2VM's advantages of fitting visual sensory memory data over SM from color to orientation. Furthermore, a single trace model and 4 other alternative models were ruled out, suggesting the necessity and sufficiency of dual traces for visual sensory memory. Together these results support the dual-trace model of visual sensory memory and provide a preliminary inquiry into the nature of information loss from visual sensory memory to WM. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Identifying selective visual attention biases related to fear of pain by tracking eye movements within a dot-probe paradigm.

    PubMed

    Yang, Zhou; Jackson, Todd; Gao, Xiao; Chen, Hong

    2012-08-01

    This research examined selective biases in visual attention related to fear of pain by tracking eye movements (EM) toward pain-related stimuli among the pain-fearful. EM of 21 young adults scoring high on a fear of pain measure (H-FOP) and 20 lower-scoring (L-FOP) control participants were measured during a dot-probe task that featured sensory pain-neutral, health catastrophe-neutral and neutral-neutral word pairs. Analyses indicated that the H-FOP group was more likely to direct immediate visual attention toward sensory pain and health catastrophe words than was the L-FOP group. The H-FOP group also had comparatively shorter first fixation latencies toward sensory pain and health catastrophe words. Conversely, groups did not differ on EM indices of attentional maintenance (i.e., first fixation duration, gaze duration, and average fixation duration) or reaction times to dot probes. Finally, both groups showed a cycle of disengagement followed by re-engagement toward sensory pain words relative to other word types. In sum, this research is the first to reveal biases toward pain stimuli during very early stages of visual information processing among the highly pain-fearful and highlights the utility of EM tracking as a means to evaluate visual attention as a dynamic process in the context of FOP. Copyright © 2012 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  5. Supramodal processing optimizes visual perceptual learning and plasticity.

    PubMed

    Zilber, Nicolas; Ciuciu, Philippe; Gramfort, Alexandre; Azizi, Leila; van Wassenhove, Virginie

    2014-06-01

    Multisensory interactions are ubiquitous in cortex and it has been suggested that sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs (Pascual-Leone and Hamilton, 2001; Renier et al., 2013; Ricciardi and Pietrini, 2011; Voss and Zatorre, 2012). Here, we asked whether learning to discriminate visual coherence could benefit from supramodal processing. To this end, three groups of participants were briefly trained to discriminate which of a red or green intermixed population of random-dot-kinematograms (RDKs) was most coherent in a visual display while being recorded with magnetoencephalography (MEG). During training, participants heard no sound (V), congruent acoustic textures (AV) or auditory noise (AVn); importantly, congruent acoustic textures shared the temporal statistics - i.e. coherence - of visual RDKs. After training, the AV group significantly outperformed participants trained in V and AVn although they were not aware of their progress. In pre- and post-training blocks, all participants were tested without sound and with the same set of RDKs. When contrasting MEG data collected in these experimental blocks, selective differences were observed in the dynamic pattern and the cortical loci responsive to visual RDKs. First and common to all three groups, vlPFC showed selectivity to the learned coherence levels whereas selectivity in visual motion area hMT+ was only seen for the AV group. Second and solely for the AV group, activity in multisensory cortices (mSTS, pSTS) correlated with post-training performances; additionally, the latencies of these effects suggested feedback from vlPFC to hMT+ possibly mediated by temporal cortices in AV and AVn groups. Altogether, we interpret our results in the context of the Reverse Hierarchy Theory of learning (Ahissar and Hochstein, 2004) in which supramodal processing optimizes visual perceptual learning by capitalizing on sensory-invariant representations - here, global coherence levels across sensory modalities. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Processing of Visual Imagery by an Adaptive Model of the Visual System: Its Performance and its Significance. Final Report, June 1969-March 1970.

    ERIC Educational Resources Information Center

    Tallman, Oliver H.

    A digital simulation of a model for the processing of visual images is derived from known aspects of the human visual system. The fundamental principle of computation suggested by a biological model is a transformation that distributes information contained in an input stimulus everywhere in a transform domain. Each sensory input contributes under…

  7. Conservation implications of anthropogenic impacts on visual communication and camouflage.

    PubMed

    Delhey, Kaspar; Peters, Anne

    2017-02-01

    Anthropogenic environmental impacts can disrupt the sensory environment of animals and affect important processes from mate choice to predator avoidance. Currently, these effects are best understood for auditory and chemosensory modalities, and recent reviews highlight their importance for conservation. We examined how anthropogenic changes to the visual environment (ambient light, transmission, and backgrounds) affect visual communication and camouflage and considered the implications of these effects for conservation. Human changes to the visual environment can increase predation risk by affecting camouflage effectiveness, lead to maladaptive patterns of mate choice, and disrupt mutualistic interactions between pollinators and plants. Implications for conservation are particularly evident for disrupted camouflage due to its tight links with survival. The conservation importance of impaired visual communication is less documented. The effects of anthropogenic changes on visual communication and camouflage may be severe when they affect critical processes such as pollination or species recognition. However, when impaired mate choice does not lead to hybridization, the conservation consequences are less clear. We suggest that the demographic effects of human impacts on visual communication and camouflage will be particularly strong when human-induced modifications to the visual environment are evolutionarily novel (i.e., very different from natural variation); affected species and populations have low levels of intraspecific (genotypic and phenotypic) variation and behavioral, sensory, or physiological plasticity; and the processes affected are directly related to survival (camouflage), species recognition, or number of offspring produced, rather than offspring quality or attractiveness. Our findings suggest that anthropogenic effects on the visual environment may be of similar importance relative to conservation as anthropogenic effects on other sensory modalities. © 2016 Society for Conservation Biology.

  8. Neural correlates of tactile perception during pre-, peri-, and post-movement.

    PubMed

    Juravle, Georgiana; Heed, Tobias; Spence, Charles; Röder, Brigitte

    2016-05-01

    Tactile information is differentially processed over the various phases of goal-directed movements. Here, event-related potentials (ERPs) were used to investigate the neural correlates of tactile and visual information processing during movement. Participants performed goal-directed reaches for an object placed centrally on the table in front of them. Tactile and visual stimulation (100 ms) was presented in separate trials during the different phases of the movement (i.e. preparation, execution, and post-movement). These stimuli were independently delivered to either the moving or resting hand. In a control condition, the participants only performed the movement, while omission (i.e. movement-only) ERPs were recorded. Participants were instructed to ignore the presence or absence of any sensory events and to concentrate solely on the execution of the movement. Enhanced ERPs were observed 80-200 ms after tactile stimulation, as well as 100-250 ms after visual stimulation: These modulations were greatest during the execution of the goal-directed movement, and they were effector based (i.e. significantly more negative for stimuli presented to the moving hand). Furthermore, ERPs revealed enhanced sensory processing during goal-directed movements for visual stimuli as well. Such enhanced processing of both tactile and visual information during the execution phase suggests that incoming sensory information is continuously monitored for a potential adjustment of the current motor plan. Furthermore, the results reported here also highlight a tight coupling between spatial attention and the execution of motor actions.

  9. More Than Meets the Eye: The Eye and Migraine-What You Need to Know.

    PubMed

    Digre, Kathleen B

    2018-05-02

    Migraine has long been associated with disturbances of vision, especially migraine with aura. However, the eye plays an important role in sensory processing as well. We have found that the visual quality of life is reduced in migraine. In this review, we discuss how the migraine and eye pain pathways are similar and affect many of the common complaints which are seen in ophthalmology and neuro-ophthalmology offices, such as dry eye and postoperative eye pain. We also review other related phenomena, including visual snow and photophobia, which also are related to altered sensory processing in migraine.

  10. Attention modulates specific motor cortical circuits recruited by transcranial magnetic stimulation.

    PubMed

    Mirdamadi, J L; Suzuki, L Y; Meehan, S K

    2017-09-17

    Skilled performance and acquisition is dependent upon afferent input to motor cortex. The present study used short-latency afferent inhibition (SAI) to probe how manipulation of sensory afference by attention affects different circuits projecting to pyramidal tract neurons in motor cortex. SAI was assessed in the first dorsal interosseous muscle while participants performed a low or high attention-demanding visual detection task. SAI was evoked by preceding a suprathreshold transcranial magnetic stimulus with electrical stimulation of the median nerve at the wrist. To isolate different afferent intracortical circuits in motor cortex SAI was evoked using either posterior-anterior (PA) or anterior-posterior (PA) monophasic current. In an independent sample, somatosensory processing during the same attention-demanding visual detection tasks was assessed using somatosensory-evoked potentials (SEP) elicited by median nerve stimulation. SAI elicited by AP TMS was reduced under high compared to low visual attention demands. SAI elicited by PA TMS was not affected by visual attention demands. SEPs revealed that the high visual attention load reduced the fronto-central P20-N30 but not the contralateral parietal N20-P25 SEP component. P20-N30 reduction confirmed that the visual attention task altered sensory afference. The current results offer further support that PA and AP TMS recruit different neuronal circuits. AP circuits may be one substrate by which cognitive strategies shape sensorimotor processing during skilled movement by altering sensory processing in premotor areas. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  11. Taking Attention Away from the Auditory Modality: Context-dependent Effects on Early Sensory Encoding of Speech.

    PubMed

    Xie, Zilong; Reetzke, Rachel; Chandrasekaran, Bharath

    2018-05-24

    Increasing visual perceptual load can reduce pre-attentive auditory cortical activity to sounds, a reflection of the limited and shared attentional resources for sensory processing across modalities. Here, we demonstrate that modulating visual perceptual load can impact the early sensory encoding of speech sounds, and that the impact of visual load is highly dependent on the predictability of the incoming speech stream. Participants (n = 20, 9 females) performed a visual search task of high (target similar to distractors) and low (target dissimilar to distractors) perceptual load, while early auditory electrophysiological responses were recorded to native speech sounds. Speech sounds were presented either in a 'repetitive context', or a less predictable 'variable context'. Independent of auditory stimulus context, pre-attentive auditory cortical activity was reduced during high visual load, relative to low visual load. We applied a data-driven machine learning approach to decode speech sounds from the early auditory electrophysiological responses. Decoding performance was found to be poorer under conditions of high (relative to low) visual load, when the incoming acoustic stream was predictable. When the auditory stimulus context was less predictable, decoding performance was substantially greater for the high (relative to low) visual load conditions. Our results provide support for shared attentional resources between visual and auditory modalities that substantially influence the early sensory encoding of speech signals in a context-dependent manner. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  12. Improvement in visual search with practice: mapping learning-related changes in neurocognitive stages of processing.

    PubMed

    Clark, Kait; Appelbaum, L Gregory; van den Berg, Berry; Mitroff, Stephen R; Woldorff, Marty G

    2015-04-01

    Practice can improve performance on visual search tasks; the neural mechanisms underlying such improvements, however, are not clear. Response time typically shortens with practice, but which components of the stimulus-response processing chain facilitate this behavioral change? Improved search performance could result from enhancements in various cognitive processing stages, including (1) sensory processing, (2) attentional allocation, (3) target discrimination, (4) motor-response preparation, and/or (5) response execution. We measured event-related potentials (ERPs) as human participants completed a five-day visual-search protocol in which they reported the orientation of a color popout target within an array of ellipses. We assessed changes in behavioral performance and in ERP components associated with various stages of processing. After practice, response time decreased in all participants (while accuracy remained consistent), and electrophysiological measures revealed modulation of several ERP components. First, amplitudes of the early sensory-evoked N1 component at 150 ms increased bilaterally, indicating enhanced visual sensory processing of the array. Second, the negative-polarity posterior-contralateral component (N2pc, 170-250 ms) was earlier and larger, demonstrating enhanced attentional orienting. Third, the amplitude of the sustained posterior contralateral negativity component (SPCN, 300-400 ms) decreased, indicating facilitated target discrimination. Finally, faster motor-response preparation and execution were observed after practice, as indicated by latency changes in both the stimulus-locked and response-locked lateralized readiness potentials (LRPs). These electrophysiological results delineate the functional plasticity in key mechanisms underlying visual search with high temporal resolution and illustrate how practice influences various cognitive and neural processing stages leading to enhanced behavioral performance. Copyright © 2015 the authors 0270-6474/15/355351-09$15.00/0.

  13. Hydrocortisone accelerates the decay of iconic memory traces: on the modulation of executive and stimulus-driven constituents of sensory information maintenance.

    PubMed

    Miller, Robert; Weckesser, Lisa J; Smolka, Michael N; Kirschbaum, Clemens; Plessow, Franziska

    2015-03-01

    A substantial amount of research documents the impact of glucocorticoids on higher-order cognitive functioning. By contrast, surprisingly little is known about the susceptibility of basic sensory processes to glucocorticoid exposure given that the glucocorticoid receptor density in the human visual cortex exceeds those observed in prefrontal and most hippocampal brain regions. As executive tasks also rely on these sensory processes, the present study investigates the impact of glucocorticoid exposure on different performance parameters characterizing the maintenance and transfer of sensory information from iconic memory (IM; the sensory buffer of the visual system) to working memory (WM). Using a crossover factorial design, we administered one out of three doses of hydrocortisone (0.06, 0.12, or 0.24mg/kg bodyweight) and a placebo to 18 healthy young men. Thereafter participants performed a partial report task, which was used to assess their individual ability to process sensory information. Blood samples were concurrently drawn to determine free and total cortisol concentrations. The compiled pharmacokinetic and psychophysical data demonstrates that free cortisol specifically accelerated the decay of sensory information (r=0.46) without significantly affecting the selective information transfer from IM to WM or the capacity limit of WM. Specifically, nonparametric regression revealed a sigmoid dose-response relationship between free cortisol levels during the testing period and the IM decay rates. Our findings highlight that glucocorticoid exposure may not only impact on the recruitment of top-down control for an active maintenance of sensory information, but alter their passive (stimulus-driven) maintenance thereby changing the availability of information prior to subsequent executive processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Perceptual load interacts with stimulus processing across sensory modalities.

    PubMed

    Klemen, J; Büchel, C; Rose, M

    2009-06-01

    According to perceptual load theory, processing of task-irrelevant stimuli is limited by the perceptual load of a parallel attended task if both the task and the irrelevant stimuli are presented to the same sensory modality. However, it remains a matter of debate whether the same principles apply to cross-sensory perceptual load and, more generally, what form cross-sensory attentional modulation in early perceptual areas takes in humans. Here we addressed these questions using functional magnetic resonance imaging. Participants undertook an auditory one-back working memory task of low or high perceptual load, while concurrently viewing task-irrelevant images at one of three object visibility levels. The processing of the visual and auditory stimuli was measured in the lateral occipital cortex (LOC) and auditory cortex (AC), respectively. Cross-sensory interference with sensory processing was observed in both the LOC and AC, in accordance with previous results of unisensory perceptual load studies. The present neuroimaging results therefore warrant the extension of perceptual load theory from a unisensory to a cross-sensory context: a validation of this cross-sensory interference effect through behavioural measures would consolidate the findings.

  15. Filling gaps in visual motion for target capture

    PubMed Central

    Bosco, Gianfranco; Delle Monache, Sergio; Gravano, Silvio; Indovina, Iole; La Scaleia, Barbara; Maffei, Vincenzo; Zago, Myrka; Lacquaniti, Francesco

    2015-01-01

    A remarkable challenge our brain must face constantly when interacting with the environment is represented by ambiguous and, at times, even missing sensory information. This is particularly compelling for visual information, being the main sensory system we rely upon to gather cues about the external world. It is not uncommon, for example, that objects catching our attention may disappear temporarily from view, occluded by visual obstacles in the foreground. Nevertheless, we are often able to keep our gaze on them throughout the occlusion or even catch them on the fly in the face of the transient lack of visual motion information. This implies that the brain can fill the gaps of missing sensory information by extrapolating the object motion through the occlusion. In recent years, much experimental evidence has been accumulated that both perceptual and motor processes exploit visual motion extrapolation mechanisms. Moreover, neurophysiological and neuroimaging studies have identified brain regions potentially involved in the predictive representation of the occluded target motion. Within this framework, ocular pursuit and manual interceptive behavior have proven to be useful experimental models for investigating visual extrapolation mechanisms. Studies in these fields have pointed out that visual motion extrapolation processes depend on manifold information related to short-term memory representations of the target motion before the occlusion, as well as to longer term representations derived from previous experience with the environment. We will review recent oculomotor and manual interception literature to provide up-to-date views on the neurophysiological underpinnings of visual motion extrapolation. PMID:25755637

  16. Filling gaps in visual motion for target capture.

    PubMed

    Bosco, Gianfranco; Monache, Sergio Delle; Gravano, Silvio; Indovina, Iole; La Scaleia, Barbara; Maffei, Vincenzo; Zago, Myrka; Lacquaniti, Francesco

    2015-01-01

    A remarkable challenge our brain must face constantly when interacting with the environment is represented by ambiguous and, at times, even missing sensory information. This is particularly compelling for visual information, being the main sensory system we rely upon to gather cues about the external world. It is not uncommon, for example, that objects catching our attention may disappear temporarily from view, occluded by visual obstacles in the foreground. Nevertheless, we are often able to keep our gaze on them throughout the occlusion or even catch them on the fly in the face of the transient lack of visual motion information. This implies that the brain can fill the gaps of missing sensory information by extrapolating the object motion through the occlusion. In recent years, much experimental evidence has been accumulated that both perceptual and motor processes exploit visual motion extrapolation mechanisms. Moreover, neurophysiological and neuroimaging studies have identified brain regions potentially involved in the predictive representation of the occluded target motion. Within this framework, ocular pursuit and manual interceptive behavior have proven to be useful experimental models for investigating visual extrapolation mechanisms. Studies in these fields have pointed out that visual motion extrapolation processes depend on manifold information related to short-term memory representations of the target motion before the occlusion, as well as to longer term representations derived from previous experience with the environment. We will review recent oculomotor and manual interception literature to provide up-to-date views on the neurophysiological underpinnings of visual motion extrapolation.

  17. Subsystems of sensory attention for skilled reaching: vision for transport and pre-shaping and somatosensation for grasping, withdrawal and release.

    PubMed

    Sacrey, Lori-Ann R; Whishaw, Ian Q

    2012-06-01

    Skilled reaching is a forelimb movement in which a subject reaches for a piece of food that is placed in the mouth for eating. It is a natural movement used by many animal species and is a routine, daily activity for humans. Its prominent features include transport of the hand to a target, shaping the digits in preparation for grasping, grasping, and withdrawal of the hand to place the food in the mouth. Studies on normal human adults show that skilled reaching is mediated by at least two sensory attention processes. Hand transport to the target and hand shaping are temporally coupled with visual fixation on the target. Grasping, withdrawal, and placing the food into the mouth are associated with visual disengagement and somatosensory guidance. Studies on nonhuman animal species illustrate that shared visual and somatosensory attention likely evolved in the primate lineage. Studies on developing infants illustrate that shared attention requires both experience and maturation. Studies on subjects with Parkinson's disease and Huntington's disease illustrate that decomposition of shared attention also features compensatory visual guidance. The evolutionary, developmental, and neural control of skilled reaching suggests that associative learning processes are importantly related to normal adult attention sharing and so can be used in remediation. The economical use of sensory attention in the different phases of skilled reaching ensures efficiency in eating, reduces sensory interference between sensory reference frames, and provides efficient neural control of the advance and withdrawal components of skilled reaching movements. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Modality-specific spectral dynamics in response to visual and tactile sequential shape information processing tasks: An MEG study using multivariate pattern classification analysis.

    PubMed

    Gohel, Bakul; Lee, Peter; Jeong, Yong

    2016-08-01

    Brain regions that respond to more than one sensory modality are characterized as multisensory regions. Studies on the processing of shape or object information have revealed recruitment of the lateral occipital cortex, posterior parietal cortex, and other regions regardless of input sensory modalities. However, it remains unknown whether such regions show similar (modality-invariant) or different (modality-specific) neural oscillatory dynamics, as recorded using magnetoencephalography (MEG), in response to identical shape information processing tasks delivered to different sensory modalities. Modality-invariant or modality-specific neural oscillatory dynamics indirectly suggest modality-independent or modality-dependent participation of particular brain regions, respectively. Therefore, this study investigated the modality-specificity of neural oscillatory dynamics in the form of spectral power modulation patterns in response to visual and tactile sequential shape-processing tasks that are well-matched in terms of speed and content between the sensory modalities. Task-related changes in spectral power modulation and differences in spectral power modulation between sensory modalities were investigated at source-space (voxel) level, using a multivariate pattern classification (MVPC) approach. Additionally, whole analyses were extended from the voxel level to the independent-component level to take account of signal leakage effects caused by inverse solution. The modality-specific spectral dynamics in multisensory and higher-order brain regions, such as the lateral occipital cortex, posterior parietal cortex, inferior temporal cortex, and other brain regions, showed task-related modulation in response to both sensory modalities. This suggests modality-dependency of such brain regions on the input sensory modality for sequential shape-information processing. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Cognitive processing of visual images in migraine populations in between headache attacks.

    PubMed

    Mickleborough, Marla J S; Chapman, Christine M; Toma, Andreea S; Handy, Todd C

    2014-09-25

    People with migraine headache have altered interictal visual sensory-level processing in between headache attacks. Here we examined the extent to which these migraine abnormalities may extend into higher visual processing such as implicit evaluative analysis of visual images in between migraine events. Specifically, we asked two groups of participants--migraineurs (N=29) and non-migraine controls (N=29)--to view a set of unfamiliar commercial logos in the context of a target identification task as the brain electrical responses to these objects were recorded via event-related potentials (ERPs). Following this task, participants individually identified those logos that they most liked or disliked. We applied a between-groups comparison of how ERP responses to logos varied as a function of hedonic evaluation. Our results suggest migraineurs have abnormal implicit evaluative processing of visual stimuli. Specifically, migraineurs lacked a bias for disliked logos found in control subjects, as measured via a late positive potential (LPP) ERP component. These results suggest post-sensory consequences of migraine in between headache events, specifically abnormal cognitive evaluative processing with a lack of normal categorical hedonic evaluation. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Feature-Selective Attentional Modulations in Human Frontoparietal Cortex.

    PubMed

    Ester, Edward F; Sutterer, David W; Serences, John T; Awh, Edward

    2016-08-03

    Control over visual selection has long been framed in terms of a dichotomy between "source" and "site," where top-down feedback signals originating in frontoparietal cortical areas modulate or bias sensory processing in posterior visual areas. This distinction is motivated in part by observations that frontoparietal cortical areas encode task-level variables (e.g., what stimulus is currently relevant or what motor outputs are appropriate), while posterior sensory areas encode continuous or analog feature representations. Here, we present evidence that challenges this distinction. We used fMRI, a roving searchlight analysis, and an inverted encoding model to examine representations of an elementary feature property (orientation) across the entire human cortical sheet while participants attended either the orientation or luminance of a peripheral grating. Orientation-selective representations were present in a multitude of visual, parietal, and prefrontal cortical areas, including portions of the medial occipital cortex, the lateral parietal cortex, and the superior precentral sulcus (thought to contain the human homolog of the macaque frontal eye fields). Additionally, representations in many-but not all-of these regions were stronger when participants were instructed to attend orientation relative to luminance. Collectively, these findings challenge models that posit a strict segregation between sources and sites of attentional control on the basis of representational properties by demonstrating that simple feature values are encoded by cortical regions throughout the visual processing hierarchy, and that representations in many of these areas are modulated by attention. Influential models of visual attention posit a distinction between top-down control and bottom-up sensory processing networks. These models are motivated in part by demonstrations showing that frontoparietal cortical areas associated with top-down control represent abstract or categorical stimulus information, while visual areas encode parametric feature information. Here, we show that multivariate activity in human visual, parietal, and frontal cortical areas encode representations of a simple feature property (orientation). Moreover, representations in several (though not all) of these areas were modulated by feature-based attention in a similar fashion. These results provide an important challenge to models that posit dissociable top-down control and sensory processing networks on the basis of representational properties. Copyright © 2016 the authors 0270-6474/16/368188-12$15.00/0.

  1. The dark side of the alpha rhythm: fMRI evidence for induced alpha modulation during complete darkness.

    PubMed

    Ben-Simon, Eti; Podlipsky, Ilana; Okon-Singer, Hadas; Gruberger, Michal; Cvetkovic, Dean; Intrator, Nathan; Hendler, Talma

    2013-03-01

    The unique role of the EEG alpha rhythm in different states of cortical activity is still debated. The main theories regarding alpha function posit either sensory processing or attention allocation as the main processes governing its modulation. Closing and opening eyes, a well-known manipulation of the alpha rhythm, could be regarded as attention allocation from inward to outward focus though during light is also accompanied by visual change. To disentangle the effects of attention allocation and sensory visual input on alpha modulation, 14 healthy subjects were asked to open and close their eyes during conditions of light and of complete darkness while simultaneous recordings of EEG and fMRI were acquired. Thus, during complete darkness the eyes-open condition is not related to visual input but only to attention allocation, allowing direct examination of its role in alpha modulation. A data-driven ridge regression classifier was applied to the EEG data in order to ascertain the contribution of the alpha rhythm to eyes-open/eyes-closed inference in both lighting conditions. Classifier results revealed significant alpha contribution during both light and dark conditions, suggesting that alpha rhythm modulation is closely linked to the change in the direction of attention regardless of the presence of visual sensory input. Furthermore, fMRI activation maps derived from an alpha modulation time-course during the complete darkness condition exhibited a right frontal cortical network associated with attention allocation. These findings support the importance of top-down processes such as attention allocation to alpha rhythm modulation, possibly as a prerequisite to its known bottom-up processing of sensory input. © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  2. Electrotactile and vibrotactile displays for sensory substitution systems

    NASA Technical Reports Server (NTRS)

    Kaczmarek, Kurt A.; Webster, John G.; Bach-Y-rita, Paul; Tompkins, Willis J.

    1991-01-01

    Sensory substitution systems provide their users with environmental information through a human sensory channel (eye, ear, or skin) different from that normally used or with the information processed in some useful way. The authors review the methods used to present visual, auditory, and modified tactile information to the skin and discuss present and potential future applications of sensory substitution, including tactile vision substitution (TVS), tactile auditory substitution, and remote tactile sensing or feedback (teletouch). The relevant sensory physiology of the skin, including the mechanisms of normal touch and the mechanisms and sensations associated with electrical stimulation of the skin using surface electrodes (electrotactile, or electrocutaneous, stimulation), is reviewed. The information-processing ability of the tactile sense and its relevance to sensory substitution is briefly summarized. The limitations of current tactile display technologies are discussed.

  3. Prefrontal contributions to visual selective attention.

    PubMed

    Squire, Ryan F; Noudoost, Behrad; Schafer, Robert J; Moore, Tirin

    2013-07-08

    The faculty of attention endows us with the capacity to process important sensory information selectively while disregarding information that is potentially distracting. Much of our understanding of the neural circuitry underlying this fundamental cognitive function comes from neurophysiological studies within the visual modality. Past evidence suggests that a principal function of the prefrontal cortex (PFC) is selective attention and that this function involves the modulation of sensory signals within posterior cortices. In this review, we discuss recent progress in identifying the specific prefrontal circuits controlling visual attention and its neural correlates within the primate visual system. In addition, we examine the persisting challenge of precisely defining how behavior should be affected when attentional function is lost.

  4. Neural time course of visually enhanced echo suppression.

    PubMed

    Bishop, Christopher W; London, Sam; Miller, Lee M

    2012-10-01

    Auditory spatial perception plays a critical role in day-to-day communication. For instance, listeners utilize acoustic spatial information to segregate individual talkers into distinct auditory "streams" to improve speech intelligibility. However, spatial localization is an exceedingly difficult task in everyday listening environments with numerous distracting echoes from nearby surfaces, such as walls. Listeners' brains overcome this unique challenge by relying on acoustic timing and, quite surprisingly, visual spatial information to suppress short-latency (1-10 ms) echoes through a process known as "the precedence effect" or "echo suppression." In the present study, we employed electroencephalography (EEG) to investigate the neural time course of echo suppression both with and without the aid of coincident visual stimulation in human listeners. We find that echo suppression is a multistage process initialized during the auditory N1 (70-100 ms) and followed by space-specific suppression mechanisms from 150 to 250 ms. Additionally, we find a robust correlate of listeners' spatial perception (i.e., suppressing or not suppressing the echo) over central electrode sites from 300 to 500 ms. Contrary to our hypothesis, vision's powerful contribution to echo suppression occurs late in processing (250-400 ms), suggesting that vision contributes primarily during late sensory or decision making processes. Together, our findings support growing evidence that echo suppression is a slow, progressive mechanism modifiable by visual influences during late sensory and decision making stages. Furthermore, our findings suggest that audiovisual interactions are not limited to early, sensory-level modulations but extend well into late stages of cortical processing.

  5. Restless 'rest': intrinsic sensory hyperactivity and disinhibition in post-traumatic stress disorder.

    PubMed

    Clancy, Kevin; Ding, Mingzhou; Bernat, Edward; Schmidt, Norman B; Li, Wen

    2017-07-01

    Post-traumatic stress disorder is characterized by exaggerated threat response, and theoretical accounts to date have focused on impaired threat processing and dysregulated prefrontal-cortex-amygdala circuitry. Nevertheless, evidence is accruing for broad, threat-neutral sensory hyperactivity in post-traumatic stress disorder. As low-level, sensory processing impacts higher-order operations, such sensory anomalies can contribute to widespread dysfunctions, presenting an additional aetiological mechanism for post-traumatic stress disorder. To elucidate a sensory pathology of post-traumatic stress disorder, we examined intrinsic visual cortical activity (based on posterior alpha oscillations) and bottom-up sensory-driven causal connectivity (Granger causality in the alpha band) during a resting state (eyes open) and a passive, serial picture viewing state. Compared to patients with generalized anxiety disorder (n = 24) and healthy control subjects (n = 20), patients with post-traumatic stress disorder (n = 25) demonstrated intrinsic sensory hyperactivity (suppressed posterior alpha power, source-localized to the visual cortex-cuneus and precuneus) and bottom-up inhibition deficits (reduced posterior→frontal Granger causality). As sensory input increased from resting to passive picture viewing, patients with post-traumatic stress disorder failed to demonstrate alpha adaptation, highlighting a rigid, set mode of sensory hyperactivity. Interestingly, patients with post-traumatic stress disorder also showed heightened frontal processing (augmented frontal gamma power, source-localized to the superior frontal gyrus and dorsal cingulate cortex), accompanied by attenuated top-down inhibition (reduced frontal→posterior causality). Importantly, not only did suppressed alpha power and bottom-up causality correlate with heightened frontal gamma power, they also correlated with increased severity of sensory and executive dysfunctions (i.e. hypervigilance and impulse control deficits, respectively). Therefore, sensory aberrations help construct a vicious cycle in post-traumatic stress disorder that is in action even at rest, implicating dysregulated triangular sensory-prefrontal-cortex-amygdala circuitry: intrinsic sensory hyperactivity and disinhibition give rise to frontal overload and disrupt executive control, fuelling and perpetuating post-traumatic stress disorder symptoms. Absent in generalized anxiety disorder, these aberrations highlight a unique sensory pathology of post-traumatic stress disorder (ruling out effects merely reflecting anxious hyperarousal), motivating new interventions targeting sensory processing and the sensory brain in these patients. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Visual Field Map Clusters in High-Order Visual Processing: Organization of V3A/V3B and a New Cloverleaf Cluster in the Posterior Superior Temporal Sulcus

    PubMed Central

    Barton, Brian; Brewer, Alyssa A.

    2017-01-01

    The cortical hierarchy of the human visual system has been shown to be organized around retinal spatial coordinates throughout much of low- and mid-level visual processing. These regions contain visual field maps (VFMs) that each follows the organization of the retina, with neighboring aspects of the visual field processed in neighboring cortical locations. On a larger, macrostructural scale, groups of such sensory cortical field maps (CFMs) in both the visual and auditory systems are organized into roughly circular cloverleaf clusters. CFMs within clusters tend to share properties such as receptive field distribution, cortical magnification, and processing specialization. Here we use fMRI and population receptive field (pRF) modeling to investigate the extent of VFM and cluster organization with an examination of higher-level visual processing in temporal cortex and compare these measurements to mid-level visual processing in dorsal occipital cortex. In human temporal cortex, the posterior superior temporal sulcus (pSTS) has been implicated in various neuroimaging studies as subserving higher-order vision, including face processing, biological motion perception, and multimodal audiovisual integration. In human dorsal occipital cortex, the transverse occipital sulcus (TOS) contains the V3A/B cluster, which comprises two VFMs subserving mid-level motion perception and visuospatial attention. For the first time, we present the organization of VFMs in pSTS in a cloverleaf cluster. This pSTS cluster contains four VFMs bilaterally: pSTS-1:4. We characterize these pSTS VFMs as relatively small at ∼125 mm2 with relatively large pRF sizes of ∼2–8° of visual angle across the central 10° of the visual field. V3A and V3B are ∼230 mm2 in surface area, with pRF sizes here similarly ∼1–8° of visual angle across the same region. In addition, cortical magnification measurements show that a larger extent of the pSTS VFM surface areas are devoted to the peripheral visual field than those in the V3A/B cluster. Reliability measurements of VFMs in pSTS and V3A/B reveal that these cloverleaf clusters are remarkably consistent and functionally differentiable. Our findings add to the growing number of measurements of widespread sensory CFMs organized into cloverleaf clusters, indicating that CFMs and cloverleaf clusters may both be fundamental organizing principles in cortical sensory processing. PMID:28293182

  7. The functional BDNF Val66Met polymorphism affects functions of pre-attentive visual sensory memory processes.

    PubMed

    Beste, Christian; Schneider, Daniel; Epplen, Jörg T; Arning, Larissa

    2011-01-01

    The brain-derived neurotrophic factor (BDNF), a member of the neurotrophin family, is involved in nerve growth and survival. Especially, a single nucleotide polymorphism (SNP) in the BDNF gene, Val66Met, has gained a lot of attention, because of its effect on activity-dependent BDNF secretion and its link to impaired memory processes. We hypothesize that the BDNF Val66Met polymorphism may have modulatory effects on the visual sensory (iconic) memory performance. Two hundred and eleven healthy German students (106 female and 105 male) were included in the data analysis. Since BDNF is also discussed to be involved in the pathogenesis of depression, we additionally tested for possible interactions with depressive mood. The BDNF Val66Met polymorphism significantly influenced iconic-memory performance, with the combined Val/Met-Met/Met genotype group revealing less time stability of information stored in iconic memory than the Val/Val group. Furthermore, this stability was positively correlated with depressive mood exclusively in the Val/Val genotype group. Thus, these results show that the BDNF Val66Met polymorphism has an effect on pre-attentive visual sensory memory processes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Sensory Temporal Processing in Adults with Early Hearing Loss

    ERIC Educational Resources Information Center

    Heming, Joanne E.; Brown, Lenora N.

    2005-01-01

    This study examined tactile and visual temporal processing in adults with early loss of hearing. The tactile task consisted of punctate stimulations that were delivered to one or both hands by a mechanical tactile stimulator. Pairs of light emitting diodes were presented on a display for visual stimulation. Responses consisted of YES or NO…

  9. Sensory Mode and "Information Load": Examining the Effects of Timing on Multisensory Processing.

    ERIC Educational Resources Information Center

    Tiene, Drew

    2000-01-01

    Discussion of the development of instructional multimedia materials focuses on a study of undergraduates that examined how the use of visual icons affected learning, differences in the instructional effectiveness of visual versus auditory processing of the same information, and timing (whether simultaneous or sequential presentation is more…

  10. Omega-3 and -6 fatty acid supplementation and sensory processing in toddlers with ASD symptomology born preterm: A randomized controlled trial.

    PubMed

    Boone, Kelly M; Gracious, Barbara; Klebanoff, Mark A; Rogers, Lynette K; Rausch, Joseph; Coury, Daniel L; Keim, Sarah A

    2017-12-01

    Despite advances in the health and long-term survival of infants born preterm, they continue to face developmental challenges including higher risk for autism spectrum disorder (ASD) and atypical sensory processing patterns. This secondary analysis aimed to describe sensory profiles and explore effects of combined dietary docosahexaenoic acid (DHA), eicosapentaenoic acid (EPA), and gamma-linolenic acid (GLA) supplementation on parent-reported sensory processing in toddlers born preterm who were exhibiting ASD symptoms. 90-day randomized, double blinded, placebo-controlled trial. 31 children aged 18-38months who were born at ≤29weeks' gestation. Mixed effects regression analyses followed intent to treat and explored effects on parent-reported sensory processing measured by the Infant/Toddler Sensory Profile (ITSP). Baseline ITSP scores reflected atypical sensory processing, with the majority of atypical scores falling below the mean. Sensory processing sections: auditory (above=0%, below=65%), vestibular (above=13%, below=48%), tactile (above=3%, below=35%), oral sensory (above=10%; below=26%), visual (above=10%, below=16%); sensory processing quadrants: low registration (above=3%; below=71%), sensation avoiding (above=3%; below=39%), sensory sensitivity (above=3%; below=35%), and sensation seeking (above=10%; below=19%). Twenty-eight of 31 children randomized had complete outcome data. Although not statistically significant (p=0.13), the magnitude of the effect for reduction in behaviors associated with sensory sensitivity was medium to large (effect size=0.57). No other scales reflected a similar magnitude of effect size (range: 0.10 to 0.32). The findings provide support for larger randomized trials of omega fatty acid supplementation for children at risk of sensory processing difficulties, especially those born preterm. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Intracranial Cortical Responses during Visual–Tactile Integration in Humans

    PubMed Central

    Quinn, Brian T.; Carlson, Chad; Doyle, Werner; Cash, Sydney S.; Devinsky, Orrin; Spence, Charles; Halgren, Eric

    2014-01-01

    Sensory integration of touch and sight is crucial to perceiving and navigating the environment. While recent evidence from other sensory modality combinations suggests that low-level sensory areas integrate multisensory information at early processing stages, little is known about how the brain combines visual and tactile information. We investigated the dynamics of multisensory integration between vision and touch using the high spatial and temporal resolution of intracranial electrocorticography in humans. We present a novel, two-step metric for defining multisensory integration. The first step compares the sum of the unisensory responses to the bimodal response as multisensory responses. The second step eliminates the possibility that double addition of sensory responses could be misinterpreted as interactions. Using these criteria, averaged local field potentials and high-gamma-band power demonstrate a functional processing cascade whereby sensory integration occurs late, both anatomically and temporally, in the temporo–parieto–occipital junction (TPOJ) and dorsolateral prefrontal cortex. Results further suggest two neurophysiologically distinct and temporally separated integration mechanisms in TPOJ, while providing direct evidence for local suppression as a dominant mechanism for synthesizing visual and tactile input. These results tend to support earlier concepts of multisensory integration as relatively late and centered in tertiary multimodal association cortices. PMID:24381279

  12. Visual processing speed.

    PubMed

    Owsley, Cynthia

    2013-09-20

    Older adults commonly report difficulties in visual tasks of everyday living that involve visual clutter, secondary task demands, and time sensitive responses. These difficulties often cannot be attributed to visual sensory impairment. Techniques for measuring visual processing speed under divided attention conditions and among visual distractors have been developed and have established construct validity in that those older adults performing poorly in these tests are more likely to exhibit daily visual task performance problems. Research suggests that computer-based training exercises can increase visual processing speed in older adults and that these gains transfer to enhancement of health and functioning and a slowing in functional and health decline as people grow older. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli.

    PubMed

    Kanaya, Shoko; Yokosawa, Kazuhiko

    2011-02-01

    Many studies on multisensory processes have focused on performance in simplified experimental situations, with a single stimulus in each sensory modality. However, these results cannot necessarily be applied to explain our perceptual behavior in natural scenes where various signals exist within one sensory modality. We investigated the role of audio-visual syllable congruency on participants' auditory localization bias or the ventriloquism effect using spoken utterances and two videos of a talking face. Salience of facial movements was also manipulated. Results indicated that more salient visual utterances attracted participants' auditory localization. Congruent pairing of audio-visual utterances elicited greater localization bias than incongruent pairing, while previous studies have reported little dependency on the reality of stimuli in ventriloquism. Moreover, audio-visual illusory congruency, owing to the McGurk effect, caused substantial visual interference on auditory localization. Multisensory performance appears more flexible and adaptive in this complex environment than in previous studies.

  14. Direct neural pathways convey distinct visual information to Drosophila mushroom bodies

    PubMed Central

    Vogt, Katrin; Aso, Yoshinori; Hige, Toshihide; Knapek, Stephan; Ichinose, Toshiharu; Friedrich, Anja B; Turner, Glenn C; Rubin, Gerald M; Tanimoto, Hiromu

    2016-01-01

    Previously, we demonstrated that visual and olfactory associative memories of Drosophila share mushroom body (MB) circuits (Vogt et al., 2014). Unlike for odor representation, the MB circuit for visual information has not been characterized. Here, we show that a small subset of MB Kenyon cells (KCs) selectively responds to visual but not olfactory stimulation. The dendrites of these atypical KCs form a ventral accessory calyx (vAC), distinct from the main calyx that receives olfactory input. We identified two types of visual projection neurons (VPNs) directly connecting the optic lobes and the vAC. Strikingly, these VPNs are differentially required for visual memories of color and brightness. The segregation of visual and olfactory domains in the MB allows independent processing of distinct sensory memories and may be a conserved form of sensory representations among insects. DOI: http://dx.doi.org/10.7554/eLife.14009.001 PMID:27083044

  15. Facial markings in the social cuckoo wasp Polistes sulcifer: No support for the visual deception and the assessment hypotheses.

    PubMed

    Cini, Alessandro; Ortolani, Irene; Zechini, Luigi; Cervo, Rita

    2015-02-01

    Insect social parasites have to conquer a host colony by overcoming its defensive barriers. In addition to increased fighting abilities, many social parasites evolved sophisticated sensory deception mechanisms to elude host colonies defenses by exploiting host communication channels. Recently, it has been shown that the conspicuous facial markings of a paper wasp social parasite, Polistes sulcifer, decrease the aggressiveness of host foundresses. Two main hypotheses stand as explanations of this phenomenon: visual sensory deception (i.e. the black patterning reduces host aggression by exploiting the host visual communication system) and visual quality assessment (i.e. facial markings reduce aggressiveness as they signal the increased fighting ability of parasites). Through behavioral assays and morphological measurements we tested three predictions resulting from these hypotheses and found no support either for the visual sensory deception or for the quality assessment to explain the reduction in host aggressiveness towards the parasite. Our results suggest that other discrimination processes may explain the observed phenomenon. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Collective motion in animal groups from a neurobiological perspective: the adaptive benefits of dynamic sensory loads and selective attention.

    PubMed

    Lemasson, B H; Anderson, J J; Goodwin, R A

    2009-12-21

    We explore mechanisms associated with collective animal motion by drawing on the neurobiological bases of sensory information processing and decision-making. The model uses simplified retinal processes to translate neighbor movement patterns into information through spatial signal integration and threshold responses. The structure provides a mechanism by which individuals can vary their sets of influential neighbors, a measure of an individual's sensory load. Sensory loads are correlated with group order and density, and we discuss their adaptive values in an ecological context. The model also provides a mechanism by which group members can identify, and rapidly respond to, novel visual stimuli.

  17. "The Mask Who Wasn't There": Visual Masking Effect with the Perceptual Absence of the Mask

    ERIC Educational Resources Information Center

    Rey, Amandine Eve; Riou, Benoit; Muller, Dominique; Dabic, Stéphanie; Versace, Rémy

    2015-01-01

    Does a visual mask need to be perceptually present to disrupt processing? In the present research, we proposed to explore the link between perceptual and memory mechanisms by demonstrating that a typical sensory phenomenon (visual masking) can be replicated at a memory level. Experiment 1 highlighted an interference effect of a visual mask on the…

  18. Temporal characteristics of audiovisual information processing.

    PubMed

    Fuhrmann Alpert, Galit; Hein, Grit; Tsai, Nancy; Naumer, Marcus J; Knight, Robert T

    2008-05-14

    In complex natural environments, auditory and visual information often have to be processed simultaneously. Previous functional magnetic resonance imaging (fMRI) studies focused on the spatial localization of brain areas involved in audiovisual (AV) information processing, but the temporal characteristics of AV information flow in these regions remained unclear. In this study, we used fMRI and a novel information-theoretic approach to study the flow of AV sensory information. Subjects passively perceived sounds and images of objects presented either alone or simultaneously. Applying the measure of mutual information, we computed for each voxel the latency in which the blood oxygenation level-dependent signal had the highest information content about the preceding stimulus. The results indicate that, after AV stimulation, the earliest informative activity occurs in right Heschl's gyrus, left primary visual cortex, and the posterior portion of the superior temporal gyrus, which is known as a region involved in object-related AV integration. Informative activity in the anterior portion of superior temporal gyrus, middle temporal gyrus, right occipital cortex, and inferior frontal cortex was found at a later latency. Moreover, AV presentation resulted in shorter latencies in multiple cortical areas compared with isolated auditory or visual presentation. The results provide evidence for bottom-up processing from primary sensory areas into higher association areas during AV integration in humans and suggest that AV presentation shortens processing time in early sensory cortices.

  19. Environmental influences on neural systems of relational complexity

    PubMed Central

    Kalbfleisch, M. Layne; deBettencourt, Megan T.; Kopperman, Rebecca; Banasiak, Meredith; Roberts, Joshua M.; Halavi, Maryam

    2013-01-01

    Constructivist learning theory contends that we construct knowledge by experience and that environmental context influences learning. To explore this principle, we examined the cognitive process relational complexity (RC), defined as the number of visual dimensions considered during problem solving on a matrix reasoning task and a well-documented measure of mature reasoning capacity. We sought to determine how the visual environment influences RC by examining the influence of color and visual contrast on RC in a neuroimaging task. To specify the contributions of sensory demand and relational integration to reasoning, our participants performed a non-verbal matrix task comprised of color, no-color line, or black-white visual contrast conditions parametrically varied by complexity (relations 0, 1, 2). The use of matrix reasoning is ecologically valid for its psychometric relevance and for its potential to link the processing of psychophysically specific visual properties with various levels of RC during reasoning. The role of these elements is important because matrix tests assess intellectual aptitude based on these seemingly context-less exercises. This experiment is a first step toward examining the psychophysical underpinnings of performance on these types of problems. The importance of this is increased in light of recent evidence that intelligence can be linked to visual discrimination. We submit three main findings. First, color and black-white visual contrast (BWVC) add demand at a basic sensory level, but contributions from color and from BWVC are dissociable in cortex such that color engages a “reasoning heuristic” and BWVC engages a “sensory heuristic.” Second, color supports contextual sense-making by boosting salience resulting in faster problem solving. Lastly, when visual complexity reaches 2-relations, color and visual contrast relinquish salience to other dimensions of problem solving. PMID:24133465

  20. Impaired Visual Motor Coordination in Obese Adults.

    PubMed

    Gaul, David; Mat, Arimin; O'Shea, Donal; Issartel, Johann

    2016-01-01

    Objective. To investigate whether obesity alters the sensory motor integration process and movement outcome during a visual rhythmic coordination task. Methods. 88 participants (44 obese and 44 matched control) sat on a chair equipped with a wrist pendulum oscillating in the sagittal plane. The task was to swing the pendulum in synchrony with a moving visual stimulus displayed on a screen. Results. Obese participants demonstrated significantly ( p < 0.01) higher values for continuous relative phase (CRP) indicating poorer level of coordination, increased movement variability ( p < 0.05), and a larger amplitude ( p < 0.05) than their healthy weight counterparts. Conclusion. These results highlight the existence of visual sensory integration deficiencies for obese participants. The obese group have greater difficulty in synchronizing their movement with a visual stimulus. Considering that visual motor coordination is an essential component of many activities of daily living, any impairment could significantly affect quality of life.

  1. Causal evidence for retina dependent and independent visual motion computations in mouse cortex

    PubMed Central

    Hillier, Daniel; Fiscella, Michele; Drinnenberg, Antonia; Trenholm, Stuart; Rompani, Santiago B.; Raics, Zoltan; Katona, Gergely; Juettner, Josephine; Hierlemann, Andreas; Rozsa, Balazs; Roska, Botond

    2017-01-01

    How neuronal computations in the sensory periphery contribute to computations in the cortex is not well understood. We examined this question in the context of visual-motion processing in the retina and primary visual cortex (V1) of mice. We disrupted retinal direction selectivity – either exclusively along the horizontal axis using FRMD7 mutants or along all directions by ablating starburst amacrine cells – and monitored neuronal activity in layer 2/3 of V1 during stimulation with visual motion. In control mice, we found an overrepresentation of cortical cells preferring posterior visual motion, the dominant motion direction an animal experiences when it moves forward. In mice with disrupted retinal direction selectivity, the overrepresentation of posterior-motion-preferring cortical cells disappeared, and their response at higher stimulus speeds was reduced. This work reveals the existence of two functionally distinct, sensory-periphery-dependent and -independent computations of visual motion in the cortex. PMID:28530661

  2. Visual Information Present in Infragranular Layers of Mouse Auditory Cortex.

    PubMed

    Morrill, Ryan J; Hasenstaub, Andrea R

    2018-03-14

    The cerebral cortex is a major hub for the convergence and integration of signals from across the sensory modalities; sensory cortices, including primary regions, are no exception. Here we show that visual stimuli influence neural firing in the auditory cortex of awake male and female mice, using multisite probes to sample single units across multiple cortical layers. We demonstrate that visual stimuli influence firing in both primary and secondary auditory cortex. We then determine the laminar location of recording sites through electrode track tracing with fluorescent dye and optogenetic identification using layer-specific markers. Spiking responses to visual stimulation occur deep in auditory cortex and are particularly prominent in layer 6. Visual modulation of firing rate occurs more frequently at areas with secondary-like auditory responses than those with primary-like responses. Auditory cortical responses to drifting visual gratings are not orientation-tuned, unlike visual cortex responses. The deepest cortical layers thus appear to be an important locus for cross-modal integration in auditory cortex. SIGNIFICANCE STATEMENT The deepest layers of the auditory cortex are often considered its most enigmatic, possessing a wide range of cell morphologies and atypical sensory responses. Here we show that, in mouse auditory cortex, these layers represent a locus of cross-modal convergence, containing many units responsive to visual stimuli. Our results suggest that this visual signal conveys the presence and timing of a stimulus rather than specifics about that stimulus, such as its orientation. These results shed light on both how and what types of cross-modal information is integrated at the earliest stages of sensory cortical processing. Copyright © 2018 the authors 0270-6474/18/382854-09$15.00/0.

  3. Reading in the dark: neural correlates and cross-modal plasticity for learning to read entire words without visual experience.

    PubMed

    Sigalov, Nadine; Maidenbaum, Shachar; Amedi, Amir

    2016-03-01

    Cognitive neuroscience has long attempted to determine the ways in which cortical selectivity develops, and the impact of nature vs. nurture on it. Congenital blindness (CB) offers a unique opportunity to test this question as the brains of blind individuals develop without visual experience. Here we approach this question through the reading network. Several areas in the visual cortex have been implicated as part of the reading network, and one of the main ones among them is the VWFA, which is selective to the form of letters and words. But what happens in the CB brain? On the one hand, it has been shown that cross-modal plasticity leads to the recruitment of occipital areas, including the VWFA, for linguistic tasks. On the other hand, we have recently demonstrated VWFA activity for letters in contrast to other visual categories when the information is provided via other senses such as touch or audition. Which of these tasks is more dominant? By which mechanism does the CB brain process reading? Using fMRI and visual-to-auditory sensory substitution which transfers the topographical features of the letters we compare reading with semantic and scrambled conditions in a group of CB. We found activation in early auditory and visual cortices during the early processing phase (letter), while the later phase (word) showed VWFA and bilateral dorsal-intraparietal activations for words. This further supports the notion that many visual regions in general, even early visual areas, also maintain a predilection for task processing even when the modality is variable and in spite of putative lifelong linguistic cross-modal plasticity. Furthermore, we find that the VWFA is recruited preferentially for letter and word form, while it was not recruited, and even exhibited deactivation, for an immediately subsequent semantic task suggesting that despite only short sensory substitution experience orthographic task processing can dominate semantic processing in the VWFA. On a wider scope, this implies that at least in some cases cross-modal plasticity which enables the recruitment of areas for new tasks may be dominated by sensory independent task specific activation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Sensory signals during active versus passive movement.

    PubMed

    Cullen, Kathleen E

    2004-12-01

    Our sensory systems are simultaneously activated as the result of our own actions and changes in the external world. The ability to distinguish self-generated sensory events from those that arise externally is thus essential for perceptual stability and accurate motor control. Recently, progress has been made towards understanding how this distinction is made. It has been proposed that an internal prediction of the consequences of our actions is compared to the actual sensory input to cancel the resultant self-generated activation. Evidence in support of this hypothesis has been obtained for early stages of sensory processing in the vestibular, visual and somatosensory systems. These findings have implications for the sensory-motor transformations that are needed to guide behavior.

  5. Heteromodal Cortical Areas Encode Sensory-Motor Features of Word Meaning.

    PubMed

    Fernandino, Leonardo; Humphries, Colin J; Conant, Lisa L; Seidenberg, Mark S; Binder, Jeffrey R

    2016-09-21

    The capacity to process information in conceptual form is a fundamental aspect of human cognition, yet little is known about how this type of information is encoded in the brain. Although the role of sensory and motor cortical areas has been a focus of recent debate, neuroimaging studies of concept representation consistently implicate a network of heteromodal areas that seem to support concept retrieval in general rather than knowledge related to any particular sensory-motor content. We used predictive machine learning on fMRI data to investigate the hypothesis that cortical areas in this "general semantic network" (GSN) encode multimodal information derived from basic sensory-motor processes, possibly functioning as convergence-divergence zones for distributed concept representation. An encoding model based on five conceptual attributes directly related to sensory-motor experience (sound, color, shape, manipulability, and visual motion) was used to predict brain activation patterns associated with individual lexical concepts in a semantic decision task. When the analysis was restricted to voxels in the GSN, the model was able to identify the activation patterns corresponding to individual concrete concepts significantly above chance. In contrast, a model based on five perceptual attributes of the word form performed at chance level. This pattern was reversed when the analysis was restricted to areas involved in the perceptual analysis of written word forms. These results indicate that heteromodal areas involved in semantic processing encode information about the relative importance of different sensory-motor attributes of concepts, possibly by storing particular combinations of sensory and motor features. The present study used a predictive encoding model of word semantics to decode conceptual information from neural activity in heteromodal cortical areas. The model is based on five sensory-motor attributes of word meaning (color, shape, sound, visual motion, and manipulability) and encodes the relative importance of each attribute to the meaning of a word. This is the first demonstration that heteromodal areas involved in semantic processing can discriminate between different concepts based on sensory-motor information alone. This finding indicates that the brain represents concepts as multimodal combinations of sensory and motor representations. Copyright © 2016 the authors 0270-6474/16/369763-07$15.00/0.

  6. Superior short-term learning effect of visual and sensory organisation ability when sensory information is unreliable in adolescent rhythmic gymnasts.

    PubMed

    Chen, Hui-Ya; Chang, Hsiao-Yun; Ju, Yan-Ying; Tsao, Hung-Ting

    2017-06-01

    Rhythmic gymnasts specialise in dynamic balance under sensory conditions of numerous somatosensory, visual, and vestibular stimulations. This study investigated whether adolescent rhythmic gymnasts are superior to peers in Sensory Organisation test (SOT) performance, which quantifies the ability to maintain standing balance in six sensory conditions, and explored whether they plateaued faster during familiarisation with the SOT. Three and six sessions of SOTs were administered to 15 female rhythmic gymnasts (15.0 ± 1.8 years) and matched peers (15.1 ± 2.1 years), respectively. The gymnasts were superior to their peers in terms of fitness measures, and their performance was better in the SOT equilibrium score when visual information was unreliable. The SOT learning effects were shown in more challenging sensory conditions between Sessions 1 and 2 and were equivalent in both groups; however, over time, the gymnasts gained marginally significant better visual ability and relied less on visual sense when unreliable. In conclusion, adolescent rhythmic gymnasts have generally the same sensory organisation ability and learning rates as their peers. However, when visual information is unreliable, they have superior sensory organisation ability and learn faster to rely less on visual sense.

  7. Grouping and Segregation of Sensory Events by Actions in Temporal Audio-Visual Recalibration.

    PubMed

    Ikumi, Nara; Soto-Faraco, Salvador

    2016-01-01

    Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands.

  8. Grouping and Segregation of Sensory Events by Actions in Temporal Audio-Visual Recalibration

    PubMed Central

    Ikumi, Nara; Soto-Faraco, Salvador

    2017-01-01

    Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands. PMID:28154529

  9. Sensory over-responsivity in adults with autism spectrum conditions.

    PubMed

    Tavassoli, Teresa; Miller, Lucy J; Schoen, Sarah A; Nielsen, Darci M; Baron-Cohen, Simon

    2014-05-01

    Anecdotal reports and empirical evidence suggest that sensory processing issues are a key feature of autism spectrum conditions. This study set out to investigate whether adults with autism spectrum conditions report more sensory over-responsivity than adults without autism spectrum conditions. Another goal of the study was to identify whether autistic traits in adults with and without autism spectrum conditions were associated with sensory over-responsivity. Adults with (n = 221) and without (n = 181) autism spectrum conditions participated in an online survey. The Autism Spectrum Quotient, the Raven Matrices and the Sensory Processing Scale were used to characterize the sample. Adults with autism spectrum conditions reported more sensory over-responsivity than control participants across various sensory domains (visual, auditory, tactile, olfactory, gustatory and proprioceptive). Sensory over-responsivity correlated positively with autistic traits (Autism Spectrum Quotient) at a significant level across groups and within groups. Adults with autism spectrum conditions experience sensory over-responsivity to daily sensory stimuli to a high degree. A positive relationship exists between sensory over-responsivity and autistic traits. Understanding sensory over-responsivity and ways of measuring it in adults with autism spectrum conditions has implications for research and clinical settings.

  10. Top-down influence on the visual cortex of the blind during sensory substitution

    PubMed Central

    Murphy, Matthew C.; Nau, Amy C.; Fisher, Christopher; Kim, Seong-Gi; Schuman, Joel S.; Chan, Kevin C.

    2017-01-01

    Visual sensory substitution devices provide a non-surgical and flexible approach to vision rehabilitation in the blind. These devices convert images taken by a camera into cross-modal sensory signals that are presented as a surrogate for direct visual input. While previous work has demonstrated that the visual cortex of blind subjects is recruited during sensory substitution, the cognitive basis of this activation remains incompletely understood. To test the hypothesis that top-down input provides a significant contribution to this activation, we performed functional MRI scanning in 11 blind (7 acquired and 4 congenital) and 11 sighted subjects under two conditions: passive listening of image-encoded soundscapes before sensory substitution training and active interpretation of the same auditory sensory substitution signals after a 10-minute training session. We found that the modulation of visual cortex activity due to active interpretation was significantly stronger in the blind over sighted subjects. In addition, congenitally blind subjects showed stronger task-induced modulation in the visual cortex than acquired blind subjects. In a parallel experiment, we scanned 18 blind (11 acquired and 7 congenital) and 18 sighted subjects at rest to investigate alterations in functional connectivity due to visual deprivation. The results demonstrated that visual cortex connectivity of the blind shifted away from sensory networks and toward known areas of top-down input. Taken together, our data support the model of the brain, including the visual system, as a highly flexible task-based and not sensory-based machine. PMID:26584776

  11. Blast exposure and dual sensory impairment: an evidence review and integrated rehabilitation approach.

    PubMed

    Saunders, Gabrielle H; Echt, Katharina V

    2012-01-01

    Combat exposures to blast can result in both peripheral damage to the ears and eyes and central damage to the auditory and visual processing areas in the brain. The functional effects of the latter include visual, auditory, and cognitive processing difficulties that manifest as deficits in attention, memory, and problem solving--symptoms similar to those seen in individuals with visual and auditory processing disorders. Coexisting damage to the auditory and visual system is referred to as dual sensory impairment (DSI). The number of Operation Iraqi Freedom/Operation Enduring Freedom Veterans with DSI is vast; yet currently no established models or guidelines exist for assessment, rehabilitation, or service-delivery practice. In this article, we review the current state of knowledge regarding blast exposure and DSI and outline the many unknowns in this area. Further, we propose a model for clinical assessment and rehabilitation of blast-related DSI that includes development of a coordinated team-based approach to target activity limitations and participation restrictions in order to enhance reintegration, recovery, and quality of life.

  12. Time-interval for integration of stabilizing haptic and visual information in subjects balancing under static and dynamic conditions

    PubMed Central

    Honeine, Jean-Louis; Schieppati, Marco

    2014-01-01

    Maintaining equilibrium is basically a sensorimotor integration task. The central nervous system (CNS) continually and selectively weights and rapidly integrates sensory inputs from multiple sources, and coordinates multiple outputs. The weighting process is based on the availability and accuracy of afferent signals at a given instant, on the time-period required to process each input, and possibly on the plasticity of the relevant pathways. The likelihood that sensory inflow changes while balancing under static or dynamic conditions is high, because subjects can pass from a dark to a well-lit environment or from a tactile-guided stabilization to loss of haptic inflow. This review article presents recent data on the temporal events accompanying sensory transition, on which basic information is fragmentary. The processing time from sensory shift to reaching a new steady state includes the time to (a) subtract or integrate sensory inputs; (b) move from allocentric to egocentric reference or vice versa; and (c) adjust the calibration of motor activity in time and amplitude to the new sensory set. We present examples of processes of integration of posture-stabilizing information, and of the respective sensorimotor time-intervals while allowing or occluding vision or adding or subtracting tactile information. These intervals are short, in the order of 1–2 s for different postural conditions, modalities and deliberate or passive shift. They are just longer for haptic than visual shift, just shorter on withdrawal than on addition of stabilizing input, and on deliberate than unexpected mode. The delays are the shortest (for haptic shift) in blind subjects. Since automatic balance stabilization may be vulnerable to sensory-integration delays and to interference from concurrent cognitive tasks in patients with sensorimotor problems, insight into the processing time for balance control represents a critical step in the design of new balance- and locomotion training devices. PMID:25339872

  13. The rapid distraction of attentional resources toward the source of incongruent stimulus input during multisensory conflict.

    PubMed

    Donohue, Sarah E; Todisco, Alexandra E; Woldorff, Marty G

    2013-04-01

    Neuroimaging work on multisensory conflict suggests that the relevant modality receives enhanced processing in the face of incongruency. However, the degree of stimulus processing in the irrelevant modality and the temporal cascade of the attentional modulations in either the relevant or irrelevant modalities are unknown. Here, we employed an audiovisual conflict paradigm with a sensory probe in the task-irrelevant modality (vision) to gauge the attentional allocation to that modality. ERPs were recorded as participants attended to and discriminated spoken auditory letters while ignoring simultaneous bilateral visual letter stimuli that were either fully congruent, fully incongruent, or partially incongruent (one side incongruent, one congruent) with the auditory stimulation. Half of the audiovisual letter stimuli were followed 500-700 msec later by a bilateral visual probe stimulus. As expected, ERPs to the audiovisual stimuli showed an incongruency ERP effect (fully incongruent versus fully congruent) of an enhanced, centrally distributed, negative-polarity wave starting ∼250 msec. More critically here, the sensory ERP components to the visual probes were larger when they followed fully incongruent versus fully congruent multisensory stimuli, with these enhancements greatest on fully incongruent trials with the slowest RTs. In addition, on the slowest-response partially incongruent trials, the P2 sensory component to the visual probes was larger contralateral to the preceding incongruent visual stimulus. These data suggest that, in response to conflicting multisensory stimulus input, the initial cognitive effect is a capture of attention by the incongruent irrelevant-modality input, pulling neural processing resources toward that modality, resulting in rapid enhancement, rather than rapid suppression, of that input.

  14. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model

    PubMed Central

    Marsh, John E.; Campbell, Tom A.

    2016-01-01

    The rostral brainstem receives both “bottom-up” input from the ascending auditory system and “top-down” descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory. PMID:27242396

  15. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model.

    PubMed

    Marsh, John E; Campbell, Tom A

    2016-01-01

    The rostral brainstem receives both "bottom-up" input from the ascending auditory system and "top-down" descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory.

  16. Effects of Attention and Laterality on Motion and Orientation Discrimination in Deaf Signers

    ERIC Educational Resources Information Center

    Bosworth, Rain G.; Petrich, Jennifer A. F.; Dobkins, Karen R.

    2013-01-01

    Previous studies have asked whether visual sensitivity and attentional processing in deaf signers are enhanced or altered as a result of their different sensory experiences during development, i.e., auditory deprivation and exposure to a visual language. In particular, deaf and hearing signers have been shown to exhibit a right visual field/left…

  17. Multichannel brain recordings in behaving Drosophila reveal oscillatory activity and local coherence in response to sensory stimulation and circuit activation

    PubMed Central

    Paulk, Angelique C.; Zhou, Yanqiong; Stratton, Peter; Liu, Li

    2013-01-01

    Neural networks in vertebrates exhibit endogenous oscillations that have been associated with functions ranging from sensory processing to locomotion. It remains unclear whether oscillations may play a similar role in the insect brain. We describe a novel “whole brain” readout for Drosophila melanogaster using a simple multichannel recording preparation to study electrical activity across the brain of flies exposed to different sensory stimuli. We recorded local field potential (LFP) activity from >2,000 registered recording sites across the fly brain in >200 wild-type and transgenic animals to uncover specific LFP frequency bands that correlate with: 1) brain region; 2) sensory modality (olfactory, visual, or mechanosensory); and 3) activity in specific neural circuits. We found endogenous and stimulus-specific oscillations throughout the fly brain. Central (higher-order) brain regions exhibited sensory modality-specific increases in power within narrow frequency bands. Conversely, in sensory brain regions such as the optic or antennal lobes, LFP coherence, rather than power, best defined sensory responses across modalities. By transiently activating specific circuits via expression of TrpA1, we found that several circuits in the fly brain modulate LFP power and coherence across brain regions and frequency domains. However, activation of a neuromodulatory octopaminergic circuit specifically increased neuronal coherence in the optic lobes during visual stimulation while decreasing coherence in central brain regions. Our multichannel recording and brain registration approach provides an effective way to track activity simultaneously across the fly brain in vivo, allowing investigation of functional roles for oscillations in processing sensory stimuli and modulating behavior. PMID:23864378

  18. An autism-associated serotonin transporter variant disrupts multisensory processing.

    PubMed

    Siemann, J K; Muller, C L; Forsberg, C G; Blakely, R D; Veenstra-VanderWeele, J; Wallace, M T

    2017-03-21

    Altered sensory processing is observed in many children with autism spectrum disorder (ASD), with growing evidence that these impairments extend to the integration of information across the different senses (that is, multisensory function). The serotonin system has an important role in sensory development and function, and alterations of serotonergic signaling have been suggested to have a role in ASD. A gain-of-function coding variant in the serotonin transporter (SERT) associates with sensory aversion in humans, and when expressed in mice produces traits associated with ASD, including disruptions in social and communicative function and repetitive behaviors. The current study set out to test whether these mice also exhibit changes in multisensory function when compared with wild-type (WT) animals on the same genetic background. Mice were trained to respond to auditory and visual stimuli independently before being tested under visual, auditory and paired audiovisual (multisensory) conditions. WT mice exhibited significant gains in response accuracy under audiovisual conditions. In contrast, although the SERT mutant animals learned the auditory and visual tasks comparably to WT littermates, they failed to show behavioral gains under multisensory conditions. We believe these results provide the first behavioral evidence of multisensory deficits in a genetic mouse model related to ASD and implicate the serotonin system in multisensory processing and in the multisensory changes seen in ASD.

  19. Colour categories are reflected in sensory stages of colour perception when stimulus issues are resolved.

    PubMed

    Forder, Lewis; He, Xun; Franklin, Anna

    2017-01-01

    Debate exists about the time course of the effect of colour categories on visual processing. We investigated the effect of colour categories for two groups who differed in whether they categorised a blue-green boundary colour as the same- or different-category to a reliably-named blue colour and a reliably-named green colour. Colour differences were equated in just-noticeable differences to be equally discriminable. We analysed event-related potentials for these colours elicited on a passive visual oddball task and investigated the time course of categorical effects on colour processing. Support for category effects was found 100 ms after stimulus onset, and over frontal sites around 250 ms, suggesting that colour naming affects both early sensory and later stages of chromatic processing.

  20. Colour categories are reflected in sensory stages of colour perception when stimulus issues are resolved

    PubMed Central

    He, Xun; Franklin, Anna

    2017-01-01

    Debate exists about the time course of the effect of colour categories on visual processing. We investigated the effect of colour categories for two groups who differed in whether they categorised a blue-green boundary colour as the same- or different-category to a reliably-named blue colour and a reliably-named green colour. Colour differences were equated in just-noticeable differences to be equally discriminable. We analysed event-related potentials for these colours elicited on a passive visual oddball task and investigated the time course of categorical effects on colour processing. Support for category effects was found 100 ms after stimulus onset, and over frontal sites around 250 ms, suggesting that colour naming affects both early sensory and later stages of chromatic processing. PMID:28542426

  1. Memorable Audiovisual Narratives Synchronize Sensory and Supramodal Neural Responses

    PubMed Central

    2016-01-01

    Abstract Our brains integrate information across sensory modalities to generate perceptual experiences and form memories. However, it is difficult to determine the conditions under which multisensory stimulation will benefit or hinder the retrieval of everyday experiences. We hypothesized that the determining factor is the reliability of information processing during stimulus presentation, which can be measured through intersubject correlation of stimulus-evoked activity. We therefore presented biographical auditory narratives and visual animations to 72 human subjects visually, auditorily, or combined, while neural activity was recorded using electroencephalography. Memory for the narrated information, contained in the auditory stream, was tested 3 weeks later. While the visual stimulus alone led to no meaningful retrieval, this related stimulus improved memory when it was combined with the story, even when it was temporally incongruent with the audio. Further, individuals with better subsequent memory elicited neural responses during encoding that were more correlated with their peers. Surprisingly, portions of this predictive synchronized activity were present regardless of the sensory modality of the stimulus. These data suggest that the strength of sensory and supramodal activity is predictive of memory performance after 3 weeks, and that neural synchrony may explain the mnemonic benefit of the functionally uninformative visual context observed for these real-world stimuli. PMID:27844062

  2. The Role of Sensory-Motor Information in Object Recognition: Evidence from Category-Specific Visual Agnosia

    ERIC Educational Resources Information Center

    Wolk, D.A.; Coslett, H.B.; Glosser, G.

    2005-01-01

    The role of sensory-motor representations in object recognition was investigated in experiments involving AD, a patient with mild visual agnosia who was impaired in the recognition of visually presented living as compared to non-living entities. AD named visually presented items for which sensory-motor information was available significantly more…

  3. The effects of neck flexion on cerebral potentials evoked by visual, auditory and somatosensory stimuli and focal brain blood flow in related sensory cortices

    PubMed Central

    2012-01-01

    Background A flexed neck posture leads to non-specific activation of the brain. Sensory evoked cerebral potentials and focal brain blood flow have been used to evaluate the activation of the sensory cortex. We investigated the effects of a flexed neck posture on the cerebral potentials evoked by visual, auditory and somatosensory stimuli and focal brain blood flow in the related sensory cortices. Methods Twelve healthy young adults received right visual hemi-field, binaural auditory and left median nerve stimuli while sitting with the neck in a resting and flexed (20° flexion) position. Sensory evoked potentials were recorded from the right occipital region, Cz in accordance with the international 10–20 system, and 2 cm posterior from C4, during visual, auditory and somatosensory stimulations. The oxidative-hemoglobin concentration was measured in the respective sensory cortex using near-infrared spectroscopy. Results Latencies of the late component of all sensory evoked potentials significantly shortened, and the amplitude of auditory evoked potentials increased when the neck was in a flexed position. Oxidative-hemoglobin concentrations in the left and right visual cortices were higher during visual stimulation in the flexed neck position. The left visual cortex is responsible for receiving the visual information. In addition, oxidative-hemoglobin concentrations in the bilateral auditory cortex during auditory stimulation, and in the right somatosensory cortex during somatosensory stimulation, were higher in the flexed neck position. Conclusions Visual, auditory and somatosensory pathways were activated by neck flexion. The sensory cortices were selectively activated, reflecting the modalities in sensory projection to the cerebral cortex and inter-hemispheric connections. PMID:23199306

  4. Cross-modality Sharpening of Visual Cortical Processing through Layer 1-Mediated Inhibition and Disinhibition

    PubMed Central

    Ibrahim, Leena A.; Mesik, Lukas; Ji, Xu-ying; Fang, Qi; Li, Hai-fu; Li, Ya-tang; Zingg, Brian; Zhang, Li I.; Tao, Huizhong Whit

    2016-01-01

    Summary Cross-modality interaction in sensory perception is advantageous for animals’ survival. How cortical sensory processing is cross-modally modulated and what are the underlying neural circuits remain poorly understood. In mouse primary visual cortex (V1), we discovered that orientation selectivity of layer (L)2/3 but not L4 excitatory neurons was sharpened in the presence of sound or optogenetic activation of projections from primary auditory cortex (A1) to V1. The effect was manifested by decreased average visual responses yet increased responses at the preferred orientation. It was more pronounced at lower visual contrast, and was diminished by suppressing L1 activity. L1 neurons were strongly innervated by A1-V1 axons and excited by sound, while visual responses of L2/3 vasoactive intestinal peptide (VIP) neurons were suppressed by sound, both preferentially at the cell's preferred orientation. These results suggest that the cross-modality modulation is achieved primarily through L1 neuron and L2/3 VIP-cell mediated inhibitory and disinhibitory circuits. PMID:26898778

  5. Brief monocular deprivation as an assay of short-term visual sensory plasticity in schizophrenia - "the binocular effect".

    PubMed

    Foxe, John J; Yeap, Sherlyn; Leavitt, Victoria M

    2013-01-01

    Visual sensory processing deficits are consistently observed in schizophrenia, with clear amplitude reduction of the visual evoked potential (VEP) during the initial 50-150 ms of processing. Similar deficits are seen in unaffected first-degree relatives and drug-naïve first-episode patients, pointing to these deficits as potential endophenotypic markers. Schizophrenia is also associated with deficits in neural plasticity, implicating dysfunction of both glutamatergic and GABAergic systems. Here, we sought to understand the intersection of these two domains, asking whether short-term plasticity during early visual processing is specifically affected in schizophrenia. Brief periods of monocular deprivation (MD) induce relatively rapid changes in the amplitude of the early VEP - i.e., short-term plasticity. Twenty patients and 20 non-psychiatric controls participated. VEPs were recorded during binocular viewing, and were compared to the sum of VEP responses during brief monocular viewing periods (i.e., Left-eye + Right-eye viewing). Under monocular conditions, neurotypical controls exhibited an effect that patients failed to demonstrate. That is, the amplitude of the summed monocular VEPs was robustly greater than the amplitude elicited binocularly during the initial sensory processing period. In patients, this "binocular effect" was absent. Patients were all medicated. Ideally, this study would also include first-episode unmedicated patients. These results suggest that short-term compensatory mechanisms that allow healthy individuals to generate robust VEPs in the context of MD are not effectively activated in patients with schizophrenia. This simple assay may provide a useful biomarker of short-term plasticity in the psychotic disorders and a target endophenotype for therapeutic interventions.

  6. Audio-visual speech cue combination.

    PubMed

    Arnold, Derek H; Tear, Morgan; Schindel, Ryan; Roseboom, Warrick

    2010-04-16

    Different sources of sensory information can interact, often shaping what we think we have seen or heard. This can enhance the precision of perceptual decisions relative to those made on the basis of a single source of information. From a computational perspective, there are multiple reasons why this might happen, and each predicts a different degree of enhanced precision. Relatively slight improvements can arise when perceptual decisions are made on the basis of multiple independent sensory estimates, as opposed to just one. These improvements can arise as a consequence of probability summation. Greater improvements can occur if two initially independent estimates are summated to form a single integrated code, especially if the summation is weighted in accordance with the variance associated with each independent estimate. This form of combination is often described as a Bayesian maximum likelihood estimate. Still greater improvements are possible if the two sources of information are encoded via a common physiological process. Here we show that the provision of simultaneous audio and visual speech cues can result in substantial sensitivity improvements, relative to single sensory modality based decisions. The magnitude of the improvements is greater than can be predicted on the basis of either a Bayesian maximum likelihood estimate or a probability summation. Our data suggest that primary estimates of speech content are determined by a physiological process that takes input from both visual and auditory processing, resulting in greater sensitivity than would be possible if initially independent audio and visual estimates were formed and then subsequently combined.

  7. The Simplest Chronoscope V: A Theory of Dual Primary and Secondary Reaction Time Systems.

    PubMed

    Montare, Alberto

    2016-12-01

    Extending work by Montare, visual simple reaction time, choice reaction time, discriminative reaction time, and overall reaction time scores obtained from college students by the simplest chronoscope (a falling meterstick) method were significantly faster as well as significantly less variable than scores of the same individuals from electromechanical reaction timers (machine method). Results supported the existence of dual reaction time systems: an ancient primary reaction time system theoretically activating the V5 parietal area of the dorsal visual stream that evolved to process significantly faster sensory-motor reactions to sudden stimulations arising from environmental objects in motion, and a secondary reaction time system theoretically activating the V4 temporal area of the ventral visual stream that subsequently evolved to process significantly slower sensory-perceptual-motor reactions to sudden stimulations arising from motionless colored objects. © The Author(s) 2016.

  8. Top-down influence on the visual cortex of the blind during sensory substitution.

    PubMed

    Murphy, Matthew C; Nau, Amy C; Fisher, Christopher; Kim, Seong-Gi; Schuman, Joel S; Chan, Kevin C

    2016-01-15

    Visual sensory substitution devices provide a non-surgical and flexible approach to vision rehabilitation in the blind. These devices convert images taken by a camera into cross-modal sensory signals that are presented as a surrogate for direct visual input. While previous work has demonstrated that the visual cortex of blind subjects is recruited during sensory substitution, the cognitive basis of this activation remains incompletely understood. To test the hypothesis that top-down input provides a significant contribution to this activation, we performed functional MRI scanning in 11 blind (7 acquired and 4 congenital) and 11 sighted subjects under two conditions: passive listening of image-encoded soundscapes before sensory substitution training and active interpretation of the same auditory sensory substitution signals after a 10-minute training session. We found that the modulation of visual cortex activity due to active interpretation was significantly stronger in the blind over sighted subjects. In addition, congenitally blind subjects showed stronger task-induced modulation in the visual cortex than acquired blind subjects. In a parallel experiment, we scanned 18 blind (11 acquired and 7 congenital) and 18 sighted subjects at rest to investigate alterations in functional connectivity due to visual deprivation. The results demonstrated that visual cortex connectivity of the blind shifted away from sensory networks and toward known areas of top-down input. Taken together, our data support the model of the brain, including the visual system, as a highly flexible task-based and not sensory-based machine. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. The Role of Working Memory in the Probabilistic Inference of Future Sensory Events.

    PubMed

    Cashdollar, Nathan; Ruhnau, Philipp; Weisz, Nathan; Hasson, Uri

    2017-05-01

    The ability to represent the emerging regularity of sensory information from the external environment has been thought to allow one to probabilistically infer future sensory occurrences and thus optimize behavior. However, the underlying neural implementation of this process is still not comprehensively understood. Through a convergence of behavioral and neurophysiological evidence, we establish that the probabilistic inference of future events is critically linked to people's ability to maintain the recent past in working memory. Magnetoencephalography recordings demonstrated that when visual stimuli occurring over an extended time series had a greater statistical regularity, individuals with higher working-memory capacity (WMC) displayed enhanced slow-wave neural oscillations in the θ frequency band (4-8 Hz.) prior to, but not during stimulus appearance. This prestimulus neural activity was specifically linked to contexts where information could be anticipated and influenced the preferential sensory processing for this visual information after its appearance. A separate behavioral study demonstrated that this process intrinsically emerges during continuous perception and underpins a realistic advantage for efficient behavioral responses. In this way, WMC optimizes the anticipation of higher level semantic concepts expected to occur in the near future. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. System identification and sensorimotor determinants of flight maneuvers in an insect

    NASA Astrophysics Data System (ADS)

    Sponberg, Simon; Hall, Robert; Roth, Eatai

    Locomotor maneuvers are inherently closed-loop processes. They are generally characterized by the integration of multiple sensory inputs and adaptation or learning over time. To probe sensorimotor processing we take a system identification approach treating the underlying physiological systems as dynamic processes and altering the feedback topology in experiment and analysis. As a model system, we use agile hawk moths (Manduca sexta), which feed from real and robotic flowers while hovering in mid air. Moths rely on vision and mechanosensation to track floral targets and can do so at exceptionally low luminance levels despite hovering being a mechanically unstable behavior that requires neural feedback to stabilize. By altering the sensory environment and placing mechanical and visual signals in conflict we show a surprisingly simple linear summation of visual and mechanosensation produces a generative prediction of behavior to novel stimuli. Tracking performance is also limited more by the mechanics of flight than the magnitude of the sensory cue. A feedback systems approach to locomotor control results in new insights into how behavior emerges from the interaction of nonlinear physiological systems.

  11. Stronger Neural Modulation by Visual Motion Intensity in Autism Spectrum Disorders

    PubMed Central

    Peiker, Ina; Schneider, Till R.; Milne, Elizabeth; Schöttle, Daniel; Vogeley, Kai; Münchau, Alexander; Schunke, Odette; Siegel, Markus; Engel, Andreas K.; David, Nicole

    2015-01-01

    Theories of autism spectrum disorders (ASD) have focused on altered perceptual integration of sensory features as a possible core deficit. Yet, there is little understanding of the neuronal processing of elementary sensory features in ASD. For typically developed individuals, we previously established a direct link between frequency-specific neural activity and the intensity of a specific sensory feature: Gamma-band activity in the visual cortex increased approximately linearly with the strength of visual motion. Using magnetoencephalography (MEG), we investigated whether in individuals with ASD neural activity reflect the coherence, and thus intensity, of visual motion in a similar fashion. Thirteen adult participants with ASD and 14 control participants performed a motion direction discrimination task with increasing levels of motion coherence. A polynomial regression analysis revealed that gamma-band power increased significantly stronger with motion coherence in ASD compared to controls, suggesting excessive visual activation with increasing stimulus intensity originating from motion-responsive visual areas V3, V6 and hMT/V5. Enhanced neural responses with increasing stimulus intensity suggest an enhanced response gain in ASD. Response gain is controlled by excitatory-inhibitory interactions, which also drive high-frequency oscillations in the gamma-band. Thus, our data suggest that a disturbed excitatory-inhibitory balance underlies enhanced neural responses to coherent motion in ASD. PMID:26147342

  12. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps

    PubMed Central

    2016-01-01

    Abstract Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor‐preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface‐based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory‐motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory‐motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M‐I. Hum Brain Mapp 37:2784–2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:27061771

  13. Impaired downregulation of visual cortex during auditory processing is associated with autism symptomatology in children and adolescents with autism spectrum disorder.

    PubMed

    Jao Keehn, R Joanne; Sanchez, Sandra S; Stewart, Claire R; Zhao, Weiqi; Grenesko-Stevens, Emily L; Keehn, Brandon; Müller, Ralph-Axel

    2017-01-01

    Autism spectrum disorders (ASD) are pervasive developmental disorders characterized by impairments in language development and social interaction, along with restricted and stereotyped behaviors. These behaviors often include atypical responses to sensory stimuli; some children with ASD are easily overwhelmed by sensory stimuli, while others may seem unaware of their environment. Vision and audition are two sensory modalities important for social interactions and language, and are differentially affected in ASD. In the present study, 16 children and adolescents with ASD and 16 typically developing (TD) participants matched for age, gender, nonverbal IQ, and handedness were tested using a mixed event-related/blocked functional magnetic resonance imaging paradigm to examine basic perceptual processes that may form the foundation for later-developing cognitive abilities. Auditory (high or low pitch) and visual conditions (dot located high or low in the display) were presented, and participants indicated whether the stimuli were "high" or "low." Results for the auditory condition showed downregulated activity of the visual cortex in the TD group, but upregulation in the ASD group. This atypical activity in visual cortex was associated with autism symptomatology. These findings suggest atypical crossmodal (auditory-visual) modulation linked to sociocommunicative deficits in ASD, in agreement with the general hypothesis of low-level sensorimotor impairments affecting core symptomatology. Autism Res 2017, 10: 130-143. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  14. Parallel pathways from whisker and visual sensory cortices to distinct frontal regions of mouse neocortex

    PubMed Central

    Sreenivasan, Varun; Kyriakatos, Alexandros; Mateo, Celine; Jaeger, Dieter; Petersen, Carl C.H.

    2016-01-01

    Abstract. The spatial organization of mouse frontal cortex is poorly understood. Here, we used voltage-sensitive dye to image electrical activity in the dorsal cortex of awake head-restrained mice. Whisker-deflection evoked the earliest sensory response in a localized region of primary somatosensory cortex and visual stimulation evoked the earliest responses in a localized region of primary visual cortex. Over the next milliseconds, the initial sensory response spread within the respective primary sensory cortex and into the surrounding higher order sensory cortices. In addition, secondary hotspots in the frontal cortex were evoked by whisker and visual stimulation, with the frontal hotspot for whisker deflection being more anterior and lateral compared to the frontal hotspot evoked by visual stimulation. Investigating axonal projections, we found that the somatosensory whisker cortex and the visual cortex directly innervated frontal cortex, with visual cortex axons innervating a region medial and posterior to the innervation from somatosensory cortex, consistent with the location of sensory responses in frontal cortex. In turn, the axonal outputs of these two frontal cortical areas innervate distinct regions of striatum, superior colliculus, and brainstem. Sensory input, therefore, appears to map onto modality-specific regions of frontal cortex, perhaps participating in distinct sensorimotor transformations, and directing distinct motor outputs. PMID:27921067

  15. A unified 3D default space consciousness model combining neurological and physiological processes that underlie conscious experience

    PubMed Central

    Jerath, Ravinder; Crawford, Molly W.; Barnes, Vernon A.

    2015-01-01

    The Global Workspace Theory and Information Integration Theory are two of the most currently accepted consciousness models; however, these models do not address many aspects of conscious experience. We compare these models to our previously proposed consciousness model in which the thalamus fills-in processed sensory information from corticothalamic feedback loops within a proposed 3D default space, resulting in the recreation of the internal and external worlds within the mind. This 3D default space is composed of all cells of the body, which communicate via gap junctions and electrical potentials to create this unified space. We use 3D illustrations to explain how both visual and non-visual sensory information may be filled-in within this dynamic space, creating a unified seamless conscious experience. This neural sensory memory space is likely generated by baseline neural oscillatory activity from the default mode network, other salient networks, brainstem, and reticular activating system. PMID:26379573

  16. Electrophysiological Correlates of Automatic Visual Change Detection in School-Age Children

    ERIC Educational Resources Information Center

    Clery, Helen; Roux, Sylvie; Besle, Julien; Giard, Marie-Helene; Bruneau, Nicole; Gomot, Marie

    2012-01-01

    Automatic stimulus-change detection is usually investigated in the auditory modality by studying Mismatch Negativity (MMN). Although the change-detection process occurs in all sensory modalities, little is known about visual deviance detection, particularly regarding the development of this brain function throughout childhood. The aim of the…

  17. Stereotyped Movements among Children Who Are Visually Impaired

    ERIC Educational Resources Information Center

    Gal, Eynat; Dyck, Murray J.

    2009-01-01

    Does the severity of visual impairment affect the prevalence and severity of stereotyped movements? In this study, children who were blind or had low vision, half of whom had intellectual disabilities, were assessed. The results revealed that blindness and global delays were associated with more sensory processing dysfunction and more stereotyped…

  18. Sensory atypicalities in dyads of children with autism spectrum disorder (ASD) and their parents.

    PubMed

    Glod, Magdalena; Riby, Deborah M; Honey, Emma; Rodgers, Jacqui

    2017-03-01

    Sensory atypicalities are a common feature of autism spectrum disorder (ASD). To date, the relationship between sensory atypicalities in dyads of children with ASD and their parents has not been investigated. Exploring these relationships can contribute to an understanding of how phenotypic profiles may be inherited, and the extent to which familial factors might contribute towards children's sensory profiles and constitute an aspect of the broader autism phenotype (BAP). Parents of 44 children with ASD and 30 typically developing (TD) children, aged between 3 and 14 years, participated. Information about children's sensory experiences was collected through parent report using the Sensory Profile questionnaire. Information about parental sensory experiences was collected via self-report using the Adolescent/Adult Sensory Profile. Parents of children with ASD had significantly higher scores than parents of TD children in relation to low registration, over responsivity, and taste/smell sensory processing. Similar levels of agreement were obtained within ASD and TD parent-child dyads on a number of sensory atypicalities; nevertheless significant correlations were found between parents and children in ASD families but not TD dyads for sensation avoiding and auditory, visual, and vestibular sensory processing. The findings suggest that there are similarities in sensory processing profiles between parents and their children in both ASD and TD dyads. Familial sensory processing factors are likely to contribute towards the BAP. Further work is needed to explore genetic and environmental influences on the developmental pathways of the sensory atypicalities in ASD. Autism Res 2017, 10: 531-538. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  19. Parallel perceptual enhancement and hierarchic relevance evaluation in an audio-visual conjunction task.

    PubMed

    Potts, Geoffrey F; Wood, Susan M; Kothmann, Delia; Martin, Laura E

    2008-10-21

    Attention directs limited-capacity information processing resources to a subset of available perceptual representations. The mechanisms by which attention selects task-relevant representations for preferential processing are not fully known. Triesman and Gelade's [Triesman, A., Gelade, G., 1980. A feature integration theory of attention. Cognit. Psychol. 12, 97-136.] influential attention model posits that simple features are processed preattentively, in parallel, but that attention is required to serially conjoin multiple features into an object representation. Event-related potentials have provided evidence for this model showing parallel processing of perceptual features in the posterior Selection Negativity (SN) and serial, hierarchic processing of feature conjunctions in the Frontal Selection Positivity (FSP). Most prior studies have been done on conjunctions within one sensory modality while many real-world objects have multimodal features. It is not known if the same neural systems of posterior parallel processing of simple features and frontal serial processing of feature conjunctions seen within a sensory modality also operate on conjunctions between modalities. The current study used ERPs and simultaneously presented auditory and visual stimuli in three task conditions: Attend Auditory (auditory feature determines the target, visual features are irrelevant), Attend Visual (visual features relevant, auditory irrelevant), and Attend Conjunction (target defined by the co-occurrence of an auditory and a visual feature). In the Attend Conjunction condition when the auditory but not the visual feature was a target there was an SN over auditory cortex, when the visual but not auditory stimulus was a target there was an SN over visual cortex, and when both auditory and visual stimuli were targets (i.e. conjunction target) there were SNs over both auditory and visual cortex, indicating parallel processing of the simple features within each modality. In contrast, an FSP was present when either the visual only or both auditory and visual features were targets, but not when only the auditory stimulus was a target, indicating that the conjunction target determination was evaluated serially and hierarchically with visual information taking precedence. This indicates that the detection of a target defined by audio-visual conjunction is achieved via the same mechanism as within a single perceptual modality, through separate, parallel processing of the auditory and visual features and serial processing of the feature conjunction elements, rather than by evaluation of a fused multimodal percept.

  20. Weighted integration of short-term memory and sensory signals in the oculomotor system.

    PubMed

    Deravet, Nicolas; Blohm, Gunnar; de Xivry, Jean-Jacques Orban; Lefèvre, Philippe

    2018-05-01

    Oculomotor behaviors integrate sensory and prior information to overcome sensory-motor delays and noise. After much debate about this process, reliability-based integration has recently been proposed and several models of smooth pursuit now include recurrent Bayesian integration or Kalman filtering. However, there is a lack of behavioral evidence in humans supporting these theoretical predictions. Here, we independently manipulated the reliability of visual and prior information in a smooth pursuit task. Our results show that both smooth pursuit eye velocity and catch-up saccade amplitude were modulated by visual and prior information reliability. We interpret these findings as the continuous reliability-based integration of a short-term memory of target motion with visual information, which support modeling work. Furthermore, we suggest that saccadic and pursuit systems share this short-term memory. We propose that this short-term memory of target motion is quickly built and continuously updated, and constitutes a general building block present in all sensorimotor systems.

  1. Visual Cortex Plasticity: A Complex Interplay of Genetic and Environmental Influences

    PubMed Central

    Maya-Vetencourt, José Fernando; Origlia, Nicola

    2012-01-01

    The central nervous system architecture is highly dynamic and continuously modified by sensory experience through processes of neuronal plasticity. Plasticity is achieved by a complex interplay of environmental influences and physiological mechanisms that ultimately activate intracellular signal transduction pathways regulating gene expression. In addition to the remarkable variety of transcription factors and their combinatorial interaction at specific gene promoters, epigenetic mechanisms that regulate transcription have emerged as conserved processes by which the nervous system accomplishes the induction of plasticity. Experience-dependent changes of DNA methylation patterns and histone posttranslational modifications are, in fact, recruited as targets of plasticity-associated signal transduction mechanisms. Here, we shall concentrate on structural and functional consequences of early sensory deprivation in the visual system and discuss how intracellular signal transduction pathways associated with experience regulate changes of chromatin structure and gene expression patterns that underlie these plastic phenomena. Recent experimental evidence for mechanisms of cross-modal plasticity following congenital or acquired sensory deprivation both in human and animal models will be considered as well. We shall also review different experimental strategies that can be used to achieve the recovery of sensory functions after long-term deprivation in humans. PMID:22852098

  2. Multisensory integration processing during olfactory-visual stimulation-An fMRI graph theoretical network analysis.

    PubMed

    Ripp, Isabelle; Zur Nieden, Anna-Nora; Blankenagel, Sonja; Franzmeier, Nicolai; Lundström, Johan N; Freiherr, Jessica

    2018-05-07

    In this study, we aimed to understand how whole-brain neural networks compute sensory information integration based on the olfactory and visual system. Task-related functional magnetic resonance imaging (fMRI) data was obtained during unimodal and bimodal sensory stimulation. Based on the identification of multisensory integration processing (MIP) specific hub-like network nodes analyzed with network-based statistics using region-of-interest based connectivity matrices, we conclude the following brain areas to be important for processing the presented bimodal sensory information: right precuneus connected contralaterally to the supramarginal gyrus for memory-related imagery and phonology retrieval, and the left middle occipital gyrus connected ipsilaterally to the inferior frontal gyrus via the inferior fronto-occipital fasciculus including functional aspects of working memory. Applied graph theory for quantification of the resulting complex network topologies indicates a significantly increased global efficiency and clustering coefficient in networks including aspects of MIP reflecting a simultaneous better integration and segregation. Graph theoretical analysis of positive and negative network correlations allowing for inferences about excitatory and inhibitory network architectures revealed-not significant, but very consistent-that MIP-specific neural networks are dominated by inhibitory relationships between brain regions involved in stimulus processing. © 2018 Wiley Periodicals, Inc.

  3. A biologically inspired neural model for visual and proprioceptive integration including sensory training.

    PubMed

    Saidi, Maryam; Towhidkhah, Farzad; Gharibzadeh, Shahriar; Lari, Abdolaziz Azizi

    2013-12-01

    Humans perceive the surrounding world by integration of information through different sensory modalities. Earlier models of multisensory integration rely mainly on traditional Bayesian and causal Bayesian inferences for single causal (source) and two causal (for two senses such as visual and auditory systems), respectively. In this paper a new recurrent neural model is presented for integration of visual and proprioceptive information. This model is based on population coding which is able to mimic multisensory integration of neural centers in the human brain. The simulation results agree with those achieved by casual Bayesian inference. The model can also simulate the sensory training process of visual and proprioceptive information in human. Training process in multisensory integration is a point with less attention in the literature before. The effect of proprioceptive training on multisensory perception was investigated through a set of experiments in our previous study. The current study, evaluates the effect of both modalities, i.e., visual and proprioceptive training and compares them with each other through a set of new experiments. In these experiments, the subject was asked to move his/her hand in a circle and estimate its position. The experiments were performed on eight subjects with proprioception training and eight subjects with visual training. Results of the experiments show three important points: (1) visual learning rate is significantly more than that of proprioception; (2) means of visual and proprioceptive errors are decreased by training but statistical analysis shows that this decrement is significant for proprioceptive error and non-significant for visual error, and (3) visual errors in training phase even in the beginning of it, is much less than errors of the main test stage because in the main test, the subject has to focus on two senses. The results of the experiments in this paper is in agreement with the results of the neural model simulation.

  4. Sensory and motoric influences on attention dynamics during standing balance recovery in young and older adults.

    PubMed

    Redfern, Mark S; Chambers, April J; Jennings, J Richard; Furman, Joseph M

    2017-08-01

    This study investigated the impact of attention on the sensory and motor actions during postural recovery from underfoot perturbations in young and older adults. A dual-task paradigm was used involving disjunctive and choice reaction time (RT) tasks to auditory and visual stimuli at different delays from the onset of two types of platform perturbations (rotations and translations). The RTs were increased prior to the perturbation (preparation phase) and during the immediate recovery response (response initiation) in young and older adults, but this interference dissipated rapidly after the perturbation response was initiated (<220 ms). The sensory modality of the RT task impacted the results with interference being greater for the auditory task compared to the visual task. As motor complexity of the RT task increased (disjunctive versus choice) there was greater interference from the perturbation. Finally, increasing the complexity of the postural perturbation by mixing the rotational and translational perturbations together increased interference for the auditory RT tasks, but did not affect the visual RT responses. These results suggest that sensory and motoric components of postural control are under the influence of different dynamic attentional processes.

  5. Seeing Your Error Alters My Pointing: Observing Systematic Pointing Errors Induces Sensori-Motor After-Effects

    PubMed Central

    Ronchi, Roberta; Revol, Patrice; Katayama, Masahiro; Rossetti, Yves; Farnè, Alessandro

    2011-01-01

    During the procedure of prism adaptation, subjects execute pointing movements to visual targets under a lateral optical displacement: As consequence of the discrepancy between visual and proprioceptive inputs, their visuo-motor activity is characterized by pointing errors. The perception of such final errors triggers error-correction processes that eventually result into sensori-motor compensation, opposite to the prismatic displacement (i.e., after-effects). Here we tested whether the mere observation of erroneous pointing movements, similar to those executed during prism adaptation, is sufficient to produce adaptation-like after-effects. Neurotypical participants observed, from a first-person perspective, the examiner's arm making incorrect pointing movements that systematically overshot visual targets location to the right, thus simulating a rightward optical deviation. Three classical after-effect measures (proprioceptive, visual and visual-proprioceptive shift) were recorded before and after first-person's perspective observation of pointing errors. Results showed that mere visual exposure to an arm that systematically points on the right-side of a target (i.e., without error correction) produces a leftward after-effect, which mostly affects the observer's proprioceptive estimation of her body midline. In addition, being exposed to such a constant visual error induced in the observer the illusion “to feel” the seen movement. These findings indicate that it is possible to elicit sensori-motor after-effects by mere observation of movement errors. PMID:21731649

  6. Modelling effects on grid cells of sensory input during self‐motion

    PubMed Central

    Raudies, Florian; Hinman, James R.

    2016-01-01

    Abstract The neural coding of spatial location for memory function may involve grid cells in the medial entorhinal cortex, but the mechanism of generating the spatial responses of grid cells remains unclear. This review describes some current theories and experimental data concerning the role of sensory input in generating the regular spatial firing patterns of grid cells, and changes in grid cell firing fields with movement of environmental barriers. As described here, the influence of visual features on spatial firing could involve either computations of self‐motion based on optic flow, or computations of absolute position based on the angle and distance of static visual cues. Due to anatomical selectivity of retinotopic processing, the sensory features on the walls of an environment may have a stronger effect on ventral grid cells that have wider spaced firing fields, whereas the sensory features on the ground plane may influence the firing of dorsal grid cells with narrower spacing between firing fields. These sensory influences could contribute to the potential functional role of grid cells in guiding goal‐directed navigation. PMID:27094096

  7. Functional near-infrared spectroscopy (fNIRS) brain imaging of multi-sensory integration during computerized dynamic posturography in middle-aged and older adults.

    PubMed

    Lin, Chia-Cheng; Barker, Jeffrey W; Sparto, Patrick J; Furman, Joseph M; Huppert, Theodore J

    2017-04-01

    Studies suggest that aging affects the sensory re-weighting process, but the neuroimaging evidence is minimal. Functional Near-Infrared Spectroscopy (fNIRS) is a novel neuroimaging tool that can detect brain activities during dynamic movement condition. In this study, fNIRS was used to investigate the hemodynamic changes in the frontal-lateral, temporal-parietal, and occipital regions of interest (ROIs) during four sensory integration conditions that manipulated visual and somatosensory feedback in 15 middle-aged and 15 older adults. The results showed that the temporal-parietal ROI was activated more when somatosensory and visual information were absent in both groups, which indicated the sole use of vestibular input for maintaining balance. While both older adults and middle-aged adults had greater activity in most brain ROIs during changes in the sensory conditions, the older adults had greater increases in the occipital ROI and frontal-lateral ROIs. These findings suggest a cortical component to sensory re-weighting that is more distributed and requires greater attention in older adults.

  8. [Ventriloquism and audio-visual integration of voice and face].

    PubMed

    Yokosawa, Kazuhiko; Kanaya, Shoko

    2012-07-01

    Presenting synchronous auditory and visual stimuli in separate locations creates the illusion that the sound originates from the direction of the visual stimulus. Participants' auditory localization bias, called the ventriloquism effect, has revealed factors affecting the perceptual integration of audio-visual stimuli. However, many studies on audio-visual processes have focused on performance in simplified experimental situations, with a single stimulus in each sensory modality. These results cannot necessarily explain our perceptual behavior in natural scenes, where various signals exist within a single sensory modality. In the present study we report the contributions of a cognitive factor, that is, the audio-visual congruency of speech, although this factor has often been underestimated in previous ventriloquism research. Thus, we investigated the contribution of speech congruency on the ventriloquism effect using a spoken utterance and two videos of a talking face. The salience of facial movements was also manipulated. As a result, when bilateral visual stimuli are presented in synchrony with a single voice, cross-modal speech congruency was found to have a significant impact on the ventriloquism effect. This result also indicated that more salient visual utterances attracted participants' auditory localization. The congruent pairing of audio-visual utterances elicited greater localization bias than did incongruent pairing, whereas previous studies have reported little dependency on the reality of stimuli in ventriloquism. Moreover, audio-visual illusory congruency, owing to the McGurk effect, caused substantial visual interference to auditory localization. This suggests that a greater flexibility in responding to multi-sensory environments exists than has been previously considered.

  9. Sensing Super-position: Visual Instrument Sensor Replacement

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Schipper, John F.

    2006-01-01

    The coming decade of fast, cheap and miniaturized electronics and sensory devices opens new pathways for the development of sophisticated equipment to overcome limitations of the human senses. This project addresses the technical feasibility of augmenting human vision through Sensing Super-position using a Visual Instrument Sensory Organ Replacement (VISOR). The current implementation of the VISOR device translates visual and other passive or active sensory instruments into sounds, which become relevant when the visual resolution is insufficient for very difficult and particular sensing tasks. A successful Sensing Super-position meets many human and pilot vehicle system requirements. The system can be further developed into cheap, portable, and low power taking into account the limited capabilities of the human user as well as the typical characteristics of his dynamic environment. The system operates in real time, giving the desired information for the particular augmented sensing tasks. The Sensing Super-position device increases the image resolution perception and is obtained via an auditory representation as well as the visual representation. Auditory mapping is performed to distribute an image in time. The three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. This paper details the approach of developing Sensing Super-position systems as a way to augment the human vision system by exploiting the capabilities of the human hearing system as an additional neural input. The human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns. The known capabilities of the human hearing system to learn and understand complicated auditory patterns provided the basic motivation for developing an image-to-sound mapping system.

  10. Passive Double-Sensory Evoked Coherence Correlates with Long-Term Memory Capacity.

    PubMed

    Horwitz, Anna; Mortensen, Erik L; Osler, Merete; Fagerlund, Birgitte; Lauritzen, Martin; Benedek, Krisztina

    2017-01-01

    HIGHLIGHTS Memory correlates with the difference between single and double-sensory evoked steady-state coherence in the gamma range (Δ C ).The correlation is most pronounced for the anterior brain region (Δ C A ).The correlation is not driven by birth size, education, speed of processing, or intelligence.The sensitivity of Δ C A for detecting low memory capacity is 90%. Cerebral rhythmic activity and oscillations are important pathways of communication between cortical cell assemblies and may be key factors in memory. We asked whether memory performance is related to gamma coherence in a non-task sensory steady-state stimulation. We investigated 40 healthy males born in 1953 who were part of a Danish birth cohort study. Coherence was measured in the gamma range in response to a single-sensory visual stimulation (36 Hz) and a double-sensory combined audiovisual stimulation (auditive: 40 Hz; visual: 36 Hz). The individual difference in coherence (Δ C ) between the bimodal and monomodal stimulation was calculated for each subject and used as the main explanatory variable. Δ C in total brain were significantly negatively correlated with long-term verbal recall. This correlation was pronounced for the anterior region. In addition, the correlation between Δ C and long-term memory was robust when controlling for working memory, as well as a wide range of potentially confounding factors, including intelligence, length of education, speed of processing, visual attention and executive function. Moreover, we found that the difference in anterior coherence (Δ C A ) is a better predictor of memory than power in multivariate models. The sensitivity of Δ C A for detecting low memory capacity is 92%. Finally, Δ C A was also associated with other types of memory: verbal learning, visual recognition, and spatial memory, and these additional correlations were also robust enough to control for a range of potentially confounding factors. Thus, the Δ C is a predictor of memory performance may be useful in cognitive neuropsychological testing.

  11. Passive Double-Sensory Evoked Coherence Correlates with Long-Term Memory Capacity

    PubMed Central

    Horwitz, Anna; Mortensen, Erik L.; Osler, Merete; Fagerlund, Birgitte; Lauritzen, Martin; Benedek, Krisztina

    2017-01-01

    HIGHLIGHTS Memory correlates with the difference between single and double-sensory evoked steady-state coherence in the gamma range (ΔC).The correlation is most pronounced for the anterior brain region (ΔCA).The correlation is not driven by birth size, education, speed of processing, or intelligence.The sensitivity of ΔCA for detecting low memory capacity is 90%. Cerebral rhythmic activity and oscillations are important pathways of communication between cortical cell assemblies and may be key factors in memory. We asked whether memory performance is related to gamma coherence in a non-task sensory steady-state stimulation. We investigated 40 healthy males born in 1953 who were part of a Danish birth cohort study. Coherence was measured in the gamma range in response to a single-sensory visual stimulation (36 Hz) and a double-sensory combined audiovisual stimulation (auditive: 40 Hz; visual: 36 Hz). The individual difference in coherence (ΔC) between the bimodal and monomodal stimulation was calculated for each subject and used as the main explanatory variable. ΔC in total brain were significantly negatively correlated with long-term verbal recall. This correlation was pronounced for the anterior region. In addition, the correlation between ΔC and long-term memory was robust when controlling for working memory, as well as a wide range of potentially confounding factors, including intelligence, length of education, speed of processing, visual attention and executive function. Moreover, we found that the difference in anterior coherence (ΔCA) is a better predictor of memory than power in multivariate models. The sensitivity of ΔCA for detecting low memory capacity is 92%. Finally, ΔCA was also associated with other types of memory: verbal learning, visual recognition, and spatial memory, and these additional correlations were also robust enough to control for a range of potentially confounding factors. Thus, the ΔC is a predictor of memory performance may be useful in cognitive neuropsychological testing. PMID:29311868

  12. Cross-Modal Attention Effects in the Vestibular Cortex during Attentive Tracking of Moving Objects.

    PubMed

    Frank, Sebastian M; Sun, Liwei; Forster, Lisa; Tse, Peter U; Greenlee, Mark W

    2016-12-14

    The midposterior fundus of the Sylvian fissure in the human brain is central to the cortical processing of vestibular cues. At least two vestibular areas are located at this site: the parietoinsular vestibular cortex (PIVC) and the posterior insular cortex (PIC). It is now well established that activity in sensory systems is subject to cross-modal attention effects. Attending to a stimulus in one sensory modality enhances activity in the corresponding cortical sensory system, but simultaneously suppresses activity in other sensory systems. Here, we wanted to probe whether such cross-modal attention effects also target the vestibular system. To this end, we used a visual multiple-object tracking task. By parametrically varying the number of tracked targets, we could measure the effect of attentional load on the PIVC and the PIC while holding the perceptual load constant. Participants performed the tracking task during functional magnetic resonance imaging. Results show that, compared with passive viewing of object motion, activity during object tracking was suppressed in the PIVC and enhanced in the PIC. Greater attentional load, induced by increasing the number of tracked targets, was associated with a corresponding increase in the suppression of activity in the PIVC. Activity in the anterior part of the PIC decreased with increasing load, whereas load effects were absent in the posterior PIC. Results of a control experiment show that attention-induced suppression in the PIVC is stronger than any suppression evoked by the visual stimulus per se. Overall, our results suggest that attention has a cross-modal modulatory effect on the vestibular cortex during visual object tracking. In this study we investigate cross-modal attention effects in the human vestibular cortex. We applied the visual multiple-object tracking task because it is known to evoke attentional load effects on neural activity in visual motion-processing and attention-processing areas. Here we demonstrate a load-dependent effect of attention on the activation in the vestibular cortex, despite constant visual motion stimulation. We find that activity in the parietoinsular vestibular cortex is more strongly suppressed the greater the attentional load on the visual tracking task. These findings suggest cross-modal attentional modulation in the vestibular cortex. Copyright © 2016 the authors 0270-6474/16/3612720-09$15.00/0.

  13. Time-resolved neuroimaging of visual short term memory consolidation by post-perceptual attention shifts.

    PubMed

    Hecht, Marcus; Thiemann, Ulf; Freitag, Christine M; Bender, Stephan

    2016-01-15

    Post-perceptual cues can enhance visual short term memory encoding even after the offset of the visual stimulus. However, both the mechanisms by which the sensory stimulus characteristics are buffered as well as the mechanisms by which post-perceptual selective attention enhances short term memory encoding remain unclear. We analyzed late post-perceptual event-related potentials (ERPs) in visual change detection tasks (100ms stimulus duration) by high-resolution ERP analysis to elucidate these mechanisms. The effects of early and late auditory post-cues (300ms or 850ms after visual stimulus onset) as well as the effects of a visual interference stimulus were examined in 27 healthy right-handed adults. Focusing attention with post-perceptual cues at both latencies significantly improved memory performance, i.e. sensory stimulus characteristics were available for up to 850ms after stimulus presentation. Passive watching of the visual stimuli without auditory cue presentation evoked a slow negative wave (N700) over occipito-temporal visual areas. N700 was strongly reduced by a visual interference stimulus which impeded memory maintenance. In contrast, contralateral delay activity (CDA) still developed in this condition after the application of auditory post-cues and was thereby dissociated from N700. CDA and N700 seem to represent two different processes involved in short term memory encoding. While N700 could reflect visual post processing by automatic attention attraction, CDA may reflect the top-down process of searching selectively for the required information through post-perceptual attention. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Cognitive processing in the primary visual cortex: from perception to memory.

    PubMed

    Supèr, Hans

    2002-01-01

    The primary visual cortex is the first cortical area of the visual system that receives information from the external visual world. Based on the receptive field characteristics of the neurons in this area, it has been assumed that the primary visual cortex is a pure sensory area extracting basic elements of the visual scene. This information is then subsequently further processed upstream in the higher-order visual areas and provides us with perception and storage of the visual environment. However, recent findings show that such neural implementations are observed in the primary visual cortex. These neural correlates are expressed by the modulated activity of the late response of a neuron to a stimulus, and most likely depend on recurrent interactions between several areas of the visual system. This favors the concept of a distributed nature of visual processing in perceptual organization.

  15. Oscillatory encoding of visual stimulus familiarity.

    PubMed

    Kissinger, Samuel T; Pak, Alexandr; Tang, Yu; Masmanidis, Sotiris C; Chubykin, Alexander A

    2018-06-18

    Familiarity of the environment changes the way we perceive and encode incoming information. However, the neural substrates underlying this phenomenon are poorly understood. Here we describe a new form of experience-dependent low frequency oscillations in the primary visual cortex (V1) of awake adult male mice. The oscillations emerged in visually evoked potentials (VEPs) and single-unit activity following repeated visual stimulation. The oscillations were sensitive to the spatial frequency content of a visual stimulus and required the muscarinic acetylcholine receptors (mAChRs) for their induction and expression. Finally, ongoing visually evoked theta (4-6 Hz) oscillations boost the VEP amplitude of incoming visual stimuli if the stimuli are presented at the high excitability phase of the oscillations. Our results demonstrate that an oscillatory code can be used to encode familiarity and serves as a gate for oncoming sensory inputs. Significance Statement. Previous experience can influence the processing of incoming sensory information by the brain and alter perception. However, the mechanistic understanding of how this process takes place is lacking. We have discovered that persistent low frequency oscillations in the primary visual cortex encode information about familiarity and the spatial frequency of the stimulus. These familiarity evoked oscillations influence neuronal responses to the oncoming stimuli in a way that depends on the oscillation phase. Our work demonstrates a new mechanism of visual stimulus feature detection and learning. Copyright © 2018 the authors.

  16. Nonvisual influences on visual-information processing in the superior colliculus.

    PubMed

    Stein, B E; Jiang, W; Wallace, M T; Stanford, T R

    2001-01-01

    Although visually responsive neurons predominate in the deep layers of the superior colliculus (SC), the majority of them also receive sensory inputs from nonvisual sources (i.e. auditory and/or somatosensory). Most of these 'multisensory' neurons are able to synthesize their cross-modal inputs and, as a consequence, their responses to visual stimuli can be profoundly enhanced or depressed in the presence of a nonvisual cue. Whether response enhancement or response depression is produced by this multisensory interaction is predictable based on several factors. These include: the organization of a neuron's visual and nonvisual receptive fields; the relative spatial relationships of the different stimuli (to their respective receptive fields and to one another); and whether or not the neuron is innervated by a select population of cortical neurons. The response enhancement or depression of SC neurons via multisensory integration has significant survival value via its profound impact on overt attentive/orientation behaviors. Nevertheless, these multisensory processes are not present at birth, and require an extensive period of postnatal maturation. It seems likely that the sensory experiences obtained during this period play an important role in crafting the processes underlying these multisensory interactions.

  17. Fundamental Visual Representations of Social Cognition in ASD

    DTIC Science & Technology

    2016-12-01

    visual adaptation functions in Autism , again pointing to basic sensory processing anomalies in this population. Our research team is developing...challenging-to-test ASD pediatric population. 15. SUBJECT TERMS Autism , Visual Adaptation, Retinotopy, Social Communication, Eye-movements, fMRI, EEG, ERP...social interaction are a hallmark symptom of Autism , and the lack of appropriate eye- contact during interpersonal interactions is an oft-noted feature

  18. Visual motion detection and habitat preference in Anolis lizards.

    PubMed

    Steinberg, David S; Leal, Manuel

    2016-11-01

    The perception of visual stimuli has been a major area of inquiry in sensory ecology, and much of this work has focused on coloration. However, for visually oriented organisms, the process of visual motion detection is often equally crucial to survival and reproduction. Despite the importance of motion detection to many organisms' daily activities, the degree of interspecific variation in the perception of visual motion remains largely unexplored. Furthermore, the factors driving this potential variation (e.g., ecology or evolutionary history) along with the effects of such variation on behavior are unknown. We used a behavioral assay under laboratory conditions to quantify the visual motion detection systems of three species of Puerto Rican Anolis lizard that prefer distinct structural habitat types. We then compared our results to data previously collected for anoles from Cuba, Puerto Rico, and Central America. Our findings indicate that general visual motion detection parameters are similar across species, regardless of habitat preference or evolutionary history. We argue that these conserved sensory properties may drive the evolution of visual communication behavior in this clade.

  19. Multiscale neural connectivity during human sensory processing in the brain

    NASA Astrophysics Data System (ADS)

    Maksimenko, Vladimir A.; Runnova, Anastasia E.; Frolov, Nikita S.; Makarov, Vladimir V.; Nedaivozov, Vladimir; Koronovskii, Alexey A.; Pisarchik, Alexander; Hramov, Alexander E.

    2018-05-01

    Stimulus-related brain activity is considered using wavelet-based analysis of neural interactions between occipital and parietal brain areas in alpha (8-12 Hz) and beta (15-30 Hz) frequency bands. We show that human sensory processing related to the visual stimuli perception induces brain response resulted in different ways of parieto-occipital interactions in these bands. In the alpha frequency band the parieto-occipital neuronal network is characterized by homogeneous increase of the interaction between all interconnected areas both within occipital and parietal lobes and between them. In the beta frequency band the occipital lobe starts to play a leading role in the dynamics of the occipital-parietal network: The perception of visual stimuli excites the visual center in the occipital area and then, due to the increase of parieto-occipital interactions, such excitation is transferred to the parietal area, where the attentional center takes place. In the case when stimuli are characterized by a high degree of ambiguity, we find greater increase of the interaction between interconnected areas in the parietal lobe due to the increase of human attention. Based on revealed mechanisms, we describe the complex response of the parieto-occipital brain neuronal network during the perception and primary processing of the visual stimuli. The results can serve as an essential complement to the existing theory of neural aspects of visual stimuli processing.

  20. Validity of Sensory Systems as Distinct Constructs

    PubMed Central

    Su, Chia-Ting

    2014-01-01

    This study investigated the validity of sensory systems as distinct measurable constructs as part of a larger project examining Ayres’s theory of sensory integration. Confirmatory factor analysis (CFA) was conducted to test whether sensory questionnaire items represent distinct sensory system constructs. Data were obtained from clinical records of two age groups, 2- to 5-yr-olds (n = 231) and 6- to 10-yr-olds (n = 223). With each group, we tested several CFA models for goodness of fit with the data. The accepted model was identical for each group and indicated that tactile, vestibular–proprioceptive, visual, and auditory systems form distinct, valid factors that are not age dependent. In contrast, alternative models that grouped items according to sensory processing problems (e.g., over- or underresponsiveness within or across sensory systems) did not yield valid factors. Results indicate that distinct sensory system constructs can be measured validly using questionnaire data. PMID:25184467

  1. Multisensory Integration in Non-Human Primates during a Sensory-Motor Task

    PubMed Central

    Lanz, Florian; Moret, Véronique; Rouiller, Eric Michel; Loquet, Gérard

    2013-01-01

    Daily our central nervous system receives inputs via several sensory modalities, processes them and integrates information in order to produce a suitable behavior. The amazing part is that such a multisensory integration brings all information into a unified percept. An approach to start investigating this property is to show that perception is better and faster when multimodal stimuli are used as compared to unimodal stimuli. This forms the first part of the present study conducted in a non-human primate’s model (n = 2) engaged in a detection sensory-motor task where visual and auditory stimuli were displayed individually or simultaneously. The measured parameters were the reaction time (RT) between stimulus and onset of arm movement, successes and errors percentages, as well as the evolution as a function of time of these parameters with training. As expected, RTs were shorter when the subjects were exposed to combined stimuli. The gains for both subjects were around 20 and 40 ms, as compared with the auditory and visual stimulus alone, respectively. Moreover the number of correct responses increased in response to bimodal stimuli. We interpreted such multisensory advantage through redundant signal effect which decreases perceptual ambiguity, increases speed of stimulus detection, and improves performance accuracy. The second part of the study presents single-unit recordings derived from the premotor cortex (PM) of the same subjects during the sensory-motor task. Response patterns to sensory/multisensory stimulation are documented and specific type proportions are reported. Characterization of bimodal neurons indicates a mechanism of audio-visual integration possibly through a decrease of inhibition. Nevertheless the neural processing leading to faster motor response from PM as a polysensory association cortical area remains still unclear. PMID:24319421

  2. Biases in rhythmic sensorimotor coordination: effects of modality and intentionality.

    PubMed

    Debats, Nienke B; Ridderikhoff, Arne; de Boer, Betteco J; Peper, C Lieke E

    2013-08-01

    Sensorimotor biases were examined for intentional (tracking task) and unintentional (distractor task) rhythmic coordination. The tracking task involved unimanual tracking of either an oscillating visual signal or the passive movements of the contralateral hand (proprioceptive signal). In both conditions the required coordination patterns (isodirectional and mirror-symmetric) were defined relative to the body midline and the hands were not visible. For proprioceptive tracking the two patterns did not differ in stability, whereas for visual tracking the isodirectional pattern was performed more stably than the mirror-symmetric pattern. However, when visual feedback about the unimanual hand movements was provided during visual tracking, the isodirectional pattern ceased to be dominant. Together these results indicated that the stability of the coordination patterns did not depend on the modality of the target signal per se, but on the combination of sensory signals that needed to be processed (unimodal vs. cross-modal). The distractor task entailed rhythmic unimanual movements during which a rhythmic visual or proprioceptive distractor signal had to be ignored. The observed biases were similar as for intentional coordination, suggesting that intentionality did not affect the underlying sensorimotor processes qualitatively. Intentional tracking was characterized by active sensory pursuit, through muscle activity in the passively moved arm (proprioceptive tracking task) and rhythmic eye movements (visual tracking task). Presumably this pursuit afforded predictive information serving the coordination process. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Visual Working Memory Enhances the Neural Response to Matching Visual Input.

    PubMed

    Gayet, Surya; Guggenmos, Matthias; Christophel, Thomas B; Haynes, John-Dylan; Paffen, Chris L E; Van der Stigchel, Stefan; Sterzer, Philipp

    2017-07-12

    Visual working memory (VWM) is used to maintain visual information available for subsequent goal-directed behavior. The content of VWM has been shown to affect the behavioral response to concurrent visual input, suggesting that visual representations originating from VWM and from sensory input draw upon a shared neural substrate (i.e., a sensory recruitment stance on VWM storage). Here, we hypothesized that visual information maintained in VWM would enhance the neural response to concurrent visual input that matches the content of VWM. To test this hypothesis, we measured fMRI BOLD responses to task-irrelevant stimuli acquired from 15 human participants (three males) performing a concurrent delayed match-to-sample task. In this task, observers were sequentially presented with two shape stimuli and a retro-cue indicating which of the two shapes should be memorized for subsequent recognition. During the retention interval, a task-irrelevant shape (the probe) was briefly presented in the peripheral visual field, which could either match or mismatch the shape category of the memorized stimulus. We show that this probe stimulus elicited a stronger BOLD response, and allowed for increased shape-classification performance, when it matched rather than mismatched the concurrently memorized content, despite identical visual stimulation. Our results demonstrate that VWM enhances the neural response to concurrent visual input in a content-specific way. This finding is consistent with the view that neural populations involved in sensory processing are recruited for VWM storage, and it provides a common explanation for a plethora of behavioral studies in which VWM-matching visual input elicits a stronger behavioral and perceptual response. SIGNIFICANCE STATEMENT Humans heavily rely on visual information to interact with their environment and frequently must memorize such information for later use. Visual working memory allows for maintaining such visual information in the mind's eye after termination of its retinal input. It is hypothesized that information maintained in visual working memory relies on the same neural populations that process visual input. Accordingly, the content of visual working memory is known to affect our conscious perception of concurrent visual input. Here, we demonstrate for the first time that visual input elicits an enhanced neural response when it matches the content of visual working memory, both in terms of signal strength and information content. Copyright © 2017 the authors 0270-6474/17/376638-10$15.00/0.

  4. Sensory processing during viewing of cinematographic material: Computational modeling and functional neuroimaging

    PubMed Central

    Bordier, Cecile; Puja, Francesco; Macaluso, Emiliano

    2013-01-01

    The investigation of brain activity using naturalistic, ecologically-valid stimuli is becoming an important challenge for neuroscience research. Several approaches have been proposed, primarily relying on data-driven methods (e.g. independent component analysis, ICA). However, data-driven methods often require some post-hoc interpretation of the imaging results to draw inferences about the underlying sensory, motor or cognitive functions. Here, we propose using a biologically-plausible computational model to extract (multi-)sensory stimulus statistics that can be used for standard hypothesis-driven analyses (general linear model, GLM). We ran two separate fMRI experiments, which both involved subjects watching an episode of a TV-series. In Exp 1, we manipulated the presentation by switching on-and-off color, motion and/or sound at variable intervals, whereas in Exp 2, the video was played in the original version, with all the consequent continuous changes of the different sensory features intact. Both for vision and audition, we extracted stimulus statistics corresponding to spatial and temporal discontinuities of low-level features, as well as a combined measure related to the overall stimulus saliency. Results showed that activity in occipital visual cortex and the superior temporal auditory cortex co-varied with changes of low-level features. Visual saliency was found to further boost activity in extra-striate visual cortex plus posterior parietal cortex, while auditory saliency was found to enhance activity in the superior temporal cortex. Data-driven ICA analyses of the same datasets also identified “sensory” networks comprising visual and auditory areas, but without providing specific information about the possible underlying processes, e.g., these processes could relate to modality, stimulus features and/or saliency. We conclude that the combination of computational modeling and GLM enables the tracking of the impact of bottom–up signals on brain activity during viewing of complex and dynamic multisensory stimuli, beyond the capability of purely data-driven approaches. PMID:23202431

  5. Behavioral and Brain Measures of Phasic Alerting Effects on Visual Attention.

    PubMed

    Wiegand, Iris; Petersen, Anders; Finke, Kathrin; Bundesen, Claus; Lansner, Jon; Habekost, Thomas

    2017-01-01

    In the present study, we investigated effects of phasic alerting on visual attention in a partial report task, in which half of the displays were preceded by an auditory warning cue. Based on the computational Theory of Visual Attention (TVA), we estimated parameters of spatial and non-spatial aspects of visual attention and measured event-related lateralizations (ERLs) over visual processing areas. We found that the TVA parameter sensory effectiveness a , which is thought to reflect visual processing capacity, significantly increased with phasic alerting. By contrast, the distribution of visual processing resources according to task relevance and spatial position, as quantified in parameters top-down control α and spatial bias w index , was not modulated by phasic alerting. On the electrophysiological level, the latencies of ERLs in response to the task displays were reduced following the warning cue. These results suggest that phasic alerting facilitates visual processing in a general, unselective manner and that this effect originates in early stages of visual information processing.

  6. A cross-modal investigation of the neural substrates for ongoing cognition

    PubMed Central

    Wang, Megan; He, Biyu J.

    2014-01-01

    What neural mechanisms underlie the seamless flow of our waking consciousness? A necessary albeit insufficient condition for such neural mechanisms is that they should be consistently modulated across time were a segment of the conscious stream to be repeated twice. In this study, we experimentally manipulated the content of a story followed by subjects during functional magnetic resonance imaging (fMRI) independently from the modality of sensory input (as visual text or auditory speech) as well as attentional focus. We then extracted brain activity patterns consistently modulated across subjects by the evolving content of the story regardless of whether it was presented visually or auditorily. Specifically, in one experiment we presented the same story to different subjects via either auditory or visual modality. In a second experiment, we presented two different stories simultaneously, one auditorily, one visually, and manipulated the subjects' attentional focus. This experimental design allowed us to dissociate brain activities underlying modality-specific sensory processing from modality-independent story processing. We uncovered a network of brain regions consistently modulated by the evolving content of a story regardless of the sensory modality used for stimulus input, including the superior temporal sulcus/gyrus (STS/STG), the inferior frontal gyrus (IFG), the posterior cingulate cortex (PCC), the medial frontal cortex (MFC), the temporal pole (TP), and the temporoparietal junction (TPJ). Many of these regions have previously been implicated in semantic processing. Interestingly, different stories elicited similar brain activity patterns, but with subtle differences potentially attributable to varying degrees of emotional valence and self-relevance. PMID:25206347

  7. Do the Contents of Visual Working Memory Automatically Influence Attentional Selection during Visual Search?

    ERIC Educational Resources Information Center

    Woodman, Geoffrey F.; Luck, Steven J.

    2007-01-01

    In many theories of cognition, researchers propose that working memory and perception operate interactively. For example, in previous studies researchers have suggested that sensory inputs matching the contents of working memory will have an automatic advantage in the competition for processing resources. The authors tested this hypothesis by…

  8. Impaired integration of object knowledge and visual input in a case of ventral simultanagnosia with bilateral damage to area V4.

    PubMed

    Leek, E Charles; d'Avossa, Giovanni; Tainturier, Marie-Josèphe; Roberts, Daniel J; Yuen, Sung Lai; Hu, Mo; Rafal, Robert

    2012-01-01

    This study examines how brain damage can affect the cognitive processes that support the integration of sensory input and prior knowledge during shape perception. It is based on the first detailed study of acquired ventral simultanagnosia, which was found in a patient (M.T.) with posterior occipitotemporal lesions encompassing V4 bilaterally. Despite showing normal object recognition for single items in both accuracy and response times (RTs), and intact low-level vision assessed across an extensive battery of tests, M.T. was impaired in object identification with overlapping figures displays. Task performance was modulated by familiarity: Unlike controls, M.T. was faster with overlapping displays of abstract shapes than with overlapping displays of common objects. His performance with overlapping common object displays was also influenced by both the semantic relatedness and visual similarity of the display items. These findings challenge claims that visual perception is driven solely by feedforward mechanisms and show how brain damage can selectively impair high-level perceptual processes supporting the integration of stored knowledge and visual sensory input.

  9. Stream specificity and asymmetries in feature binding and content-addressable access in visual encoding and memory.

    PubMed

    Huynh, Duong L; Tripathy, Srimant P; Bedell, Harold E; Ögmen, Haluk

    2015-01-01

    Human memory is content addressable-i.e., contents of the memory can be accessed using partial information about the bound features of a stored item. In this study, we used a cross-feature cuing technique to examine how the human visual system encodes, binds, and retains information about multiple stimulus features within a set of moving objects. We sought to characterize the roles of three different features (position, color, and direction of motion, the latter two of which are processed preferentially within the ventral and dorsal visual streams, respectively) in the construction and maintenance of object representations. We investigated the extent to which these features are bound together across the following processing stages: during stimulus encoding, sensory (iconic) memory, and visual short-term memory. Whereas all features examined here can serve as cues for addressing content, their effectiveness shows asymmetries and varies according to cue-report pairings and the stage of information processing and storage. Position-based indexing theories predict that position should be more effective as a cue compared to other features. While we found a privileged role for position as a cue at the stimulus-encoding stage, position was not the privileged cue at the sensory and visual short-term memory stages. Instead, the pattern that emerged from our findings is one that mirrors the parallel processing streams in the visual system. This stream-specific binding and cuing effectiveness manifests itself in all three stages of information processing examined here. Finally, we find that the Leaky Flask model proposed in our previous study is applicable to all three features.

  10. Screening for hearing, visual and dual sensory impairment in older adults using behavioural cues: a validation study.

    PubMed

    Roets-Merken, Lieve M; Zuidema, Sytse U; Vernooij-Dassen, Myrra J F J; Kempen, Gertrudis I J M

    2014-11-01

    This study investigated the psychometric properties of the Severe Dual Sensory Loss screening tool, a tool designed to help nurses and care assistants to identify hearing, visual and dual sensory impairment in older adults. Construct validity of the Severe Dual Sensory Loss screening tool was evaluated using Crohnbach's alpha and factor analysis. Interrater reliability was calculated using Kappa statistics. To evaluate the predictive validity, sensitivity and specificity were calculated by comparison with the criterion standard assessment for hearing and vision. The criterion used for hearing impairment was a hearing loss of ≥40 decibel measured by pure-tone audiometry, and the criterion for visual impairment was a visual acuity of ≤0.3 diopter or a visual field of ≤0.3°. Feasibility was evaluated by the time needed to fill in the screening tool and the clarity of the instruction and items. Prevalence of dual sensory impairment was calculated. A total of 56 older adults receiving aged care and 12 of their nurses and care assistants participated in the study. Crohnbach's alpha was 0.81 for the hearing subscale and 0.84 for the visual subscale. Factor analysis showed two constructs for hearing and two for vision. Kappa was 0.71 for the hearing subscale and 0.74 for the visual subscale. The predictive validity showed a sensitivity of 0.71 and a specificity of 0.72 for the hearing subscale; and a sensitivity of 0.69 and a specificity of 0.78 for the visual subscale. The optimum cut-off point for each subscale was score 1. The nurses and care assistants reported that the Severe Dual Sensory Loss screening tool was easy to use. The prevalence of hearing and vision impairment was 55% and 29%, respectively, and that of dual sensory impairment was 20%. The Severe Dual Sensory Loss screening tool was compared with the criterion standards for hearing and visual impairment and was found a valid and reliable tool, enabling nurses and care assistants to identify hearing, visual and dual sensory impairment among older adults. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Aging and the interaction of sensory cortical function and structure.

    PubMed

    Peiffer, Ann M; Hugenschmidt, Christina E; Maldjian, Joseph A; Casanova, Ramon; Srikanth, Ryali; Hayasaka, Satoru; Burdette, Jonathan H; Kraft, Robert A; Laurienti, Paul J

    2009-01-01

    Even the healthiest older adults experience changes in cognitive and sensory function. Studies show that older adults have reduced neural responses to sensory information. However, it is well known that sensory systems do not act in isolation but function cooperatively to either enhance or suppress neural responses to individual environmental stimuli. Very little research has been dedicated to understanding how aging affects the interactions between sensory systems, especially cross-modal deactivations or the ability of one sensory system (e.g., audition) to suppress the neural responses in another sensory system cortex (e.g., vision). Such cross-modal interactions have been implicated in attentional shifts between sensory modalities and could account for increased distractibility in older adults. To assess age-related changes in cross-modal deactivations, functional MRI studies were performed in 61 adults between 18 and 80 years old during simple auditory and visual discrimination tasks. Results within visual cortex confirmed previous findings of decreased responses to visual stimuli for older adults. Age-related changes in the visual cortical response to auditory stimuli were, however, much more complex and suggested an alteration with age in the functional interactions between the senses. Ventral visual cortical regions exhibited cross-modal deactivations in younger but not older adults, whereas more dorsal aspects of visual cortex were suppressed in older but not younger adults. These differences in deactivation also remained after adjusting for age-related reductions in brain volume of sensory cortex. Thus, functional differences in cortical activity between older and younger adults cannot solely be accounted for by differences in gray matter volume. (c) 2007 Wiley-Liss, Inc.

  12. Context generalization in Drosophila visual learning requires the mushroom bodies

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wolf, Reinhard; Ernst, Roman; Heisenberg, Martin

    1999-08-01

    The world is permanently changing. Laboratory experiments on learning and memory normally minimize this feature of reality, keeping all conditions except the conditioned and unconditioned stimuli as constant as possible. In the real world, however, animals need to extract from the universe of sensory signals the actual predictors of salient events by separating them from non-predictive stimuli (context). In principle, this can be achieved ifonly those sensory inputs that resemble the reinforcer in theirtemporal structure are taken as predictors. Here we study visual learning in the fly Drosophila melanogaster, using a flight simulator,, and show that memory retrieval is, indeed, partially context-independent. Moreover, we show that the mushroom bodies, which are required for olfactory but not visual or tactile learning, effectively support context generalization. In visual learning in Drosophila, it appears that a facilitating effect of context cues for memory retrieval is the default state, whereas making recall context-independent requires additional processing.

  13. Strength of figure-ground activity in monkey primary visual cortex predicts saccadic reaction time in a delayed detection task.

    PubMed

    Supèr, Hans; Lamme, Victor A F

    2007-06-01

    When and where are decisions made? In the visual system a saccade, which is a fast shift of gaze toward a target in the visual scene, is the behavioral outcome of a decision. Current neurophysiological data and reaction time models show that saccadic reaction times are determined by a build-up of activity in motor-related structures, such as the frontal eye fields. These structures depend on the sensory evidence of the stimulus. Here we use a delayed figure-ground detection task to show that late modulated activity in the visual cortex (V1) predicts saccadic reaction time. This predictive activity is part of the process of figure-ground segregation and is specific for the saccade target location. These observations indicate that sensory signals are directly involved in the decision of when and where to look.

  14. Auditory and visual connectivity gradients in frontoparietal cortex

    PubMed Central

    Hellyer, Peter J.; Wise, Richard J. S.; Leech, Robert

    2016-01-01

    Abstract A frontoparietal network of brain regions is often implicated in both auditory and visual information processing. Although it is possible that the same set of multimodal regions subserves both modalities, there is increasing evidence that there is a differentiation of sensory function within frontoparietal cortex. Magnetic resonance imaging (MRI) in humans was used to investigate whether different frontoparietal regions showed intrinsic biases in connectivity with visual or auditory modalities. Structural connectivity was assessed with diffusion tractography and functional connectivity was tested using functional MRI. A dorsal–ventral gradient of function was observed, where connectivity with visual cortex dominates dorsal frontal and parietal connections, while connectivity with auditory cortex dominates ventral frontal and parietal regions. A gradient was also observed along the posterior–anterior axis, although in opposite directions in prefrontal and parietal cortices. The results suggest that the location of neural activity within frontoparietal cortex may be influenced by these intrinsic biases toward visual and auditory processing. Thus, the location of activity in frontoparietal cortex may be influenced as much by stimulus modality as the cognitive demands of a task. It was concluded that stimulus modality was spatially encoded throughout frontal and parietal cortices, and was speculated that such an arrangement allows for top–down modulation of modality‐specific information to occur within higher‐order cortex. This could provide a potentially faster and more efficient pathway by which top–down selection between sensory modalities could occur, by constraining modulations to within frontal and parietal regions, rather than long‐range connections to sensory cortices. Hum Brain Mapp 38:255–270, 2017. © 2016 Wiley Periodicals, Inc. PMID:27571304

  15. Temporal Processing in the Olfactory System: Can We See a Smell?

    PubMed Central

    Gire, David H.; Restrepo, Diego; Sejnowski, Terrence J.; Greer, Charles; De Carlos, Juan A.; Lopez-Mascaraque, Laura

    2013-01-01

    Sensory processing circuits in the visual and olfactory systems receive input from complex, rapidly changing environments. Although patterns of light and plumes of odor create different distributions of activity in the retina and olfactory bulb, both structures use what appears on the surface similar temporal coding strategies to convey information to higher areas in the brain. We compare temporal coding in the early stages of the olfactory and visual systems, highlighting recent progress in understanding the role of time in olfactory coding during active sensing by behaving animals. We also examine studies that address the divergent circuit mechanisms that generate temporal codes in the two systems, and find that they provide physiological information directly related to functional questions raised by neuroanatomical studies of Ramon y Cajal over a century ago. Consideration of differences in neural activity in sensory systems contributes to generating new approaches to understand signal processing. PMID:23664611

  16. Effects of aging on perception of motion

    NASA Astrophysics Data System (ADS)

    Kaur, Manpreet; Wilder, Joseph; Hung, George; Julesz, Bela

    1997-09-01

    Driving requires two basic visual components: 'visual sensory function' and 'higher order skills.' Among the elderly, it has been observed that when attention must be divided in the presence of multiple objects, their attentional skills and relational processes, along with impairment of basic visual sensory function, are markedly impaired. A high frame rate imaging system was developed to assess the elderly driver's ability to locate and distinguish computer generated images of vehicles and to determine their direction of motion in a simulated intersection. Preliminary experiments were performed at varying target speeds and angular displacements to study the effect of these parameters on motion perception. Results for subjects in four different age groups, ranging from mid- twenties to mid-sixties, show significantly better performance for the younger subjects as compared to the older ones.

  17. Negative BOLD in sensory cortices during verbal memory: a component in generating internal representations?

    PubMed

    Azulay, Haim; Striem, Ella; Amedi, Amir

    2009-05-01

    People tend to close their eyes when trying to retrieve an event or a visual image from memory. However the brain mechanisms behind this phenomenon remain poorly understood. Recently, we showed that during visual mental imagery, auditory areas show a much more robust deactivation than during visual perception. Here we ask whether this is a special case of a more general phenomenon involving retrieval of intrinsic, internally stored information, which would result in crossmodal deactivations in other sensory cortices which are irrelevant to the task at hand. To test this hypothesis, a group of 9 sighted individuals were scanned while performing a memory retrieval task for highly abstract words (i.e., with low imaginability scores). We also scanned a group of 10 congenitally blind, which by definition do not have any visual imagery per se. In sighted subjects, both auditory and visual areas were robustly deactivated during memory retrieval, whereas in the blind the auditory cortex was deactivated while visual areas, shown previously to be relevant for this task, presented a positive BOLD signal. These results suggest that deactivation may be most prominent in task-irrelevant sensory cortices whenever there is a need for retrieval or manipulation of internally stored representations. Thus, there is a task-dependent balance of activation and deactivation that might allow maximization of resources and filtering out of non relevant information to enable allocation of attention to the required task. Furthermore, these results suggest that the balance between positive and negative BOLD might be crucial to our understanding of a large variety of intrinsic and extrinsic tasks including high-level cognitive functions, sensory processing and multisensory integration.

  18. Linking Cognitive and Visual Perceptual Decline in Healthy Aging: The Information Degradation Hypothesis

    PubMed Central

    Monge, Zachary A.; Madden, David J.

    2016-01-01

    Several hypotheses attempt to explain the relation between cognitive and perceptual decline in aging (e.g., common-cause, sensory deprivation, cognitive load on perception, information degradation). Unfortunately, the majority of past studies examining this association have used correlational analyses, not allowing for these hypotheses to be tested sufficiently. This correlational issue is especially relevant for the information degradation hypothesis, which states that degraded perceptual signal inputs, resulting from either age-related neurobiological processes (e.g., retinal degeneration) or experimental manipulations (e.g., reduced visual contrast), lead to errors in perceptual processing, which in turn may affect non-perceptual, higher-order cognitive processes. Even though the majority of studies examining the relation between age-related cognitive and perceptual decline have been correlational, we reviewed several studies demonstrating that visual manipulations affect both younger and older adults’ cognitive performance, supporting the information degradation hypothesis and contradicting implications of other hypotheses (e.g., common-cause, sensory deprivation, cognitive load on perception). The reviewed evidence indicates the necessity to further examine the information degradation hypothesis in order to identify mechanisms underlying age-related cognitive decline. PMID:27484869

  19. Contextual signals in visual cortex.

    PubMed

    Khan, Adil G; Hofer, Sonja B

    2018-06-05

    Vision is an active process. What we perceive strongly depends on our actions, intentions and expectations. During visual processing, these internal signals therefore need to be integrated with the visual information from the retina. The mechanisms of how this is achieved by the visual system are still poorly understood. Advances in recording and manipulating neuronal activity in specific cell types and axonal projections together with tools for circuit tracing are beginning to shed light on the neuronal circuit mechanisms of how internal, contextual signals shape sensory representations. Here we review recent work, primarily in mice, that has advanced our understanding of these processes, focusing on contextual signals related to locomotion, behavioural relevance and predictions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Visual cortex responses reflect temporal structure of continuous quasi-rhythmic sensory stimulation.

    PubMed

    Keitel, Christian; Thut, Gregor; Gross, Joachim

    2017-02-01

    Neural processing of dynamic continuous visual input, and cognitive influences thereon, are frequently studied in paradigms employing strictly rhythmic stimulation. However, the temporal structure of natural stimuli is hardly ever fully rhythmic but possesses certain spectral bandwidths (e.g. lip movements in speech, gestures). Examining periodic brain responses elicited by strictly rhythmic stimulation might thus represent ideal, yet isolated cases. Here, we tested how the visual system reflects quasi-rhythmic stimulation with frequencies continuously varying within ranges of classical theta (4-7Hz), alpha (8-13Hz) and beta bands (14-20Hz) using EEG. Our findings substantiate a systematic and sustained neural phase-locking to stimulation in all three frequency ranges. Further, we found that allocation of spatial attention enhances EEG-stimulus locking to theta- and alpha-band stimulation. Our results bridge recent findings regarding phase locking ("entrainment") to quasi-rhythmic visual input and "frequency-tagging" experiments employing strictly rhythmic stimulation. We propose that sustained EEG-stimulus locking can be considered as a continuous neural signature of processing dynamic sensory input in early visual cortices. Accordingly, EEG-stimulus locking serves to trace the temporal evolution of rhythmic as well as quasi-rhythmic visual input and is subject to attentional bias. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. The consequence of spatial visual processing dysfunction caused by traumatic brain injury (TBI).

    PubMed

    Padula, William V; Capo-Aponte, Jose E; Padula, William V; Singman, Eric L; Jenness, Jonathan

    2017-01-01

    A bi-modal visual processing model is supported by research to affect dysfunction following a traumatic brain injury (TBI). TBI causes dysfunction of visual processing affecting binocularity, spatial orientation, posture and balance. Research demonstrates that prescription of prisms influence the plasticity between spatial visual processing and motor-sensory systems improving visual processing and reducing symptoms following a TBI. The rationale demonstrates that visual processing underlies the functional aspects of binocularity, balance and posture. The bi-modal visual process maintains plasticity for efficiency. Compromise causes Post Trauma Vision Syndrome (PTVS) and Visual Midline Shift Syndrome (VMSS). Rehabilitation through use of lenses, prisms and sectoral occlusion has inter-professional implications in rehabilitation affecting the plasticity of the bi-modal visual process, thereby improving binocularity, spatial orientation, posture and balance Main outcomes: This review provides an opportunity to create a new perspective of the consequences of TBI on visual processing and the symptoms that are often caused by trauma. It also serves to provide a perspective of visual processing dysfunction that has potential for developing new approaches of rehabilitation. Understanding vision as a bi-modal process facilitates a new perspective of visual processing and the potentials for rehabilitation following a concussion, brain injury or other neurological events.

  2. Large-scale two-photon imaging revealed super-sparse population codes in the V1 superficial layer of awake monkeys.

    PubMed

    Tang, Shiming; Zhang, Yimeng; Li, Zhihao; Li, Ming; Liu, Fang; Jiang, Hongfei; Lee, Tai Sing

    2018-04-26

    One general principle of sensory information processing is that the brain must optimize efficiency by reducing the number of neurons that process the same information. The sparseness of the sensory representations in a population of neurons reflects the efficiency of the neural code. Here, we employ large-scale two-photon calcium imaging to examine the responses of a large population of neurons within the superficial layers of area V1 with single-cell resolution, while simultaneously presenting a large set of natural visual stimuli, to provide the first direct measure of the population sparseness in awake primates. The results show that only 0.5% of neurons respond strongly to any given natural image - indicating a ten-fold increase in the inferred sparseness over previous measurements. These population activities are nevertheless necessary and sufficient to discriminate visual stimuli with high accuracy, suggesting that the neural code in the primary visual cortex is both super-sparse and highly efficient. © 2018, Tang et al.

  3. Gain control by layer six in cortical circuits of vision.

    PubMed

    Olsen, Shawn R; Bortone, Dante S; Adesnik, Hillel; Scanziani, Massimo

    2012-02-22

    After entering the cerebral cortex, sensory information spreads through six different horizontal neuronal layers that are interconnected by vertical axonal projections. It is believed that through these projections layers can influence each other's response to sensory stimuli, but the specific role that each layer has in cortical processing is still poorly understood. Here we show that layer six in the primary visual cortex of the mouse has a crucial role in controlling the gain of visually evoked activity in neurons of the upper layers without changing their tuning to orientation. This gain modulation results from the coordinated action of layer six intracortical projections to superficial layers and deep projections to the thalamus, with a substantial role of the intracortical circuit. This study establishes layer six as a major mediator of cortical gain modulation and suggests that it could be a node through which convergent inputs from several brain areas can regulate the earliest steps of cortical visual processing.

  4. Concept Representation Reflects Multimodal Abstraction: A Framework for Embodied Semantics

    PubMed Central

    Fernandino, Leonardo; Binder, Jeffrey R.; Desai, Rutvik H.; Pendl, Suzanne L.; Humphries, Colin J.; Gross, William L.; Conant, Lisa L.; Seidenberg, Mark S.

    2016-01-01

    Recent research indicates that sensory and motor cortical areas play a significant role in the neural representation of concepts. However, little is known about the overall architecture of this representational system, including the role played by higher level areas that integrate different types of sensory and motor information. The present study addressed this issue by investigating the simultaneous contributions of multiple sensory-motor modalities to semantic word processing. With a multivariate fMRI design, we examined activation associated with 5 sensory-motor attributes—color, shape, visual motion, sound, and manipulation—for 900 words. Regions responsive to each attribute were identified using independent ratings of the attributes' relevance to the meaning of each word. The results indicate that these aspects of conceptual knowledge are encoded in multimodal and higher level unimodal areas involved in processing the corresponding types of information during perception and action, in agreement with embodied theories of semantics. They also reveal a hierarchical system of abstracted sensory-motor representations incorporating a major division between object interaction and object perception processes. PMID:25750259

  5. Brain size and visual environment predict species differences in paper wasp sensory processing brain regions (hymenoptera: vespidae, polistinae).

    PubMed

    O'Donnell, Sean; Clifford, Marie R; DeLeon, Sara; Papa, Christopher; Zahedi, Nazaneen; Bulova, Susan J

    2013-01-01

    The mosaic brain evolution hypothesis predicts that the relative volumes of functionally distinct brain regions will vary independently and correlate with species' ecology. Paper wasp species (Hymenoptera: Vespidae, Polistinae) differ in light exposure: they construct open versus enclosed nests and one genus (Apoica) is nocturnal. We asked whether light environments were related to species differences in the size of antennal and optic processing brain tissues. Paper wasp brains have anatomically distinct peripheral and central regions that process antennal and optic sensory inputs. We measured the volumes of 4 sensory processing brain regions in paper wasp species from 13 Neotropical genera including open and enclosed nesters, and diurnal and nocturnal species. Species differed in sensory region volumes, but there was no evidence for trade-offs among sensory modalities. All sensory region volumes correlated with brain size. However, peripheral optic processing investment increased with brain size at a higher rate than peripheral antennal processing investment. Our data suggest that mosaic and concerted (size-constrained) brain evolution are not exclusive alternatives. When brain regions increase with brain size at different rates, these distinct allometries can allow for differential investment among sensory modalities. As predicted by mosaic evolution, species ecology was associated with some aspects of brain region investment. Nest architecture variation was not associated with brain investment differences, but the nocturnal genus Apoica had the largest antennal:optic volume ratio in its peripheral sensory lobes. Investment in central processing tissues was not related to nocturnality, a pattern also noted in mammals. The plasticity of neural connections in central regions may accommodate evolutionary shifts in input from the periphery with relatively minor changes in volume. © 2013 S. Karger AG, Basel.

  6. Location-specific effects of attention during visual short-term memory maintenance.

    PubMed

    Matsukura, Michi; Cosman, Joshua D; Roper, Zachary J J; Vatterott, Daniel B; Vecera, Shaun P

    2014-06-01

    Recent neuroimaging studies suggest that early sensory areas such as area V1 are recruited to actively maintain a selected feature of the item held in visual short-term memory (VSTM). These findings raise the possibility that visual attention operates in similar manners across perceptual and memory representations to a certain extent, despite memory-level and perception-level selections are functionally dissociable. If VSTM operates by retaining "reasonable copies" of scenes constructed during sensory processing (Serences et al., 2009, p. 207, the sensory recruitment hypothesis), then it is possible that selective attention can be guided by both exogenous (peripheral) and endogenous (central) cues during VSTM maintenance. Yet, the results from the previous studies that examined this issue are inconsistent. In the present study, we investigated whether attention can be directed to a specific item's location represented in VSTM with the exogenous cue in a well-controlled setting. The results from the four experiments suggest that, as observed with the endogenous cue, the exogenous cue can efficiently guide selective attention during VSTM maintenance. The finding is not only consistent with the sensory recruitment hypothesis but also validates the legitimacy of the exogenous cue use in past and future studies. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech

    PubMed Central

    Alcalá-Quintana, Rocío

    2015-01-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. PMID:27551361

  8. Aging effects on functional auditory and visual processing using fMRI with variable sensory loading.

    PubMed

    Cliff, Michael; Joyce, Dan W; Lamar, Melissa; Dannhauser, Thomas; Tracy, Derek K; Shergill, Sukhwinder S

    2013-05-01

    Traditionally, studies investigating the functional implications of age-related structural brain alterations have focused on higher cognitive processes; by increasing stimulus load, these studies assess behavioral and neurophysiological performance. In order to understand age-related changes in these higher cognitive processes, it is crucial to examine changes in visual and auditory processes that are the gateways to higher cognitive functions. This study provides evidence for age-related functional decline in visual and auditory processing, and regional alterations in functional brain processing, using non-invasive neuroimaging. Using functional magnetic resonance imaging (fMRI), younger (n=11; mean age=31) and older (n=10; mean age=68) adults were imaged while observing flashing checkerboard images (passive visual stimuli) and hearing word lists (passive auditory stimuli) across varying stimuli presentation rates. Younger adults showed greater overall levels of temporal and occipital cortical activation than older adults for both auditory and visual stimuli. The relative change in activity as a function of stimulus presentation rate showed differences between young and older participants. In visual cortex, the older group showed a decrease in fMRI blood oxygen level dependent (BOLD) signal magnitude as stimulus frequency increased, whereas the younger group showed a linear increase. In auditory cortex, the younger group showed a relative increase as a function of word presentation rate, while older participants showed a relatively stable magnitude of fMRI BOLD response across all rates. When analyzing participants across all ages, only the auditory cortical activation showed a continuous, monotonically decreasing BOLD signal magnitude as a function of age. Our preliminary findings show an age-related decline in demand-related, passive early sensory processing. As stimulus demand increases, visual and auditory cortex do not show increases in activity in older compared to younger people. This may negatively impact on the fidelity of information available to higher cognitive processing. Such evidence may inform future studies focused on cognitive decline in aging. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. Selection of multiple cued items is possible during visual short-term memory maintenance.

    PubMed

    Matsukura, Michi; Vecera, Shaun P

    2015-07-01

    Recent neuroimaging studies suggest that maintenance of a selected object feature held in visual short-term/working memory (VSTM/VWM) is supported by the same neural mechanisms that encode the sensory information. If VSTM operates by retaining "reasonable copies" of scenes constructed during sensory processing (Serences, Ester, Vogel, & Awh, 2009, p. 207, the sensory recruitment hypothesis), then attention should be able to select multiple items represented in VSTM as long as the number of these attended items does not exceed the typical VSTM capacity. It is well known that attention can select at least two noncontiguous locations at the same time during sensory processing. However, empirical reports from the studies that examined this possibility are inconsistent. In the present study, we demonstrate that (1) attention can indeed select more than a single item during VSTM maintenance when observers are asked to recognize a set of items in the manner that these items were originally attended, and (2) attention can select multiple cued items regardless of whether these items are perceptually organized into a single group (contiguous locations) or not (noncontiguous locations). The results also replicate and extend the recent finding that selective attention that operates during VSTM maintenance is sensitive to the observers' goal and motivation to use the cueing information.

  10. Effects of Binaural Sensory Aids on the Development of Visual Perceptual Abilities in Visually Handicapped Infants. Final Report, April 15, 1982-November 15, 1982.

    ERIC Educational Resources Information Center

    Hart, Verna; Ferrell, Kay

    Twenty-four congenitally visually handicapped infants, aged 6-24 months, participated in a study to determine (1) those stimuli best able to elicit visual attention, (2) the stability of visual acuity over time, and (3) the effects of binaural sensory aids on both visual attention and visual acuity. Ss were dichotomized into visually handicapped…

  11. A common perceptual temporal limit of binding synchronous inputs across different sensory attributes and modalities.

    PubMed

    Fujisaki, Waka; Nishida, Shin'ya

    2010-08-07

    The human brain processes different aspects of the surrounding environment through multiple sensory modalities, and each modality can be subdivided into multiple attribute-specific channels. When the brain rebinds sensory content information ('what') across different channels, temporal coincidence ('when') along with spatial coincidence ('where') provides a critical clue. It however remains unknown whether neural mechanisms for binding synchronous attributes are specific to each attribute combination, or universal and central. In human psychophysical experiments, we examined how combinations of visual, auditory and tactile attributes affect the temporal frequency limit of synchrony-based binding. The results indicated that the upper limits of cross-attribute binding were lower than those of within-attribute binding, and surprisingly similar for any combination of visual, auditory and tactile attributes (2-3 Hz). They are unlikely to be the limits for judging synchrony, since the temporal limit of a cross-attribute synchrony judgement was higher and varied with the modality combination (4-9 Hz). These findings suggest that cross-attribute temporal binding is mediated by a slow central process that combines separately processed 'what' and 'when' properties of a single event. While the synchrony performance reflects temporal bottlenecks existing in 'when' processing, the binding performance reflects the central temporal limit of integrating 'when' and 'what' properties.

  12. Distributed and opposing effects of incidental learning in the human brain.

    PubMed

    Hall, Michelle G; Naughtin, Claire K; Mattingley, Jason B; Dux, Paul E

    2018-06-01

    Incidental learning affords a behavioural advantage when sensory information matches regularities that have previously been encountered. Previous studies have taken a focused approach by probing the involvement of specific candidate brain regions underlying incidentally acquired memory representations, as well as expectation effects on early sensory representations. Here, we investigated the broader extent of the brain's sensitivity to violations and fulfilments of expectations, using an incidental learning paradigm in which the contingencies between target locations and target identities were manipulated without participants' overt knowledge. Multivariate analysis of functional magnetic resonance imaging data was applied to compare the consistency of neural activity for visual events that the contingency manipulation rendered likely versus unlikely. We observed widespread sensitivity to expectations across frontal, temporal, occipital, and sub-cortical areas. These activation clusters showed distinct response profiles, such that some regions displayed more reliable activation patterns under fulfilled expectations, whereas others showed more reliable patterns when expectations were violated. These findings reveal that expectations affect multiple stages of information processing during visual decision making, rather than early sensory processing stages alone. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Language Networks in Anophthalmia: Maintained Hierarchy of Processing in "Visual" Cortex

    ERIC Educational Resources Information Center

    Watkins, Kate E.; Cowey, Alan; Alexander, Iona; Filippini, Nicola; Kennedy, James M.; Smith, Stephen M.; Ragge, Nicola; Bridge, Holly

    2012-01-01

    Imaging studies in blind subjects have consistently shown that sensory and cognitive tasks evoke activity in the occipital cortex, which is normally visual. The precise areas involved and degree of activation are dependent upon the cause and age of onset of blindness. Here, we investigated the cortical language network at rest and during an…

  14. The Effect of Non-Visual Working Memory Load on Top-Down Modulation of Visual Processing

    ERIC Educational Resources Information Center

    Rissman, Jesse; Gazzaley, Adam; D'Esposito, Mark

    2009-01-01

    While a core function of the working memory (WM) system is the active maintenance of behaviorally relevant sensory representations, it is also critical that distracting stimuli are appropriately ignored. We used functional magnetic resonance imaging to examine the role of domain-general WM resources in the top-down attentional modulation of…

  15. Modulation of visual physiology by behavioral state in monkeys, mice, and flies.

    PubMed

    Maimon, Gaby

    2011-08-01

    When a monkey attends to a visual stimulus, neurons in visual cortex respond differently to that stimulus than when the monkey attends elsewhere. In the 25 years since the initial discovery, the study of attention in primates has been central to understanding flexible visual processing. Recent experiments demonstrate that visual neurons in mice and fruit flies are modulated by locomotor behaviors, like running and flying, in a manner that resembles attention-based modulations in primates. The similar findings across species argue for a more generalized view of state-dependent sensory processing and for a renewed dialogue among vertebrate and invertebrate research communities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Partitioning neuronal variability

    PubMed Central

    Goris, Robbe L.T.; Movshon, J. Anthony; Simoncelli, Eero P.

    2014-01-01

    Responses of sensory neurons differ across repeated measurements. This variability is usually treated as stochasticity arising within neurons or neural circuits. However, some portion of the variability arises from fluctuations in excitability due to factors that are not purely sensory, such as arousal, attention, and adaptation. To isolate these fluctuations, we developed a model in which spikes are generated by a Poisson process whose rate is the product of a drive that is sensory in origin, and a gain summarizing stimulus-independent modulatory influences on excitability. This model provides an accurate account of response distributions of visual neurons in macaque LGN, V1, V2, and MT, revealing that variability originates in large part from excitability fluctuations which are correlated over time and between neurons, and which increase in strength along the visual pathway. The model provides a parsimonious explanation for observed systematic dependencies of response variability and covariability on firing rate. PMID:24777419

  17. Normalization regulates competition for visual awareness

    PubMed Central

    Ling, Sam; Blake, Randolph

    2012-01-01

    Summary Signals in our brain are in a constant state of competition, including those that vie for motor control, sensory dominance and awareness. To shed light on the mechanisms underlying neural competition, we exploit binocular rivalry, a phenomenon that allows us to probe the competitive process that ordinarily transpires outside of our awareness. By measuring psychometric functions under different states of rivalry, we discovered a pattern of gain changes that are consistent with a model of competition in which attention interacts with normalization processes, thereby driving the ebb and flow between states of awareness. Moreover, we reveal that attention plays a crucial role in modulating competition; without attention, rivalry suppression for high-contrast stimuli is negligible. We propose a framework whereby our visual awareness of competing sensory representations is governed by a common neural computation: normalization. PMID:22884335

  18. Straightening the Eyes Doesn't Rebalance the Brain

    PubMed Central

    Zhou, Jiawei; Wang, Yonghua; Feng, Lixia; Wang, Jiafeng; Hess, Robert F.

    2017-01-01

    Surgery to align the two eyes is commonly used in treating strabismus. However, the role of strabismic surgery on patients' binocular visual processing is not yet fully understood. In this study, we asked two questions: (1) Does realigning the eyes by strabismic surgery produce an immediate benefit to patients' sensory eye balance? (2) If not, is there a subsequent period of “alignment adaptation” akin to refractive adaptation where sensory benefits to binocular function accrue? Seventeen patients with strabismus (mean age: 17.06 ± 5.16 years old) participated in our experiment. All participants had normal or corrected to normal visual acuity (LogMAR < 0.10) in the two eyes. We quantitatively measured their sensory eye balance before and after surgery using a binocular phase combination paradigm. For the seven patients whose sensory eye balance was measured before surgery, we found no significant change [t(6) = −0.92; p = 0.39] in the sensory eye balance measured 0.5–1 months after the surgery, indicating that the surgical re-alignment didn't by itself produce any immediate benefit for sensory eye balance. To answer the second question, we measured 16 patients' sensory eye balance at around 5–12 months after their eyes had been surgically re-aligned and compared this with our measurements 0.5–1 months after surgery. We found no significant change [t(15) = −0.89; p = 0.39] in sensory eye balance 5–12 months after the surgery. These results suggest that strabismic surgery while being necessary is not itself sufficient for re-establishing balanced sensory eye dominance. PMID:28955214

  19. Enhanced Visual Cortical Activation for Emotional Stimuli is Preserved in Patients with Unilateral Amygdala Resection

    PubMed Central

    Edmiston, E. Kale; McHugo, Maureen; Dukic, Mildred S.; Smith, Stephen D.; Abou-Khalil, Bassel; Eggers, Erica

    2013-01-01

    Emotionally arousing pictures induce increased activation of visual pathways relative to emotionally neutral images. A predominant model for the preferential processing and attention to emotional stimuli posits that the amygdala modulates sensory pathways through its projections to visual cortices. However, recent behavioral studies have found intact perceptual facilitation of emotional stimuli in individuals with amygdala damage. To determine the importance of the amygdala to modulations in visual processing, we used functional magnetic resonance imaging to examine visual cortical blood oxygenation level-dependent (BOLD) signal in response to emotionally salient and neutral images in a sample of human patients with unilateral medial temporal lobe resection that included the amygdala. Adults with right (n = 13) or left (n = 5) medial temporal lobe resections were compared with demographically matched healthy control participants (n = 16). In the control participants, both aversive and erotic images produced robust BOLD signal increases in bilateral primary and secondary visual cortices relative to neutral images. Similarly, all patients with amygdala resections showed enhanced visual cortical activations to erotic images both ipsilateral and contralateral to the lesion site. All but one of the amygdala resection patients showed similar enhancements to aversive stimuli and there were no significant group differences in visual cortex BOLD responses in patients compared with controls for either aversive or erotic images. Our results indicate that neither the right nor left amygdala is necessary for the heightened visual cortex BOLD responses observed during emotional stimulus presentation. These data challenge an amygdalo-centric model of emotional modulation and suggest that non-amygdalar processes contribute to the emotional modulation of sensory pathways. PMID:23825407

  20. Decoding visual object categories in early somatosensory cortex.

    PubMed

    Smith, Fraser W; Goodale, Melvyn A

    2015-04-01

    Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influence from both within and across modality connections. In the present work, we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associations formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in 2 fMRI experiments. Multivariate pattern analysis revealed reliable decoding of familiar visual object category in bilateral S1 (i.e., postcentral gyri) and right S2. We further show that this decoding is observed for familiar but not unfamiliar visual objects in S1. In addition, whole-brain searchlight decoding analyses revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somatosensory processing carry information about the category of visually presented familiar objects. © The Author 2013. Published by Oxford University Press.

  1. Decoding Visual Object Categories in Early Somatosensory Cortex

    PubMed Central

    Smith, Fraser W.; Goodale, Melvyn A.

    2015-01-01

    Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influence from both within and across modality connections. In the present work, we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associations formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in 2 fMRI experiments. Multivariate pattern analysis revealed reliable decoding of familiar visual object category in bilateral S1 (i.e., postcentral gyri) and right S2. We further show that this decoding is observed for familiar but not unfamiliar visual objects in S1. In addition, whole-brain searchlight decoding analyses revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somatosensory processing carry information about the category of visually presented familiar objects. PMID:24122136

  2. Comparing the visual spans for faces and letters

    PubMed Central

    He, Yingchen; Scholz, Jennifer M.; Gage, Rachel; Kallie, Christopher S.; Liu, Tingting; Legge, Gordon E.

    2015-01-01

    The visual span—the number of adjacent text letters that can be reliably recognized on one fixation—has been proposed as a sensory bottleneck that limits reading speed (Legge, Mansfield, & Chung, 2001). Like reading, searching for a face is an important daily task that involves pattern recognition. Is there a similar limitation on the number of faces that can be recognized in a single fixation? Here we report on a study in which we measured and compared the visual-span profiles for letter and face recognition. A serial two-stage model for pattern recognition was developed to interpret the data. The first stage is characterized by factors limiting recognition of isolated letters or faces, and the second stage represents the interfering effect of nearby stimuli on recognition. Our findings show that the visual span for faces is smaller than that for letters. Surprisingly, however, when differences in first-stage processing for letters and faces are accounted for, the two visual spans become nearly identical. These results suggest that the concept of visual span may describe a common sensory bottleneck that underlies different types of pattern recognition. PMID:26129858

  3. Enhanced dimension-specific visual working memory in grapheme–color synesthesia☆

    PubMed Central

    Terhune, Devin Blair; Wudarczyk, Olga Anna; Kochuparampil, Priya; Cohen Kadosh, Roi

    2013-01-01

    There is emerging evidence that the encoding of visual information and the maintenance of this information in a temporarily accessible state in working memory rely on the same neural mechanisms. A consequence of this overlap is that atypical forms of perception should influence working memory. We examined this by investigating whether having grapheme–color synesthesia, a condition characterized by the involuntary experience of color photisms when reading or representing graphemes, would confer benefits on working memory. Two competing hypotheses propose that superior memory in synesthesia results from information being coded in two information channels (dual-coding) or from superior dimension-specific visual processing (enhanced processing). We discriminated between these hypotheses in three n-back experiments in which controls and synesthetes viewed inducer and non-inducer graphemes and maintained color or grapheme information in working memory. Synesthetes displayed superior color working memory than controls for both grapheme types, whereas the two groups did not differ in grapheme working memory. Further analyses excluded the possibilities of enhanced working memory among synesthetes being due to greater color discrimination, stimulus color familiarity, or bidirectionality. These results reveal enhanced dimension-specific visual working memory in this population and supply further evidence for a close relationship between sensory processing and the maintenance of sensory information in working memory. PMID:23892185

  4. Contributions of Low and High Spatial Frequency Processing to Impaired Object Recognition Circuitry in Schizophrenia

    PubMed Central

    Calderone, Daniel J.; Hoptman, Matthew J.; Martínez, Antígona; Nair-Collins, Sangeeta; Mauro, Cristina J.; Bar, Moshe; Javitt, Daniel C.; Butler, Pamela D.

    2013-01-01

    Patients with schizophrenia exhibit cognitive and sensory impairment, and object recognition deficits have been linked to sensory deficits. The “frame and fill” model of object recognition posits that low spatial frequency (LSF) information rapidly reaches the prefrontal cortex (PFC) and creates a general shape of an object that feeds back to the ventral temporal cortex to assist object recognition. Visual dysfunction findings in schizophrenia suggest a preferential loss of LSF information. This study used functional magnetic resonance imaging (fMRI) and resting state functional connectivity (RSFC) to investigate the contribution of visual deficits to impaired object “framing” circuitry in schizophrenia. Participants were shown object stimuli that were intact or contained only LSF or high spatial frequency (HSF) information. For controls, fMRI revealed preferential activation to LSF information in precuneus, superior temporal, and medial and dorsolateral PFC areas, whereas patients showed a preference for HSF information or no preference. RSFC revealed a lack of connectivity between early visual areas and PFC for patients. These results demonstrate impaired processing of LSF information during object recognition in schizophrenia, with patients instead displaying increased processing of HSF information. This is consistent with findings of a preference for local over global visual information in schizophrenia. PMID:22735157

  5. Feature-based attention elicits surround suppression in feature space.

    PubMed

    Störmer, Viola S; Alvarez, George A

    2014-09-08

    It is known that focusing attention on a particular feature (e.g., the color red) facilitates the processing of all objects in the visual field containing that feature [1-7]. Here, we show that such feature-based attention not only facilitates processing but also actively inhibits processing of similar, but not identical, features globally across the visual field. We combined behavior and electrophysiological recordings of frequency-tagged potentials in human observers to measure this inhibitory surround in feature space. We found that sensory signals of an attended color (e.g., red) were enhanced, whereas sensory signals of colors similar to the target color (e.g., orange) were suppressed relative to colors more distinct from the target color (e.g., yellow). Importantly, this inhibitory effect spreads globally across the visual field, thus operating independently of location. These findings suggest that feature-based attention comprises an excitatory peak surrounded by a narrow inhibitory zone in color space to attenuate the most distracting and potentially confusable stimuli during visual perception. This selection profile is akin to what has been reported for location-based attention [8-10] and thus suggests that such center-surround mechanisms are an overarching principle of attention across different domains in the human brain. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. How previous experience shapes perception in different sensory modalities

    PubMed Central

    Snyder, Joel S.; Schwiedrzik, Caspar M.; Vitela, A. Davi; Melloni, Lucia

    2015-01-01

    What has transpired immediately before has a strong influence on how sensory stimuli are processed and perceived. In particular, temporal context can have contrastive effects, repelling perception away from the interpretation of the context stimulus, and attractive effects (TCEs), whereby perception repeats upon successive presentations of the same stimulus. For decades, scientists have documented contrastive and attractive temporal context effects mostly with simple visual stimuli. But both types of effects also occur in other modalities, e.g., audition and touch, and for stimuli of varying complexity, raising the possibility that context effects reflect general computational principles of sensory systems. Neuroimaging shows that contrastive and attractive context effects arise from neural processes in different areas of the cerebral cortex, suggesting two separate operations with distinct functional roles. Bayesian models can provide a functional account of both context effects, whereby prior experience adjusts sensory systems to optimize perception of future stimuli. PMID:26582982

  7. Is it me? Self-recognition bias across sensory modalities and its relationship to autistic traits.

    PubMed

    Chakraborty, Anya; Chakrabarti, Bhismadev

    2015-01-01

    Atypical self-processing is an emerging theme in autism research, suggested by lower self-reference effect in memory, and atypical neural responses to visual self-representations. Most research on physical self-processing in autism uses visual stimuli. However, the self is a multimodal construct, and therefore, it is essential to test self-recognition in other sensory modalities as well. Self-recognition in the auditory modality remains relatively unexplored and has not been tested in relation to autism and related traits. This study investigates self-recognition in auditory and visual domain in the general population and tests if it is associated with autistic traits. Thirty-nine neurotypical adults participated in a two-part study. In the first session, individual participant's voice was recorded and face was photographed and morphed respectively with voices and faces from unfamiliar identities. In the second session, participants performed a 'self-identification' task, classifying each morph as 'self' voice (or face) or an 'other' voice (or face). All participants also completed the Autism Spectrum Quotient (AQ). For each sensory modality, slope of the self-recognition curve was used as individual self-recognition metric. These two self-recognition metrics were tested for association between each other, and with autistic traits. Fifty percent 'self' response was reached for a higher percentage of self in the auditory domain compared to the visual domain (t = 3.142; P < 0.01). No significant correlation was noted between self-recognition bias across sensory modalities (τ = -0.165, P = 0.204). Higher recognition bias for self-voice was observed in individuals higher in autistic traits (τ AQ = 0.301, P = 0.008). No such correlation was observed between recognition bias for self-face and autistic traits (τ AQ = -0.020, P = 0.438). Our data shows that recognition bias for physical self-representation is not related across sensory modalities. Further, individuals with higher autistic traits were better able to discriminate self from other voices, but this relation was not observed with self-face. A narrow self-other overlap in the auditory domain seen in individuals with high autistic traits could arise due to enhanced perceptual processing of auditory stimuli often observed in individuals with autism.

  8. 75 FR 54915 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Sensory System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-09

    ... DEPARTMENT OF JUSTICE Antitrust Division Notice Pursuant to the National Cooperative Research and Production Act of 1993--Sensory System for Critical Infrastructure Defect Recognition, Visualization and... Critical Infrastructure Defect Recognition, Visualization and Failure Prediction ('Sensory System'') has...

  9. Sensori-Motor Experience Leads to Changes in Visual Processing in the Developing Brain

    ERIC Educational Resources Information Center

    James, Karin Harman

    2010-01-01

    Since Broca's studies on language processing, cortical functional specialization has been considered to be integral to efficient neural processing. A fundamental question in cognitive neuroscience concerns the type of learning that is required for functional specialization to develop. To address this issue with respect to the development of neural…

  10. Dynamic interactions between visual working memory and saccade target selection

    PubMed Central

    Schneegans, Sebastian; Spencer, John P.; Schöner, Gregor; Hwang, Seongmin; Hollingworth, Andrew

    2014-01-01

    Recent psychophysical experiments have shown that working memory for visual surface features interacts with saccadic motor planning, even in tasks where the saccade target is unambiguously specified by spatial cues. Specifically, a match between a memorized color and the color of either the designated target or a distractor stimulus influences saccade target selection, saccade amplitudes, and latencies in a systematic fashion. To elucidate these effects, we present a dynamic neural field model in combination with new experimental data. The model captures the neural processes underlying visual perception, working memory, and saccade planning relevant to the psychophysical experiment. It consists of a low-level visual sensory representation that interacts with two separate pathways: a spatial pathway implementing spatial attention and saccade generation, and a surface feature pathway implementing color working memory and feature attention. Due to bidirectional coupling between visual working memory and feature attention in the model, the working memory content can indirectly exert an effect on perceptual processing in the low-level sensory representation. This in turn biases saccadic movement planning in the spatial pathway, allowing the model to quantitatively reproduce the observed interaction effects. The continuous coupling between representations in the model also implies that modulation should be bidirectional, and model simulations provide specific predictions for complementary effects of saccade target selection on visual working memory. These predictions were empirically confirmed in a new experiment: Memory for a sample color was biased toward the color of a task-irrelevant saccade target object, demonstrating the bidirectional coupling between visual working memory and perceptual processing. PMID:25228628

  11. Difference in Visual Processing Assessed by Eye Vergence Movements

    PubMed Central

    Solé Puig, Maria; Puigcerver, Laura; Aznar-Casanova, J. Antonio; Supèr, Hans

    2013-01-01

    Orienting visual attention is closely linked to the oculomotor system. For example, a shift of attention is usually followed by a saccadic eye movement and can be revealed by micro saccades. Recently we reported a novel role of another type of eye movement, namely eye vergence, in orienting visual attention. Shifts in visuospatial attention are characterized by the response modulation to a selected target. However, unlike (micro-) saccades, eye vergence movements do not carry spatial information (except for depth) and are thus not specific to a particular visual location. To further understand the role of eye vergence in visual attention, we tested subjects with different perceptual styles. Perceptual style refers to the characteristic way individuals perceive environmental stimuli, and is characterized by a spatial difference (local vs. global) in perceptual processing. We tested field independent (local; FI) and field dependent (global; FD) observers in a cue/no-cue task and a matching task. We found that FI observers responded faster and had stronger modulation in eye vergence in both tasks than FD subjects. The results may suggest that eye vergence modulation may relate to the trade-off between the size of spatial region covered by attention and the processing efficiency of sensory information. Alternatively, vergence modulation may have a role in the switch in cortical state to prepare the visual system for new incoming sensory information. In conclusion, vergence eye movements may be added to the growing list of functions of fixational eye movements in visual perception. However, further studies are needed to elucidate its role. PMID:24069140

  12. Impairments of Multisensory Integration and Cross-Sensory Learning as Pathways to Dyslexia

    PubMed Central

    Hahn, Noemi; Foxe, John J.; Molholm, Sophie

    2014-01-01

    Two sensory systems are intrinsic to learning to read. Written words enter the brain through the visual system and associated sounds through the auditory system. The task before the beginning reader is quite basic. She must learn correspondences between orthographic tokens and phonemic utterances, and she must do this to the point that there is seamless automatic ‘connection’ between these sensorially distinct units of language. It is self-evident then that learning to read requires formation of cross-sensory associations to the point that deeply encoded multisensory representations are attained. While the majority of individuals manage this task to a high degree of expertise, some struggle to attain even rudimentary capabilities. Why do dyslexic individuals, who learn well in myriad other domains, fail at this particular task? Here, we examine the literature as it pertains to multisensory processing in dyslexia. We find substantial support for multisensory deficits in dyslexia, and make the case that to fully understand its neurological basis, it will be necessary to thoroughly probe the integrity of auditory-visual integration mechanisms. PMID:25265514

  13. Age-Related Changes in Visual Temporal Order Judgment Performance: Relation to Sensory and Cognitive Capacities

    PubMed Central

    Busey, Thomas; Craig, James; Clark, Chris; Humes, Larry

    2010-01-01

    Five measures of temporal order judgments were obtained from 261 participants, including 146 elder, 44 middle aged, and 71 young participants. Strong age group differences were observed in all five measures, although the group differences were reduced when letter discriminability was matched for all participants. Significant relations were found between these measures of temporal processing and several cognitive and sensory assays, and structural equation modeling revealed the degree to which temporal order processing can be viewed as a latent factor that depends in part on contributions from sensory and cognitive capacities. The best-fitting model involved two different latent factors representing temporal order processing at same and different locations, and the sensory and cognitive factors were more successful predicting performance in the different location factor than the same-location factor. Processing speed, even measured using high-contrast symbols on a paper-and-pencil test, was a surprisingly strong predictor of variability in both latent factors. However, low-level sensory measures also made significant contributions to the latent factors. The results demonstrate the degree to which temporal order processing relates to other perceptual and cognitive capacities, and address the question of whether age-related declines in these capacities share a common cause. PMID:20580644

  14. Age-related changes in visual temporal order judgment performance: Relation to sensory and cognitive capacities.

    PubMed

    Busey, Thomas; Craig, James; Clark, Chris; Humes, Larry

    2010-08-06

    Five measures of temporal order judgments were obtained from 261 participants, including 146 elder, 44 middle aged, and 71 young participants. Strong age group differences were observed in all five measures, although the group differences were reduced when letter discriminability was matched for all participants. Significant relations were found between these measures of temporal processing and several cognitive and sensory assays, and structural equation modeling revealed the degree to which temporal order processing can be viewed as a latent factor that depends in part on contributions from sensory and cognitive capacities. The best-fitting model involved two different latent factors representing temporal order processing at same and different locations, and the sensory and cognitive factors were more successful predicting performance in the different location factor than the same-location factor. Processing speed, even measured using high-contrast symbols on a paper-and-pencil test, was a surprisingly strong predictor of variability in both latent factors. However, low-level sensory measures also made significant contributions to the latent factors. The results demonstrate the degree to which temporal order processing relates to other perceptual and cognitive capacities, and address the question of whether age-related declines in these capacities share a common cause. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. Development of the Classroom Sensory Environment Assessment (CSEA).

    PubMed

    Kuhaneck, Heather Miller; Kelleher, Jaqueline

    2015-01-01

    The Classroom Sensory Environment Assessment (CSEA) is a tool that provides a means of understanding the impact of a classroom's sensory environment on student behavior. The purpose of the CSEA is to promote collaboration between occupational therapists and elementary education teachers. In particular, students with autism spectrum disorder included in general education classrooms may benefit from a suitable match created through this collaborative process between the sensory environment and their unique sensory preferences. The development of the CSEA has occurred in multiple stages over 2 yr. This article reports on descriptive results for 152 classrooms and initial reliability results. Descriptive information suggests that classrooms are environments with an enormous variety of sensory experiences that can be quantified. Visual experiences are most frequent. The tool has adequate internal consistency but requires further investigation of interrater reliability and validity. Copyright © 2015 by the American Occupational Therapy Association, Inc.

  16. Athletic training in badminton players modulates the early C1 component of visual evoked potentials: a preliminary investigation.

    PubMed

    Jin, Hua; Xu, Guiping; Zhang, John X; Ye, Zuoer; Wang, Shufang; Zhao, Lun; Lin, Chong-De; Mo, Lei

    2010-12-01

    One basic question in brain plasticity research is whether individual life experience in the normal population can affect very early sensory-perceptual processing. Athletes provide a possible model to explore plasticity of the visual cortex as athletic training in confrontational ball games is quite often accompanied by training of the visual system. We asked professional badminton players to watch video clips related to their training experience and predict where the ball would land and examined whether they differed from non-player controls in the elicited C1, a visual evoked potential indexing V1 activity. Compared with controls, the players made judgments significantly more accurately, albeit not faster. An early ERP component peaking around 65 ms post-stimulus with a scalp topography centering at the occipital pole (electrode Oz) was observed in both groups and interpreted as the C1 component. With comparable latency, amplitudes of this component were significantly enhanced for the players than for the non-players, suggesting that it can be modulated by long-term physical training. The results present a clear case of experience-induced brain plasticity in primary visual cortex for very early sensory processing. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.

    PubMed

    Treille, Avril; Vilain, Coriandre; Kandel, Sonia; Sato, Marc

    2017-09-01

    Previous electrophysiological studies have provided strong evidence for early multisensory integrative mechanisms during audiovisual speech perception. From these studies, one unanswered issue is whether hearing our own voice and seeing our own articulatory gestures facilitate speech perception, possibly through a better processing and integration of sensory inputs with our own sensory-motor knowledge. The present EEG study examined the impact of self-knowledge during the perception of auditory (A), visual (V) and audiovisual (AV) speech stimuli that were previously recorded from the participant or from a speaker he/she had never met. Audiovisual interactions were estimated by comparing N1 and P2 auditory evoked potentials during the bimodal condition (AV) with the sum of those observed in the unimodal conditions (A + V). In line with previous EEG studies, our results revealed an amplitude decrease of P2 auditory evoked potentials in AV compared to A + V conditions. Crucially, a temporal facilitation of N1 responses was observed during the visual perception of self speech movements compared to those of another speaker. This facilitation was negatively correlated with the saliency of visual stimuli. These results provide evidence for a temporal facilitation of the integration of auditory and visual speech signals when the visual situation involves our own speech gestures.

  18. Higher-order neural processing tunes motion neurons to visual ecology in three species of hawkmoths.

    PubMed

    Stöckl, A L; O'Carroll, D; Warrant, E J

    2017-06-28

    To sample information optimally, sensory systems must adapt to the ecological demands of each animal species. These adaptations can occur peripherally, in the anatomical structures of sensory organs and their receptors; and centrally, as higher-order neural processing in the brain. While a rich body of investigations has focused on peripheral adaptations, our understanding is sparse when it comes to central mechanisms. We quantified how peripheral adaptations in the eyes, and central adaptations in the wide-field motion vision system, set the trade-off between resolution and sensitivity in three species of hawkmoths active at very different light levels: nocturnal Deilephila elpenor, crepuscular Manduca sexta , and diurnal Macroglossum stellatarum. Using optical measurements and physiological recordings from the photoreceptors and wide-field motion neurons in the lobula complex, we demonstrate that all three species use spatial and temporal summation to improve visual performance in dim light. The diurnal Macroglossum relies least on summation, but can only see at brighter intensities. Manduca, with large sensitive eyes, relies less on neural summation than the smaller eyed Deilephila , but both species attain similar visual performance at nocturnal light levels. Our results reveal how the visual systems of these three hawkmoth species are intimately matched to their visual ecologies. © 2017 The Author(s).

  19. Perceptual Literacy and the Construction of Significant Meanings within Art Education

    ERIC Educational Resources Information Center

    Cerkez, Beatriz Tomsic

    2014-01-01

    In order to verify how important the ability to process visual images and sounds in a holistic way can be, we developed an experiment based on the production and reception of an art work that was conceived as a multi-sensorial experience and implied a complex understanding of visual and auditory information. We departed from the idea that to…

  20. The strength of attentional biases reduces as visual short-term memory load increases

    PubMed Central

    Shimi, A.

    2013-01-01

    Despite our visual system receiving irrelevant input that competes with task-relevant signals, we are able to pursue our perceptual goals. Attention enhances our visual processing by biasing the processing of the input that is relevant to the task at hand. The top-down signals enabling these biases are therefore important for regulating lower level sensory mechanisms. In three experiments, we examined whether we apply similar biases to successfully maintain information in visual short-term memory (VSTM). We presented participants with targets alongside distracters and we graded their perceptual similarity to vary the extent to which they competed. Experiments 1 and 2 showed that the more items held in VSTM before the onset of the distracters, the more perceptually distinct the distracters needed to be for participants to retain the target accurately. Experiment 3 extended these behavioral findings by demonstrating that the perceptual similarity between target and distracters exerted a significantly greater effect on occipital alpha amplitudes, depending on the number of items already held in VSTM. The trade-off between VSTM load and target-distracter competition suggests that VSTM and perceptual competition share a partially overlapping mechanism, namely top-down inputs into sensory areas. PMID:23576694

  1. Proprioceptive feedback determines visuomotor gain in Drosophila

    PubMed Central

    Bartussek, Jan; Lehmann, Fritz-Olaf

    2016-01-01

    Multisensory integration is a prerequisite for effective locomotor control in most animals. Especially, the impressive aerial performance of insects relies on rapid and precise integration of multiple sensory modalities that provide feedback on different time scales. In flies, continuous visual signalling from the compound eyes is fused with phasic proprioceptive feedback to ensure precise neural activation of wing steering muscles (WSM) within narrow temporal phase bands of the stroke cycle. This phase-locked activation relies on mechanoreceptors distributed over wings and gyroscopic halteres. Here we investigate visual steering performance of tethered flying fruit flies with reduced haltere and wing feedback signalling. Using a flight simulator, we evaluated visual object fixation behaviour, optomotor altitude control and saccadic escape reflexes. The behavioural assays show an antagonistic effect of wing and haltere signalling on visuomotor gain during flight. Compared with controls, suppression of haltere feedback attenuates while suppression of wing feedback enhances the animal’s wing steering range. Our results suggest that the generation of motor commands owing to visual perception is dynamically controlled by proprioception. We outline a potential physiological mechanism based on the biomechanical properties of WSM and sensory integration processes at the level of motoneurons. Collectively, the findings contribute to our general understanding how moving animals integrate sensory information with dynamically changing temporal structure. PMID:26909184

  2. Auditory to Visual Cross-Modal Adaptation for Emotion: Psychophysical and Neural Correlates.

    PubMed

    Wang, Xiaodong; Guo, Xiaotao; Chen, Lin; Liu, Yijun; Goldberg, Michael E; Xu, Hong

    2017-02-01

    Adaptation is fundamental in sensory processing and has been studied extensively within the same sensory modality. However, little is known about adaptation across sensory modalities, especially in the context of high-level processing, such as the perception of emotion. Previous studies have shown that prolonged exposure to a face exhibiting one emotion, such as happiness, leads to contrastive biases in the perception of subsequently presented faces toward the opposite emotion, such as sadness. Such work has shown the importance of adaptation in calibrating face perception based on prior visual exposure. In the present study, we showed for the first time that emotion-laden sounds, like laughter, adapt the visual perception of emotional faces, that is, subjects more frequently perceived faces as sad after listening to a happy sound. Furthermore, via electroencephalography recordings and event-related potential analysis, we showed that there was a neural correlate underlying the perceptual bias: There was an attenuated response occurring at ∼ 400 ms to happy test faces and a quickened response to sad test faces, after exposure to a happy sound. Our results provide the first direct evidence for a behavioral cross-modal adaptation effect on the perception of facial emotion, and its neural correlate. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. Managing daily life with age-related sensory loss: cognitive resources gain in importance.

    PubMed

    Heyl, Vera; Wahl, Hans-Werner

    2012-06-01

    This paper investigates the role of cognitive resources in everyday functioning, comparing visually impaired, hearing impaired, and sensory unimpaired older adults. According to arguments that cognitive resources are of increased importance and a greater awareness of cognitive restrictions exists among sensory impaired individuals, in particular among visually impaired individuals, we hypothesized differential relationships between resources and outcomes when comparing sensory impaired and sensory unimpaired older adults. Findings are based on samples of 121 visually impaired, 116 hearing impaired, and 150 sensory unimpaired older adults (M = 82 years). Results from a sample of 43 dual sensory impaired older adults are reported for comparison. Assessment relied on established instruments (e.g., WAIS-R, ADL/IADL). Structural equation modeling showed that cognitive resources and behavior-related everyday functioning were more strongly related in the sensory impaired groups as compared to the sensory unimpaired group. Cognitive resources and evaluation of everyday functioning were significantly linked only among the sensory impaired groups. When medical condition was controlled for, these effects persisted. It is concluded that both cognitive training as well as psychosocial support may serve as important additions to classic vision and hearing loss rehabilitation. PsycINFO Database Record (c) 2012 APA, all rights reserved

  4. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat.

    PubMed

    Aasebø, Ida E J; Lepperød, Mikkel E; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute; Hafting, Torkel; Fyhn, Marianne

    2017-01-01

    The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model.

  5. Temporal Processing in the Visual Cortex of the Awake and Anesthetized Rat

    PubMed Central

    Aasebø, Ida E. J.; Stavrinou, Maria; Nøkkevangen, Sandra; Einevoll, Gaute

    2017-01-01

    Abstract The activity pattern and temporal dynamics within and between neuron ensembles are essential features of information processing and believed to be profoundly affected by anesthesia. Much of our general understanding of sensory information processing, including computational models aimed at mathematically simulating sensory information processing, rely on parameters derived from recordings conducted on animals under anesthesia. Due to the high variety of neuronal subtypes in the brain, population-based estimates of the impact of anesthesia may conceal unit- or ensemble-specific effects of the transition between states. Using chronically implanted tetrodes into primary visual cortex (V1) of rats, we conducted extracellular recordings of single units and followed the same cell ensembles in the awake and anesthetized states. We found that the transition from wakefulness to anesthesia involves unpredictable changes in temporal response characteristics. The latency of single-unit responses to visual stimulation was delayed in anesthesia, with large individual variations between units. Pair-wise correlations between units increased under anesthesia, indicating more synchronized activity. Further, the units within an ensemble show reproducible temporal activity patterns in response to visual stimuli that is changed between states, suggesting state-dependent sequences of activity. The current dataset, with recordings from the same neural ensembles across states, is well suited for validating and testing computational network models. This can lead to testable predictions, bring a deeper understanding of the experimental findings and improve models of neural information processing. Here, we exemplify such a workflow using a Brunel network model. PMID:28791331

  6. Mismatch Negativity with Visual-only and Audiovisual Speech

    PubMed Central

    Ponton, Curtis W.; Bernstein, Lynne E.; Auer, Edward T.

    2009-01-01

    The functional organization of cortical speech processing is thought to be hierarchical, increasing in complexity and proceeding from primary sensory areas centrifugally. The current study used the mismatch negativity (MMN) obtained with electrophysiology (EEG) to investigate the early latency period of visual speech processing under both visual-only (VO) and audiovisual (AV) conditions. Current density reconstruction (CDR) methods were used to model the cortical MMN generator locations. MMNs were obtained with VO and AV speech stimuli at early latencies (approximately 82-87 ms peak in time waveforms relative to the acoustic onset) and in regions of the right lateral temporal and parietal cortices. Latencies were consistent with bottom-up processing of the visible stimuli. We suggest that a visual pathway extracts phonetic cues from visible speech, and that previously reported effects of AV speech in classical early auditory areas, given later reported latencies, could be attributable to modulatory feedback from visual phonetic processing. PMID:19404730

  7. Early Visual Deprivation Alters Multisensory Processing in Peripersonal Space

    ERIC Educational Resources Information Center

    Collignon, Olivier; Charbonneau, Genevieve; Lassonde, Maryse; Lepore, Franco

    2009-01-01

    Multisensory peripersonal space develops in a maturational process that is thought to be influenced by early sensory experience. We investigated the role of vision in the effective development of audiotactile interactions in peripersonal space. Early blind (EB), late blind (LB) and sighted control (SC) participants were asked to lateralize…

  8. Postural and Cortical Responses Following Visual Occlusion in Adults with and without ASD

    ERIC Educational Resources Information Center

    Goh, Kwang Leng; Morris, Susan; Parsons, Richard; Ring, Alexander; Tan, Tele

    2018-01-01

    Autism is associated with differences in sensory processing and motor coordination. Evidence from electroencephalography suggests individual perturbation evoked response (PER) components represent specific aspects of postural disturbance processing; P1 reflects the detection and N1 reflects the evaluation of postural instability. Despite the…

  9. Motor imagery learning modulates functional connectivity of multiple brain systems in resting state.

    PubMed

    Zhang, Hang; Long, Zhiying; Ge, Ruiyang; Xu, Lele; Jin, Zhen; Yao, Li; Liu, Yijun

    2014-01-01

    Learning motor skills involves subsequent modulation of resting-state functional connectivity in the sensory-motor system. This idea was mostly derived from the investigations on motor execution learning which mainly recruits the processing of sensory-motor information. Behavioral evidences demonstrated that motor skills in our daily lives could be learned through imagery procedures. However, it remains unclear whether the modulation of resting-state functional connectivity also exists in the sensory-motor system after motor imagery learning. We performed a fMRI investigation on motor imagery learning from resting state. Based on previous studies, we identified eight sensory and cognitive resting-state networks (RSNs) corresponding to the brain systems and further explored the functional connectivity of these RSNs through the assessments, connectivity and network strengths before and after the two-week consecutive learning. Two intriguing results were revealed: (1) The sensory RSNs, specifically sensory-motor and lateral visual networks exhibited greater connectivity strengths in precuneus and fusiform gyrus after learning; (2) Decreased network strength induced by learning was proved in the default mode network, a cognitive RSN. These results indicated that resting-state functional connectivity could be modulated by motor imagery learning in multiple brain systems, and such modulation displayed in the sensory-motor, visual and default brain systems may be associated with the establishment of motor schema and the regulation of introspective thought. These findings further revealed the neural substrates underlying motor skill learning and potentially provided new insights into the therapeutic benefits of motor imagery learning.

  10. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2016-01-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of the present study is to understand the complex system of sensory-motor behaviors that may underlie the establishment of joint attention between parents and toddlers. In an experimental task, parents and toddlers played together with multiple toys. We objectively measured joint attention – and the sensory-motor behaviors that underlie it – using a dual head-mounted eye-tracking system and frame-by-frame coding of manual actions. By tracking the momentary visual fixations and hand actions of each participant, we precisely determined just how often they fixated on the same object at the same time, the visual behaviors that preceded joint attention, and manual behaviors that preceded and co-occurred with joint attention. We found that multiple sequential sensory-motor patterns lead to joint attention. In addition, there are developmental changes in this multi-pathway system evidenced as variations in strength among multiple routes. We propose that coordinated visual attention between parents and toddlers is primarily a sensory-motor behavior. Skill in achieving coordinated visual attention in social settings – like skills in other sensory-motor domains – emerges from multiple pathways to the same functional end. PMID:27016038

  11. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention.

    PubMed

    Yu, Chen; Smith, Linda B

    2017-02-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of this study was to understand the complex system of sensory-motor behaviors that may underlie the establishment of joint attention between parents and toddlers. In an experimental task, parents and toddlers played together with multiple toys. We objectively measured joint attention-and the sensory-motor behaviors that underlie it-using a dual head-mounted eye-tracking system and frame-by-frame coding of manual actions. By tracking the momentary visual fixations and hand actions of each participant, we precisely determined just how often they fixated on the same object at the same time, the visual behaviors that preceded joint attention and manual behaviors that preceded and co-occurred with joint attention. We found that multiple sequential sensory-motor patterns lead to joint attention. In addition, there are developmental changes in this multi-pathway system evidenced as variations in strength among multiple routes. We propose that coordinated visual attention between parents and toddlers is primarily a sensory-motor behavior. Skill in achieving coordinated visual attention in social settings-like skills in other sensory-motor domains-emerges from multiple pathways to the same functional end. Copyright © 2016 Cognitive Science Society, Inc.

  12. Does visual attention drive the dynamics of bistable perception?

    PubMed Central

    Dieter, Kevin C.; Brascamp, Jan; Tadin, Duje; Blake, Randolph

    2016-01-01

    How does attention interact with incoming sensory information to determine what we perceive? One domain in which this question has received serious consideration is that of bistable perception: a captivating class of phenomena that involves fluctuating visual experience in the face of physically unchanging sensory input. Here, some investigations have yielded support for the idea that attention alone determines what is seen, while others have implicated entirely attention-independent processes in driving alternations during bistable perception. We review the body of literature addressing this divide and conclude that in fact both sides are correct – depending on the form of bistable perception being considered. Converging evidence suggests that visual attention is required for alternations in the type of bistable perception called binocular rivalry, while alternations during other types of bistable perception appear to continue without requiring attention. We discuss some implications of this differential effect of attention for our understanding of the mechanisms underlying bistable perception, and examine how these mechanisms operate during our everyday visual experiences. PMID:27230785

  13. Does visual attention drive the dynamics of bistable perception?

    PubMed

    Dieter, Kevin C; Brascamp, Jan; Tadin, Duje; Blake, Randolph

    2016-10-01

    How does attention interact with incoming sensory information to determine what we perceive? One domain in which this question has received serious consideration is that of bistable perception: a captivating class of phenomena that involves fluctuating visual experience in the face of physically unchanging sensory input. Here, some investigations have yielded support for the idea that attention alone determines what is seen, while others have implicated entirely attention-independent processes in driving alternations during bistable perception. We review the body of literature addressing this divide and conclude that in fact both sides are correct-depending on the form of bistable perception being considered. Converging evidence suggests that visual attention is required for alternations in the type of bistable perception called binocular rivalry, while alternations during other types of bistable perception appear to continue without requiring attention. We discuss some implications of this differential effect of attention for our understanding of the mechanisms underlying bistable perception, and examine how these mechanisms operate during our everyday visual experiences.

  14. Into the black and back: the ecology of brain investment in Neotropical army ants (Formicidae: Dorylinae)

    NASA Astrophysics Data System (ADS)

    Bulova, S.; Purce, K.; Khodak, P.; Sulger, E.; O'Donnell, S.

    2016-04-01

    Shifts to new ecological settings can drive evolutionary changes in animal sensory systems and in the brain structures that process sensory information. We took advantage of the diverse habitat ecology of Neotropical army ants to test whether evolutionary transitions from below- to above-ground activity were associated with changes in brain structure. Our estimates of genus-typical frequencies of above-ground activity suggested a high degree of evolutionary plasticity in habitat use among Neotropical army ants. Brain structure consistently corresponded to degree of above-ground activity among genera and among species within genera. The most above-ground genera (and species) invested relatively more in visual processing brain tissues; the most subterranean species invested relatively less in central processing higher-brain centers (mushroom body calyces). These patterns suggest a strong role of sensory ecology (e.g., light levels) in selecting for army ant brain investment evolution and further suggest that the subterranean environment poses reduced cognitive challenges to workers. The highly above-ground active genus Eciton was exceptional in having relatively large brains and particularly large and structurally complex optic lobes. These patterns suggest that the transition to above-ground activity from ancestors that were largely subterranean for approximately 60 million years was followed by re-emergence of enhanced visual function in workers.

  15. Contextual effects on motion perception and smooth pursuit eye movements.

    PubMed

    Spering, Miriam; Gegenfurtner, Karl R

    2008-08-15

    Smooth pursuit eye movements are continuous, slow rotations of the eyes that allow us to follow the motion of a visual object of interest. These movements are closely related to sensory inputs from the visual motion processing system. To track a moving object in the natural environment, its motion first has to be segregated from the motion signals provided by surrounding stimuli. Here, we review experiments on the effect of the visual context on motion processing with a focus on the relationship between motion perception and smooth pursuit eye movements. While perception and pursuit are closely linked, we show that they can behave quite distinctly when required by the visual context.

  16. Parallel Processing Strategies of the Primate Visual System

    PubMed Central

    Nassi, Jonathan J.; Callaway, Edward M.

    2009-01-01

    Preface Incoming sensory information is sent to the brain along modality-specific channels corresponding to the five senses. Each of these channels further parses the incoming signals into parallel streams to provide a compact, efficient input to the brain. Ultimately, these parallel input signals must be elaborated upon and integrated within the cortex to provide a unified and coherent percept. Recent studies in the primate visual cortex have greatly contributed to our understanding of how this goal is accomplished. Multiple strategies including retinal tiling, hierarchical and parallel processing and modularity, defined spatially and by cell type-specific connectivity, are all used by the visual system to recover the rich detail of our visual surroundings. PMID:19352403

  17. 3D hierarchical spatial representation and memory of multimodal sensory data

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Dow, Paul A.; Huber, David J.

    2009-04-01

    This paper describes an efficient method and system for representing, processing and understanding multi-modal sensory data. More specifically, it describes a computational method and system for how to process and remember multiple locations in multimodal sensory space (e.g., visual, auditory, somatosensory, etc.). The multimodal representation and memory is based on a biologically-inspired hierarchy of spatial representations implemented with novel analogues of real representations used in the human brain. The novelty of the work is in the computationally efficient and robust spatial representation of 3D locations in multimodal sensory space as well as an associated working memory for storage and recall of these representations at the desired level for goal-oriented action. We describe (1) A simple and efficient method for human-like hierarchical spatial representations of sensory data and how to associate, integrate and convert between these representations (head-centered coordinate system, body-centered coordinate, etc.); (2) a robust method for training and learning a mapping of points in multimodal sensory space (e.g., camera-visible object positions, location of auditory sources, etc.) to the above hierarchical spatial representations; and (3) a specification and implementation of a hierarchical spatial working memory based on the above for storage and recall at the desired level for goal-oriented action(s). This work is most useful for any machine or human-machine application that requires processing of multimodal sensory inputs, making sense of it from a spatial perspective (e.g., where is the sensory information coming from with respect to the machine and its parts) and then taking some goal-oriented action based on this spatial understanding. A multi-level spatial representation hierarchy means that heterogeneous sensory inputs (e.g., visual, auditory, somatosensory, etc.) can map onto the hierarchy at different levels. When controlling various machine/robot degrees of freedom, the desired movements and action can be computed from these different levels in the hierarchy. The most basic embodiment of this machine could be a pan-tilt camera system, an array of microphones, a machine with arm/hand like structure or/and a robot with some or all of the above capabilities. We describe the approach, system and present preliminary results on a real-robotic platform.

  18. Neural mechanisms of human perceptual choice under focused and divided attention.

    PubMed

    Wyart, Valentin; Myers, Nicholas E; Summerfield, Christopher

    2015-02-25

    Perceptual decisions occur after the evaluation and integration of momentary sensory inputs, and dividing attention between spatially disparate sources of information impairs decision performance. However, it remains unknown whether dividing attention degrades the precision of sensory signals, precludes their conversion into decision signals, or dampens the integration of decision information toward an appropriate response. Here we recorded human electroencephalographic (EEG) activity while participants categorized one of two simultaneous and independent streams of visual gratings according to their average tilt. By analyzing trial-by-trial correlations between EEG activity and the information offered by each sample, we obtained converging behavioral and neural evidence that dividing attention between left and right visual fields does not dampen the encoding of sensory or decision information. Under divided attention, momentary decision information from both visual streams was encoded in slow parietal signals without interference but was lost downstream during their integration as reflected in motor mu- and beta-band (10-30 Hz) signals, resulting in a "leaky" accumulation process that conferred greater behavioral influence to more recent samples. By contrast, sensory inputs that were explicitly cued as irrelevant were not converted into decision signals. These findings reveal that a late cognitive bottleneck on information integration limits decision performance under divided attention, and places new capacity constraints on decision-theoretic models of information integration under cognitive load. Copyright © 2015 the authors 0270-6474/15/353485-14$15.00/0.

  19. Neural mechanisms of human perceptual choice under focused and divided attention

    PubMed Central

    Wyart, Valentin; Myers, Nicholas E.; Summerfield, Christopher

    2015-01-01

    Perceptual decisions occur after evaluation and integration of momentary sensory inputs, and dividing attention between spatially disparate sources of information impairs decision performance. However, it remains unknown whether dividing attention degrades the precision of sensory signals, precludes their conversion into decision signals, or dampens the integration of decision information towards an appropriate response. Here we recorded human electroencephalographic (EEG) activity whilst participants categorised one of two simultaneous and independent streams of visual gratings according to their average tilt. By analyzing trial-by-trial correlations between EEG activity and the information offered by each sample, we obtained converging behavioural and neural evidence that dividing attention between left and right visual fields does not dampen the encoding of sensory or decision information. Under divided attention, momentary decision information from both visual streams was encoded in slow parietal signals without interference but was lost downstream during their integration as reflected in motor mu- and beta-band (10–30 Hz) signals, resulting in a ‘leaky’ accumulation process which conferred greater behavioural influence to more recent samples. By contrast, sensory inputs that were explicitly cued as irrelevant were not converted into decision signals. These findings reveal that a late cognitive bottleneck on information integration limits decision performance under divided attention, and place new capacity constraints on decision-theoretic models of information integration under cognitive load. PMID:25716848

  20. Working memory capacity and visual-verbal cognitive load modulate auditory-sensory gating in the brainstem: toward a unified view of attention.

    PubMed

    Sörqvist, Patrik; Stenfelt, Stefan; Rönnberg, Jerker

    2012-11-01

    Two fundamental research questions have driven attention research in the past: One concerns whether selection of relevant information among competing, irrelevant, information takes place at an early or at a late processing stage; the other concerns whether the capacity of attention is limited by a central, domain-general pool of resources or by independent, modality-specific pools. In this article, we contribute to these debates by showing that the auditory-evoked brainstem response (an early stage of auditory processing) to task-irrelevant sound decreases as a function of central working memory load (manipulated with a visual-verbal version of the n-back task). Furthermore, individual differences in central/domain-general working memory capacity modulated the magnitude of the auditory-evoked brainstem response, but only in the high working memory load condition. The results support a unified view of attention whereby the capacity of a late/central mechanism (working memory) modulates early precortical sensory processing.

  1. Mouse V1 population correlates of visual detection rely on heterogeneity within neuronal response patterns

    PubMed Central

    Montijn, Jorrit S; Goltstein, Pieter M; Pennartz, Cyriel MA

    2015-01-01

    Previous studies have demonstrated the importance of the primary sensory cortex for the detection, discrimination, and awareness of visual stimuli, but it is unknown how neuronal populations in this area process detected and undetected stimuli differently. Critical differences may reside in the mean strength of responses to visual stimuli, as reflected in bulk signals detectable in functional magnetic resonance imaging, electro-encephalogram, or magnetoencephalography studies, or may be more subtly composed of differentiated activity of individual sensory neurons. Quantifying single-cell Ca2+ responses to visual stimuli recorded with in vivo two-photon imaging, we found that visual detection correlates more strongly with population response heterogeneity rather than overall response strength. Moreover, neuronal populations showed consistencies in activation patterns across temporally spaced trials in association with hit responses, but not during nondetections. Contrary to models relying on temporally stable networks or bulk signaling, these results suggest that detection depends on transient differentiation in neuronal activity within cortical populations. DOI: http://dx.doi.org/10.7554/eLife.10163.001 PMID:26646184

  2. EMG-based visual-haptic biofeedback: a tool to improve motor control in children with primary dystonia.

    PubMed

    Casellato, Claudia; Pedrocchi, Alessandra; Zorzi, Giovanna; Vernisse, Lea; Ferrigno, Giancarlo; Nardocci, Nardo

    2013-05-01

    New insights suggest that dystonic motor impairments could also involve a deficit of sensory processing. In this framework, biofeedback, making covert physiological processes more overt, could be useful. The present work proposes an innovative integrated setup which provides the user with an electromyogram (EMG)-based visual-haptic biofeedback during upper limb movements (spiral tracking tasks), to test if augmented sensory feedbacks can induce motor control improvement in patients with primary dystonia. The ad hoc developed real-time control algorithm synchronizes the haptic loop with the EMG reading; the brachioradialis EMG values were used to modify visual and haptic features of the interface: the higher was the EMG level, the higher was the virtual table friction and the background color proportionally moved from green to red. From recordings on dystonic and healthy subjects, statistical results showed that biofeedback has a significant impact, correlated with the local impairment, on the dystonic muscular control. These tests pointed out the effectiveness of biofeedback paradigms in gaining a better specific-muscle voluntary motor control. The flexible tool developed here shows promising prospects of clinical applications and sensorimotor rehabilitation.

  3. Macroscopic brain dynamics during verbal and pictorial processing of affective stimuli.

    PubMed

    Keil, Andreas

    2006-01-01

    Emotions can be viewed as action dispositions, preparing an individual to act efficiently and successfully in situations of behavioral relevance. To initiate optimized behavior, it is essential to accurately process the perceptual elements indicative of emotional relevance. The present chapter discusses effects of affective content on neural and behavioral parameters of perception, across different information channels. Electrocortical data are presented from studies examining affective perception with pictures and words in different task contexts. As a main result, these data suggest that sensory facilitation has an important role in affective processing. Affective pictures appear to facilitate perception as a function of emotional arousal at multiple levels of visual analysis. If the discrimination between affectively arousing vs. nonarousing content relies on fine-grained differences, amplification of the cortical representation may occur as early as 60-90 ms after stimulus onset. Affectively arousing information as conveyed via visual verbal channels was not subject to such very early enhancement. However, electrocortical indices of lexical access and/or activation of semantic networks showed that affectively arousing content may enhance the formation of semantic representations during word encoding. It can be concluded that affective arousal is associated with activation of widespread networks, which act to optimize sensory processing. On the basis of prioritized sensory analysis for affectively relevant stimuli, subsequent steps such as working memory, motor preparation, and action may be adjusted to meet the adaptive requirements of the situation perceived.

  4. A common perceptual temporal limit of binding synchronous inputs across different sensory attributes and modalities

    PubMed Central

    Fujisaki, Waka; Nishida, Shin'ya

    2010-01-01

    The human brain processes different aspects of the surrounding environment through multiple sensory modalities, and each modality can be subdivided into multiple attribute-specific channels. When the brain rebinds sensory content information (‘what’) across different channels, temporal coincidence (‘when’) along with spatial coincidence (‘where’) provides a critical clue. It however remains unknown whether neural mechanisms for binding synchronous attributes are specific to each attribute combination, or universal and central. In human psychophysical experiments, we examined how combinations of visual, auditory and tactile attributes affect the temporal frequency limit of synchrony-based binding. The results indicated that the upper limits of cross-attribute binding were lower than those of within-attribute binding, and surprisingly similar for any combination of visual, auditory and tactile attributes (2–3 Hz). They are unlikely to be the limits for judging synchrony, since the temporal limit of a cross-attribute synchrony judgement was higher and varied with the modality combination (4–9 Hz). These findings suggest that cross-attribute temporal binding is mediated by a slow central process that combines separately processed ‘what’ and ‘when’ properties of a single event. While the synchrony performance reflects temporal bottlenecks existing in ‘when’ processing, the binding performance reflects the central temporal limit of integrating ‘when’ and ‘what’ properties. PMID:20335212

  5. Neural Dynamics Underlying Target Detection in the Human Brain

    PubMed Central

    Bansal, Arjun K.; Madhavan, Radhika; Agam, Yigal; Golby, Alexandra; Madsen, Joseph R.

    2014-01-01

    Sensory signals must be interpreted in the context of goals and tasks. To detect a target in an image, the brain compares input signals and goals to elicit the correct behavior. We examined how target detection modulates visual recognition signals by recording intracranial field potential responses from 776 electrodes in 10 epileptic human subjects. We observed reliable differences in the physiological responses to stimuli when a cued target was present versus absent. Goal-related modulation was particularly strong in the inferior temporal and fusiform gyri, two areas important for object recognition. Target modulation started after 250 ms post stimulus, considerably after the onset of visual recognition signals. While broadband signals exhibited increased or decreased power, gamma frequency power showed predominantly increases during target presence. These observations support models where task goals interact with sensory inputs via top-down signals that influence the highest echelons of visual processing after the onset of selective responses. PMID:24553944

  6. Stimulus relevance modulates contrast adaptation in visual cortex

    PubMed Central

    Keller, Andreas J; Houlton, Rachael; Kampa, Björn M; Lesica, Nicholas A; Mrsic-Flogel, Thomas D; Keller, Georg B; Helmchen, Fritjof

    2017-01-01

    A general principle of sensory processing is that neurons adapt to sustained stimuli by reducing their response over time. Most of our knowledge on adaptation in single cells is based on experiments in anesthetized animals. How responses adapt in awake animals, when stimuli may be behaviorally relevant or not, remains unclear. Here we show that contrast adaptation in mouse primary visual cortex depends on the behavioral relevance of the stimulus. Cells that adapted to contrast under anesthesia maintained or even increased their activity in awake naïve mice. When engaged in a visually guided task, contrast adaptation re-occurred for stimuli that were irrelevant for solving the task. However, contrast adaptation was reversed when stimuli acquired behavioral relevance. Regulation of cortical adaptation by task demand may allow dynamic control of sensory-evoked signal flow in the neocortex. DOI: http://dx.doi.org/10.7554/eLife.21589.001 PMID:28130922

  7. Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration.

    PubMed

    Wahn, Basil; König, Peter

    2015-01-01

    Humans continuously receive and integrate information from several sensory modalities. However, attentional resources limit the amount of information that can be processed. It is not yet clear how attentional resources and multisensory processing are interrelated. Specifically, the following questions arise: (1) Are there distinct spatial attentional resources for each sensory modality? and (2) Does attentional load affect multisensory integration? We investigated these questions using a dual task paradigm: participants performed two spatial tasks (a multiple object tracking task and a localization task), either separately (single task condition) or simultaneously (dual task condition). In the multiple object tracking task, participants visually tracked a small subset of several randomly moving objects. In the localization task, participants received either visual, auditory, or redundant visual and auditory location cues. In the dual task condition, we found a substantial decrease in participants' performance relative to the results of the single task condition. Importantly, participants performed equally well in the dual task condition regardless of the location cues' modality. This result suggests that having spatial information coming from different modalities does not facilitate performance, thereby indicating shared spatial attentional resources for the auditory and visual modality. Furthermore, we found that participants integrated redundant multisensory information similarly even when they experienced additional attentional load in the dual task condition. Overall, findings suggest that (1) visual and auditory spatial attentional resources are shared and that (2) audiovisual integration of spatial information occurs in an pre-attentive processing stage.

  8. Processing time of addition or withdrawal of single or combined balance-stabilizing haptic and visual information

    PubMed Central

    Honeine, Jean-Louis; Crisafulli, Oscar; Sozzi, Stefania

    2015-01-01

    We investigated the integration time of haptic and visual input and their interaction during stance stabilization. Eleven subjects performed four tandem-stance conditions (60 trials each). Vision, touch, and both vision and touch were added and withdrawn. Furthermore, vision was replaced with touch and vice versa. Body sway, tibialis anterior, and peroneus longus activity were measured. Following addition or withdrawal of vision or touch, an integration time period elapsed before the earliest changes in sway were observed. Thereafter, sway varied exponentially to a new steady-state while reweighting occurred. Latencies of sway changes on sensory addition ranged from 0.6 to 1.5 s across subjects, consistently longer for touch than vision, and were regularly preceded by changes in muscle activity. Addition of vision and touch simultaneously shortened the latencies with respect to vision or touch separately, suggesting cooperation between sensory modalities. Latencies following withdrawal of vision or touch or both simultaneously were shorter than following addition. When vision was replaced with touch or vice versa, adding one modality did not interfere with the effect of withdrawal of the other, suggesting that integration of withdrawal and addition were performed in parallel. The time course of the reweighting process to reach the new steady-state was also shorter on withdrawal than addition. The effects of different sensory inputs on posture stabilization illustrate the operation of a time-consuming, possibly supraspinal process that integrates and fuses modalities for accurate balance control. This study also shows the facilitatory interaction of visual and haptic inputs in integration and reweighting of stance-stabilizing inputs. PMID:26334013

  9. The origins of metamodality in visual object area LO: Bodily topographical biases and increased functional connectivity to S1

    PubMed Central

    Tal, Zohar; Geva, Ran; Amedi, Amir

    2016-01-01

    Recent evidence from blind participants suggests that visual areas are task-oriented and sensory modality input independent rather than sensory-specific to vision. Specifically, visual areas are thought to retain their functional selectivity when using non-visual inputs (touch or sound) even without having any visual experience. However, this theory is still controversial since it is not clear whether this also characterizes the sighted brain, and whether the reported results in the sighted reflect basic fundamental a-modal processes or are an epiphenomenon to a large extent. In the current study, we addressed these questions using a series of fMRI experiments aimed to explore visual cortex responses to passive touch on various body parts and the coupling between the parietal and visual cortices as manifested by functional connectivity. We show that passive touch robustly activated the object selective parts of the lateral–occipital (LO) cortex while deactivating almost all other occipital–retinotopic-areas. Furthermore, passive touch responses in the visual cortex were specific to hand and upper trunk stimulations. Psychophysiological interaction (PPI) analysis suggests that LO is functionally connected to the hand area in the primary somatosensory homunculus (S1), during hand and shoulder stimulations but not to any of the other body parts. We suggest that LO is a fundamental hub that serves as a node between visual-object selective areas and S1 hand representation, probably due to the critical evolutionary role of touch in object recognition and manipulation. These results might also point to a more general principle suggesting that recruitment or deactivation of the visual cortex by other sensory input depends on the ecological relevance of the information conveyed by this input to the task/computations carried out by each area or network. This is likely to rely on the unique and differential pattern of connectivity for each visual area with the rest of the brain. PMID:26673114

  10. Evaluation of Sensory Skills among Students with Visual Impairment

    ERIC Educational Resources Information Center

    Saleem, Suhib Saleem; Al-Salahat, Mohammad Mousa

    2016-01-01

    The purpose of the study was to evaluate the sensory skills among students with visual impairment (SVI). The sample contained of 30 students with blind and low vision enrolled in mainstreaming programs at general education schools at Najran in Kingdom of Saudi Arabia. A sensory skills scale was developed. The scale consisted of 20 items was…

  11. Enhanced dimension-specific visual working memory in grapheme-color synesthesia.

    PubMed

    Terhune, Devin Blair; Wudarczyk, Olga Anna; Kochuparampil, Priya; Cohen Kadosh, Roi

    2013-10-01

    There is emerging evidence that the encoding of visual information and the maintenance of this information in a temporarily accessible state in working memory rely on the same neural mechanisms. A consequence of this overlap is that atypical forms of perception should influence working memory. We examined this by investigating whether having grapheme-color synesthesia, a condition characterized by the involuntary experience of color photisms when reading or representing graphemes, would confer benefits on working memory. Two competing hypotheses propose that superior memory in synesthesia results from information being coded in two information channels (dual-coding) or from superior dimension-specific visual processing (enhanced processing). We discriminated between these hypotheses in three n-back experiments in which controls and synesthetes viewed inducer and non-inducer graphemes and maintained color or grapheme information in working memory. Synesthetes displayed superior color working memory than controls for both grapheme types, whereas the two groups did not differ in grapheme working memory. Further analyses excluded the possibilities of enhanced working memory among synesthetes being due to greater color discrimination, stimulus color familiarity, or bidirectionality. These results reveal enhanced dimension-specific visual working memory in this population and supply further evidence for a close relationship between sensory processing and the maintenance of sensory information in working memory. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Sensory Substitution: The Spatial Updating of Auditory Scenes "Mimics" the Spatial Updating of Visual Scenes.

    PubMed

    Pasqualotto, Achille; Esenkaya, Tayfun

    2016-01-01

    Visual-to-auditory sensory substitution is used to convey visual information through audition, and it was initially created to compensate for blindness; it consists of software converting the visual images captured by a video-camera into the equivalent auditory images, or "soundscapes". Here, it was used by blindfolded sighted participants to learn the spatial position of simple shapes depicted in images arranged on the floor. Very few studies have used sensory substitution to investigate spatial representation, while it has been widely used to investigate object recognition. Additionally, with sensory substitution we could study the performance of participants actively exploring the environment through audition, rather than passively localizing sound sources. Blindfolded participants egocentrically learnt the position of six images by using sensory substitution and then a judgment of relative direction task (JRD) was used to determine how this scene was represented. This task consists of imagining being in a given location, oriented in a given direction, and pointing towards the required image. Before performing the JRD task, participants explored a map that provided allocentric information about the scene. Although spatial exploration was egocentric, surprisingly we found that performance in the JRD task was better for allocentric perspectives. This suggests that the egocentric representation of the scene was updated. This result is in line with previous studies using visual and somatosensory scenes, thus supporting the notion that different sensory modalities produce equivalent spatial representation(s). Moreover, our results have practical implications to improve training methods with sensory substitution devices (SSD).

  13. Human Computation in Visualization: Using Purpose Driven Games for Robust Evaluation of Visualization Algorithms.

    PubMed

    Ahmed, N; Zheng, Ziyi; Mueller, K

    2012-12-01

    Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.

  14. Avian visual behavior and the organization of the telencephalon.

    PubMed

    Shimizu, Toru; Patton, Tadd B; Husband, Scott A

    2010-01-01

    Birds have excellent visual abilities that are comparable or superior to those of primates, but how the bird brain solves complex visual problems is poorly understood. More specifically, we lack knowledge about how such superb abilities are used in nature and how the brain, especially the telencephalon, is organized to process visual information. Here we review the results of several studies that examine the organization of the avian telencephalon and the relevance of visual abilities to avian social and reproductive behavior. Video playback and photographic stimuli show that birds can detect and evaluate subtle differences in local facial features of potential mates in a fashion similar to that of primates. These techniques have also revealed that birds do not attend well to global configural changes in the face, suggesting a fundamental difference between birds and primates in face perception. The telencephalon plays a major role in the visual and visuo-cognitive abilities of birds and primates, and anatomical data suggest that these animals may share similar organizational characteristics in the visual telencephalon. As is true in the primate cerebral cortex, different visual features are processed separately in the avian telencephalon where separate channels are organized in the anterior-posterior axis roughly parallel to the major laminae. Furthermore, the efferent projections from the primary visual telencephalon form an extensive column-like continuum involving the dorsolateral pallium and the lateral basal ganglia. Such a column-like organization may exist not only for vision, but for other sensory modalities and even for a continuum that links sensory and limbic areas of the avian brain. Behavioral and neural studies must be integrated in order to understand how birds have developed their amazing visual systems through 150 million years of evolution. 2010 S. Karger AG, Basel.

  15. Avian Visual Behavior and the Organization of the Telencephalon

    PubMed Central

    Shimizu, Toru; Patton, Tadd B.; Husband, Scott A.

    2010-01-01

    Birds have excellent visual abilities that are comparable or superior to those of primates, but how the bird brain solves complex visual problems is poorly understood. More specifically, we lack knowledge about how such superb abilities are used in nature and how the brain, especially the telencephalon, is organized to process visual information. Here we review the results of several studies that examine the organization of the avian telencephalon and the relevance of visual abilities to avian social and reproductive behavior. Video playback and photographic stimuli show that birds can detect and evaluate subtle differences in local facial features of potential mates in a fashion similar to that of primates. These techniques have also revealed that birds do not attend well to global configural changes in the face, suggesting a fundamental difference between birds and primates in face perception. The telencephalon plays a major role in the visual and visuo-cognitive abilities of birds and primates, and anatomical data suggest that these animals may share similar organizational characteristics in the visual telencephalon. As is true in the primate cerebral cortex, different visual features are processed separately in the avian telencephalon where separate channels are organized in the anterior-posterior axis roughly parallel to the major laminae. Furthermore, the efferent projections from the primary visual telencephalon form an extensive column-like continuum involving the dorsolateral pallium and the lateral basal ganglia. Such a column-like organization may exist not only for vision, but for other sensory modalities and even for a continuum that links sensory and limbic areas of the avian brain. Behavioral and neural studies must be integrated in order to understand how birds have developed their amazing visual systems through 150 million years of evolution. PMID:20733296

  16. Training to Facilitate Adaptation to Novel Sensory Environments

    NASA Technical Reports Server (NTRS)

    Bloomberg, J. J.; Peters, B. T.; Mulavara, A. P.; Brady, R. A.; Batson, C. D.; Ploutz-Snyder, R. J.; Cohen, H. S.

    2010-01-01

    After spaceflight, the process of readapting to Earth s gravity causes locomotor dysfunction. We are developing a gait training countermeasure to facilitate adaptive responses in locomotor function. Our training system is comprised of a treadmill placed on a motion-base facing a virtual visual scene that provides an unstable walking surface combined with incongruent visual flow designed to train subjects to rapidly adapt their gait patterns to changes in the sensory environment. The goal of our present study was to determine if training improved both the locomotor and dual-tasking ability responses to a novel sensory environment and to quantify the retention of training. Subjects completed three, 30-minute training sessions during which they walked on the treadmill while receiving discordant support surface and visual input. Control subjects walked on the treadmill without any support surface or visual alterations. To determine the efficacy of training, all subjects were then tested using a novel visual flow and support surface movement not previously experienced during training. This test was performed 20 minutes, 1 week, and 1, 3, and 6 months after the final training session. Stride frequency and auditory reaction time were collected as measures of postural stability and cognitive effort, respectively. Subjects who received training showed less alteration in stride frequency and auditory reaction time compared to controls. Trained subjects maintained their level of performance over 6 months. We conclude that, with training, individuals became more proficient at walking in novel discordant sensorimotor conditions and were able to devote more attention to competing tasks.

  17. Fluctuation scaling in the visual cortex at threshold

    NASA Astrophysics Data System (ADS)

    Medina, José M.; Díaz, José A.

    2016-05-01

    Fluctuation scaling relates trial-to-trial variability to the average response by a power function in many physical processes. Here we address whether fluctuation scaling holds in sensory psychophysics and its functional role in visual processing. We report experimental evidence of fluctuation scaling in human color vision and form perception at threshold. Subjects detected thresholds in a psychophysical masking experiment that is considered a standard reference for studying suppression between neurons in the visual cortex. For all subjects, the analysis of threshold variability that results from the masking task indicates that fluctuation scaling is a global property that modulates detection thresholds with a scaling exponent that departs from 2, β =2.48 ±0.07 . We also examine a generalized version of fluctuation scaling between the sample kurtosis K and the sample skewness S of threshold distributions. We find that K and S are related and follow a unique quadratic form K =(1.19 ±0.04 ) S2+(2.68 ±0.06 ) that departs from the expected 4/3 power function regime. A random multiplicative process with weak additive noise is proposed based on a Langevin-type equation. The multiplicative process provides a unifying description of fluctuation scaling and the quadratic S -K relation and is related to on-off intermittency in sensory perception. Our findings provide an insight into how the human visual system interacts with the external environment. The theoretical methods open perspectives for investigating fluctuation scaling and intermittency effects in a wide variety of natural, economic, and cognitive phenomena.

  18. Sharpened cortical tuning and enhanced cortico-cortical communication contribute to the long-term neural mechanisms of visual motion perceptual learning.

    PubMed

    Chen, Nihong; Bi, Taiyong; Zhou, Tiangang; Li, Sheng; Liu, Zili; Fang, Fang

    2015-07-15

    Much has been debated about whether the neural plasticity mediating perceptual learning takes place at the sensory or decision-making stage in the brain. To investigate this, we trained human subjects in a visual motion direction discrimination task. Behavioral performance and BOLD signals were measured before, immediately after, and two weeks after training. Parallel to subjects' long-lasting behavioral improvement, the neural selectivity in V3A and the effective connectivity from V3A to IPS (intraparietal sulcus, a motion decision-making area) exhibited a persistent increase for the trained direction. Moreover, the improvement was well explained by a linear combination of the selectivity and connectivity increases. These findings suggest that the long-term neural mechanisms of motion perceptual learning are implemented by sharpening cortical tuning to trained stimuli at the sensory processing stage, as well as by optimizing the connections between sensory and decision-making areas in the brain. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Test of the neurolinguistic programming hypothesis that eye-movements relate to processing imagery.

    PubMed

    Wertheim, E H; Habib, C; Cumming, G

    1986-04-01

    Bandler and Grinder's hypothesis that eye-movements reflect sensory processing was examined. 28 volunteers first memorized and then recalled visual, auditory, and kinesthetic stimuli. Changes in eye-positions during recall were videotaped and categorized by two raters into positions hypothesized by Bandler and Grinder's model to represent visual, auditory, and kinesthetic recall. Planned contrast analyses suggested that visual stimulus items, when recalled, elicited significantly more upward eye-positions and stares than auditory and kinesthetic items. Auditory and kinesthetic items, however, did not elicit more changes in eye-position hypothesized by the model to represent auditory and kinesthetic recall, respectively.

  20. Enhanced attentional gain as a mechanism for generalized perceptual learning in human visual cortex.

    PubMed

    Byers, Anna; Serences, John T

    2014-09-01

    Learning to better discriminate a specific visual feature (i.e., a specific orientation in a specific region of space) has been associated with plasticity in early visual areas (sensory modulation) and with improvements in the transmission of sensory information from early visual areas to downstream sensorimotor and decision regions (enhanced readout). However, in many real-world scenarios that require perceptual expertise, observers need to efficiently process numerous exemplars from a broad stimulus class as opposed to just a single stimulus feature. Some previous data suggest that perceptual learning leads to highly specific neural modulations that support the discrimination of specific trained features. However, the extent to which perceptual learning acts to improve the discriminability of a broad class of stimuli via the modulation of sensory responses in human visual cortex remains largely unknown. Here, we used functional MRI and a multivariate analysis method to reconstruct orientation-selective response profiles based on activation patterns in the early visual cortex before and after subjects learned to discriminate small offsets in a set of grating stimuli that were rendered in one of nine possible orientations. Behavioral performance improved across 10 training sessions, and there was a training-related increase in the amplitude of orientation-selective response profiles in V1, V2, and V3 when orientation was task relevant compared with when it was task irrelevant. These results suggest that generalized perceptual learning can lead to modified responses in the early visual cortex in a manner that is suitable for supporting improved discriminability of stimuli drawn from a large set of exemplars. Copyright © 2014 the American Physiological Society.

  1. Compensatory Plasticity in the Deaf Brain: Effects on Perception of Music

    PubMed Central

    Good, Arla; Reed, Maureen J.; Russo, Frank A.

    2014-01-01

    When one sense is unavailable, sensory responsibilities shift and processing of the remaining modalities becomes enhanced to compensate for missing information. This shift, referred to as compensatory plasticity, results in a unique sensory experience for individuals who are deaf, including the manner in which music is perceived. This paper evaluates the neural, behavioural and cognitive evidence for compensatory plasticity following auditory deprivation and considers how this manifests in a unique experience of music that emphasizes visual and vibrotactile modalities. PMID:25354235

  2. Sight or Scent: Lemur Sensory Reliance in Detecting Food Quality Varies with Feeding Ecology

    PubMed Central

    Rushmore, Julie; Leonhardt, Sara D.; Drea, Christine M.

    2012-01-01

    Visual and olfactory cues provide important information to foragers, yet we know little about species differences in sensory reliance during food selection. In a series of experimental foraging studies, we examined the relative reliance on vision versus olfaction in three diurnal, primate species with diverse feeding ecologies, including folivorous Coquerel's sifakas (Propithecus coquereli), frugivorous ruffed lemurs (Varecia variegata spp), and generalist ring-tailed lemurs (Lemur catta). We used animals with known color-vision status and foods for which different maturation stages (and hence quality) produce distinct visual and olfactory cues (the latter determined chemically). We first showed that lemurs preferentially selected high-quality foods over low-quality foods when visual and olfactory cues were simultaneously available for both food types. Next, using a novel apparatus in a series of discrimination trials, we either manipulated food quality (while holding sensory cues constant) or manipulated sensory cues (while holding food quality constant). Among our study subjects that showed relatively strong preferences for high-quality foods, folivores required both sensory cues combined to reliably identify their preferred foods, whereas generalists could identify their preferred foods using either cue alone, and frugivores could identify their preferred foods using olfactory, but not visual, cues alone. Moreover, when only high-quality foods were available, folivores and generalists used visual rather than olfactory cues to select food, whereas frugivores used both cue types equally. Lastly, individuals in all three of the study species predominantly relied on sight when choosing between low-quality foods, but species differed in the strength of their sensory biases. Our results generally emphasize visual over olfactory reliance in foraging lemurs, but we suggest that the relative sensory reliance of animals may vary with their feeding ecology. PMID:22870229

  3. Crossmodal Connections of Primary Sensory Cortices Largely Vanish During Normal Aging

    PubMed Central

    Henschke, Julia U.; Ohl, Frank W.; Budinger, Eike

    2018-01-01

    During aging, human response times (RTs) to unisensory and crossmodal stimuli decrease. However, the elderly benefit more from crossmodal stimulus representations than younger people. The underlying short-latency multisensory integration process is mediated by direct crossmodal connections at the level of primary sensory cortices. We investigate the age-related changes of these connections using a rodent model (Mongolian gerbil), retrograde tracer injections into the primary auditory (A1), somatosensory (S1), and visual cortex (V1), and immunohistochemistry for markers of apoptosis (Caspase-3), axonal plasticity (Growth associated protein 43, GAP 43), and a calcium-binding protein (Parvalbumin, PV). In adult animals, primary sensory cortices receive a substantial number of direct thalamic inputs from nuclei of their matched, but also from nuclei of non-matched sensory modalities. There are also direct intracortical connections among primary sensory cortices and connections with secondary sensory cortices of other modalities. In very old animals, the crossmodal connections strongly decrease in number or vanish entirely. This is likely due to a retraction of the projection neuron axonal branches rather than ongoing programmed cell death. The loss of crossmodal connections is also accompanied by changes in anatomical correlates of inhibition and excitation in the sensory thalamus and cortex. Together, the loss and restructuring of crossmodal connections during aging suggest a shift of multisensory processing from primary cortices towards other sensory brain areas in elderly individuals. PMID:29551970

  4. Crossmodal Connections of Primary Sensory Cortices Largely Vanish During Normal Aging.

    PubMed

    Henschke, Julia U; Ohl, Frank W; Budinger, Eike

    2018-01-01

    During aging, human response times (RTs) to unisensory and crossmodal stimuli decrease. However, the elderly benefit more from crossmodal stimulus representations than younger people. The underlying short-latency multisensory integration process is mediated by direct crossmodal connections at the level of primary sensory cortices. We investigate the age-related changes of these connections using a rodent model (Mongolian gerbil), retrograde tracer injections into the primary auditory (A1), somatosensory (S1), and visual cortex (V1), and immunohistochemistry for markers of apoptosis (Caspase-3), axonal plasticity (Growth associated protein 43, GAP 43), and a calcium-binding protein (Parvalbumin, PV). In adult animals, primary sensory cortices receive a substantial number of direct thalamic inputs from nuclei of their matched, but also from nuclei of non-matched sensory modalities. There are also direct intracortical connections among primary sensory cortices and connections with secondary sensory cortices of other modalities. In very old animals, the crossmodal connections strongly decrease in number or vanish entirely. This is likely due to a retraction of the projection neuron axonal branches rather than ongoing programmed cell death. The loss of crossmodal connections is also accompanied by changes in anatomical correlates of inhibition and excitation in the sensory thalamus and cortex. Together, the loss and restructuring of crossmodal connections during aging suggest a shift of multisensory processing from primary cortices towards other sensory brain areas in elderly individuals.

  5. Sensory memory for odors is encoded in spontaneous correlated activity between olfactory glomeruli.

    PubMed

    Galán, Roberto F; Weidert, Marcel; Menzel, Randolf; Herz, Andreas V M; Galizia, C Giovanni

    2006-01-01

    Sensory memory is a short-lived persistence of a sensory stimulus in the nervous system, such as iconic memory in the visual system. However, little is known about the mechanisms underlying olfactory sensory memory. We have therefore analyzed the effect of odor stimuli on the first odor-processing network in the honeybee brain, the antennal lobe, which corresponds to the vertebrate olfactory bulb. We stained output neurons with a calcium-sensitive dye and measured across-glomerular patterns of spontaneous activity before and after a stimulus. Such a single-odor presentation changed the relative timing of spontaneous activity across glomeruli in accordance with Hebb's theory of learning. Moreover, during the first few minutes after odor presentation, correlations between the spontaneous activity fluctuations suffice to reconstruct the stimulus. As spontaneous activity is ubiquitous in the brain, modifiable fluctuations could provide an ideal substrate for Hebbian reverberations and sensory memory in other neural systems.

  6. Sensory maps in the claustrum of the cat.

    PubMed

    Olson, C R; Graybiel, A M

    1980-12-04

    The claustrum is a telencephalic cell group (Fig. 1A, B) possessing widespread reciprocal connections with the neocortex. In this regard, it bears a unique and striking resemblance to the thalamus. We have now examined the anatomical ordering of pathways linking the claustrum with sensory areas of the cat neocortex and, in parallel electrophysiological experiments, have studied the functional organization of claustral sensory zones so identified. Our findings indicate that there are discrete visual and somatosensory subdivisions in the claustrum interconnected with the corresponding primary sensory areas of the neocortex and that the respective zones contain orderly retinotopic and somatotopic maps. A third claustral region receiving fibre projections from the auditory cortex in or near area Ep was found to contain neurones responsive to auditory stimulation. We conclude that loops connecting sensory areas of the neocortex with satellite zones in the claustrum contribute to the early processing of exteroceptive information by the forebrain.

  7. Auditory Sensory Substitution is Intuitive and Automatic with Texture Stimuli

    PubMed Central

    Stiles, Noelle R. B.; Shimojo, Shinsuke

    2015-01-01

    Millions of people are blind worldwide. Sensory substitution (SS) devices (e.g., vOICe) can assist the blind by encoding a video stream into a sound pattern, recruiting visual brain areas for auditory analysis via crossmodal interactions and plasticity. SS devices often require extensive training to attain limited functionality. In contrast to conventional attention-intensive SS training that starts with visual primitives (e.g., geometrical shapes), we argue that sensory substitution can be engaged efficiently by using stimuli (such as textures) associated with intrinsic crossmodal mappings. Crossmodal mappings link images with sounds and tactile patterns. We show that intuitive SS sounds can be matched to the correct images by naive sighted participants just as well as by intensively-trained participants. This result indicates that existing crossmodal interactions and amodal sensory cortical processing may be as important in the interpretation of patterns by SS as crossmodal plasticity (e.g., the strengthening of existing connections or the formation of new ones), especially at the earlier stages of SS usage. An SS training procedure based on crossmodal mappings could both considerably improve participant performance and shorten training times, thereby enabling SS devices to significantly expand blind capabilities. PMID:26490260

  8. Evidence for distinct mechanisms underlying attentional priming and sensory memory for bistable perception.

    PubMed

    Brinkhuis, M A B; Kristjánsson, Á; Brascamp, J W

    2015-08-01

    Attentional selection in visual search paradigms and perceptual selection in bistable perception paradigms show functional similarities. For example, both are sensitive to trial history: They are biased toward previously selected targets or interpretations. We investigated whether priming by target selection in visual search and sensory memory for bistable perception are related. We did this by presenting two trial types to observers. We presented either ambiguous spheres that rotated over a central axis and could be perceived as rotating in one of two directions, or search displays in which the unambiguously rotating target and distractor spheres closely resembled the two possible interpretations of the ambiguous stimulus. We interleaved both trial types within experiments, to see whether priming by target selection during search trials would affect the perceptual outcome of bistable perception and, conversely, whether sensory memory during bistable perception would affect target selection times during search. Whereas we found intertrial repetition effects among consecutive search trials and among consecutive bistable trials, we did not find cross-paradigm effects. Thus, even though we could ascertain that our experiments robustly elicited processes of both search priming and sensory memory for bistable perception, these same experiments revealed no interaction between the two.

  9. Anemonefishes rely on visual and chemical cues to correctly identify conspecifics

    NASA Astrophysics Data System (ADS)

    Johnston, Nicole K.; Dixson, Danielle L.

    2017-09-01

    Organisms rely on sensory cues to interpret their environment and make important life-history decisions. Accurate recognition is of particular importance in diverse reef environments. Most evidence on the use of sensory cues focuses on those used in predator avoidance or habitat recognition, with little information on their role in conspecific recognition. Yet conspecific recognition is essential for life-history decisions including settlement, mate choice, and dominance interactions. Using a sensory manipulated tank and a two-chamber choice flume, anemonefish conspecific response was measured in the presence and absence of chemical and/or visual cues. Experiments were then repeated in the presence or absence of two heterospecific species to evaluate whether a heterospecific fish altered the conspecific response. Anemonefishes responded to both the visual and chemical cues of conspecifics, but relied on the combination of the two cues to recognize conspecifics inside the sensory manipulated tank. These results contrast previous studies focusing on predator detection where anemonefishes were found to compensate for the loss of one sensory cue (chemical) by utilizing a second cue (visual). This lack of sensory compensation may impact the ability of anemonefishes to acclimate to changing reef environments in the future.

  10. Auditory Processing in Infancy: Do Early Abnormalities Predict Disorders of Language and Cognitive Development?

    ERIC Educational Resources Information Center

    Guzzetta, Francesco; Conti, Guido; Mercuri, Eugenio

    2011-01-01

    Increasing attention has been devoted to the maturation of sensory processing in the first year of life. While the development of cortical visual function has been thoroughly studied, much less information is available on auditory processing and its early disorders. The aim of this paper is to provide an overview of the assessment techniques for…

  11. Optimal Mixtures of Test Types in Paired-Associate Learning (Sensory Information Processing). Final Report.

    ERIC Educational Resources Information Center

    Wolford, George

    Seven experiments were run to determine the precise nature of some of the variables which affect the processing of short-term visual information. In particular, retinal location, report order, processing order, lateral masking, and redundancy were studied along with the nature of the confusion errors which are made in the full report procedure.…

  12. Perceptual Decoding Processes for Language in a Visual Mode and for Language in an Auditory Mode.

    ERIC Educational Resources Information Center

    Myerson, Rosemarie Farkas

    The purpose of this paper is to gain insight into the nature of the reading process through an understanding of the general nature of sensory processing mechanisms which reorganize and restructure input signals for central recognition, and an understanding of how the grammar of the language functions in defining the set of possible sentences in…

  13. Effects of visual working memory on brain information processing of irrelevant auditory stimuli.

    PubMed

    Qu, Jiagui; Rizak, Joshua D; Zhao, Lun; Li, Minghong; Ma, Yuanye

    2014-01-01

    Selective attention has traditionally been viewed as a sensory processing modulator that promotes cognitive processing efficiency by favoring relevant stimuli while inhibiting irrelevant stimuli. However, the cross-modal processing of irrelevant information during working memory (WM) has been rarely investigated. In this study, the modulation of irrelevant auditory information by the brain during a visual WM task was investigated. The N100 auditory evoked potential (N100-AEP) following an auditory click was used to evaluate the selective attention to auditory stimulus during WM processing and at rest. N100-AEP amplitudes were found to be significantly affected in the left-prefrontal, mid-prefrontal, right-prefrontal, left-frontal, and mid-frontal regions while performing a high WM load task. In contrast, no significant differences were found between N100-AEP amplitudes in WM states and rest states under a low WM load task in all recorded brain regions. Furthermore, no differences were found between the time latencies of N100-AEP troughs in WM states and rest states while performing either the high or low WM load task. These findings suggested that the prefrontal cortex (PFC) may integrate information from different sensory channels to protect perceptual integrity during cognitive processing.

  14. From Sensory Perception to Lexical-Semantic Processing: An ERP Study in Non-Verbal Children with Autism.

    PubMed

    Cantiani, Chiara; Choudhury, Naseem A; Yu, Yan H; Shafer, Valerie L; Schwartz, Richard G; Benasich, April A

    2016-01-01

    This study examines electrocortical activity associated with visual and auditory sensory perception and lexical-semantic processing in nonverbal (NV) or minimally-verbal (MV) children with Autism Spectrum Disorder (ASD). Currently, there is no agreement on whether these children comprehend incoming linguistic information and whether their perception is comparable to that of typically developing children. Event-related potentials (ERPs) of 10 NV/MV children with ASD and 10 neurotypical children were recorded during a picture-word matching paradigm. Atypical ERP responses were evident at all levels of processing in children with ASD. Basic perceptual processing was delayed in both visual and auditory domains but overall was similar in amplitude to typically-developing children. However, significant differences between groups were found at the lexical-semantic level, suggesting more atypical higher-order processes. The results suggest that although basic perception is relatively preserved in NV/MV children with ASD, higher levels of processing, including lexical- semantic functions, are impaired. The use of passive ERP paradigms that do not require active participant response shows significant potential for assessment of non-compliant populations such as NV/MV children with ASD.

  15. From Sensory Perception to Lexical-Semantic Processing: An ERP Study in Non-Verbal Children with Autism

    PubMed Central

    Cantiani, Chiara; Choudhury, Naseem A.; Yu, Yan H.; Shafer, Valerie L.; Schwartz, Richard G.; Benasich, April A.

    2016-01-01

    This study examines electrocortical activity associated with visual and auditory sensory perception and lexical-semantic processing in nonverbal (NV) or minimally-verbal (MV) children with Autism Spectrum Disorder (ASD). Currently, there is no agreement on whether these children comprehend incoming linguistic information and whether their perception is comparable to that of typically developing children. Event-related potentials (ERPs) of 10 NV/MV children with ASD and 10 neurotypical children were recorded during a picture-word matching paradigm. Atypical ERP responses were evident at all levels of processing in children with ASD. Basic perceptual processing was delayed in both visual and auditory domains but overall was similar in amplitude to typically-developing children. However, significant differences between groups were found at the lexical-semantic level, suggesting more atypical higher-order processes. The results suggest that although basic perception is relatively preserved in NV/MV children with ASD, higher levels of processing, including lexical- semantic functions, are impaired. The use of passive ERP paradigms that do not require active participant response shows significant potential for assessment of non-compliant populations such as NV/MV children with ASD. PMID:27560378

  16. Visuocortical Changes During Delay and Trace Aversive Conditioning: Evidence From Steady-State Visual Evoked Potentials

    PubMed Central

    Miskovic, Vladimir; Keil, Andreas

    2015-01-01

    The visual system is biased towards sensory cues that have been associated with danger or harm through temporal co-occurrence. An outstanding question about conditioning-induced changes in visuocortical processing is the extent to which they are driven primarily by top-down factors such as expectancy or by low-level factors such as the temporal proximity between conditioned stimuli and aversive outcomes. Here, we examined this question using two different differential aversive conditioning experiments: participants learned to associate a particular grating stimulus with an aversive noise that was presented either in close temporal proximity (delay conditioning experiment) or after a prolonged stimulus-free interval (trace conditioning experiment). In both experiments we probed cue-related cortical responses by recording steady-state visual evoked potentials (ssVEPs). Although behavioral ratings indicated that all participants successfully learned to discriminate between the grating patterns that predicted the presence versus absence of the aversive noise, selective amplification of population-level responses in visual cortex for the conditioned danger signal was observed only when the grating and the noise were temporally contiguous. Our findings are in line with notions purporting that changes in the electrocortical response of visual neurons induced by aversive conditioning are a product of Hebbian associations among sensory cell assemblies rather than being driven entirely by expectancy-based, declarative processes. PMID:23398582

  17. Digital-Visual-Sensory-Design Anthropology: Ethnography, Imagination and Intervention

    ERIC Educational Resources Information Center

    Pink, Sarah

    2014-01-01

    In this article I outline how a digital-visual-sensory approach to anthropological ethnography might participate in the making of relationship between design and anthropology. While design anthropology is itself coming of age, the potential of its relationship with applied visual anthropology methodology and theory has not been considered in the…

  18. Visual Landmarks Facilitate Rodent Spatial Navigation in Virtual Reality Environments

    ERIC Educational Resources Information Center

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain…

  19. Motor Imagery Learning Modulates Functional Connectivity of Multiple Brain Systems in Resting State

    PubMed Central

    Zhang, Hang; Long, Zhiying; Ge, Ruiyang; Xu, Lele; Jin, Zhen; Yao, Li; Liu, Yijun

    2014-01-01

    Background Learning motor skills involves subsequent modulation of resting-state functional connectivity in the sensory-motor system. This idea was mostly derived from the investigations on motor execution learning which mainly recruits the processing of sensory-motor information. Behavioral evidences demonstrated that motor skills in our daily lives could be learned through imagery procedures. However, it remains unclear whether the modulation of resting-state functional connectivity also exists in the sensory-motor system after motor imagery learning. Methodology/Principal Findings We performed a fMRI investigation on motor imagery learning from resting state. Based on previous studies, we identified eight sensory and cognitive resting-state networks (RSNs) corresponding to the brain systems and further explored the functional connectivity of these RSNs through the assessments, connectivity and network strengths before and after the two-week consecutive learning. Two intriguing results were revealed: (1) The sensory RSNs, specifically sensory-motor and lateral visual networks exhibited greater connectivity strengths in precuneus and fusiform gyrus after learning; (2) Decreased network strength induced by learning was proved in the default mode network, a cognitive RSN. Conclusions/Significance These results indicated that resting-state functional connectivity could be modulated by motor imagery learning in multiple brain systems, and such modulation displayed in the sensory-motor, visual and default brain systems may be associated with the establishment of motor schema and the regulation of introspective thought. These findings further revealed the neural substrates underlying motor skill learning and potentially provided new insights into the therapeutic benefits of motor imagery learning. PMID:24465577

  20. Prenatal sensory experience affects hatching behavior in domestic chicks (Gallus gallus) and Japanese quail chicks (Coturnix coturnix japonica).

    PubMed

    Sleigh, Merry J; Casey, Michael B

    2014-07-01

    Species-typical developmental outcomes result from organismic and environmental constraints and experiences shared by members of a species. We examined the effects of enhanced prenatal sensory experience on hatching behaviors by exposing domestic chicks (n = 95) and Japanese quail (n = 125) to one of four prenatal conditions: enhanced visual stimulation, enhanced auditory stimulation, enhanced auditory and visual stimulation, or no enhanced sensory experience (control condition). In general, across species, control embryos had slower hatching behaviors than all other embryos. Embryos in the auditory condition had faster hatching behaviors than embryos in the visual and control conditions. Auditory-visual condition embryos showed similarities to embryos exposed to either auditory or visual stimulation. These results suggest that prenatal sensory experience can influence hatching behavior of precocial birds, with the type of stimulation being a critical variable. These results also provide further evidence that species-typical outcomes are the result of species-typical prenatal experiences. © 2013 Wiley Periodicals, Inc.

  1. Collective behaviour in vertebrates: a sensory perspective

    PubMed Central

    Collignon, Bertrand; Fernández-Juricic, Esteban

    2016-01-01

    Collective behaviour models can predict behaviours of schools, flocks, and herds. However, in many cases, these models make biologically unrealistic assumptions in terms of the sensory capabilities of the organism, which are applied across different species. We explored how sensitive collective behaviour models are to these sensory assumptions. Specifically, we used parameters reflecting the visual coverage and visual acuity that determine the spatial range over which an individual can detect and interact with conspecifics. Using metric and topological collective behaviour models, we compared the classic sensory parameters, typically used to model birds and fish, with a set of realistic sensory parameters obtained through physiological measurements. Compared with the classic sensory assumptions, the realistic assumptions increased perceptual ranges, which led to fewer groups and larger group sizes in all species, and higher polarity values and slightly shorter neighbour distances in the fish species. Overall, classic visual sensory assumptions are not representative of many species showing collective behaviour and constrain unrealistically their perceptual ranges. More importantly, caution must be exercised when empirically testing the predictions of these models in terms of choosing the model species, making realistic predictions, and interpreting the results. PMID:28018616

  2. Virtually-induced threat in Parkinson's: Dopaminergic interactions between anxiety and sensory-perceptual processing while walking.

    PubMed

    Ehgoetz Martens, Kaylena A; Ellard, Colin G; Almeida, Quincy J

    2015-12-01

    Research evidence has suggested that anxiety influences gait in PD, with an identified dopa-sensitive gait response in highly anxious PD. It has been well-established that accurate perception of the environment and sensory feedback is essential for gait. Arguably since sensory and perceptual deficits have been noted in PD, anxiety has the potential to exacerbate movement impairments, since one might expect that reducing resources needed to overcome or compensate for sensory-perceptual deficits may lead to even more severe gait impairments. It is possible that anxiety in threatening situations might consume more processing resources, limiting the ability to process information about the environment or one's own movement (sensory feedback) especially in highly anxious PD. Therefore, the current study aimed to (i) evaluate whether processing of threat-related aspects of the environment was influenced by anxiety, (ii) evaluate whether anxiety influences the ability to utilize sensory feedback in PD while walking in threatening situations, and (iii) further understand the role of dopaminergic medication on these processes in threatening situations in PD. Forty-eight participants (24 HC; 12 Low Anxious [LA-PD], 12 Highly Anxious [HA-PD]) completed 20 walking trials in virtual reality across a plank that was (i) located on the ground (GROUND) (ii) located above a deep pit (ELEVATED); while provided with or without visual feedback about their lower limbs (+VF; -VF). After walking across the plank, participants were asked to judge the width of the plank they had just walked across. The plank varied in size from 60-100 cm. Both ON and OFF dopaminergic medication states were evaluated in PD. Gait parameters, judgment error and self-reported anxiety levels were measured. Results showed that HA-PD reported greater levels of anxiety overall (p<0.001) compared to HC and LA-PD, and all participants reported greater anxiety during the ELEVATED condition compared to GROUND (p=0.01). PD had similar judgment error as HC. Additionally, medication state did not significantly influence judgment error in PD. More importantly, HA-PD were the only group that did not adjust their step width when feedback was provided during the GROUND condition. However, medication facilitated a reduction in ST-CV when visual feedback was available only in the HA-PD group. Therefore, the current study provides evidence that anxiety may interfere with information processing, especially utilizing sensory feedback while walking. Dopaminergic medication appears to improve utilization of sensory feedback in stressful situations by reducing anxiety and/or improving resource allocation especially in those with PD who are highly anxious. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Data of ERPs and spectral alpha power when attention is engaged on visual or verbal/auditory imagery

    PubMed Central

    Villena-González, Mario; López, Vladimir; Rodríguez, Eugenio

    2016-01-01

    This article provides data from statistical analysis of event-related brain potentials (ERPs) and spectral power from 20 participants during three attentional conditions. Specifically, P1, N1 and P300 amplitude of ERP were compared when participant׳s attention was oriented to an external task, to a visual imagery and to an inner speech. The spectral power from alpha band was also compared in these three attentional conditions. These data are related to the research article where sensory processing of external information was compared during these three conditions entitled “Orienting attention to visual or verbal/auditory imagery differentially impairs the processing of visual stimuli” (Villena-Gonzalez et al., 2016) [1]. PMID:27077090

  4. Object Recognition in Mental Representations: Directions for Exploring Diagnostic Features through Visual Mental Imagery.

    PubMed

    Roldan, Stephanie M

    2017-01-01

    One of the fundamental goals of object recognition research is to understand how a cognitive representation produced from the output of filtered and transformed sensory information facilitates efficient viewer behavior. Given that mental imagery strongly resembles perceptual processes in both cortical regions and subjective visual qualities, it is reasonable to question whether mental imagery facilitates cognition in a manner similar to that of perceptual viewing: via the detection and recognition of distinguishing features. Categorizing the feature content of mental imagery holds potential as a reverse pathway by which to identify the components of a visual stimulus which are most critical for the creation and retrieval of a visual representation. This review will examine the likelihood that the information represented in visual mental imagery reflects distinctive object features thought to facilitate efficient object categorization and recognition during perceptual viewing. If it is the case that these representational features resemble their sensory counterparts in both spatial and semantic qualities, they may well be accessible through mental imagery as evaluated through current investigative techniques. In this review, methods applied to mental imagery research and their findings are reviewed and evaluated for their efficiency in accessing internal representations, and implications for identifying diagnostic features are discussed. An argument is made for the benefits of combining mental imagery assessment methods with diagnostic feature research to advance the understanding of visual perceptive processes, with suggestions for avenues of future investigation.

  5. Object Recognition in Mental Representations: Directions for Exploring Diagnostic Features through Visual Mental Imagery

    PubMed Central

    Roldan, Stephanie M.

    2017-01-01

    One of the fundamental goals of object recognition research is to understand how a cognitive representation produced from the output of filtered and transformed sensory information facilitates efficient viewer behavior. Given that mental imagery strongly resembles perceptual processes in both cortical regions and subjective visual qualities, it is reasonable to question whether mental imagery facilitates cognition in a manner similar to that of perceptual viewing: via the detection and recognition of distinguishing features. Categorizing the feature content of mental imagery holds potential as a reverse pathway by which to identify the components of a visual stimulus which are most critical for the creation and retrieval of a visual representation. This review will examine the likelihood that the information represented in visual mental imagery reflects distinctive object features thought to facilitate efficient object categorization and recognition during perceptual viewing. If it is the case that these representational features resemble their sensory counterparts in both spatial and semantic qualities, they may well be accessible through mental imagery as evaluated through current investigative techniques. In this review, methods applied to mental imagery research and their findings are reviewed and evaluated for their efficiency in accessing internal representations, and implications for identifying diagnostic features are discussed. An argument is made for the benefits of combining mental imagery assessment methods with diagnostic feature research to advance the understanding of visual perceptive processes, with suggestions for avenues of future investigation. PMID:28588538

  6. The Human Brain Uses Noise

    NASA Astrophysics Data System (ADS)

    Mori, Toshio; Kai, Shoichi

    2003-05-01

    We present the first observation of stochastic resonance (SR) in the human brain's visual processing area. The novel experimental protocol is to stimulate the right eye with a sub-threshold periodic optical signal and the left eye with a noisy one. The stimuli bypass sensory organs and are mixed in the visual cortex. With many noise sources present in the brain, higher brain functions, e.g. perception and cognition, may exploit SR.

  7. Adaptation to sensory-motor reflex perturbations is blind to the source of errors.

    PubMed

    Hudson, Todd E; Landy, Michael S

    2012-01-06

    In the study of visual-motor control, perhaps the most familiar findings involve adaptation to externally imposed movement errors. Theories of visual-motor adaptation based on optimal information processing suppose that the nervous system identifies the sources of errors to effect the most efficient adaptive response. We report two experiments using a novel perturbation based on stimulating a visually induced reflex in the reaching arm. Unlike adaptation to an external force, our method induces a perturbing reflex within the motor system itself, i.e., perturbing forces are self-generated. This novel method allows a test of the theory that error source information is used to generate an optimal adaptive response. If the self-generated source of the visually induced reflex perturbation is identified, the optimal response will be via reflex gain control. If the source is not identified, a compensatory force should be generated to counteract the reflex. Gain control is the optimal response to reflex perturbation, both because energy cost and movement errors are minimized. Energy is conserved because neither reflex-induced nor compensatory forces are generated. Precision is maximized because endpoint variance is proportional to force production. We find evidence against source-identified adaptation in both experiments, suggesting that sensory-motor information processing is not always optimal.

  8. Visual cortical areas of the mouse: comparison of parcellation and network structure with primates

    PubMed Central

    Laramée, Marie-Eve; Boire, Denis

    2015-01-01

    Brains have evolved to optimize sensory processing. In primates, complex cognitive tasks must be executed and evolution led to the development of large brains with many cortical areas. Rodents do not accomplish cognitive tasks of the same level of complexity as primates and remain with small brains both in relative and absolute terms. But is a small brain necessarily a simple brain? In this review, several aspects of the visual cortical networks have been compared between rodents and primates. The visual system has been used as a model to evaluate the level of complexity of the cortical circuits at the anatomical and functional levels. The evolutionary constraints are first presented in order to appreciate the rules for the development of the brain and its underlying circuits. The organization of sensory pathways, with their parallel and cross-modal circuits, is also examined. Other features of brain networks, often considered as imposing constraints on the development of underlying circuitry, are also discussed and their effect on the complexity of the mouse and primate brain are inspected. In this review, we discuss the common features of cortical circuits in mice and primates and see how these can be useful in understanding visual processing in these animals. PMID:25620914

  9. Visual cortical areas of the mouse: comparison of parcellation and network structure with primates.

    PubMed

    Laramée, Marie-Eve; Boire, Denis

    2014-01-01

    Brains have evolved to optimize sensory processing. In primates, complex cognitive tasks must be executed and evolution led to the development of large brains with many cortical areas. Rodents do not accomplish cognitive tasks of the same level of complexity as primates and remain with small brains both in relative and absolute terms. But is a small brain necessarily a simple brain? In this review, several aspects of the visual cortical networks have been compared between rodents and primates. The visual system has been used as a model to evaluate the level of complexity of the cortical circuits at the anatomical and functional levels. The evolutionary constraints are first presented in order to appreciate the rules for the development of the brain and its underlying circuits. The organization of sensory pathways, with their parallel and cross-modal circuits, is also examined. Other features of brain networks, often considered as imposing constraints on the development of underlying circuitry, are also discussed and their effect on the complexity of the mouse and primate brain are inspected. In this review, we discuss the common features of cortical circuits in mice and primates and see how these can be useful in understanding visual processing in these animals.

  10. Visual Occlusion Decreases Motion Sickness in a Flight Simulator.

    PubMed

    Ishak, Shaziela; Bubka, Andrea; Bonato, Frederick

    2018-05-01

    Sensory conflict theories of motion sickness (MS) assert that symptoms may result when incoming sensory inputs (e.g., visual and vestibular) contradict each other. Logic suggests that attenuating input from one sense may reduce conflict and hence lessen MS symptoms. In the current study, it was hypothesized that attenuating visual input by blocking light entering the eye would reduce MS symptoms in a motion provocative environment. Participants sat inside an aircraft cockpit mounted onto a motion platform that simultaneously pitched, rolled, and heaved in two conditions. In the occluded condition, participants wore "blackout" goggles and closed their eyes to block light. In the control condition, participants opened their eyes and had full view of the cockpit's interior. Participants completed separate Simulator Sickness Questionnaires before and after each condition. The posttreatment total Simulator Sickness Questionnaires and subscores for nausea, oculomotor, and disorientation in the control condition were significantly higher than those in the occluded condition. These results suggest that under some conditions attenuating visual input may delay the onset of MS or weaken the severity of symptoms. Eliminating visual input may reduce visual/nonvisual sensory conflict by weakening the influence of the visual channel, which is consistent with the sensory conflict theory of MS.

  11. Sensory Impairments and Cognitive Function in Middle-Aged Adults.

    PubMed

    Schubert, Carla R; Cruickshanks, Karen J; Fischer, Mary E; Chen, Yanjun; Klein, Barbara E K; Klein, Ronald; Pinto, A Alex

    2017-08-01

    Hearing, visual, and olfactory impairments have been associated with cognitive impairment in older adults but less is known about associations with cognitive function in middle-aged adults. Sensory and cognitive functions were measured on participants in the baseline examination (2005-2008) of the Beaver Dam Offspring Study. Cognitive function was measured with the Trail Making tests A (TMTA) and B (TMTB) and the Grooved Peg Board test. Pure-tone audiometry, Pelli-Robson letter charts, and the San Diego Odor Identification test were used to measure hearing, contrast sensitivity, and olfaction, respectively. There were 2,836 participants aged 21-84 years with measures of hearing, visual, olfactory, and cognitive function at the baseline examination. Nineteen percent of the cohort had one sensory impairment and 3% had multiple sensory impairments. In multivariable adjusted linear regression models that included all three sensory impairments, hearing impairment, visual impairment, and olfactory impairment were each independently associated with poorer performance on the TMTA, TMTB, and Grooved Peg Board (p < .05 for all sensory impairments in all models). Participants with a sensory impairment took on average from 2 to 10 seconds longer than participants without the corresponding sensory impairment to complete these tests. Results were similar in models that included adjustment for hearing aid use. Hearing, visual and olfactory impairment were associated with poorer performance on cognitive function tests independent of the other sensory impairments and factors associated with cognition. Sensory impairments in midlife are associated with subtle deficits in cognitive function which may be indicative of early brain aging. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Hearing Shapes: Event-related Potentials Reveal the Time Course of Auditory-Visual Sensory Substitution.

    PubMed

    Graulty, Christian; Papaioannou, Orestis; Bauer, Phoebe; Pitts, Michael A; Canseco-Gonzalez, Enriqueta

    2018-04-01

    In auditory-visual sensory substitution, visual information (e.g., shape) can be extracted through strictly auditory input (e.g., soundscapes). Previous studies have shown that image-to-sound conversions that follow simple rules [such as the Meijer algorithm; Meijer, P. B. L. An experimental system for auditory image representation. Transactions on Biomedical Engineering, 39, 111-121, 1992] are highly intuitive and rapidly learned by both blind and sighted individuals. A number of recent fMRI studies have begun to explore the neuroplastic changes that result from sensory substitution training. However, the time course of cross-sensory information transfer in sensory substitution is largely unexplored and may offer insights into the underlying neural mechanisms. In this study, we recorded ERPs to soundscapes before and after sighted participants were trained with the Meijer algorithm. We compared these posttraining versus pretraining ERP differences with those of a control group who received the same set of 80 auditory/visual stimuli but with arbitrary pairings during training. Our behavioral results confirmed the rapid acquisition of cross-sensory mappings, and the group trained with the Meijer algorithm was able to generalize their learning to novel soundscapes at impressive levels of accuracy. The ERP results revealed an early cross-sensory learning effect (150-210 msec) that was significantly enhanced in the algorithm-trained group compared with the control group as well as a later difference (420-480 msec) that was unique to the algorithm-trained group. These ERP modulations are consistent with previous fMRI results and provide additional insight into the time course of cross-sensory information transfer in sensory substitution.

  13. Value-driven attentional capture in the auditory domain.

    PubMed

    Anderson, Brian A

    2016-01-01

    It is now well established that the visual attention system is shaped by reward learning. When visual features are associated with a reward outcome, they acquire high priority and can automatically capture visual attention. To date, evidence for value-driven attentional capture has been limited entirely to the visual system. In the present study, I demonstrate that previously reward-associated sounds also capture attention, interfering more strongly with the performance of a visual task. This finding suggests that value-driven attention reflects a broad principle of information processing that can be extended to other sensory modalities and that value-driven attention can bias cross-modal stimulus competition.

  14. Emotional words facilitate lexical but not early visual processing.

    PubMed

    Trauer, Sophie M; Kotz, Sonja A; Müller, Matthias M

    2015-12-12

    Emotional scenes and faces have shown to capture and bind visual resources at early sensory processing stages, i.e. in early visual cortex. However, emotional words have led to mixed results. In the current study ERPs were assessed simultaneously with steady-state visual evoked potentials (SSVEPs) to measure attention effects on early visual activity in emotional word processing. Neutral and negative words were flickered at 12.14 Hz whilst participants performed a Lexical Decision Task. Emotional word content did not modulate the 12.14 Hz SSVEP amplitude, neither did word lexicality. However, emotional words affected the ERP. Negative compared to neutral words as well as words compared to pseudowords lead to enhanced deflections in the P2 time range indicative of lexico-semantic access. The N400 was reduced for negative compared to neutral words and enhanced for pseudowords compared to words indicating facilitated semantic processing of emotional words. LPC amplitudes reflected word lexicality and thus the task-relevant response. In line with previous ERP and imaging evidence, the present results indicate that written emotional words are facilitated in processing only subsequent to visual analysis.

  15. The visual cognitive network, but not the visual sensory network, is affected in amnestic mild cognitive impairment: a study of brain oscillatory responses.

    PubMed

    Yener, Görsev G; Emek-Savaş, Derya Durusu; Güntekin, Bahar; Başar, Erol

    2014-10-17

    Mild Cognitive Impairment (MCI) is considered in many as prodromal stage of Alzheimer's disease (AD). Event-related oscillations (ERO) reflect cognitive responses of brain whereas sensory-evoked oscillations (SEO) inform about sensory responses. For this study, we compared visual SEO and ERO responses in MCI to explore brain dynamics (BACKGROUND). Forty-three patients with MCI (mean age=74.0 year) and 41 age- and education-matched healthy-elderly controls (HC) (mean age=71.1 year) participated in the study. The maximum peak-to-peak amplitudes for each subject's averaged delta response (0.5-3.0 Hz) were measured from two conditions (simple visual stimulation and classical visual oddball paradigm target stimulation) (METHOD). Overall, amplitudes of target ERO responses were higher than SEO amplitudes. The preferential location for maximum amplitude values was frontal lobe for ERO and occipital lobe for SEO. The ANOVA for delta responses showed significant results for the group Xparadigm. Post-hoc tests indicated that (1) the difference between groups were significant for target delta responses, but not for SEO, (2) ERO elicited higher responses for HC than MCI patients, and (3) females had higher target ERO than males and this difference was pronounced in the control group (RESULTS). Overall, cognitive responses display almost double the amplitudes of sensory responses over frontal regions. The topography of oscillatory responses differs depending on stimuli: visualsensory responses are highest over occipitals and -cognitive responses over frontal regions. A group effect is observed in MCI indicating that visual sensory and cognitive circuits behave differently indicating preserved visual sensory responses, but decreased cognitive responses (CONCLUSION). Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception.

  17. Sensory and demographic characteristics of deafblindness rehabilitation clients in Montréal, Canada.

    PubMed

    Wittich, Walter; Watanabe, Donald H; Gagné, Jean-Pierre

    2012-05-01

      Demographic changes are increasing the number of older adults with combined age-related vision and hearing loss, while medical advances increase the survival probability of children with congenital dual (or multiple) impairments due to pre-maturity or rare hereditary diseases. Rehabilitation services for these populations are highly in demand since traditional uni-sensory rehabilitation approaches using the other sense to compensate are not always utilizable. Very little is currently known about the client population characteristics with dual sensory impairment. The present study provides information about demographic and sensory variables of persons in the Montreal region that were receiving rehabilitation for dual impairment in December 2010. This information can inform researchers, clinicians, educators, as well as administrators about potential research and service delivery priorities. A chart review of all client files across the three rehabilitation agencies that offer integrated dual sensory rehabilitation services in Montreal provided data on visual acuity, visual field, hearing detection thresholds, and demographic variables. The 209 males and 355 females ranged in age from 4months to 105years (M=71.9, S.D.=24.6), indicating a prevalence estimate for dual sensory impairment at 15/100000. Only 5.7% were under 18years of age, while 69.1% were over the age of 65years, with 43.1% over the age of 85years. The diagnostic combination that accounted for 31% of the entire sample was age-related macular degeneration with presbycusis. Their visual and auditory measures indicated that older adults were likely to fall into moderate to severe levels of impairment on both measures. Individuals with Usher Syndrome comprised 20.9% (n=118) of the sample. The age distribution in this sample of persons with dual sensory impairment indicates that service delivery planning will need to strongly consider the growing presence of older adults as the baby-boomers approach retirement age. The distribution of their visual and auditory limits indicates that the large majority of this client group has residual vision and hearing that can be maximized in the rehabilitation process in order to restore functional abilities and social participation. Future research in this area should identify the specific priorities in both rehabilitation and research in individuals affected with combined vision and hearing loss. Ophthalmic & Physiological Optics © 2012 The College of Optometrists.

  18. The sensory components of high-capacity iconic memory and visual working memory.

    PubMed

    Bradley, Claire; Pearson, Joel

    2012-01-01

    EARLY VISUAL MEMORY CAN BE SPLIT INTO TWO PRIMARY COMPONENTS: a high-capacity, short-lived iconic memory followed by a limited-capacity visual working memory that can last many seconds. Whereas a large number of studies have investigated visual working memory for low-level sensory features, much research on iconic memory has used more "high-level" alphanumeric stimuli such as letters or numbers. These two forms of memory are typically examined separately, despite an intrinsic overlap in their characteristics. Here, we used a purely sensory paradigm to examine visual short-term memory for 10 homogeneous items of three different visual features (color, orientation and motion) across a range of durations from 0 to 6 s. We found that the amount of information stored in iconic memory is smaller for motion than for color or orientation. Performance declined exponentially with longer storage durations and reached chance levels after ∼2 s. Further experiments showed that performance for the 10 items at 1 s was contingent on unperturbed attentional resources. In addition, for orientation stimuli, performance was contingent on the location of stimuli in the visual field, especially for short cue delays. Overall, our results suggest a smooth transition between an automatic, high-capacity, feature-specific sensory-iconic memory, and an effortful "lower-capacity" visual working memory.

  19. Brain representations for acquiring and recalling visual-motor adaptations

    PubMed Central

    Bédard, Patrick; Sanes, Jerome N.

    2014-01-01

    Humans readily learn and remember new motor skills, a process that likely underlies adaptation to changing environments. During adaptation, the brain develops new sensory-motor relationships, and if consolidation occurs, a memory of the adaptation can be retained for extended periods. Considerable evidence exists that multiple brain circuits participate in acquiring new sensory-motor memories, though the networks engaged in recalling these and whether the same brain circuits participate in their formation and recall has less clarity. To address these issues, we assessed brain activation with functional MRI while young healthy adults learned and recalled new sensory-motor skills by adapting to world-view rotations of visual feedback that guided hand movements. We found cerebellar activation related to adaptation rate, likely reflecting changes related to overall adjustments to the visual rotation. A set of parietal and frontal regions, including inferior and superior parietal lobules, premotor area, supplementary motor area and primary somatosensory cortex, exhibited non-linear learning-related activation that peaked in the middle of the adaptation phase. Activation in some of these areas, including the inferior parietal lobule, intra-parietal sulcus and somatosensory cortex, likely reflected actual learning, since the activation correlated with learning after-effects. Lastly, we identified several structures having recall-related activation, including the anterior cingulate and the posterior putamen, since the activation correlated with recall efficacy. These findings demonstrate dynamic aspects of brain activation patterns related to formation and recall of a sensory-motor skill, such that non-overlapping brain regions participate in distinctive behavioral events. PMID:25019676

  20. Modeling of Explorative Procedures for Remote Object Identification

    DTIC Science & Technology

    1991-09-01

    haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the

  1. Sensory Substitution: The Spatial Updating of Auditory Scenes “Mimics” the Spatial Updating of Visual Scenes

    PubMed Central

    Pasqualotto, Achille; Esenkaya, Tayfun

    2016-01-01

    Visual-to-auditory sensory substitution is used to convey visual information through audition, and it was initially created to compensate for blindness; it consists of software converting the visual images captured by a video-camera into the equivalent auditory images, or “soundscapes”. Here, it was used by blindfolded sighted participants to learn the spatial position of simple shapes depicted in images arranged on the floor. Very few studies have used sensory substitution to investigate spatial representation, while it has been widely used to investigate object recognition. Additionally, with sensory substitution we could study the performance of participants actively exploring the environment through audition, rather than passively localizing sound sources. Blindfolded participants egocentrically learnt the position of six images by using sensory substitution and then a judgment of relative direction task (JRD) was used to determine how this scene was represented. This task consists of imagining being in a given location, oriented in a given direction, and pointing towards the required image. Before performing the JRD task, participants explored a map that provided allocentric information about the scene. Although spatial exploration was egocentric, surprisingly we found that performance in the JRD task was better for allocentric perspectives. This suggests that the egocentric representation of the scene was updated. This result is in line with previous studies using visual and somatosensory scenes, thus supporting the notion that different sensory modalities produce equivalent spatial representation(s). Moreover, our results have practical implications to improve training methods with sensory substitution devices (SSD). PMID:27148000

  2. Multisensory perceptual learning of temporal order: audiovisual learning transfers to vision but not audition.

    PubMed

    Alais, David; Cass, John

    2010-06-23

    An outstanding question in sensory neuroscience is whether the perceived timing of events is mediated by a central supra-modal timing mechanism, or multiple modality-specific systems. We use a perceptual learning paradigm to address this question. Three groups were trained daily for 10 sessions on an auditory, a visual or a combined audiovisual temporal order judgment (TOJ). Groups were pre-tested on a range TOJ tasks within and between their group modality prior to learning so that transfer of any learning from the trained task could be measured by post-testing other tasks. Robust TOJ learning (reduced temporal order discrimination thresholds) occurred for all groups, although auditory learning (dichotic 500/2000 Hz tones) was slightly weaker than visual learning (lateralised grating patches). Crossmodal TOJs also displayed robust learning. Post-testing revealed that improvements in temporal resolution acquired during visual learning transferred within modality to other retinotopic locations and orientations, but not to auditory or crossmodal tasks. Auditory learning did not transfer to visual or crossmodal tasks, and neither did it transfer within audition to another frequency pair. In an interesting asymmetry, crossmodal learning transferred to all visual tasks but not to auditory tasks. Finally, in all conditions, learning to make TOJs for stimulus onsets did not transfer at all to discriminating temporal offsets. These data present a complex picture of timing processes. The lack of transfer between unimodal groups indicates no central supramodal timing process for this task; however, the audiovisual-to-visual transfer cannot be explained without some form of sensory interaction. We propose that auditory learning occurred in frequency-tuned processes in the periphery, precluding interactions with more central visual and audiovisual timing processes. Functionally the patterns of featural transfer suggest that perceptual learning of temporal order may be optimised to object-centered rather than viewer-centered constraints.

  3. Visual Sensory and Visual-Cognitive Function and Rate of Crash and Near-Crash Involvement Among Older Drivers Using Naturalistic Driving Data

    PubMed Central

    Huisingh, Carrie; Levitan, Emily B.; Irvin, Marguerite R.; MacLennan, Paul; Wadley, Virginia; Owsley, Cynthia

    2017-01-01

    Purpose An innovative methodology using naturalistic driving data was used to examine the association between visual sensory and visual-cognitive function and rates of future crash or near-crash involvement among older drivers. Methods The Strategic Highway Research Program (SHRP2) Naturalistic Driving Study was used for this prospective analysis. The sample consisted of N = 659 drivers aged ≥70 years and study participation lasted 1 or 2 years for most participants. Distance and near visual acuity, contrast sensitivity, peripheral vision, visual processing speed, and visuospatial skills were assessed at baseline. Crash and near-crash involvement were based on video recordings and vehicle sensors. Poisson regression models were used to generate crude and adjusted rate ratios (RRs) and 95% confidence intervals, while accounting for person-miles of travel. Results After adjustment, severe impairment of the useful field of view (RR = 1.33) was associated with an increased rate of near-crash involvement. Crash, severe crash, and at-fault crash involvement were associated with impaired contrast sensitivity in the worse eye (RRs = 1.38, 1.54, and 1.44, respectively) and far peripheral field loss in both eyes (RRs = 1.74, 2.32, and 1.73, respectively). Conclusions Naturalistic driving data suggest that contrast sensitivity in the worse eye and far peripheral field loss in both eyes elevate the rates of crash involvement, and impaired visual processing speed elevates rates of near-crash involvement among older drivers. Naturalistic driving data may ultimately be critical for understanding the relationship between vision and driving safety. PMID:28605807

  4. Sensory experience ratings (SERs) for 1,659 French words: Relationships with other psycholinguistic variables and visual word recognition.

    PubMed

    Bonin, Patrick; Méot, Alain; Ferrand, Ludovic; Bugaïska, Aurélia

    2015-09-01

    We collected sensory experience ratings (SERs) for 1,659 French words in adults. Sensory experience for words is a recently introduced variable that corresponds to the degree to which words elicit sensory and perceptual experiences (Juhasz & Yap Behavior Research Methods, 45, 160-168, 2013; Juhasz, Yap, Dicke, Taylor, & Gullick Quarterly Journal of Experimental Psychology, 64, 1683-1691, 2011). The relationships of the sensory experience norms with other psycholinguistic variables (e.g., imageability and age of acquisition) were analyzed. We also investigated the degree to which SER predicted performance in visual word recognition tasks (lexical decision, word naming, and progressive demasking). The analyses indicated that SER reliably predicted response times in lexical decision, but not in word naming or progressive demasking. The findings are discussed in relation to the status of SER, the role of semantic code activation in visual word recognition, and the embodied view of cognition.

  5. Psychology and the Handicapped Child.

    ERIC Educational Resources Information Center

    Sherrick, Carl E., Ed.; And Others

    Reviewed in seven author contributed chapters are findings of experimental psychology relevant to the education of handicapped children in the areas of sensory processes, visual perception, memory, cognition and language development, sustained attention and impulse control, and personality and social development. Noted in an introductory chapter…

  6. ERGONOMICS ABSTRACTS 48347-48982.

    ERIC Educational Resources Information Center

    Ministry of Technology, London (England). Warren Spring Lab.

    IN THIS COLLECTION OF ERGONOMICS ABSTRACTS AND ANNOTATIONS THE FOLLOWING AREAS OF CONCERN ARE REPRESENTED--GENERAL REFERENCES, METHODS, FACILITIES, AND EQUIPMENT RELATING TO ERGONOMICS, SYSTEMS OF MAN AND MACHINES, VISUAL, AUDITORY, AND OTHER SENSORY INPUTS AND PROCESSES (INCLUDING SPEECH AND INTELLIGIBILITY), INPUT CHANNELS, BODY MEASUREMENTS,…

  7. Audiovisual Modulation in Mouse Primary Visual Cortex Depends on Cross-Modal Stimulus Configuration and Congruency.

    PubMed

    Meijer, Guido T; Montijn, Jorrit S; Pennartz, Cyriel M A; Lansink, Carien S

    2017-09-06

    The sensory neocortex is a highly connected associative network that integrates information from multiple senses, even at the level of the primary sensory areas. Although a growing body of empirical evidence supports this view, the neural mechanisms of cross-modal integration in primary sensory areas, such as the primary visual cortex (V1), are still largely unknown. Using two-photon calcium imaging in awake mice, we show that the encoding of audiovisual stimuli in V1 neuronal populations is highly dependent on the features of the stimulus constituents. When the visual and auditory stimulus features were modulated at the same rate (i.e., temporally congruent), neurons responded with either an enhancement or suppression compared with unisensory visual stimuli, and their prevalence was balanced. Temporally incongruent tones or white-noise bursts included in audiovisual stimulus pairs resulted in predominant response suppression across the neuronal population. Visual contrast did not influence multisensory processing when the audiovisual stimulus pairs were congruent; however, when white-noise bursts were used, neurons generally showed response suppression when the visual stimulus contrast was high whereas this effect was absent when the visual contrast was low. Furthermore, a small fraction of V1 neurons, predominantly those located near the lateral border of V1, responded to sound alone. These results show that V1 is involved in the encoding of cross-modal interactions in a more versatile way than previously thought. SIGNIFICANCE STATEMENT The neural substrate of cross-modal integration is not limited to specialized cortical association areas but extends to primary sensory areas. Using two-photon imaging of large groups of neurons, we show that multisensory modulation of V1 populations is strongly determined by the individual and shared features of cross-modal stimulus constituents, such as contrast, frequency, congruency, and temporal structure. Congruent audiovisual stimulation resulted in a balanced pattern of response enhancement and suppression compared with unisensory visual stimuli, whereas incongruent or dissimilar stimuli at full contrast gave rise to a population dominated by response-suppressing neurons. Our results indicate that V1 dynamically integrates nonvisual sources of information while still attributing most of its resources to coding visual information. Copyright © 2017 the authors 0270-6474/17/378783-14$15.00/0.

  8. Saccadic Eye Movements Impose a Natural Bottleneck on Visual Short-Term Memory

    ERIC Educational Resources Information Center

    Ohl, Sven; Rolfs, Martin

    2017-01-01

    Visual short-term memory (VSTM) is a crucial repository of information when events unfold rapidly before our eyes, yet it maintains only a fraction of the sensory information encoded by the visual system. Here, we tested the hypothesis that saccadic eye movements provide a natural bottleneck for the transition of fragile content in sensory memory…

  9. Auditory-musical processing in autism spectrum disorders: a review of behavioral and brain imaging studies.

    PubMed

    Ouimet, Tia; Foster, Nicholas E V; Tryfon, Ana; Hyde, Krista L

    2012-04-01

    Autism spectrum disorder (ASD) is a complex neurodevelopmental condition characterized by atypical social and communication skills, repetitive behaviors, and atypical visual and auditory perception. Studies in vision have reported enhanced detailed ("local") processing but diminished holistic ("global") processing of visual features in ASD. Individuals with ASD also show enhanced processing of simple visual stimuli but diminished processing of complex visual stimuli. Relative to the visual domain, auditory global-local distinctions, and the effects of stimulus complexity on auditory processing in ASD, are less clear. However, one remarkable finding is that many individuals with ASD have enhanced musical abilities, such as superior pitch processing. This review provides a critical evaluation of behavioral and brain imaging studies of auditory processing with respect to current theories in ASD. We have focused on auditory-musical processing in terms of global versus local processing and simple versus complex sound processing. This review contributes to a better understanding of auditory processing differences in ASD. A deeper comprehension of sensory perception in ASD is key to better defining ASD phenotypes and, in turn, may lead to better interventions. © 2012 New York Academy of Sciences.

  10. An Evaluation of the Role of Sensory Drive in the Evolution of Lake Malawi Cichlid Fishes

    PubMed Central

    Smith, Adam R.; van Staaden, Moira J.; Carleton, Karen L.

    2012-01-01

    Although the cichlids of Lake Malawi are an important model system for the study of sensory evolution and sexual selection, the evolutionary processes linking these two phenomena remain unclear. Prior works have proposed that evolutionary divergence is driven by sensory drive, particularly as it applies to the visual system. While evidence suggests that sensory drive has played a role in the speciation of Lake Victoria cichlids, the findings from several lines of research on cichlids of Lake Malawi are not consistent with the primary tenets of this hypothesis. More specifically, three observations make the sensory drive model implausible in Malawi: (i) a lack of environmental constraint due to a broad and intense ambient light spectrum in species rich littoral habitats, (ii) pronounced variation in receiver sensory characteristics, and (iii) pronounced variability in male courtship signal characteristics. In the following work, we synthesize the results from recent studies to draw attention to the importance of sensory variation in cichlid evolution and speciation, and we suggest possible avenues of future research. PMID:22779029

  11. STANDARDS OF FUNCTIONAL MEASUREMENTS IN OCULAR TOXICOLOGY.

    EPA Science Inventory

    The visual system, like other sensory systems, may be a frequent target of exposure to toxic chemicals. A thorough evaluation of visual toxicity should include both structural and functional measures. Sensory evoked potentials are one set of neurophysiological procedures that...

  12. Premotor neural correlates of predictive motor timing for speech production and hand movement: evidence for a temporal predictive code in the motor system.

    PubMed

    Johari, Karim; Behroozmand, Roozbeh

    2017-05-01

    The predictive coding model suggests that neural processing of sensory information is facilitated for temporally-predictable stimuli. This study investigated how temporal processing of visually-presented sensory cues modulates movement reaction time and neural activities in speech and hand motor systems. Event-related potentials (ERPs) were recorded in 13 subjects while they were visually-cued to prepare to produce a steady vocalization of a vowel sound or press a button in a randomized order, and to initiate the cued movement following the onset of a go signal on the screen. Experiment was conducted in two counterbalanced blocks in which the time interval between visual cue and go signal was temporally-predictable (fixed delay at 1000 ms) or unpredictable (variable between 1000 and 2000 ms). Results of the behavioral response analysis indicated that movement reaction time was significantly decreased for temporally-predictable stimuli in both speech and hand modalities. We identified premotor ERP activities with a left-lateralized parietal distribution for hand and a frontocentral distribution for speech that were significantly suppressed in response to temporally-predictable compared with unpredictable stimuli. The premotor ERPs were elicited approximately -100 ms before movement and were significantly correlated with speech and hand motor reaction times only in response to temporally-predictable stimuli. These findings suggest that the motor system establishes a predictive code to facilitate movement in response to temporally-predictable sensory stimuli. Our data suggest that the premotor ERP activities are robust neurophysiological biomarkers of such predictive coding mechanisms. These findings provide novel insights into the temporal processing mechanisms of speech and hand motor systems.

  13. Axonal Conduction Delays, Brain State, and Corticogeniculate Communication

    PubMed Central

    2017-01-01

    Thalamocortical conduction times are short, but layer 6 corticothalamic axons display an enormous range of conduction times, some exceeding 40–50 ms. Here, we investigate (1) how axonal conduction times of corticogeniculate (CG) neurons are related to the visual information conveyed to the thalamus, and (2) how alert versus nonalert awake brain states affect visual processing across the spectrum of CG conduction times. In awake female Dutch-Belted rabbits, we found 58% of CG neurons to be visually responsive, and 42% to be unresponsive. All responsive CG neurons had simple, orientation-selective receptive fields, and generated sustained responses to stationary stimuli. CG axonal conduction times were strongly related to modulated firing rates (F1 values) generated by drifting grating stimuli, and their associated interspike interval distributions, suggesting a continuum of visual responsiveness spanning the spectrum of axonal conduction times. CG conduction times were also significantly related to visual response latency, contrast sensitivity (C-50 values), directional selectivity, and optimal stimulus velocity. Increasing alertness did not cause visually unresponsive CG neurons to become responsive and did not change the response linearity (F1/F0 ratios) of visually responsive CG neurons. However, for visually responsive CG neurons, increased alertness nearly doubled the modulated response amplitude to optimal visual stimulation (F1 values), significantly shortened response latency, and dramatically increased response reliability. These effects of alertness were uniform across the broad spectrum of CG axonal conduction times. SIGNIFICANCE STATEMENT Corticothalamic neurons of layer 6 send a dense feedback projection to thalamic nuclei that provide input to sensory neocortex. While sensory information reaches the cortex after brief thalamocortical axonal delays, corticothalamic axons can exhibit conduction delays of <2 ms to 40–50 ms. Here, in the corticogeniculate visual system of awake rabbits, we investigate the functional significance of this axonal diversity, and the effects of shifting alert/nonalert brain states on corticogeniculate processing. We show that axonal conduction times are strongly related to multiple visual response properties, suggesting a continuum of visual responsiveness spanning the spectrum of corticogeniculate axonal conduction times. We also show that transitions between awake brain states powerfully affect corticogeniculate processing, in some ways more strongly than in layer 4. PMID:28559382

  14. Sensory experience modifies feature map relationships in visual cortex

    PubMed Central

    Cloherty, Shaun L; Hughes, Nicholas J; Hietanen, Markus A; Bhagavatula, Partha S

    2016-01-01

    The extent to which brain structure is influenced by sensory input during development is a critical but controversial question. A paradigmatic system for studying this is the mammalian visual cortex. Maps of orientation preference (OP) and ocular dominance (OD) in the primary visual cortex of ferrets, cats and monkeys can be individually changed by altered visual input. However, the spatial relationship between OP and OD maps has appeared immutable. Using a computational model we predicted that biasing the visual input to orthogonal orientation in the two eyes should cause a shift of OP pinwheels towards the border of OD columns. We then confirmed this prediction by rearing cats wearing orthogonally oriented cylindrical lenses over each eye. Thus, the spatial relationship between OP and OD maps can be modified by visual experience, revealing a previously unknown degree of brain plasticity in response to sensory input. DOI: http://dx.doi.org/10.7554/eLife.13911.001 PMID:27310531

  15. Noise-Induced Entrainment and Stochastic Resonance in Human Brain Waves

    NASA Astrophysics Data System (ADS)

    Mori, Toshio; Kai, Shoichi

    2002-05-01

    We present the first observation of stochastic resonance (SR) in the human brain's visual processing area. The novel experimental protocol is to stimulate the right eye with a subthreshold periodic optical signal and the left eye with a noisy one. The stimuli bypass sensory organs and are mixed in the visual cortex. With many noise sources present in the brain, higher brain functions, e.g., perception and cognition, may exploit SR.

  16. Sensory signals and neuronal groups involved in guiding the sea-ward motor behavior in turtle hatchlings of Chelonia agassizi

    NASA Astrophysics Data System (ADS)

    Fuentes, A. L.; Camarena, V.; Ochoa, G.; Urrutia, J.; Gutierrez, G.

    2007-05-01

    Turtle hatchlings orient display sea-ward oriented movements as soon as they emerge from the nest. Although most studies have emphasized the role of the visual information in this process, less attention has been paid to other sensory modalities. Here, we evaluated the nature of sensory cues used by turtle hatchlings of Chelonia agassizi to orient their movements towards the ocean. We recorded the time they took to crawl from the nest to the beach front (120m long) in control conditions and in visually, olfactory and magnetically deprived circumstances. Visually-deprived hatchlings displayed a high degree of disorientation. Olfactory deprivation and magnetic field distortion impaired, but not abolished, sea-ward oriented movements. With regard to the neuronal mapping experiments, visual deprivation reduced dramatically c-fos expression in the whole brain. Hatchlings with their nares blocked revealed neurons with c-fos expression above control levels principally in the c and d areas, while those subjected to magnetic field distortion had a wide spread activation of neurons throughout the brain predominantly in the dorsal ventricular ridge The present results support that Chelonia agassizi hatchlings use predominantly visual cues to orient their movements towards the sea. Olfactory and magnetic cues may also be use but their influence on hatchlings oriented motor behavior is not as clear as it is for vision. This conclusion is supported by the fact that in the absence of olfactory and magnetic cues, the brain turns on the expression of c- fos in neuronal groups that, in the intact hatchling, are not normally involved in accomplishing the task.

  17. Appraisal of unimodal cues during agonistic interactions in Maylandia zebra

    PubMed Central

    Ben Ammar, Imen; Fernandez, Marie S.A.; Boyer, Nicolas; Attia, Joël; Fonseca, Paulo J.; Amorim, M. Clara P.; Beauchaud, Marilyn

    2017-01-01

    Communication is essential during social interactions including animal conflicts and it is often a complex process involving multiple sensory channels or modalities. To better understand how different modalities interact during communication, it is fundamental to study the behavioural responses to both the composite multimodal signal and each unimodal component with adequate experimental protocols. Here we test how an African cichlid, which communicates with multiple senses, responds to different sensory stimuli in a social relevant scenario. We tested Maylandia zebra males with isolated chemical (urine or holding water coming both from dominant males), visual (real opponent or video playback) and acoustic (agonistic sounds) cues during agonistic interactions. We showed that (1) these fish relied mostly on the visual modality, showing increased aggressiveness in response to the sight of a real contestant but no responses to urine or agonistic sounds presented separately, (2) video playback in our study did not appear appropriate to test the visual modality and needs more technical prospecting, (3) holding water provoked territorial behaviours and seems to be promising for the investigation into the role of the chemical channel in this species. Our findings suggest that unimodal signals are non-redundant but how different sensory modalities interplay during communication remains largely unknown in fish. PMID:28785523

  18. Preservation of crossmodal selective attention in healthy aging

    PubMed Central

    Hugenschmidt, Christina E.; Peiffer, Ann M.; McCoy, Thomas P.; Hayasaka, Satoru; Laurienti, Paul J.

    2010-01-01

    The goal of the present study was to determine if older adults benefited from attention to a specific sensory modality in a voluntary attention task and evidenced changes in voluntary or involuntary attention when compared to younger adults. Suppressing and enhancing effects of voluntary attention were assessed using two cued forced-choice tasks, one that asked participants to localize and one that asked them to categorize visual and auditory targets. Involuntary attention was assessed using the same tasks, but with no attentional cues. The effects of attention were evaluated using traditional comparisons of means and Cox proportional hazards models. All analyses showed that older adults benefited behaviorally from selective attention in both visual and auditory conditions, including robust suppressive effects of attention. Of note, the performance of the older adults was commensurate with that of younger adults in almost all analyses, suggesting that older adults can successfully engage crossmodal attention processes. Thus, age-related increases in distractibility across sensory modalities are likely due to mechanisms other than deficits in attentional processing. PMID:19404621

  19. Phosphodiesterase Inhibition Increases CREB Phosphorylation and Restores Orientation Selectivity in a Model of Fetal Alcohol Spectrum Disorders

    PubMed Central

    Krahe, Thomas E.; Wang, Weili; Medina, Alexandre E.

    2009-01-01

    Background Fetal alcohol spectrum disorders (FASD) are the leading cause of mental retardation in the western world and children with FASD present altered somatosensory, auditory and visual processing. There is growing evidence that some of these sensory processing problems may be related to altered cortical maps caused by impaired developmental neuronal plasticity. Methodology/Principal Findings Here we show that the primary visual cortex of ferrets exposed to alcohol during the third trimester equivalent of human gestation have decreased CREB phosphorylation and poor orientation selectivity revealed by western blotting, optical imaging of intrinsic signals and single-unit extracellular recording techniques. Treating animals several days after the period of alcohol exposure with a phosphodiesterase type 1 inhibitor (Vinpocetine) increased CREB phosphorylation and restored orientation selectivity columns and neuronal orientation tuning. Conclusions/Significance These findings suggest that CREB function is important for the maturation of orientation selectivity and that plasticity enhancement by vinpocetine may play a role in the treatment of sensory problems in FASD. PMID:19680548

  20. Hunger-Dependent Enhancement of Food Cue Responses in Mouse Postrhinal Cortex and Lateral Amygdala.

    PubMed

    Burgess, Christian R; Ramesh, Rohan N; Sugden, Arthur U; Levandowski, Kirsten M; Minnig, Margaret A; Fenselau, Henning; Lowell, Bradford B; Andermann, Mark L

    2016-09-07

    The needs of the body can direct behavioral and neural processing toward motivationally relevant sensory cues. For example, human imaging studies have consistently found specific cortical areas with biased responses to food-associated visual cues in hungry subjects, but not in sated subjects. To obtain a cellular-level understanding of these hunger-dependent cortical response biases, we performed chronic two-photon calcium imaging in postrhinal association cortex (POR) and primary visual cortex (V1) of behaving mice. As in humans, neurons in mouse POR, but not V1, exhibited biases toward food-associated cues that were abolished by satiety. This emergent bias was mirrored by the innervation pattern of amygdalo-cortical feedback axons. Strikingly, these axons exhibited even stronger food cue biases and sensitivity to hunger state and trial history. These findings highlight a direct pathway by which the lateral amygdala may contribute to state-dependent cortical processing of motivationally relevant sensory cues. Published by Elsevier Inc.

  1. A comprehensive wiring diagram of the protocerebral bridge for visual information processing in the Drosophila brain.

    PubMed

    Lin, Chih-Yung; Chuang, Chao-Chun; Hua, Tzu-En; Chen, Chun-Chao; Dickson, Barry J; Greenspan, Ralph J; Chiang, Ann-Shyn

    2013-05-30

    How the brain perceives sensory information and generates meaningful behavior depends critically on its underlying circuitry. The protocerebral bridge (PB) is a major part of the insect central complex (CX), a premotor center that may be analogous to the human basal ganglia. Here, by deconstructing hundreds of PB single neurons and reconstructing them into a common three-dimensional framework, we have constructed a comprehensive map of PB circuits with labeled polarity and predicted directions of information flow. Our analysis reveals a highly ordered information processing system that involves directed information flow among CX subunits through 194 distinct PB neuron types. Circuitry properties such as mirroring, convergence, divergence, tiling, reverberation, and parallel signal propagation were observed; their functional and evolutional significance is discussed. This layout of PB neuronal circuitry may provide guidelines for further investigations on transformation of sensory (e.g., visual) input into locomotor commands in fly brains. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans.

    PubMed

    Fort, Alexandra; Delpuech, Claude; Pernier, Jacques; Giard, Marie-Hélène

    2002-10-01

    Very recently, a number of neuroimaging studies in humans have begun to investigate the question of how the brain integrates information from different sensory modalities to form unified percepts. Already, intermodal neural processing appears to depend on the modalities of inputs or the nature (speech/non-speech) of information to be combined. Yet, the variety of paradigms, stimuli and technics used make it difficult to understand the relationships between the factors operating at the perceptual level and the underlying physiological processes. In a previous experiment, we used event-related potentials to describe the spatio-temporal organization of audio-visual interactions during a bimodal object recognition task. Here we examined the network of cross-modal interactions involved in simple detection of the same objects. The objects were defined either by unimodal auditory or visual features alone, or by the combination of the two features. As expected, subjects detected bimodal stimuli more rapidly than either unimodal stimuli. Combined analysis of potentials, scalp current densities and dipole modeling revealed several interaction patterns within the first 200 micro s post-stimulus: in occipito-parietal visual areas (45-85 micro s), in deep brain structures, possibly the superior colliculus (105-140 micro s), and in right temporo-frontal regions (170-185 micro s). These interactions differed from those found during object identification in sensory-specific areas and possibly in the superior colliculus, indicating that the neural operations governing multisensory integration depend crucially on the nature of the perceptual processes involved.

  3. Maintenance of relational information in working memory leads to suppression of the sensory cortex.

    PubMed

    Ikkai, Akiko; Blacker, Kara J; Lakshmanan, Balaji M; Ewen, Joshua B; Courtney, Susan M

    2014-10-15

    Working memory (WM) for sensory-based information about individual objects and their locations appears to involve interactions between lateral prefrontal and sensory cortexes. The mechanisms and representations for maintenance of more abstract, nonsensory information in WM are unknown, particularly whether such actively maintained information can become independent of the sensory information from which it was derived. Previous studies of WM for individual visual items found increased electroencephalogram (EEG) alpha (8-13 Hz) power over posterior electrode sites, which appears to correspond to the suppression of cortical areas that represent irrelevant sensory information. Here, we recorded EEG while participants performed a visual WM task that involved maintaining either concrete spatial coordinates or abstract relational information. Maintenance of relational information resulted in higher alpha power in posterior electrodes. Furthermore, lateralization of alpha power due to a covert shift of attention to one visual hemifield was marginally weaker during storage of relational information than during storage of concrete information. These results suggest that abstract relational information is maintained in WM differently from concrete, sensory representations and that during maintenance of abstract information, posterior sensory regions become task irrelevant and are thus suppressed. Copyright © 2014 the American Physiological Society.

  4. The "serendipitous brain": Low expectancy and timing uncertainty of conscious events improve awareness of unconscious ones (evidence from the Attentional Blink).

    PubMed

    Lasaponara, Stefano; Dragone, Alessio; Lecce, Francesca; Di Russo, Francesco; Doricchi, Fabrizio

    2015-10-01

    To anticipate upcoming sensory events, the brain picks-up and exploits statistical regularities in the sensory environment. However, it is untested whether cumulated predictive knowledge about consciously seen stimuli improves the access to awareness of stimuli that usually go unseen. To explore this issue, we exploited the Attentional Blink (AB) effect, where conscious processing of a first visual target (T1) hinders detection of early following targets (T2). We report that timing uncertainty and low expectancy about the occurrence of consciously seen T2s presented outside the AB period, improve detection of early and otherwise often unseen T2s presented inside the AB. Recording of high-resolution Event Related Potentials (ERPs) and the study of their intracranial sources showed that the brain achieves this improvement by initially amplifying and extending the pre-conscious storage of T2s' traces signalled by the N2 wave originating in the extra-striate cortex. This enhancement in the N2 wave is followed by specific changes in the latency and amplitude of later components in the P3 wave (P3a and P3b), signalling access of the sensory trace to the network of parietal and frontal areas modulating conscious processing. These findings show that the interaction between conscious and unconscious processing changes adaptively as a function of the probabilistic properties of the sensory environment and that the combination of an active attentional state with loose probabilistic and temporal expectancies on forthcoming conscious events favors the emergence to awareness of otherwise unnoticed visual events. This likely provides an insight on the attentional conditions that predispose an active observer to unexpected "serendipitous" findings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Interoceptive signals impact visual processing: Cardiac modulation of visual body perception.

    PubMed

    Ronchi, Roberta; Bernasconi, Fosco; Pfeiffer, Christian; Bello-Ruiz, Javier; Kaliuzhna, Mariia; Blanke, Olaf

    2017-09-01

    Multisensory perception research has largely focused on exteroceptive signals, but recent evidence has revealed the integration of interoceptive signals with exteroceptive information. Such research revealed that heartbeat signals affect sensory (e.g., visual) processing: however, it is unknown how they impact the perception of body images. Here we linked our participants' heartbeat to visual stimuli and investigated the spatio-temporal brain dynamics of cardio-visual stimulation on the processing of human body images. We recorded visual evoked potentials with 64-channel electroencephalography while showing a body or a scrambled-body (control) that appeared at the frequency of the on-line recorded participants' heartbeat or not (not-synchronous, control). Extending earlier studies, we found a body-independent effect, with cardiac signals enhancing visual processing during two time periods (77-130 ms and 145-246 ms). Within the second (later) time-window we detected a second effect characterised by enhanced activity in parietal, temporo-occipital, inferior frontal, and right basal ganglia-insula regions, but only when non-scrambled body images were flashed synchronously with the heartbeat (208-224 ms). In conclusion, our results highlight the role of interoceptive information for the visual processing of human body pictures within a network integrating cardio-visual signals of relevance for perceptual and cognitive aspects of visual body processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Elemental or contextual? It depends: individual difference in the hippocampal dependence of associative learning for a simple sensory stimulus

    PubMed Central

    Lee, Kyung J.; Park, Seong-Beom; Lee, Inah

    2014-01-01

    Learning theories categorize learning systems into elemental and contextual systems, the former being processed by non-hippocampal regions and the latter being processed in the hippocampus. A set of complex stimuli such as a visual background is often considered a contextual stimulus and simple sensory stimuli such as pure tone and light are considered elemental stimuli. However, this elemental-contextual categorization scheme has only been tested in limited behavioral paradigms and it is largely unknown whether it can be generalized across different learning situations. By requiring rats to respond differently to a common object in association with various types of sensory cues including contextual and elemental stimuli, we tested whether different types of elemental and contextual sensory stimuli depended on the hippocampus to different degrees. In most rats, a surrounding visual background and a tactile stimulus served as contextual (hippocampal dependent) and elemental (non-hippocampal dependent) stimuli, respectively. However, simple tone and light stimuli frequently used as elemental cues in traditional experiments required the hippocampus to varying degrees among rats. Specifically, one group of rats showed a normal contextual bias when both contextual and elemental cues were present. These rats effectively switched to using elemental cues when the hippocampus was inactivated. The other group showed a strong contextual bias (and hippocampal dependence) because these rats were not able to use elemental cues when the hippocampus was unavailable. It is possible that the latter group of rats might have interpreted the elemental cues (light and tone) as background stimuli and depended more on the hippocampus in associating the cues with choice responses. Although exact mechanisms underlying these individual variances are unclear, our findings recommend a caution for adopting a simple sensory stimulus as a non-hippocampal sensory cue only based on the literature. PMID:24982624

  7. Global Sensory Qualities and Aesthetic Experience in Music.

    PubMed

    Brattico, Pauli; Brattico, Elvira; Vuust, Peter

    2017-01-01

    A well-known tradition in the study of visual aesthetics holds that the experience of visual beauty is grounded in global computational or statistical properties of the stimulus, for example, scale-invariant Fourier spectrum or self-similarity. Some approaches rely on neural mechanisms, such as efficient computation, processing fluency, or the responsiveness of the cells in the primary visual cortex. These proposals are united by the fact that the contributing factors are hypothesized to be global (i.e., they concern the percept as a whole), formal or non-conceptual (i.e., they concern form instead of content), computational and/or statistical, and based on relatively low-level sensory properties. Here we consider that the study of aesthetic responses to music could benefit from the same approach. Thus, along with local features such as pitch, tuning, consonance/dissonance, harmony, timbre, or beat, also global sonic properties could be viewed as contributing toward creating an aesthetic musical experience. Several such properties are discussed and their neural implementation is reviewed in the light of recent advances in neuroaesthetics.

  8. Auditory biofeedback substitutes for loss of sensory information in maintaining stance.

    PubMed

    Dozza, Marco; Horak, Fay B; Chiari, Lorenzo

    2007-03-01

    The importance of sensory feedback for postural control in stance is evident from the balance improvements occurring when sensory information from the vestibular, somatosensory, and visual systems is available. However, the extent to which also audio-biofeedback (ABF) information can improve balance has not been determined. It is also unknown why additional artificial sensory feedback is more effective for some subjects than others and in some environmental contexts than others. The aim of this study was to determine the relative effectiveness of an ABF system to reduce postural sway in stance in healthy control subjects and in subjects with bilateral vestibular loss, under conditions of reduced vestibular, visual, and somatosensory inputs. This ABF system used a threshold region and non-linear scaling parameters customized for each individual, to provide subjects with pitch and volume coding of their body sway. ABF had the largest effect on reducing the body sway of the subjects with bilateral vestibular loss when the environment provided limited visual and somatosensory information; it had the smallest effect on reducing the sway of subjects with bilateral vestibular loss, when the environment provided full somatosensory information. The extent that all subjects substituted ABF information for their loss of sensory information was related to the extent that each subject was visually dependent or somatosensory-dependent for their postural control. Comparison of postural sway under a variety of sensory conditions suggests that patients with profound bilateral loss of vestibular function show larger than normal information redundancy among the remaining senses and ABF of trunk sway. The results support the hypothesis that the nervous system uses augmented sensory information differently depending both on the environment and on individual proclivities to rely on vestibular, somatosensory or visual information to control sway.

  9. Dynamic modulation of visual and electrosensory gains for locomotor control

    PubMed Central

    Sutton, Erin E.; Demir, Alican; Stamper, Sarah A.; Fortune, Eric S.; Cowan, Noah J.

    2016-01-01

    Animal nervous systems resolve sensory conflict for the control of movement. For example, the glass knifefish, Eigenmannia virescens, relies on visual and electrosensory feedback as it swims to maintain position within a moving refuge. To study how signals from these two parallel sensory streams are used in refuge tracking, we constructed a novel augmented reality apparatus that enables the independent manipulation of visual and electrosensory cues to freely swimming fish (n = 5). We evaluated the linearity of multisensory integration, the change to the relative perceptual weights given to vision and electrosense in relation to sensory salience, and the effect of the magnitude of sensory conflict on sensorimotor gain. First, we found that tracking behaviour obeys superposition of the sensory inputs, suggesting linear sensorimotor integration. In addition, fish rely more on vision when electrosensory salience is reduced, suggesting that fish dynamically alter sensorimotor gains in a manner consistent with Bayesian integration. However, the magnitude of sensory conflict did not significantly affect sensorimotor gain. These studies lay the theoretical and experimental groundwork for future work investigating multisensory control of locomotion. PMID:27170650

  10. Sensory Prioritization in Rats: Behavioral Performance and Neuronal Correlates.

    PubMed

    Lee, Conrad C Y; Diamond, Mathew E; Arabzadeh, Ehsan

    2016-03-16

    Operating with some finite quantity of processing resources, an animal would benefit from prioritizing the sensory modality expected to provide key information in a particular context. The present study investigated whether rats dedicate attentional resources to the sensory modality in which a near-threshold event is more likely to occur. We manipulated attention by controlling the likelihood with which a stimulus was presented from one of two modalities. In a whisker session, 80% of trials contained a brief vibration stimulus applied to whiskers and the remaining 20% of trials contained a brief change of luminance. These likelihoods were reversed in a visual session. When a stimulus was presented in the high-likelihood context, detection performance increased and was faster compared with the same stimulus presented in the low-likelihood context. Sensory prioritization was also reflected in neuronal activity in the vibrissal area of primary somatosensory cortex: single units responded differentially to the whisker vibration stimulus when presented with higher probability compared with lower probability. Neuronal activity in the vibrissal cortex displayed signatures of multiplicative gain control and enhanced response to vibration stimuli during the whisker session. In conclusion, rats allocate priority to the more likely stimulus modality and the primary sensory cortex may participate in the redistribution of resources. Detection of low-amplitude events is critical to survival; for example, to warn prey of predators. To formulate a response, decision-making systems must extract minute neuronal signals from the sensory modality that provides key information. Here, we identify the behavioral and neuronal correlates of sensory prioritization in rats. Rats were trained to detect whisker vibrations or visual flickers. Stimuli were embedded in two contexts in which either visual or whisker modality was more likely to occur. When a stimulus was presented in the high-likelihood context, detection was faster and more reliable. Neuronal recording from the vibrissal cortex revealed enhanced representation of vibrations in the prioritized context. These results establish the rat as an alternative model organism to primates for studying attention. Copyright © 2016 the authors 0270-6474/16/363243-11$15.00/0.

  11. How actions shape perception: learning action-outcome relations and predicting sensory outcomes promote audio-visual temporal binding

    PubMed Central

    Desantis, Andrea; Haggard, Patrick

    2016-01-01

    To maintain a temporally-unified representation of audio and visual features of objects in our environment, the brain recalibrates audio-visual simultaneity. This process allows adjustment for both differences in time of transmission and time for processing of audio and visual signals. In four experiments, we show that the cognitive processes for controlling instrumental actions also have strong influence on audio-visual recalibration. Participants learned that right and left hand button-presses each produced a specific audio-visual stimulus. Following one action the audio preceded the visual stimulus, while for the other action audio lagged vision. In a subsequent test phase, left and right button-press generated either the same audio-visual stimulus as learned initially, or the pair associated with the other action. We observed recalibration of simultaneity only for previously-learned audio-visual outcomes. Thus, learning an action-outcome relation promotes temporal grouping of the audio and visual events within the outcome pair, contributing to the creation of a temporally unified multisensory object. This suggests that learning action-outcome relations and the prediction of perceptual outcomes can provide an integrative temporal structure for our experiences of external events. PMID:27982063

  12. How actions shape perception: learning action-outcome relations and predicting sensory outcomes promote audio-visual temporal binding.

    PubMed

    Desantis, Andrea; Haggard, Patrick

    2016-12-16

    To maintain a temporally-unified representation of audio and visual features of objects in our environment, the brain recalibrates audio-visual simultaneity. This process allows adjustment for both differences in time of transmission and time for processing of audio and visual signals. In four experiments, we show that the cognitive processes for controlling instrumental actions also have strong influence on audio-visual recalibration. Participants learned that right and left hand button-presses each produced a specific audio-visual stimulus. Following one action the audio preceded the visual stimulus, while for the other action audio lagged vision. In a subsequent test phase, left and right button-press generated either the same audio-visual stimulus as learned initially, or the pair associated with the other action. We observed recalibration of simultaneity only for previously-learned audio-visual outcomes. Thus, learning an action-outcome relation promotes temporal grouping of the audio and visual events within the outcome pair, contributing to the creation of a temporally unified multisensory object. This suggests that learning action-outcome relations and the prediction of perceptual outcomes can provide an integrative temporal structure for our experiences of external events.

  13. The Sensory Components of High-Capacity Iconic Memory and Visual Working Memory

    PubMed Central

    Bradley, Claire; Pearson, Joel

    2012-01-01

    Early visual memory can be split into two primary components: a high-capacity, short-lived iconic memory followed by a limited-capacity visual working memory that can last many seconds. Whereas a large number of studies have investigated visual working memory for low-level sensory features, much research on iconic memory has used more “high-level” alphanumeric stimuli such as letters or numbers. These two forms of memory are typically examined separately, despite an intrinsic overlap in their characteristics. Here, we used a purely sensory paradigm to examine visual short-term memory for 10 homogeneous items of three different visual features (color, orientation and motion) across a range of durations from 0 to 6 s. We found that the amount of information stored in iconic memory is smaller for motion than for color or orientation. Performance declined exponentially with longer storage durations and reached chance levels after ∼2 s. Further experiments showed that performance for the 10 items at 1 s was contingent on unperturbed attentional resources. In addition, for orientation stimuli, performance was contingent on the location of stimuli in the visual field, especially for short cue delays. Overall, our results suggest a smooth transition between an automatic, high-capacity, feature-specific sensory-iconic memory, and an effortful “lower-capacity” visual working memory. PMID:23055993

  14. The influence of spontaneous activity on stimulus processing in primary visual cortex.

    PubMed

    Schölvinck, M L; Friston, K J; Rees, G

    2012-02-01

    Spontaneous activity in the resting human brain has been studied extensively; however, how such activity affects the local processing of a sensory stimulus is relatively unknown. Here, we examined the impact of spontaneous activity in primary visual cortex on neuronal and behavioural responses to a simple visual stimulus, using functional MRI. Stimulus-evoked responses remained essentially unchanged by spontaneous fluctuations, combining with them in a largely linear fashion (i.e., with little evidence for an interaction). However, interactions between spontaneous fluctuations and stimulus-evoked responses were evident behaviourally; high levels of spontaneous activity tended to be associated with increased stimulus detection at perceptual threshold. Our results extend those found in studies of spontaneous fluctuations in motor cortex and higher order visual areas, and suggest a fundamental role for spontaneous activity in stimulus processing. Copyright © 2011. Published by Elsevier Inc.

  15. Spontaneous Fluctuations in Sensory Processing Predict Within-Subject Reaction Time Variability.

    PubMed

    Ribeiro, Maria J; Paiva, Joana S; Castelo-Branco, Miguel

    2016-01-01

    When engaged in a repetitive task our performance fluctuates from trial-to-trial. In particular, inter-trial reaction time variability has been the subject of considerable research. It has been claimed to be a strong biomarker of attention deficits, increases with frontal dysfunction, and predicts age-related cognitive decline. Thus, rather than being just a consequence of noise in the system, it appears to be under the control of a mechanism that breaks down under certain pathological conditions. Although the underlying mechanism is still an open question, consensual hypotheses are emerging regarding the neural correlates of reaction time inter-trial intra-individual variability. Sensory processing, in particular, has been shown to covary with reaction time, yet the spatio-temporal profile of the moment-to-moment variability in sensory processing is still poorly characterized. The goal of this study was to characterize the intra-individual variability in the time course of single-trial visual evoked potentials and its relationship with inter-trial reaction time variability. For this, we chose to take advantage of the high temporal resolution of the electroencephalogram (EEG) acquired while participants were engaged in a 2-choice reaction time task. We studied the link between single trial event-related potentials (ERPs) and reaction time using two different analyses: (1) time point by time point correlation analyses thereby identifying time windows of interest; and (2) correlation analyses between single trial measures of peak latency and amplitude and reaction time. To improve extraction of single trial ERP measures related with activation of the visual cortex, we used an independent component analysis (ICA) procedure. Our ERP analysis revealed a relationship between the N1 visual evoked potential and reaction time. The earliest time point presenting a significant correlation of its respective amplitude with reaction time occurred 175 ms after stimulus onset, just after the onset of the N1 peak. Interestingly, single trial N1 latency correlated significantly with reaction time, while N1 amplitude did not. In conclusion, our findings suggest that inter-trial variability in the timing of extrastriate visual processing contributes to reaction time variability.

  16. Spontaneous Fluctuations in Sensory Processing Predict Within-Subject Reaction Time Variability

    PubMed Central

    Ribeiro, Maria J.; Paiva, Joana S.; Castelo-Branco, Miguel

    2016-01-01

    When engaged in a repetitive task our performance fluctuates from trial-to-trial. In particular, inter-trial reaction time variability has been the subject of considerable research. It has been claimed to be a strong biomarker of attention deficits, increases with frontal dysfunction, and predicts age-related cognitive decline. Thus, rather than being just a consequence of noise in the system, it appears to be under the control of a mechanism that breaks down under certain pathological conditions. Although the underlying mechanism is still an open question, consensual hypotheses are emerging regarding the neural correlates of reaction time inter-trial intra-individual variability. Sensory processing, in particular, has been shown to covary with reaction time, yet the spatio-temporal profile of the moment-to-moment variability in sensory processing is still poorly characterized. The goal of this study was to characterize the intra-individual variability in the time course of single-trial visual evoked potentials and its relationship with inter-trial reaction time variability. For this, we chose to take advantage of the high temporal resolution of the electroencephalogram (EEG) acquired while participants were engaged in a 2-choice reaction time task. We studied the link between single trial event-related potentials (ERPs) and reaction time using two different analyses: (1) time point by time point correlation analyses thereby identifying time windows of interest; and (2) correlation analyses between single trial measures of peak latency and amplitude and reaction time. To improve extraction of single trial ERP measures related with activation of the visual cortex, we used an independent component analysis (ICA) procedure. Our ERP analysis revealed a relationship between the N1 visual evoked potential and reaction time. The earliest time point presenting a significant correlation of its respective amplitude with reaction time occurred 175 ms after stimulus onset, just after the onset of the N1 peak. Interestingly, single trial N1 latency correlated significantly with reaction time, while N1 amplitude did not. In conclusion, our findings suggest that inter-trial variability in the timing of extrastriate visual processing contributes to reaction time variability. PMID:27242470

  17. Shapes, scents and sounds: quantifying the full multi-sensory basis of conceptual knowledge.

    PubMed

    Hoffman, Paul; Lambon Ralph, Matthew A

    2013-01-01

    Contemporary neuroscience theories assume that concepts are formed through experience in multiple sensory-motor modalities. Quantifying the contribution of each modality to different object categories is critical to understanding the structure of the conceptual system and to explaining category-specific knowledge deficits. Verbal feature listing is typically used to elicit this information but has a number of drawbacks: sensory knowledge often cannot easily be translated into verbal features and many features are experienced in multiple modalities. Here, we employed a more direct approach in which subjects rated their knowledge of objects in each sensory-motor modality separately. Compared with these ratings, feature listing over-estimated the importance of visual form and functional knowledge and under-estimated the contributions of other sensory channels. An item's sensory rating proved to be a better predictor of lexical-semantic processing speed than the number of features it possessed, suggesting that ratings better capture the overall quantity of sensory information associated with a concept. Finally, the richer, multi-modal rating data not only replicated the sensory-functional distinction between animals and non-living things but also revealed novel distinctions between different types of artefact. Hierarchical cluster analyses indicated that mechanical devices (e.g., vehicles) were distinct from other non-living objects because they had strong sound and motion characteristics, making them more similar to animals in this respect. Taken together, the ratings align with neuroscience evidence in suggesting that a number of distinct sensory processing channels make important contributions to object knowledge. Multi-modal ratings for 160 objects are provided as supplementary materials. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Superior voice recognition in a patient with acquired prosopagnosia and object agnosia.

    PubMed

    Hoover, Adria E N; Démonet, Jean-François; Steeves, Jennifer K E

    2010-11-01

    Anecdotally, it has been reported that individuals with acquired prosopagnosia compensate for their inability to recognize faces by using other person identity cues such as hair, gait or the voice. Are they therefore superior at the use of non-face cues, specifically voices, to person identity? Here, we empirically measure person and object identity recognition in a patient with acquired prosopagnosia and object agnosia. We quantify person identity (face and voice) and object identity (car and horn) recognition for visual, auditory, and bimodal (visual and auditory) stimuli. The patient is unable to recognize faces or cars, consistent with his prosopagnosia and object agnosia, respectively. He is perfectly able to recognize people's voices and car horns and bimodal stimuli. These data show a reverse shift in the typical weighting of visual over auditory information for audiovisual stimuli in a compromised visual recognition system. Moreover, the patient shows selectively superior voice recognition compared to the controls revealing that two different stimulus domains, persons and objects, may not be equally affected by sensory adaptation effects. This also implies that person and object identity recognition are processed in separate pathways. These data demonstrate that an individual with acquired prosopagnosia and object agnosia can compensate for the visual impairment and become quite skilled at using spared aspects of sensory processing. In the case of acquired prosopagnosia it is advantageous to develop a superior use of voices for person identity recognition in everyday life. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Age-dependent modulation of the somatosensory network upon eye closure.

    PubMed

    Brodoehl, Stefan; Klingner, Carsten; Witte, Otto W

    2016-02-01

    Eye closure even in complete darkness can improve somatosensory perception by switching the brain to a uni-sensory processing mode. This causes an increased information flow between the thalamus and the somatosensory cortex while decreasing modulation by the visual cortex. Previous work suggests that these modulations are age-dependent and that the benefit in somatosensory performance due to eye closing diminishes with age. The cause of this age-dependency and to what extent somatosensory processing is involved remains unclear. Therefore, we intended to characterize the underlying age-dependent modifications in the interaction and connectivity of different sensory networks caused by eye closure. We performed functional MR-imaging with tactile stimulation of the right hand under the conditions of opened and closed eyes in healthy young and elderly participants. Conditional Granger causality analysis was performed to assess the somatosensory and visual networks, including the thalamus. Independent of age, eye closure improved the information transfer from the thalamus to and within the somatosensory cortex. However, beyond that, we found an age-dependent recruitment strategy. Whereas young participants were characterized by an optimized information flow within the relays of the somatosensory network, elderly participants revealed a stronger modulatory influence of the visual network upon the somatosensory cortex. Our results demonstrate that the modulation of the somatosensory and visual networks by eye closure diminishes with age and that the dominance of the visual system is more pronounced in the aging brain. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Sensory interaction on static balance: a comparison concerning the history of falls of community-dwelling elderly.

    PubMed

    Ricci, Natalia Aquaroni; de Faria Figueiredo Gonçalves, Daniele; Coimbra, Arlete Maria Valente; Coimbra, Ibsen Bellini

    2009-06-01

    To determine whether elderly subjects with distinct histories of falls presented differences concerning the influence of sensory interaction on balance. Cross-sectional research. Ninety-six community-dwelling elderly subjects were divided into three groups, according to the history of falls within the past year (group 1, no falls; group 2, one fall; and group 3, recurrent falls). The Clinical Test of Sensory Interaction and Balance was used to evaluate the influence of sensory inputs on standing balance. The test required the subject to maintain stability during 30 s, under six conditions: (i) firm surface with eyes open; (ii) firm surface with eyes closed; (iii) firm surface with visual conflict; (iv) unstable surface with eyes open; (v) unstable surface with eyes closed; and (vi) unstable surface with visual conflict. The time expended on conditions and the number of abnormal cases were compared between groups. Each group was evaluated in relation to its performance in the progression of conditions. More abnormal cases occurred in group 3 compared to group 1 for conditions (iv) and (v); and compared to group 2 for condition (iv). Group 3 remained less time than group 1 under conditions (iv), (v) and (vi). Groups 1, 2 and 3 presented relevant decrements in trial duration from conditions (iv) to (v). For group 3, a significant decay was also noted from condition (i) to (ii). Sensorial interaction in the elderly varies according to their history of falls. Thus, it is possible to correctly guide the rehabilitation process and to prevent sensorial decays according to an individual's history of falls.

  1. Multisensory and modality specific processing of visual speech in different regions of the premotor cortex

    PubMed Central

    Callan, Daniel E.; Jones, Jeffery A.; Callan, Akiko

    2014-01-01

    Behavioral and neuroimaging studies have demonstrated that brain regions involved with speech production also support speech perception, especially under degraded conditions. The premotor cortex (PMC) has been shown to be active during both observation and execution of action (“Mirror System” properties), and may facilitate speech perception by mapping unimodal and multimodal sensory features onto articulatory speech gestures. For this functional magnetic resonance imaging (fMRI) study, participants identified vowels produced by a speaker in audio-visual (saw the speaker's articulating face and heard her voice), visual only (only saw the speaker's articulating face), and audio only (only heard the speaker's voice) conditions with varying audio signal-to-noise ratios in order to determine the regions of the PMC involved with multisensory and modality specific processing of visual speech gestures. The task was designed so that identification could be made with a high level of accuracy from visual only stimuli to control for task difficulty and differences in intelligibility. The results of the functional magnetic resonance imaging (fMRI) analysis for visual only and audio-visual conditions showed overlapping activity in inferior frontal gyrus and PMC. The left ventral inferior premotor cortex (PMvi) showed properties of multimodal (audio-visual) enhancement with a degraded auditory signal. The left inferior parietal lobule and right cerebellum also showed these properties. The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas. The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with articulatory speech gestures. PMID:24860526

  2. The activity in the anterior insulae is modulated by perceptual decision-making difficulty.

    PubMed

    Lamichhane, Bidhan; Adhikari, Bhim M; Dhamala, Mukesh

    2016-07-07

    Previous neuroimaging studies provide evidence for the involvement of the anterior insulae (INSs) in perceptual decision-making processes. However, how the insular cortex is involved in integration of degraded sensory information to create a conscious percept of environment and to drive our behaviors still remains a mystery. In this study, using functional magnetic resonance imaging (fMRI) and four different perceptual categorization tasks in visual and audio-visual domains, we measured blood oxygen level dependent (BOLD) signals and examined the roles of INSs in easy and difficult perceptual decision-making. We created a varying degree of degraded stimuli by manipulating the task-specific stimuli in these four experiments to examine the effects of task difficulty on insular cortex response. We hypothesized that significantly higher BOLD response would be associated with the ambiguity of the sensory information and decision-making difficulty. In all of our experimental tasks, we found the INS activity consistently increased with task difficulty and participants' behavioral performance changed with the ambiguity of the presented sensory information. These findings support the hypothesis that the anterior insulae are involved in sensory-guided, goal-directed behaviors and their activities can predict perceptual load and task difficulty. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  3. Short-term memory for event duration: modality specificity and goal dependency.

    PubMed

    Takahashi, Kohske; Watanabe, Katsumi

    2012-11-01

    Time perception is involved in various cognitive functions. This study investigated the characteristics of short-term memory for event duration by examining how the length of the retention period affects inter- and intramodal duration judgment. On each trial, a sample stimulus was followed by a comparison stimulus, after a variable delay period (0.5-5 s). The sample and comparison stimuli were presented in the visual or auditory modality. The participants determined whether the comparison stimulus was longer or shorter than the sample stimulus. The distortion pattern of subjective duration during the delay period depended on the sensory modality of the comparison stimulus but was not affected by that of the sample stimulus. When the comparison stimulus was visually presented, the retained duration of the sample stimulus was shortened as the delay period increased. Contrarily, when the comparison stimulus was presented in the auditory modality, the delay period had little to no effect on the retained duration. Furthermore, whenever the participants did not know the sensory modality of the comparison stimulus beforehand, the effect of the delay period disappeared. These results suggest that the memory process for event duration is specific to sensory modality and that its performance is determined depending on the sensory modality in which the retained duration will be used subsequently.

  4. Modelling auditory attention

    PubMed Central

    Kaya, Emine Merve

    2017-01-01

    Sounds in everyday life seldom appear in isolation. Both humans and machines are constantly flooded with a cacophony of sounds that need to be sorted through and scoured for relevant information—a phenomenon referred to as the ‘cocktail party problem’. A key component in parsing acoustic scenes is the role of attention, which mediates perception and behaviour by focusing both sensory and cognitive resources on pertinent information in the stimulus space. The current article provides a review of modelling studies of auditory attention. The review highlights how the term attention refers to a multitude of behavioural and cognitive processes that can shape sensory processing. Attention can be modulated by ‘bottom-up’ sensory-driven factors, as well as ‘top-down’ task-specific goals, expectations and learned schemas. Essentially, it acts as a selection process or processes that focus both sensory and cognitive resources on the most relevant events in the soundscape; with relevance being dictated by the stimulus itself (e.g. a loud explosion) or by a task at hand (e.g. listen to announcements in a busy airport). Recent computational models of auditory attention provide key insights into its role in facilitating perception in cluttered auditory scenes. This article is part of the themed issue ‘Auditory and visual scene analysis’. PMID:28044012

  5. Cerebral Palsy for the Pediatric Eye Care Team Part III: Diagnosis and Management of Associated Visual and Sensory Disorders.

    PubMed

    Arnoldi, Kyle A; Pendarvis, Lauren; Jackson, Jorie; Batra, Noopur Nikki Agarwal

    2006-01-01

    Cerebral palsy (CP) is a term used to describe a spectrum of deficits of muscle tone and posture resulting from damage to the developing nervous system. Though considered a motor disorder, CP can be associated with disorders of the sensory visual pathway. This paper, the final in a series of three articles, will present frequency, diagnosis, and management of the visual and binocular vision deficits associated with CP. Topics for discussion will include the prevalence and etiology of decreased acuity, the effect of CP on sensory and motor fusion, and the response to treatment for these sensory deficits. A retrospective chart review of all cases of cerebral palsy referred to the St. Louis Children's Hospital Eye Center was done. Detailed data on the sensory and motor deficits documented in these children was collected. Also recorded was the management strategy and response to treatment. Of the 131 cases reviewed (mean age 5.2 years at presentation), 46% had decreased vision in at least one eye due to amblyopia (24%), optic nerve abnormality (16%), cortical visual impairment (14%), or a combination. Forty-nine (37%) had significant refractive error. Sixty-four percent of those with significant refractive error responded to spectacle correction. Forty-three percent of those with amblyopia responded to conventional therapies. Of the nonstrabismic patients, 89% demonstrated sensory fusion, 90% had stereopsis, and 91% had motor fusion. No patient lacking fusion or stereopsis prior to strabismus surgery gained these abilities with realignment of the eyes. While children with CP are capable of age-appropriate acuity and binocular vision, they are at increased risk for sensory visual deficits. These deficits are not the direct result of CP itself, but either share a common underlying cause, or occur as sequelae to the strabismus that is prevalent in CP. Most importantly, some sensory deficits may respond to standard treatment methods.

  6. Serotonin Decreases the Gain of Visual Responses in Awake Macaque V1.

    PubMed

    Seillier, Lenka; Lorenz, Corinna; Kawaguchi, Katsuhisa; Ott, Torben; Nieder, Andreas; Pourriahi, Paria; Nienborg, Hendrikje

    2017-11-22

    Serotonin, an important neuromodulator in the brain, is implicated in affective and cognitive functions. However, its role even for basic cortical processes is controversial. For example, in the mammalian primary visual cortex (V1), heterogenous serotonergic modulation has been observed in anesthetized animals. Here, we combined extracellular single-unit recordings with iontophoresis in awake animals. We examined the role of serotonin on well-defined tuning properties (orientation, spatial frequency, contrast, and size) in V1 of two male macaque monkeys. We find that in the awake macaque the modulatory effect of serotonin is surprisingly uniform: it causes a mainly multiplicative decrease of the visual responses and a slight increase in the stimulus-selective response latency. Moreover, serotonin neither systematically changes the selectivity or variability of the response, nor the interneuronal correlation unexplained by the stimulus ("noise-correlation"). The modulation by serotonin has qualitative similarities with that for a decrease in stimulus contrast, but differs quantitatively from decreasing contrast. It can be captured by a simple additive change to a threshold-linear spiking nonlinearity. Together, our results show that serotonin is well suited to control the response gain of neurons in V1 depending on the animal's behavioral or motivational context, complementing other known state-dependent gain-control mechanisms. SIGNIFICANCE STATEMENT Serotonin is an important neuromodulator in the brain and a major target for drugs used to treat psychiatric disorders. Nonetheless, surprisingly little is known about how it shapes information processing in sensory areas. Here we examined the serotonergic modulation of visual processing in the primary visual cortex of awake behaving macaque monkeys. We found that serotonin mainly decreased the gain of the visual responses, without systematically changing their selectivity, variability, or covariability. This identifies a simple computational function of serotonin for state-dependent sensory processing, depending on the animal's affective or motivational state. Copyright © 2017 Seillier, Lorenz et al.

  7. Serotonin Decreases the Gain of Visual Responses in Awake Macaque V1

    PubMed Central

    Seillier, Lenka; Lorenz, Corinna; Kawaguchi, Katsuhisa; Ott, Torben; Pourriahi, Paria

    2017-01-01

    Serotonin, an important neuromodulator in the brain, is implicated in affective and cognitive functions. However, its role even for basic cortical processes is controversial. For example, in the mammalian primary visual cortex (V1), heterogenous serotonergic modulation has been observed in anesthetized animals. Here, we combined extracellular single-unit recordings with iontophoresis in awake animals. We examined the role of serotonin on well-defined tuning properties (orientation, spatial frequency, contrast, and size) in V1 of two male macaque monkeys. We find that in the awake macaque the modulatory effect of serotonin is surprisingly uniform: it causes a mainly multiplicative decrease of the visual responses and a slight increase in the stimulus-selective response latency. Moreover, serotonin neither systematically changes the selectivity or variability of the response, nor the interneuronal correlation unexplained by the stimulus (“noise-correlation”). The modulation by serotonin has qualitative similarities with that for a decrease in stimulus contrast, but differs quantitatively from decreasing contrast. It can be captured by a simple additive change to a threshold-linear spiking nonlinearity. Together, our results show that serotonin is well suited to control the response gain of neurons in V1 depending on the animal's behavioral or motivational context, complementing other known state-dependent gain-control mechanisms. SIGNIFICANCE STATEMENT Serotonin is an important neuromodulator in the brain and a major target for drugs used to treat psychiatric disorders. Nonetheless, surprisingly little is known about how it shapes information processing in sensory areas. Here we examined the serotonergic modulation of visual processing in the primary visual cortex of awake behaving macaque monkeys. We found that serotonin mainly decreased the gain of the visual responses, without systematically changing their selectivity, variability, or covariability. This identifies a simple computational function of serotonin for state-dependent sensory processing, depending on the animal's affective or motivational state. PMID:29042433

  8. Visual Bias Predicts Gait Adaptability in Novel Sensory Discordant Conditions

    NASA Technical Reports Server (NTRS)

    Brady, Rachel A.; Batson, Crystal D.; Peters, Brian T.; Mulavara, Ajitkumar P.; Bloomberg, Jacob J.

    2010-01-01

    We designed a gait training study that presented combinations of visual flow and support-surface manipulations to investigate the response of healthy adults to novel discordant sensorimotor conditions. We aimed to determine whether a relationship existed between subjects visual dependence and their postural stability and cognitive performance in a new discordant environment presented at the conclusion of training (Transfer Test). Our training system comprised a treadmill placed on a motion base facing a virtual visual scene that provided a variety of sensory challenges. Ten healthy adults completed 3 training sessions during which they walked on a treadmill at 1.1 m/s while receiving discordant support-surface and visual manipulations. At the first visit, in an analysis of normalized torso translation measured in a scene-movement-only condition, 3 of 10 subjects were classified as visually dependent. During the Transfer Test, all participants received a 2-minute novel exposure. In a combined measure of stride frequency and reaction time, the non-visually dependent subjects showed improved adaptation on the Transfer Test compared to their visually dependent counterparts. This finding suggests that individual differences in the ability to adapt to new sensorimotor conditions may be explained by individuals innate sensory biases. An accurate preflight assessment of crewmembers biases for visual dependence could be used to predict their propensities to adapt to novel sensory conditions. It may also facilitate the development of customized training regimens that could expedite adaptation to alternate gravitational environments.

  9. Abnormal late visual responses and alpha oscillations in neurofibromatosis type 1: a link to visual and attention deficits

    PubMed Central

    2014-01-01

    Background Neurofibromatosis type 1 (NF1) affects several areas of cognitive function including visual processing and attention. We investigated the neural mechanisms underlying the visual deficits of children and adolescents with NF1 by studying visual evoked potentials (VEPs) and brain oscillations during visual stimulation and rest periods. Methods Electroencephalogram/event-related potential (EEG/ERP) responses were measured during visual processing (NF1 n = 17; controls n = 19) and idle periods with eyes closed and eyes open (NF1 n = 12; controls n = 14). Visual stimulation was chosen to bias activation of the three detection mechanisms: achromatic, red-green and blue-yellow. Results We found significant differences between the groups for late chromatic VEPs and a specific enhancement in the amplitude of the parieto-occipital alpha amplitude both during visual stimulation and idle periods. Alpha modulation and the negative influence of alpha oscillations in visual performance were found in both groups. Conclusions Our findings suggest abnormal later stages of visual processing and enhanced amplitude of alpha oscillations supporting the existence of deficits in basic sensory processing in NF1. Given the link between alpha oscillations, visual perception and attention, these results indicate a neural mechanism that might underlie the visual sensitivity deficits and increased lapses of attention observed in individuals with NF1. PMID:24559228

  10. Effect of prism adaptation on left dichotic listening deficit in neglect patients: glasses to hear better?

    PubMed

    Jacquin-Courtois, S; Rode, G; Pavani, F; O'Shea, J; Giard, M H; Boisson, D; Rossetti, Y

    2010-03-01

    Unilateral neglect is a disabling syndrome frequently observed following right hemisphere brain damage. Symptoms range from visuo-motor impairments through to deficient visuo-spatial imagery, but impairment can also affect the auditory modality. A short period of adaptation to a rightward prismatic shift of the visual field is known to improve a wide range of hemispatial neglect symptoms, including visuo-manual tasks, mental imagery, postural imbalance, visuo-verbal measures and number bisection. The aim of the present study was to assess whether the beneficial effects of prism adaptation may generalize to auditory manifestations of neglect. Auditory extinction, whose clinical manifestations are independent of the sensory modalities engaged in visuo-manual adaptation, was examined in neglect patients before and after prism adaptation. Two separate groups of neglect patients (all of whom exhibited left auditory extinction) underwent prism adaptation: one group (n = 6) received a classical prism treatment ('Prism' group), the other group (n = 6) was submitted to the same procedure, but wore neutral glasses creating no optical shift (placebo 'Control' group). Auditory extinction was assessed by means of a dichotic listening task performed three times: prior to prism exposure (pre-test), upon prism removal (0 h post-test) and 2 h later (2 h post-test). The total number of correct responses, the lateralization index (detection asymmetry between the two ears) and the number of left-right fusion errors were analysed. Our results demonstrate that prism adaptation can improve left auditory extinction, thus revealing transfer of benefit to a sensory modality that is orthogonal to the visual, proprioceptive and motor modalities directly implicated in the visuo-motor adaptive process. The observed benefit was specific to the detection asymmetry between the two ears and did not affect the total number of responses. This indicates a specific effect of prism adaptation on lateralized processes rather than on general arousal. Our results suggest that the effects of prism adaptation can extend to unexposed sensory systems. The bottom-up approach of visuo-motor adaptation appears to interact with higher order brain functions related to multisensory integration and can have beneficial effects on sensory processing in different modalities. These findings should stimulate the development of therapeutic approaches aimed at bypassing the affected sensory processing modality by adapting other sensory modalities.

  11. Integrating brain, behavior, and phylogeny to understand the evolution of sensory systems in birds

    PubMed Central

    Wylie, Douglas R.; Gutiérrez-Ibáñez, Cristian; Iwaniuk, Andrew N.

    2015-01-01

    The comparative anatomy of sensory systems has played a major role in developing theories and principles central to evolutionary neuroscience. This includes the central tenet of many comparative studies, the principle of proper mass, which states that the size of a neural structure reflects its processing capacity. The size of structures within the sensory system is not, however, the only salient variable in sensory evolution. Further, the evolution of the brain and behavior are intimately tied to phylogenetic history, requiring studies to integrate neuroanatomy with behavior and phylogeny to gain a more holistic view of brain evolution. Birds have proven to be a useful group for these studies because of widespread interest in their phylogenetic relationships and a wealth of information on the functional organization of most of their sensory pathways. In this review, we examine the principle of proper mass in relation differences in the sensory capabilities among birds. We discuss how neuroanatomy, behavior, and phylogeny can be integrated to understand the evolution of sensory systems in birds providing evidence from visual, auditory, and somatosensory systems. We also consider the concept of a “trade-off,” whereby one sensory system (or subpathway within a sensory system), may be expanded in size, at the expense of others, which are reduced in size. PMID:26321905

  12. The Sound of Vision Project: On the Feasibility of an Audio-Haptic Representation of the Environment, for the Visually Impaired

    PubMed Central

    Jóhannesson, Ómar I.; Balan, Oana; Unnthorsson, Runar; Moldoveanu, Alin; Kristjánsson, Árni

    2016-01-01

    The Sound of Vision project involves developing a sensory substitution device that is aimed at creating and conveying a rich auditory representation of the surrounding environment to the visually impaired. However, the feasibility of such an approach is strongly constrained by neural flexibility, possibilities of sensory substitution and adaptation to changed sensory input. We review evidence for such flexibility from various perspectives. We discuss neuroplasticity of the adult brain with an emphasis on functional changes in the visually impaired compared to sighted people. We discuss effects of adaptation on brain activity, in particular short-term and long-term effects of repeated exposure to particular stimuli. We then discuss evidence for sensory substitution such as Sound of Vision involves, while finally discussing evidence for adaptation to changes in the auditory environment. We conclude that sensory substitution enterprises such as Sound of Vision are quite feasible in light of the available evidence, which is encouraging regarding such projects. PMID:27355966

  13. Spontaneous cortical activity alternates between motifs defined by regional axonal projections

    PubMed Central

    Mohajerani, Majid H.; Chan, Allen W.; Mohsenvand, Mostafa; LeDue, Jeffrey; Liu, Rui; McVea, David A.; Boyd, Jamie D.; Wang, Yu Tian; Reimers, Mark; Murphy, Timothy H.

    2014-01-01

    In lightly anaesthetized or awake adult mice using millisecond timescale voltage sensitive dye imaging, we show that a palette of sensory-evoked and hemisphere-wide activity motifs are represented in spontaneous activity. These motifs can reflect multiple modes of sensory processing including vision, audition, and touch. Similar cortical networks were found with direct cortical activation using channelrhodopsin-2. Regional analysis of activity spread indicated modality specific sources such as primary sensory areas, and a common posterior-medial cortical sink where sensory activity was extinguished within the parietal association area, and a secondary anterior medial sink within the cingulate/secondary motor cortices for visual stimuli. Correlation analysis between functional circuits and intracortical axonal projections indicated a common framework corresponding to long-range mono-synaptic connections between cortical regions. Maps of intracortical mono-synaptic structural connections predicted hemisphere-wide patterns of spontaneous and sensory-evoked depolarization. We suggest that an intracortical monosynaptic connectome shapes the ebb and flow of spontaneous cortical activity. PMID:23974708

  14. Integrating Information from Different Senses in the Auditory Cortex

    PubMed Central

    King, Andrew J.; Walker, Kerry M.M.

    2015-01-01

    Multisensory integration was once thought to be the domain of brain areas high in the cortical hierarchy, with early sensory cortical fields devoted to unisensory processing of inputs from their given set of sensory receptors. More recently, a wealth of evidence documenting visual and somatosensory responses in auditory cortex, even as early as the primary fields, has changed this view of cortical processing. These multisensory inputs may serve to enhance responses to sounds that are accompanied by other sensory cues, effectively making them easier to hear, but may also act more selectively to shape the receptive field properties of auditory cortical neurons to the location or identity of these events. We discuss the new, converging evidence that multiplexing of neural signals may play a key role in informatively encoding and integrating signals in auditory cortex across multiple sensory modalities. We highlight some of the many open research questions that exist about the neural mechanisms that give rise to multisensory integration in auditory cortex, which should be addressed in future experimental and theoretical studies. PMID:22798035

  15. Pilot Errors Involving Head-Up Displays (HUDs), Helmet-Mounted Displays (HMDs), and Night Vision Goggles (NVGs)

    DTIC Science & Technology

    1992-01-01

    results in stimulation of spatial-motion-location visual processes, which are known to take precedence over any other sensor or cognitive stimuli. In...or version he is flying. This was initially an observation that stimulated the birth of the human-factors engineering discipline during World War H...collisions with the surface, the pilot needs inputs to sensory channels other than the focal visual system. Properly designed auditory and

  16. Stress Potentiates Early and Attenuates Late Stages of Visual Processing

    DTIC Science & Technology

    2011-01-19

    threat (M 6.5, SD 20.0) than during safety (M 19.3, SD 11.6), t(31) 6.7, p 0.001. They also expressed more intense negative emotion on their...threats increase risk assessment (Kava- liers and Choleris, 2001), and fearful facial expressions enhance sensory intake (Susskind et al., 2008). These...visual analog scales to rate the intensity of their emotional experience (anxious, happy, safe, or stressed) during safety and threat blocks. To minimize

  17. [Age-related changes of sensory system].

    PubMed

    Iwamoto, Toshihiko; Hanyu, Haruo; Umahara, Takahiko

    2013-10-01

    Pathological processes usually superimpose on physiological aging even in the sensory system including visual, hearing, olfactory, taste and somatosensory functions. Representative changes of age-related changes are presbyopia, cataracts, and presbyacusis. Reduced sense of smell is seen in normal aging, but the prominent reduction detected by the odor stick identification test is noticed especially in early stage of Alzheimer or Parkinson disease. Reduced sense of taste is well-known especially in salty sense, while the changes of sweet, bitter, and sour tastes are different among individuals. Finally, deep sensation of vibration and proprioception is decreased with age as well as superficial sensation (touch, temperature, pain). As a result, impaired sensory system could induce deterioration of the activities of daily living and quality of life in the elderly.

  18. Postural Stability of Patients with Schizophrenia during Challenging Sensory Conditions: Implication of Sensory Integration for Postural Control.

    PubMed

    Teng, Ya-Ling; Chen, Chiung-Ling; Lou, Shu-Zon; Wang, Wei-Tsan; Wu, Jui-Yen; Ma, Hui-Ing; Chen, Vincent Chin-Hung

    2016-01-01

    Postural dysfunctions are prevalent in patients with schizophrenia and affect their daily life and ability to work. In addition, sensory functions and sensory integration that are crucial for postural control are also compromised. This study intended to examine how patients with schizophrenia coordinate multiple sensory systems to maintain postural stability in dynamic sensory conditions. Twenty-nine patients with schizophrenia and 32 control subjects were recruited. Postural stability of the participants was examined in six sensory conditions of different level of congruency of multiple sensory information, which was based on combinations of correct, removed, or conflicting sensory inputs from visual, somatosensory, and vestibular systems. The excursion of the center of pressure was measured by posturography. Equilibrium scores were derived to indicate the range of anterior-posterior (AP) postural sway, and sensory ratios were calculated to explore ability to use sensory information to maintain balance. The overall AP postural sway was significantly larger for patients with schizophrenia compared to the controls [patients (69.62±8.99); controls (76.53±7.47); t1,59 = -3.28, p<0.001]. The results of mixed-model ANOVAs showed a significant interaction between the group and sensory conditions [F5,295 = 5.55, p<0.001]. Further analysis indicated that AP postural sway was significantly larger for patients compared to the controls in conditions containing unreliable somatosensory information either with visual deprivation or with conflicting visual information. Sensory ratios were not significantly different between groups, although small and non-significant difference in inefficiency to utilize vestibular information was also noted. No significant correlations were found between postural stability and clinical characteristics. To sum up, patients with schizophrenia showed increased postural sway and a higher rate of falls during challenging sensory conditions, which was independent of clinical characteristics. Patients further demonstrated similar pattern and level of utilizing sensory information to maintain balance compared to the controls.

  19. Women process multisensory emotion expressions more efficiently than men.

    PubMed

    Collignon, O; Girard, S; Gosselin, F; Saint-Amour, D; Lepore, F; Lassonde, M

    2010-01-01

    Despite claims in the popular press, experiments investigating whether female are more efficient than male observers at processing expression of emotions produced inconsistent findings. In the present study, participants were asked to categorize fear and disgust expressions displayed auditorily, visually, or audio-visually. Results revealed an advantage of women in all the conditions of stimulus presentation. We also observed more nonlinear probabilistic summation in the bimodal conditions in female than male observers, indicating greater neural integration of different sensory-emotional informations. These findings indicate robust differences between genders in the multisensory perception of emotion expression.

  20. Multi-sensory landscape assessment: the contribution of acoustic perception to landscape evaluation.

    PubMed

    Gan, Yonghong; Luo, Tao; Breitung, Werner; Kang, Jian; Zhang, Tianhai

    2014-12-01

    In this paper, the contribution of visual and acoustic preference to multi-sensory landscape evaluation was quantitatively compared. The real landscapes were treated as dual-sensory ambiance and separated into visual landscape and soundscape. Both were evaluated by 63 respondents in laboratory conditions. The analysis of the relationship between respondent's visual and acoustic preference as well as their respective contribution to landscape preference showed that (1) some common attributes are universally identified in assessing visual, aural and audio-visual preference, such as naturalness or degree of human disturbance; (2) with acoustic and visual preferences as variables, a multi-variate linear regression model can satisfactorily predict landscape preference (R(2 )= 0.740), while the coefficients of determination for a unitary linear regression model were 0.345 and 0.720 for visual and acoustic preference as predicting factors, respectively; (3) acoustic preference played a much more important role in landscape evaluation than visual preference in this study (the former is about 4.5 times of the latter), which strongly suggests a rethinking of the role of soundscape in environment perception research and landscape planning practice.

  1. Beyond sensory images: Object-based representation in the human ventral pathway

    PubMed Central

    Pietrini, Pietro; Furey, Maura L.; Ricciardi, Emiliano; Gobbini, M. Ida; Wu, W.-H. Carolyn; Cohen, Leonardo; Guazzelli, Mario; Haxby, James V.

    2004-01-01

    We investigated whether the topographically organized, category-related patterns of neural response in the ventral visual pathway are a representation of sensory images or a more abstract representation of object form that is not dependent on sensory modality. We used functional MRI to measure patterns of response evoked during visual and tactile recognition of faces and manmade objects in sighted subjects and during tactile recognition in blind subjects. Results showed that visual and tactile recognition evoked category-related patterns of response in a ventral extrastriate visual area in the inferior temporal gyrus that were correlated across modality for manmade objects. Blind subjects also demonstrated category-related patterns of response in this “visual” area, and in more ventral cortical regions in the fusiform gyrus, indicating that these patterns are not due to visual imagery and, furthermore, that visual experience is not necessary for category-related representations to develop in these cortices. These results demonstrate that the representation of objects in the ventral visual pathway is not simply a representation of visual images but, rather, is a representation of more abstract features of object form. PMID:15064396

  2. A simple approach to ignoring irrelevant variables by population decoding based on multisensory neurons

    PubMed Central

    Kim, HyungGoo R.; Pitkow, Xaq; Angelaki, Dora E.

    2016-01-01

    Sensory input reflects events that occur in the environment, but multiple events may be confounded in sensory signals. For example, under many natural viewing conditions, retinal image motion reflects some combination of self-motion and movement of objects in the world. To estimate one stimulus event and ignore others, the brain can perform marginalization operations, but the neural bases of these operations are poorly understood. Using computational modeling, we examine how multisensory signals may be processed to estimate the direction of self-motion (i.e., heading) and to marginalize out effects of object motion. Multisensory neurons represent heading based on both visual and vestibular inputs and come in two basic types: “congruent” and “opposite” cells. Congruent cells have matched heading tuning for visual and vestibular cues and have been linked to perceptual benefits of cue integration during heading discrimination. Opposite cells have mismatched visual and vestibular heading preferences and are ill-suited for cue integration. We show that decoding a mixed population of congruent and opposite cells substantially reduces errors in heading estimation caused by object motion. In addition, we present a general formulation of an optimal linear decoding scheme that approximates marginalization and can be implemented biologically by simple reinforcement learning mechanisms. We also show that neural response correlations induced by task-irrelevant variables may greatly exceed intrinsic noise correlations. Overall, our findings suggest a general computational strategy by which neurons with mismatched tuning for two different sensory cues may be decoded to perform marginalization operations that dissociate possible causes of sensory inputs. PMID:27334948

  3. Sensory modality of smoking cues modulates neural cue reactivity.

    PubMed

    Yalachkov, Yavor; Kaiser, Jochen; Görres, Andreas; Seehaus, Arne; Naumer, Marcus J

    2013-01-01

    Behavioral experiments have demonstrated that the sensory modality of presentation modulates drug cue reactivity. The present study on nicotine addiction tested whether neural responses to smoking cues are modulated by the sensory modality of stimulus presentation. We measured brain activation using functional magnetic resonance imaging (fMRI) in 15 smokers and 15 nonsmokers while they viewed images of smoking paraphernalia and control objects and while they touched the same objects without seeing them. Haptically presented, smoking-related stimuli induced more pronounced neural cue reactivity than visual cues in the left dorsal striatum in smokers compared to nonsmokers. The severity of nicotine dependence correlated positively with the preference for haptically explored smoking cues in the left inferior parietal lobule/somatosensory cortex, right fusiform gyrus/inferior temporal cortex/cerebellum, hippocampus/parahippocampal gyrus, posterior cingulate cortex, and supplementary motor area. These observations are in line with the hypothesized role of the dorsal striatum for the expression of drug habits and the well-established concept of drug-related automatized schemata, since haptic perception is more closely linked to the corresponding object-specific action pattern than visual perception. Moreover, our findings demonstrate that with the growing severity of nicotine dependence, brain regions involved in object perception, memory, self-processing, and motor control exhibit an increasing preference for haptic over visual smoking cues. This difference was not found for control stimuli. Considering the sensory modality of the presented cues could serve to develop more reliable fMRI-specific biomarkers, more ecologically valid experimental designs, and more effective cue-exposure therapies of addiction.

  4. A neural correlate of working memory in the monkey primary visual cortex.

    PubMed

    Supèr, H; Spekreijse, H; Lamme, V A

    2001-07-06

    The brain frequently needs to store information for short periods. In vision, this means that the perceptual correlate of a stimulus has to be maintained temporally once the stimulus has been removed from the visual scene. However, it is not known how the visual system transfers sensory information into a memory component. Here, we identify a neural correlate of working memory in the monkey primary visual cortex (V1). We propose that this component may link sensory activity with memory activity.

  5. Audiovisual associations alter the perception of low-level visual motion

    PubMed Central

    Kafaligonul, Hulusi; Oluk, Can

    2015-01-01

    Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role. PMID:25873869

  6. Advocating for a Population-Specific Health Literacy for People With Visual Impairments.

    PubMed

    Harrison, Tracie; Lazard, Allison

    2015-01-01

    Health literacy, the ability to access, process, and understand health information, is enhanced by the visual senses among people who are typically sighted. Emotions, meaning, speed of knowledge transfer, level of attention, and degree of relevance are all manipulated by the visual design of health information when people can see. When consumers of health information are blind or visually impaired, they access, process, and understand their health information in a multitude of methods using a variety of accommodations depending upon their severity and type of impairment. They are taught, or they learn how, to accommodate their differences by using alternative sensory experiences and interpretations. In this article, we argue that due to the unique and powerful aspects of visual learning and due to the differences in knowledge creation when people are not visually oriented, health literacy must be considered a unique construct for people with visual impairment, which requires a distinctive theoretical basis for determining the impact of their mind-constructed representations of health.

  7. Ground-based training for the stimulus rearrangement encountered during spaceflight

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Parker, D. E.; Harm, D. L.; Michaud, L.

    1988-01-01

    Approximately 65-70% of the crew members now experience motion sickness of some degree during the first 72 h of orbital flight on the Space Shuttle. Lack of congruence among signals from spatial orientation systems leads to sensory conflict, which appears to be the basic cause of space motion sickness. A project to develop training devices and procedures to preadapt astronauts to the stimulus rearrangements of microgravity is currently being pursued. The preflight adaptation trainers (PATs) are intended to: demonstrate sensory phenomena likely to be experienced in flight, allow astronauts to train preflight in an altered sensory environment, alter sensory-motor reflexes, and alleviate or shorten the duration of space motion sickness. Four part-task PATs are anticipated. The trainers are designed to evoke two adaptation processes, sensory compensation and sensory reinterpretation, which are necessary to maintain spatial orientation in a weightless environment. Recent investigations using one of the trainers indicate that self-motion perception of linear translation is enhanced when body tilt is combined with visual surround translation, and that a 270 degrees phase angle relationship between tilt and surround motion produces maximum translation perception.

  8. Strategies for Characterizing the Sensory Environment: Objective and Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory Environment: Objective and...Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera By Joseph McArdle, Ashley Foots, Chris Stachowiak, and...return it to the originator. ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory

  9. Frequency-band signatures of visual responses to naturalistic input in ferret primary visual cortex during free viewing.

    PubMed

    Sellers, Kristin K; Bennett, Davis V; Fröhlich, Flavio

    2015-02-19

    Neuronal firing responses in visual cortex reflect the statistics of visual input and emerge from the interaction with endogenous network dynamics. Artificial visual stimuli presented to animals in which the network dynamics were constrained by anesthetic agents or trained behavioral tasks have provided fundamental understanding of how individual neurons in primary visual cortex respond to input. In contrast, very little is known about the mesoscale network dynamics and their relationship to microscopic spiking activity in the awake animal during free viewing of naturalistic visual input. To address this gap in knowledge, we recorded local field potential (LFP) and multiunit activity (MUA) simultaneously in all layers of primary visual cortex (V1) of awake, freely viewing ferrets presented with naturalistic visual input (nature movie clips). We found that naturalistic visual stimuli modulated the entire oscillation spectrum; low frequency oscillations were mostly suppressed whereas higher frequency oscillations were enhanced. In average across all cortical layers, stimulus-induced change in delta and alpha power negatively correlated with the MUA responses, whereas sensory-evoked increases in gamma power positively correlated with MUA responses. The time-course of the band-limited power in these frequency bands provided evidence for a model in which naturalistic visual input switched V1 between two distinct, endogenously present activity states defined by the power of low (delta, alpha) and high (gamma) frequency oscillatory activity. Therefore, the two mesoscale activity states delineated in this study may define the degree of engagement of the circuit with the processing of sensory input. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Non-Attended Representations are Perceptual Rather than Unconscious in Nature

    PubMed Central

    Fahrenfort, Johannes J.; Ambroziak, Klaudia B.; Lamme, Victor A. F.

    2012-01-01

    Introspectively we experience a phenomenally rich world. In stark contrast, many studies show that we can only report on the few items that we happen to attend to. So what happens to the unattended objects? Are these consciously processed as our first person perspective would have us believe, or are they – in fact – entirely unconscious? Here, we attempt to resolve this question by investigating the perceptual characteristics of visual sensory memory. Sensory memory is a fleeting, high-capacity form of memory that precedes attentional selection and working memory. We found that memory capacity benefits from figural information induced by the Kanizsa illusion. Importantly, this benefit was larger for sensory memory than for working memory and depended critically on the illusion, not on the stimulus configuration. This shows that pre-attentive sensory memory contains representations that have a genuinely perceptual nature, suggesting that non-attended representations are phenomenally experienced rather than unconscious. PMID:23209639

  11. Multisensory architectures for action-oriented perception

    NASA Astrophysics Data System (ADS)

    Alba, L.; Arena, P.; De Fiore, S.; Listán, J.; Patané, L.; Salem, A.; Scordino, G.; Webb, B.

    2007-05-01

    In order to solve the navigation problem of a mobile robot in an unstructured environment a versatile sensory system and efficient locomotion control algorithms are necessary. In this paper an innovative sensory system for action-oriented perception applied to a legged robot is presented. An important problem we address is how to utilize a large variety and number of sensors, while having systems that can operate in real time. Our solution is to use sensory systems that incorporate analog and parallel processing, inspired by biological systems, to reduce the required data exchange with the motor control layer. In particular, as concerns the visual system, we use the Eye-RIS v1.1 board made by Anafocus, which is based on a fully parallel mixed-signal array sensor-processor chip. The hearing sensor is inspired by the cricket hearing system and allows efficient localization of a specific sound source with a very simple analog circuit. Our robot utilizes additional sensors for touch, posture, load, distance, and heading, and thus requires customized and parallel processing for concurrent acquisition. Therefore a Field Programmable Gate Array (FPGA) based hardware was used to manage the multi-sensory acquisition and processing. This choice was made because FPGAs permit the implementation of customized digital logic blocks that can operate in parallel allowing the sensors to be driven simultaneously. With this approach the multi-sensory architecture proposed can achieve real time capabilities.

  12. Vicariously touching products through observing others' hand actions increases purchasing intention, and the effect of visual perspective in this process: An fMRI study.

    PubMed

    Liu, Yi; Zang, Xuelian; Chen, Lihan; Assumpção, Leonardo; Li, Hong

    2018-01-01

    The growth of online shopping increases consumers' dependence on vicarious sensory experiences, such as observing others touching products in commercials. However, empirical evidence on whether observing others' sensory experiences increases purchasing intention is still scarce. In the present study, participants observed others interacting with products in the first- or third-person perspective in video clips, and their neural responses were measured with functional magnetic resonance imaging (fMRI). We investigated (1) whether and how vicariously touching certain products affected purchasing intention, and the neural correlates of this process; and (2) how visual perspective interacts with vicarious tactility. Vicarious tactile experiences were manipulated by hand actions touching or not touching the products, while the visual perspective was manipulated by showing the hand actions either in first- or third-person perspective. During the fMRI scanning, participants watched the video clips and rated their purchasing intention for each product. The results showed that, observing others touching (vs. not touching) the products increased purchasing intention, with vicarious neural responses found in mirror neuron systems (MNS) and lateral occipital complex (LOC). Moreover, the stronger neural activities in MNS was associated with higher purchasing intention. The effects of visual perspectives were found in left superior parietal lobule (SPL), while the interaction of tactility and visual perspective was shown in precuneus and precuneus-LOC connectivity. The present study provides the first evidence that vicariously touching a given product increased purchasing intention and the neural activities in bilateral MNS, LOC, left SPL and precuneus are involved in this process. Hum Brain Mapp 39:332-343, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  13. Multimodal stimulation of the Colorado potato beetle: Prevalence of visual over olfactory cues

    USDA-ARS?s Scientific Manuscript database

    Orientation of insects to host plants and conspecifics is the result of detection and integration of chemical and physical cues present in the environment. Sensory organs have evolved to be sensitive to important signals, providing neural input for higher order processing and behavioral output. He...

  14. Processing Determinants of Reading Speed.

    ERIC Educational Resources Information Center

    Jackson, Mark D.; McClelland, James L.

    1979-01-01

    Two groups of undergraduates differing in reading ability were tested on a number of reaction-time tasks designed to determine the speed of encoding visual information at several different levels, tests of sensory functions, verbal and quantitative reasoning ability, short-term auditory memory span, and ability to comprehend spoken text.…

  15. Reevaluating the Sensory Account of Visual Working Memory Storage.

    PubMed

    Xu, Yaoda

    2017-10-01

    Recent human fMRI pattern-decoding studies have highlighted the involvement of sensory areas in visual working memory (VWM) tasks and argue for a sensory account of VWM storage. In this review, evidence is examined from human behavior, fMRI decoding, and transcranial magnetic stimulation (TMS) studies, as well as from monkey neurophysiology studies. Contrary to the prevalent view, the available evidence provides little support for the sensory account of VWM storage. Instead, when the ability to resist distraction and the existence of top-down feedback are taken into account, VWM-related activities in sensory areas seem to reflect feedback signals indicative of VWM storage elsewhere in the brain. Collectively, the evidence shows that prefrontal and parietal regions, rather than sensory areas, play more significant roles in VWM storage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Top-down controlled alpha band activity in somatosensory areas determines behavioral performance in a discrimination task.

    PubMed

    Haegens, Saskia; Händel, Barbara F; Jensen, Ole

    2011-04-06

    The brain receives a rich flow of information which must be processed according to behavioral relevance. How is the state of the sensory system adjusted to up- or downregulate processing according to anticipation? We used magnetoencephalography to investigate whether prestimulus alpha band activity (8-14 Hz) reflects allocation of attentional resources in the human somatosensory system. Subjects performed a tactile discrimination task where a visual cue directed attention to their right or left hand. The strength of attentional modulation was controlled by varying the reliability of the cue in three experimental blocks (100%, 75%, or 50% valid cueing). While somatosensory prestimulus alpha power lateralized strongly with a fully predictive cue (100%), lateralization was decreased with lower cue reliability (75%) and virtually absent if the cue had no predictive value at all (50%). Importantly, alpha lateralization influenced the subjects' behavioral performance positively: both accuracy and speed of response improved with the degree of alpha lateralization. This study demonstrates that prestimulus alpha lateralization in the somatosensory system behaves similarly to posterior alpha activity observed in visual attention tasks. Our findings extend the notion that alpha band activity is involved in shaping the functional architecture of the working brain by determining both the engagement and disengagement of specific regions: the degree of anticipation modulates the alpha activity in sensory regions in a graded manner. Thus, the alpha activity is under top-down control and seems to play an important role for setting the state of sensory regions to optimize processing.

  17. The persistence of a visual dominance effect in a telemanipulator task: A comparison between visual and electrotactile feedback

    NASA Technical Reports Server (NTRS)

    Gaillard, J. P.

    1981-01-01

    The possibility to use an electrotactile stimulation in teleoperation and to observe the interpretation of such information as a feedback to the operator was investigated. It is proposed that visual feedback is more informative than an electrotactile one; and that complex electrotactile feedback slows down both the motor decision and motor response processes, is processed as an all or nothing signal, and bypasses the receptive structure and accesses directly in a working memory where information is sequentially processed and where memory is limited in treatment capacity. The electrotactile stimulation is used as an alerting signal. It is suggested that the visual dominance effect is the result of the advantage of both a transfer function and a sensory memory register where information is pretreated and memorized for a short time. It is found that dividing attention has an effect on the acquisition of the information but not on the subsequent decision processes.

  18. Confident false memories for spatial location are mediated by V1.

    PubMed

    Karanian, Jessica M; Slotnick, Scott D

    2018-06-27

    Prior functional magnetic resonance imaging (fMRI) results suggest that true memories, but not false memories, activate early sensory cortex. It is thought that false memories, which reflect conscious processing, do not activate early sensory cortex because these regions are associated with nonconscious processing. We posited that false memories may activate the earliest visual cortical processing region (i.e., V1) when task conditions are manipulated to evoke conscious processing in this region. In an fMRI experiment, abstract shapes were presented to the left or right of fixation during encoding. During retrieval, old shapes were presented at fixation and participants characterized each shape as previously on the "left" or "right" followed by an "unsure"-"sure"-"very sure" confidence rating. False memories for spatial location (i.e., "right"/left or "left"/right trials with "sure" or "very sure" confidence ratings) were associated with activity in bilateral early visual regions, including V1. In a follow-up fMRI-guided transcranial magnetic stimulation (TMS) experiment that employed the same paradigm, we assessed whether V1 activity was necessary for false memory construction. Between the encoding phase and the retrieval phase of each run, TMS (1 Hz, 8 min) was used to target the location of false memory activity (identified in the fMRI experiment) in left V1, right V1, or the vertex (control site). Confident false memories for spatial location were significantly reduced following TMS to V1, as compared to vertex. The results of the present experiments provide convergent evidence that early sensory cortex can contribute to false memory construction under particular task conditions.

  19. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention

    ERIC Educational Resources Information Center

    Yu, Chen; Smith, Linda B.

    2017-01-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of this study was to understand the complex system of sensory-motor behaviors that…

  20. Visual perception and imagery: a new molecular hypothesis.

    PubMed

    Bókkon, I

    2009-05-01

    Here, we put forward a redox molecular hypothesis about the natural biophysical substrate of visual perception and visual imagery. This hypothesis is based on the redox and bioluminescent processes of neuronal cells in retinotopically organized cytochrome oxidase-rich visual areas. Our hypothesis is in line with the functional roles of reactive oxygen and nitrogen species in living cells that are not part of haphazard process, but rather a very strict mechanism used in signaling pathways. We point out that there is a direct relationship between neuronal activity and the biophoton emission process in the brain. Electrical and biochemical processes in the brain represent sensory information from the external world. During encoding or retrieval of information, electrical signals of neurons can be converted into synchronized biophoton signals by bioluminescent radical and non-radical processes. Therefore, information in the brain appears not only as an electrical (chemical) signal but also as a regulated biophoton (weak optical) signal inside neurons. During visual perception, the topological distribution of photon stimuli on the retina is represented by electrical neuronal activity in retinotopically organized visual areas. These retinotopic electrical signals in visual neurons can be converted into synchronized biophoton signals by radical and non-radical processes in retinotopically organized mitochondria-rich areas. As a result, regulated bioluminescent biophotons can create intrinsic pictures (depictive representation) in retinotopically organized cytochrome oxidase-rich visual areas during visual imagery and visual perception. The long-term visual memory is interpreted as epigenetic information regulated by free radicals and redox processes. This hypothesis does not claim to solve the secret of consciousness, but proposes that the evolution of higher levels of complexity made the intrinsic picture representation of the external visual world possible by regulated redox and bioluminescent reactions in the visual system during visual perception and visual imagery.

  1. Vision in two cyprinid fish: implications for collective behavior

    PubMed Central

    Moore, Bret A.; Tyrrell, Luke P.; Fernández-Juricic, Esteban

    2015-01-01

    Many species of fish rely on their visual systems to interact with conspecifics and these interactions can lead to collective behavior. Individual-based models have been used to predict collective interactions; however, these models generally make simplistic assumptions about the sensory systems that are applied without proper empirical testing to different species. This could limit our ability to predict (and test empirically) collective behavior in species with very different sensory requirements. In this study, we characterized components of the visual system in two species of cyprinid fish known to engage in visually dependent collective interactions (zebrafish Danio rerio and golden shiner Notemigonus crysoleucas) and derived quantitative predictions about the positioning of individuals within schools. We found that both species had relatively narrow binocular and blind fields and wide visual coverage. However, golden shiners had more visual coverage in the vertical plane (binocular field extending behind the head) and higher visual acuity than zebrafish. The centers of acute vision (areae) of both species projected in the fronto-dorsal region of the visual field, but those of the zebrafish projected more dorsally than those of the golden shiner. Based on this visual sensory information, we predicted that: (a) predator detection time could be increased by >1,000% in zebrafish and >100% in golden shiners with an increase in nearest neighbor distance, (b) zebrafish schools would have a higher roughness value (surface area/volume ratio) than those of golden shiners, (c) and that nearest neighbor distance would vary from 8 to 20 cm to visually resolve conspecific striping patterns in both species. Overall, considering between-species differences in the sensory system of species exhibiting collective behavior could change the predictions about the positioning of individuals in the group as well as the shape of the school, which can have implications for group cohesion. We suggest that more effort should be invested in assessing the role of the sensory system in shaping local interactions driving collective behavior. PMID:26290783

  2. The role of visual deprivation and experience on the performance of sensory substitution devices.

    PubMed

    Stronks, H Christiaan; Nau, Amy C; Ibbotson, Michael R; Barnes, Nick

    2015-10-22

    It is commonly accepted that the blind can partially compensate for their loss of vision by developing enhanced abilities with their remaining senses. This visual compensation may be related to the fact that blind people rely on their other senses in everyday life. Many studies have indeed shown that experience plays an important role in visual compensation. Numerous neuroimaging studies have shown that the visual cortices of the blind are recruited by other functional brain areas and can become responsive to tactile or auditory input instead. These cross-modal plastic changes are more pronounced in the early blind compared to late blind individuals. The functional consequences of cross-modal plasticity on visual compensation in the blind are debated, as are the influences of various etiologies of vision loss (i.e., blindness acquired early or late in life). Distinguishing between the influences of experience and visual deprivation on compensation is especially relevant for rehabilitation of the blind with sensory substitution devices. The BrainPort artificial vision device and The vOICe are assistive devices for the blind that redirect visual information to another intact sensory system. Establishing how experience and different etiologies of vision loss affect the performance of these devices may help to improve existing rehabilitation strategies, formulate effective selection criteria and develop prognostic measures. In this review we will discuss studies that investigated the influence of training and visual deprivation on the performance of various sensory substitution approaches. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Do the Contents of Visual Working Memory Automatically Influence Attentional Selection During Visual Search?

    PubMed Central

    Woodman, Geoffrey F.; Luck, Steven J.

    2007-01-01

    In many theories of cognition, researchers propose that working memory and perception operate interactively. For example, in previous studies researchers have suggested that sensory inputs matching the contents of working memory will have an automatic advantage in the competition for processing resources. The authors tested this hypothesis by requiring observers to perform a visual search task while concurrently maintaining object representations in visual working memory. The hypothesis that working memory activation produces a simple but uncontrollable bias signal leads to the prediction that items matching the contents of working memory will automatically capture attention. However, no evidence for automatic attentional capture was obtained; instead, the participants avoided attending to these items. Thus, the contents of working memory can be used in a flexible manner for facilitation or inhibition of processing. PMID:17469973

  4. Do the contents of visual working memory automatically influence attentional selection during visual search?

    PubMed

    Woodman, Geoffrey F; Luck, Steven J

    2007-04-01

    In many theories of cognition, researchers propose that working memory and perception operate interactively. For example, in previous studies researchers have suggested that sensory inputs matching the contents of working memory will have an automatic advantage in the competition for processing resources. The authors tested this hypothesis by requiring observers to perform a visual search task while concurrently maintaining object representations in visual working memory. The hypothesis that working memory activation produces a simple but uncontrollable bias signal leads to the prediction that items matching the contents of working memory will automatically capture attention. However, no evidence for automatic attentional capture was obtained; instead, the participants avoided attending to these items. Thus, the contents of working memory can be used in a flexible manner for facilitation or inhibition of processing.

  5. Parallel processing in the honeybee olfactory pathway: structure, function, and evolution.

    PubMed

    Rössler, Wolfgang; Brill, Martin F

    2013-11-01

    Animals face highly complex and dynamic olfactory stimuli in their natural environments, which require fast and reliable olfactory processing. Parallel processing is a common principle of sensory systems supporting this task, for example in visual and auditory systems, but its role in olfaction remained unclear. Studies in the honeybee focused on a dual olfactory pathway. Two sets of projection neurons connect glomeruli in two antennal-lobe hemilobes via lateral and medial tracts in opposite sequence with the mushroom bodies and lateral horn. Comparative studies suggest that this dual-tract circuit represents a unique adaptation in Hymenoptera. Imaging studies indicate that glomeruli in both hemilobes receive redundant sensory input. Recent simultaneous multi-unit recordings from projection neurons of both tracts revealed widely overlapping response profiles strongly indicating parallel olfactory processing. Whereas lateral-tract neurons respond fast with broad (generalistic) profiles, medial-tract neurons are odorant specific and respond slower. In analogy to "what-" and "where" subsystems in visual pathways, this suggests two parallel olfactory subsystems providing "what-" (quality) and "when" (temporal) information. Temporal response properties may support across-tract coincidence coding in higher centers. Parallel olfactory processing likely enhances perception of complex odorant mixtures to decode the diverse and dynamic olfactory world of a social insect.

  6. Timescale- and Sensory Modality-Dependency of the Central Tendency of Time Perception.

    PubMed

    Murai, Yuki; Yotsumoto, Yuko

    2016-01-01

    When individuals are asked to reproduce intervals of stimuli that are intermixedly presented at various times, longer intervals are often underestimated and shorter intervals overestimated. This phenomenon may be attributed to the central tendency of time perception, and suggests that our brain optimally encodes a stimulus interval based on current stimulus input and prior knowledge of the distribution of stimulus intervals. Two distinct systems are thought to be recruited in the perception of sub- and supra-second intervals. Sub-second timing is subject to local sensory processing, whereas supra-second timing depends on more centralized mechanisms. To clarify the factors that influence time perception, the present study investigated how both sensory modality and timescale affect the central tendency. In Experiment 1, participants were asked to reproduce sub- or supra-second intervals, defined by visual or auditory stimuli. In the sub-second range, the magnitude of the central tendency was significantly larger for visual intervals compared to auditory intervals, while visual and auditory intervals exhibited a correlated and comparable central tendency in the supra-second range. In Experiment 2, the ability to discriminate sub-second intervals in the reproduction task was controlled across modalities by using an interval discrimination task. Even when the ability to discriminate intervals was controlled, visual intervals exhibited a larger central tendency than auditory intervals in the sub-second range. In addition, the magnitude of the central tendency for visual and auditory sub-second intervals was significantly correlated. These results suggest that a common modality-independent mechanism is responsible for the supra-second central tendency, and that both the modality-dependent and modality-independent components of the timing system contribute to the central tendency in the sub-second range.

  7. Neural mechanisms underlying sound-induced visual motion perception: An fMRI study.

    PubMed

    Hidaka, Souta; Higuchi, Satomi; Teramoto, Wataru; Sugita, Yoichi

    2017-07-01

    Studies of crossmodal interactions in motion perception have reported activation in several brain areas, including those related to motion processing and/or sensory association, in response to multimodal (e.g., visual and auditory) stimuli that were both in motion. Recent studies have demonstrated that sounds can trigger illusory visual apparent motion to static visual stimuli (sound-induced visual motion: SIVM): A visual stimulus blinking at a fixed location is perceived to be moving laterally when an alternating left-right sound is also present. Here, we investigated brain activity related to the perception of SIVM using a 7T functional magnetic resonance imaging technique. Specifically, we focused on the patterns of neural activities in SIVM and visually induced visual apparent motion (VIVM). We observed shared activations in the middle occipital area (V5/hMT), which is thought to be involved in visual motion processing, for SIVM and VIVM. Moreover, as compared to VIVM, SIVM resulted in greater activation in the superior temporal area and dominant functional connectivity between the V5/hMT area and the areas related to auditory and crossmodal motion processing. These findings indicate that similar but partially different neural mechanisms could be involved in auditory-induced and visually-induced motion perception, and neural signals in auditory, visual, and, crossmodal motion processing areas closely and directly interact in the perception of SIVM. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. ‘If you are good, I get better’: the role of social hierarchy in perceptual decision-making

    PubMed Central

    Pannunzi, Mario; Ayneto, Alba; Deco, Gustavo; Sebastián-Gallés, Nuria

    2014-01-01

    So far, it was unclear if social hierarchy could influence sensory or perceptual cognitive processes. We evaluated the effects of social hierarchy on these processes using a basic visual perceptual decision task. We constructed a social hierarchy where participants performed the perceptual task separately with two covertly simulated players (superior, inferior). Participants were faster (better) when performing the discrimination task with the superior player. We studied the time course when social hierarchy was processed using event-related potentials and observed hierarchical effects even in early stages of sensory-perceptual processing, suggesting early top–down modulation by social hierarchy. Moreover, in a parallel analysis, we fitted a drift-diffusion model (DDM) to the results to evaluate the decision making process of this perceptual task in the context of a social hierarchy. Consistently, the DDM pointed to nondecision time (probably perceptual encoding) as the principal period influenced by social hierarchy. PMID:23946003

  9. Impact of enhanced sensory input on treadmill step frequency: infants born with myelomeningocele.

    PubMed

    Pantall, Annette; Teulier, Caroline; Smith, Beth A; Moerchen, Victoria; Ulrich, Beverly D

    2011-01-01

    To determine the effect of enhanced sensory input on the step frequency of infants with myelomeningocele (MMC) when supported on a motorized treadmill. Twenty-seven infants aged 2 to 10 months with MMC lesions at, or caudal to, L1 participated. We supported infants upright on the treadmill for 2 sets of 6 trials, each 30 seconds long. Enhanced sensory inputs within each set were presented in random order and included baseline, visual flow, unloading, weights, Velcro, and friction. Overall friction and visual flow significantly increased step rate, particularly for the older subjects. Friction and Velcro increased stance-phase duration. Enhanced sensory input had minimal effect on leg activity when infants were not stepping. : Increased friction via Dycem and enhancing visual flow via a checkerboard pattern on the treadmill belt appear to be more effective than the traditional smooth black belt surface for eliciting stepping patterns in infants with MMC.

  10. Impact of Enhanced Sensory Input on Treadmill Step Frequency: Infants Born With Myelomeningocele

    PubMed Central

    Pantall, Annette; Teulier, Caroline; Smith, Beth A; Moerchen, Victoria; Ulrich, Beverly D.

    2012-01-01

    Purpose To determine the effect of enhanced sensory input on the step frequency of infants with myelomeningocele (MMC) when supported on a motorized treadmill. Methods Twenty seven infants aged 2 to 10 months with MMC lesions at or caudal to L1 participated. We supported infants upright on the treadmill for 2 sets of 6 trials, each 30s long. Enhanced sensory inputs within each set were presented in random order and included: baseline, visual flow, unloading, weights, Velcro and friction. Results Overall friction and visual flow significantly increased step rate, particularly for the older group. Friction and Velcro increased stance phase duration. Enhanced sensory input had minimal effect on leg activity when infants were not stepping. Conclusions Increased friction via Dycem and enhancing visual flow via a checkerboard pattern on the treadmill belt appear more effective than the traditional smooth black belt surface for eliciting stepping patterns in infants with MMC. PMID:21266940

  11. Evidence of a visual-to-auditory cross-modal sensory gating phenomenon as reflected by the human P50 event-related brain potential modulation.

    PubMed

    Lebib, Riadh; Papo, David; de Bode, Stella; Baudonnière, Pierre Marie

    2003-05-08

    We investigated the existence of a cross-modal sensory gating reflected by the modulation of an early electrophysiological index, the P50 component. We analyzed event-related brain potentials elicited by audiovisual speech stimuli manipulated along two dimensions: congruency and discriminability. The results showed that the P50 was attenuated when visual and auditory speech information were redundant (i.e. congruent), in comparison with this same event-related potential component elicited with discrepant audiovisual dubbing. When hard to discriminate, however, bimodal incongruent speech stimuli elicited a similar pattern of P50 attenuation. We concluded to the existence of a visual-to-auditory cross-modal sensory gating phenomenon. These results corroborate previous findings revealing a very early audiovisual interaction during speech perception. Finally, we postulated that the sensory gating system included a cross-modal dimension.

  12. Differential Effects of Motor Efference Copies and Proprioceptive Information on Response Evaluation Processes

    PubMed Central

    Stock, Ann-Kathrin; Wascher, Edmund; Beste, Christian

    2013-01-01

    It is well-kown that sensory information influences the way we execute motor responses. However, less is known about if and how sensory and motor information are integrated in the subsequent process of response evaluation. We used a modified Simon Task to investigate how these streams of information are integrated in response evaluation processes, applying an in-depth neurophysiological analysis of event-related potentials (ERPs), time-frequency decomposition and sLORETA. The results show that response evaluation processes are differentially modulated by afferent proprioceptive information and efference copies. While the influence of proprioceptive information is mediated via oscillations in different frequency bands, efference copy based information about the motor execution is specifically mediated via oscillations in the theta frequency band. Stages of visual perception and attention were not modulated by the interaction of proprioception and motor efference copies. Brain areas modulated by the interactive effects of proprioceptive and efference copy based information included the middle frontal gyrus and the supplementary motor area (SMA), suggesting that these areas integrate sensory information for the purpose of response evaluation. The results show how motor response evaluation processes are modulated by information about both the execution and the location of a response. PMID:23658624

  13. Real-Time Cognitive Computing Architecture for Data Fusion in a Dynamic Environment

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.; Duong, Vu A.

    2012-01-01

    A novel cognitive computing architecture is conceptualized for processing multiple channels of multi-modal sensory data streams simultaneously, and fusing the information in real time to generate intelligent reaction sequences. This unique architecture is capable of assimilating parallel data streams that could be analog, digital, synchronous/asynchronous, and could be programmed to act as a knowledge synthesizer and/or an "intelligent perception" processor. In this architecture, the bio-inspired models of visual pathway and olfactory receptor processing are combined as processing components, to achieve the composite function of "searching for a source of food while avoiding the predator." The architecture is particularly suited for scene analysis from visual data and odorant.

  14. A Computational Model for Aperture Control in Reach-to-Grasp Movement Based on Predictive Variability

    PubMed Central

    Takemura, Naohiro; Fukui, Takao; Inui, Toshio

    2015-01-01

    In human reach-to-grasp movement, visual occlusion of a target object leads to a larger peak grip aperture compared to conditions where online vision is available. However, no previous computational and neural network models for reach-to-grasp movement explain the mechanism of this effect. We simulated the effect of online vision on the reach-to-grasp movement by proposing a computational control model based on the hypothesis that the grip aperture is controlled to compensate for both motor variability and sensory uncertainty. In this model, the aperture is formed to achieve a target aperture size that is sufficiently large to accommodate the actual target; it also includes a margin to ensure proper grasping despite sensory and motor variability. To this end, the model considers: (i) the variability of the grip aperture, which is predicted by the Kalman filter, and (ii) the uncertainty of the object size, which is affected by visual noise. Using this model, we simulated experiments in which the effect of the duration of visual occlusion was investigated. The simulation replicated the experimental result wherein the peak grip aperture increased when the target object was occluded, especially in the early phase of the movement. Both predicted motor variability and sensory uncertainty play important roles in the online visuomotor process responsible for grip aperture control. PMID:26696874

  15. Escape from harm: linking affective vision and motor responses during active avoidance

    PubMed Central

    Keil, Andreas

    2014-01-01

    When organisms confront unpleasant objects in their natural environments, they engage in behaviors that allow them to avoid aversive outcomes. Here, we linked visual processing of threat to its behavioral consequences by including a motor response that terminated exposure to an aversive event. Dense-array steady-state visual evoked potentials were recorded in response to conditioned threat and safety signals viewed in active or passive behavioral contexts. The amplitude of neuronal responses in visual cortex increased additively, as a function of emotional value and action relevance. The gain in local cortical population activity for threat relative to safety cues persisted when aversive reinforcement was behaviorally terminated, suggesting a lingering emotionally based response amplification within the visual system. Distinct patterns of long-range neural synchrony emerged between the visual cortex and extravisual regions. Increased coupling between visual and higher-order structures was observed specifically during active perception of threat, consistent with a reorganization of neuronal populations involved in linking sensory processing to action preparation. PMID:24493849

  16. Behavioural system identification of visual flight speed control in Drosophila melanogaster

    PubMed Central

    Rohrseitz, Nicola; Fry, Steven N.

    2011-01-01

    Behavioural control in many animals involves complex mechanisms with intricate sensory-motor feedback loops. Modelling allows functional aspects to be captured without relying on a description of the underlying complex, and often unknown, mechanisms. A wide range of engineering techniques are available for modelling, but their ability to describe time-continuous processes is rarely exploited to describe sensory-motor control mechanisms in biological systems. We performed a system identification of visual flight speed control in the fruitfly Drosophila, based on an extensive dataset of open-loop responses previously measured under free flight conditions. We identified a second-order under-damped control model with just six free parameters that well describes both the transient and steady-state characteristics of the open-loop data. We then used the identified control model to predict flight speed responses after a visual perturbation under closed-loop conditions and validated the model with behavioural measurements performed in free-flying flies under the same closed-loop conditions. Our system identification of the fruitfly's flight speed response uncovers the high-level control strategy of a fundamental flight control reflex without depending on assumptions about the underlying physiological mechanisms. The results are relevant for future investigations of the underlying neuromotor processing mechanisms, as well as for the design of biomimetic robots, such as micro-air vehicles. PMID:20525744

  17. Behavioural system identification of visual flight speed control in Drosophila melanogaster.

    PubMed

    Rohrseitz, Nicola; Fry, Steven N

    2011-02-06

    Behavioural control in many animals involves complex mechanisms with intricate sensory-motor feedback loops. Modelling allows functional aspects to be captured without relying on a description of the underlying complex, and often unknown, mechanisms. A wide range of engineering techniques are available for modelling, but their ability to describe time-continuous processes is rarely exploited to describe sensory-motor control mechanisms in biological systems. We performed a system identification of visual flight speed control in the fruitfly Drosophila, based on an extensive dataset of open-loop responses previously measured under free flight conditions. We identified a second-order under-damped control model with just six free parameters that well describes both the transient and steady-state characteristics of the open-loop data. We then used the identified control model to predict flight speed responses after a visual perturbation under closed-loop conditions and validated the model with behavioural measurements performed in free-flying flies under the same closed-loop conditions. Our system identification of the fruitfly's flight speed response uncovers the high-level control strategy of a fundamental flight control reflex without depending on assumptions about the underlying physiological mechanisms. The results are relevant for future investigations of the underlying neuromotor processing mechanisms, as well as for the design of biomimetic robots, such as micro-air vehicles.

  18. Modality-independent coding of spatial layout in the human brain

    PubMed Central

    Wolbers, Thomas; Klatzky, Roberta L.; Loomis, Jack M.; Wutte, Magdalena G.; Giudice, Nicholas A.

    2011-01-01

    Summary In many non-human species, neural computations of navigational information such as position and orientation are not tied to a specific sensory modality [1, 2]. Rather, spatial signals are integrated from multiple input sources, likely leading to abstract representations of space. In contrast, the potential for abstract spatial representations in humans is not known, as most neuroscientific experiments on human navigation have focused exclusively on visual cues. Here, we tested the modality independence hypothesis with two fMRI experiments that characterized computations in regions implicated in processing spatial layout [3]. According to the hypothesis, such regions should be recruited for spatial computation of 3-D geometric configuration, independent of a specific sensory modality. In support of this view, sighted participants showed strong activation of the parahippocampal place area (PPA) and the retrosplenial cortex (RSC) for visual and haptic exploration of information-matched scenes but not objects. Functional connectivity analyses suggested that these effects were not related to visual recoding, which was further supported by a similar preference for haptic scenes found with blind participants. Taken together, these findings establish the PPA/RSC network as critical in modality-independent spatial computations and provide important evidence for a theory of high-level abstract spatial information processing in the human brain. PMID:21620708

  19. Occipital GABA correlates with cognitive failures in daily life.

    PubMed

    Sandberg, Kristian; Blicher, Jakob Udby; Dong, Mia Yuan; Rees, Geraint; Near, Jamie; Kanai, Ryota

    2014-02-15

    The brain has limited capacity, and so selective attention enhances relevant incoming information while suppressing irrelevant information. This process is not always successful, and the frequency of such cognitive failures varies to a large extent between individuals. Here we hypothesised that individual differences in cognitive failures might be reflected in inhibitory processing in the sensory cortex. To test this hypothesis, we measured GABA in human visual cortex using MR spectroscopy and found a negative correlation between occipital GABA (GABA+/Cr ratio) and cognitive failures as measured by an established cognitive failures questionnaire (CFQ). For a second site in parietal cortex, no correlation between CFQ score and GABA+/Cr ratio was found, thus establishing the regional specificity of the link between occipital GABA and cognitive failures. We further found that grey matter volume in the left superior parietal lobule (SPL) correlated with cognitive failures independently from the impact of occipital GABA and together, occipital GABA and SPL grey matter volume statistically explained around 50% of the individual variability in daily cognitive failures. We speculate that the amount of GABA in sensory areas may reflect the potential capacity to selectively suppress irrelevant information already at the sensory level, or alternatively that GABA influences the specificity of neural representations in visual cortex thus improving the effectiveness of successful attentional modulation. © 2013. Published by Elsevier Inc. All rights reserved.

  20. Effects of visual span on reading speed and parafoveal processing in eye movements during sentence reading.

    PubMed

    Risse, Sarah

    2014-07-15

    The visual span (or ‘‘uncrowded window’’), which limits the sensory information on each fixation, has been shown to determine reading speed in tasks involving rapid serial visual presentation of single words. The present study investigated whether this is also true for fixation durations during sentence reading when all words are presented at the same time and parafoveal preview of words prior to fixation typically reduces later word-recognition times. If so, a larger visual span may allow more efficient parafoveal processing and thus faster reading. In order to test this hypothesis, visual span profiles (VSPs) were collected from 60 participants and related to data from an eye-tracking reading experiment. The results confirmed a positive relationship between the readers’ VSPs and fixation-based reading speed. However, this relationship was not determined by parafoveal processing. There was no evidence that individual differences in VSPs predicted differences in parafoveal preview benefit. Nevertheless, preview benefit correlated with reading speed, suggesting an independent effect on oculomotor control during reading. In summary, the present results indicate a more complex relationship between the visual span, parafoveal processing, and reading speed than initially assumed. © 2014 ARVO.

  1. Effects of aging and sensory loss on glial cells in mouse visual and auditory cortices.

    PubMed

    Tremblay, Marie-Ève; Zettel, Martha L; Ison, James R; Allen, Paul D; Majewska, Ania K

    2012-04-01

    Normal aging is often accompanied by a progressive loss of receptor sensitivity in hearing and vision, whose consequences on cellular function in cortical sensory areas have remained largely unknown. By examining the primary auditory (A1) and visual (V1) cortices in two inbred strains of mice undergoing either age-related loss of audition (C57BL/6J) or vision (CBA/CaJ), we were able to describe cellular and subcellular changes that were associated with normal aging (occurring in A1 and V1 of both strains) or specifically with age-related sensory loss (only in A1 of C57BL/6J or V1 of CBA/CaJ), using immunocytochemical electron microscopy and light microscopy. While the changes were subtle in neurons, glial cells and especially microglia were transformed in aged animals. Microglia became more numerous and irregularly distributed, displayed more variable cell body and process morphologies, occupied smaller territories, and accumulated phagocytic inclusions that often displayed ultrastructural features of synaptic elements. Additionally, evidence of myelination defects were observed, and aged oligodendrocytes became more numerous and were more often encountered in contiguous pairs. Most of these effects were profoundly exacerbated by age-related sensory loss. Together, our results suggest that the age-related alteration of glial cells in sensory cortical areas can be accelerated by activity-driven central mechanisms that result from an age-related loss of peripheral sensitivity. In light of our observations, these age-related changes in sensory function should be considered when investigating cellular, cortical, and behavioral functions throughout the lifespan in these commonly used C57BL/6J and CBA/CaJ mouse models. Copyright © 2012 Wiley Periodicals, Inc.

  2. Effects of aging and sensory loss on glial cells in mouse visual and auditory cortices

    PubMed Central

    Tremblay, Marie-Ève; Zettel, Martha L.; Ison, James R.; Allen, Paul D.; Majewska, Ania K.

    2011-01-01

    Normal aging is often accompanied by a progressive loss of receptor sensitivity in hearing and vision, whose consequences on cellular function in cortical sensory areas have remained largely unknown. By examining the primary auditory (A1) and visual (V1) cortices in two inbred strains of mice undergoing either age-related loss of audition (C57BL/6J) or vision (CBA/CaJ), we were able to describe cellular and subcellular changes that were associated with normal aging (occurring in A1 and V1 of both strains) or specifically with age-related sensory loss (only in A1 of C57BL/6J or V1 of CBA/CaJ), using immunocytochemical electron microscopy and light microscopy. While the changes were subtle in neurons, glial cells and especially microglia were transformed in aged animals. Microglia became more numerous and irregularly distributed, displayed more variable cell body and process morphologies, occupied smaller territories, and accumulated phagocytic inclusions that often displayed ultrastructural features of synaptic elements. Additionally, evidence of myelination defects were observed, and aged oligodendrocytes became more numerous and were more often encountered in contiguous pairs. Most of these effects were profoundly exacerbated by age-related sensory loss. Together, our results suggest that the age-related alteration of glial cells in sensory cortical areas can be accelerated by activity-driven central mechanisms that result from an age-related loss of peripheral sensitivity. In light of our observations, these age-related changes in sensory function should be considered when investigating cellular, cortical and behavioral functions throughout the lifespan in these commonly used C57BL/6J and CBA/CaJ mouse models. PMID:22223464

  3. Combined mirror visual and auditory feedback therapy for upper limb phantom pain: a case report

    PubMed Central

    2011-01-01

    Introduction Phantom limb sensation and phantom limb pain is a very common issue after amputations. In recent years there has been accumulating data implicating 'mirror visual feedback' or 'mirror therapy' as helpful in the treatment of phantom limb sensation and phantom limb pain. Case presentation We present the case of a 24-year-old Caucasian man, a left upper limb amputee, treated with mirror visual feedback combined with auditory feedback with improved pain relief. Conclusion This case may suggest that auditory feedback might enhance the effectiveness of mirror visual feedback and serve as a valuable addition to the complex multi-sensory processing of body perception in patients who are amputees. PMID:21272334

  4. Evaluation of Sensory Aids for the Visually Handicapped.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    Presented are 11 papers given at a conference on the evaluation of sensory aids for the visually handicapped which emphasized mobility and reading aids beginning to be tested and distributed widely. Many of the presentations are by the principal developers or advocates of the aids. Introductory readings compare the role of evaluation in the…

  5. Origins of task-specific sensory-independent organization in the visual and auditory brain: neuroscience evidence, open questions and clinical implications.

    PubMed

    Heimler, Benedetta; Striem-Amit, Ella; Amedi, Amir

    2015-12-01

    Evidence of task-specific sensory-independent (TSSI) plasticity from blind and deaf populations has led to a better understanding of brain organization. However, the principles determining the origins of this plasticity remain unclear. We review recent data suggesting that a combination of the connectivity bias and sensitivity to task-distinctive features might account for TSSI plasticity in the sensory cortices as a whole, from the higher-order occipital/temporal cortices to the primary sensory cortices. We discuss current theories and evidence, open questions and related predictions. Finally, given the rapid progress in visual and auditory restoration techniques, we address the crucial need to develop effective rehabilitation approaches for sensory recovery. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Electrophysiological assessments of cognition and sensory processing in TBI: applications for diagnosis, prognosis and rehabilitation.

    PubMed

    Folmer, Robert L; Billings, Curtis J; Diedesch-Rouse, Anna C; Gallun, Frederick J; Lew, Henry L

    2011-10-01

    Traumatic brain injuries are often associated with damage to sensory and cognitive processing pathways. Because evoked potentials (EPs) and event-related potentials (ERPs) are generated by neuronal activity, they are useful for assessing the integrity of neural processing capabilities in patients with traumatic brain injury (TBI). This review of somatosensory, auditory and visual ERPs in assessments of TBI patients is provided with the hope that it will be of interest to clinicians and researchers who conduct or interpret electrophysiological evaluations of this population. Because this article reviews ERP studies conducted in three different sensory modalities, involving patients with a wide range of TBI severity ratings and circumstances, it is difficult to provide a coherent summary of findings. However, some general trends emerge that give rise to the following observations and recommendations: 1) bilateral absence of somatosensory evoked potentials (SEPs) is often associated with poor clinical prognosis and outcome; 2) the presence of normal ERPs does not guarantee favorable outcome; 3) ERPs evoked by a variety of sensory stimuli should be used to evaluate TBI patients, especially those with severe injuries; 4) time since onset of injury should be taken into account when conducting ERP evaluations of TBI patients or interpreting results; 5) because sensory deficits (e.g., vision impairment or hearing loss) affect ERP results, tests of peripheral sensory integrity should be conducted in conjunction with ERP recordings; and 6) patients' state of consciousness, physical and cognitive abilities to respond and follow directions should be considered when conducting or interpreting ERP evaluations. Published by Elsevier B.V.

  7. Role of orientation reference selection in motion sickness

    NASA Technical Reports Server (NTRS)

    Peterka, Robert J.; Black, F. Owen

    1988-01-01

    Previous experiments with moving platform posturography have shown that different people have varying abilities to resolve conflicts among vestibular, visual, and proprioceptive sensory signals used to control upright posture. In particular, there is one class of subjects with a vestibular disorder known as benign paroxysmal positional vertigo (BPPV) who often are particularly sensitive to inaccurate visual information. That is, they will use visual sensory information for the control of their posture even when that visual information is inaccurate and is in conflict with accurate proprioceptive and vestibular sensory signals. BPPV has been associated with disorders of both posterior semicircular canal function and possibly otolith function. The present proposal hopes to take advantage of the similarities between the space motion sickness problem and the sensory orientation reference selection problems associated with the BPPV syndrome. These similarities include both etiology related to abnormal vertical canal-otolith function, and motion sickness initiating events provoked by pitch and roll head movements. The objectives of this proposal are to explore and quantify the orientation reference selection abilities of subjects and the relation of this selection to motion sickness in humans.

  8. Evolution of crossmodal reorganization of the voice area in cochlear-implanted deaf patients.

    PubMed

    Rouger, Julien; Lagleyre, Sébastien; Démonet, Jean-François; Fraysse, Bernard; Deguine, Olivier; Barone, Pascal

    2012-08-01

    Psychophysical and neuroimaging studies in both animal and human subjects have clearly demonstrated that cortical plasticity following sensory deprivation leads to a brain functional reorganization that favors the spared modalities. In postlingually deaf patients, the use of a cochlear implant (CI) allows a recovery of the auditory function, which will probably counteract the cortical crossmodal reorganization induced by hearing loss. To study the dynamics of such reversed crossmodal plasticity, we designed a longitudinal neuroimaging study involving the follow-up of 10 postlingually deaf adult CI users engaged in a visual speechreading task. While speechreading activates Broca's area in normally hearing subjects (NHS), the activity level elicited in this region in CI patients is abnormally low and increases progressively with post-implantation time. Furthermore, speechreading in CI patients induces abnormal crossmodal activations in right anterior regions of the superior temporal cortex normally devoted to processing human voice stimuli (temporal voice-sensitive areas-TVA). These abnormal activity levels diminish with post-implantation time and tend towards the levels observed in NHS. First, our study revealed that the neuroplasticity after cochlear implantation involves not only auditory but also visual and audiovisual speech processing networks. Second, our results suggest that during deafness, the functional links between cortical regions specialized in face and voice processing are reallocated to support speech-related visual processing through cross-modal reorganization. Such reorganization allows a more efficient audiovisual integration of speech after cochlear implantation. These compensatory sensory strategies are later completed by the progressive restoration of the visuo-audio-motor speech processing loop, including Broca's area. Copyright © 2011 Wiley Periodicals, Inc.

  9. Learning enhances the relative impact of top-down processing in the visual cortex

    PubMed Central

    Makino, Hiroshi; Komiyama, Takaki

    2015-01-01

    Theories have proposed that in sensory cortices learning can enhance top-down modulation by higher brain areas while reducing bottom-up sensory inputs. To address circuit mechanisms underlying this process, we examined the activity of layer 2/3 (L2/3) excitatory neurons in the mouse primary visual cortex (V1) as well as L4 neurons, the main bottom-up source, and long-range top-down projections from the retrosplenial cortex (RSC) during associative learning over days using chronic two-photon calcium imaging. During learning, L4 responses gradually weakened, while RSC inputs became stronger. Furthermore, L2/3 acquired a ramp-up response temporal profile with learning, coinciding with a similar change in RSC inputs. Learning also reduced the activity of somatostatin-expressing inhibitory neurons (SOM-INs) in V1 that could potentially gate top-down inputs. Finally, RSC inactivation or SOM-IN activation was sufficient to partially reverse the learning-induced changes in L2/3. Together, these results reveal a learning-dependent dynamic shift in the balance between bottom-up and top-down information streams and uncover a role of SOM-INs in controlling this process. PMID:26167904

  10. Sensory-based expert monitoring and control

    NASA Astrophysics Data System (ADS)

    Yen, Gary G.

    1999-03-01

    Field operators use their eyes, ears, and nose to detect process behavior and to trigger corrective control actions. For instance: in daily practice, the experienced operator in sulfuric acid treatment of phosphate rock may observe froth color or bubble character to control process material in-flow. Or, similarly, (s)he may use acoustic sound of cavitation or boiling/flashing to increase or decrease material flow rates in tank levels. By contrast, process control computers continue to be limited to taking action on P, T, F, and A signals. Yet, there is sufficient evidence from the fields that visual and acoustic information can be used for control and identification. Smart in-situ sensors have facilitated potential mechanism for factory automation with promising industry applicability. In respond to these critical needs, a generic, structured health monitoring approach is proposed. The system assumes a given sensor suite will act as an on-line health usage monitor and at best provide the real-time control autonomy. The sensor suite can incorporate various types of sensory devices, from vibration accelerometers, directional microphones, machine vision CCDs, pressure gauges to temperature indicators. The decision can be shown in a visual on-board display or fed to the control block to invoke controller reconfigurration.

  11. Retention interval affects visual short-term memory encoding.

    PubMed

    Bankó, Eva M; Vidnyánszky, Zoltán

    2010-03-01

    Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.

  12. Brainstem origins for cortical 'what' and 'where' pathways in the auditory system.

    PubMed

    Kraus, Nina; Nicol, Trent

    2005-04-01

    We have developed a data-driven conceptual framework that links two areas of science: the source-filter model of acoustics and cortical sensory processing streams. The source-filter model describes the mechanics behind speech production: the identity of the speaker is carried largely in the vocal cord source and the message is shaped by the ever-changing filters of the vocal tract. Sensory processing streams, popularly called 'what' and 'where' pathways, are well established in the visual system as a neural scheme for separately carrying different facets of visual objects, namely their identity and their position/motion, to the cortex. A similar functional organization has been postulated in the auditory system. Both speaker identity and the spoken message, which are simultaneously conveyed in the acoustic structure of speech, can be disentangled into discrete brainstem response components. We argue that these two response classes are early manifestations of auditory 'what' and 'where' streams in the cortex. This brainstem link forges a new understanding of the relationship between the acoustics of speech and cortical processing streams, unites two hitherto separate areas in science, and provides a model for future investigations of auditory function.

  13. A measure for assessing the effects of audiovisual speech integration.

    PubMed

    Altieri, Nicholas; Townsend, James T; Wenger, Michael J

    2014-06-01

    We propose a measure of audiovisual speech integration that takes into account accuracy and response times. This measure should prove beneficial for researchers investigating multisensory speech recognition, since it relates to normal-hearing and aging populations. As an example, age-related sensory decline influences both the rate at which one processes information and the ability to utilize cues from different sensory modalities. Our function assesses integration when both auditory and visual information are available, by comparing performance on these audiovisual trials with theoretical predictions for performance under the assumptions of parallel, independent self-terminating processing of single-modality inputs. We provide example data from an audiovisual identification experiment and discuss applications for measuring audiovisual integration skills across the life span.

  14. Effects of microgravity on vestibular development and function in rats: genetics and environment

    NASA Technical Reports Server (NTRS)

    Ronca, A. E.; Fritzsch, B.; Alberts, J. R.; Bruce, L. L.

    2000-01-01

    Our anatomical and behavioral studies of embryonic rats that developed in microgravity suggest that the vestibular sensory system, like the visual system, has genetically mediated processes of development that establish crude connections between the periphery and the brain. Environmental stimuli also regulate connection formation including terminal branch formation and fine-tuning of synaptic contacts. Axons of vestibular sensory neurons from gravistatic as well as linear acceleration receptors reach their targets in both microgravity and normal gravity, suggesting that this is a genetically regulated component of development. However, microgravity exposure delays the development of terminal branches and synapses in gravistatic but not linear acceleration-sensitive neurons and also produces behavioral changes. These latter changes reflect environmentally controlled processes of development.

  15. The interaction of Bayesian priors and sensory data and its neural circuit implementation in visually-guided movement

    PubMed Central

    Yang, Jin; Lee, Joonyeol; Lisberger, Stephen G.

    2012-01-01

    Sensory-motor behavior results from a complex interaction of noisy sensory data with priors based on recent experience. By varying the stimulus form and contrast for the initiation of smooth pursuit eye movements in monkeys, we show that visual motion inputs compete with two independent priors: one prior biases eye speed toward zero; the other prior attracts eye direction according to the past several days’ history of target directions. The priors bias the speed and direction of the initiation of pursuit for the weak sensory data provided by the motion of a low-contrast sine wave grating. However, the priors have relatively little effect on pursuit speed and direction when the visual stimulus arises from the coherent motion of a high-contrast patch of dots. For any given stimulus form, the mean and variance of eye speed co-vary in the initiation of pursuit, as expected for signal-dependent noise. This relationship suggests that pursuit implements a trade-off between movement accuracy and variation, reducing both when the sensory signals are noisy. The tradeoff is implemented as a competition of sensory data and priors that follows the rules of Bayesian estimation. Computer simulations show that the priors can be understood as direction specific control of the strength of visual-motor transmission, and can be implemented in a neural-network model that makes testable predictions about the population response in the smooth eye movement region of the frontal eye fields. PMID:23223286

  16. Axonal Conduction Delays, Brain State, and Corticogeniculate Communication.

    PubMed

    Stoelzel, Carl R; Bereshpolova, Yulia; Alonso, Jose-Manuel; Swadlow, Harvey A

    2017-06-28

    Thalamocortical conduction times are short, but layer 6 corticothalamic axons display an enormous range of conduction times, some exceeding 40-50 ms. Here, we investigate (1) how axonal conduction times of corticogeniculate (CG) neurons are related to the visual information conveyed to the thalamus, and (2) how alert versus nonalert awake brain states affect visual processing across the spectrum of CG conduction times. In awake female Dutch-Belted rabbits, we found 58% of CG neurons to be visually responsive, and 42% to be unresponsive. All responsive CG neurons had simple, orientation-selective receptive fields, and generated sustained responses to stationary stimuli. CG axonal conduction times were strongly related to modulated firing rates (F1 values) generated by drifting grating stimuli, and their associated interspike interval distributions, suggesting a continuum of visual responsiveness spanning the spectrum of axonal conduction times. CG conduction times were also significantly related to visual response latency, contrast sensitivity (C-50 values), directional selectivity, and optimal stimulus velocity. Increasing alertness did not cause visually unresponsive CG neurons to become responsive and did not change the response linearity (F1/F0 ratios) of visually responsive CG neurons. However, for visually responsive CG neurons, increased alertness nearly doubled the modulated response amplitude to optimal visual stimulation (F1 values), significantly shortened response latency, and dramatically increased response reliability. These effects of alertness were uniform across the broad spectrum of CG axonal conduction times. SIGNIFICANCE STATEMENT Corticothalamic neurons of layer 6 send a dense feedback projection to thalamic nuclei that provide input to sensory neocortex. While sensory information reaches the cortex after brief thalamocortical axonal delays, corticothalamic axons can exhibit conduction delays of <2 ms to 40-50 ms. Here, in the corticogeniculate visual system of awake rabbits, we investigate the functional significance of this axonal diversity, and the effects of shifting alert/nonalert brain states on corticogeniculate processing. We show that axonal conduction times are strongly related to multiple visual response properties, suggesting a continuum of visual responsiveness spanning the spectrum of corticogeniculate axonal conduction times. We also show that transitions between awake brain states powerfully affect corticogeniculate processing, in some ways more strongly than in layer 4. Copyright © 2017 the authors 0270-6474/17/376342-17$15.00/0.

  17. Effects of lifetime occupational pesticide exposure on postural control among farmworkers and non-farmworkers

    PubMed Central

    Sunwook, Kim; Nussbaum, Maury A.; Quandt, Sara A.; Laurienti, Paul J.; Arcury, Thomas A.

    2015-01-01

    Objective Assess potential chronic effects of pesticide exposure on postural control, by examining postural balance of farmworkers and non-farmworkers diverse self-reported lifetime exposures. Methods Balance was assessed during quiet upright stance under four experimental conditions (2 visual × 2 cognitive difficulty). Results Significant differences in baseline balance performance (eyes open without cognitive task) between occupational groups were apparent in postural sway complexity. When adding a cognitive task to the eyes open condition, the influence of lifetime exposure on complexity ratios appeared different between occupational groups. Removing visual information revealed a negative association of lifetime exposure with complexity ratios. Conclusions Farmworkers and non-farmworkers may use different postural control strategies even when controlling for the level of lifetime pesticide exposure. Long-term exposure can affect somatosensory/vestibular sensory systems and the central processing of sensory information for postural control. PMID:26849257

  18. The function and failure of sensory predictions.

    PubMed

    Bansal, Sonia; Ford, Judith M; Spering, Miriam

    2018-04-23

    Humans and other primates are equipped with neural mechanisms that allow them to automatically make predictions about future events, facilitating processing of expected sensations and actions. Prediction-driven control and monitoring of perceptual and motor acts are vital to normal cognitive functioning. This review provides an overview of corollary discharge mechanisms involved in predictions across sensory modalities and discusses consequences of predictive coding for cognition and behavior. Converging evidence now links impairments in corollary discharge mechanisms to neuropsychiatric symptoms such as hallucinations and delusions. We review studies supporting a prediction-failure hypothesis of perceptual and cognitive disturbances. We also outline neural correlates underlying prediction function and failure, highlighting similarities across the visual, auditory, and somatosensory systems. In linking basic psychophysical and psychophysiological evidence of visual, auditory, and somatosensory prediction failures to neuropsychiatric symptoms, our review furthers our understanding of disease mechanisms. © 2018 New York Academy of Sciences.

  19. Event-related potentials to visual, auditory, and bimodal (combined auditory-visual) stimuli.

    PubMed

    Isoğlu-Alkaç, Ummühan; Kedzior, Karina; Keskindemirci, Gonca; Ermutlu, Numan; Karamursel, Sacit

    2007-02-01

    The purpose of this study was to investigate the response properties of event related potentials to unimodal and bimodal stimulations. The amplitudes of N1 and P2 were larger during bimodal evoked potentials (BEPs) than auditory evoked potentials (AEPs) in the anterior sites and the amplitudes of P1 were larger during BEPs than VEPs especially at the parieto-occipital locations. Responses to bimodal stimulation had longer latencies than responses to unimodal stimulation. The N1 and P2 components were larger in amplitude and longer in latency during the bimodal paradigm and predominantly occurred at the anterior sites. Therefore, the current bimodal paradigm can be used to investigate the involvement and location of specific neural generators that contribute to higher processing of sensory information. Moreover, this paradigm may be a useful tool to investigate the level of sensory dysfunctions in clinical samples.

  20. Multisensory effects on somatosensation: a trimodal visuo-vestibular-tactile interaction

    PubMed Central

    Kaliuzhna, Mariia; Ferrè, Elisa Raffaella; Herbelin, Bruno; Blanke, Olaf; Haggard, Patrick

    2016-01-01

    Vestibular information about self-motion is combined with other sensory signals. Previous research described both visuo-vestibular and vestibular-tactile bilateral interactions, but the simultaneous interaction between all three sensory modalities has not been explored. Here we exploit a previously reported visuo-vestibular integration to investigate multisensory effects on tactile sensitivity in humans. Tactile sensitivity was measured during passive whole body rotations alone or in conjunction with optic flow, creating either purely vestibular or visuo-vestibular sensations of self-motion. Our results demonstrate that tactile sensitivity is modulated by perceived self-motion, as provided by a combined visuo-vestibular percept, and not by the visual and vestibular cues independently. We propose a hierarchical multisensory interaction that underpins somatosensory modulation: visual and vestibular cues are first combined to produce a multisensory self-motion percept. Somatosensory processing is then enhanced according to the degree of perceived self-motion. PMID:27198907

  1. Large-scale functional brain network changes in taxi drivers: evidence from resting-state fMRI.

    PubMed

    Wang, Lubin; Liu, Qiang; Shen, Hui; Li, Hong; Hu, Dewen

    2015-03-01

    Driving a car in the environment is a complex behavior that involves cognitive processing of visual information to generate the proper motor outputs and action controls. Previous neuroimaging studies have used virtual simulation to identify the brain areas that are associated with various driving-related tasks. Few studies, however, have focused on the specific patterns of functional organization in the driver's brain. The aim of this study was to assess differences in the resting-state networks (RSNs) of the brains of drivers and nondrivers. Forty healthy subjects (20 licensed taxi drivers, 20 nondrivers) underwent an 8-min resting-state functional MRI acquisition. Using independent component analysis, three sensory (primary and extrastriate visual, sensorimotor) RSNs and four cognitive (anterior and posterior default mode, left and right frontoparietal) RSNs were retrieved from the data. We then examined the group differences in the intrinsic brain activity of each RSN and in the functional network connectivity (FNC) between the RSNs. We found that the drivers had reduced intrinsic brain activity in the visual RSNs and reduced FNC between the sensory RSNs compared with the nondrivers. The major finding of this study, however, was that the FNC between the cognitive and sensory RSNs became more positively or less negatively correlated in the drivers relative to that in the nondrivers. Notably, the strength of the FNC between the left frontoparietal and primary visual RSNs was positively correlated with the number of taxi-driving years. Our findings may provide new insight into how the brain supports driving behavior. © 2014 Wiley Periodicals, Inc.

  2. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities

    PubMed Central

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks implicated during divided attention across spatial locations and sensory modalities, pointing out the importance of investigating effective connectivity of large-scale brain networks supporting complex behavior. PMID:29535614

  3. Large-Scale Brain Networks Supporting Divided Attention across Spatial Locations and Sensory Modalities.

    PubMed

    Santangelo, Valerio

    2018-01-01

    Higher-order cognitive processes were shown to rely on the interplay between large-scale neural networks. However, brain networks involved with the capability to split attentional resource over multiple spatial locations and multiple stimuli or sensory modalities have been largely unexplored to date. Here I re-analyzed data from Santangelo et al. (2010) to explore the causal interactions between large-scale brain networks during divided attention. During fMRI scanning, participants monitored streams of visual and/or auditory stimuli in one or two spatial locations for detection of occasional targets. This design allowed comparing a condition in which participants monitored one stimulus/modality (either visual or auditory) in two spatial locations vs. a condition in which participants monitored two stimuli/modalities (both visual and auditory) in one spatial location. The analysis of the independent components (ICs) revealed that dividing attentional resources across two spatial locations necessitated a brain network involving the left ventro- and dorso-lateral prefrontal cortex plus the posterior parietal cortex, including the intraparietal sulcus (IPS) and the angular gyrus, bilaterally. The analysis of Granger causality highlighted that the activity of lateral prefrontal regions were predictive of the activity of all of the posteriors parietal nodes. By contrast, dividing attention across two sensory modalities necessitated a brain network including nodes belonging to the dorsal frontoparietal network, i.e., the bilateral frontal eye-fields (FEF) and IPS, plus nodes belonging to the salience network, i.e., the anterior cingulated cortex and the left and right anterior insular cortex (aIC). The analysis of Granger causality highlights a tight interdependence between the dorsal frontoparietal and salience nodes in trials requiring divided attention between different sensory modalities. The current findings therefore highlighted a dissociation among brain networks implicated during divided attention across spatial locations and sensory modalities, pointing out the importance of investigating effective connectivity of large-scale brain networks supporting complex behavior.

  4. Characterizing the roles of alpha and theta oscillations in multisensory attention.

    PubMed

    Keller, Arielle S; Payne, Lisa; Sekuler, Robert

    2017-05-01

    Cortical alpha oscillations (8-13Hz) appear to play a role in suppressing distractions when just one sensory modality is being attended, but do they also contribute when attention is distributed over multiple sensory modalities? For an answer, we examined cortical oscillations in human subjects who were dividing attention between auditory and visual sequences. In Experiment 1, subjects performed an oddball task with auditory, visual, or simultaneous audiovisual sequences in separate blocks, while the electroencephalogram was recorded using high-density scalp electrodes. Alpha oscillations were present continuously over posterior regions while subjects were attending to auditory sequences. This supports the idea that the brain suppresses processing of visual input in order to advantage auditory processing. During a divided-attention audiovisual condition, an oddball (a rare, unusual stimulus) occurred in either the auditory or the visual domain, requiring that attention be divided between the two modalities. Fronto-central theta band (4-7Hz) activity was strongest in this audiovisual condition, when subjects monitored auditory and visual sequences simultaneously. Theta oscillations have been associated with both attention and with short-term memory. Experiment 2 sought to distinguish these possible roles of fronto-central theta activity during multisensory divided attention. Using a modified version of the oddball task from Experiment 1, Experiment 2 showed that differences in theta power among conditions were independent of short-term memory load. Ruling out theta's association with short-term memory, we conclude that fronto-central theta activity is likely a marker of multisensory divided attention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Characterizing the roles of alpha and theta oscillations in multisensory attention

    PubMed Central

    Keller, Arielle S.; Payne, Lisa; Sekuler, Robert

    2017-01-01

    Cortical alpha oscillations (8–13 Hz) appear to play a role in suppressing distractions when just one sensory modality is being attended, but do they also contribute when attention is distributed over multiple sensory modalities? For an answer, we examined cortical oscillations in human subjects who were dividing attention between auditory and visual sequences. In Experiment 1, subjects performed an oddball task with auditory, visual, or simultaneous audiovisual sequences in separate blocks, while the electroencephalogram was recorded using high-density scalp electrodes. Alpha oscillations were present continuously over posterior regions while subjects were attending to auditory sequences. This supports the idea that the brain suppresses processing of visual input in order to advantage auditory processing. During a divided-attention audiovisual condition, an oddball (a rare, unusual stimulus) occurred in either the auditory or the visual domain, requiring that attention be divided between the two modalities. Fronto-central theta band (4–7 Hz) activity was strongest in this audiovisual condition, when subjects monitored auditory and visual sequences simultaneously. Theta oscillations have been associated with both attention and with short-term memory. Experiment 2 sought to distinguish these possible roles of fronto-central theta activity during multisensory divided attention. Using a modified version of the oddball task from Experiment 1, Experiment 2 showed that differences in theta power among conditions were independent of short-term memory load. Ruling out theta’s association with short-term memory, we conclude that fronto-central theta activity is likely a marker of multisensory divided attention. PMID:28259771

  6. Impaired capacity of cerebellar patients to perceive and learn two-dimensional shapes based on kinesthetic cues.

    PubMed

    Shimansky, Y; Saling, M; Wunderlich, D A; Bracha, V; Stelmach, G E; Bloedel, J R

    1997-01-01

    This study addresses the issue of the role of the cerebellum in the processing of sensory information by determining the capability of cerebellar patients to acquire and use kinesthetic cues received via the active or passive tracing of an irregular shape while blindfolded. Patients with cerebellar lesions and age-matched healthy controls were tested on four tasks: (1) learning to discriminate a reference shape from three others through the repeated tracing of the reference template; (2) reproducing the reference shape from memory by drawing blindfolded; (3) performing the same task with vision; and (4) visually recognizing the reference shape. The cues used to acquire and then to recognize the reference shape were generated under four conditions: (1) "active kinesthesia," in which cues were acquired by the blindfolded subject while actively tracing a reference template; (2) "passive kinesthesia," in which the tracing was performed while the hand was guided passively through the template; (3) "sequential vision," in which the shape was visualized by the serial exposure of small segments of its outline; and (4) "full vision," in which the entire shape was visualized. The sequential vision condition was employed to emulate the sequential way in which kinesthetic information is acquired while tracing the reference shape. The results demonstrate a substantial impairment of cerebellar patients in their capability to perceive two-dimensional irregular shapes based only on kinesthetic cues. There also is evidence that this deficit in part relates to a reduced capacity to integrate temporal sequences of sensory cues into a complete image useful for shape discrimination tasks or for reproducing the shape through drawing. Consequently, the cerebellum has an important role in this type of sensory information processing even when it is not directly associated with the execution of movements.

  7. Dissociated emergent-response system and fine-processing system in human neural network and a heuristic neural architecture for autonomous humanoid robots.

    PubMed

    Yan, Xiaodan

    2010-01-01

    The current study investigated the functional connectivity of the primary sensory system with resting state fMRI and applied such knowledge into the design of the neural architecture of autonomous humanoid robots. Correlation and Granger causality analyses were utilized to reveal the functional connectivity patterns. Dissociation was within the primary sensory system, in that the olfactory cortex and the somatosensory cortex were strongly connected to the amygdala whereas the visual cortex and the auditory cortex were strongly connected with the frontal cortex. The posterior cingulate cortex (PCC) and the anterior cingulate cortex (ACC) were found to maintain constant communication with the primary sensory system, the frontal cortex, and the amygdala. Such neural architecture inspired the design of dissociated emergent-response system and fine-processing system in autonomous humanoid robots, with separate processing units and another consolidation center to coordinate the two systems. Such design can help autonomous robots to detect and respond quickly to danger, so as to maintain their sustainability and independence.

  8. The Visual Arts and Qualitative Research: Diverse and Emerging Voices.

    ERIC Educational Resources Information Center

    Stephen, Veronica P.

    The arts are basic educational processes that involve students with different abilities and from differing age groups in sensory perception. This perception, augmented by the use of art compositions, establishes a critical dialogue between the medium and the viewer. What one views, sees, and observes in an art piece serves to create a…

  9. The power of projectomes: genetic mosaic labeling in the larval zebrafish brain reveals organizing principles of sensory circuits.

    PubMed

    Robles, Estuardo

    2017-09-01

    In no vertebrate species do we possess an accurate, comprehensive tally of neuron types in the brain. This is in no small part due to the vast diversity of neuronal types that comprise complex vertebrate nervous systems. A fundamental goal of neuroscience is to construct comprehensive catalogs of cell types defined by structure, connectivity, and physiological response properties. This type of information will be invaluable for generating models of how assemblies of neurons encode and distribute sensory information and correspondingly alter behavior. This review summarizes recent efforts in the larval zebrafish to construct sensory projectomes, comprehensive analyses of axonal morphologies in sensory axon tracts. Focusing on the olfactory and optic tract, these studies revealed principles of sensory information processing in the olfactory and visual systems that could not have been directly quantified by other methods. In essence, these studies reconstructed the optic and olfactory tract in a virtual manner, providing insights into patterns of neuronal growth that underlie the formation of sensory axon tracts. Quantitative analysis of neuronal diversity revealed organizing principles that determine information flow through sensory systems in the zebrafish that are likely to be conserved across vertebrate species. The generation of comprehensive cell type classifications based on structural, physiological, and molecular features will lead to testable hypotheses on the functional role of individual sensory neuron subtypes in controlling specific sensory-evoked behaviors.

  10. Disruption of visual awareness during the attentional blink is reflected by selective disruption of late-stage neural processing

    PubMed Central

    Harris, Joseph A.; McMahon, Alex R.; Woldorff, Marty G.

    2015-01-01

    Any information represented in the brain holds the potential to influence behavior. It is therefore of broad interest to determine the extent and quality of neural processing of stimulus input that occurs with and without awareness. The attentional blink is a useful tool for dissociating neural and behavioral measures of perceptual visual processing across conditions of awareness. The extent of higher-order visual information beyond basic sensory signaling that is processed during the attentional blink remains controversial. To determine what neural processing at the level of visual-object identification occurs in the absence of awareness, electrophysiological responses to images of faces and houses were recorded both within and outside of the attentional blink period during a rapid serial visual presentation (RSVP) stream. Electrophysiological results were sorted according to behavioral performance (correctly identified targets versus missed targets) within these blink and non-blink periods. An early index of face-specific processing (the N170, 140–220 ms post-stimulus) was observed regardless of whether the subject demonstrated awareness of the stimulus, whereas a later face-specific effect with the same topographic distribution (500–700 ms post-stimulus) was only seen for accurate behavioral discrimination of the stimulus content. The present findings suggest a multi-stage process of object-category processing, with only the later phase being associated with explicit visual awareness. PMID:23859644

  11. Supralinear and Supramodal Integration of Visual and Tactile Signals in Rats: Psychophysics and Neuronal Mechanisms.

    PubMed

    Nikbakht, Nader; Tafreshiha, Azadeh; Zoccolan, Davide; Diamond, Mathew E

    2018-02-07

    To better understand how object recognition can be triggered independently of the sensory channel through which information is acquired, we devised a task in which rats judged the orientation of a raised, black and white grating. They learned to recognize two categories of orientation: 0° ± 45° ("horizontal") and 90° ± 45° ("vertical"). Each trial required a visual (V), a tactile (T), or a visual-tactile (VT) discrimination; VT performance was better than that predicted by optimal linear combination of V and T signals, indicating synergy between sensory channels. We examined posterior parietal cortex (PPC) and uncovered key neuronal correlates of the behavioral findings: PPC carried both graded information about object orientation and categorical information about the rat's upcoming choice; single neurons exhibited identical responses under the three modality conditions. Finally, a linear classifier of neuronal population firing replicated the behavioral findings. Taken together, these findings suggest that PPC is involved in the supramodal processing of shape. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Global Sensory Qualities and Aesthetic Experience in Music

    PubMed Central

    Brattico, Pauli; Brattico, Elvira; Vuust, Peter

    2017-01-01

    A well-known tradition in the study of visual aesthetics holds that the experience of visual beauty is grounded in global computational or statistical properties of the stimulus, for example, scale-invariant Fourier spectrum or self-similarity. Some approaches rely on neural mechanisms, such as efficient computation, processing fluency, or the responsiveness of the cells in the primary visual cortex. These proposals are united by the fact that the contributing factors are hypothesized to be global (i.e., they concern the percept as a whole), formal or non-conceptual (i.e., they concern form instead of content), computational and/or statistical, and based on relatively low-level sensory properties. Here we consider that the study of aesthetic responses to music could benefit from the same approach. Thus, along with local features such as pitch, tuning, consonance/dissonance, harmony, timbre, or beat, also global sonic properties could be viewed as contributing toward creating an aesthetic musical experience. Several such properties are discussed and their neural implementation is reviewed in the light of recent advances in neuroaesthetics. PMID:28424573

  13. Action video game playing is associated with improved visual sensitivity, but not alterations in visual sensory memory.

    PubMed

    Appelbaum, L Gregory; Cain, Matthew S; Darling, Elise F; Mitroff, Stephen R

    2013-08-01

    Action video game playing has been experimentally linked to a number of perceptual and cognitive improvements. These benefits are captured through a wide range of psychometric tasks and have led to the proposition that action video game experience may promote the ability to extract statistical evidence from sensory stimuli. Such an advantage could arise from a number of possible mechanisms: improvements in visual sensitivity, enhancements in the capacity or duration for which information is retained in visual memory, or higher-level strategic use of information for decision making. The present study measured the capacity and time course of visual sensory memory using a partial report performance task as a means to distinguish between these three possible mechanisms. Sensitivity measures and parameter estimates that describe sensory memory capacity and the rate of memory decay were compared between individuals who reported high evels and low levels of action video game experience. Our results revealed a uniform increase in partial report accuracy at all stimulus-to-cue delays for action video game players but no difference in the rate or time course of the memory decay. The present findings suggest that action video game playing may be related to enhancements in the initial sensitivity to visual stimuli, but not to a greater retention of information in iconic memory buffers.

  14. A physiologically based nonhomogeneous Poisson counter model of visual identification.

    PubMed

    Christensen, Jeppe H; Markussen, Bo; Bundesen, Claus; Kyllingsbæk, Søren

    2018-04-30

    A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects that are mutually confusable and hard to see. The model assumes that the visual system's initial sensory response consists in tentative visual categorizations, which are accumulated by leaky integration of both transient and sustained components comparable with those found in spike density patterns of early sensory neurons. The sensory response (tentative categorizations) feeds independent Poisson counters, each of which accumulates tentative object categorizations of a particular type to guide overt identification performance. We tested the model's ability to predict the effect of stimulus duration on observed distributions of responses in a nonspeeded (pure accuracy) identification task with eight response alternatives. The time courses of correct and erroneous categorizations were well accounted for when the event-rates of competing Poisson counters were allowed to vary independently over time in a way that mimicked the dynamics of receptive field selectivity as found in neurophysiological studies. Furthermore, the initial sensory response yielded theoretical hazard rate functions that closely resembled empirically estimated ones. Finally, supplied with a Naka-Rushton type contrast gain control, the model provided an explanation for Bloch's law. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. The Sense of Agency Is More Sensitive to Manipulations of Outcome than Movement-Related Feedback Irrespective of Sensory Modality

    PubMed Central

    David, Nicole; Skoruppa, Stefan; Gulberti, Alessandro

    2016-01-01

    The sense of agency describes the ability to experience oneself as the agent of one's own actions. Previous studies of the sense of agency manipulated the predicted sensory feedback related either to movement execution or to the movement’s outcome, for example by delaying the movement of a virtual hand or the onset of a tone that resulted from a button press. Such temporal sensorimotor discrepancies reduce the sense of agency. It remains unclear whether movement-related feedback is processed differently than outcome-related feedback in terms of agency experience, especially if these types of feedback differ with respect to sensory modality. We employed a mixed-reality setup, in which participants tracked their finger movements by means of a virtual hand. They performed a single tap, which elicited a sound. The temporal contingency between the participants’ finger movements and (i) the movement of the virtual hand or (ii) the expected auditory outcome was systematically varied. In a visual control experiment, the tap elicited a visual outcome. For each feedback type and participant, changes in the sense of agency were quantified using a forced-choice paradigm and the Method of Constant Stimuli. Participants were more sensitive to delays of outcome than to delays of movement execution. This effect was very similar for visual or auditory outcome delays. Our results indicate different contributions of movement- versus outcome-related sensory feedback to the sense of agency, irrespective of the modality of the outcome. We propose that this differential sensitivity reflects the behavioral importance of assessing authorship of the outcome of an action. PMID:27536948

  16. Sensorimotor aspects of high-speed artificial gravity: I. Sensory conflict in vestibular adaptation

    NASA Technical Reports Server (NTRS)

    Brown, Erika L.; Hecht, Heiko; Young, Laurence R.

    2002-01-01

    Short-radius centrifugation offers a promising and affordable countermeasure to the adverse effects of prolonged weightlessness. However, head movements made in a fast rotating environment elicit Coriolis effects, which seriously compromise sensory and motor processes. We found that participants can adapt to these Coriolis effects when exposed intermittently to high rotation rates and, at the same time, can maintain their perceptual-motor coordination in stationary environments. In this paper, we explore the role of inter-sensory conflict in this adaptation process. Different measures (vertical nystagmus, illusory body tilt, motion sickness) react differently to visual-vestibular conflict and adapt differently. In particular, proprioceptive-vestibular conflict sufficed to adapt subjective parameters and the time constant of nystagmus decay, while retinal slip was required for VOR gain adaptation. A simple correlation between the strength of intersensory conflict and the efficacy of adaptation fails to explain the data. Implications of these findings, which differ from existing data for low rotation rates, are discussed.

  17. Perception and Cognition in the Ageing Brain: A Brief Review of the Short- and Long-Term Links between Perceptual and Cognitive Decline

    PubMed Central

    Roberts, Katherine L.; Allen, Harriet A.

    2016-01-01

    Ageing is associated with declines in both perception and cognition. We review evidence for an interaction between perceptual and cognitive decline in old age. Impoverished perceptual input can increase the cognitive difficulty of tasks, while changes to cognitive strategies can compensate, to some extent, for impaired perception. While there is strong evidence from cross-sectional studies for a link between sensory acuity and cognitive performance in old age, there is not yet compelling evidence from longitudinal studies to suggest that poor perception causes cognitive decline, nor to demonstrate that correcting sensory impairment can improve cognition in the longer term. Most studies have focused on relatively simple measures of sensory (visual and auditory) acuity, but more complex measures of suprathreshold perceptual processes, such as temporal processing, can show a stronger link with cognition. The reviewed evidence underlines the importance of fully accounting for perceptual deficits when investigating cognitive decline in old age. PMID:26973514

  18. Play with your food! Sensory play is associated with tasting of fruits and vegetables in preschool children.

    PubMed

    Coulthard, Helen; Sealy, Annemarie

    2017-06-01

    The objective of the current study was to ascertain whether taking part in a sensory play activity with real fruits and vegetables (FV) can encourage tasting in preschool children, compared to a non-food activity or visual exposure to the activity. Three to four year old pre-school children (N = 62) were recruited from three preschool nursery classes from a school in Northamptonshire, UK. A between participants experimental study was conducted with each class assigned to one of three conditions; sensory FV play, sensory non-food play and visual FV exposure. Parental report of several baseline variables were taken; child baseline liking of the foods used in the study, parental and child FV consumption (portions/day), child neophobia and child tactile sensitivity. Outcome measures were the number of fruits and vegetables tasted in a post experiment taste test which featured (n = 5) or did not feature (n = 3) in the task. Analyses of covariance controlling for food neophobia and baseline liking of foods, showed that after the activity children in the sensory FV play condition tried more FV than both children in the non-food sensory play task (p < 0.001) and children in the visual FV exposure task (p < 0.001). This was true not only for five foods used in the activity (p < 0.001), but also three foods that were not used in the activity (p < 0.05). Sensory play activities using fruits and vegetables may encourage FV tasting in preschool children more than non food play or visual exposure alone. Long term intervention studies need to be carried out to see if these effects can be sustained over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Stochastic correlative firing for figure-ground segregation.

    PubMed

    Chen, Zhe

    2005-03-01

    Segregation of sensory inputs into separate objects is a central aspect of perception and arises in all sensory modalities. The figure-ground segregation problem requires identifying an object of interest in a complex scene, in many cases given binaural auditory or binocular visual observations. The computations required for visual and auditory figure-ground segregation share many common features and can be cast within a unified framework. Sensory perception can be viewed as a problem of optimizing information transmission. Here we suggest a stochastic correlative firing mechanism and an associative learning rule for figure-ground segregation in several classic sensory perception tasks, including the cocktail party problem in binaural hearing, binocular fusion of stereo images, and Gestalt grouping in motion perception.

  20. Sensory adaptation for timing perception.

    PubMed

    Roseboom, Warrick; Linares, Daniel; Nishida, Shin'ya

    2015-04-22

    Recent sensory experience modifies subjective timing perception. For example, when visual events repeatedly lead auditory events, such as when the sound and video tracks of a movie are out of sync, subsequent vision-leads-audio presentations are reported as more simultaneous. This phenomenon could provide insights into the fundamental problem of how timing is represented in the brain, but the underlying mechanisms are poorly understood. Here, we show that the effect of recent experience on timing perception is not just subjective; recent sensory experience also modifies relative timing discrimination. This result indicates that recent sensory history alters the encoding of relative timing in sensory areas, excluding explanations of the subjective phenomenon based only on decision-level changes. The pattern of changes in timing discrimination suggests the existence of two sensory components, similar to those previously reported for visual spatial attributes: a lateral shift in the nonlinear transducer that maps relative timing into perceptual relative timing and an increase in transducer slope around the exposed timing. The existence of these components would suggest that previous explanations of how recent experience may change the sensory encoding of timing, such as changes in sensory latencies or simple implementations of neural population codes, cannot account for the effect of sensory adaptation on timing perception.

Top