Sample records for visual sensory structures

  1. Flexibility and Stability in Sensory Processing Revealed Using Visual-to-Auditory Sensory Substitution

    PubMed Central

    Hertz, Uri; Amedi, Amir

    2015-01-01

    The classical view of sensory processing involves independent processing in sensory cortices and multisensory integration in associative areas. This hierarchical structure has been challenged by evidence of multisensory responses in sensory areas, and dynamic weighting of sensory inputs in associative areas, thus far reported independently. Here, we used a visual-to-auditory sensory substitution algorithm (SSA) to manipulate the information conveyed by sensory inputs while keeping the stimuli intact. During scan sessions before and after SSA learning, subjects were presented with visual images and auditory soundscapes. The findings reveal 2 dynamic processes. First, crossmodal attenuation of sensory cortices changed direction after SSA learning from visual attenuations of the auditory cortex to auditory attenuations of the visual cortex. Secondly, associative areas changed their sensory response profile from strongest response for visual to that for auditory. The interaction between these phenomena may play an important role in multisensory processing. Consistent features were also found in the sensory dominance in sensory areas and audiovisual convergence in associative area Middle Temporal Gyrus. These 2 factors allow for both stability and a fast, dynamic tuning of the system when required. PMID:24518756

  2. Flexibility and Stability in Sensory Processing Revealed Using Visual-to-Auditory Sensory Substitution.

    PubMed

    Hertz, Uri; Amedi, Amir

    2015-08-01

    The classical view of sensory processing involves independent processing in sensory cortices and multisensory integration in associative areas. This hierarchical structure has been challenged by evidence of multisensory responses in sensory areas, and dynamic weighting of sensory inputs in associative areas, thus far reported independently. Here, we used a visual-to-auditory sensory substitution algorithm (SSA) to manipulate the information conveyed by sensory inputs while keeping the stimuli intact. During scan sessions before and after SSA learning, subjects were presented with visual images and auditory soundscapes. The findings reveal 2 dynamic processes. First, crossmodal attenuation of sensory cortices changed direction after SSA learning from visual attenuations of the auditory cortex to auditory attenuations of the visual cortex. Secondly, associative areas changed their sensory response profile from strongest response for visual to that for auditory. The interaction between these phenomena may play an important role in multisensory processing. Consistent features were also found in the sensory dominance in sensory areas and audiovisual convergence in associative area Middle Temporal Gyrus. These 2 factors allow for both stability and a fast, dynamic tuning of the system when required. © The Author 2014. Published by Oxford University Press.

  3. The Inversion of Sensory Processing by Feedback Pathways: A Model of Visual Cognitive Functions.

    ERIC Educational Resources Information Center

    Harth, E.; And Others

    1987-01-01

    Explains the hierarchic structure of the mammalian visual system. Proposes a model in which feedback pathways serve to modify sensory stimuli in ways that enhance and complete sensory input patterns. Investigates the functioning of the system through computer simulations. (ML)

  4. Sensory system plasticity in a visually specialized, nocturnal spider.

    PubMed

    Stafstrom, Jay A; Michalik, Peter; Hebets, Eileen A

    2017-04-21

    The interplay between an animal's environmental niche and its behavior can influence the evolutionary form and function of its sensory systems. While intraspecific variation in sensory systems has been documented across distant taxa, fewer studies have investigated how changes in behavior might relate to plasticity in sensory systems across developmental time. To investigate the relationships among behavior, peripheral sensory structures, and central processing regions in the brain, we take advantage of a dramatic within-species shift of behavior in a nocturnal, net-casting spider (Deinopis spinosa), where males cease visually-mediated foraging upon maturation. We compared eye diameters and brain region volumes across sex and life stage, the latter through micro-computed X-ray tomography. We show that mature males possess altered peripheral visual morphology when compared to their juvenile counterparts, as well as juvenile and mature females. Matching peripheral sensory structure modifications, we uncovered differences in relative investment in both lower-order and higher-order processing regions in the brain responsible for visual processing. Our study provides evidence for sensory system plasticity when individuals dramatically change behavior across life stages, uncovering new avenues of inquiry focusing on altered reliance of specific sensory information when entering a new behavioral niche.

  5. Learning to Perceive Structure from Motion and Neural Plasticity in Patients with Alzheimer's Disease

    ERIC Educational Resources Information Center

    Kim, Nam-Gyoon; Park, Jong-Hee

    2010-01-01

    Recent research has demonstrated that Alzheimer's disease (AD) affects the visual sensory pathways, producing a variety of visual deficits, including the capacity to perceive structure-from-motion (SFM). Because the sensory areas of the adult brain are known to retain a large degree of plasticity, the present study was conducted to explore whether…

  6. STANDARDS OF FUNCTIONAL MEASUREMENTS IN OCULAR TOXICOLOGY.

    EPA Science Inventory

    The visual system, like other sensory systems, may be a frequent target of exposure to toxic chemicals. A thorough evaluation of visual toxicity should include both structural and functional measures. Sensory evoked potentials are one set of neurophysiological procedures that...

  7. Aging and the interaction of sensory cortical function and structure.

    PubMed

    Peiffer, Ann M; Hugenschmidt, Christina E; Maldjian, Joseph A; Casanova, Ramon; Srikanth, Ryali; Hayasaka, Satoru; Burdette, Jonathan H; Kraft, Robert A; Laurienti, Paul J

    2009-01-01

    Even the healthiest older adults experience changes in cognitive and sensory function. Studies show that older adults have reduced neural responses to sensory information. However, it is well known that sensory systems do not act in isolation but function cooperatively to either enhance or suppress neural responses to individual environmental stimuli. Very little research has been dedicated to understanding how aging affects the interactions between sensory systems, especially cross-modal deactivations or the ability of one sensory system (e.g., audition) to suppress the neural responses in another sensory system cortex (e.g., vision). Such cross-modal interactions have been implicated in attentional shifts between sensory modalities and could account for increased distractibility in older adults. To assess age-related changes in cross-modal deactivations, functional MRI studies were performed in 61 adults between 18 and 80 years old during simple auditory and visual discrimination tasks. Results within visual cortex confirmed previous findings of decreased responses to visual stimuli for older adults. Age-related changes in the visual cortical response to auditory stimuli were, however, much more complex and suggested an alteration with age in the functional interactions between the senses. Ventral visual cortical regions exhibited cross-modal deactivations in younger but not older adults, whereas more dorsal aspects of visual cortex were suppressed in older but not younger adults. These differences in deactivation also remained after adjusting for age-related reductions in brain volume of sensory cortex. Thus, functional differences in cortical activity between older and younger adults cannot solely be accounted for by differences in gray matter volume. (c) 2007 Wiley-Liss, Inc.

  8. Managing daily life with age-related sensory loss: cognitive resources gain in importance.

    PubMed

    Heyl, Vera; Wahl, Hans-Werner

    2012-06-01

    This paper investigates the role of cognitive resources in everyday functioning, comparing visually impaired, hearing impaired, and sensory unimpaired older adults. According to arguments that cognitive resources are of increased importance and a greater awareness of cognitive restrictions exists among sensory impaired individuals, in particular among visually impaired individuals, we hypothesized differential relationships between resources and outcomes when comparing sensory impaired and sensory unimpaired older adults. Findings are based on samples of 121 visually impaired, 116 hearing impaired, and 150 sensory unimpaired older adults (M = 82 years). Results from a sample of 43 dual sensory impaired older adults are reported for comparison. Assessment relied on established instruments (e.g., WAIS-R, ADL/IADL). Structural equation modeling showed that cognitive resources and behavior-related everyday functioning were more strongly related in the sensory impaired groups as compared to the sensory unimpaired group. Cognitive resources and evaluation of everyday functioning were significantly linked only among the sensory impaired groups. When medical condition was controlled for, these effects persisted. It is concluded that both cognitive training as well as psychosocial support may serve as important additions to classic vision and hearing loss rehabilitation. PsycINFO Database Record (c) 2012 APA, all rights reserved

  9. Decoding the future from past experience: learning shapes predictions in early visual cortex.

    PubMed

    Luft, Caroline D B; Meeson, Alan; Welchman, Andrew E; Kourtzi, Zoe

    2015-05-01

    Learning the structure of the environment is critical for interpreting the current scene and predicting upcoming events. However, the brain mechanisms that support our ability to translate knowledge about scene statistics to sensory predictions remain largely unknown. Here we provide evidence that learning of temporal regularities shapes representations in early visual cortex that relate to our ability to predict sensory events. We tested the participants' ability to predict the orientation of a test stimulus after exposure to sequences of leftward- or rightward-oriented gratings. Using fMRI decoding, we identified brain patterns related to the observers' visual predictions rather than stimulus-driven activity. Decoding of predicted orientations following structured sequences was enhanced after training, while decoding of cued orientations following exposure to random sequences did not change. These predictive representations appear to be driven by the same large-scale neural populations that encode actual stimulus orientation and to be specific to the learned sequence structure. Thus our findings provide evidence that learning temporal structures supports our ability to predict future events by reactivating selective sensory representations as early as in primary visual cortex. Copyright © 2015 the American Physiological Society.

  10. Sensory experience modifies feature map relationships in visual cortex

    PubMed Central

    Cloherty, Shaun L; Hughes, Nicholas J; Hietanen, Markus A; Bhagavatula, Partha S

    2016-01-01

    The extent to which brain structure is influenced by sensory input during development is a critical but controversial question. A paradigmatic system for studying this is the mammalian visual cortex. Maps of orientation preference (OP) and ocular dominance (OD) in the primary visual cortex of ferrets, cats and monkeys can be individually changed by altered visual input. However, the spatial relationship between OP and OD maps has appeared immutable. Using a computational model we predicted that biasing the visual input to orthogonal orientation in the two eyes should cause a shift of OP pinwheels towards the border of OD columns. We then confirmed this prediction by rearing cats wearing orthogonally oriented cylindrical lenses over each eye. Thus, the spatial relationship between OP and OD maps can be modified by visual experience, revealing a previously unknown degree of brain plasticity in response to sensory input. DOI: http://dx.doi.org/10.7554/eLife.13911.001 PMID:27310531

  11. Rhythmic Oscillations of Visual Contrast Sensitivity Synchronized with Action

    PubMed Central

    Tomassini, Alice; Spinelli, Donatella; Jacono, Marco; Sandini, Giulio; Morrone, Maria Concetta

    2016-01-01

    It is well known that the motor and the sensory systems structure sensory data collection and cooperate to achieve an efficient integration and exchange of information. Increasing evidence suggests that both motor and sensory functions are regulated by rhythmic processes reflecting alternating states of neuronal excitability, and these may be involved in mediating sensory-motor interactions. Here we show an oscillatory fluctuation in early visual processing time locked with the execution of voluntary action, and, crucially, even for visual stimuli irrelevant to the motor task. Human participants were asked to perform a reaching movement toward a display and judge the orientation of a Gabor patch, near contrast threshold, briefly presented at random times before and during the reaching movement. When the data are temporally aligned to the onset of movement, visual contrast sensitivity oscillates with periodicity within the theta band. Importantly, the oscillations emerge during the motor planning stage, ~500 ms before movement onset. We suggest that brain oscillatory dynamics may mediate an automatic coupling between early motor planning and early visual processing, possibly instrumental in linking and closing up the visual-motor control loop. PMID:25948254

  12. The dorsal raphe modulates sensory responsiveness during arousal in zebrafish

    PubMed Central

    Yokogawa, Tohei; Hannan, Markus C.; Burgess, Harold A.

    2012-01-01

    During waking behavior animals adapt their state of arousal in response to environmental pressures. Sensory processing is regulated in aroused states and several lines of evidence imply that this is mediated at least partly by the serotonergic system. However there is little information directly showing that serotonergic function is required for state-dependent modulation of sensory processing. Here we find that zebrafish larvae can maintain a short-term state of arousal during which neurons in the dorsal raphe modulate sensory responsiveness to behaviorally relevant visual cues. Following a brief exposure to water flow, larvae show elevated activity and heightened sensitivity to perceived motion. Calcium imaging of neuronal activity after flow revealed increased activity in serotonergic neurons of the dorsal raphe. Genetic ablation of these neurons abolished the increase in visual sensitivity during arousal without affecting baseline visual function or locomotor activity. We traced projections from the dorsal raphe to a major visual area, the optic tectum. Laser ablation of the tectum demonstrated that this structure, like the dorsal raphe, is required for improved visual sensitivity during arousal. These findings reveal that serotonergic neurons of the dorsal raphe have a state-dependent role in matching sensory responsiveness to behavioral context. PMID:23100441

  13. Higher order visual input to the mushroom bodies in the bee, Bombus impatiens.

    PubMed

    Paulk, Angelique C; Gronenberg, Wulfila

    2008-11-01

    To produce appropriate behaviors based on biologically relevant associations, sensory pathways conveying different modalities are integrated by higher-order central brain structures, such as insect mushroom bodies. To address this function of sensory integration, we characterized the structure and response of optic lobe (OL) neurons projecting to the calyces of the mushroom bodies in bees. Bees are well known for their visual learning and memory capabilities and their brains possess major direct visual input from the optic lobes to the mushroom bodies. To functionally characterize these visual inputs to the mushroom bodies, we recorded intracellularly from neurons in bumblebees (Apidae: Bombus impatiens) and a single neuron in a honeybee (Apidae: Apis mellifera) while presenting color and motion stimuli. All of the mushroom body input neurons were color sensitive while a subset was motion sensitive. Additionally, most of the mushroom body input neurons would respond to the first, but not to subsequent, presentations of repeated stimuli. In general, the medulla or lobula neurons projecting to the calyx signaled specific chromatic, temporal, and motion features of the visual world to the mushroom bodies, which included sensory information required for the biologically relevant associations bees form during foraging tasks.

  14. Auditory and visual cortex of primates: a comparison of two sensory systems

    PubMed Central

    Rauschecker, Josef P.

    2014-01-01

    A comparative view of the brain, comparing related functions across species and sensory systems, offers a number of advantages. In particular, it allows separating the formal purpose of a model structure from its implementation in specific brains. Models of auditory cortical processing can be conceived by analogy to the visual cortex, incorporating neural mechanisms that are found in both the visual and auditory systems. Examples of such canonical features on the columnar level are direction selectivity, size/bandwidth selectivity, as well as receptive fields with segregated versus overlapping on- and off-sub-regions. On a larger scale, parallel processing pathways have been envisioned that represent the two main facets of sensory perception: 1) identification of objects and 2) processing of space. Expanding this model in terms of sensorimotor integration and control offers an overarching view of cortical function independent of sensory modality. PMID:25728177

  15. [Contemporary approach to evaluation of sensory disorders in polyneuropathy due to vibration].

    PubMed

    Nepershina, C P; Lagutina, G N; Kuzmina, L P; Skrypnik, O V; Ryabininal, S N; Lagutina, A P

    2016-08-01

    Recently, the studies search possibilities to visualize and objectify sensory disorders in polyneuropathy caused by vibration. Special attention is paid on studies of injuried structures responsible for temperature and pain sensitivity. Examination covered 92 patients with vibration disease, aged 34 to 73 years. Methods used are: pallesthesiometry, quantitative sensory tests, questionnaires and s 'cales of pain (visual analog scale (VAS) of pain, Pain-Detect, MPQ DN-, HADS). Correlation was found between.temperature, pain thresholds and VAS and pallesthesiometry parameters. The obtained results analysis indicates formation distal polyneuropathy syndrome of upper limbs with concomitant pain during vibration disease.

  16. ON THE PERCEPTION OF PROBABLE THINGS

    PubMed Central

    Albright, Thomas D.

    2012-01-01

    SUMMARY Perception is influenced both by the immediate pattern of sensory inputs and by memories acquired through prior experiences with the world. Throughout much of its illustrious history, however, study of the cellular basis of perception has focused on neuronal structures and events that underlie the detection and discrimination of sensory stimuli. Relatively little attention has been paid to the means by which memories interact with incoming sensory signals. Building upon recent neurophysiological/behavioral studies of the cortical substrates of visual associative memory, I propose a specific functional process by which stored information about the world supplements sensory inputs to yield neuronal signals that can account for visual perceptual experience. This perspective represents a significant shift in the way we think about the cellular bases of perception. PMID:22542178

  17. Enhanced and bilateralized visual sensory processing in the ventral stream may be a feature of normal aging.

    PubMed

    De Sanctis, Pierfilippo; Katz, Richard; Wylie, Glenn R; Sehatpour, Pejman; Alexopoulos, George S; Foxe, John J

    2008-10-01

    Evidence has emerged for age-related amplification of basic sensory processing indexed by early components of the visual evoked potential (VEP). However, since these age-related effects have been incidental to the main focus of these studies, it is unclear whether they are performance dependent or alternately, represent intrinsic sensory processing changes. High-density VEPs were acquired from 19 healthy elderly and 15 young control participants who viewed alphanumeric stimuli in the absence of any active task. The data show both enhanced and delayed neural responses within structures of the ventral visual stream, with reduced hemispheric asymmetry in the elderly that may be indicative of a decline in hemispheric specialization. Additionally, considerably enhanced early frontal cortical activation was observed in the elderly, suggesting frontal hyper-activation. These age-related differences in early sensory processing are discussed in terms of recent proposals that normal aging involves large-scale compensatory reorganization. Our results suggest that such compensatory mechanisms are not restricted to later higher-order cognitive processes but may also be a feature of early sensory-perceptual processes.

  18. Locomotor Sensory Organization Test: How Sensory Conflict Affects the Temporal Structure of Sway Variability During Gait.

    PubMed

    Chien, Jung Hung; Mukherjee, Mukul; Siu, Ka-Chun; Stergiou, Nicholas

    2016-05-01

    When maintaining postural stability temporally under increased sensory conflict, a more rigid response is used where the available degrees of freedom are essentially frozen. The current study investigated if such a strategy is also utilized during more dynamic situations of postural control as is the case with walking. This study attempted to answer this question by using the Locomotor Sensory Organization Test (LSOT). This apparatus incorporates SOT inspired perturbations of the visual and the somatosensory system. Ten healthy young adults performed the six conditions of the traditional SOT and the corresponding six conditions on the LSOT. The temporal structure of sway variability was evaluated from all conditions. The results showed that in the anterior posterior direction somatosensory input is crucial for postural control for both walking and standing; visual input also had an effect but was not as prominent as the somatosensory input. In the medial lateral direction and with respect to walking, visual input has a much larger effect than somatosensory input. This is possibly due to the added contributions by peripheral vision during walking; in standing such contributions may not be as significant for postural control. In sum, as sensory conflict increases more rigid and regular sway patterns are found during standing confirming the previous results presented in the literature, however the opposite was the case with walking where more exploratory and adaptive movement patterns are present.

  19. Strength of figure-ground activity in monkey primary visual cortex predicts saccadic reaction time in a delayed detection task.

    PubMed

    Supèr, Hans; Lamme, Victor A F

    2007-06-01

    When and where are decisions made? In the visual system a saccade, which is a fast shift of gaze toward a target in the visual scene, is the behavioral outcome of a decision. Current neurophysiological data and reaction time models show that saccadic reaction times are determined by a build-up of activity in motor-related structures, such as the frontal eye fields. These structures depend on the sensory evidence of the stimulus. Here we use a delayed figure-ground detection task to show that late modulated activity in the visual cortex (V1) predicts saccadic reaction time. This predictive activity is part of the process of figure-ground segregation and is specific for the saccade target location. These observations indicate that sensory signals are directly involved in the decision of when and where to look.

  20. Strabismus and the Oculomotor System: Insights from Macaque Models

    PubMed Central

    Das, Vallabh E.

    2017-01-01

    Disrupting binocular vision in infancy leads to strabismus and oftentimes to a variety of associated visual sensory deficits and oculomotor abnormalities. Investigation of this disorder has been aided by the development of various animal models, each of which has advantages and disadvantages. In comparison to studies of binocular visual responses in cortical structures, investigations of neural oculomotor structures that mediate the misalignment and abnormalities of eye movements have been more recent, and these studies have shown that different brain areas are intimately involved in driving several aspects of the strabismic condition, including horizontal misalignment, dissociated deviations, A and V patterns of strabismus, disconjugate eye movements, nystagmus, and fixation switch. The responses of cells in visual and oculomotor areas that potentially drive the sensory deficits and also eye alignment and eye movement abnormalities follow a general theme of disrupted calibration, lower sensitivity, and poorer specificity compared with the normally developed visual oculomotor system. PMID:28532347

  1. A dual-trace model for visual sensory memory.

    PubMed

    Cappiello, Marcus; Zhang, Weiwei

    2016-11-01

    Visual sensory memory refers to a transient memory lingering briefly after the stimulus offset. Although previous literature suggests that visual sensory memory is supported by a fine-grained trace for continuous representation and a coarse-grained trace of categorical information, simultaneous separation and assessment of these traces can be difficult without a quantitative model. The present study used a continuous estimation procedure to test a novel mathematical model of the dual-trace hypothesis of visual sensory memory according to which visual sensory memory could be modeled as a mixture of 2 von Mises (2VM) distributions differing in standard deviation. When visual sensory memory and working memory (WM) for colors were distinguished using different experimental manipulations in the first 3 experiments, the 2VM model outperformed Zhang and Luck (2008) standard mixture model (SM) representing a mixture of a single memory trace and random guesses, even though SM outperformed 2VM for WM. Experiment 4 generalized 2VM's advantages of fitting visual sensory memory data over SM from color to orientation. Furthermore, a single trace model and 4 other alternative models were ruled out, suggesting the necessity and sufficiency of dual traces for visual sensory memory. Together these results support the dual-trace model of visual sensory memory and provide a preliminary inquiry into the nature of information loss from visual sensory memory to WM. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Disentangling the role of floral sensory stimuli in pollination networks.

    PubMed

    Kantsa, Aphrodite; Raguso, Robert A; Dyer, Adrian G; Olesen, Jens M; Tscheulin, Thomas; Petanidou, Theodora

    2018-03-12

    Despite progress in understanding pollination network structure, the functional roles of floral sensory stimuli (visual, olfactory) have never been addressed comprehensively in a community context, even though such traits are known to mediate plant-pollinator interactions. Here, we use a comprehensive dataset of floral traits and a novel dynamic data-pooling methodology to explore the impacts of floral sensory diversity on the structure of a pollination network in a Mediterranean scrubland. Our approach tracks transitions in the network behaviour of each plant species throughout its flowering period and, despite dynamism in visitor composition, reveals significant links to floral scent, and/or colour as perceived by pollinators. Having accounted for floral phenology, abundance and phylogeny, the persistent association between floral sensory traits and visitor guilds supports a deeper role for sensory bias and diffuse coevolution in structuring plant-pollinator networks. This knowledge of floral sensory diversity, by identifying the most influential phenotypes, could help prioritize efforts for plant-pollinator community restoration.

  3. Visual function and cognitive speed of processing mediate age-related decline in memory span and fluid intelligence

    PubMed Central

    Clay, Olivio J.; Edwards, Jerri D.; Ross, Lesley A.; Okonkwo, Ozioma; Wadley, Virginia G.; Roth, David L.; Ball, Karlene K.

    2010-01-01

    Objectives: To evaluate the relationship between sensory and cognitive decline, particularly with respect to speed of processing, memory span, and fluid intelligence. Additionally, the common cause, sensory degradation and speed of processing hypotheses were compared. Methods: Structural equation modeling was used to investigate the complex relationships among age-related decrements in these areas. Results: Cross-sectional data analyses included 842 older adult participants (M = 73 years). After accounting for age-related declines in vision and processing speed, the direct associations between age and memory span and between age and fluid intelligence were nonsignificant. Older age was associated with visual decline, which was associated with slower speed of processing, which in turn was associated with greater cognitive deficits. Discussion: The findings support both the sensory degradation and speed of processing accounts of age-related cognitive decline. Further, the findings highlight positive aspects of normal cognitive aging in that older age may not be associated with a loss of fluid intelligence if visual sensory functioning and processing speed can be maintained. PMID:19436063

  4. Visual Cortex Plasticity: A Complex Interplay of Genetic and Environmental Influences

    PubMed Central

    Maya-Vetencourt, José Fernando; Origlia, Nicola

    2012-01-01

    The central nervous system architecture is highly dynamic and continuously modified by sensory experience through processes of neuronal plasticity. Plasticity is achieved by a complex interplay of environmental influences and physiological mechanisms that ultimately activate intracellular signal transduction pathways regulating gene expression. In addition to the remarkable variety of transcription factors and their combinatorial interaction at specific gene promoters, epigenetic mechanisms that regulate transcription have emerged as conserved processes by which the nervous system accomplishes the induction of plasticity. Experience-dependent changes of DNA methylation patterns and histone posttranslational modifications are, in fact, recruited as targets of plasticity-associated signal transduction mechanisms. Here, we shall concentrate on structural and functional consequences of early sensory deprivation in the visual system and discuss how intracellular signal transduction pathways associated with experience regulate changes of chromatin structure and gene expression patterns that underlie these plastic phenomena. Recent experimental evidence for mechanisms of cross-modal plasticity following congenital or acquired sensory deprivation both in human and animal models will be considered as well. We shall also review different experimental strategies that can be used to achieve the recovery of sensory functions after long-term deprivation in humans. PMID:22852098

  5. Visual system plasticity in mammals: the story of monocular enucleation-induced vision loss

    PubMed Central

    Nys, Julie; Scheyltjens, Isabelle; Arckens, Lutgarde

    2015-01-01

    The groundbreaking work of Hubel and Wiesel in the 1960’s on ocular dominance plasticity instigated many studies of the visual system of mammals, enriching our understanding of how the development of its structure and function depends on high quality visual input through both eyes. These studies have mainly employed lid suturing, dark rearing and eye patching applied to different species to reduce or impair visual input, and have created extensive knowledge on binocular vision. However, not all aspects and types of plasticity in the visual cortex have been covered in full detail. In that regard, a more drastic deprivation method like enucleation, leading to complete vision loss appears useful as it has more widespread effects on the afferent visual pathway and even on non-visual brain regions. One-eyed vision due to monocular enucleation (ME) profoundly affects the contralateral retinorecipient subcortical and cortical structures thereby creating a powerful means to investigate cortical plasticity phenomena in which binocular competition has no vote.In this review, we will present current knowledge about the specific application of ME as an experimental tool to study visual and cross-modal brain plasticity and compare early postnatal stages up into adulthood. The structural and physiological consequences of this type of extensive sensory loss as documented and studied in several animal species and human patients will be discussed. We will summarize how ME studies have been instrumental to our current understanding of the differentiation of sensory systems and how the structure and function of cortical circuits in mammals are shaped in response to such an extensive alteration in experience. In conclusion, we will highlight future perspectives and the clinical relevance of adding ME to the list of more longstanding deprivation models in visual system research. PMID:25972788

  6. Mental Imagery and Visual Working Memory

    PubMed Central

    Keogh, Rebecca; Pearson, Joel

    2011-01-01

    Visual working memory provides an essential link between past and future events. Despite recent efforts, capacity limits, their genesis and the underlying neural structures of visual working memory remain unclear. Here we show that performance in visual working memory - but not iconic visual memory - can be predicted by the strength of mental imagery as assessed with binocular rivalry in a given individual. In addition, for individuals with strong imagery, modulating the background luminance diminished performance on visual working memory and imagery tasks, but not working memory for number strings. This suggests that luminance signals were disrupting sensory-based imagery mechanisms and not a general working memory system. Individuals with poor imagery still performed above chance in the visual working memory task, but their performance was not affected by the background luminance, suggesting a dichotomy in strategies for visual working memory: individuals with strong mental imagery rely on sensory-based imagery to support mnemonic performance, while those with poor imagery rely on different strategies. These findings could help reconcile current controversy regarding the mechanism and location of visual mnemonic storage. PMID:22195024

  7. Mental imagery and visual working memory.

    PubMed

    Keogh, Rebecca; Pearson, Joel

    2011-01-01

    Visual working memory provides an essential link between past and future events. Despite recent efforts, capacity limits, their genesis and the underlying neural structures of visual working memory remain unclear. Here we show that performance in visual working memory--but not iconic visual memory--can be predicted by the strength of mental imagery as assessed with binocular rivalry in a given individual. In addition, for individuals with strong imagery, modulating the background luminance diminished performance on visual working memory and imagery tasks, but not working memory for number strings. This suggests that luminance signals were disrupting sensory-based imagery mechanisms and not a general working memory system. Individuals with poor imagery still performed above chance in the visual working memory task, but their performance was not affected by the background luminance, suggesting a dichotomy in strategies for visual working memory: individuals with strong mental imagery rely on sensory-based imagery to support mnemonic performance, while those with poor imagery rely on different strategies. These findings could help reconcile current controversy regarding the mechanism and location of visual mnemonic storage.

  8. Computer-aided training sensorimotor cortex functions in humans before the upper limb transplantation using virtual reality and sensory feedback.

    PubMed

    Kurzynski, Marek; Jaskolska, Anna; Marusiak, Jaroslaw; Wolczowski, Andrzej; Bierut, Przemyslaw; Szumowski, Lukasz; Witkowski, Jerzy; Kisiel-Sajewicz, Katarzyna

    2017-08-01

    One of the biggest problems of upper limb transplantation is lack of certainty as to whether a patient will be able to control voluntary movements of transplanted hands. Based on findings of the recent research on brain cortex plasticity, a premise can be drawn that mental training supported with visual and sensory feedback can cause structural and functional reorganization of the sensorimotor cortex, which leads to recovery of function associated with the control of movements performed by the upper limbs. In this study, authors - based on the above observations - propose the computer-aided training (CAT) system, which generating visual and sensory stimuli, should enhance the effectiveness of mental training applied to humans before upper limb transplantation. The basis for the concept of computer-aided training system is a virtual hand whose reaching and grasping movements the trained patient can observe on the VR headset screen (visual feedback) and whose contact with virtual objects the patient can feel as a touch (sensory feedback). The computer training system is composed of three main components: (1) the system generating 3D virtual world in which the patient sees the virtual limb from the perspective as if it were his/her own hand; (2) sensory feedback transforming information about the interaction of the virtual hand with the grasped object into mechanical vibration; (3) the therapist's panel for controlling the training course. Results of the case study demonstrate that mental training supported with visual and sensory stimuli generated by the computer system leads to a beneficial change of the brain activity related to motor control of the reaching in the patient with bilateral upper limb congenital transverse deficiency. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Superior short-term learning effect of visual and sensory organisation ability when sensory information is unreliable in adolescent rhythmic gymnasts.

    PubMed

    Chen, Hui-Ya; Chang, Hsiao-Yun; Ju, Yan-Ying; Tsao, Hung-Ting

    2017-06-01

    Rhythmic gymnasts specialise in dynamic balance under sensory conditions of numerous somatosensory, visual, and vestibular stimulations. This study investigated whether adolescent rhythmic gymnasts are superior to peers in Sensory Organisation test (SOT) performance, which quantifies the ability to maintain standing balance in six sensory conditions, and explored whether they plateaued faster during familiarisation with the SOT. Three and six sessions of SOTs were administered to 15 female rhythmic gymnasts (15.0 ± 1.8 years) and matched peers (15.1 ± 2.1 years), respectively. The gymnasts were superior to their peers in terms of fitness measures, and their performance was better in the SOT equilibrium score when visual information was unreliable. The SOT learning effects were shown in more challenging sensory conditions between Sessions 1 and 2 and were equivalent in both groups; however, over time, the gymnasts gained marginally significant better visual ability and relied less on visual sense when unreliable. In conclusion, adolescent rhythmic gymnasts have generally the same sensory organisation ability and learning rates as their peers. However, when visual information is unreliable, they have superior sensory organisation ability and learn faster to rely less on visual sense.

  10. Top-down influence on the visual cortex of the blind during sensory substitution

    PubMed Central

    Murphy, Matthew C.; Nau, Amy C.; Fisher, Christopher; Kim, Seong-Gi; Schuman, Joel S.; Chan, Kevin C.

    2017-01-01

    Visual sensory substitution devices provide a non-surgical and flexible approach to vision rehabilitation in the blind. These devices convert images taken by a camera into cross-modal sensory signals that are presented as a surrogate for direct visual input. While previous work has demonstrated that the visual cortex of blind subjects is recruited during sensory substitution, the cognitive basis of this activation remains incompletely understood. To test the hypothesis that top-down input provides a significant contribution to this activation, we performed functional MRI scanning in 11 blind (7 acquired and 4 congenital) and 11 sighted subjects under two conditions: passive listening of image-encoded soundscapes before sensory substitution training and active interpretation of the same auditory sensory substitution signals after a 10-minute training session. We found that the modulation of visual cortex activity due to active interpretation was significantly stronger in the blind over sighted subjects. In addition, congenitally blind subjects showed stronger task-induced modulation in the visual cortex than acquired blind subjects. In a parallel experiment, we scanned 18 blind (11 acquired and 7 congenital) and 18 sighted subjects at rest to investigate alterations in functional connectivity due to visual deprivation. The results demonstrated that visual cortex connectivity of the blind shifted away from sensory networks and toward known areas of top-down input. Taken together, our data support the model of the brain, including the visual system, as a highly flexible task-based and not sensory-based machine. PMID:26584776

  11. Auditory and visual sequence learning in humans and monkeys using an artificial grammar learning paradigm.

    PubMed

    Milne, Alice E; Petkov, Christopher I; Wilson, Benjamin

    2017-07-05

    Language flexibly supports the human ability to communicate using different sensory modalities, such as writing and reading in the visual modality and speaking and listening in the auditory domain. Although it has been argued that nonhuman primate communication abilities are inherently multisensory, direct behavioural comparisons between human and nonhuman primates are scant. Artificial grammar learning (AGL) tasks and statistical learning experiments can be used to emulate ordering relationships between words in a sentence. However, previous comparative work using such paradigms has primarily investigated sequence learning within a single sensory modality. We used an AGL paradigm to evaluate how humans and macaque monkeys learn and respond to identically structured sequences of either auditory or visual stimuli. In the auditory and visual experiments, we found that both species were sensitive to the ordering relationships between elements in the sequences. Moreover, the humans and monkeys produced largely similar response patterns to the visual and auditory sequences, indicating that the sequences are processed in comparable ways across the sensory modalities. These results provide evidence that human sequence processing abilities stem from an evolutionarily conserved capacity that appears to operate comparably across the sensory modalities in both human and nonhuman primates. The findings set the stage for future neurobiological studies to investigate the multisensory nature of these sequencing operations in nonhuman primates and how they compare to related processes in humans. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  12. Visual indication of enviromental humidity by using poly(ionic liquid) photonic crystals.

    PubMed

    Huang, Jing; Tao, Cheng-an; An, Qi; Lin, Changxu; Li, Xuesong; Xu, Dan; Wu, Yiguang; Li, Xiaogang; Shen, Dezhong; Li, Guangtao

    2010-06-21

    The combination of poly (ionic liquid) and photonic structure affords a new class of self-reporting humidity sensory materials with excellent reversibility, which are able to rapidly, sensitively and visually indicate environmental humidity with colour change from blue to green, orange, and red, covering the whole visible range.

  13. Not Your Mother's View: The Dynamics of Toddler Visual Experience

    ERIC Educational Resources Information Center

    Smith, Linda B.; Yu, Chen; Pereira, Alfredo F.

    2011-01-01

    Human toddlers learn about objects through second-by-second, minute-by-minute sensory-motor interactions. In an effort to understand how toddlers' bodily actions structure the visual learning environment, mini-video cameras were placed low on the foreheads of toddlers, and for comparison also on the foreheads of their parents, as they jointly…

  14. The Role of Sensory-Motor Information in Object Recognition: Evidence from Category-Specific Visual Agnosia

    ERIC Educational Resources Information Center

    Wolk, D.A.; Coslett, H.B.; Glosser, G.

    2005-01-01

    The role of sensory-motor representations in object recognition was investigated in experiments involving AD, a patient with mild visual agnosia who was impaired in the recognition of visually presented living as compared to non-living entities. AD named visually presented items for which sensory-motor information was available significantly more…

  15. The effects of neck flexion on cerebral potentials evoked by visual, auditory and somatosensory stimuli and focal brain blood flow in related sensory cortices

    PubMed Central

    2012-01-01

    Background A flexed neck posture leads to non-specific activation of the brain. Sensory evoked cerebral potentials and focal brain blood flow have been used to evaluate the activation of the sensory cortex. We investigated the effects of a flexed neck posture on the cerebral potentials evoked by visual, auditory and somatosensory stimuli and focal brain blood flow in the related sensory cortices. Methods Twelve healthy young adults received right visual hemi-field, binaural auditory and left median nerve stimuli while sitting with the neck in a resting and flexed (20° flexion) position. Sensory evoked potentials were recorded from the right occipital region, Cz in accordance with the international 10–20 system, and 2 cm posterior from C4, during visual, auditory and somatosensory stimulations. The oxidative-hemoglobin concentration was measured in the respective sensory cortex using near-infrared spectroscopy. Results Latencies of the late component of all sensory evoked potentials significantly shortened, and the amplitude of auditory evoked potentials increased when the neck was in a flexed position. Oxidative-hemoglobin concentrations in the left and right visual cortices were higher during visual stimulation in the flexed neck position. The left visual cortex is responsible for receiving the visual information. In addition, oxidative-hemoglobin concentrations in the bilateral auditory cortex during auditory stimulation, and in the right somatosensory cortex during somatosensory stimulation, were higher in the flexed neck position. Conclusions Visual, auditory and somatosensory pathways were activated by neck flexion. The sensory cortices were selectively activated, reflecting the modalities in sensory projection to the cerebral cortex and inter-hemispheric connections. PMID:23199306

  16. Visual motion detection and habitat preference in Anolis lizards.

    PubMed

    Steinberg, David S; Leal, Manuel

    2016-11-01

    The perception of visual stimuli has been a major area of inquiry in sensory ecology, and much of this work has focused on coloration. However, for visually oriented organisms, the process of visual motion detection is often equally crucial to survival and reproduction. Despite the importance of motion detection to many organisms' daily activities, the degree of interspecific variation in the perception of visual motion remains largely unexplored. Furthermore, the factors driving this potential variation (e.g., ecology or evolutionary history) along with the effects of such variation on behavior are unknown. We used a behavioral assay under laboratory conditions to quantify the visual motion detection systems of three species of Puerto Rican Anolis lizard that prefer distinct structural habitat types. We then compared our results to data previously collected for anoles from Cuba, Puerto Rico, and Central America. Our findings indicate that general visual motion detection parameters are similar across species, regardless of habitat preference or evolutionary history. We argue that these conserved sensory properties may drive the evolution of visual communication behavior in this clade.

  17. Top-down influence on the visual cortex of the blind during sensory substitution.

    PubMed

    Murphy, Matthew C; Nau, Amy C; Fisher, Christopher; Kim, Seong-Gi; Schuman, Joel S; Chan, Kevin C

    2016-01-15

    Visual sensory substitution devices provide a non-surgical and flexible approach to vision rehabilitation in the blind. These devices convert images taken by a camera into cross-modal sensory signals that are presented as a surrogate for direct visual input. While previous work has demonstrated that the visual cortex of blind subjects is recruited during sensory substitution, the cognitive basis of this activation remains incompletely understood. To test the hypothesis that top-down input provides a significant contribution to this activation, we performed functional MRI scanning in 11 blind (7 acquired and 4 congenital) and 11 sighted subjects under two conditions: passive listening of image-encoded soundscapes before sensory substitution training and active interpretation of the same auditory sensory substitution signals after a 10-minute training session. We found that the modulation of visual cortex activity due to active interpretation was significantly stronger in the blind over sighted subjects. In addition, congenitally blind subjects showed stronger task-induced modulation in the visual cortex than acquired blind subjects. In a parallel experiment, we scanned 18 blind (11 acquired and 7 congenital) and 18 sighted subjects at rest to investigate alterations in functional connectivity due to visual deprivation. The results demonstrated that visual cortex connectivity of the blind shifted away from sensory networks and toward known areas of top-down input. Taken together, our data support the model of the brain, including the visual system, as a highly flexible task-based and not sensory-based machine. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Into the black and back: the ecology of brain investment in Neotropical army ants (Formicidae: Dorylinae)

    NASA Astrophysics Data System (ADS)

    Bulova, S.; Purce, K.; Khodak, P.; Sulger, E.; O'Donnell, S.

    2016-04-01

    Shifts to new ecological settings can drive evolutionary changes in animal sensory systems and in the brain structures that process sensory information. We took advantage of the diverse habitat ecology of Neotropical army ants to test whether evolutionary transitions from below- to above-ground activity were associated with changes in brain structure. Our estimates of genus-typical frequencies of above-ground activity suggested a high degree of evolutionary plasticity in habitat use among Neotropical army ants. Brain structure consistently corresponded to degree of above-ground activity among genera and among species within genera. The most above-ground genera (and species) invested relatively more in visual processing brain tissues; the most subterranean species invested relatively less in central processing higher-brain centers (mushroom body calyces). These patterns suggest a strong role of sensory ecology (e.g., light levels) in selecting for army ant brain investment evolution and further suggest that the subterranean environment poses reduced cognitive challenges to workers. The highly above-ground active genus Eciton was exceptional in having relatively large brains and particularly large and structurally complex optic lobes. These patterns suggest that the transition to above-ground activity from ancestors that were largely subterranean for approximately 60 million years was followed by re-emergence of enhanced visual function in workers.

  19. Visual cortex responses reflect temporal structure of continuous quasi-rhythmic sensory stimulation.

    PubMed

    Keitel, Christian; Thut, Gregor; Gross, Joachim

    2017-02-01

    Neural processing of dynamic continuous visual input, and cognitive influences thereon, are frequently studied in paradigms employing strictly rhythmic stimulation. However, the temporal structure of natural stimuli is hardly ever fully rhythmic but possesses certain spectral bandwidths (e.g. lip movements in speech, gestures). Examining periodic brain responses elicited by strictly rhythmic stimulation might thus represent ideal, yet isolated cases. Here, we tested how the visual system reflects quasi-rhythmic stimulation with frequencies continuously varying within ranges of classical theta (4-7Hz), alpha (8-13Hz) and beta bands (14-20Hz) using EEG. Our findings substantiate a systematic and sustained neural phase-locking to stimulation in all three frequency ranges. Further, we found that allocation of spatial attention enhances EEG-stimulus locking to theta- and alpha-band stimulation. Our results bridge recent findings regarding phase locking ("entrainment") to quasi-rhythmic visual input and "frequency-tagging" experiments employing strictly rhythmic stimulation. We propose that sustained EEG-stimulus locking can be considered as a continuous neural signature of processing dynamic sensory input in early visual cortices. Accordingly, EEG-stimulus locking serves to trace the temporal evolution of rhythmic as well as quasi-rhythmic visual input and is subject to attentional bias. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Context generalization in Drosophila visual learning requires the mushroom bodies

    NASA Astrophysics Data System (ADS)

    Liu, Li; Wolf, Reinhard; Ernst, Roman; Heisenberg, Martin

    1999-08-01

    The world is permanently changing. Laboratory experiments on learning and memory normally minimize this feature of reality, keeping all conditions except the conditioned and unconditioned stimuli as constant as possible. In the real world, however, animals need to extract from the universe of sensory signals the actual predictors of salient events by separating them from non-predictive stimuli (context). In principle, this can be achieved ifonly those sensory inputs that resemble the reinforcer in theirtemporal structure are taken as predictors. Here we study visual learning in the fly Drosophila melanogaster, using a flight simulator,, and show that memory retrieval is, indeed, partially context-independent. Moreover, we show that the mushroom bodies, which are required for olfactory but not visual or tactile learning, effectively support context generalization. In visual learning in Drosophila, it appears that a facilitating effect of context cues for memory retrieval is the default state, whereas making recall context-independent requires additional processing.

  1. Projection-specific visual feature encoding by layer 5 cortical subnetworks

    PubMed Central

    Lur, Gyorgy; Vinck, Martin A.; Tang, Lan; Cardin, Jessica A.; Higley, Michael J.

    2016-01-01

    Summary Primary neocortical sensory areas act as central hubs, distributing afferent information to numerous cortical and subcortical structures. However, it remains unclear whether each downstream target receives distinct versions of sensory information. We used in vivo calcium imaging combined with retrograde tracing to monitor visual response properties of three distinct subpopulations of projection neurons in primary visual cortex. While there is overlap across the groups, on average corticotectal (CT) cells exhibit lower contrast thresholds and broader tuning for orientation and spatial frequency in comparison to corticostriatal (CS) cells, while corticocortical (CC) cells have intermediate properties. Noise correlational analyses support the hypothesis that CT cells integrate information across diverse layer 5 populations, whereas CS and CC cells form more selectively interconnected groups. Overall, our findings demonstrate the existence of functional subnetworks within layer 5 that may differentially route visual information to behaviorally relevant downstream targets. PMID:26972011

  2. Supranormal orientation selectivity of visual neurons in orientation-restricted animals.

    PubMed

    Sasaki, Kota S; Kimura, Rui; Ninomiya, Taihei; Tabuchi, Yuka; Tanaka, Hiroki; Fukui, Masayuki; Asada, Yusuke C; Arai, Toshiya; Inagaki, Mikio; Nakazono, Takayuki; Baba, Mika; Kato, Daisuke; Nishimoto, Shinji; Sanada, Takahisa M; Tani, Toshiki; Imamura, Kazuyuki; Tanaka, Shigeru; Ohzawa, Izumi

    2015-11-16

    Altered sensory experience in early life often leads to remarkable adaptations so that humans and animals can make the best use of the available information in a particular environment. By restricting visual input to a limited range of orientations in young animals, this investigation shows that stimulus selectivity, e.g., the sharpness of tuning of single neurons in the primary visual cortex, is modified to match a particular environment. Specifically, neurons tuned to an experienced orientation in orientation-restricted animals show sharper orientation tuning than neurons in normal animals, whereas the opposite was true for neurons tuned to non-experienced orientations. This sharpened tuning appears to be due to elongated receptive fields. Our results demonstrate that restricted sensory experiences can sculpt the supranormal functions of single neurons tailored for a particular environment. The above findings, in addition to the minimal population response to orientations close to the experienced one, agree with the predictions of a sparse coding hypothesis in which information is represented efficiently by a small number of activated neurons. This suggests that early brain areas adopt an efficient strategy for coding information even when animals are raised in a severely limited visual environment where sensory inputs have an unnatural statistical structure.

  3. Supranormal orientation selectivity of visual neurons in orientation-restricted animals

    PubMed Central

    Sasaki, Kota S.; Kimura, Rui; Ninomiya, Taihei; Tabuchi, Yuka; Tanaka, Hiroki; Fukui, Masayuki; Asada, Yusuke C.; Arai, Toshiya; Inagaki, Mikio; Nakazono, Takayuki; Baba, Mika; Kato, Daisuke; Nishimoto, Shinji; Sanada, Takahisa M.; Tani, Toshiki; Imamura, Kazuyuki; Tanaka, Shigeru; Ohzawa, Izumi

    2015-01-01

    Altered sensory experience in early life often leads to remarkable adaptations so that humans and animals can make the best use of the available information in a particular environment. By restricting visual input to a limited range of orientations in young animals, this investigation shows that stimulus selectivity, e.g., the sharpness of tuning of single neurons in the primary visual cortex, is modified to match a particular environment. Specifically, neurons tuned to an experienced orientation in orientation-restricted animals show sharper orientation tuning than neurons in normal animals, whereas the opposite was true for neurons tuned to non-experienced orientations. This sharpened tuning appears to be due to elongated receptive fields. Our results demonstrate that restricted sensory experiences can sculpt the supranormal functions of single neurons tailored for a particular environment. The above findings, in addition to the minimal population response to orientations close to the experienced one, agree with the predictions of a sparse coding hypothesis in which information is represented efficiently by a small number of activated neurons. This suggests that early brain areas adopt an efficient strategy for coding information even when animals are raised in a severely limited visual environment where sensory inputs have an unnatural statistical structure. PMID:26567927

  4. Phylostratigraphic profiles reveal a deep evolutionary history of the vertebrate head sensory systems

    PubMed Central

    2013-01-01

    Background The vertebrate head is a highly derived trait with a heavy concentration of sophisticated sensory organs that allow complex behaviour in this lineage. The head sensory structures arise during vertebrate development from cranial placodes and the neural crest. It is generally thought that derivatives of these ectodermal embryonic tissues played a central role in the evolutionary transition at the onset of vertebrates. Despite the obvious importance of head sensory organs for vertebrate biology, their evolutionary history is still uncertain. Results To give a fresh perspective on the adaptive history of the vertebrate head sensory organs, we applied genomic phylostratigraphy to large-scale in situ expression data of the developing zebrafish Danio rerio. Contrary to traditional predictions, we found that dominant adaptive signals in the analyzed sensory structures largely precede the evolutionary advent of vertebrates. The leading adaptive signals at the bilaterian-chordate transition suggested that the visual system was the first sensory structure to evolve. The olfactory, vestibuloauditory, and lateral line sensory organs displayed a strong link with the urochordate-vertebrate ancestor. The only structures that qualified as genuine vertebrate innovations were the neural crest derivatives, trigeminal ganglion and adenohypophysis. We also found evidence that the cranial placodes evolved before the neural crest despite their proposed embryological relatedness. Conclusions Taken together, our findings reveal pre-vertebrate roots and a stepwise adaptive history of the vertebrate sensory systems. This study also underscores that large genomic and expression datasets are rich sources of macroevolutionary information that can be recovered by phylostratigraphic mining. PMID:23587066

  5. Parallel pathways from whisker and visual sensory cortices to distinct frontal regions of mouse neocortex

    PubMed Central

    Sreenivasan, Varun; Kyriakatos, Alexandros; Mateo, Celine; Jaeger, Dieter; Petersen, Carl C.H.

    2016-01-01

    Abstract. The spatial organization of mouse frontal cortex is poorly understood. Here, we used voltage-sensitive dye to image electrical activity in the dorsal cortex of awake head-restrained mice. Whisker-deflection evoked the earliest sensory response in a localized region of primary somatosensory cortex and visual stimulation evoked the earliest responses in a localized region of primary visual cortex. Over the next milliseconds, the initial sensory response spread within the respective primary sensory cortex and into the surrounding higher order sensory cortices. In addition, secondary hotspots in the frontal cortex were evoked by whisker and visual stimulation, with the frontal hotspot for whisker deflection being more anterior and lateral compared to the frontal hotspot evoked by visual stimulation. Investigating axonal projections, we found that the somatosensory whisker cortex and the visual cortex directly innervated frontal cortex, with visual cortex axons innervating a region medial and posterior to the innervation from somatosensory cortex, consistent with the location of sensory responses in frontal cortex. In turn, the axonal outputs of these two frontal cortical areas innervate distinct regions of striatum, superior colliculus, and brainstem. Sensory input, therefore, appears to map onto modality-specific regions of frontal cortex, perhaps participating in distinct sensorimotor transformations, and directing distinct motor outputs. PMID:27921067

  6. Enhanced alpha-oscillations in visual cortex during anticipation of self-generated visual stimulation.

    PubMed

    Stenner, Max-Philipp; Bauer, Markus; Haggard, Patrick; Heinze, Hans-Jochen; Dolan, Ray

    2014-11-01

    The perceived intensity of sensory stimuli is reduced when these stimuli are caused by the observer's actions. This phenomenon is traditionally explained by forward models of sensory action-outcome, which arise from motor processing. Although these forward models critically predict anticipatory modulation of sensory neural processing, neurophysiological evidence for anticipatory modulation is sparse and has not been linked to perceptual data showing sensory attenuation. By combining a psychophysical task involving contrast discrimination with source-level time-frequency analysis of MEG data, we demonstrate that the amplitude of alpha-oscillations in visual cortex is enhanced before the onset of a visual stimulus when the identity and onset of the stimulus are controlled by participants' motor actions. Critically, this prestimulus enhancement of alpha-amplitude is paralleled by psychophysical judgments of a reduced contrast for this stimulus. We suggest that alpha-oscillations in visual cortex preceding self-generated visual stimulation are a likely neurophysiological signature of motor-induced sensory anticipation and mediate sensory attenuation. We discuss our results in relation to proposals that attribute generic inhibitory functions to alpha-oscillations in prioritizing and gating sensory information via top-down control.

  7. Integrating brain, behavior, and phylogeny to understand the evolution of sensory systems in birds

    PubMed Central

    Wylie, Douglas R.; Gutiérrez-Ibáñez, Cristian; Iwaniuk, Andrew N.

    2015-01-01

    The comparative anatomy of sensory systems has played a major role in developing theories and principles central to evolutionary neuroscience. This includes the central tenet of many comparative studies, the principle of proper mass, which states that the size of a neural structure reflects its processing capacity. The size of structures within the sensory system is not, however, the only salient variable in sensory evolution. Further, the evolution of the brain and behavior are intimately tied to phylogenetic history, requiring studies to integrate neuroanatomy with behavior and phylogeny to gain a more holistic view of brain evolution. Birds have proven to be a useful group for these studies because of widespread interest in their phylogenetic relationships and a wealth of information on the functional organization of most of their sensory pathways. In this review, we examine the principle of proper mass in relation differences in the sensory capabilities among birds. We discuss how neuroanatomy, behavior, and phylogeny can be integrated to understand the evolution of sensory systems in birds providing evidence from visual, auditory, and somatosensory systems. We also consider the concept of a “trade-off,” whereby one sensory system (or subpathway within a sensory system), may be expanded in size, at the expense of others, which are reduced in size. PMID:26321905

  8. Perception and the strongest sensory memory trace of multi-stable displays both form shortly after the stimulus onset.

    PubMed

    Pastukhov, Alexander

    2016-02-01

    We investigated the relation between perception and sensory memory of multi-stable structure-from-motion displays. The latter is an implicit visual memory that reflects a recent history of perceptual dominance and influences only the initial perception of multi-stable displays. First, we established the earliest time point when the direction of an illusory rotation can be reversed after the display onset (29-114 ms). Because our display manipulation did not bias perception towards a specific direction of illusory rotation but only signaled the change in motion, this means that the perceptual dominance was established no later than 29-114 ms after the stimulus onset. Second, we used orientation-selectivity of sensory memory to establish which display orientation produced the strongest memory trace and when this orientation was presented during the preceding prime interval (80-140 ms). Surprisingly, both estimates point towards the time interval immediately after the display onset, indicating that both perception and sensory memory form at approximately the same time. This suggests a tighter integration between perception and sensory memory than previously thought, warrants a reconsideration of its role in visual perception, and indicates that sensory memory could be a unique behavioral correlate of the earlier perceptual inference that can be studied post hoc.

  9. Collective motion in animal groups from a neurobiological perspective: the adaptive benefits of dynamic sensory loads and selective attention.

    PubMed

    Lemasson, B H; Anderson, J J; Goodwin, R A

    2009-12-21

    We explore mechanisms associated with collective animal motion by drawing on the neurobiological bases of sensory information processing and decision-making. The model uses simplified retinal processes to translate neighbor movement patterns into information through spatial signal integration and threshold responses. The structure provides a mechanism by which individuals can vary their sets of influential neighbors, a measure of an individual's sensory load. Sensory loads are correlated with group order and density, and we discuss their adaptive values in an ecological context. The model also provides a mechanism by which group members can identify, and rapidly respond to, novel visual stimuli.

  10. The synaptic pharmacology underlying sensory processing in the superior colliculus.

    PubMed

    Binns, K E

    1999-10-01

    The superior colliculus (SC) is one of the most ancient regions of the vertebrate central sensory system. In this hub afferents from several sensory pathways converge, and an extensive range of neural circuits enable primary sensory processing, multi-sensory integration and the generation of motor commands for orientation behaviours. The SC has a laminar structure and is usually considered in two parts; the superficial visual layers and the deep multi-modal/motor layers. Neurones in the superficial layers integrate visual information from the retina, cortex and other sources, while the deep layers draw together data from many cortical and sub-cortical sensory areas, including the superficial layers, to generate motor commands. Functional studies in anaesthetized subjects and in slice preparations have used pharmacological tools to probe some of the SC's interacting circuits. The studies reviewed here reveal important roles for ionotropic glutamate receptors in the mediation of sensory inputs to the SC and in transmission between the superficial and deep layers. N-methyl-D-aspartate receptors appear to have special responsibility for the temporal matching of retinal and cortical activity in the superficial layers and for the integration of multiple sensory data-streams in the deep layers. Sensory responses are shaped by intrinsic inhibitory mechanisms mediated by GABA(A) and GABA(B) receptors and influenced by nicotinic acetylcholine receptors. These sensory and motor-command activities of SC neurones are modulated by levels of arousal through extrinsic connections containing GABA, serotonin and other transmitters. It is possible to naturally stimulate many of the SC's sensory and non-sensory inputs either independently or simultaneously and this brain area is an ideal location in which to study: (a) interactions between inputs from the same sensory system; (b) the integration of inputs from several sensory systems; and (c) the influence of non-sensory systems on sensory processing.

  11. Cryptically Patterned Moths Perceive Bark Structure When Choosing Body Orientations That Match Wing Color Pattern to the Bark Pattern

    PubMed Central

    Kang, Chang-ku; Moon, Jong-yeol; Lee, Sang-im; Jablonski, Piotr G.

    2013-01-01

    Many moths have wing patterns that resemble bark of trees on which they rest. The wing patterns help moths to become camouflaged and to avoid predation because the moths are able to assume specific body orientations that produce a very good match between the pattern on the bark and the pattern on the wings. Furthermore, after landing on a bark moths are able to perceive stimuli that correlate with their crypticity and are able to re-position their bodies to new more cryptic locations and body orientations. However, the proximate mechanisms, i.e. how a moth finds an appropriate resting position and orientation, are poorly studied. Here, we used a geometrid moth Jankowskia fuscaria to examine i) whether a choice of resting orientation by moths depends on the properties of natural background, and ii) what sensory cues moths use. We studied moths’ behavior on natural (a tree log) and artificial backgrounds, each of which was designed to mimic one of the hypothetical cues that moths may perceive on a tree trunk (visual pattern, directional furrow structure, and curvature). We found that moths mainly used structural cues from the background when choosing their resting position and orientation. Our findings highlight the possibility that moths use information from one type of sensory modality (structure of furrows is probably detected through tactile channel) to achieve crypticity in another sensory modality (visual). This study extends our knowledge of how behavior, sensory systems and morphology of animals interact to produce crypsis. PMID:24205118

  12. Cryptically patterned moths perceive bark structure when choosing body orientations that match wing color pattern to the bark pattern.

    PubMed

    Kang, Chang-Ku; Moon, Jong-Yeol; Lee, Sang-Im; Jablonski, Piotr G

    2013-01-01

    Many moths have wing patterns that resemble bark of trees on which they rest. The wing patterns help moths to become camouflaged and to avoid predation because the moths are able to assume specific body orientations that produce a very good match between the pattern on the bark and the pattern on the wings. Furthermore, after landing on a bark moths are able to perceive stimuli that correlate with their crypticity and are able to re-position their bodies to new more cryptic locations and body orientations. However, the proximate mechanisms, i.e. how a moth finds an appropriate resting position and orientation, are poorly studied. Here, we used a geometrid moth Jankowskia fuscaria to examine i) whether a choice of resting orientation by moths depends on the properties of natural background, and ii) what sensory cues moths use. We studied moths' behavior on natural (a tree log) and artificial backgrounds, each of which was designed to mimic one of the hypothetical cues that moths may perceive on a tree trunk (visual pattern, directional furrow structure, and curvature). We found that moths mainly used structural cues from the background when choosing their resting position and orientation. Our findings highlight the possibility that moths use information from one type of sensory modality (structure of furrows is probably detected through tactile channel) to achieve crypticity in another sensory modality (visual). This study extends our knowledge of how behavior, sensory systems and morphology of animals interact to produce crypsis.

  13. Task-dependent modulation of the visual sensory thalamus assists visual-speech recognition.

    PubMed

    Díaz, Begoña; Blank, Helen; von Kriegstein, Katharina

    2018-05-14

    The cerebral cortex modulates early sensory processing via feed-back connections to sensory pathway nuclei. The functions of this top-down modulation for human behavior are poorly understood. Here, we show that top-down modulation of the visual sensory thalamus (the lateral geniculate body, LGN) is involved in visual-speech recognition. In two independent functional magnetic resonance imaging (fMRI) studies, LGN response increased when participants processed fast-varying features of articulatory movements required for visual-speech recognition, as compared to temporally more stable features required for face identification with the same stimulus material. The LGN response during the visual-speech task correlated positively with the visual-speech recognition scores across participants. In addition, the task-dependent modulation was present for speech movements and did not occur for control conditions involving non-speech biological movements. In face-to-face communication, visual speech recognition is used to enhance or even enable understanding what is said. Speech recognition is commonly explained in frameworks focusing on cerebral cortex areas. Our findings suggest that task-dependent modulation at subcortical sensory stages has an important role for communication: Together with similar findings in the auditory modality the findings imply that task-dependent modulation of the sensory thalami is a general mechanism to optimize speech recognition. Copyright © 2018. Published by Elsevier Inc.

  14. Other ways of seeing: From behavior to neural mechanisms in the online “visual” control of action with sensory substitution

    PubMed Central

    Proulx, Michael J.; Gwinnutt, James; Dell’Erba, Sara; Levy-Tzedek, Shelly; de Sousa, Alexandra A.; Brown, David J.

    2015-01-01

    Vision is the dominant sense for perception-for-action in humans and other higher primates. Advances in sight restoration now utilize the other intact senses to provide information that is normally sensed visually through sensory substitution to replace missing visual information. Sensory substitution devices translate visual information from a sensor, such as a camera or ultrasound device, into a format that the auditory or tactile systems can detect and process, so the visually impaired can see through hearing or touch. Online control of action is essential for many daily tasks such as pointing, grasping and navigating, and adapting to a sensory substitution device successfully requires extensive learning. Here we review the research on sensory substitution for vision restoration in the context of providing the means of online control for action in the blind or blindfolded. It appears that the use of sensory substitution devices utilizes the neural visual system; this suggests the hypothesis that sensory substitution draws on the same underlying mechanisms as unimpaired visual control of action. Here we review the current state of the art for sensory substitution approaches to object recognition, localization, and navigation, and the potential these approaches have for revealing a metamodal behavioral and neural basis for the online control of action. PMID:26599473

  15. Visual perception of ADHD children with sensory processing disorder.

    PubMed

    Jung, Hyerim; Woo, Young Jae; Kang, Je Wook; Choi, Yeon Woo; Kim, Kyeong Mi

    2014-04-01

    The aim of the present study was to investigate the visual perception difference between ADHD children with and without sensory processing disorder, and the relationship between sensory processing and visual perception of the children with ADHD. Participants were 47 outpatients, aged 6-8 years, diagnosed with ADHD. After excluding those who met exclusion criteria, 38 subjects were clustered into two groups, ADHD children with and without sensory processing disorder (SPD), using SSP reported by their parents, then subjects completed K-DTVP-2. Spearman correlation analysis was run to determine the relationship between sensory processing and visual perception, and Mann-Whitney-U test was conducted to compare the K-DTVP-2 score of two groups respectively. The ADHD children with SPD performed inferiorly to ADHD children without SPD in the on 3 quotients of K-DTVP-2. The GVP of K-DTVP-2 score was related to Movement Sensitivity section (r=0.368(*)) and Low Energy/Weak section of SSP (r=0.369*). The result of the present study suggests that among children with ADHD, the visual perception is lower in those children with co-morbid SPD. Also, visual perception may be related to sensory processing, especially in the reactions of vestibular and proprioceptive senses. Regarding academic performance, it is necessary to consider how sensory processing issues affect visual perception in children with ADHD.

  16. A model of attention-guided visual perception and recognition.

    PubMed

    Rybak, I A; Gusakova, V I; Golovan, A V; Podladchikova, L N; Shevtsova, N A

    1998-08-01

    A model of visual perception and recognition is described. The model contains: (i) a low-level subsystem which performs both a fovea-like transformation and detection of primary features (edges), and (ii) a high-level subsystem which includes separated 'what' (sensory memory) and 'where' (motor memory) structures. Image recognition occurs during the execution of a 'behavioral recognition program' formed during the primary viewing of the image. The recognition program contains both programmed attention window movements (stored in the motor memory) and predicted image fragments (stored in the sensory memory) for each consecutive fixation. The model shows the ability to recognize complex images (e.g. faces) invariantly with respect to shift, rotation and scale.

  17. Sensory Symptoms and Processing of Nonverbal Auditory and Visual Stimuli in Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Stewart, Claire R.; Sanchez, Sandra S.; Grenesko, Emily L.; Brown, Christine M.; Chen, Colleen P.; Keehn, Brandon; Velasquez, Francisco; Lincoln, Alan J.; Müller, Ralph-Axel

    2016-01-01

    Atypical sensory responses are common in autism spectrum disorder (ASD). While evidence suggests impaired auditory-visual integration for verbal information, findings for nonverbal stimuli are inconsistent. We tested for sensory symptoms in children with ASD (using the Adolescent/Adult Sensory Profile) and examined unisensory and bisensory…

  18. Screening for hearing, visual and dual sensory impairment in older adults using behavioural cues: a validation study.

    PubMed

    Roets-Merken, Lieve M; Zuidema, Sytse U; Vernooij-Dassen, Myrra J F J; Kempen, Gertrudis I J M

    2014-11-01

    This study investigated the psychometric properties of the Severe Dual Sensory Loss screening tool, a tool designed to help nurses and care assistants to identify hearing, visual and dual sensory impairment in older adults. Construct validity of the Severe Dual Sensory Loss screening tool was evaluated using Crohnbach's alpha and factor analysis. Interrater reliability was calculated using Kappa statistics. To evaluate the predictive validity, sensitivity and specificity were calculated by comparison with the criterion standard assessment for hearing and vision. The criterion used for hearing impairment was a hearing loss of ≥40 decibel measured by pure-tone audiometry, and the criterion for visual impairment was a visual acuity of ≤0.3 diopter or a visual field of ≤0.3°. Feasibility was evaluated by the time needed to fill in the screening tool and the clarity of the instruction and items. Prevalence of dual sensory impairment was calculated. A total of 56 older adults receiving aged care and 12 of their nurses and care assistants participated in the study. Crohnbach's alpha was 0.81 for the hearing subscale and 0.84 for the visual subscale. Factor analysis showed two constructs for hearing and two for vision. Kappa was 0.71 for the hearing subscale and 0.74 for the visual subscale. The predictive validity showed a sensitivity of 0.71 and a specificity of 0.72 for the hearing subscale; and a sensitivity of 0.69 and a specificity of 0.78 for the visual subscale. The optimum cut-off point for each subscale was score 1. The nurses and care assistants reported that the Severe Dual Sensory Loss screening tool was easy to use. The prevalence of hearing and vision impairment was 55% and 29%, respectively, and that of dual sensory impairment was 20%. The Severe Dual Sensory Loss screening tool was compared with the criterion standards for hearing and visual impairment and was found a valid and reliable tool, enabling nurses and care assistants to identify hearing, visual and dual sensory impairment among older adults. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Reading with sounds: sensory substitution selectively activates the visual word form area in the blind.

    PubMed

    Striem-Amit, Ella; Cohen, Laurent; Dehaene, Stanislas; Amedi, Amir

    2012-11-08

    Using a visual-to-auditory sensory-substitution algorithm, congenitally fully blind adults were taught to read and recognize complex images using "soundscapes"--sounds topographically representing images. fMRI was used to examine key questions regarding the visual word form area (VWFA): its selectivity for letters over other visual categories without visual experience, its feature tolerance for reading in a novel sensory modality, and its plasticity for scripts learned in adulthood. The blind activated the VWFA specifically and selectively during the processing of letter soundscapes relative to both textures and visually complex object categories and relative to mental imagery and semantic-content controls. Further, VWFA recruitment for reading soundscapes emerged after 2 hr of training in a blind adult on a novel script. Therefore, the VWFA shows category selectivity regardless of input sensory modality, visual experience, and long-term familiarity or expertise with the script. The VWFA may perform a flexible task-specific rather than sensory-specific computation, possibly linking letter shapes to phonology. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. The Neural Correlates of Hierarchical Predictions for Perceptual Decisions.

    PubMed

    Weilnhammer, Veith A; Stuke, Heiner; Sterzer, Philipp; Schmack, Katharina

    2018-05-23

    Sensory information is inherently noisy, sparse, and ambiguous. In contrast, visual experience is usually clear, detailed, and stable. Bayesian theories of perception resolve this discrepancy by assuming that prior knowledge about the causes underlying sensory stimulation actively shapes perceptual decisions. The CNS is believed to entertain a generative model aligned to dynamic changes in the hierarchical states of our volatile sensory environment. Here, we used model-based fMRI to study the neural correlates of the dynamic updating of hierarchically structured predictions in male and female human observers. We devised a crossmodal associative learning task with covertly interspersed ambiguous trials in which participants engaged in hierarchical learning based on changing contingencies between auditory cues and visual targets. By inverting a Bayesian model of perceptual inference, we estimated individual hierarchical predictions, which significantly biased perceptual decisions under ambiguity. Although "high-level" predictions about the cue-target contingency correlated with activity in supramodal regions such as orbitofrontal cortex and hippocampus, dynamic "low-level" predictions about the conditional target probabilities were associated with activity in retinotopic visual cortex. Our results suggest that our CNS updates distinct representations of hierarchical predictions that continuously affect perceptual decisions in a dynamically changing environment. SIGNIFICANCE STATEMENT Bayesian theories posit that our brain entertains a generative model to provide hierarchical predictions regarding the causes of sensory information. Here, we use behavioral modeling and fMRI to study the neural underpinnings of such hierarchical predictions. We show that "high-level" predictions about the strength of dynamic cue-target contingencies during crossmodal associative learning correlate with activity in orbitofrontal cortex and the hippocampus, whereas "low-level" conditional target probabilities were reflected in retinotopic visual cortex. Our findings empirically corroborate theorizations on the role of hierarchical predictions in visual perception and contribute substantially to a longstanding debate on the link between sensory predictions and orbitofrontal or hippocampal activity. Our work fundamentally advances the mechanistic understanding of perceptual inference in the human brain. Copyright © 2018 the authors 0270-6474/18/385008-14$15.00/0.

  1. Proprioceptive feedback determines visuomotor gain in Drosophila

    PubMed Central

    Bartussek, Jan; Lehmann, Fritz-Olaf

    2016-01-01

    Multisensory integration is a prerequisite for effective locomotor control in most animals. Especially, the impressive aerial performance of insects relies on rapid and precise integration of multiple sensory modalities that provide feedback on different time scales. In flies, continuous visual signalling from the compound eyes is fused with phasic proprioceptive feedback to ensure precise neural activation of wing steering muscles (WSM) within narrow temporal phase bands of the stroke cycle. This phase-locked activation relies on mechanoreceptors distributed over wings and gyroscopic halteres. Here we investigate visual steering performance of tethered flying fruit flies with reduced haltere and wing feedback signalling. Using a flight simulator, we evaluated visual object fixation behaviour, optomotor altitude control and saccadic escape reflexes. The behavioural assays show an antagonistic effect of wing and haltere signalling on visuomotor gain during flight. Compared with controls, suppression of haltere feedback attenuates while suppression of wing feedback enhances the animal’s wing steering range. Our results suggest that the generation of motor commands owing to visual perception is dynamically controlled by proprioception. We outline a potential physiological mechanism based on the biomechanical properties of WSM and sensory integration processes at the level of motoneurons. Collectively, the findings contribute to our general understanding how moving animals integrate sensory information with dynamically changing temporal structure. PMID:26909184

  2. The power of projectomes: genetic mosaic labeling in the larval zebrafish brain reveals organizing principles of sensory circuits.

    PubMed

    Robles, Estuardo

    2017-09-01

    In no vertebrate species do we possess an accurate, comprehensive tally of neuron types in the brain. This is in no small part due to the vast diversity of neuronal types that comprise complex vertebrate nervous systems. A fundamental goal of neuroscience is to construct comprehensive catalogs of cell types defined by structure, connectivity, and physiological response properties. This type of information will be invaluable for generating models of how assemblies of neurons encode and distribute sensory information and correspondingly alter behavior. This review summarizes recent efforts in the larval zebrafish to construct sensory projectomes, comprehensive analyses of axonal morphologies in sensory axon tracts. Focusing on the olfactory and optic tract, these studies revealed principles of sensory information processing in the olfactory and visual systems that could not have been directly quantified by other methods. In essence, these studies reconstructed the optic and olfactory tract in a virtual manner, providing insights into patterns of neuronal growth that underlie the formation of sensory axon tracts. Quantitative analysis of neuronal diversity revealed organizing principles that determine information flow through sensory systems in the zebrafish that are likely to be conserved across vertebrate species. The generation of comprehensive cell type classifications based on structural, physiological, and molecular features will lead to testable hypotheses on the functional role of individual sensory neuron subtypes in controlling specific sensory-evoked behaviors.

  3. Seeing the Song: Left Auditory Structures May Track Auditory-Visual Dynamic Alignment

    PubMed Central

    Mossbridge, Julia A.; Grabowecky, Marcia; Suzuki, Satoru

    2013-01-01

    Auditory and visual signals generated by a single source tend to be temporally correlated, such as the synchronous sounds of footsteps and the limb movements of a walker. Continuous tracking and comparison of the dynamics of auditory-visual streams is thus useful for the perceptual binding of information arising from a common source. Although language-related mechanisms have been implicated in the tracking of speech-related auditory-visual signals (e.g., speech sounds and lip movements), it is not well known what sensory mechanisms generally track ongoing auditory-visual synchrony for non-speech signals in a complex auditory-visual environment. To begin to address this question, we used music and visual displays that varied in the dynamics of multiple features (e.g., auditory loudness and pitch; visual luminance, color, size, motion, and organization) across multiple time scales. Auditory activity (monitored using auditory steady-state responses, ASSR) was selectively reduced in the left hemisphere when the music and dynamic visual displays were temporally misaligned. Importantly, ASSR was not affected when attentional engagement with the music was reduced, or when visual displays presented dynamics clearly dissimilar to the music. These results appear to suggest that left-lateralized auditory mechanisms are sensitive to auditory-visual temporal alignment, but perhaps only when the dynamics of auditory and visual streams are similar. These mechanisms may contribute to correct auditory-visual binding in a busy sensory environment. PMID:24194873

  4. Effects of Binaural Sensory Aids on the Development of Visual Perceptual Abilities in Visually Handicapped Infants. Final Report, April 15, 1982-November 15, 1982.

    ERIC Educational Resources Information Center

    Hart, Verna; Ferrell, Kay

    Twenty-four congenitally visually handicapped infants, aged 6-24 months, participated in a study to determine (1) those stimuli best able to elicit visual attention, (2) the stability of visual acuity over time, and (3) the effects of binaural sensory aids on both visual attention and visual acuity. Ss were dichotomized into visually handicapped…

  5. Adaptation to sensory input tunes visual cortex to criticality

    NASA Astrophysics Data System (ADS)

    Shew, Woodrow L.; Clawson, Wesley P.; Pobst, Jeff; Karimipanah, Yahya; Wright, Nathaniel C.; Wessel, Ralf

    2015-08-01

    A long-standing hypothesis at the interface of physics and neuroscience is that neural networks self-organize to the critical point of a phase transition, thereby optimizing aspects of sensory information processing. This idea is partially supported by strong evidence for critical dynamics observed in the cerebral cortex, but the impact of sensory input on these dynamics is largely unknown. Thus, the foundations of this hypothesis--the self-organization process and how it manifests during strong sensory input--remain unstudied experimentally. Here we show in visual cortex and in a computational model that strong sensory input initially elicits cortical network dynamics that are not critical, but adaptive changes in the network rapidly tune the system to criticality. This conclusion is based on observations of multifaceted scaling laws predicted to occur at criticality. Our findings establish sensory adaptation as a self-organizing mechanism that maintains criticality in visual cortex during sensory information processing.

  6. Hardy's stargazers and the astronomy of other minds.

    PubMed

    Henchman, A

    2008-01-01

    This essay argues that Thomas Hardy compares the act of observing another person to the scientific practice of observing the stars in order to reveal structural obstacles to accessing other minds. He draws on astronomy and optics to underscore the discrepancy between the full perception one has of one's own consciousness and the lack of such sensory evidence for the consciousness of others. His scenes of stargazing show such obstacles being temporarily overcome; the stargazer turns away from the thick sensory detail of earthly life and uses minimal visual information as a jumping-off point for the imagination. These visual journeys into space are analogous to those Hardy's readers experience as he wrests them out of their bodies into imaginary landscapes and unfamiliar minds.

  7. Mate choice in the eye and ear of the beholder? Female multimodal sensory configuration influences her preferences.

    PubMed

    Ronald, Kelly L; Fernández-Juricic, Esteban; Lucas, Jeffrey R

    2018-05-16

    A common assumption in sexual selection studies is that receivers decode signal information similarly. However, receivers may vary in how they rank signallers if signal perception varies with an individual's sensory configuration. Furthermore, receivers may vary in their weighting of different elements of multimodal signals based on their sensory configuration. This could lead to complex levels of selection on signalling traits. We tested whether multimodal sensory configuration could affect preferences for multimodal signals. We used brown-headed cowbird ( Molothrus ater ) females to examine how auditory sensitivity and auditory filters, which influence auditory spectral and temporal resolution, affect song preferences, and how visual spatial resolution and visual temporal resolution, which influence resolution of a moving visual signal, affect visual display preferences. Our results show that multimodal sensory configuration significantly affects preferences for male displays: females with better auditory temporal resolution preferred songs that were shorter, with lower Wiener entropy, and higher frequency; and females with better visual temporal resolution preferred males with less intense visual displays. Our findings provide new insights into mate-choice decisions and receiver signal processing. Furthermore, our results challenge a long-standing assumption in animal communication which can affect how we address honest signalling, assortative mating and sensory drive. © 2018 The Author(s).

  8. Aging and Visual Attention

    PubMed Central

    Madden, David J.

    2007-01-01

    Older adults are often slower and less accurate than are younger adults in performing visual-search tasks, suggesting an age-related decline in attentional functioning. Age-related decline in attention, however, is not entirely pervasive. Visual search that is based on the observer’s expectations (i.e., top-down attention) is relatively preserved as a function of adult age. Neuroimaging research suggests that age-related decline occurs in the structure and function of brain regions mediating the visual sensory input, whereas activation of regions in the frontal and parietal lobes is often greater for older adults than for younger adults. This increased activation may represent an age-related increase in the role of top-down attention during visual tasks. To obtain a more complete account of age-related decline and preservation of visual attention, current research is beginning to explore the relation of neuroimaging measures of brain structure and function to behavioral measures of visual attention. PMID:18080001

  9. 75 FR 54915 - Notice Pursuant to the National Cooperative Research and Production Act of 1993-Sensory System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-09

    ... DEPARTMENT OF JUSTICE Antitrust Division Notice Pursuant to the National Cooperative Research and Production Act of 1993--Sensory System for Critical Infrastructure Defect Recognition, Visualization and... Critical Infrastructure Defect Recognition, Visualization and Failure Prediction ('Sensory System'') has...

  10. Characterizing structural association alterations within brain networks in normal aging using Gaussian Bayesian networks.

    PubMed

    Guo, Xiaojuan; Wang, Yan; Chen, Kewei; Wu, Xia; Zhang, Jiacai; Li, Ke; Jin, Zhen; Yao, Li

    2014-01-01

    Recent multivariate neuroimaging studies have revealed aging-related alterations in brain structural networks. However, the sensory/motor networks such as the auditory, visual and motor networks, have obtained much less attention in normal aging research. In this study, we used Gaussian Bayesian networks (BN), an approach investigating possible inter-regional directed relationship, to characterize aging effects on structural associations between core brain regions within each of these structural sensory/motor networks using volumetric MRI data. We then further examined the discriminability of BN models for the young (N = 109; mean age =22.73 years, range 20-28) and old (N = 82; mean age =74.37 years, range 60-90) groups. The results of the BN modeling demonstrated that structural associations exist between two homotopic brain regions from the left and right hemispheres in each of the three networks. In particular, compared with the young group, the old group had significant connection reductions in each of the three networks and lesser connection numbers in the visual network. Moreover, it was found that the aging-related BN models could distinguish the young and old individuals with 90.05, 73.82, and 88.48% accuracy for the auditory, visual, and motor networks, respectively. Our findings suggest that BN models can be used to investigate the normal aging process with reliable statistical power. Moreover, these differences in structural inter-regional interactions may help elucidate the neuronal mechanism of anatomical changes in normal aging.

  11. Assessment and Therapeutic Application of the Expressive Therapies Continuum: Implications for Brain Structures and Functions

    ERIC Educational Resources Information Center

    Lusebrink, Vija B.

    2010-01-01

    The Expressive Therapies Continuum (ETC) provides a theoretical model for art-based assessments and applications of media in art therapy. The three levels of the ETC (Kinesthetic/Sensory, Perceptual/Affective, and Cognitive/Symbolic) appear to reflect different functions and structures in the brain that process visual and affective information.…

  12. Shared sensory estimates for human motion perception and pursuit eye movements.

    PubMed

    Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio; Osborne, Leslie C

    2015-06-03

    Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. Copyright © 2015 the authors 0270-6474/15/358515-16$15.00/0.

  13. Shared Sensory Estimates for Human Motion Perception and Pursuit Eye Movements

    PubMed Central

    Mukherjee, Trishna; Battifarano, Matthew; Simoncini, Claudio

    2015-01-01

    Are sensory estimates formed centrally in the brain and then shared between perceptual and motor pathways or is centrally represented sensory activity decoded independently to drive awareness and action? Questions about the brain's information flow pose a challenge because systems-level estimates of environmental signals are only accessible indirectly as behavior. Assessing whether sensory estimates are shared between perceptual and motor circuits requires comparing perceptual reports with motor behavior arising from the same sensory activity. Extrastriate visual cortex both mediates the perception of visual motion and provides the visual inputs for behaviors such as smooth pursuit eye movements. Pursuit has been a valuable testing ground for theories of sensory information processing because the neural circuits and physiological response properties of motion-responsive cortical areas are well studied, sensory estimates of visual motion signals are formed quickly, and the initiation of pursuit is closely coupled to sensory estimates of target motion. Here, we analyzed variability in visually driven smooth pursuit and perceptual reports of target direction and speed in human subjects while we manipulated the signal-to-noise level of motion estimates. Comparable levels of variability throughout viewing time and across conditions provide evidence for shared noise sources in the perception and action pathways arising from a common sensory estimate. We found that conditions that create poor, low-gain pursuit create a discrepancy between the precision of perception and that of pursuit. Differences in pursuit gain arising from differences in optic flow strength in the stimulus reconcile much of the controversy on this topic. PMID:26041919

  14. Factor analysis of persistent postconcussive symptoms within a military sample with blast exposure.

    PubMed

    Franke, Laura M; Czarnota, Jenna N; Ketchum, Jessica M; Walker, William C

    2015-01-01

    To determine the factor structure of persistent postconcussive syndrome symptoms in a blast-exposed military sample and validate factors against objective and symptom measures. Veterans Affairs medical center and military bases. One hundred eighty-one service members and veterans with at least 1 significant exposure to blast during deployment within the 2 years prior to study enrollment. Confirmatory and exploratory factor analyses of the Rivermead Postconcussion Questionnaire. Rivermead Postconcussion Questionnaire, PTSD (posttraumatic stress disorder) Symptom Checklist-Civilian, Center for Epidemiological Studies Depression scale, Sensory Organization Test, Paced Auditory Serial Addition Test, California Verbal Learning Test, and Delis-Kaplan Executive Function System subtests. The 3-factor structure of persistent postconcussive syndrome was not confirmed. A 4-factor structure was extracted, and factors were interpreted as reflecting emotional, cognitive, visual, and vestibular functions. All factors were associated with scores on psychological symptom inventories; visual and vestibular factors were also associated with balance performance. There was no significant association between the cognitive factor and neuropsychological performance or between a history of mild traumatic brain injury and factor scores. Persistent postconcussive symptoms observed months after blast exposure seem to be related to 4 distinct forms of distress, but not to mild traumatic brain injury per se, with vestibular and visual factors possibly related to injury of sensory organs by blast.

  15. The structure of pairwise correlation in mouse primary visual cortex reveals functional organization in the absence of an orientation map.

    PubMed

    Denman, Daniel J; Contreras, Diego

    2014-10-01

    Neural responses to sensory stimuli are not independent. Pairwise correlation can reduce coding efficiency, occur independent of stimulus representation, or serve as an additional channel of information, depending on the timescale of correlation and the method of decoding. Any role for correlation depends on its magnitude and structure. In sensory areas with maps, like the orientation map in primary visual cortex (V1), correlation is strongly related to the underlying functional architecture, but it is unclear whether this correlation structure is an essential feature of the system or arises from the arrangement of cells in the map. We assessed the relationship between functional architecture and pairwise correlation by measuring both synchrony and correlated spike count variability in mouse V1, which lacks an orientation map. We observed significant pairwise synchrony, which was organized by distance and relative orientation preference between cells. We also observed nonzero correlated variability in both the anesthetized (0.16) and awake states (0.18). Our results indicate that the structure of pairwise correlation is maintained in the absence of an underlying anatomical organization and may be an organizing principle of the mammalian visual system preserved by nonrandom connectivity within local networks. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Contribution of amygdalar and lateral hypothalamic neurons to visual information processing of food and nonfood in monkey.

    PubMed

    Ono, T; Tamura, R; Nishijo, H; Nakamura, K; Tabuchi, E

    1989-02-01

    Visual information processing was investigated in the inferotemporal cortical (ITCx)-amygdalar (AM)-lateral hypothalamic (LHA) axis which contributes to food-nonfood discrimination. Neuronal activity was recorded from monkey AM and LHA during discrimination of sensory stimuli including sight of food or nonfood. The task had four phases: control, visual, bar press, and ingestion. Of 710 AM neurons tested, 220 (31.0%) responded during visual phase: 48 to only visual stimulation, 13 (1.9%) to visual plus oral sensory stimulation, 142 (20.0%) to multimodal stimulation and 17 (2.4%) to one affectively significant item. Of 669 LHA neurons tested, 106 (15.8%) responded in the visual phase. Of 80 visual-related neurons tested systematically, 33 (41.2%) responded selectively to the sight of any object predicting the availability of reward, and 47 (58.8%) responded nondifferentially to both food and nonfood. Many of AM neuron responses were graded according to the degree of affective significance of sensory stimuli (sensory-affective association), but responses of LHA food responsive neurons did not depend on the kind of reward indicated by the sensory stimuli (stimulus-reinforcement association). Some AM and LHA food responses were modulated by extinction or reversal. Dynamic information processing in ITCx-AM-LHA axis was investigated by reversible deficits of bilateral ITCx or AM by cooling. ITCx cooling suppressed discrimination by vision responding AM neurons (8/17). AM cooling suppressed LHA responses to food (9/22). We suggest deep AM-LHA involvement in food-nonfood discrimination based on AM sensory-affective association and LHA stimulus-reinforcement association.

  17. Differential sensory cortical involvement in auditory and visual sensorimotor temporal recalibration: Evidence from transcranial direct current stimulation (tDCS).

    PubMed

    Aytemür, Ali; Almeida, Nathalia; Lee, Kwang-Hyuk

    2017-02-01

    Adaptation to delayed sensory feedback following an action produces a subjective time compression between the action and the feedback (temporal recalibration effect, TRE). TRE is important for sensory delay compensation to maintain a relationship between causally related events. It is unclear whether TRE is a sensory modality-specific phenomenon. In 3 experiments employing a sensorimotor synchronization task, we investigated this question using cathodal transcranial direct-current stimulation (tDCS). We found that cathodal tDCS over the visual cortex, and to a lesser extent over the auditory cortex, produced decreased visual TRE. However, both auditory and visual cortex tDCS did not produce any measurable effects on auditory TRE. Our study revealed different nature of TRE in auditory and visual domains. Visual-motor TRE, which is more variable than auditory TRE, is a sensory modality-specific phenomenon, modulated by the auditory cortex. The robustness of auditory-motor TRE, unaffected by tDCS, suggests the dominance of the auditory system in temporal processing, by providing a frame of reference in the realignment of sensorimotor timing signals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The answer is blowing in the wind: free-flying honeybees can integrate visual and mechano-sensory inputs for making complex foraging decisions.

    PubMed

    Ravi, Sridhar; Garcia, Jair E; Wang, Chun; Dyer, Adrian G

    2016-11-01

    Bees navigate in complex environments using visual, olfactory and mechano-sensorial cues. In the lowest region of the atmosphere, the wind environment can be highly unsteady and bees employ fine motor-skills to enhance flight control. Recent work reveals sophisticated multi-modal processing of visual and olfactory channels by the bee brain to enhance foraging efficiency, but it currently remains unclear whether wind-induced mechano-sensory inputs are also integrated with visual information to facilitate decision making. Individual honeybees were trained in a linear flight arena with appetitive-aversive differential conditioning to use a context-setting cue of 3 m s -1 cross-wind direction to enable decisions about either a 'blue' or 'yellow' star stimulus being the correct alternative. Colour stimuli properties were mapped in bee-specific opponent-colour spaces to validate saliency, and to thus enable rapid reverse learning. Bees were able to integrate mechano-sensory and visual information to facilitate decisions that were significantly different to chance expectation after 35 learning trials. An independent group of bees were trained to find a single rewarding colour that was unrelated to the wind direction. In these trials, wind was not used as a context-setting cue and served only as a potential distracter in identifying the relevant rewarding visual stimuli. Comparison between respective groups shows that bees can learn to integrate visual and mechano-sensory information in a non-elemental fashion, revealing an unsuspected level of sensory processing in honeybees, and adding to the growing body of knowledge on the capacity of insect brains to use multi-modal sensory inputs in mediating foraging behaviour. © 2016. Published by The Company of Biologists Ltd.

  19. Visual sensory networks and effective information transfer in animal groups.

    PubMed

    Strandburg-Peshkin, Ariana; Twomey, Colin R; Bode, Nikolai W F; Kao, Albert B; Katz, Yael; Ioannou, Christos C; Rosenthal, Sara B; Torney, Colin J; Wu, Hai Shan; Levin, Simon A; Couzin, Iain D

    2013-09-09

    Social transmission of information is vital for many group-living animals, allowing coordination of motion and effective response to complex environments. Revealing the interaction networks underlying information flow within these groups is a central challenge. Previous work has modeled interactions between individuals based directly on their relative spatial positions: each individual is considered to interact with all neighbors within a fixed distance (metric range), a fixed number of nearest neighbors (topological range), a 'shell' of near neighbors (Voronoi range), or some combination (Figure 1A). However, conclusive evidence to support these assumptions is lacking. Here, we employ a novel approach that considers individual movement decisions to be based explicitly on the sensory information available to the organism. In other words, we consider that while spatial relations do inform interactions between individuals, they do so indirectly, through individuals' detection of sensory cues. We reconstruct computationally the visual field of each individual throughout experiments designed to investigate information propagation within fish schools (golden shiners, Notemigonus crysoleucas). Explicitly considering visual sensing allows us to more accurately predict the propagation of behavioral change in these groups during leadership events. Furthermore, we find that structural properties of visual interaction networks differ markedly from those of metric and topological counterparts, suggesting that previous assumptions may not appropriately reflect information flow in animal groups. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention

    PubMed Central

    Yu, Chen; Smith, Linda B.

    2016-01-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of the present study is to understand the complex system of sensory-motor behaviors that may underlie the establishment of joint attention between parents and toddlers. In an experimental task, parents and toddlers played together with multiple toys. We objectively measured joint attention – and the sensory-motor behaviors that underlie it – using a dual head-mounted eye-tracking system and frame-by-frame coding of manual actions. By tracking the momentary visual fixations and hand actions of each participant, we precisely determined just how often they fixated on the same object at the same time, the visual behaviors that preceded joint attention, and manual behaviors that preceded and co-occurred with joint attention. We found that multiple sequential sensory-motor patterns lead to joint attention. In addition, there are developmental changes in this multi-pathway system evidenced as variations in strength among multiple routes. We propose that coordinated visual attention between parents and toddlers is primarily a sensory-motor behavior. Skill in achieving coordinated visual attention in social settings – like skills in other sensory-motor domains – emerges from multiple pathways to the same functional end. PMID:27016038

  1. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention.

    PubMed

    Yu, Chen; Smith, Linda B

    2017-02-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of this study was to understand the complex system of sensory-motor behaviors that may underlie the establishment of joint attention between parents and toddlers. In an experimental task, parents and toddlers played together with multiple toys. We objectively measured joint attention-and the sensory-motor behaviors that underlie it-using a dual head-mounted eye-tracking system and frame-by-frame coding of manual actions. By tracking the momentary visual fixations and hand actions of each participant, we precisely determined just how often they fixated on the same object at the same time, the visual behaviors that preceded joint attention and manual behaviors that preceded and co-occurred with joint attention. We found that multiple sequential sensory-motor patterns lead to joint attention. In addition, there are developmental changes in this multi-pathway system evidenced as variations in strength among multiple routes. We propose that coordinated visual attention between parents and toddlers is primarily a sensory-motor behavior. Skill in achieving coordinated visual attention in social settings-like skills in other sensory-motor domains-emerges from multiple pathways to the same functional end. Copyright © 2016 Cognitive Science Society, Inc.

  2. Eagle-eyed visual acuity: an experimental investigation of enhanced perception in autism.

    PubMed

    Ashwin, Emma; Ashwin, Chris; Rhydderch, Danielle; Howells, Jessica; Baron-Cohen, Simon

    2009-01-01

    Anecdotal accounts of sensory hypersensitivity in individuals with autism spectrum conditions (ASC) have been noted since the first reports of the condition. Over time, empirical evidence has supported the notion that those with ASC have superior visual abilities compared with control subjects. However, it remains unclear whether these abilities are specifically the result of differences in sensory thresholds (low-level processing), rather than higher-level cognitive processes. This study investigates visual threshold in n = 15 individuals with ASC and n = 15 individuals without ASC, using a standardized optometric test, the Freiburg Visual Acuity and Contrast Test, to investigate basic low-level visual acuity. Individuals with ASC have significantly better visual acuity (20:7) compared with control subjects (20:13)-acuity so superior that it lies in the region reported for birds of prey. The results of this study suggest that inclusion of sensory hypersensitivity in the diagnostic criteria for ASC may be warranted and that basic standardized tests of sensory thresholds may inform causal theories of ASC.

  3. Modality-specificity of sensory aging in vision and audition: evidence from event-related potentials.

    PubMed

    Ceponiene, R; Westerfield, M; Torki, M; Townsend, J

    2008-06-18

    Major accounts of aging implicate changes in processing external stimulus information. Little is known about differential effects of auditory and visual sensory aging, and the mechanisms of sensory aging are still poorly understood. Using event-related potentials (ERPs) elicited by unattended stimuli in younger (M=25.5 yrs) and older (M=71.3 yrs) subjects, this study examined mechanisms of sensory aging under minimized attention conditions. Auditory and visual modalities were examined to address modality-specificity vs. generality of sensory aging. Between-modality differences were robust. The earlier-latency responses (P1, N1) were unaffected in the auditory modality but were diminished in the visual modality. The auditory N2 and early visual N2 were diminished. Two similarities between the modalities were age-related enhancements in the late P2 range and positive behavior-early N2 correlation, the latter suggesting that N2 may reflect long-latency inhibition of irrelevant stimuli. Since there is no evidence for salient differences in neuro-biological aging between the two sensory regions, the observed between-modality differences are best explained by the differential reliance of auditory and visual systems on attention. Visual sensory processing relies on facilitation by visuo-spatial attention, withdrawal of which appears to be more disadvantageous in older populations. In contrast, auditory processing is equipped with powerful inhibitory capacities. However, when the whole auditory modality is unattended, thalamo-cortical gating deficits may not manifest in the elderly. In contrast, ERP indices of longer-latency, stimulus-level inhibitory modulation appear to diminish with age.

  4. Late development of cue integration is linked to sensory fusion in cortex.

    PubMed

    Dekker, Tessa M; Ban, Hiroshi; van der Velde, Bauke; Sereno, Martin I; Welchman, Andrew E; Nardini, Marko

    2015-11-02

    Adults optimize perceptual judgements by integrating different types of sensory information [1, 2]. This engages specialized neural circuits that fuse signals from the same [3-5] or different [6] modalities. Whereas young children can use sensory cues independently, adult-like precision gains from cue combination only emerge around ages 10 to 11 years [7-9]. Why does it take so long to make best use of sensory information? Existing data cannot distinguish whether this (1) reflects surprisingly late changes in sensory processing (sensory integration mechanisms in the brain are still developing) or (2) depends on post-perceptual changes (integration in sensory cortex is adult-like, but higher-level decision processes do not access the information) [10]. We tested visual depth cue integration in the developing brain to distinguish these possibilities. We presented children aged 6-12 years with displays depicting depth from binocular disparity and relative motion and made measurements using psychophysics, retinotopic mapping, and pattern classification fMRI. Older children (>10.5 years) showed clear evidence for sensory fusion in V3B, a visual area thought to integrate depth cues in the adult brain [3-5]. By contrast, in younger children (<10.5 years), there was no evidence for sensory fusion in any visual area. This significant age difference was paired with a shift in perceptual performance around ages 10 to 11 years and could not be explained by motion artifacts, visual attention, or signal quality differences. Thus, whereas many basic visual processes mature early in childhood [11, 12], the brain circuits that fuse cues take a very long time to develop. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Late Development of Cue Integration Is Linked to Sensory Fusion in Cortex

    PubMed Central

    Dekker, Tessa M.; Ban, Hiroshi; van der Velde, Bauke; Sereno, Martin I.; Welchman, Andrew E.; Nardini, Marko

    2015-01-01

    Summary Adults optimize perceptual judgements by integrating different types of sensory information [1, 2]. This engages specialized neural circuits that fuse signals from the same [3, 4, 5] or different [6] modalities. Whereas young children can use sensory cues independently, adult-like precision gains from cue combination only emerge around ages 10 to 11 years [7, 8, 9]. Why does it take so long to make best use of sensory information? Existing data cannot distinguish whether this (1) reflects surprisingly late changes in sensory processing (sensory integration mechanisms in the brain are still developing) or (2) depends on post-perceptual changes (integration in sensory cortex is adult-like, but higher-level decision processes do not access the information) [10]. We tested visual depth cue integration in the developing brain to distinguish these possibilities. We presented children aged 6–12 years with displays depicting depth from binocular disparity and relative motion and made measurements using psychophysics, retinotopic mapping, and pattern classification fMRI. Older children (>10.5 years) showed clear evidence for sensory fusion in V3B, a visual area thought to integrate depth cues in the adult brain [3, 4, 5]. By contrast, in younger children (<10.5 years), there was no evidence for sensory fusion in any visual area. This significant age difference was paired with a shift in perceptual performance around ages 10 to 11 years and could not be explained by motion artifacts, visual attention, or signal quality differences. Thus, whereas many basic visual processes mature early in childhood [11, 12], the brain circuits that fuse cues take a very long time to develop. PMID:26480841

  6. Brain representations for acquiring and recalling visual-motor adaptations

    PubMed Central

    Bédard, Patrick; Sanes, Jerome N.

    2014-01-01

    Humans readily learn and remember new motor skills, a process that likely underlies adaptation to changing environments. During adaptation, the brain develops new sensory-motor relationships, and if consolidation occurs, a memory of the adaptation can be retained for extended periods. Considerable evidence exists that multiple brain circuits participate in acquiring new sensory-motor memories, though the networks engaged in recalling these and whether the same brain circuits participate in their formation and recall has less clarity. To address these issues, we assessed brain activation with functional MRI while young healthy adults learned and recalled new sensory-motor skills by adapting to world-view rotations of visual feedback that guided hand movements. We found cerebellar activation related to adaptation rate, likely reflecting changes related to overall adjustments to the visual rotation. A set of parietal and frontal regions, including inferior and superior parietal lobules, premotor area, supplementary motor area and primary somatosensory cortex, exhibited non-linear learning-related activation that peaked in the middle of the adaptation phase. Activation in some of these areas, including the inferior parietal lobule, intra-parietal sulcus and somatosensory cortex, likely reflected actual learning, since the activation correlated with learning after-effects. Lastly, we identified several structures having recall-related activation, including the anterior cingulate and the posterior putamen, since the activation correlated with recall efficacy. These findings demonstrate dynamic aspects of brain activation patterns related to formation and recall of a sensory-motor skill, such that non-overlapping brain regions participate in distinctive behavioral events. PMID:25019676

  7. Evaluation of Sensory Skills among Students with Visual Impairment

    ERIC Educational Resources Information Center

    Saleem, Suhib Saleem; Al-Salahat, Mohammad Mousa

    2016-01-01

    The purpose of the study was to evaluate the sensory skills among students with visual impairment (SVI). The sample contained of 30 students with blind and low vision enrolled in mainstreaming programs at general education schools at Najran in Kingdom of Saudi Arabia. A sensory skills scale was developed. The scale consisted of 20 items was…

  8. Multisensory integration, sensory substitution and visual rehabilitation.

    PubMed

    Proulx, Michael J; Ptito, Maurice; Amedi, Amir

    2014-04-01

    Sensory substitution has advanced remarkably over the past 35 years since first introduced to the scientific literature by Paul Bach-y-Rita. In this issue dedicated to his memory, we describe a collection of reviews that assess the current state of neuroscience research on sensory substitution, visual rehabilitation, and multisensory processes. Copyright © 2014. Published by Elsevier Ltd.

  9. Otolith shape lends support to the sensory drive hypothesis in rockfishes.

    PubMed

    Tuset, V M; Otero-Ferrer, J L; Gómez-Zurita, J; Venerus, L A; Stransky, C; Imondi, R; Orlov, A M; Ye, Z; Santschi, L; Afanasiev, P K; Zhuang, L; Farré, M; Love, M S; Lombarte, A

    2016-10-01

    The sensory drive hypothesis proposes that environmental factors affect both signalling dynamics and the evolution of signals and receivers. Sound detection and equilibrium in marine fishes are senses dependent on the sagittae otoliths, whose morphological variability appears intrinsically linked to the environment. The aim of this study was to understand if and which environmental factors could be conditioning the evolution of this sensory structure, therefore lending support to the sensory drive hypothesis. Thus, we analysed the otolith shape of 42 rockfish species (Sebastes spp.) to test the potential associations with the phylogeny, biological (age), ecological (feeding habit and depth distribution) and biogeographical factors. The results showed strong differences in the otolith shapes of some species, noticeably influenced by ecological and biogeographical factors. Moreover, otolith shape was clearly conditioned by phylogeny, but with a strong environmental effect, cautioning about the use of this structure for the systematics of rockfishes or other marine fishes. However, our most relevant finding is that the data supported the sensory drive hypothesis as a force promoting the radiation of the genus Sebastes. This hypothesis holds that adaptive divergence in communication has significant influence relative to other life history traits. It has already been established in Sebastes for visual characters and organs; our results showed that it applies to otolith transformations as well (despite the clear influence of feeding and depth), expanding the scope of the hypothesis to other sensory structures. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.

  10. Thalamic control of sensory selection in divided attention.

    PubMed

    Wimmer, Ralf D; Schmitt, L Ian; Davidson, Thomas J; Nakajima, Miho; Deisseroth, Karl; Halassa, Michael M

    2015-10-29

    How the brain selects appropriate sensory inputs and suppresses distractors is unknown. Given the well-established role of the prefrontal cortex (PFC) in executive function, its interactions with sensory cortical areas during attention have been hypothesized to control sensory selection. To test this idea and, more generally, dissect the circuits underlying sensory selection, we developed a cross-modal divided-attention task in mice that allowed genetic access to this cognitive process. By optogenetically perturbing PFC function in a temporally precise window, the ability of mice to select appropriately between conflicting visual and auditory stimuli was diminished. Equivalent sensory thalamocortical manipulations showed that behaviour was causally dependent on PFC interactions with the sensory thalamus, not sensory cortex. Consistent with this notion, we found neurons of the visual thalamic reticular nucleus (visTRN) to exhibit PFC-dependent changes in firing rate predictive of the modality selected. visTRN activity was causal to performance as confirmed by bidirectional optogenetic manipulations of this subnetwork. Using a combination of electrophysiology and intracellular chloride photometry, we demonstrated that visTRN dynamically controls visual thalamic gain through feedforward inhibition. Our experiments introduce a new subcortical model of sensory selection, in which the PFC biases thalamic reticular subnetworks to control thalamic sensory gain, selecting appropriate inputs for further processing.

  11. Audiovisual Modulation in Mouse Primary Visual Cortex Depends on Cross-Modal Stimulus Configuration and Congruency.

    PubMed

    Meijer, Guido T; Montijn, Jorrit S; Pennartz, Cyriel M A; Lansink, Carien S

    2017-09-06

    The sensory neocortex is a highly connected associative network that integrates information from multiple senses, even at the level of the primary sensory areas. Although a growing body of empirical evidence supports this view, the neural mechanisms of cross-modal integration in primary sensory areas, such as the primary visual cortex (V1), are still largely unknown. Using two-photon calcium imaging in awake mice, we show that the encoding of audiovisual stimuli in V1 neuronal populations is highly dependent on the features of the stimulus constituents. When the visual and auditory stimulus features were modulated at the same rate (i.e., temporally congruent), neurons responded with either an enhancement or suppression compared with unisensory visual stimuli, and their prevalence was balanced. Temporally incongruent tones or white-noise bursts included in audiovisual stimulus pairs resulted in predominant response suppression across the neuronal population. Visual contrast did not influence multisensory processing when the audiovisual stimulus pairs were congruent; however, when white-noise bursts were used, neurons generally showed response suppression when the visual stimulus contrast was high whereas this effect was absent when the visual contrast was low. Furthermore, a small fraction of V1 neurons, predominantly those located near the lateral border of V1, responded to sound alone. These results show that V1 is involved in the encoding of cross-modal interactions in a more versatile way than previously thought. SIGNIFICANCE STATEMENT The neural substrate of cross-modal integration is not limited to specialized cortical association areas but extends to primary sensory areas. Using two-photon imaging of large groups of neurons, we show that multisensory modulation of V1 populations is strongly determined by the individual and shared features of cross-modal stimulus constituents, such as contrast, frequency, congruency, and temporal structure. Congruent audiovisual stimulation resulted in a balanced pattern of response enhancement and suppression compared with unisensory visual stimuli, whereas incongruent or dissimilar stimuli at full contrast gave rise to a population dominated by response-suppressing neurons. Our results indicate that V1 dynamically integrates nonvisual sources of information while still attributing most of its resources to coding visual information. Copyright © 2017 the authors 0270-6474/17/378783-14$15.00/0.

  12. Sensory Substitution: The Spatial Updating of Auditory Scenes "Mimics" the Spatial Updating of Visual Scenes.

    PubMed

    Pasqualotto, Achille; Esenkaya, Tayfun

    2016-01-01

    Visual-to-auditory sensory substitution is used to convey visual information through audition, and it was initially created to compensate for blindness; it consists of software converting the visual images captured by a video-camera into the equivalent auditory images, or "soundscapes". Here, it was used by blindfolded sighted participants to learn the spatial position of simple shapes depicted in images arranged on the floor. Very few studies have used sensory substitution to investigate spatial representation, while it has been widely used to investigate object recognition. Additionally, with sensory substitution we could study the performance of participants actively exploring the environment through audition, rather than passively localizing sound sources. Blindfolded participants egocentrically learnt the position of six images by using sensory substitution and then a judgment of relative direction task (JRD) was used to determine how this scene was represented. This task consists of imagining being in a given location, oriented in a given direction, and pointing towards the required image. Before performing the JRD task, participants explored a map that provided allocentric information about the scene. Although spatial exploration was egocentric, surprisingly we found that performance in the JRD task was better for allocentric perspectives. This suggests that the egocentric representation of the scene was updated. This result is in line with previous studies using visual and somatosensory scenes, thus supporting the notion that different sensory modalities produce equivalent spatial representation(s). Moreover, our results have practical implications to improve training methods with sensory substitution devices (SSD).

  13. Visual short-term memory load reduces retinotopic cortex response to contrast.

    PubMed

    Konstantinou, Nikos; Bahrami, Bahador; Rees, Geraint; Lavie, Nilli

    2012-11-01

    Load Theory of attention suggests that high perceptual load in a task leads to reduced sensory visual cortex response to task-unrelated stimuli resulting in "load-induced blindness" [e.g., Lavie, N. Attention, distraction and cognitive control under load. Current Directions in Psychological Science, 19, 143-148, 2010; Lavie, N. Distracted and confused?: Selective attention under load. Trends in Cognitive Sciences, 9, 75-82, 2005]. Consideration of the findings that visual STM (VSTM) involves sensory recruitment [e.g., Pasternak, T., & Greenlee, M. Working memory in primate sensory systems. Nature Reviews Neuroscience, 6, 97-107, 2005] within Load Theory led us to a new hypothesis regarding the effects of VSTM load on visual processing. If VSTM load draws on sensory visual capacity, then similar to perceptual load, high VSTM load should also reduce visual cortex response to incoming stimuli leading to a failure to detect them. We tested this hypothesis with fMRI and behavioral measures of visual detection sensitivity. Participants detected the presence of a contrast increment during the maintenance delay in a VSTM task requiring maintenance of color and position. Increased VSTM load (manipulated by increased set size) led to reduced retinotopic visual cortex (V1-V3) responses to contrast as well as reduced detection sensitivity, as we predicted. Additional visual detection experiments established a clear tradeoff between the amount of information maintained in VSTM and detection sensitivity, while ruling out alternative accounts for the effects of VSTM load in terms of differential spatial allocation strategies or task difficulty. These findings extend Load Theory to demonstrate a new form of competitive interactions between early visual cortex processing and visual representations held in memory under load and provide a novel line of support for the sensory recruitment hypothesis of VSTM.

  14. Auditory and visual connectivity gradients in frontoparietal cortex

    PubMed Central

    Hellyer, Peter J.; Wise, Richard J. S.; Leech, Robert

    2016-01-01

    Abstract A frontoparietal network of brain regions is often implicated in both auditory and visual information processing. Although it is possible that the same set of multimodal regions subserves both modalities, there is increasing evidence that there is a differentiation of sensory function within frontoparietal cortex. Magnetic resonance imaging (MRI) in humans was used to investigate whether different frontoparietal regions showed intrinsic biases in connectivity with visual or auditory modalities. Structural connectivity was assessed with diffusion tractography and functional connectivity was tested using functional MRI. A dorsal–ventral gradient of function was observed, where connectivity with visual cortex dominates dorsal frontal and parietal connections, while connectivity with auditory cortex dominates ventral frontal and parietal regions. A gradient was also observed along the posterior–anterior axis, although in opposite directions in prefrontal and parietal cortices. The results suggest that the location of neural activity within frontoparietal cortex may be influenced by these intrinsic biases toward visual and auditory processing. Thus, the location of activity in frontoparietal cortex may be influenced as much by stimulus modality as the cognitive demands of a task. It was concluded that stimulus modality was spatially encoded throughout frontal and parietal cortices, and was speculated that such an arrangement allows for top–down modulation of modality‐specific information to occur within higher‐order cortex. This could provide a potentially faster and more efficient pathway by which top–down selection between sensory modalities could occur, by constraining modulations to within frontal and parietal regions, rather than long‐range connections to sensory cortices. Hum Brain Mapp 38:255–270, 2017. © 2016 Wiley Periodicals, Inc. PMID:27571304

  15. Sight or Scent: Lemur Sensory Reliance in Detecting Food Quality Varies with Feeding Ecology

    PubMed Central

    Rushmore, Julie; Leonhardt, Sara D.; Drea, Christine M.

    2012-01-01

    Visual and olfactory cues provide important information to foragers, yet we know little about species differences in sensory reliance during food selection. In a series of experimental foraging studies, we examined the relative reliance on vision versus olfaction in three diurnal, primate species with diverse feeding ecologies, including folivorous Coquerel's sifakas (Propithecus coquereli), frugivorous ruffed lemurs (Varecia variegata spp), and generalist ring-tailed lemurs (Lemur catta). We used animals with known color-vision status and foods for which different maturation stages (and hence quality) produce distinct visual and olfactory cues (the latter determined chemically). We first showed that lemurs preferentially selected high-quality foods over low-quality foods when visual and olfactory cues were simultaneously available for both food types. Next, using a novel apparatus in a series of discrimination trials, we either manipulated food quality (while holding sensory cues constant) or manipulated sensory cues (while holding food quality constant). Among our study subjects that showed relatively strong preferences for high-quality foods, folivores required both sensory cues combined to reliably identify their preferred foods, whereas generalists could identify their preferred foods using either cue alone, and frugivores could identify their preferred foods using olfactory, but not visual, cues alone. Moreover, when only high-quality foods were available, folivores and generalists used visual rather than olfactory cues to select food, whereas frugivores used both cue types equally. Lastly, individuals in all three of the study species predominantly relied on sight when choosing between low-quality foods, but species differed in the strength of their sensory biases. Our results generally emphasize visual over olfactory reliance in foraging lemurs, but we suggest that the relative sensory reliance of animals may vary with their feeding ecology. PMID:22870229

  16. Anemonefishes rely on visual and chemical cues to correctly identify conspecifics

    NASA Astrophysics Data System (ADS)

    Johnston, Nicole K.; Dixson, Danielle L.

    2017-09-01

    Organisms rely on sensory cues to interpret their environment and make important life-history decisions. Accurate recognition is of particular importance in diverse reef environments. Most evidence on the use of sensory cues focuses on those used in predator avoidance or habitat recognition, with little information on their role in conspecific recognition. Yet conspecific recognition is essential for life-history decisions including settlement, mate choice, and dominance interactions. Using a sensory manipulated tank and a two-chamber choice flume, anemonefish conspecific response was measured in the presence and absence of chemical and/or visual cues. Experiments were then repeated in the presence or absence of two heterospecific species to evaluate whether a heterospecific fish altered the conspecific response. Anemonefishes responded to both the visual and chemical cues of conspecifics, but relied on the combination of the two cues to recognize conspecifics inside the sensory manipulated tank. These results contrast previous studies focusing on predator detection where anemonefishes were found to compensate for the loss of one sensory cue (chemical) by utilizing a second cue (visual). This lack of sensory compensation may impact the ability of anemonefishes to acclimate to changing reef environments in the future.

  17. Digital-Visual-Sensory-Design Anthropology: Ethnography, Imagination and Intervention

    ERIC Educational Resources Information Center

    Pink, Sarah

    2014-01-01

    In this article I outline how a digital-visual-sensory approach to anthropological ethnography might participate in the making of relationship between design and anthropology. While design anthropology is itself coming of age, the potential of its relationship with applied visual anthropology methodology and theory has not been considered in the…

  18. Visual Landmarks Facilitate Rodent Spatial Navigation in Virtual Reality Environments

    ERIC Educational Resources Information Center

    Youngstrom, Isaac A.; Strowbridge, Ben W.

    2012-01-01

    Because many different sensory modalities contribute to spatial learning in rodents, it has been difficult to determine whether spatial navigation can be guided solely by visual cues. Rodents moving within physical environments with visual cues engage a variety of nonvisual sensory systems that cannot be easily inhibited without lesioning brain…

  19. Prenatal sensory experience affects hatching behavior in domestic chicks (Gallus gallus) and Japanese quail chicks (Coturnix coturnix japonica).

    PubMed

    Sleigh, Merry J; Casey, Michael B

    2014-07-01

    Species-typical developmental outcomes result from organismic and environmental constraints and experiences shared by members of a species. We examined the effects of enhanced prenatal sensory experience on hatching behaviors by exposing domestic chicks (n = 95) and Japanese quail (n = 125) to one of four prenatal conditions: enhanced visual stimulation, enhanced auditory stimulation, enhanced auditory and visual stimulation, or no enhanced sensory experience (control condition). In general, across species, control embryos had slower hatching behaviors than all other embryos. Embryos in the auditory condition had faster hatching behaviors than embryos in the visual and control conditions. Auditory-visual condition embryos showed similarities to embryos exposed to either auditory or visual stimulation. These results suggest that prenatal sensory experience can influence hatching behavior of precocial birds, with the type of stimulation being a critical variable. These results also provide further evidence that species-typical outcomes are the result of species-typical prenatal experiences. © 2013 Wiley Periodicals, Inc.

  20. Learning Enhances Sensory and Multiple Non-sensory Representations in Primary Visual Cortex

    PubMed Central

    Poort, Jasper; Khan, Adil G.; Pachitariu, Marius; Nemri, Abdellatif; Orsolic, Ivana; Krupic, Julija; Bauza, Marius; Sahani, Maneesh; Keller, Georg B.; Mrsic-Flogel, Thomas D.; Hofer, Sonja B.

    2015-01-01

    Summary We determined how learning modifies neural representations in primary visual cortex (V1) during acquisition of a visually guided behavioral task. We imaged the activity of the same layer 2/3 neuronal populations as mice learned to discriminate two visual patterns while running through a virtual corridor, where one pattern was rewarded. Improvements in behavioral performance were closely associated with increasingly distinguishable population-level representations of task-relevant stimuli, as a result of stabilization of existing and recruitment of new neurons selective for these stimuli. These effects correlated with the appearance of multiple task-dependent signals during learning: those that increased neuronal selectivity across the population when expert animals engaged in the task, and those reflecting anticipation or behavioral choices specifically in neuronal subsets preferring the rewarded stimulus. Therefore, learning engages diverse mechanisms that modify sensory and non-sensory representations in V1 to adjust its processing to task requirements and the behavioral relevance of visual stimuli. PMID:26051421

  1. Orthographic units in the absence of visual processing: Evidence from sublexical structure in braille.

    PubMed

    Fischer-Baum, Simon; Englebretson, Robert

    2016-08-01

    Reading relies on the recognition of units larger than single letters and smaller than whole words. Previous research has linked sublexical structures in reading to properties of the visual system, specifically on the parallel processing of letters that the visual system enables. But whether the visual system is essential for this to happen, or whether the recognition of sublexical structures may emerge by other means, is an open question. To address this question, we investigate braille, a writing system that relies exclusively on the tactile rather than the visual modality. We provide experimental evidence demonstrating that adult readers of (English) braille are sensitive to sublexical units. Contrary to prior assumptions in the braille research literature, we find strong evidence that braille readers do indeed access sublexical structure, namely the processing of multi-cell contractions as single orthographic units and the recognition of morphemes within morphologically-complex words. Therefore, we conclude that the recognition of sublexical structure is not exclusively tied to the visual system. However, our findings also suggest that there are aspects of morphological processing on which braille and print readers differ, and that these differences may, crucially, be related to reading using the tactile rather than the visual sensory modality. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Collective behaviour in vertebrates: a sensory perspective

    PubMed Central

    Collignon, Bertrand; Fernández-Juricic, Esteban

    2016-01-01

    Collective behaviour models can predict behaviours of schools, flocks, and herds. However, in many cases, these models make biologically unrealistic assumptions in terms of the sensory capabilities of the organism, which are applied across different species. We explored how sensitive collective behaviour models are to these sensory assumptions. Specifically, we used parameters reflecting the visual coverage and visual acuity that determine the spatial range over which an individual can detect and interact with conspecifics. Using metric and topological collective behaviour models, we compared the classic sensory parameters, typically used to model birds and fish, with a set of realistic sensory parameters obtained through physiological measurements. Compared with the classic sensory assumptions, the realistic assumptions increased perceptual ranges, which led to fewer groups and larger group sizes in all species, and higher polarity values and slightly shorter neighbour distances in the fish species. Overall, classic visual sensory assumptions are not representative of many species showing collective behaviour and constrain unrealistically their perceptual ranges. More importantly, caution must be exercised when empirically testing the predictions of these models in terms of choosing the model species, making realistic predictions, and interpreting the results. PMID:28018616

  3. Factor Analysis of Persistent Post-Concussive Symptoms within a Military Sample with Blast Exposure

    PubMed Central

    Franke, L.M.; Czarnota, J.N.; Ketchum, J.M.; Walker, W.C.

    2014-01-01

    Objective To determine the factor structure of persistent post-concussive syndrome (PPCS) symptoms in a blast-exposed military sample and validate factors against objective and symptom measures. Setting Veterans Affairs medical center and military bases. Participants One hundred eighty-one service members and veterans with at least one significant exposure to blast during deployment within the two years prior to study enrollment. Design Confirmatory and exploratory factor analysis of the Rivermead Post-concussion Questionnaire (RPQ). Main Measures RPQ, PTSD Symptom Checklist-Civilian, Center for Epidemiologic Studies Depression inventory, Sensory Organization Test, Paced Auditory Serial Addition Test, California Verbal Learning Test, Delis-Kaplan Executive Function System subtests. Results The three-factor structure of PPCS was not confirmed. A four-factor structure was extracted, and factors were interpreted as reflecting emotional, cognitive, visual, and vestibular functions. All factors were associated with scores on psychological symptom inventories; visual and vestibular factors were also associated with balance performance. There was no significant association between the cognitive factor and neuropsychological performance, nor between a history of mTBI and factor scores. Conclusion Persistent post-concussive symptoms observed months after blast exposure seem to be related to four distinct forms of distress, but not to mTBI per se, with vestibular and visual factors possibly related to injury of sensory organs by blast. PMID:24695267

  4. Visual Occlusion Decreases Motion Sickness in a Flight Simulator.

    PubMed

    Ishak, Shaziela; Bubka, Andrea; Bonato, Frederick

    2018-05-01

    Sensory conflict theories of motion sickness (MS) assert that symptoms may result when incoming sensory inputs (e.g., visual and vestibular) contradict each other. Logic suggests that attenuating input from one sense may reduce conflict and hence lessen MS symptoms. In the current study, it was hypothesized that attenuating visual input by blocking light entering the eye would reduce MS symptoms in a motion provocative environment. Participants sat inside an aircraft cockpit mounted onto a motion platform that simultaneously pitched, rolled, and heaved in two conditions. In the occluded condition, participants wore "blackout" goggles and closed their eyes to block light. In the control condition, participants opened their eyes and had full view of the cockpit's interior. Participants completed separate Simulator Sickness Questionnaires before and after each condition. The posttreatment total Simulator Sickness Questionnaires and subscores for nausea, oculomotor, and disorientation in the control condition were significantly higher than those in the occluded condition. These results suggest that under some conditions attenuating visual input may delay the onset of MS or weaken the severity of symptoms. Eliminating visual input may reduce visual/nonvisual sensory conflict by weakening the influence of the visual channel, which is consistent with the sensory conflict theory of MS.

  5. Sensory Impairments and Cognitive Function in Middle-Aged Adults.

    PubMed

    Schubert, Carla R; Cruickshanks, Karen J; Fischer, Mary E; Chen, Yanjun; Klein, Barbara E K; Klein, Ronald; Pinto, A Alex

    2017-08-01

    Hearing, visual, and olfactory impairments have been associated with cognitive impairment in older adults but less is known about associations with cognitive function in middle-aged adults. Sensory and cognitive functions were measured on participants in the baseline examination (2005-2008) of the Beaver Dam Offspring Study. Cognitive function was measured with the Trail Making tests A (TMTA) and B (TMTB) and the Grooved Peg Board test. Pure-tone audiometry, Pelli-Robson letter charts, and the San Diego Odor Identification test were used to measure hearing, contrast sensitivity, and olfaction, respectively. There were 2,836 participants aged 21-84 years with measures of hearing, visual, olfactory, and cognitive function at the baseline examination. Nineteen percent of the cohort had one sensory impairment and 3% had multiple sensory impairments. In multivariable adjusted linear regression models that included all three sensory impairments, hearing impairment, visual impairment, and olfactory impairment were each independently associated with poorer performance on the TMTA, TMTB, and Grooved Peg Board (p < .05 for all sensory impairments in all models). Participants with a sensory impairment took on average from 2 to 10 seconds longer than participants without the corresponding sensory impairment to complete these tests. Results were similar in models that included adjustment for hearing aid use. Hearing, visual and olfactory impairment were associated with poorer performance on cognitive function tests independent of the other sensory impairments and factors associated with cognition. Sensory impairments in midlife are associated with subtle deficits in cognitive function which may be indicative of early brain aging. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Hearing Shapes: Event-related Potentials Reveal the Time Course of Auditory-Visual Sensory Substitution.

    PubMed

    Graulty, Christian; Papaioannou, Orestis; Bauer, Phoebe; Pitts, Michael A; Canseco-Gonzalez, Enriqueta

    2018-04-01

    In auditory-visual sensory substitution, visual information (e.g., shape) can be extracted through strictly auditory input (e.g., soundscapes). Previous studies have shown that image-to-sound conversions that follow simple rules [such as the Meijer algorithm; Meijer, P. B. L. An experimental system for auditory image representation. Transactions on Biomedical Engineering, 39, 111-121, 1992] are highly intuitive and rapidly learned by both blind and sighted individuals. A number of recent fMRI studies have begun to explore the neuroplastic changes that result from sensory substitution training. However, the time course of cross-sensory information transfer in sensory substitution is largely unexplored and may offer insights into the underlying neural mechanisms. In this study, we recorded ERPs to soundscapes before and after sighted participants were trained with the Meijer algorithm. We compared these posttraining versus pretraining ERP differences with those of a control group who received the same set of 80 auditory/visual stimuli but with arbitrary pairings during training. Our behavioral results confirmed the rapid acquisition of cross-sensory mappings, and the group trained with the Meijer algorithm was able to generalize their learning to novel soundscapes at impressive levels of accuracy. The ERP results revealed an early cross-sensory learning effect (150-210 msec) that was significantly enhanced in the algorithm-trained group compared with the control group as well as a later difference (420-480 msec) that was unique to the algorithm-trained group. These ERP modulations are consistent with previous fMRI results and provide additional insight into the time course of cross-sensory information transfer in sensory substitution.

  7. Strain differences of the effect of enucleation and anophthalmia on the size and growth of sensory cortices in mice.

    PubMed

    Massé, Ian O; Guillemette, Sonia; Laramée, Marie-Eve; Bronchti, Gilles; Boire, Denis

    2014-11-07

    Anophthalmia is a condition in which the eye does not develop from the early embryonic period. Early blindness induces cross-modal plastic modifications in the brain such as auditory and haptic activations of the visual cortex and also leads to a greater solicitation of the somatosensory and auditory cortices. The visual cortex is activated by auditory stimuli in anophthalmic mice and activity is known to alter the growth pattern of the cerebral cortex. The size of the primary visual, auditory and somatosensory cortices and of the corresponding specific sensory thalamic nuclei were measured in intact and enucleated C57Bl/6J mice and in ZRDCT anophthalmic mice (ZRDCT/An) to evaluate the contribution of cross-modal activity on the growth of the cerebral cortex. In addition, the size of these structures were compared in intact, enucleated and anophthalmic fourth generation backcrossed hybrid C57Bl/6J×ZRDCT/An mice to parse out the effects of mouse strains and of the different visual deprivations. The visual cortex was smaller in the anophthalmic ZRDCT/An than in the intact and enucleated C57Bl/6J mice. Also the auditory cortex was larger and the somatosensory cortex smaller in the ZRDCT/An than in the intact and enucleated C57Bl/6J mice. The size differences of sensory cortices between the enucleated and anophthalmic mice were no longer present in the hybrid mice, showing specific genetic differences between C57Bl/6J and ZRDCT mice. The post natal size increase of the visual cortex was less in the enucleated than in the anophthalmic and intact hybrid mice. This suggests differences in the activity of the visual cortex between enucleated and anophthalmic mice and that early in-utero spontaneous neural activity in the visual system contributes to the shaping of functional properties of cortical networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. The visual cognitive network, but not the visual sensory network, is affected in amnestic mild cognitive impairment: a study of brain oscillatory responses.

    PubMed

    Yener, Görsev G; Emek-Savaş, Derya Durusu; Güntekin, Bahar; Başar, Erol

    2014-10-17

    Mild Cognitive Impairment (MCI) is considered in many as prodromal stage of Alzheimer's disease (AD). Event-related oscillations (ERO) reflect cognitive responses of brain whereas sensory-evoked oscillations (SEO) inform about sensory responses. For this study, we compared visual SEO and ERO responses in MCI to explore brain dynamics (BACKGROUND). Forty-three patients with MCI (mean age=74.0 year) and 41 age- and education-matched healthy-elderly controls (HC) (mean age=71.1 year) participated in the study. The maximum peak-to-peak amplitudes for each subject's averaged delta response (0.5-3.0 Hz) were measured from two conditions (simple visual stimulation and classical visual oddball paradigm target stimulation) (METHOD). Overall, amplitudes of target ERO responses were higher than SEO amplitudes. The preferential location for maximum amplitude values was frontal lobe for ERO and occipital lobe for SEO. The ANOVA for delta responses showed significant results for the group Xparadigm. Post-hoc tests indicated that (1) the difference between groups were significant for target delta responses, but not for SEO, (2) ERO elicited higher responses for HC than MCI patients, and (3) females had higher target ERO than males and this difference was pronounced in the control group (RESULTS). Overall, cognitive responses display almost double the amplitudes of sensory responses over frontal regions. The topography of oscillatory responses differs depending on stimuli: visualsensory responses are highest over occipitals and -cognitive responses over frontal regions. A group effect is observed in MCI indicating that visual sensory and cognitive circuits behave differently indicating preserved visual sensory responses, but decreased cognitive responses (CONCLUSION). Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Spontaneous cortical activity alternates between motifs defined by regional axonal projections

    PubMed Central

    Mohajerani, Majid H.; Chan, Allen W.; Mohsenvand, Mostafa; LeDue, Jeffrey; Liu, Rui; McVea, David A.; Boyd, Jamie D.; Wang, Yu Tian; Reimers, Mark; Murphy, Timothy H.

    2014-01-01

    In lightly anaesthetized or awake adult mice using millisecond timescale voltage sensitive dye imaging, we show that a palette of sensory-evoked and hemisphere-wide activity motifs are represented in spontaneous activity. These motifs can reflect multiple modes of sensory processing including vision, audition, and touch. Similar cortical networks were found with direct cortical activation using channelrhodopsin-2. Regional analysis of activity spread indicated modality specific sources such as primary sensory areas, and a common posterior-medial cortical sink where sensory activity was extinguished within the parietal association area, and a secondary anterior medial sink within the cingulate/secondary motor cortices for visual stimuli. Correlation analysis between functional circuits and intracortical axonal projections indicated a common framework corresponding to long-range mono-synaptic connections between cortical regions. Maps of intracortical mono-synaptic structural connections predicted hemisphere-wide patterns of spontaneous and sensory-evoked depolarization. We suggest that an intracortical monosynaptic connectome shapes the ebb and flow of spontaneous cortical activity. PMID:23974708

  10. The Role of Ribbons at Sensory Synapses

    PubMed Central

    LoGiudice, Lisamarie; Matthews, Gary

    2009-01-01

    Synaptic ribbons are organelles that tether vesicles at the presynaptic active zones of sensory neurons in the visual, auditory and vestibular systems. These neurons generate sustained, graded electrical signals in response to sensory stimuli, and fidelity of transmission therefore requires their synapses to release neurotransmitter continuously at high rates. It has long been thought that the ribbons at the active zones of sensory synapses accomplish this task by enhancing the size and accessibility of the readily releasable pool of synaptic vesicles, which may represent the vesicles attached to the ribbon. Recent evidence suggests that synaptic ribbons immobilize vesicles in the resting cell and coordinate the transient, synchronous release of vesicles in response to stimulation, but it is not yet clear how the ribbon can efficiently mobilize and coordinate multiple vesicles for release. However, detailed anatomical, electrophysiological and optical studies have begun to reveal the mechanics of release at ribbon synapses, and this multidisciplinary approach promises to reconcile structure, function, and mechanism at these important sensory synapses. PMID:19264728

  11. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception.

  12. The sensory components of high-capacity iconic memory and visual working memory.

    PubMed

    Bradley, Claire; Pearson, Joel

    2012-01-01

    EARLY VISUAL MEMORY CAN BE SPLIT INTO TWO PRIMARY COMPONENTS: a high-capacity, short-lived iconic memory followed by a limited-capacity visual working memory that can last many seconds. Whereas a large number of studies have investigated visual working memory for low-level sensory features, much research on iconic memory has used more "high-level" alphanumeric stimuli such as letters or numbers. These two forms of memory are typically examined separately, despite an intrinsic overlap in their characteristics. Here, we used a purely sensory paradigm to examine visual short-term memory for 10 homogeneous items of three different visual features (color, orientation and motion) across a range of durations from 0 to 6 s. We found that the amount of information stored in iconic memory is smaller for motion than for color or orientation. Performance declined exponentially with longer storage durations and reached chance levels after ∼2 s. Further experiments showed that performance for the 10 items at 1 s was contingent on unperturbed attentional resources. In addition, for orientation stimuli, performance was contingent on the location of stimuli in the visual field, especially for short cue delays. Overall, our results suggest a smooth transition between an automatic, high-capacity, feature-specific sensory-iconic memory, and an effortful "lower-capacity" visual working memory.

  13. Modeling of Explorative Procedures for Remote Object Identification

    DTIC Science & Technology

    1991-09-01

    haptic sensory system and the simulated foveal component of the visual system. Eventually it will allow multiple applications in remote sensing and...superposition of sensory channels. The use of a force reflecting telemanipulator and computer simulated visual foveal component are the tools which...representation of human search models is achieved by using the proprioceptive component of the haptic sensory system and the simulated foveal component of the

  14. Sensory Substitution: The Spatial Updating of Auditory Scenes “Mimics” the Spatial Updating of Visual Scenes

    PubMed Central

    Pasqualotto, Achille; Esenkaya, Tayfun

    2016-01-01

    Visual-to-auditory sensory substitution is used to convey visual information through audition, and it was initially created to compensate for blindness; it consists of software converting the visual images captured by a video-camera into the equivalent auditory images, or “soundscapes”. Here, it was used by blindfolded sighted participants to learn the spatial position of simple shapes depicted in images arranged on the floor. Very few studies have used sensory substitution to investigate spatial representation, while it has been widely used to investigate object recognition. Additionally, with sensory substitution we could study the performance of participants actively exploring the environment through audition, rather than passively localizing sound sources. Blindfolded participants egocentrically learnt the position of six images by using sensory substitution and then a judgment of relative direction task (JRD) was used to determine how this scene was represented. This task consists of imagining being in a given location, oriented in a given direction, and pointing towards the required image. Before performing the JRD task, participants explored a map that provided allocentric information about the scene. Although spatial exploration was egocentric, surprisingly we found that performance in the JRD task was better for allocentric perspectives. This suggests that the egocentric representation of the scene was updated. This result is in line with previous studies using visual and somatosensory scenes, thus supporting the notion that different sensory modalities produce equivalent spatial representation(s). Moreover, our results have practical implications to improve training methods with sensory substitution devices (SSD). PMID:27148000

  15. Sensory experience ratings (SERs) for 1,659 French words: Relationships with other psycholinguistic variables and visual word recognition.

    PubMed

    Bonin, Patrick; Méot, Alain; Ferrand, Ludovic; Bugaïska, Aurélia

    2015-09-01

    We collected sensory experience ratings (SERs) for 1,659 French words in adults. Sensory experience for words is a recently introduced variable that corresponds to the degree to which words elicit sensory and perceptual experiences (Juhasz & Yap Behavior Research Methods, 45, 160-168, 2013; Juhasz, Yap, Dicke, Taylor, & Gullick Quarterly Journal of Experimental Psychology, 64, 1683-1691, 2011). The relationships of the sensory experience norms with other psycholinguistic variables (e.g., imageability and age of acquisition) were analyzed. We also investigated the degree to which SER predicted performance in visual word recognition tasks (lexical decision, word naming, and progressive demasking). The analyses indicated that SER reliably predicted response times in lexical decision, but not in word naming or progressive demasking. The findings are discussed in relation to the status of SER, the role of semantic code activation in visual word recognition, and the embodied view of cognition.

  16. Action preparation modulates sensory perception in unseen personal space: An electrophysiological investigation.

    PubMed

    Job, Xavier E; de Fockert, Jan W; van Velzen, José

    2016-08-01

    Behavioural and electrophysiological evidence has demonstrated that preparation of goal-directed actions modulates sensory perception at the goal location before the action is executed. However, previous studies have focused on sensory perception in areas of peripersonal space. The present study investigated visual and tactile sensory processing at the goal location of upcoming movements towards the body, much of which is not visible, as well as visible peripersonal space. A motor task cued participants to prepare a reaching movement towards goals either in peripersonal space in front of them or personal space on the upper chest. In order to assess modulations of sensory perception during movement preparation, event-related potentials (ERPs) were recorded in response to task-irrelevant visual and tactile probe stimuli delivered randomly at one of the goal locations of the movements. In line with previous neurophysiological findings, movement preparation modulated visual processing at the goal of a movement in peripersonal space. Movement preparation also modulated somatosensory processing at the movement goal in personal space. The findings demonstrate that tactile perception in personal space is subject to similar top-down sensory modulation by motor preparation as observed for visual stimuli presented in peripersonal space. These findings show for the first time that the principles and mechanisms underlying adaptive modulation of sensory processing in the context of action extend to tactile perception in unseen personal space. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Saccadic Eye Movements Impose a Natural Bottleneck on Visual Short-Term Memory

    ERIC Educational Resources Information Center

    Ohl, Sven; Rolfs, Martin

    2017-01-01

    Visual short-term memory (VSTM) is a crucial repository of information when events unfold rapidly before our eyes, yet it maintains only a fraction of the sensory information encoded by the visual system. Here, we tested the hypothesis that saccadic eye movements provide a natural bottleneck for the transition of fragile content in sensory memory…

  18. Linking express saccade occurance to stimulus properties and sensorimotor integration in the superior colliculus.

    PubMed

    Marino, Robert A; Levy, Ron; Munoz, Douglas P

    2015-08-01

    Express saccades represent the fastest possible eye movements to visual targets with reaction times that approach minimum sensory-motor conduction delays. Previous work in monkeys has identified two specific neural signals in the superior colliculus (SC: a midbrain sensorimotor integration structure involved in gaze control) that are required to execute express saccades: 1) previsual activity consisting of a low-frequency increase in action potentials in sensory-motor neurons immediately before the arrival of a visual response; and 2) a transient visual-sensory response consisting of a high-frequency burst of action potentials in visually responsive neurons resulting from the appearance of a visual target stimulus. To better understand how these two neural signals interact to produce express saccades, we manipulated the arrival time and magnitude of visual responses in the SC by altering target luminance and we examined the corresponding influences on SC activity and express saccade generation. We recorded from saccade neurons with visual-, motor-, and previsual-related activity in the SC of monkeys performing the gap saccade task while target luminance was systematically varied between 0.001 and 42.5 cd/m(2) against a black background (∼0.0001 cd/m(2)). Our results demonstrated that 1) express saccade latencies were linked directly to the arrival time in the SC of visual responses produced by abruptly appearing visual stimuli; 2) express saccades were generated toward both dim and bright targets whenever sufficient previsual activity was present; and 3) target luminance altered the likelihood of producing an express saccade. When an express saccade was generated, visuomotor neurons increased their activity immediately before the arrival of the visual response in the SC and saccade initiation. Furthermore, the visual and motor responses of visuomotor neurons merged into a single burst of action potentials, while the visual response of visual-only neurons was unaffected. A linear combination model was used to test which SC signals best predicted the likelihood of producing an express saccade. In addition to visual response magnitude and previsual activity of saccade neurons, the model identified presaccadic activity (activity occurring during the 30-ms epoch immediately before saccade initiation) as a third important signal for predicting express saccades. We conclude that express saccades can be predicted by visual, previsual, and presaccadic signals recorded from visuomotor neurons in the intermediate layers of the SC. Copyright © 2015 the American Physiological Society.

  19. Linking express saccade occurance to stimulus properties and sensorimotor integration in the superior colliculus

    PubMed Central

    Levy, Ron; Munoz, Douglas P.

    2015-01-01

    Express saccades represent the fastest possible eye movements to visual targets with reaction times that approach minimum sensory-motor conduction delays. Previous work in monkeys has identified two specific neural signals in the superior colliculus (SC: a midbrain sensorimotor integration structure involved in gaze control) that are required to execute express saccades: 1) previsual activity consisting of a low-frequency increase in action potentials in sensory-motor neurons immediately before the arrival of a visual response; and 2) a transient visual-sensory response consisting of a high-frequency burst of action potentials in visually responsive neurons resulting from the appearance of a visual target stimulus. To better understand how these two neural signals interact to produce express saccades, we manipulated the arrival time and magnitude of visual responses in the SC by altering target luminance and we examined the corresponding influences on SC activity and express saccade generation. We recorded from saccade neurons with visual-, motor-, and previsual-related activity in the SC of monkeys performing the gap saccade task while target luminance was systematically varied between 0.001 and 42.5 cd/m2 against a black background (∼0.0001 cd/m2). Our results demonstrated that 1) express saccade latencies were linked directly to the arrival time in the SC of visual responses produced by abruptly appearing visual stimuli; 2) express saccades were generated toward both dim and bright targets whenever sufficient previsual activity was present; and 3) target luminance altered the likelihood of producing an express saccade. When an express saccade was generated, visuomotor neurons increased their activity immediately before the arrival of the visual response in the SC and saccade initiation. Furthermore, the visual and motor responses of visuomotor neurons merged into a single burst of action potentials, while the visual response of visual-only neurons was unaffected. A linear combination model was used to test which SC signals best predicted the likelihood of producing an express saccade. In addition to visual response magnitude and previsual activity of saccade neurons, the model identified presaccadic activity (activity occurring during the 30-ms epoch immediately before saccade initiation) as a third important signal for predicting express saccades. We conclude that express saccades can be predicted by visual, previsual, and presaccadic signals recorded from visuomotor neurons in the intermediate layers of the SC. PMID:26063770

  20. Sensory Eye Dominance in Treated Anisometropic Amblyopia

    PubMed Central

    Chen, Yao

    2017-01-01

    Amblyopia results from inadequate visual experience during the critical period of visual development. Abnormal binocular interactions are believed to play a critical role in amblyopia. These binocular deficits can often be resolved, owing to the residual visual plasticity in amblyopes. In this study, we quantitatively measured the sensory eye dominance in treated anisometropic amblyopes to determine whether they had fully recovered. Fourteen treated anisometropic amblyopes with normal or corrected to normal visual acuity participated, and their sensory eye dominance was assessed by using a binocular phase combination paradigm. We found that the two eyes were unequal in binocular combination in most (11 out of 14) of our treated anisometropic amblyopes, but none of the controls. We concluded that the treated anisometropic amblyopes, even those with a normal range of visual acuity, exhibited abnormal binocular processing. Our results thus suggest that there is potential for improvement in treated anisometropic amblyopes that may further enhance their binocular visual functioning. PMID:28573051

  1. Common Sense in Choice: The Effect of Sensory Modality on Neural Value Representations.

    PubMed

    Shuster, Anastasia; Levy, Dino J

    2018-01-01

    Although it is well established that the ventromedial prefrontal cortex (vmPFC) represents value using a common currency across categories of rewards, it is unknown whether the vmPFC represents value irrespective of the sensory modality in which alternatives are presented. In the current study, male and female human subjects completed a decision-making task while their neural activity was recorded using functional magnetic resonance imaging. On each trial, subjects chose between a safe alternative and a lottery, which was presented visually or aurally. A univariate conjunction analysis revealed that the anterior portion of the vmPFC tracks subjective value (SV) irrespective of the sensory modality. Using a novel cross-modality multivariate classifier, we were able to decode auditory value based on visual trials and vice versa. In addition, we found that the visual and auditory sensory cortices, which were identified using functional localizers, are also sensitive to the value of stimuli, albeit in a modality-specific manner. Whereas both primary and higher-order auditory cortices represented auditory SV (aSV), only a higher-order visual area represented visual SV (vSV). These findings expand our understanding of the common currency network of the brain and shed a new light on the interplay between sensory and value information processing.

  2. Common Sense in Choice: The Effect of Sensory Modality on Neural Value Representations

    PubMed Central

    2018-01-01

    Abstract Although it is well established that the ventromedial prefrontal cortex (vmPFC) represents value using a common currency across categories of rewards, it is unknown whether the vmPFC represents value irrespective of the sensory modality in which alternatives are presented. In the current study, male and female human subjects completed a decision-making task while their neural activity was recorded using functional magnetic resonance imaging. On each trial, subjects chose between a safe alternative and a lottery, which was presented visually or aurally. A univariate conjunction analysis revealed that the anterior portion of the vmPFC tracks subjective value (SV) irrespective of the sensory modality. Using a novel cross-modality multivariate classifier, we were able to decode auditory value based on visual trials and vice versa. In addition, we found that the visual and auditory sensory cortices, which were identified using functional localizers, are also sensitive to the value of stimuli, albeit in a modality-specific manner. Whereas both primary and higher-order auditory cortices represented auditory SV (aSV), only a higher-order visual area represented visual SV (vSV). These findings expand our understanding of the common currency network of the brain and shed a new light on the interplay between sensory and value information processing. PMID:29619408

  3. Locomotor sensory organization test: a novel paradigm for the assessment of sensory contributions in gait.

    PubMed

    Chien, Jung Hung; Eikema, Diderik-Jan Anthony; Mukherjee, Mukul; Stergiou, Nicholas

    2014-12-01

    Feedback based balance control requires the integration of visual, proprioceptive and vestibular input to detect the body's movement within the environment. When the accuracy of sensory signals is compromised, the system reorganizes the relative contributions through a process of sensory recalibration, for upright postural stability to be maintained. Whereas this process has been studied extensively in standing using the Sensory Organization Test (SOT), less is known about these processes in more dynamic tasks such as locomotion. In the present study, ten healthy young adults performed the six conditions of the traditional SOT to quantify standing postural control when exposed to sensory conflict. The same subjects performed these six conditions using a novel experimental paradigm, the Locomotor SOT (LSOT), to study dynamic postural control during walking under similar types of sensory conflict. To quantify postural control during walking, the net Center of Pressure sway variability was used. This corresponds to the Performance Index of the center of pressure trajectory, which is used to quantify postural control during standing. Our results indicate that dynamic balance control during locomotion in healthy individuals is affected by the systematic manipulation of multisensory inputs. The sway variability patterns observed during locomotion reflect similar balance performance with standing posture, indicating that similar feedback processes may be involved. However, the contribution of visual input is significantly increased during locomotion, compared to standing in similar sensory conflict conditions. The increased visual gain in the LSOT conditions reflects the importance of visual input for the control of locomotion. Since balance perturbations tend to occur in dynamic tasks and in response to environmental constraints not present during the SOT, the LSOT may provide additional information for clinical evaluation on healthy and deficient sensory processing.

  4. Which Aspects of Visual Attention Are Changed by Deafness? The Case of the Attentional Network Test

    ERIC Educational Resources Information Center

    Dye, Matthew W. G.; Baril, Dara E.; Bavelier, Daphne

    2007-01-01

    The loss of one sensory modality can lead to a reorganization of the other intact sensory modalities. In the case of individuals who are born profoundly deaf, there is growing evidence of changes in visual functions. Specifically, deaf individuals demonstrate enhanced visual processing in the periphery, and in particular enhanced peripheral visual…

  5. Orienting attention to visual or verbal/auditory imagery differentially impairs the processing of visual stimuli.

    PubMed

    Villena-González, Mario; López, Vladimir; Rodríguez, Eugenio

    2016-05-15

    When attention is oriented toward inner thoughts, as spontaneously occurs during mind wandering, the processing of external information is attenuated. However, the potential effects of thought's content regarding sensory attenuation are still unknown. The present study aims to assess if the representational format of thoughts, such as visual imagery or inner speech, might differentially affect the sensory processing of external stimuli. We recorded the brain activity of 20 participants (12 women) while they were exposed to a probe visual stimulus in three different conditions: executing a task on the visual probe (externally oriented attention), and two conditions involving inward-turned attention i.e. generating inner speech and performing visual imagery. Event-related potentials results showed that the P1 amplitude, related with sensory response, was significantly attenuated during both task involving inward attention compared with external task. When both representational formats were compared, the visual imagery condition showed stronger attenuation in sensory processing than inner speech condition. Alpha power in visual areas was measured as an index of cortical inhibition. Larger alpha amplitude was found when participants engaged in an internal thought contrasted with the external task, with visual imagery showing even more alpha power than inner speech condition. Our results show, for the first time to our knowledge, that visual attentional processing to external stimuli during self-generated thoughts is differentially affected by the representational format of the ongoing train of thoughts. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Maintenance of relational information in working memory leads to suppression of the sensory cortex.

    PubMed

    Ikkai, Akiko; Blacker, Kara J; Lakshmanan, Balaji M; Ewen, Joshua B; Courtney, Susan M

    2014-10-15

    Working memory (WM) for sensory-based information about individual objects and their locations appears to involve interactions between lateral prefrontal and sensory cortexes. The mechanisms and representations for maintenance of more abstract, nonsensory information in WM are unknown, particularly whether such actively maintained information can become independent of the sensory information from which it was derived. Previous studies of WM for individual visual items found increased electroencephalogram (EEG) alpha (8-13 Hz) power over posterior electrode sites, which appears to correspond to the suppression of cortical areas that represent irrelevant sensory information. Here, we recorded EEG while participants performed a visual WM task that involved maintaining either concrete spatial coordinates or abstract relational information. Maintenance of relational information resulted in higher alpha power in posterior electrodes. Furthermore, lateralization of alpha power due to a covert shift of attention to one visual hemifield was marginally weaker during storage of relational information than during storage of concrete information. These results suggest that abstract relational information is maintained in WM differently from concrete, sensory representations and that during maintenance of abstract information, posterior sensory regions become task irrelevant and are thus suppressed. Copyright © 2014 the American Physiological Society.

  7. Using Innate Visual Biases to Guide Face Learning in Natural Scenes: A Computational Investigation

    ERIC Educational Resources Information Center

    Balas, Benjamin

    2010-01-01

    Newborn infants appear to possess an innate bias that guides preferential orienting to and tracking of human faces. There is, however, no clear agreement as to the underlying mechanism supporting such a preference. In particular, two competing theories (known as the "structural" and "sensory" hypotheses) conjecture fundamentally different biasing…

  8. Molecular Mechanisms at the Basis of Plasticity in the Developing Visual Cortex: Epigenetic Processes and Gene Programs

    PubMed Central

    Maya-Vetencourt, José Fernando; Pizzorusso, Tommaso

    2013-01-01

    Neuronal circuitries in the mammalian visual system change as a function of experience. Sensory experience modifies neuronal networks connectivity via the activation of different physiological processes such as excitatory/inhibitory synaptic transmission, neurotrophins, and signaling of extracellular matrix molecules. Long-lasting phenomena of plasticity occur when intracellular signal transduction pathways promote epigenetic alterations of chromatin structure that regulate the induction of transcription factors that in turn drive the expression of downstream targets, the products of which then work via the activation of structural and functional mechanisms that modify synaptic connectivity. Here, we review recent findings in the field of visual cortical plasticity while focusing on how physiological mechanisms associated with experience promote structural changes that determine functional modifications of neural circuitries in V1. We revise the role of microRNAs as molecular transducers of environmental stimuli and the role of immediate early genes that control gene expression programs underlying plasticity in the developing visual cortex. PMID:25157210

  9. Sensori-motor experience leads to changes in visual processing in the developing brain.

    PubMed

    James, Karin Harman

    2010-03-01

    Since Broca's studies on language processing, cortical functional specialization has been considered to be integral to efficient neural processing. A fundamental question in cognitive neuroscience concerns the type of learning that is required for functional specialization to develop. To address this issue with respect to the development of neural specialization for letters, we used functional magnetic resonance imaging (fMRI) to compare brain activation patterns in pre-school children before and after different letter-learning conditions: a sensori-motor group practised printing letters during the learning phase, while the control group practised visual recognition. Results demonstrated an overall left-hemisphere bias for processing letters in these pre-literate participants, but, more interestingly, showed enhanced blood oxygen-level-dependent activation in the visual association cortex during letter perception only after sensori-motor (printing) learning. It is concluded that sensori-motor experience augments processing in the visual system of pre-school children. The change of activation in these neural circuits provides important evidence that 'learning-by-doing' can lay the foundation for, and potentially strengthen, the neural systems used for visual letter recognition.

  10. Multimodal Neuroimaging in Schizophrenia: Description and Dissemination.

    PubMed

    Aine, C J; Bockholt, H J; Bustillo, J R; Cañive, J M; Caprihan, A; Gasparovic, C; Hanlon, F M; Houck, J M; Jung, R E; Lauriello, J; Liu, J; Mayer, A R; Perrone-Bizzozero, N I; Posse, S; Stephen, J M; Turner, J A; Clark, V P; Calhoun, Vince D

    2017-10-01

    In this paper we describe an open-access collection of multimodal neuroimaging data in schizophrenia for release to the community. Data were acquired from approximately 100 patients with schizophrenia and 100 age-matched controls during rest as well as several task activation paradigms targeting a hierarchy of cognitive constructs. Neuroimaging data include structural MRI, functional MRI, diffusion MRI, MR spectroscopic imaging, and magnetoencephalography. For three of the hypothesis-driven projects, task activation paradigms were acquired on subsets of ~200 volunteers which examined a range of sensory and cognitive processes (e.g., auditory sensory gating, auditory/visual multisensory integration, visual transverse patterning). Neuropsychological data were also acquired and genetic material via saliva samples were collected from most of the participants and have been typed for both genome-wide polymorphism data as well as genome-wide methylation data. Some results are also presented from the individual studies as well as from our data-driven multimodal analyses (e.g., multimodal examinations of network structure and network dynamics and multitask fMRI data analysis across projects). All data will be released through the Mind Research Network's collaborative informatics and neuroimaging suite (COINS).

  11. Higher-order neural processing tunes motion neurons to visual ecology in three species of hawkmoths.

    PubMed

    Stöckl, A L; O'Carroll, D; Warrant, E J

    2017-06-28

    To sample information optimally, sensory systems must adapt to the ecological demands of each animal species. These adaptations can occur peripherally, in the anatomical structures of sensory organs and their receptors; and centrally, as higher-order neural processing in the brain. While a rich body of investigations has focused on peripheral adaptations, our understanding is sparse when it comes to central mechanisms. We quantified how peripheral adaptations in the eyes, and central adaptations in the wide-field motion vision system, set the trade-off between resolution and sensitivity in three species of hawkmoths active at very different light levels: nocturnal Deilephila elpenor, crepuscular Manduca sexta , and diurnal Macroglossum stellatarum. Using optical measurements and physiological recordings from the photoreceptors and wide-field motion neurons in the lobula complex, we demonstrate that all three species use spatial and temporal summation to improve visual performance in dim light. The diurnal Macroglossum relies least on summation, but can only see at brighter intensities. Manduca, with large sensitive eyes, relies less on neural summation than the smaller eyed Deilephila , but both species attain similar visual performance at nocturnal light levels. Our results reveal how the visual systems of these three hawkmoth species are intimately matched to their visual ecologies. © 2017 The Author(s).

  12. Auditory biofeedback substitutes for loss of sensory information in maintaining stance.

    PubMed

    Dozza, Marco; Horak, Fay B; Chiari, Lorenzo

    2007-03-01

    The importance of sensory feedback for postural control in stance is evident from the balance improvements occurring when sensory information from the vestibular, somatosensory, and visual systems is available. However, the extent to which also audio-biofeedback (ABF) information can improve balance has not been determined. It is also unknown why additional artificial sensory feedback is more effective for some subjects than others and in some environmental contexts than others. The aim of this study was to determine the relative effectiveness of an ABF system to reduce postural sway in stance in healthy control subjects and in subjects with bilateral vestibular loss, under conditions of reduced vestibular, visual, and somatosensory inputs. This ABF system used a threshold region and non-linear scaling parameters customized for each individual, to provide subjects with pitch and volume coding of their body sway. ABF had the largest effect on reducing the body sway of the subjects with bilateral vestibular loss when the environment provided limited visual and somatosensory information; it had the smallest effect on reducing the sway of subjects with bilateral vestibular loss, when the environment provided full somatosensory information. The extent that all subjects substituted ABF information for their loss of sensory information was related to the extent that each subject was visually dependent or somatosensory-dependent for their postural control. Comparison of postural sway under a variety of sensory conditions suggests that patients with profound bilateral loss of vestibular function show larger than normal information redundancy among the remaining senses and ABF of trunk sway. The results support the hypothesis that the nervous system uses augmented sensory information differently depending both on the environment and on individual proclivities to rely on vestibular, somatosensory or visual information to control sway.

  13. Dynamic modulation of visual and electrosensory gains for locomotor control

    PubMed Central

    Sutton, Erin E.; Demir, Alican; Stamper, Sarah A.; Fortune, Eric S.; Cowan, Noah J.

    2016-01-01

    Animal nervous systems resolve sensory conflict for the control of movement. For example, the glass knifefish, Eigenmannia virescens, relies on visual and electrosensory feedback as it swims to maintain position within a moving refuge. To study how signals from these two parallel sensory streams are used in refuge tracking, we constructed a novel augmented reality apparatus that enables the independent manipulation of visual and electrosensory cues to freely swimming fish (n = 5). We evaluated the linearity of multisensory integration, the change to the relative perceptual weights given to vision and electrosense in relation to sensory salience, and the effect of the magnitude of sensory conflict on sensorimotor gain. First, we found that tracking behaviour obeys superposition of the sensory inputs, suggesting linear sensorimotor integration. In addition, fish rely more on vision when electrosensory salience is reduced, suggesting that fish dynamically alter sensorimotor gains in a manner consistent with Bayesian integration. However, the magnitude of sensory conflict did not significantly affect sensorimotor gain. These studies lay the theoretical and experimental groundwork for future work investigating multisensory control of locomotion. PMID:27170650

  14. Persistent recruitment of somatosensory cortex during active maintenance of hand images in working memory.

    PubMed

    Galvez-Pol, A; Calvo-Merino, B; Capilla, A; Forster, B

    2018-07-01

    Working memory (WM) supports temporary maintenance of task-relevant information. This process is associated with persistent activity in the sensory cortex processing the information (e.g., visual stimuli activate visual cortex). However, we argue here that more multifaceted stimuli moderate this sensory-locked activity and recruit distinctive cortices. Specifically, perception of bodies recruits somatosensory cortex (SCx) beyond early visual areas (suggesting embodiment processes). Here we explore persistent activation in processing areas beyond the sensory cortex initially relevant to the modality of the stimuli. Using visual and somatosensory evoked-potentials in a visual WM task, we isolated different levels of visual and somatosensory involvement during encoding of body and non-body-related images. Persistent activity increased in SCx only when maintaining body images in WM, whereas visual/posterior regions' activity increased significantly when maintaining non-body images. Our results bridge WM and embodiment frameworks, supporting a dynamic WM process where the nature of the information summons specific processing resources. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. The Sensory Components of High-Capacity Iconic Memory and Visual Working Memory

    PubMed Central

    Bradley, Claire; Pearson, Joel

    2012-01-01

    Early visual memory can be split into two primary components: a high-capacity, short-lived iconic memory followed by a limited-capacity visual working memory that can last many seconds. Whereas a large number of studies have investigated visual working memory for low-level sensory features, much research on iconic memory has used more “high-level” alphanumeric stimuli such as letters or numbers. These two forms of memory are typically examined separately, despite an intrinsic overlap in their characteristics. Here, we used a purely sensory paradigm to examine visual short-term memory for 10 homogeneous items of three different visual features (color, orientation and motion) across a range of durations from 0 to 6 s. We found that the amount of information stored in iconic memory is smaller for motion than for color or orientation. Performance declined exponentially with longer storage durations and reached chance levels after ∼2 s. Further experiments showed that performance for the 10 items at 1 s was contingent on unperturbed attentional resources. In addition, for orientation stimuli, performance was contingent on the location of stimuli in the visual field, especially for short cue delays. Overall, our results suggest a smooth transition between an automatic, high-capacity, feature-specific sensory-iconic memory, and an effortful “lower-capacity” visual working memory. PMID:23055993

  16. Asymmetries of the human social brain in the visual, auditory and chemical modalities.

    PubMed

    Brancucci, Alfredo; Lucci, Giuliana; Mazzatenta, Andrea; Tommasi, Luca

    2009-04-12

    Structural and functional asymmetries are present in many regions of the human brain responsible for motor control, sensory and cognitive functions and communication. Here, we focus on hemispheric asymmetries underlying the domain of social perception, broadly conceived as the analysis of information about other individuals based on acoustic, visual and chemical signals. By means of these cues the brain establishes the border between 'self' and 'other', and interprets the surrounding social world in terms of the physical and behavioural characteristics of conspecifics essential for impression formation and for creating bonds and relationships. We show that, considered from the standpoint of single- and multi-modal sensory analysis, the neural substrates of the perception of voices, faces, gestures, smells and pheromones, as evidenced by modern neuroimaging techniques, are characterized by a general pattern of right-hemispheric functional asymmetry that might benefit from other aspects of hemispheric lateralization rather than constituting a true specialization for social information.

  17. Temporal Processing in the Olfactory System: Can We See a Smell?

    PubMed Central

    Gire, David H.; Restrepo, Diego; Sejnowski, Terrence J.; Greer, Charles; De Carlos, Juan A.; Lopez-Mascaraque, Laura

    2013-01-01

    Sensory processing circuits in the visual and olfactory systems receive input from complex, rapidly changing environments. Although patterns of light and plumes of odor create different distributions of activity in the retina and olfactory bulb, both structures use what appears on the surface similar temporal coding strategies to convey information to higher areas in the brain. We compare temporal coding in the early stages of the olfactory and visual systems, highlighting recent progress in understanding the role of time in olfactory coding during active sensing by behaving animals. We also examine studies that address the divergent circuit mechanisms that generate temporal codes in the two systems, and find that they provide physiological information directly related to functional questions raised by neuroanatomical studies of Ramon y Cajal over a century ago. Consideration of differences in neural activity in sensory systems contributes to generating new approaches to understand signal processing. PMID:23664611

  18. Touch to see: neuropsychological evidence of a sensory mirror system for touch.

    PubMed

    Bolognini, Nadia; Olgiati, Elena; Xaiz, Annalisa; Posteraro, Lucio; Ferraro, Francesco; Maravita, Angelo

    2012-09-01

    The observation of touch can be grounded in the activation of brain areas underpinning direct tactile experience, namely the somatosensory cortices. What is the behavioral impact of such a mirror sensory activity on visual perception? To address this issue, we investigated the causal interplay between observed and felt touch in right brain-damaged patients, as a function of their underlying damaged visual and/or tactile modalities. Patients and healthy controls underwent a detection task, comprising visual stimuli depicting touches or without a tactile component. Touch and No-touch stimuli were presented in egocentric or allocentric perspectives. Seeing touches, regardless of the viewing perspective, differently affects visual perception depending on which sensory modality is damaged: In patients with a selective visual deficit, but without any tactile defect, the sight of touch improves the visual impairment; this effect is associated with a lesion to the supramarginal gyrus. In patients with a tactile deficit, but intact visual perception, the sight of touch disrupts visual processing, inducing a visual extinction-like phenomenon. This disruptive effect is associated with the damage of the postcentral gyrus. Hence, a damage to the somatosensory system can lead to a dysfunctional visual processing, and an intact somatosensory processing can aid visual perception.

  19. The contribution of visual information to the perception of speech in noise with and without informative temporal fine structure

    PubMed Central

    Stacey, Paula C.; Kitterick, Pádraig T.; Morris, Saffron D.; Sumner, Christian J.

    2017-01-01

    Understanding what is said in demanding listening situations is assisted greatly by looking at the face of a talker. Previous studies have observed that normal-hearing listeners can benefit from this visual information when a talker's voice is presented in background noise. These benefits have also been observed in quiet listening conditions in cochlear-implant users, whose device does not convey the informative temporal fine structure cues in speech, and when normal-hearing individuals listen to speech processed to remove these informative temporal fine structure cues. The current study (1) characterised the benefits of visual information when listening in background noise; and (2) used sine-wave vocoding to compare the size of the visual benefit when speech is presented with or without informative temporal fine structure. The accuracy with which normal-hearing individuals reported words in spoken sentences was assessed across three experiments. The availability of visual information and informative temporal fine structure cues was varied within and across the experiments. The results showed that visual benefit was observed using open- and closed-set tests of speech perception. The size of the benefit increased when informative temporal fine structure cues were removed. This finding suggests that visual information may play an important role in the ability of cochlear-implant users to understand speech in many everyday situations. Models of audio-visual integration were able to account for the additional benefit of visual information when speech was degraded and suggested that auditory and visual information was being integrated in a similar way in all conditions. The modelling results were consistent with the notion that audio-visual benefit is derived from the optimal combination of auditory and visual sensory cues. PMID:27085797

  20. Cerebral Palsy for the Pediatric Eye Care Team Part III: Diagnosis and Management of Associated Visual and Sensory Disorders.

    PubMed

    Arnoldi, Kyle A; Pendarvis, Lauren; Jackson, Jorie; Batra, Noopur Nikki Agarwal

    2006-01-01

    Cerebral palsy (CP) is a term used to describe a spectrum of deficits of muscle tone and posture resulting from damage to the developing nervous system. Though considered a motor disorder, CP can be associated with disorders of the sensory visual pathway. This paper, the final in a series of three articles, will present frequency, diagnosis, and management of the visual and binocular vision deficits associated with CP. Topics for discussion will include the prevalence and etiology of decreased acuity, the effect of CP on sensory and motor fusion, and the response to treatment for these sensory deficits. A retrospective chart review of all cases of cerebral palsy referred to the St. Louis Children's Hospital Eye Center was done. Detailed data on the sensory and motor deficits documented in these children was collected. Also recorded was the management strategy and response to treatment. Of the 131 cases reviewed (mean age 5.2 years at presentation), 46% had decreased vision in at least one eye due to amblyopia (24%), optic nerve abnormality (16%), cortical visual impairment (14%), or a combination. Forty-nine (37%) had significant refractive error. Sixty-four percent of those with significant refractive error responded to spectacle correction. Forty-three percent of those with amblyopia responded to conventional therapies. Of the nonstrabismic patients, 89% demonstrated sensory fusion, 90% had stereopsis, and 91% had motor fusion. No patient lacking fusion or stereopsis prior to strabismus surgery gained these abilities with realignment of the eyes. While children with CP are capable of age-appropriate acuity and binocular vision, they are at increased risk for sensory visual deficits. These deficits are not the direct result of CP itself, but either share a common underlying cause, or occur as sequelae to the strabismus that is prevalent in CP. Most importantly, some sensory deficits may respond to standard treatment methods.

  1. Taking Attention Away from the Auditory Modality: Context-dependent Effects on Early Sensory Encoding of Speech.

    PubMed

    Xie, Zilong; Reetzke, Rachel; Chandrasekaran, Bharath

    2018-05-24

    Increasing visual perceptual load can reduce pre-attentive auditory cortical activity to sounds, a reflection of the limited and shared attentional resources for sensory processing across modalities. Here, we demonstrate that modulating visual perceptual load can impact the early sensory encoding of speech sounds, and that the impact of visual load is highly dependent on the predictability of the incoming speech stream. Participants (n = 20, 9 females) performed a visual search task of high (target similar to distractors) and low (target dissimilar to distractors) perceptual load, while early auditory electrophysiological responses were recorded to native speech sounds. Speech sounds were presented either in a 'repetitive context', or a less predictable 'variable context'. Independent of auditory stimulus context, pre-attentive auditory cortical activity was reduced during high visual load, relative to low visual load. We applied a data-driven machine learning approach to decode speech sounds from the early auditory electrophysiological responses. Decoding performance was found to be poorer under conditions of high (relative to low) visual load, when the incoming acoustic stream was predictable. When the auditory stimulus context was less predictable, decoding performance was substantially greater for the high (relative to low) visual load conditions. Our results provide support for shared attentional resources between visual and auditory modalities that substantially influence the early sensory encoding of speech signals in a context-dependent manner. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Visual Bias Predicts Gait Adaptability in Novel Sensory Discordant Conditions

    NASA Technical Reports Server (NTRS)

    Brady, Rachel A.; Batson, Crystal D.; Peters, Brian T.; Mulavara, Ajitkumar P.; Bloomberg, Jacob J.

    2010-01-01

    We designed a gait training study that presented combinations of visual flow and support-surface manipulations to investigate the response of healthy adults to novel discordant sensorimotor conditions. We aimed to determine whether a relationship existed between subjects visual dependence and their postural stability and cognitive performance in a new discordant environment presented at the conclusion of training (Transfer Test). Our training system comprised a treadmill placed on a motion base facing a virtual visual scene that provided a variety of sensory challenges. Ten healthy adults completed 3 training sessions during which they walked on a treadmill at 1.1 m/s while receiving discordant support-surface and visual manipulations. At the first visit, in an analysis of normalized torso translation measured in a scene-movement-only condition, 3 of 10 subjects were classified as visually dependent. During the Transfer Test, all participants received a 2-minute novel exposure. In a combined measure of stride frequency and reaction time, the non-visually dependent subjects showed improved adaptation on the Transfer Test compared to their visually dependent counterparts. This finding suggests that individual differences in the ability to adapt to new sensorimotor conditions may be explained by individuals innate sensory biases. An accurate preflight assessment of crewmembers biases for visual dependence could be used to predict their propensities to adapt to novel sensory conditions. It may also facilitate the development of customized training regimens that could expedite adaptation to alternate gravitational environments.

  3. Visually Evoked 3-5 Hz Membrane Potential Oscillations Reduce the Responsiveness of Visual Cortex Neurons in Awake Behaving Mice.

    PubMed

    Einstein, Michael C; Polack, Pierre-Olivier; Tran, Duy T; Golshani, Peyman

    2017-05-17

    Low-frequency membrane potential ( V m ) oscillations were once thought to only occur in sleeping and anesthetized states. Recently, low-frequency V m oscillations have been described in inactive awake animals, but it is unclear whether they shape sensory processing in neurons and whether they occur during active awake behavioral states. To answer these questions, we performed two-photon guided whole-cell V m recordings from primary visual cortex layer 2/3 excitatory and inhibitory neurons in awake mice during passive visual stimulation and performance of visual and auditory discrimination tasks. We recorded stereotyped 3-5 Hz V m oscillations where the V m baseline hyperpolarized as the V m underwent high amplitude rhythmic fluctuations lasting 1-2 s in duration. When 3-5 Hz V m oscillations coincided with visual cues, excitatory neuron responses to preferred cues were significantly reduced. Despite this disruption to sensory processing, visual cues were critical for evoking 3-5 Hz V m oscillations when animals performed discrimination tasks and passively viewed drifting grating stimuli. Using pupillometry and animal locomotive speed as indicators of arousal, we found that 3-5 Hz oscillations were not restricted to unaroused states and that they occurred equally in aroused and unaroused states. Therefore, low-frequency V m oscillations play a role in shaping sensory processing in visual cortical neurons, even during active wakefulness and decision making. SIGNIFICANCE STATEMENT A neuron's membrane potential ( V m ) strongly shapes how information is processed in sensory cortices of awake animals. Yet, very little is known about how low-frequency V m oscillations influence sensory processing and whether they occur in aroused awake animals. By performing two-photon guided whole-cell recordings from layer 2/3 excitatory and inhibitory neurons in the visual cortex of awake behaving animals, we found visually evoked stereotyped 3-5 Hz V m oscillations that disrupt excitatory responsiveness to visual stimuli. Moreover, these oscillations occurred when animals were in high and low arousal states as measured by animal speed and pupillometry. These findings show, for the first time, that low-frequency V m oscillations can significantly modulate sensory signal processing, even in awake active animals. Copyright © 2017 the authors 0270-6474/17/375084-15$15.00/0.

  4. The Sound of Vision Project: On the Feasibility of an Audio-Haptic Representation of the Environment, for the Visually Impaired

    PubMed Central

    Jóhannesson, Ómar I.; Balan, Oana; Unnthorsson, Runar; Moldoveanu, Alin; Kristjánsson, Árni

    2016-01-01

    The Sound of Vision project involves developing a sensory substitution device that is aimed at creating and conveying a rich auditory representation of the surrounding environment to the visually impaired. However, the feasibility of such an approach is strongly constrained by neural flexibility, possibilities of sensory substitution and adaptation to changed sensory input. We review evidence for such flexibility from various perspectives. We discuss neuroplasticity of the adult brain with an emphasis on functional changes in the visually impaired compared to sighted people. We discuss effects of adaptation on brain activity, in particular short-term and long-term effects of repeated exposure to particular stimuli. We then discuss evidence for sensory substitution such as Sound of Vision involves, while finally discussing evidence for adaptation to changes in the auditory environment. We conclude that sensory substitution enterprises such as Sound of Vision are quite feasible in light of the available evidence, which is encouraging regarding such projects. PMID:27355966

  5. Sequential sensory and decision processing in posterior parietal cortex

    PubMed Central

    Ibos, Guilhem; Freedman, David J

    2017-01-01

    Decisions about the behavioral significance of sensory stimuli often require comparing sensory inference of what we are looking at to internal models of what we are looking for. Here, we test how neuronal selectivity for visual features is transformed into decision-related signals in posterior parietal cortex (area LIP). Monkeys performed a visual matching task that required them to detect target stimuli composed of conjunctions of color and motion-direction. Neuronal recordings from area LIP revealed two main findings. First, the sequential processing of visual features and the selection of target-stimuli suggest that LIP is involved in transforming sensory information into decision-related signals. Second, the patterns of color and motion selectivity and their impact on decision-related encoding suggest that LIP plays a role in detecting target stimuli by comparing bottom-up sensory inputs (what the monkeys were looking at) and top-down cognitive encoding inputs (what the monkeys were looking for). DOI: http://dx.doi.org/10.7554/eLife.23743.001 PMID:28418332

  6. The sensory timecourses associated with conscious visual item memory and source memory.

    PubMed

    Thakral, Preston P; Slotnick, Scott D

    2015-09-01

    Previous event-related potential (ERP) findings have suggested that during visual item and source memory, nonconscious and conscious sensory (occipital-temporal) activity onsets may be restricted to early (0-800 ms) and late (800-1600 ms) temporal epochs, respectively. In an ERP experiment, we tested this hypothesis by separately assessing whether the onset of conscious sensory activity was restricted to the late epoch during source (location) memory and item (shape) memory. We found that conscious sensory activity had a late (>800 ms) onset during source memory and an early (<200 ms) onset during item memory. In a follow-up fMRI experiment, conscious sensory activity was localized to BA17, BA18, and BA19. Of primary importance, the distinct source memory and item memory ERP onsets contradict the hypothesis that there is a fixed temporal boundary separating nonconscious and conscious processing during all forms of visual conscious retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Cross-frequency synchronization connects networks of fast and slow oscillations during visual working memory maintenance.

    PubMed

    Siebenhühner, Felix; Wang, Sheng H; Palva, J Matias; Palva, Satu

    2016-09-26

    Neuronal activity in sensory and fronto-parietal (FP) areas underlies the representation and attentional control, respectively, of sensory information maintained in visual working memory (VWM). Within these regions, beta/gamma phase-synchronization supports the integration of sensory functions, while synchronization in theta/alpha bands supports the regulation of attentional functions. A key challenge is to understand which mechanisms integrate neuronal processing across these distinct frequencies and thereby the sensory and attentional functions. We investigated whether such integration could be achieved by cross-frequency phase synchrony (CFS). Using concurrent magneto- and electroencephalography, we found that CFS was load-dependently enhanced between theta and alpha-gamma and between alpha and beta-gamma oscillations during VWM maintenance among visual, FP, and dorsal attention (DA) systems. CFS also connected the hubs of within-frequency-synchronized networks and its strength predicted individual VWM capacity. We propose that CFS integrates processing among synchronized neuronal networks from theta to gamma frequencies to link sensory and attentional functions.

  8. Postural Stability of Patients with Schizophrenia during Challenging Sensory Conditions: Implication of Sensory Integration for Postural Control.

    PubMed

    Teng, Ya-Ling; Chen, Chiung-Ling; Lou, Shu-Zon; Wang, Wei-Tsan; Wu, Jui-Yen; Ma, Hui-Ing; Chen, Vincent Chin-Hung

    2016-01-01

    Postural dysfunctions are prevalent in patients with schizophrenia and affect their daily life and ability to work. In addition, sensory functions and sensory integration that are crucial for postural control are also compromised. This study intended to examine how patients with schizophrenia coordinate multiple sensory systems to maintain postural stability in dynamic sensory conditions. Twenty-nine patients with schizophrenia and 32 control subjects were recruited. Postural stability of the participants was examined in six sensory conditions of different level of congruency of multiple sensory information, which was based on combinations of correct, removed, or conflicting sensory inputs from visual, somatosensory, and vestibular systems. The excursion of the center of pressure was measured by posturography. Equilibrium scores were derived to indicate the range of anterior-posterior (AP) postural sway, and sensory ratios were calculated to explore ability to use sensory information to maintain balance. The overall AP postural sway was significantly larger for patients with schizophrenia compared to the controls [patients (69.62±8.99); controls (76.53±7.47); t1,59 = -3.28, p<0.001]. The results of mixed-model ANOVAs showed a significant interaction between the group and sensory conditions [F5,295 = 5.55, p<0.001]. Further analysis indicated that AP postural sway was significantly larger for patients compared to the controls in conditions containing unreliable somatosensory information either with visual deprivation or with conflicting visual information. Sensory ratios were not significantly different between groups, although small and non-significant difference in inefficiency to utilize vestibular information was also noted. No significant correlations were found between postural stability and clinical characteristics. To sum up, patients with schizophrenia showed increased postural sway and a higher rate of falls during challenging sensory conditions, which was independent of clinical characteristics. Patients further demonstrated similar pattern and level of utilizing sensory information to maintain balance compared to the controls.

  9. Multi-sensory landscape assessment: the contribution of acoustic perception to landscape evaluation.

    PubMed

    Gan, Yonghong; Luo, Tao; Breitung, Werner; Kang, Jian; Zhang, Tianhai

    2014-12-01

    In this paper, the contribution of visual and acoustic preference to multi-sensory landscape evaluation was quantitatively compared. The real landscapes were treated as dual-sensory ambiance and separated into visual landscape and soundscape. Both were evaluated by 63 respondents in laboratory conditions. The analysis of the relationship between respondent's visual and acoustic preference as well as their respective contribution to landscape preference showed that (1) some common attributes are universally identified in assessing visual, aural and audio-visual preference, such as naturalness or degree of human disturbance; (2) with acoustic and visual preferences as variables, a multi-variate linear regression model can satisfactorily predict landscape preference (R(2 )= 0.740), while the coefficients of determination for a unitary linear regression model were 0.345 and 0.720 for visual and acoustic preference as predicting factors, respectively; (3) acoustic preference played a much more important role in landscape evaluation than visual preference in this study (the former is about 4.5 times of the latter), which strongly suggests a rethinking of the role of soundscape in environment perception research and landscape planning practice.

  10. Beyond sensory images: Object-based representation in the human ventral pathway

    PubMed Central

    Pietrini, Pietro; Furey, Maura L.; Ricciardi, Emiliano; Gobbini, M. Ida; Wu, W.-H. Carolyn; Cohen, Leonardo; Guazzelli, Mario; Haxby, James V.

    2004-01-01

    We investigated whether the topographically organized, category-related patterns of neural response in the ventral visual pathway are a representation of sensory images or a more abstract representation of object form that is not dependent on sensory modality. We used functional MRI to measure patterns of response evoked during visual and tactile recognition of faces and manmade objects in sighted subjects and during tactile recognition in blind subjects. Results showed that visual and tactile recognition evoked category-related patterns of response in a ventral extrastriate visual area in the inferior temporal gyrus that were correlated across modality for manmade objects. Blind subjects also demonstrated category-related patterns of response in this “visual” area, and in more ventral cortical regions in the fusiform gyrus, indicating that these patterns are not due to visual imagery and, furthermore, that visual experience is not necessary for category-related representations to develop in these cortices. These results demonstrate that the representation of objects in the ventral visual pathway is not simply a representation of visual images but, rather, is a representation of more abstract features of object form. PMID:15064396

  11. A neural correlate of working memory in the monkey primary visual cortex.

    PubMed

    Supèr, H; Spekreijse, H; Lamme, V A

    2001-07-06

    The brain frequently needs to store information for short periods. In vision, this means that the perceptual correlate of a stimulus has to be maintained temporally once the stimulus has been removed from the visual scene. However, it is not known how the visual system transfers sensory information into a memory component. Here, we identify a neural correlate of working memory in the monkey primary visual cortex (V1). We propose that this component may link sensory activity with memory activity.

  12. Link between orientation and retinotopic maps in primary visual cortex

    PubMed Central

    Paik, Se-Bum; Ringach, Dario L.

    2012-01-01

    Maps representing the preference of neurons for the location and orientation of a stimulus on the visual field are a hallmark of primary visual cortex. It is not yet known how these maps develop and what function they play in visual processing. One hypothesis postulates that orientation maps are initially seeded by the spatial interference of ON- and OFF-center retinal receptive field mosaics. Here we show that such a mechanism predicts a link between the layout of orientation preferences around singularities of different signs and the cardinal axes of the retinotopic map. Moreover, we confirm the predicted relationship holds in tree shrew primary visual cortex. These findings provide additional support for the notion that spatially structured input from the retina may provide a blueprint for the early development of cortical maps and receptive fields. More broadly, it raises the possibility that spatially structured input from the periphery may shape the organization of primary sensory cortex of other modalities as well. PMID:22509015

  13. Strategies for Characterizing the Sensory Environment: Objective and Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera

    DTIC Science & Technology

    2017-11-01

    ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory Environment: Objective and...Subjective Evaluation Methods using the VisiSonic Real Space 64/5 Audio-Visual Panoramic Camera By Joseph McArdle, Ashley Foots, Chris Stachowiak, and...return it to the originator. ARL-TR-8205 ● NOV 2017 US Army Research Laboratory Strategies for Characterizing the Sensory

  14. On the dependence of response inhibition processes on sensory modality.

    PubMed

    Bodmer, Benjamin; Beste, Christian

    2017-04-01

    The ability to inhibit responses is a central sensorimotor function but only recently the importance of sensory processes for motor inhibition mechanisms went more into the research focus. In this regard it is elusive, whether there are differences between sensory modalities to trigger response inhibition processes. Due to functional neuroanatomical considerations strong differences may exist, for example, between the visual and the tactile modality. In the current study we examine what neurophysiological mechanisms as well as functional neuroanatomical networks are modulated during response inhibition. Therefore, a Go/NoGo-paradigm employing a novel combination of visual, tactile, and visuotactile stimuli was used. The data show that the tactile modality is more powerful than the visual modality to trigger response inhibition processes. However, the tactile modality loses its efficacy to trigger response inhibition processes when being combined with the visual modality. This may be due to competitive mechanisms leading to a suppression of certain sensory stimuli and the response selection level. Variations in sensory modalities specifically affected conflict monitoring processes during response inhibition by modulating activity in a frontal parietal network including the right inferior frontal gyrus, anterior cingulate cortex and the temporoparietal junction. Attentional selection processes are not modulated. The results suggest that the functional neuroanatomical networks involved in response inhibition critically depends on the nature of the sensory input. Hum Brain Mapp 38:1941-1951, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Reevaluating the Sensory Account of Visual Working Memory Storage.

    PubMed

    Xu, Yaoda

    2017-10-01

    Recent human fMRI pattern-decoding studies have highlighted the involvement of sensory areas in visual working memory (VWM) tasks and argue for a sensory account of VWM storage. In this review, evidence is examined from human behavior, fMRI decoding, and transcranial magnetic stimulation (TMS) studies, as well as from monkey neurophysiology studies. Contrary to the prevalent view, the available evidence provides little support for the sensory account of VWM storage. Instead, when the ability to resist distraction and the existence of top-down feedback are taken into account, VWM-related activities in sensory areas seem to reflect feedback signals indicative of VWM storage elsewhere in the brain. Collectively, the evidence shows that prefrontal and parietal regions, rather than sensory areas, play more significant roles in VWM storage. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Awake vs. anesthetized: layer-specific sensory processing in visual cortex and functional connectivity between cortical areas

    PubMed Central

    Sellers, Kristin K.; Bennett, Davis V.; Hutt, Axel; Williams, James H.

    2015-01-01

    During general anesthesia, global brain activity and behavioral state are profoundly altered. Yet it remains mostly unknown how anesthetics alter sensory processing across cortical layers and modulate functional cortico-cortical connectivity. To address this gap in knowledge of the micro- and mesoscale effects of anesthetics on sensory processing in the cortical microcircuit, we recorded multiunit activity and local field potential in awake and anesthetized ferrets (Mustela putoris furo) during sensory stimulation. To understand how anesthetics alter sensory processing in a primary sensory area and the representation of sensory input in higher-order association areas, we studied the local sensory responses and long-range functional connectivity of primary visual cortex (V1) and prefrontal cortex (PFC). Isoflurane combined with xylazine provided general anesthesia for all anesthetized recordings. We found that anesthetics altered the duration of sensory-evoked responses, disrupted the response dynamics across cortical layers, suppressed both multimodal interactions in V1 and sensory responses in PFC, and reduced functional cortico-cortical connectivity between V1 and PFC. Together, the present findings demonstrate altered sensory responses and impaired functional network connectivity during anesthesia at the level of multiunit activity and local field potential across cortical layers. PMID:25833839

  17. Cortical Neuroprosthesis Merges Visible and Invisible Light Without Impairing Native Sensory Function

    PubMed Central

    Thomson, Eric E.; Zea, Ivan; França, Wendy

    2017-01-01

    Abstract Adult rats equipped with a sensory prosthesis, which transduced infrared (IR) signals into electrical signals delivered to somatosensory cortex (S1), took approximately 4 d to learn a four-choice IR discrimination task. Here, we show that when such IR signals are projected to the primary visual cortex (V1), rats that are pretrained in a visual-discrimination task typically learn the same IR discrimination task on their first day of training. However, without prior training on a visual discrimination task, the learning rates for S1- and V1-implanted animals converged, suggesting there is no intrinsic difference in learning rate between the two areas. We also discovered that animals were able to integrate IR information into the ongoing visual processing stream in V1, performing a visual-IR integration task in which they had to combine IR and visual information. Furthermore, when the IR prosthesis was implanted in S1, rats showed no impairment in their ability to use their whiskers to perform a tactile discrimination task. Instead, in some rats, this ability was actually enhanced. Cumulatively, these findings suggest that cortical sensory neuroprostheses can rapidly augment the representational scope of primary sensory areas, integrating novel sources of information into ongoing processing while incurring minimal loss of native function. PMID:29279860

  18. Signs as Pictures and Signs as Words: Effect of Language Knowledge on Memory for New Vocabulary.

    ERIC Educational Resources Information Center

    Siple, Patricia; And Others

    1982-01-01

    The role of sensory attributes in a vocabulary learning task was investigated for a non-oral language using deaf and hearing individuals, more or less skilled in the use of sign language. Skilled signers encoded invented signs in terms of linguistic structure rather than as visual-pictorial events. (Author/RD)

  19. Multiple Sensory-Motor Pathways Lead to Coordinated Visual Attention

    ERIC Educational Resources Information Center

    Yu, Chen; Smith, Linda B.

    2017-01-01

    Joint attention has been extensively studied in the developmental literature because of overwhelming evidence that the ability to socially coordinate visual attention to an object is essential to healthy developmental outcomes, including language learning. The goal of this study was to understand the complex system of sensory-motor behaviors that…

  20. Hierarchical organization of macaque and cat cortical sensory systems explored with a novel network processor.

    PubMed

    Hilgetag, C C; O'Neill, M A; Young, M P

    2000-01-29

    Neuroanatomists have described a large number of connections between the various structures of monkey and cat cortical sensory systems. Because of the complexity of the connection data, analysis is required to unravel what principles of organization they imply. To date, analysis of laminar origin and termination connection data to reveal hierarchical relationships between the cortical areas has been the most widely acknowledged approach. We programmed a network processor that searches for optimal hierarchical orderings of cortical areas given known hierarchical constraints and rules for their interpretation. For all cortical systems and all cost functions, the processor found a multitude of equally low-cost hierarchies. Laminar hierarchical constraints that are presently available in the anatomical literature were therefore insufficient to constrain a unique ordering for any of the sensory systems we analysed. Hierarchical orderings of the monkey visual system that have been widely reported, but which were derived by hand, were not among the optimal orderings. All the cortical systems we studied displayed a significant degree of hierarchical organization, and the anatomical constraints from the monkey visual and somato-motor systems were satisfied with very few constraint violations in the optimal hierarchies. The visual and somato-motor systems in that animal were therefore surprisingly strictly hierarchical. Most inconsistencies between the constraints and the hierarchical relationships in the optimal structures for the visual system were related to connections of area FST (fundus of superior temporal sulcus). We found that the hierarchical solutions could be further improved by assuming that FST consists of two areas, which differ in the nature of their projections. Indeed, we found that perfect hierarchical arrangements of the primate visual system, without any violation of anatomical constraints, could be obtained under two reasonable conditions, namely the subdivision of FST into two distinct areas, whose connectivity we predict, and the abolition of at least one of the less reliable rule constraints. Our analyses showed that the future collection of the same type of laminar constraints, or the inclusion of new hierarchical constraints from thalamocortical connections, will not resolve the problem of multiple optimal hierarchical representations for the primate visual system. Further data, however, may help to specify the relative ordering of some more areas. This indeterminacy of the visual hierarchy is in part due to the reported absence of some connections between cortical areas. These absences are consistent with limited cross-talk between differentiated processing streams in the system. Hence, hierarchical representation of the visual system is affected by, and must take into account, other organizational features, such as processing streams.

  1. Vision in two cyprinid fish: implications for collective behavior

    PubMed Central

    Moore, Bret A.; Tyrrell, Luke P.; Fernández-Juricic, Esteban

    2015-01-01

    Many species of fish rely on their visual systems to interact with conspecifics and these interactions can lead to collective behavior. Individual-based models have been used to predict collective interactions; however, these models generally make simplistic assumptions about the sensory systems that are applied without proper empirical testing to different species. This could limit our ability to predict (and test empirically) collective behavior in species with very different sensory requirements. In this study, we characterized components of the visual system in two species of cyprinid fish known to engage in visually dependent collective interactions (zebrafish Danio rerio and golden shiner Notemigonus crysoleucas) and derived quantitative predictions about the positioning of individuals within schools. We found that both species had relatively narrow binocular and blind fields and wide visual coverage. However, golden shiners had more visual coverage in the vertical plane (binocular field extending behind the head) and higher visual acuity than zebrafish. The centers of acute vision (areae) of both species projected in the fronto-dorsal region of the visual field, but those of the zebrafish projected more dorsally than those of the golden shiner. Based on this visual sensory information, we predicted that: (a) predator detection time could be increased by >1,000% in zebrafish and >100% in golden shiners with an increase in nearest neighbor distance, (b) zebrafish schools would have a higher roughness value (surface area/volume ratio) than those of golden shiners, (c) and that nearest neighbor distance would vary from 8 to 20 cm to visually resolve conspecific striping patterns in both species. Overall, considering between-species differences in the sensory system of species exhibiting collective behavior could change the predictions about the positioning of individuals in the group as well as the shape of the school, which can have implications for group cohesion. We suggest that more effort should be invested in assessing the role of the sensory system in shaping local interactions driving collective behavior. PMID:26290783

  2. Biasing the brain's attentional set: I. cue driven deployments of intersensory selective attention.

    PubMed

    Foxe, John J; Simpson, Gregory V; Ahlfors, Seppo P; Saron, Clifford D

    2005-10-01

    Brain activity associated with directing attention to one of two possible sensory modalities was examined using high-density mapping of human event-related potentials. The deployment of selective attention was based on visually presented symbolic cue-words instructing subjects on a trial-by-trial basis, which sensory modality to attend. We measured the spatio-temporal pattern of activation in the approximately 1 second period between the cue-instruction and a subsequent compound auditory-visual imperative stimulus. This allowed us to assess the flow of processing across brain regions involved in deploying and sustaining inter-sensory selective attention, prior to the actual selective processing of the compound audio-visual target stimulus. Activity over frontal and parietal areas showed sensory specific increases in activation during the early part of the anticipatory period (~230 ms), probably representing the activation of fronto-parietal attentional deployment systems for top-down control of attention. In the later period preceding the arrival of the "to-be-attended" stimulus, sustained differential activity was seen over fronto-central regions and parieto-occipital regions, suggesting the maintenance of sensory-specific biased attentional states that would allow for subsequent selective processing. Although there was clear sensory biasing in this late sustained period, it was also clear that both sensory systems were being prepared during the cue-target period. These late sensory-specific biasing effects were also accompanied by sustained activations over frontal cortices that also showed both common and sensory specific activation patterns, suggesting that maintenance of the biased state includes top-down inputs from generators in frontal cortices, some of which are sensory-specific regions. These data support extensive interactions between sensory, parietal and frontal regions during processing of cue information, deployment of attention, and maintenance of the focus of attention in anticipation of impending attentionally relevant input.

  3. The role of visual deprivation and experience on the performance of sensory substitution devices.

    PubMed

    Stronks, H Christiaan; Nau, Amy C; Ibbotson, Michael R; Barnes, Nick

    2015-10-22

    It is commonly accepted that the blind can partially compensate for their loss of vision by developing enhanced abilities with their remaining senses. This visual compensation may be related to the fact that blind people rely on their other senses in everyday life. Many studies have indeed shown that experience plays an important role in visual compensation. Numerous neuroimaging studies have shown that the visual cortices of the blind are recruited by other functional brain areas and can become responsive to tactile or auditory input instead. These cross-modal plastic changes are more pronounced in the early blind compared to late blind individuals. The functional consequences of cross-modal plasticity on visual compensation in the blind are debated, as are the influences of various etiologies of vision loss (i.e., blindness acquired early or late in life). Distinguishing between the influences of experience and visual deprivation on compensation is especially relevant for rehabilitation of the blind with sensory substitution devices. The BrainPort artificial vision device and The vOICe are assistive devices for the blind that redirect visual information to another intact sensory system. Establishing how experience and different etiologies of vision loss affect the performance of these devices may help to improve existing rehabilitation strategies, formulate effective selection criteria and develop prognostic measures. In this review we will discuss studies that investigated the influence of training and visual deprivation on the performance of various sensory substitution approaches. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Hierarchical organization of brain functional networks during visual tasks.

    PubMed

    Zhuo, Zhao; Cai, Shi-Min; Fu, Zhong-Qian; Zhang, Jie

    2011-09-01

    The functional network of the brain is known to demonstrate modular structure over different hierarchical scales. In this paper, we systematically investigated the hierarchical modular organizations of the brain functional networks that are derived from the extent of phase synchronization among high-resolution EEG time series during a visual task. In particular, we compare the modular structure of the functional network from EEG channels with that of the anatomical parcellation of the brain cortex. Our results show that the modular architectures of brain functional networks correspond well to those from the anatomical structures over different levels of hierarchy. Most importantly, we find that the consistency between the modular structures of the functional network and the anatomical network becomes more pronounced in terms of vision, sensory, vision-temporal, motor cortices during the visual task, which implies that the strong modularity in these areas forms the functional basis for the visual task. The structure-function relationship further reveals that the phase synchronization of EEG time series in the same anatomical group is much stronger than that of EEG time series from different anatomical groups during the task and that the hierarchical organization of functional brain network may be a consequence of functional segmentation of the brain cortex.

  5. Sensory processing patterns predict the integration of information held in visual working memory.

    PubMed

    Lowe, Matthew X; Stevenson, Ryan A; Wilson, Kristin E; Ouslis, Natasha E; Barense, Morgan D; Cant, Jonathan S; Ferber, Susanne

    2016-02-01

    Given the limited resources of visual working memory, multiple items may be remembered as an averaged group or ensemble. As a result, local information may be ill-defined, but these ensemble representations provide accurate diagnostics of the natural world by combining gist information with item-level information held in visual working memory. Some neurodevelopmental disorders are characterized by sensory processing profiles that predispose individuals to avoid or seek-out sensory stimulation, fundamentally altering their perceptual experience. Here, we report such processing styles will affect the computation of ensemble statistics in the general population. We identified stable adult sensory processing patterns to demonstrate that individuals with low sensory thresholds who show a greater proclivity to engage in active response strategies to prevent sensory overstimulation are less likely to integrate mean size information across a set of similar items and are therefore more likely to be biased away from the mean size representation of an ensemble display. We therefore propose the study of ensemble processing should extend beyond the statistics of the display, and should also consider the statistics of the observer. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Impact of enhanced sensory input on treadmill step frequency: infants born with myelomeningocele.

    PubMed

    Pantall, Annette; Teulier, Caroline; Smith, Beth A; Moerchen, Victoria; Ulrich, Beverly D

    2011-01-01

    To determine the effect of enhanced sensory input on the step frequency of infants with myelomeningocele (MMC) when supported on a motorized treadmill. Twenty-seven infants aged 2 to 10 months with MMC lesions at, or caudal to, L1 participated. We supported infants upright on the treadmill for 2 sets of 6 trials, each 30 seconds long. Enhanced sensory inputs within each set were presented in random order and included baseline, visual flow, unloading, weights, Velcro, and friction. Overall friction and visual flow significantly increased step rate, particularly for the older subjects. Friction and Velcro increased stance-phase duration. Enhanced sensory input had minimal effect on leg activity when infants were not stepping. : Increased friction via Dycem and enhancing visual flow via a checkerboard pattern on the treadmill belt appear to be more effective than the traditional smooth black belt surface for eliciting stepping patterns in infants with MMC.

  7. Impact of Enhanced Sensory Input on Treadmill Step Frequency: Infants Born With Myelomeningocele

    PubMed Central

    Pantall, Annette; Teulier, Caroline; Smith, Beth A; Moerchen, Victoria; Ulrich, Beverly D.

    2012-01-01

    Purpose To determine the effect of enhanced sensory input on the step frequency of infants with myelomeningocele (MMC) when supported on a motorized treadmill. Methods Twenty seven infants aged 2 to 10 months with MMC lesions at or caudal to L1 participated. We supported infants upright on the treadmill for 2 sets of 6 trials, each 30s long. Enhanced sensory inputs within each set were presented in random order and included: baseline, visual flow, unloading, weights, Velcro and friction. Results Overall friction and visual flow significantly increased step rate, particularly for the older group. Friction and Velcro increased stance phase duration. Enhanced sensory input had minimal effect on leg activity when infants were not stepping. Conclusions Increased friction via Dycem and enhancing visual flow via a checkerboard pattern on the treadmill belt appear more effective than the traditional smooth black belt surface for eliciting stepping patterns in infants with MMC. PMID:21266940

  8. Evidence of a visual-to-auditory cross-modal sensory gating phenomenon as reflected by the human P50 event-related brain potential modulation.

    PubMed

    Lebib, Riadh; Papo, David; de Bode, Stella; Baudonnière, Pierre Marie

    2003-05-08

    We investigated the existence of a cross-modal sensory gating reflected by the modulation of an early electrophysiological index, the P50 component. We analyzed event-related brain potentials elicited by audiovisual speech stimuli manipulated along two dimensions: congruency and discriminability. The results showed that the P50 was attenuated when visual and auditory speech information were redundant (i.e. congruent), in comparison with this same event-related potential component elicited with discrepant audiovisual dubbing. When hard to discriminate, however, bimodal incongruent speech stimuli elicited a similar pattern of P50 attenuation. We concluded to the existence of a visual-to-auditory cross-modal sensory gating phenomenon. These results corroborate previous findings revealing a very early audiovisual interaction during speech perception. Finally, we postulated that the sensory gating system included a cross-modal dimension.

  9. High perceptual load leads to both reduced gain and broader orientation tuning

    PubMed Central

    Stolte, Moritz; Bahrami, Bahador; Lavie, Nilli

    2014-01-01

    Due to its limited capacity, visual perception depends on the allocation of attention. The resultant phenomena of inattentional blindness, accompanied by reduced sensory visual cortex response to unattended stimuli in conditions of high perceptual load in the attended task, are now well established (Lavie, 2005; Lavie, 2010, for reviews). However, the underlying mechanisms for these effects remain to be elucidated. Specifically, is reduced perceptual processing under high perceptual load a result of reduced sensory signal gain, broader tuning, or both? We examined this question with psychophysical measures of orientation tuning under different levels of perceptual load in the task performed. Our results show that increased perceptual load leads to both reduced sensory signal and broadening of tuning. These results clarify the effects of attention on elementary visual perception and suggest that high perceptual load is critical for attentional effects on sensory tuning. PMID:24610952

  10. Visual search and attention: an overview.

    PubMed

    Davis, Elizabeth T; Palmer, John

    2004-01-01

    This special feature issue is devoted to attention and visual search. Attention is a central topic in psychology and visual search is both a versatile paradigm for the study of visual attention and a topic of study in itself. Visual search depends on sensory, perceptual, and cognitive processes. As a result, the search paradigm has been used to investigate a diverse range of phenomena. Manipulating the search task can vary the demands on attention. In turn, attention modulates visual search by selecting and limiting the information available at various levels of processing. Focusing on the intersection of attention and search provides a relatively structured window into the wide world of attentional phenomena. In particular, the effects of divided attention are illustrated by the effects of set size (the number of stimuli in a display) and the effects of selective attention are illustrated by cueing subsets of stimuli within the display. These two phenomena provide the starting point for the articles in this special issue. The articles are organized into four general topics to help structure the issues of attention and search.

  11. A quantitative comparison of the hemispheric, areal, and laminar origins of sensory and motor cortical projections to the superior colliculus of the cat.

    PubMed

    Butler, Blake E; Chabot, Nicole; Lomber, Stephen G

    2016-09-01

    The superior colliculus (SC) is a midbrain structure central to orienting behaviors. The organization of descending projections from sensory cortices to the SC has garnered much attention; however, rarely have projections from multiple modalities been quantified and contrasted, allowing for meaningful conclusions within a single species. Here, we examine corticotectal projections from visual, auditory, somatosensory, motor, and limbic cortices via retrograde pathway tracers injected throughout the superficial and deep layers of the cat SC. As anticipated, the majority of cortical inputs to the SC originate in the visual cortex. In fact, each field implicated in visual orienting behavior makes a substantial projection. Conversely, only one area of the auditory orienting system, the auditory field of the anterior ectosylvian sulcus (fAES), and no area involved in somatosensory orienting, shows significant corticotectal inputs. Although small relative to visual inputs, the projection from the fAES is of particular interest, as it represents the only bilateral cortical input to the SC. This detailed, quantitative study allows for comparison across modalities in an animal that serves as a useful model for both auditory and visual perception. Moreover, the differences in patterns of corticotectal projections between modalities inform the ways in which orienting systems are modulated by cortical feedback. J. Comp. Neurol. 524:2623-2642, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Making Sense of Education: Sensory Ethnography and Visual Impairment

    ERIC Educational Resources Information Center

    Morris, Ceri

    2017-01-01

    Education involves the engagement of the full range of the senses in the accomplishment of tasks and the learning of knowledge and skills. However both in pedagogical practices and in the process of educational research, there has been a tendency to privilege the visual. To explore these issues, detailed sensory ethnographic fieldwork was…

  13. Evaluation of Sensory Aids for the Visually Handicapped.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    Presented are 11 papers given at a conference on the evaluation of sensory aids for the visually handicapped which emphasized mobility and reading aids beginning to be tested and distributed widely. Many of the presentations are by the principal developers or advocates of the aids. Introductory readings compare the role of evaluation in the…

  14. Origins of task-specific sensory-independent organization in the visual and auditory brain: neuroscience evidence, open questions and clinical implications.

    PubMed

    Heimler, Benedetta; Striem-Amit, Ella; Amedi, Amir

    2015-12-01

    Evidence of task-specific sensory-independent (TSSI) plasticity from blind and deaf populations has led to a better understanding of brain organization. However, the principles determining the origins of this plasticity remain unclear. We review recent data suggesting that a combination of the connectivity bias and sensitivity to task-distinctive features might account for TSSI plasticity in the sensory cortices as a whole, from the higher-order occipital/temporal cortices to the primary sensory cortices. We discuss current theories and evidence, open questions and related predictions. Finally, given the rapid progress in visual and auditory restoration techniques, we address the crucial need to develop effective rehabilitation approaches for sensory recovery. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Role of orientation reference selection in motion sickness

    NASA Technical Reports Server (NTRS)

    Peterka, Robert J.; Black, F. Owen

    1988-01-01

    Previous experiments with moving platform posturography have shown that different people have varying abilities to resolve conflicts among vestibular, visual, and proprioceptive sensory signals used to control upright posture. In particular, there is one class of subjects with a vestibular disorder known as benign paroxysmal positional vertigo (BPPV) who often are particularly sensitive to inaccurate visual information. That is, they will use visual sensory information for the control of their posture even when that visual information is inaccurate and is in conflict with accurate proprioceptive and vestibular sensory signals. BPPV has been associated with disorders of both posterior semicircular canal function and possibly otolith function. The present proposal hopes to take advantage of the similarities between the space motion sickness problem and the sensory orientation reference selection problems associated with the BPPV syndrome. These similarities include both etiology related to abnormal vertical canal-otolith function, and motion sickness initiating events provoked by pitch and roll head movements. The objectives of this proposal are to explore and quantify the orientation reference selection abilities of subjects and the relation of this selection to motion sickness in humans.

  16. The trait of sensory processing sensitivity and neural responses to changes in visual scenes

    PubMed Central

    Xu, Xiaomeng; Aron, Arthur; Aron, Elaine; Cao, Guikang; Feng, Tingyong; Weng, Xuchu

    2011-01-01

    This exploratory study examined the extent to which individual differences in sensory processing sensitivity (SPS), a temperament/personality trait characterized by social, emotional and physical sensitivity, are associated with neural response in visual areas in response to subtle changes in visual scenes. Sixteen participants completed the Highly Sensitive Person questionnaire, a standard measure of SPS. Subsequently, they were tested on a change detection task while undergoing functional magnetic resonance imaging (fMRI). SPS was associated with significantly greater activation in brain areas involved in high-order visual processing (i.e. right claustrum, left occipitotemporal, bilateral temporal and medial and posterior parietal regions) as well as in the right cerebellum, when detecting minor (vs major) changes in stimuli. These findings remained strong and significant after controlling for neuroticism and introversion, traits that are often correlated with SPS. These results provide the first evidence of neural differences associated with SPS, the first direct support for the sensory aspect of this trait that has been studied primarily for its social and affective implications, and preliminary evidence for heightened sensory processing in individuals high in SPS. PMID:20203139

  17. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation

    PubMed Central

    Moberg, Kayla; Amin, Kemia N.; Wright, Melissa; Newkirk, Jordan J.; Ponder, Monica A.; Acuff, Gary R.; Dickson, James S.

    2017-01-01

    Abstract Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted‐steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L *) and more red (higher a*) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b *; lower L *) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. PMID:28407236

  18. The interaction of Bayesian priors and sensory data and its neural circuit implementation in visually-guided movement

    PubMed Central

    Yang, Jin; Lee, Joonyeol; Lisberger, Stephen G.

    2012-01-01

    Sensory-motor behavior results from a complex interaction of noisy sensory data with priors based on recent experience. By varying the stimulus form and contrast for the initiation of smooth pursuit eye movements in monkeys, we show that visual motion inputs compete with two independent priors: one prior biases eye speed toward zero; the other prior attracts eye direction according to the past several days’ history of target directions. The priors bias the speed and direction of the initiation of pursuit for the weak sensory data provided by the motion of a low-contrast sine wave grating. However, the priors have relatively little effect on pursuit speed and direction when the visual stimulus arises from the coherent motion of a high-contrast patch of dots. For any given stimulus form, the mean and variance of eye speed co-vary in the initiation of pursuit, as expected for signal-dependent noise. This relationship suggests that pursuit implements a trade-off between movement accuracy and variation, reducing both when the sensory signals are noisy. The tradeoff is implemented as a competition of sensory data and priors that follows the rules of Bayesian estimation. Computer simulations show that the priors can be understood as direction specific control of the strength of visual-motor transmission, and can be implemented in a neural-network model that makes testable predictions about the population response in the smooth eye movement region of the frontal eye fields. PMID:23223286

  19. Visual cortical areas of the mouse: comparison of parcellation and network structure with primates

    PubMed Central

    Laramée, Marie-Eve; Boire, Denis

    2015-01-01

    Brains have evolved to optimize sensory processing. In primates, complex cognitive tasks must be executed and evolution led to the development of large brains with many cortical areas. Rodents do not accomplish cognitive tasks of the same level of complexity as primates and remain with small brains both in relative and absolute terms. But is a small brain necessarily a simple brain? In this review, several aspects of the visual cortical networks have been compared between rodents and primates. The visual system has been used as a model to evaluate the level of complexity of the cortical circuits at the anatomical and functional levels. The evolutionary constraints are first presented in order to appreciate the rules for the development of the brain and its underlying circuits. The organization of sensory pathways, with their parallel and cross-modal circuits, is also examined. Other features of brain networks, often considered as imposing constraints on the development of underlying circuitry, are also discussed and their effect on the complexity of the mouse and primate brain are inspected. In this review, we discuss the common features of cortical circuits in mice and primates and see how these can be useful in understanding visual processing in these animals. PMID:25620914

  20. Visual cortical areas of the mouse: comparison of parcellation and network structure with primates.

    PubMed

    Laramée, Marie-Eve; Boire, Denis

    2014-01-01

    Brains have evolved to optimize sensory processing. In primates, complex cognitive tasks must be executed and evolution led to the development of large brains with many cortical areas. Rodents do not accomplish cognitive tasks of the same level of complexity as primates and remain with small brains both in relative and absolute terms. But is a small brain necessarily a simple brain? In this review, several aspects of the visual cortical networks have been compared between rodents and primates. The visual system has been used as a model to evaluate the level of complexity of the cortical circuits at the anatomical and functional levels. The evolutionary constraints are first presented in order to appreciate the rules for the development of the brain and its underlying circuits. The organization of sensory pathways, with their parallel and cross-modal circuits, is also examined. Other features of brain networks, often considered as imposing constraints on the development of underlying circuitry, are also discussed and their effect on the complexity of the mouse and primate brain are inspected. In this review, we discuss the common features of cortical circuits in mice and primates and see how these can be useful in understanding visual processing in these animals.

  1. Cross-frequency synchronization connects networks of fast and slow oscillations during visual working memory maintenance

    PubMed Central

    Siebenhühner, Felix; Wang, Sheng H; Palva, J Matias; Palva, Satu

    2016-01-01

    Neuronal activity in sensory and fronto-parietal (FP) areas underlies the representation and attentional control, respectively, of sensory information maintained in visual working memory (VWM). Within these regions, beta/gamma phase-synchronization supports the integration of sensory functions, while synchronization in theta/alpha bands supports the regulation of attentional functions. A key challenge is to understand which mechanisms integrate neuronal processing across these distinct frequencies and thereby the sensory and attentional functions. We investigated whether such integration could be achieved by cross-frequency phase synchrony (CFS). Using concurrent magneto- and electroencephalography, we found that CFS was load-dependently enhanced between theta and alpha–gamma and between alpha and beta-gamma oscillations during VWM maintenance among visual, FP, and dorsal attention (DA) systems. CFS also connected the hubs of within-frequency-synchronized networks and its strength predicted individual VWM capacity. We propose that CFS integrates processing among synchronized neuronal networks from theta to gamma frequencies to link sensory and attentional functions. DOI: http://dx.doi.org/10.7554/eLife.13451.001 PMID:27669146

  2. Cerebellar contributions to motor timing: a PET study of auditory and visual rhythm reproduction.

    PubMed

    Penhune, V B; Zattore, R J; Evans, A C

    1998-11-01

    The perception and production of temporal patterns, or rhythms, is important for both music and speech. However, the way in which the human brain achieves accurate timing of perceptual input and motor output is as yet little understood. Central control of both motor timing and perceptual timing across modalities has been linked to both the cerebellum and the basal ganglia (BG). The present study was designed to test the hypothesized central control of temporal processing and to examine the roles of the cerebellum, BG, and sensory association areas. In this positron emission tomography (PET) activation paradigm, subjects reproduced rhythms of increasing temporal complexity that were presented separately in the auditory and visual modalities. The results provide support for a supramodal contribution of the lateral cerebellar cortex and cerebellar vermis to the production of a timed motor response, particularly when it is complex and/or novel. The results also give partial support to the involvement of BG structures in motor timing, although this may be more directly related to implementation of the motor response than to timing per se. Finally, sensory association areas and the ventrolateral frontal cortex were found to be involved in modality-specific encoding and retrieval of the temporal stimuli. Taken together, these results point to the participation of a number of neural structures in the production of a timed motor response from an external stimulus. The role of the cerebellum in timing is conceptualized not as a clock or counter but simply as the structure that provides the necessary circuitry for the sensory system to extract temporal information and for the motor system to learn to produce a precisely timed response.

  3. Interactions between the spatial and temporal stimulus factors that influence multisensory integration in human performance.

    PubMed

    Stevenson, Ryan A; Fister, Juliane Krueger; Barnett, Zachary P; Nidiffer, Aaron R; Wallace, Mark T

    2012-05-01

    In natural environments, human sensory systems work in a coordinated and integrated manner to perceive and respond to external events. Previous research has shown that the spatial and temporal relationships of sensory signals are paramount in determining how information is integrated across sensory modalities, but in ecologically plausible settings, these factors are not independent. In the current study, we provide a novel exploration of the impact on behavioral performance for systematic manipulations of the spatial location and temporal synchrony of a visual-auditory stimulus pair. Simple auditory and visual stimuli were presented across a range of spatial locations and stimulus onset asynchronies (SOAs), and participants performed both a spatial localization and simultaneity judgment task. Response times in localizing paired visual-auditory stimuli were slower in the periphery and at larger SOAs, but most importantly, an interaction was found between the two factors, in which the effect of SOA was greater in peripheral as opposed to central locations. Simultaneity judgments also revealed a novel interaction between space and time: individuals were more likely to judge stimuli as synchronous when occurring in the periphery at large SOAs. The results of this study provide novel insights into (a) how the speed of spatial localization of an audiovisual stimulus is affected by location and temporal coincidence and the interaction between these two factors and (b) how the location of a multisensory stimulus impacts judgments concerning the temporal relationship of the paired stimuli. These findings provide strong evidence for a complex interdependency between spatial location and temporal structure in determining the ultimate behavioral and perceptual outcome associated with a paired multisensory (i.e., visual-auditory) stimulus.

  4. Action video game playing is associated with improved visual sensitivity, but not alterations in visual sensory memory.

    PubMed

    Appelbaum, L Gregory; Cain, Matthew S; Darling, Elise F; Mitroff, Stephen R

    2013-08-01

    Action video game playing has been experimentally linked to a number of perceptual and cognitive improvements. These benefits are captured through a wide range of psychometric tasks and have led to the proposition that action video game experience may promote the ability to extract statistical evidence from sensory stimuli. Such an advantage could arise from a number of possible mechanisms: improvements in visual sensitivity, enhancements in the capacity or duration for which information is retained in visual memory, or higher-level strategic use of information for decision making. The present study measured the capacity and time course of visual sensory memory using a partial report performance task as a means to distinguish between these three possible mechanisms. Sensitivity measures and parameter estimates that describe sensory memory capacity and the rate of memory decay were compared between individuals who reported high evels and low levels of action video game experience. Our results revealed a uniform increase in partial report accuracy at all stimulus-to-cue delays for action video game players but no difference in the rate or time course of the memory decay. The present findings suggest that action video game playing may be related to enhancements in the initial sensitivity to visual stimuli, but not to a greater retention of information in iconic memory buffers.

  5. Top-down modulation of visual and auditory cortical processing in aging.

    PubMed

    Guerreiro, Maria J S; Eck, Judith; Moerel, Michelle; Evers, Elisabeth A T; Van Gerven, Pascal W M

    2015-02-01

    Age-related cognitive decline has been accounted for by an age-related deficit in top-down attentional modulation of sensory cortical processing. In light of recent behavioral findings showing that age-related differences in selective attention are modality dependent, our goal was to investigate the role of sensory modality in age-related differences in top-down modulation of sensory cortical processing. This question was addressed by testing younger and older individuals in several memory tasks while undergoing fMRI. Throughout these tasks, perceptual features were kept constant while attentional instructions were varied, allowing us to devise all combinations of relevant and irrelevant, visual and auditory information. We found no top-down modulation of auditory sensory cortical processing in either age group. In contrast, we found top-down modulation of visual cortical processing in both age groups, and this effect did not differ between age groups. That is, older adults enhanced cortical processing of relevant visual information and suppressed cortical processing of visual distractors during auditory attention to the same extent as younger adults. The present results indicate that older adults are capable of suppressing irrelevant visual information in the context of cross-modal auditory attention, and thereby challenge the view that age-related attentional and cognitive decline is due to a general deficits in the ability to suppress irrelevant information. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. A physiologically based nonhomogeneous Poisson counter model of visual identification.

    PubMed

    Christensen, Jeppe H; Markussen, Bo; Bundesen, Claus; Kyllingsbæk, Søren

    2018-04-30

    A physiologically based nonhomogeneous Poisson counter model of visual identification is presented. The model was developed in the framework of a Theory of Visual Attention (Bundesen, 1990; Kyllingsbæk, Markussen, & Bundesen, 2012) and meant for modeling visual identification of objects that are mutually confusable and hard to see. The model assumes that the visual system's initial sensory response consists in tentative visual categorizations, which are accumulated by leaky integration of both transient and sustained components comparable with those found in spike density patterns of early sensory neurons. The sensory response (tentative categorizations) feeds independent Poisson counters, each of which accumulates tentative object categorizations of a particular type to guide overt identification performance. We tested the model's ability to predict the effect of stimulus duration on observed distributions of responses in a nonspeeded (pure accuracy) identification task with eight response alternatives. The time courses of correct and erroneous categorizations were well accounted for when the event-rates of competing Poisson counters were allowed to vary independently over time in a way that mimicked the dynamics of receptive field selectivity as found in neurophysiological studies. Furthermore, the initial sensory response yielded theoretical hazard rate functions that closely resembled empirically estimated ones. Finally, supplied with a Naka-Rushton type contrast gain control, the model provided an explanation for Bloch's law. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Episodic Memory Retrieval Functionally Relies on Very Rapid Reactivation of Sensory Information.

    PubMed

    Waldhauser, Gerd T; Braun, Verena; Hanslmayr, Simon

    2016-01-06

    Episodic memory retrieval is assumed to rely on the rapid reactivation of sensory information that was present during encoding, a process termed "ecphory." We investigated the functional relevance of this scarcely understood process in two experiments in human participants. We presented stimuli to the left or right of fixation at encoding, followed by an episodic memory test with centrally presented retrieval cues. This allowed us to track the reactivation of lateralized sensory memory traces during retrieval. Successful episodic retrieval led to a very early (∼100-200 ms) reactivation of lateralized alpha/beta (10-25 Hz) electroencephalographic (EEG) power decreases in the visual cortex contralateral to the visual field at encoding. Applying rhythmic transcranial magnetic stimulation to interfere with early retrieval processing in the visual cortex led to decreased episodic memory performance specifically for items encoded in the visual field contralateral to the site of stimulation. These results demonstrate, for the first time, that episodic memory functionally relies on very rapid reactivation of sensory information. Remembering personal experiences requires a "mental time travel" to revisit sensory information perceived in the past. This process is typically described as a controlled, relatively slow process. However, by using electroencephalography to measure neural activity with a high time resolution, we show that such episodic retrieval entails a very rapid reactivation of sensory brain areas. Using transcranial magnetic stimulation to alter brain function during retrieval revealed that this early sensory reactivation is causally relevant for conscious remembering. These results give first neural evidence for a functional, preconscious component of episodic remembering. This provides new insight into the nature of human memory and may help in the understanding of psychiatric conditions that involve the automatic intrusion of unwanted memories. Copyright © 2016 the authors 0270-6474/16/360251-10$15.00/0.

  8. Play with your food! Sensory play is associated with tasting of fruits and vegetables in preschool children.

    PubMed

    Coulthard, Helen; Sealy, Annemarie

    2017-06-01

    The objective of the current study was to ascertain whether taking part in a sensory play activity with real fruits and vegetables (FV) can encourage tasting in preschool children, compared to a non-food activity or visual exposure to the activity. Three to four year old pre-school children (N = 62) were recruited from three preschool nursery classes from a school in Northamptonshire, UK. A between participants experimental study was conducted with each class assigned to one of three conditions; sensory FV play, sensory non-food play and visual FV exposure. Parental report of several baseline variables were taken; child baseline liking of the foods used in the study, parental and child FV consumption (portions/day), child neophobia and child tactile sensitivity. Outcome measures were the number of fruits and vegetables tasted in a post experiment taste test which featured (n = 5) or did not feature (n = 3) in the task. Analyses of covariance controlling for food neophobia and baseline liking of foods, showed that after the activity children in the sensory FV play condition tried more FV than both children in the non-food sensory play task (p < 0.001) and children in the visual FV exposure task (p < 0.001). This was true not only for five foods used in the activity (p < 0.001), but also three foods that were not used in the activity (p < 0.05). Sensory play activities using fruits and vegetables may encourage FV tasting in preschool children more than non food play or visual exposure alone. Long term intervention studies need to be carried out to see if these effects can be sustained over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Stochastic correlative firing for figure-ground segregation.

    PubMed

    Chen, Zhe

    2005-03-01

    Segregation of sensory inputs into separate objects is a central aspect of perception and arises in all sensory modalities. The figure-ground segregation problem requires identifying an object of interest in a complex scene, in many cases given binaural auditory or binocular visual observations. The computations required for visual and auditory figure-ground segregation share many common features and can be cast within a unified framework. Sensory perception can be viewed as a problem of optimizing information transmission. Here we suggest a stochastic correlative firing mechanism and an associative learning rule for figure-ground segregation in several classic sensory perception tasks, including the cocktail party problem in binaural hearing, binocular fusion of stereo images, and Gestalt grouping in motion perception.

  10. Brain correlates of automatic visual change detection.

    PubMed

    Cléry, H; Andersson, F; Fonlupt, P; Gomot, M

    2013-07-15

    A number of studies support the presence of visual automatic detection of change, but little is known about the brain generators involved in such processing and about the modulation of brain activity according to the salience of the stimulus. The study presented here was designed to locate the brain activity elicited by unattended visual deviant and novel stimuli using fMRI. Seventeen adult participants were presented with a passive visual oddball sequence while performing a concurrent visual task. Variations in BOLD signal were observed in the modality-specific sensory cortex, but also in non-specific areas involved in preattentional processing of changing events. A degree-of-deviance effect was observed, since novel stimuli elicited more activity in the sensory occipital regions and at the medial frontal site than small changes. These findings could be compared to those obtained in the auditory modality and might suggest a "general" change detection process operating in several sensory modalities. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Visual Functions of the Thalamus

    PubMed Central

    Usrey, W. Martin; Alitto, Henry J.

    2017-01-01

    The thalamus is the heavily interconnected partner of the neocortex. All areas of the neocortex receive afferent input from and send efferent projections to specific thalamic nuclei. Through these connections, the thalamus serves to provide the cortex with sensory input, and to facilitate interareal cortical communication and motor and cognitive functions. In the visual system, the lateral geniculate nucleus (LGN) of the dorsal thalamus is the gateway through which visual information reaches the cerebral cortex. Visual processing in the LGN includes spatial and temporal influences on visual signals that serve to adjust response gain, transform the temporal structure of retinal activity patterns, and increase the signal-to-noise ratio of the retinal signal while preserving its basic content. This review examines recent advances in our understanding of LGN function and circuit organization and places these findings in a historical context. PMID:28217740

  12. Sensory adaptation for timing perception.

    PubMed

    Roseboom, Warrick; Linares, Daniel; Nishida, Shin'ya

    2015-04-22

    Recent sensory experience modifies subjective timing perception. For example, when visual events repeatedly lead auditory events, such as when the sound and video tracks of a movie are out of sync, subsequent vision-leads-audio presentations are reported as more simultaneous. This phenomenon could provide insights into the fundamental problem of how timing is represented in the brain, but the underlying mechanisms are poorly understood. Here, we show that the effect of recent experience on timing perception is not just subjective; recent sensory experience also modifies relative timing discrimination. This result indicates that recent sensory history alters the encoding of relative timing in sensory areas, excluding explanations of the subjective phenomenon based only on decision-level changes. The pattern of changes in timing discrimination suggests the existence of two sensory components, similar to those previously reported for visual spatial attributes: a lateral shift in the nonlinear transducer that maps relative timing into perceptual relative timing and an increase in transducer slope around the exposed timing. The existence of these components would suggest that previous explanations of how recent experience may change the sensory encoding of timing, such as changes in sensory latencies or simple implementations of neural population codes, cannot account for the effect of sensory adaptation on timing perception.

  13. [Interest of ultrasonographic guidance in paediatric regional anaesthesia].

    PubMed

    Dadure, C; Raux, O; Rochette, A; Capdevila, X

    2009-10-01

    The use of ultrasonographic guidance for regional anaesthesia has known recently a big interest in children in recent years. The linear ultrasound probes with a 25 mm active surface area (or probes with 38 mm active surface area in older children), with high sound frequencies in the range 8-14 MHz, allow a good compromise between excellent resolution for superficial structure and good penetration depths. In children, the easiest ultrasound guided blocks are axillar blocks, femoral blocks, fascia iliaca compartment blocks, ilio-inguinal blocks and para-umbilical blocks, caudal blocks. They permit a safe and easy learning curve of these techniques. The main advantage of ultrasound guided regional anaesthesia is the visualization of different anatomical structures and the approximate localization of the tip of needle. The other advantages for ultrasound guided peripheral nerve blocks in children are: faster onset time of sensory and motor block, longer duration of sensory blockade, increase of blockade quality and reduction of local anesthetic injection. The use of ultrasonographic guidance for central block allows to visualize different structures as well as spine and his content. Spinous process, ligament flavum, dura mater, conus medullaris and cerebrospinal fluid are identifiable, and give some information on spine, epidural space and the depth between epidural space and skin. At last, in caudal block, ultrasounds permit to evaluate the anatomy of caudal epidural space, especially the relation of the sacral hiatus to the dural sac and the search of occult spinal dysraphism. Benefit of this technique is the visualization of targeted nerves or spaces and the spread of injected local anaesthetic.

  14. The contributions of vision and haptics to reaching and grasping

    PubMed Central

    Stone, Kayla D.; Gonzalez, Claudia L. R.

    2015-01-01

    This review aims to provide a comprehensive outlook on the sensory (visual and haptic) contributions to reaching and grasping. The focus is on studies in developing children, normal, and neuropsychological populations, and in sensory-deprived individuals. Studies have suggested a right-hand/left-hemisphere specialization for visually guided grasping and a left-hand/right-hemisphere specialization for haptically guided object recognition. This poses the interesting possibility that when vision is not available and grasping relies heavily on the haptic system, there is an advantage to use the left hand. We review the evidence for this possibility and dissect the unique contributions of the visual and haptic systems to grasping. We ultimately discuss how the integration of these two sensory modalities shape hand preference. PMID:26441777

  15. Visual and acoustic communication in non-human animals: a comparison.

    PubMed

    Rosenthal, G G; Ryan, M J

    2000-09-01

    The visual and auditory systems are two major sensory modalities employed in communication. Although communication in these two sensory modalities can serve analogous functions and evolve in response to similar selection forces, the two systems also operate under different constraints imposed by the environment and the degree to which these sensory modalities are recruited for non-communication functions. Also, the research traditions in each tend to differ, with studies of mechanisms of acoustic communication tending to take a more reductionist tack often concentrating on single signal parameters, and studies of visual communication tending to be more concerned with multivariate signal arrays in natural environments and higher level processing of such signals. Each research tradition would benefit by being more expansive in its approach.

  16. Mouse auditory cortex differs from visual and somatosensory cortices in the laminar distribution of cytochrome oxidase and acetylcholinesterase.

    PubMed

    Anderson, L A; Christianson, G B; Linden, J F

    2009-02-03

    Cytochrome oxidase (CYO) and acetylcholinesterase (AChE) staining density varies across the cortical layers in many sensory areas. The laminar variations likely reflect differences between the layers in levels of metabolic activity and cholinergic modulation. The question of whether these laminar variations differ between primary sensory cortices has never been systematically addressed in the same set of animals, since most studies of sensory cortex focus on a single sensory modality. Here, we compared the laminar distribution of CYO and AChE activity in the primary auditory, visual, and somatosensory cortices of the mouse, using Nissl-stained sections to define laminar boundaries. Interestingly, for both CYO and AChE, laminar patterns of enzyme activity were similar in the visual and somatosensory cortices, but differed in the auditory cortex. In the visual and somatosensory areas, staining densities for both enzymes were highest in layers III/IV or IV and in lower layer V. In the auditory cortex, CYO activity showed a reliable peak only at the layer III/IV border, while AChE distribution was relatively homogeneous across layers. These results suggest that laminar patterns of metabolic activity and cholinergic influence are similar in the mouse visual and somatosensory cortices, but differ in the auditory cortex.

  17. A Review of the Benefits of Nature Experiences: More Than Meets the Eye

    PubMed Central

    Franco, Lara S.; Shanahan, Danielle F.

    2017-01-01

    Evidence that experiences of nature can benefit people has accumulated rapidly. Yet perhaps because of the domination of the visual sense in humans, most research has focused on the visual aspects of nature experiences. However, humans are multisensory, and it seems likely that many benefits are delivered through the non-visual senses and these are potentially avenues through which a physiological mechanism could occur. Here we review the evidence around these lesser studied sensory pathways—through sound, smell, taste, touch, and three non-sensory pathways. Natural sounds and smells underpin experiences of nature for many people, and this may well be rooted in evolutionary psychology. Tactile experiences of nature, particularly beyond animal petting, are understudied yet potentially fundamentally important. Tastes of nature, through growing and consuming natural foods, have been linked with a range of health and well-being benefits. Beyond the five senses, evidence is emerging for other non-visual pathways for nature experiences to be effective. These include ingestion or inhalation of phytoncides, negative air ions and microbes. We conclude that (i) these non-visual avenues are potentially important for delivering benefits from nature experiences; (ii) the evidence base is relatively weak and often based on correlational studies; and (iii) deeper exploration of these sensory and non-sensory avenues is needed. PMID:28763021

  18. A Review of the Benefits of Nature Experiences: More Than Meets the Eye.

    PubMed

    Franco, Lara S; Shanahan, Danielle F; Fuller, Richard A

    2017-08-01

    Evidence that experiences of nature can benefit people has accumulated rapidly. Yet perhaps because of the domination of the visual sense in humans, most research has focused on the visual aspects of nature experiences. However, humans are multisensory, and it seems likely that many benefits are delivered through the non-visual senses and these are potentially avenues through which a physiological mechanism could occur. Here we review the evidence around these lesser studied sensory pathways-through sound, smell, taste, touch, and three non-sensory pathways. Natural sounds and smells underpin experiences of nature for many people, and this may well be rooted in evolutionary psychology. Tactile experiences of nature, particularly beyond animal petting, are understudied yet potentially fundamentally important. Tastes of nature, through growing and consuming natural foods, have been linked with a range of health and well-being benefits. Beyond the five senses, evidence is emerging for other non-visual pathways for nature experiences to be effective. These include ingestion or inhalation of phytoncides, negative air ions and microbes. We conclude that (i) these non-visual avenues are potentially important for delivering benefits from nature experiences; (ii) the evidence base is relatively weak and often based on correlational studies; and (iii) deeper exploration of these sensory and non-sensory avenues is needed.

  19. Visual Information Present in Infragranular Layers of Mouse Auditory Cortex.

    PubMed

    Morrill, Ryan J; Hasenstaub, Andrea R

    2018-03-14

    The cerebral cortex is a major hub for the convergence and integration of signals from across the sensory modalities; sensory cortices, including primary regions, are no exception. Here we show that visual stimuli influence neural firing in the auditory cortex of awake male and female mice, using multisite probes to sample single units across multiple cortical layers. We demonstrate that visual stimuli influence firing in both primary and secondary auditory cortex. We then determine the laminar location of recording sites through electrode track tracing with fluorescent dye and optogenetic identification using layer-specific markers. Spiking responses to visual stimulation occur deep in auditory cortex and are particularly prominent in layer 6. Visual modulation of firing rate occurs more frequently at areas with secondary-like auditory responses than those with primary-like responses. Auditory cortical responses to drifting visual gratings are not orientation-tuned, unlike visual cortex responses. The deepest cortical layers thus appear to be an important locus for cross-modal integration in auditory cortex. SIGNIFICANCE STATEMENT The deepest layers of the auditory cortex are often considered its most enigmatic, possessing a wide range of cell morphologies and atypical sensory responses. Here we show that, in mouse auditory cortex, these layers represent a locus of cross-modal convergence, containing many units responsive to visual stimuli. Our results suggest that this visual signal conveys the presence and timing of a stimulus rather than specifics about that stimulus, such as its orientation. These results shed light on both how and what types of cross-modal information is integrated at the earliest stages of sensory cortical processing. Copyright © 2018 the authors 0270-6474/18/382854-09$15.00/0.

  20. Age-related changes in human posture control: Sensory organization tests

    NASA Technical Reports Server (NTRS)

    Peterka, R. J.; Black, F. O.

    1989-01-01

    Postural control was measured in 214 human subjects ranging in age from 7 to 81 years. Sensory organization tests measured the magnitude of anterior-posterior body sway during six 21 s trials in which visual and somatosensory orientation cues were altered (by rotating the visual surround and support surface in proportion to the subject's sway) or vision eliminated (eyes closed) in various combinations. No age-related increase in postural sway was found for subjects standing on a fixed support surface with eyes open or closed. However, age-related increases in sway were found for conditions involving altered visual or somatosensory cues. Subjects older than about 55 years showed the largest sway increases. Subjects younger than about 15 years were also sensitive to alteration of sensory cues. On average, the older subjects were more affected by altered visual cues whereas younger subjects had more difficulty with altered somatosensory cues.

  1. Emotional facilitation of sensory processing in the visual cortex.

    PubMed

    Schupp, Harald T; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2003-01-01

    A key function of emotion is the preparation for action. However, organization of successful behavioral strategies depends on efficient stimulus encoding. The present study tested the hypothesis that perceptual encoding in the visual cortex is modulated by the emotional significance of visual stimuli. Event-related brain potentials were measured while subjects viewed pleasant, neutral, and unpleasant pictures. Early selective encoding of pleasant and unpleasant images was associated with a posterior negativity, indicating primary sources of activation in the visual cortex. The study also replicated previous findings in that affective cues also elicited enlarged late positive potentials, indexing increased stimulus relevance at higher-order stages of stimulus processing. These results support the hypothesis that sensory encoding of affective stimuli is facilitated implicitly by natural selective attention. Thus, the affect system not only modulates motor output (i.e., favoring approach or avoidance dispositions), but already operates at an early level of sensory encoding.

  2. Sensory determinants of the autonomous sensory meridian response (ASMR): understanding the triggers.

    PubMed

    Barratt, Emma L; Spence, Charles; Davis, Nick J

    2017-01-01

    The autonomous sensory meridian response (ASMR) is an atypical sensory phenomenon involving electrostatic-like tingling sensations in response to certain sensory, primarily audio-visual, stimuli. The current study used an online questionnaire, completed by 130 people who self-reported experiencing ASMR. We aimed to extend preliminary investigations into the experience, and establish key multisensory factors contributing to the successful induction of ASMR through online media. Aspects such as timing and trigger load, atmosphere, and characteristics of ASMR content, ideal spatial distance from various types of stimuli, visual characteristics, context and use of ASMR triggers, and audio preferences are explored. Lower-pitched, complex sounds were found to be especially effective triggers, as were slow-paced, detail-focused videos. Conversely, background music inhibited the sensation for many respondents. These results will help in designing media for ASMR induction.

  3. Sensory determinants of the autonomous sensory meridian response (ASMR): understanding the triggers

    PubMed Central

    Barratt, Emma L.; Spence, Charles

    2017-01-01

    The autonomous sensory meridian response (ASMR) is an atypical sensory phenomenon involving electrostatic-like tingling sensations in response to certain sensory, primarily audio-visual, stimuli. The current study used an online questionnaire, completed by 130 people who self-reported experiencing ASMR. We aimed to extend preliminary investigations into the experience, and establish key multisensory factors contributing to the successful induction of ASMR through online media. Aspects such as timing and trigger load, atmosphere, and characteristics of ASMR content, ideal spatial distance from various types of stimuli, visual characteristics, context and use of ASMR triggers, and audio preferences are explored. Lower-pitched, complex sounds were found to be especially effective triggers, as were slow-paced, detail-focused videos. Conversely, background music inhibited the sensation for many respondents. These results will help in designing media for ASMR induction. PMID:29018601

  4. Equilibration and Sensory Overload in the Pre-School Child: Some Effects of Children's Television Programming.

    ERIC Educational Resources Information Center

    Miller, Thomas W.

    This paper reports an attempt to research sensory overstimulation in a variety of children's television programs by rating the level of visual sensory stimulation, auditory sensory stimulation, verbal response patterns and nonverbal response patterns in 45 television programs designed for pre-school children. The Television Rating Inventory (TVRI)…

  5. Attention stabilizes the shared gain of V4 populations

    PubMed Central

    Rabinowitz, Neil C; Goris, Robbe L; Cohen, Marlene; Simoncelli, Eero P

    2015-01-01

    Responses of sensory neurons represent stimulus information, but are also influenced by internal state. For example, when monkeys direct their attention to a visual stimulus, the response gain of specific subsets of neurons in visual cortex changes. Here, we develop a functional model of population activity to investigate the structure of this effect. We fit the model to the spiking activity of bilateral neural populations in area V4, recorded while the animal performed a stimulus discrimination task under spatial attention. The model reveals four separate time-varying shared modulatory signals, the dominant two of which each target task-relevant neurons in one hemisphere. In attention-directed conditions, the associated shared modulatory signal decreases in variance. This finding provides an interpretable and parsimonious explanation for previous observations that attention reduces variability and noise correlations of sensory neurons. Finally, the recovered modulatory signals reflect previous reward, and are predictive of subsequent choice behavior. DOI: http://dx.doi.org/10.7554/eLife.08998.001 PMID:26523390

  6. Asymmetries of the human social brain in the visual, auditory and chemical modalities

    PubMed Central

    Brancucci, Alfredo; Lucci, Giuliana; Mazzatenta, Andrea; Tommasi, Luca

    2008-01-01

    Structural and functional asymmetries are present in many regions of the human brain responsible for motor control, sensory and cognitive functions and communication. Here, we focus on hemispheric asymmetries underlying the domain of social perception, broadly conceived as the analysis of information about other individuals based on acoustic, visual and chemical signals. By means of these cues the brain establishes the border between ‘self’ and ‘other’, and interprets the surrounding social world in terms of the physical and behavioural characteristics of conspecifics essential for impression formation and for creating bonds and relationships. We show that, considered from the standpoint of single- and multi-modal sensory analysis, the neural substrates of the perception of voices, faces, gestures, smells and pheromones, as evidenced by modern neuroimaging techniques, are characterized by a general pattern of right-hemispheric functional asymmetry that might benefit from other aspects of hemispheric lateralization rather than constituting a true specialization for social information. PMID:19064350

  7. Cortical Feedback Regulates Feedforward Retinogeniculate Refinement

    PubMed Central

    Thompson, Andrew D; Picard, Nathalie; Min, Lia; Fagiolini, Michela; Chen, Chinfei

    2016-01-01

    SUMMARY According to the prevailing view of neural development, sensory pathways develop sequentially in a feedforward manner, whereby each local microcircuit refines and stabilizes before directing the wiring of its downstream target. In the visual system, retinal circuits are thought to mature first and direct refinement in the thalamus, after which cortical circuits refine with experience-dependent plasticity. In contrast, we now show that feedback from cortex to thalamus critically regulates refinement of the retinogeniculate projection during a discrete window in development, beginning at postnatal day 20 in mice. Disrupting cortical activity during this window, pharmacologically or chemogenetically, increases the number of retinal ganglion cells innervating each thalamic relay neuron. These results suggest that primary sensory structures develop through the concurrent and interdependent remodeling of subcortical and cortical circuits in response to sensory experience, rather than through a simple feedforward process. Our findings also highlight an unexpected function for the corticothalamic projection. PMID:27545712

  8. Transformation priming helps to disambiguate sudden changes of sensory inputs.

    PubMed

    Pastukhov, Alexander; Vivian-Griffiths, Solveiga; Braun, Jochen

    2015-11-01

    Retinal input is riddled with abrupt transients due to self-motion, changes in illumination, object-motion, etc. Our visual system must correctly interpret each of these changes to keep visual perception consistent and sensitive. This poses an enormous challenge, as many transients are highly ambiguous in that they are consistent with many alternative physical transformations. Here we investigated inter-trial effects in three situations with sudden and ambiguous transients, each presenting two alternative appearances (rotation-reversing structure-from-motion, polarity-reversing shape-from-shading, and streaming-bouncing object collisions). In every situation, we observed priming of transformations as the outcome perceived in earlier trials tended to repeat in subsequent trials and this repetition was contingent on perceptual experience. The observed priming was specific to transformations and did not originate in priming of perceptual states preceding a transient. Moreover, transformation priming was independent of attention and specific to low level stimulus attributes. In summary, we show how "transformation priors" and experience-driven updating of such priors helps to disambiguate sudden changes of sensory inputs. We discuss how dynamic transformation priors can be instantiated as "transition energies" in an "energy landscape" model of the visual perception. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Differential effects of ADORA2A gene variations in pre-attentive visual sensory memory subprocesses.

    PubMed

    Beste, Christian; Stock, Ann-Kathrin; Ness, Vanessa; Epplen, Jörg T; Arning, Larissa

    2012-08-01

    The ADORA2A gene encodes the adenosine A(2A) receptor that is highly expressed in the striatum where it plays a role in modulating glutamatergic and dopaminergic transmission. Glutamatergic signaling has been suggested to play a pivotal role in cognitive functions related to the pre-attentive processing of external stimuli. Yet, the precise molecular mechanism of these processes is poorly understood. Therefore, we aimed to investigate whether ADORA2A gene variation has modulating effects on visual pre-attentive sensory memory processing. Studying two polymorphisms, rs5751876 and rs2298383, in 199 healthy control subjects who performed a partial-report paradigm, we find that ADORA2A variation is associated with differences in the efficiency of pre-attentive sensory memory sub-processes. We show that especially the initial visual availability of stimulus information is rendered more efficiently in the homozygous rare genotype groups. Processes related to the transfer of information into working memory and the duration of visual sensory (iconic) memory are compromised in the homozygous rare genotype groups. Our results show a differential genotype-dependent modulation of pre-attentive sensory memory sub-processes. Hence, we assume that this modulation may be due to differential effects of increased adenosine A(2A) receptor signaling on glutamatergic transmission and striatal medium spiny neuron (MSN) interaction. Copyright © 2011 Elsevier B.V. and ECNP. All rights reserved.

  10. Processes to Preserve Spice and Herb Quality and Sensory Integrity During Pathogen Inactivation.

    PubMed

    Duncan, Susan E; Moberg, Kayla; Amin, Kemia N; Wright, Melissa; Newkirk, Jordan J; Ponder, Monica A; Acuff, Gary R; Dickson, James S

    2017-05-01

    Selected processing methods, demonstrated to be effective at reducing Salmonella, were assessed to determine if spice and herb quality was affected. Black peppercorn, cumin seed, oregano, and onion powder were irradiated to a target dose of 8 kGy. Two additional processes were examined for whole black peppercorns and cumin seeds: ethylene oxide (EtO) fumigation and vacuum assisted-steam (82.22 °C, 7.5 psia). Treated and untreated spices/herbs were compared (visual, odor) using sensory similarity testing protocols (α = 0.20; β = 0.05; proportion of discriminators: 20%) to determine if processing altered sensory quality. Analytical assessment of quality (color, water activity, and volatile chemistry) was completed. Irradiation did not alter visual or odor sensory quality of black peppercorn, cumin seed, or oregano but created differences in onion powder, which was lighter (higher L * ) and more red (higher a * ) in color, and resulted in nearly complete loss of measured volatile compounds. EtO processing did not create detectable odor or appearance differences in black peppercorn; however visual and odor sensory quality differences, supported by changes in color (higher b * ; lower L * ) and increased concentrations of most volatiles, were detected for cumin seeds. Steam processing of black peppercorn resulted in perceptible odor differences, supported by increased concentration of monoterpene volatiles and loss of all sesquiterpenes; only visual differences were noted for cumin seed. An important step in process validation is the verification that no effect is detectable from a sensory perspective. © 2017 The Authors. Journal of Food Science published by Wiley Periodicals, Inc. on behalf of Institute of Food Technologists.

  11. Push-Pull Receptive Field Organization and Synaptic Depression: Mechanisms for Reliably Encoding Naturalistic Stimuli in V1

    PubMed Central

    Kremkow, Jens; Perrinet, Laurent U.; Monier, Cyril; Alonso, Jose-Manuel; Aertsen, Ad; Frégnac, Yves; Masson, Guillaume S.

    2016-01-01

    Neurons in the primary visual cortex are known for responding vigorously but with high variability to classical stimuli such as drifting bars or gratings. By contrast, natural scenes are encoded more efficiently by sparse and temporal precise spiking responses. We used a conductance-based model of the visual system in higher mammals to investigate how two specific features of the thalamo-cortical pathway, namely push-pull receptive field organization and fast synaptic depression, can contribute to this contextual reshaping of V1 responses. By comparing cortical dynamics evoked respectively by natural vs. artificial stimuli in a comprehensive parametric space analysis, we demonstrate that the reliability and sparseness of the spiking responses during natural vision is not a mere consequence of the increased bandwidth in the sensory input spectrum. Rather, it results from the combined impacts of fast synaptic depression and push-pull inhibition, the later acting for natural scenes as a form of “effective” feed-forward inhibition as demonstrated in other sensory systems. Thus, the combination of feedforward-like inhibition with fast thalamo-cortical synaptic depression by simple cells receiving a direct structured input from thalamus composes a generic computational mechanism for generating a sparse and reliable encoding of natural sensory events. PMID:27242445

  12. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps.

    PubMed

    Sood, Mariam R; Sereno, Martin I

    2016-08-01

    Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor-preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface-based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory-motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory-motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M-I. Hum Brain Mapp 37:2784-2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  13. Relational Associative Learning Induces Cross-Modal Plasticity in Early Visual Cortex

    PubMed Central

    Headley, Drew B.; Weinberger, Norman M.

    2015-01-01

    Neurobiological theories of memory posit that the neocortex is a storage site of declarative memories, a hallmark of which is the association of two arbitrary neutral stimuli. Early sensory cortices, once assumed uninvolved in memory storage, recently have been implicated in associations between neutral stimuli and reward or punishment. We asked whether links between neutral stimuli also could be formed in early visual or auditory cortices. Rats were presented with a tone paired with a light using a sensory preconditioning paradigm that enabled later evaluation of successful association. Subjects that acquired this association developed enhanced sound evoked potentials in their primary and secondary visual cortices. Laminar recordings localized this potential to cortical Layers 5 and 6. A similar pattern of activation was elicited by microstimulation of primary auditory cortex in the same subjects, consistent with a cortico-cortical substrate of association. Thus, early sensory cortex has the capability to form neutral stimulus associations. This plasticity may constitute a declarative memory trace between sensory cortices. PMID:24275832

  14. Experimental and Computational Studies of Cortical Neural Network Properties Through Signal Processing

    NASA Astrophysics Data System (ADS)

    Clawson, Wesley Patrick

    Previous studies, both theoretical and experimental, of network level dynamics in the cerebral cortex show evidence for a statistical phenomenon called criticality; a phenomenon originally studied in the context of phase transitions in physical systems and that is associated with favorable information processing in the context of the brain. The focus of this thesis is to expand upon past results with new experimentation and modeling to show a relationship between criticality and the ability to detect and discriminate sensory input. A line of theoretical work predicts maximal sensory discrimination as a functional benefit of criticality, which can then be characterized using mutual information between sensory input, visual stimulus, and neural response,. The primary finding of our experiments in the visual cortex in turtles and neuronal network modeling confirms this theoretical prediction. We show that sensory discrimination is maximized when visual cortex operates near criticality. In addition to presenting this primary finding in detail, this thesis will also address our preliminary results on change-point-detection in experimentally measured cortical dynamics.

  15. The Use of a Tactile-Vision Sensory Substitution System as an Augmentative Tool for Individuals with Visual Impairments

    ERIC Educational Resources Information Center

    Williams, Michael D.; Ray, Christopher T.; Griffith, Jennifer; De l'Aune, William

    2011-01-01

    The promise of novel technological strategies and solutions to assist persons with visual impairments (that is, those who are blind or have low vision) is frequently discussed and held to be widely beneficial in countless applications and daily activities. One such approach involving a tactile-vision sensory substitution modality as a mechanism to…

  16. Fundamental Visual Representations of Social Cognition in ASD

    DTIC Science & Technology

    2015-10-01

    autism spectrum disorder as assessed by high density electrical mapping...C., Russo, N. N., & Foxe, J. J. (2013). Atypical cortical representation of peripheral visual space in children with an autism spectrum disorder . European Journal of Neuroscience, 38(1), 2125-2138. ...Sensory processing issues are prevalent in the autism spectrum (ASD) population, and sensory adaptation can be a potential biomarker - a

  17. Visual learning with reduced adaptation is eccentricity-specific.

    PubMed

    Harris, Hila; Sagi, Dov

    2018-01-12

    Visual learning is known to be specific to the trained target location, showing little transfer to untrained locations. Recently, learning was shown to transfer across equal-eccentricity retinal-locations when sensory adaptation due to repetitive stimulation was minimized. It was suggested that learning transfers to previously untrained locations when the learned representation is location invariant, with sensory adaptation introducing location-dependent representations, thus preventing transfer. Spatial invariance may also fail when the trained and tested locations are at different distance from the center of gaze (different retinal eccentricities), due to differences in the corresponding low-level cortical representations (e.g. allocated cortical area decreases with eccentricity). Thus, if learning improves performance by better classifying target-dependent early visual representations, generalization is predicted to fail when locations of different retinal eccentricities are trained and tested in the absence sensory adaptation. Here, using the texture discrimination task, we show specificity of learning across different retinal eccentricities (4-8°) using reduced adaptation training. The existence of generalization across equal-eccentricity locations but not across different eccentricities demonstrates that learning accesses visual representations preceding location independent representations, with specificity of learning explained by inhomogeneous sensory representation.

  18. Multi-modal distraction: insights from children's limited attention.

    PubMed

    Matusz, Pawel J; Broadbent, Hannah; Ferrari, Jessica; Forrest, Benjamin; Merkley, Rebecca; Scerif, Gaia

    2015-03-01

    How does the multi-sensory nature of stimuli influence information processing? Cognitive systems with limited selective attention can elucidate these processes. Six-year-olds, 11-year-olds and 20-year-olds engaged in a visual search task that required them to detect a pre-defined coloured shape under conditions of low or high visual perceptual load. On each trial, a peripheral distractor that could be either compatible or incompatible with the current target colour was presented either visually, auditorily or audiovisually. Unlike unimodal distractors, audiovisual distractors elicited reliable compatibility effects across the two levels of load in adults and in the older children, but high visual load significantly reduced distraction for all children, especially the youngest participants. This study provides the first demonstration that multi-sensory distraction has powerful effects on selective attention: Adults and older children alike allocate attention to potentially relevant information across multiple senses. However, poorer attentional resources can, paradoxically, shield the youngest children from the deleterious effects of multi-sensory distraction. Furthermore, we highlight how developmental research can enrich the understanding of distinct mechanisms controlling adult selective attention in multi-sensory environments. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Prestimulus neural oscillations inhibit visual perception via modulation of response gain.

    PubMed

    Chaumon, Maximilien; Busch, Niko A

    2014-11-01

    The ongoing state of the brain radically affects how it processes sensory information. How does this ongoing brain activity interact with the processing of external stimuli? Spontaneous oscillations in the alpha range are thought to inhibit sensory processing, but little is known about the psychophysical mechanisms of this inhibition. We recorded ongoing brain activity with EEG while human observers performed a visual detection task with stimuli of different contrast intensities. To move beyond qualitative description, we formally compared psychometric functions obtained under different levels of ongoing alpha power and evaluated the inhibitory effect of ongoing alpha oscillations in terms of contrast or response gain models. This procedure opens the way to understanding the actual functional mechanisms by which ongoing brain activity affects visual performance. We found that strong prestimulus occipital alpha oscillations-but not more anterior mu oscillations-reduce performance most strongly for stimuli of the highest intensities tested. This inhibitory effect is best explained by a divisive reduction of response gain. Ongoing occipital alpha oscillations thus reflect changes in the visual system's input/output transformation that are independent of the sensory input to the system. They selectively scale the system's response, rather than change its sensitivity to sensory information.

  20. The sensory substrate of multimodal communication in brown-headed cowbirds: are females sensory 'specialists' or 'generalists'?

    PubMed

    Ronald, Kelly L; Sesterhenn, Timothy M; Fernandez-Juricic, Esteban; Lucas, Jeffrey R

    2017-11-01

    Many animals communicate with multimodal signals. While we have an understanding of multimodal signal production, we know relatively less about receiver filtering of multimodal signals and whether filtering capacity in one modality influences filtering in a second modality. Most multimodal signals contain a temporal element, such as change in frequency over time or a dynamic visual display. We examined the relationship in temporal resolution across two modalities to test whether females are (1) sensory 'specialists', where a trade-off exists between the sensory modalities, (2) sensory 'generalists', where a positive relationship exists between the modalities, or (3) whether no relationship exists between modalities. We used female brown-headed cowbirds (Molothrus ater) to investigate this question as males court females with an audiovisual display. We found a significant positive relationship between female visual and auditory temporal resolution, suggesting that females are sensory 'generalists'. Females appear to resolve information well across multiple modalities, which may select for males that signal their quality similarly across modalities.

  1. Influences of selective adaptation on perception of audiovisual speech

    PubMed Central

    Dias, James W.; Cook, Theresa C.; Rosenblum, Lawrence D.

    2016-01-01

    Research suggests that selective adaptation in speech is a low-level process dependent on sensory-specific information shared between the adaptor and test-stimuli. However, previous research has only examined how adaptors shift perception of unimodal test stimuli, either auditory or visual. In the current series of experiments, we investigated whether adaptation to cross-sensory phonetic information can influence perception of integrated audio-visual phonetic information. We examined how selective adaptation to audio and visual adaptors shift perception of speech along an audiovisual test continuum. This test-continuum consisted of nine audio-/ba/-visual-/va/ stimuli, ranging in visual clarity of the mouth. When the mouth was clearly visible, perceivers “heard” the audio-visual stimulus as an integrated “va” percept 93.7% of the time (e.g., McGurk & MacDonald, 1976). As visibility of the mouth became less clear across the nine-item continuum, the audio-visual “va” percept weakened, resulting in a continuum ranging in audio-visual percepts from /va/ to /ba/. Perception of the test-stimuli was tested before and after adaptation. Changes in audiovisual speech perception were observed following adaptation to visual-/va/ and audiovisual-/va/, but not following adaptation to auditory-/va/, auditory-/ba/, or visual-/ba/. Adaptation modulates perception of integrated audio-visual speech by modulating the processing of sensory-specific information. The results suggest that auditory and visual speech information are not completely integrated at the level of selective adaptation. PMID:27041781

  2. Artificial organs: recent progress in artificial hearing and vision.

    PubMed

    Ifukube, Tohru

    2009-01-01

    Artificial sensory organs are a prosthetic means of sending visual or auditory information to the brain by electrical stimulation of the optic or auditory nerves to assist visually impaired or hearing-impaired people. However, clinical application of artificial sensory organs, except for cochlear implants, is still a trial-and-error process. This is because how and where the information transmitted to the brain is processed is still unknown, and also because changes in brain function (plasticity) remain unknown, even though brain plasticity plays an important role in meaningful interpretation of new sensory stimuli. This article discusses some basic unresolved issues and potential solutions in the development of artificial sensory organs such as cochlear implants, brainstem implants, artificial vision, and artificial retinas.

  3. Semantic congruency and the (reversed) Colavita effect in children and adults.

    PubMed

    Wille, Claudia; Ebersbach, Mirjam

    2016-01-01

    When presented with auditory, visual, or bimodal audiovisual stimuli in a discrimination task, adults tend to ignore the auditory component in bimodal stimuli and respond to the visual component only (i.e., Colavita visual dominance effect). The same is true for older children, whereas young children are dominated by the auditory component of bimodal audiovisual stimuli. This suggests a change of sensory dominance during childhood. The aim of the current study was to investigate, in three experimental conditions, whether children and adults show sensory dominance when presented with complex semantic stimuli and whether this dominance can be modulated by stimulus characteristics such as semantic (in)congruency, frequency of bimodal trials, and color information. Semantic (in)congruency did not affect the magnitude of the auditory dominance effect in 6-year-olds or the visual dominance effect in adults, but it was a modulating factor of the visual dominance in 9-year-olds (Conditions 1 and 2). Furthermore, the absence of color information (Condition 3) did not affect auditory dominance in 6-year-olds and hardly affected visual dominance in adults, whereas the visual dominance in 9-year-olds disappeared. Our results suggest that (a) sensory dominance in children and adults is not restricted to simple lights and sounds, as used in previous research, but can be extended to semantically meaningful stimuli and that (b) sensory dominance is more robust in 6-year-olds and adults than in 9-year-olds, implying a transitional stage around this age. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Being a Parent's Eyes and Ears: Emotional Literacy and Empathy of Children Whose Parents Have a Sensory Disability

    ERIC Educational Resources Information Center

    Eden, Sigal; Romi, Shlomo; Braun Aviyashar, Einat

    2017-01-01

    Children of parents with sensory disability may feel that their experience helped nurture their sense of empathy. The study was designed to examine the connection between parents' sensory disability (visual disability to blindness and hearing disability to deafness) and the empathy and emotional literacy of their non-sensory-disabled children.…

  5. Fractality of sensations and the brain health: the theory linking neurodegenerative disorder with distortion of spatial and temporal scale-invariance and fractal complexity of the visible world

    PubMed Central

    Zueva, Marina V.

    2015-01-01

    The theory that ties normal functioning and pathology of the brain and visual system with the spatial–temporal structure of the visual and other sensory stimuli is described for the first time in the present study. The deficit of fractal complexity of environmental influences can lead to the distortion of fractal complexity in the visual pathways of the brain and abnormalities of development or aging. The use of fractal light stimuli and fractal stimuli of other modalities can help to restore the functions of the brain, particularly in the elderly and in patients with neurodegenerative disorders or amblyopia. Non-linear dynamics of these physiological processes have a strong base of evidence, which is seen in the impaired fractal regulation of rhythmic activity in aged and diseased brains. From birth to old age, we live in a non-linear world, in which objects and processes with the properties of fractality and non-linearity surround us. Against this background, the evolution of man took place and all periods of life unfolded. Works of art created by man may also have fractal properties. The positive influence of music on cognitive functions is well-known. Insufficiency of sensory experience is believed to play a crucial role in the pathogenesis of amblyopia and age-dependent diseases. The brain is very plastic in its early development, and the plasticity decreases throughout life. However, several studies showed the possibility to reactivate the adult’s neuroplasticity in a variety of ways. We propose that a non-linear structure of sensory information on many spatial and temporal scales is crucial to the brain health and fractal regulation of physiological rhythms. Theoretical substantiation of the author’s theory is presented. Possible applications and the future research that can experimentally confirm or refute the theoretical concept are considered. PMID:26236232

  6. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model

    PubMed Central

    Marsh, John E.; Campbell, Tom A.

    2016-01-01

    The rostral brainstem receives both “bottom-up” input from the ascending auditory system and “top-down” descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory. PMID:27242396

  7. Processing Complex Sounds Passing through the Rostral Brainstem: The New Early Filter Model.

    PubMed

    Marsh, John E; Campbell, Tom A

    2016-01-01

    The rostral brainstem receives both "bottom-up" input from the ascending auditory system and "top-down" descending corticofugal connections. Speech information passing through the inferior colliculus of elderly listeners reflects the periodicity envelope of a speech syllable. This information arguably also reflects a composite of temporal-fine-structure (TFS) information from the higher frequency vowel harmonics of that repeated syllable. The amplitude of those higher frequency harmonics, bearing even higher frequency TFS information, correlates positively with the word recognition ability of elderly listeners under reverberatory conditions. Also relevant is that working memory capacity (WMC), which is subject to age-related decline, constrains the processing of sounds at the level of the brainstem. Turning to the effects of a visually presented sensory or memory load on auditory processes, there is a load-dependent reduction of that processing, as manifest in the auditory brainstem responses (ABR) evoked by to-be-ignored clicks. Wave V decreases in amplitude with increases in the visually presented memory load. A visually presented sensory load also produces a load-dependent reduction of a slightly different sort: The sensory load of visually presented information limits the disruptive effects of background sound upon working memory performance. A new early filter model is thus advanced whereby systems within the frontal lobe (affected by sensory or memory load) cholinergically influence top-down corticofugal connections. Those corticofugal connections constrain the processing of complex sounds such as speech at the level of the brainstem. Selective attention thereby limits the distracting effects of background sound entering the higher auditory system via the inferior colliculus. Processing TFS in the brainstem relates to perception of speech under adverse conditions. Attentional selectivity is crucial when the signal heard is degraded or masked: e.g., speech in noise, speech in reverberatory environments. The assumptions of a new early filter model are consistent with these findings: A subcortical early filter, with a predictive selectivity based on acoustical (linguistic) context and foreknowledge, is under cholinergic top-down control. A prefrontal capacity limitation constrains this top-down control as is guided by the cholinergic processing of contextual information in working memory.

  8. Impacts of Ocean Acidification on Sensory Function in Marine Organisms.

    PubMed

    Ashur, Molly M; Johnston, Nicole K; Dixson, Danielle L

    2017-07-01

    Ocean acidification has been identified as a major contributor to ocean ecosystem decline, impacting the calcification, survival, and behavior of marine organisms. Numerous studies have observed altered sensory perception of chemical, auditory, and visual cues after exposure to elevated CO2. Sensory systems enable the observation of the external environment and therefore play a critical role in survival, communication, and behavior of marine organisms. This review seeks to (1) summarize the current knowledge of sensory impairment caused by ocean acidification, (2) discuss potential mechanisms behind this disruption, and (3) analyze the expected taxa differences in sensitivities to elevated CO2 conditions. Although a lack of standardized methodology makes cross-study comparisons challenging, trends and biases arise from this synthesis including a substantial focus on vertebrates, larvae or juveniles, the reef ecosystem, and chemosensory perception. Future studies must broaden the scope of the field by diversifying the taxa and ecosystems studied, incorporating ontogenetic comparisons, and focusing on cryptic sensory systems such as electroreception, magnetic sense, and the lateral line system. A discussion of possible mechanisms reveals GABAA receptor reversal as the conspicuous physiological mechanism. However, the potential remains for alternative disruption through structure or cue changes. Finally, a taxonomic comparison of physiological complexity reveals few trends in sensory sensitivities to lowered pH, but we hypothesize potential correlations relating to habitat, life history or relative use of sensory systems. Elevated CO2, in concordance with other global and local stressors, has the potential to drastically shift community composition and structure. Therefore research addressing the extent of sensory impairment, the underlying mechanisms, and the differences between taxa is vital for improved predictions of organismal response to ocean acidification. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  9. 3D hierarchical spatial representation and memory of multimodal sensory data

    NASA Astrophysics Data System (ADS)

    Khosla, Deepak; Dow, Paul A.; Huber, David J.

    2009-04-01

    This paper describes an efficient method and system for representing, processing and understanding multi-modal sensory data. More specifically, it describes a computational method and system for how to process and remember multiple locations in multimodal sensory space (e.g., visual, auditory, somatosensory, etc.). The multimodal representation and memory is based on a biologically-inspired hierarchy of spatial representations implemented with novel analogues of real representations used in the human brain. The novelty of the work is in the computationally efficient and robust spatial representation of 3D locations in multimodal sensory space as well as an associated working memory for storage and recall of these representations at the desired level for goal-oriented action. We describe (1) A simple and efficient method for human-like hierarchical spatial representations of sensory data and how to associate, integrate and convert between these representations (head-centered coordinate system, body-centered coordinate, etc.); (2) a robust method for training and learning a mapping of points in multimodal sensory space (e.g., camera-visible object positions, location of auditory sources, etc.) to the above hierarchical spatial representations; and (3) a specification and implementation of a hierarchical spatial working memory based on the above for storage and recall at the desired level for goal-oriented action(s). This work is most useful for any machine or human-machine application that requires processing of multimodal sensory inputs, making sense of it from a spatial perspective (e.g., where is the sensory information coming from with respect to the machine and its parts) and then taking some goal-oriented action based on this spatial understanding. A multi-level spatial representation hierarchy means that heterogeneous sensory inputs (e.g., visual, auditory, somatosensory, etc.) can map onto the hierarchy at different levels. When controlling various machine/robot degrees of freedom, the desired movements and action can be computed from these different levels in the hierarchy. The most basic embodiment of this machine could be a pan-tilt camera system, an array of microphones, a machine with arm/hand like structure or/and a robot with some or all of the above capabilities. We describe the approach, system and present preliminary results on a real-robotic platform.

  10. Smell or vision? The use of different sensory modalities in predator discrimination.

    PubMed

    Fischer, Stefan; Oberhummer, Evelyne; Cunha-Saraiva, Filipa; Gerber, Nina; Taborsky, Barbara

    2017-01-01

    Theory predicts that animals should adjust their escape responses to the perceived predation risk. The information animals obtain about potential predation risk may differ qualitatively depending on the sensory modality by which a cue is perceived. For instance, olfactory cues may reveal better information about the presence or absence of threats, whereas visual information can reliably transmit the position and potential attack distance of a predator. While this suggests a differential use of information perceived through the two sensory channels, the relative importance of visual vs. olfactory cues when distinguishing between different predation threats is still poorly understood. Therefore, we exposed individuals of the cooperatively breeding cichlid Neolamprologus pulcher to a standardized threat stimulus combined with either predator or non-predator cues presented either visually or chemically. We predicted that flight responses towards a threat stimulus are more pronounced if cues of dangerous rather than harmless heterospecifics are presented and that N. pulcher , being an aquatic species, relies more on olfaction when discriminating between dangerous and harmless heterospecifics. N. pulcher responded faster to the threat stimulus, reached a refuge faster and entered a refuge more likely when predator cues were perceived. Unexpectedly, the sensory modality used to perceive the cues did not affect the escape response or the duration of the recovery phase. This suggests that N. pulcher are able to discriminate heterospecific cues with similar acuity when using vision or olfaction. We discuss that this ability may be advantageous in aquatic environments where the visibility conditions strongly vary over time. The ability to rapidly discriminate between dangerous predators and harmless heterospecifics is crucial for the survival of prey animals. In seasonally fluctuating environment, sensory conditions may change over the year and may make the use of multiple sensory modalities for heterospecific discrimination highly beneficial. Here we compared the efficacy of visual and olfactory senses in the discrimination ability of the cooperatively breeding cichlid Neolamprologus pulcher . We presented individual fish with visual or olfactory cues of predators or harmless heterospecifics and recorded their flight response. When exposed to predator cues, individuals responded faster, reached a refuge faster and were more likely to enter the refuge. Unexpectedly, the olfactory and visual senses seemed to be equally efficient in this discrimination task, suggesting that seasonal variation of water conditions experienced by N. pulcher may necessitate the use of multiple sensory channels for the same task.

  11. Shapes, scents and sounds: quantifying the full multi-sensory basis of conceptual knowledge.

    PubMed

    Hoffman, Paul; Lambon Ralph, Matthew A

    2013-01-01

    Contemporary neuroscience theories assume that concepts are formed through experience in multiple sensory-motor modalities. Quantifying the contribution of each modality to different object categories is critical to understanding the structure of the conceptual system and to explaining category-specific knowledge deficits. Verbal feature listing is typically used to elicit this information but has a number of drawbacks: sensory knowledge often cannot easily be translated into verbal features and many features are experienced in multiple modalities. Here, we employed a more direct approach in which subjects rated their knowledge of objects in each sensory-motor modality separately. Compared with these ratings, feature listing over-estimated the importance of visual form and functional knowledge and under-estimated the contributions of other sensory channels. An item's sensory rating proved to be a better predictor of lexical-semantic processing speed than the number of features it possessed, suggesting that ratings better capture the overall quantity of sensory information associated with a concept. Finally, the richer, multi-modal rating data not only replicated the sensory-functional distinction between animals and non-living things but also revealed novel distinctions between different types of artefact. Hierarchical cluster analyses indicated that mechanical devices (e.g., vehicles) were distinct from other non-living objects because they had strong sound and motion characteristics, making them more similar to animals in this respect. Taken together, the ratings align with neuroscience evidence in suggesting that a number of distinct sensory processing channels make important contributions to object knowledge. Multi-modal ratings for 160 objects are provided as supplementary materials. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Eyes Matched to the Prize: The State of Matched Filters in Insect Visual Circuits.

    PubMed

    Kohn, Jessica R; Heath, Sarah L; Behnia, Rudy

    2018-01-01

    Confronted with an ever-changing visual landscape, animals must be able to detect relevant stimuli and translate this information into behavioral output. A visual scene contains an abundance of information: to interpret the entirety of it would be uneconomical. To optimally perform this task, neural mechanisms exist to enhance the detection of important features of the sensory environment while simultaneously filtering out irrelevant information. This can be accomplished by using a circuit design that implements specific "matched filters" that are tuned to relevant stimuli. Following this rule, the well-characterized visual systems of insects have evolved to streamline feature extraction on both a structural and functional level. Here, we review examples of specialized visual microcircuits for vital behaviors across insect species, including feature detection, escape, and estimation of self-motion. Additionally, we discuss how these microcircuits are modulated to weigh relevant input with respect to different internal and behavioral states.

  13. A visual pathway links brain structures active during magnetic compass orientation in migratory birds.

    PubMed

    Heyers, Dominik; Manns, Martina; Luksch, Harald; Güntürkün, Onur; Mouritsen, Henrik

    2007-09-26

    The magnetic compass of migratory birds has been suggested to be light-dependent. Retinal cryptochrome-expressing neurons and a forebrain region, "Cluster N", show high neuronal activity when night-migratory songbirds perform magnetic compass orientation. By combining neuronal tracing with behavioral experiments leading to sensory-driven gene expression of the neuronal activity marker ZENK during magnetic compass orientation, we demonstrate a functional neuronal connection between the retinal neurons and Cluster N via the visual thalamus. Thus, the two areas of the central nervous system being most active during magnetic compass orientation are part of an ascending visual processing stream, the thalamofugal pathway. Furthermore, Cluster N seems to be a specialized part of the visual wulst. These findings strongly support the hypothesis that migratory birds use their visual system to perceive the reference compass direction of the geomagnetic field and that migratory birds "see" the reference compass direction provided by the geomagnetic field.

  14. Alpha-Band Rhythms in Visual Task Performance: Phase-Locking by Rhythmic Sensory Stimulation

    PubMed Central

    de Graaf, Tom A.; Gross, Joachim; Paterson, Gavin; Rusch, Tessa; Sack, Alexander T.; Thut, Gregor

    2013-01-01

    Oscillations are an important aspect of neuronal activity. Interestingly, oscillatory patterns are also observed in behaviour, such as in visual performance measures after the presentation of a brief sensory event in the visual or another modality. These oscillations in visual performance cycle at the typical frequencies of brain rhythms, suggesting that perception may be closely linked to brain oscillations. We here investigated this link for a prominent rhythm of the visual system (the alpha-rhythm, 8–12 Hz) by applying rhythmic visual stimulation at alpha-frequency (10.6 Hz), known to lead to a resonance response in visual areas, and testing its effects on subsequent visual target discrimination. Our data show that rhythmic visual stimulation at 10.6 Hz: 1) has specific behavioral consequences, relative to stimulation at control frequencies (3.9 Hz, 7.1 Hz, 14.2 Hz), and 2) leads to alpha-band oscillations in visual performance measures, that 3) correlate in precise frequency across individuals with resting alpha-rhythms recorded over parieto-occipital areas. The most parsimonious explanation for these three findings is entrainment (phase-locking) of ongoing perceptually relevant alpha-band brain oscillations by rhythmic sensory events. These findings are in line with occipital alpha-oscillations underlying periodicity in visual performance, and suggest that rhythmic stimulation at frequencies of intrinsic brain-rhythms can be used to reveal influences of these rhythms on task performance to study their functional roles. PMID:23555873

  15. Linking crowding, visual span, and reading.

    PubMed

    He, Yingchen; Legge, Gordon E

    2017-09-01

    The visual span is hypothesized to be a sensory bottleneck on reading speed with crowding thought to be the major sensory factor limiting the size of the visual span. This proposed linkage between crowding, visual span, and reading speed is challenged by the finding that training to read crowded letters reduced crowding but did not improve reading speed (Chung, 2007). Here, we examined two properties of letter-recognition training that may influence the transfer to improved reading: the spatial arrangement of training stimuli and the presence of flankers. Three groups of nine young adults were trained with different configurations of letter stimuli at 10° in the lower visual field: a flanked-local group (flanked letters localized at one position), a flanked-distributed group (flanked letters distributed across different horizontal locations), and an isolated-distributed group (isolated and distributed letters). We found that distributed training, but not the presence of flankers, appears to be necessary for the training benefit to transfer to increased reading speed. Localized training may have biased attention to one specific, small area in the visual field, thereby failing to improve reading. We conclude that the visual span represents a sensory bottleneck on reading, but there may also be an attentional bottleneck. Reducing the impact of crowding can enlarge the visual span and can potentially facilitate reading, but not when adverse attentional bias is present. Our results clarify the association between crowding, visual span, and reading.

  16. Linking crowding, visual span, and reading

    PubMed Central

    He, Yingchen; Legge, Gordon E.

    2017-01-01

    The visual span is hypothesized to be a sensory bottleneck on reading speed with crowding thought to be the major sensory factor limiting the size of the visual span. This proposed linkage between crowding, visual span, and reading speed is challenged by the finding that training to read crowded letters reduced crowding but did not improve reading speed (Chung, 2007). Here, we examined two properties of letter-recognition training that may influence the transfer to improved reading: the spatial arrangement of training stimuli and the presence of flankers. Three groups of nine young adults were trained with different configurations of letter stimuli at 10° in the lower visual field: a flanked-local group (flanked letters localized at one position), a flanked-distributed group (flanked letters distributed across different horizontal locations), and an isolated-distributed group (isolated and distributed letters). We found that distributed training, but not the presence of flankers, appears to be necessary for the training benefit to transfer to increased reading speed. Localized training may have biased attention to one specific, small area in the visual field, thereby failing to improve reading. We conclude that the visual span represents a sensory bottleneck on reading, but there may also be an attentional bottleneck. Reducing the impact of crowding can enlarge the visual span and can potentially facilitate reading, but not when adverse attentional bias is present. Our results clarify the association between crowding, visual span, and reading. PMID:28973564

  17. Integration and binding in rehabilitative sensory substitution: Increasing resolution using a new Zooming-in approach

    PubMed Central

    Buchs, Galit; Maidenbaum, Shachar; Levy-Tzedek, Shelly; Amedi, Amir

    2015-01-01

    Purpose: To visually perceive our surroundings we constantly move our eyes and focus on particular details, and then integrate them into a combined whole. Current visual rehabilitation methods, both invasive, like bionic-eyes and non-invasive, like Sensory Substitution Devices (SSDs), down-sample visual stimuli into low-resolution images. Zooming-in to sub-parts of the scene could potentially improve detail perception. Can congenitally blind individuals integrate a ‘visual’ scene when offered this information via different sensory modalities, such as audition? Can they integrate visual information –perceived in parts - into larger percepts despite never having had any visual experience? Methods: We explored these questions using a zooming-in functionality embedded in the EyeMusic visual-to-auditory SSD. Eight blind participants were tasked with identifying cartoon faces by integrating their individual components recognized via the EyeMusic’s zooming mechanism. Results: After specialized training of just 6–10 hours, blind participants successfully and actively integrated facial features into cartooned identities in 79±18% of the trials in a highly significant manner, (chance level 10% ; rank-sum P <  1.55E-04). Conclusions: These findings show that even users who lacked any previous visual experience whatsoever can indeed integrate this visual information with increased resolution. This potentially has important practical visual rehabilitation implications for both invasive and non-invasive methods. PMID:26518671

  18. Experientally guided robots. [for planet exploration

    NASA Technical Reports Server (NTRS)

    Merriam, E. W.; Becker, J. D.

    1974-01-01

    This paper argues that an experientally guided robot is necessary to successfully explore far-away planets. Such a robot is characterized as having sense organs which receive sensory information from its environment and motor systems which allow it to interact with that environment. The sensori-motor information which it receives is organized into an experiential knowledge structure and this knowledge in turn is used to guide the robot's future actions. A summary is presented of a problem solving system which is being used as a test bed for developing such a robot. The robot currently engages in the behaviors of visual tracking, focusing down, and looking around in a simulated Martian landscape. Finally, some unsolved problems are outlined whose solutions are necessary before an experientally guided robot can be produced. These problems center around organizing the motivational and memory structure of the robot and understanding its high-level control mechanisms.

  19. Pictorial communication in virtual and real environments

    NASA Technical Reports Server (NTRS)

    Ellis, Stephen R. (Editor)

    1991-01-01

    Papers about the communication between human users and machines in real and synthetic environments are presented. Individual topics addressed include: pictorial communication, distortions in memory for visual displays, cartography and map displays, efficiency of graphical perception, volumetric visualization of 3D data, spatial displays to increase pilot situational awareness, teleoperation of land vehicles, computer graphics system for visualizing spacecraft in orbit, visual display aid for orbital maneuvering, multiaxis control in telemanipulation and vehicle guidance, visual enhancements in pick-and-place tasks, target axis effects under transformed visual-motor mappings, adapting to variable prismatic displacement. Also discussed are: spatial vision within egocentric and exocentric frames of reference, sensory conflict in motion sickness, interactions of form and orientation, perception of geometrical structure from congruence, prediction of three-dimensionality across continuous surfaces, effects of viewpoint in the virtual space of pictures, visual slant underestimation, spatial constraints of stereopsis in video displays, stereoscopic stance perception, paradoxical monocular stereopsis and perspective vergence. (No individual items are abstracted in this volume)

  20. Grouping and Segregation of Sensory Events by Actions in Temporal Audio-Visual Recalibration.

    PubMed

    Ikumi, Nara; Soto-Faraco, Salvador

    2016-01-01

    Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands.

  1. Grouping and Segregation of Sensory Events by Actions in Temporal Audio-Visual Recalibration

    PubMed Central

    Ikumi, Nara; Soto-Faraco, Salvador

    2017-01-01

    Perception in multi-sensory environments involves both grouping and segregation of events across sensory modalities. Temporal coincidence between events is considered a strong cue to resolve multisensory perception. However, differences in physical transmission and neural processing times amongst modalities complicate this picture. This is illustrated by cross-modal recalibration, whereby adaptation to audio-visual asynchrony produces shifts in perceived simultaneity. Here, we examined whether voluntary actions might serve as a temporal anchor to cross-modal recalibration in time. Participants were tested on an audio-visual simultaneity judgment task after an adaptation phase where they had to synchronize voluntary actions with audio-visual pairs presented at a fixed asynchrony (vision leading or vision lagging). Our analysis focused on the magnitude of cross-modal recalibration to the adapted audio-visual asynchrony as a function of the nature of the actions during adaptation, putatively fostering cross-modal grouping or, segregation. We found larger temporal adjustments when actions promoted grouping than segregation of sensory events. However, a control experiment suggested that additional factors, such as attention to planning/execution of actions, could have an impact on recalibration effects. Contrary to the view that cross-modal temporal organization is mainly driven by external factors related to the stimulus or environment, our findings add supporting evidence for the idea that perceptual adjustments strongly depend on the observer's inner states induced by motor and cognitive demands. PMID:28154529

  2. Top-down beta oscillatory signaling conveys behavioral context in early visual cortex.

    PubMed

    Richter, Craig G; Coppola, Richard; Bressler, Steven L

    2018-05-03

    Top-down modulation of sensory processing is a critical neural mechanism subserving numerous important cognitive roles, one of which may be to inform lower-order sensory systems of the current 'task at hand' by conveying behavioral context to these systems. Accumulating evidence indicates that top-down cortical influences are carried by directed interareal synchronization of oscillatory neuronal populations, with recent results pointing to beta-frequency oscillations as particularly important for top-down processing. However, it remains to be determined if top-down beta-frequency oscillations indeed convey behavioral context. We measured spectral Granger Causality (sGC) using local field potentials recorded from microelectrodes chronically implanted in visual areas V1/V2, V4, and TEO of two rhesus macaque monkeys, and applied multivariate pattern analysis to the spatial patterns of top-down sGC. We decoded behavioral context by discriminating patterns of top-down (V4/TEO-to-V1/V2) beta-peak sGC for two different task rules governing correct responses to identical visual stimuli. The results indicate that top-down directed influences are carried to visual cortex by beta oscillations, and differentiate task demands even before visual stimulus processing. They suggest that top-down beta-frequency oscillatory processes coordinate processing of sensory information by conveying global knowledge states to early levels of the sensory cortical hierarchy independently of bottom-up stimulus-driven processing.

  3. How well do you see what you hear? The acuity of visual-to-auditory sensory substitution

    PubMed Central

    Haigh, Alastair; Brown, David J.; Meijer, Peter; Proulx, Michael J.

    2013-01-01

    Sensory substitution devices (SSDs) aim to compensate for the loss of a sensory modality, typically vision, by converting information from the lost modality into stimuli in a remaining modality. “The vOICe” is a visual-to-auditory SSD which encodes images taken by a camera worn by the user into “soundscapes” such that experienced users can extract information about their surroundings. Here we investigated how much detail was resolvable during the early induction stages by testing the acuity of blindfolded sighted, naïve vOICe users. Initial performance was well above chance. Participants who took the test twice as a form of minimal training showed a marked improvement on the second test. Acuity was slightly but not significantly impaired when participants wore a camera and judged letter orientations “live”. A positive correlation was found between participants' musical training and their acuity. The relationship between auditory expertise via musical training and the lack of a relationship with visual imagery, suggests that early use of a SSD draws primarily on the mechanisms of the sensory modality being used rather than the one being substituted. If vision is lost, audition represents the sensory channel of highest bandwidth of those remaining. The level of acuity found here, and the fact it was achieved with very little experience in sensory substitution by naïve users is promising. PMID:23785345

  4. The involvement of central attention in visual search is determined by task demands.

    PubMed

    Han, Suk Won

    2017-04-01

    Attention, the mechanism by which a subset of sensory inputs is prioritized over others, operates at multiple processing stages. Specifically, attention enhances weak sensory signal at the perceptual stage, while it serves to select appropriate responses or consolidate sensory representations into short-term memory at the central stage. This study investigated the independence and interaction between perceptual and central attention. To do so, I used a dual-task paradigm, pairing a four-alternative choice task with a visual search task. The results showed that central attention for response selection was engaged in perceptual processing for visual search when the number of search items increased, thereby increasing the demand for serial allocation of focal attention. By contrast, central attention and perceptual attention remained independent as far as the demand for serial shifting of focal attention remained constant; decreasing stimulus contrast or increasing the set size of a parallel search did not evoke the involvement of central attention in visual search. These results suggest that the nature of concurrent visual search process plays a crucial role in the functional interaction between two different types of attention.

  5. Facial markings in the social cuckoo wasp Polistes sulcifer: No support for the visual deception and the assessment hypotheses.

    PubMed

    Cini, Alessandro; Ortolani, Irene; Zechini, Luigi; Cervo, Rita

    2015-02-01

    Insect social parasites have to conquer a host colony by overcoming its defensive barriers. In addition to increased fighting abilities, many social parasites evolved sophisticated sensory deception mechanisms to elude host colonies defenses by exploiting host communication channels. Recently, it has been shown that the conspicuous facial markings of a paper wasp social parasite, Polistes sulcifer, decrease the aggressiveness of host foundresses. Two main hypotheses stand as explanations of this phenomenon: visual sensory deception (i.e. the black patterning reduces host aggression by exploiting the host visual communication system) and visual quality assessment (i.e. facial markings reduce aggressiveness as they signal the increased fighting ability of parasites). Through behavioral assays and morphological measurements we tested three predictions resulting from these hypotheses and found no support either for the visual sensory deception or for the quality assessment to explain the reduction in host aggressiveness towards the parasite. Our results suggest that other discrimination processes may explain the observed phenomenon. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Sensory substitution: closing the gap between basic research and widespread practical visual rehabilitation.

    PubMed

    Maidenbaum, Shachar; Abboud, Sami; Amedi, Amir

    2014-04-01

    Sensory substitution devices (SSDs) have come a long way since first developed for visual rehabilitation. They have produced exciting experimental results, and have furthered our understanding of the human brain. Unfortunately, they are still not used for practical visual rehabilitation, and are currently considered as reserved primarily for experiments in controlled settings. Over the past decade, our understanding of the neural mechanisms behind visual restoration has changed as a result of converging evidence, much of which was gathered with SSDs. This evidence suggests that the brain is more than a pure sensory-machine but rather is a highly flexible task-machine, i.e., brain regions can maintain or regain their function in vision even with input from other senses. This complements a recent set of more promising behavioral achievements using SSDs and new promising technologies and tools. All these changes strongly suggest that the time has come to revive the focus on practical visual rehabilitation with SSDs and we chart several key steps in this direction such as training protocols and self-train tools. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Structure-function relationships between aldolase C/zebrin II expression and complex spike synchrony in the cerebellum.

    PubMed

    Tsutsumi, Shinichiro; Yamazaki, Maya; Miyazaki, Taisuke; Watanabe, Masahiko; Sakimura, Kenji; Kano, Masanobu; Kitamura, Kazuo

    2015-01-14

    Simple and regular anatomical structure is a hallmark of the cerebellar cortex. Parasagittally arrayed alternate expression of aldolase C/zebrin II in Purkinje cells (PCs) has been extensively studied, but surprisingly little is known about its functional significance. Here we found a precise structure-function relationship between aldolase C expression and synchrony of PC complex spike activities that reflect climbing fiber inputs to PCs. We performed two-photon calcium imaging in transgenic mice in which aldolase C compartments can be visualized in vivo, and identified highly synchronous complex spike activities among aldolase C-positive or aldolase C-negative PCs, but not across these populations. The boundary of aldolase C compartments corresponded to that of complex spike synchrony at single-cell resolution. Sensory stimulation evoked aldolase C compartment-specific complex spike responses and synchrony. This result further revealed the structure-function segregation. In awake animals, complex spike synchrony both within and between PC populations across the aldolase C boundary were enhanced in response to sensory stimuli, in a way that two functionally distinct PC ensembles are coactivated. These results suggest that PC populations characterized by aldolase C expression precisely represent distinct functional units of the cerebellar cortex, and these functional units can cooperate to process sensory information in awake animals. Copyright © 2015 the authors 0270-6474/15/350843-10$15.00/0.

  8. Parent Handbook: A Curricular Approach To Support the Transition to Adulthood of Adolescents with Visual or Dual Sensory Impairments and Cognitive Disabilities.

    ERIC Educational Resources Information Center

    O'Neill, John; And Others

    This handbook for parents is part of a packet intended to aid educators, families, and adult service providers to facilitate the transition from school to adult life in the community for students with both cognitive disabilities and visual or dual sensory impairments. Emphasis is on preparation of students for adult lifestyles through transition…

  9. Exploring Mechanisms Underlying Impaired Brain Function in Gulf War Illness through Advanced Network Analysis

    DTIC Science & Technology

    2017-10-01

    networks of the brain responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased...4    For each subject, the rsFMRI voxel time-series were temporally shifted to account for differences in slice acquisition times...responsible for visual processing, mood regulation, motor coordination, sensory processing, and language command, but increased connectivity in

  10. Sensory Correlations in Autism

    ERIC Educational Resources Information Center

    Kern, Janet K.; Trivedi, Madhukar H.; Grannemann, Bruce D.; Garver, Carolyn R.; Johnson, Danny G.; Andrews, Alonzo A.; Savla, Jayshree S.; Mehta, Jyutika A.; Schroeder, Jennifer L.

    2007-01-01

    This study examined the relationship between auditory, visual, touch, and oral sensory dysfunction in autism and their relationship to multisensory dysfunction and severity of autism. The Sensory Profile was completed on 104 persons with a diagnosis of autism, 3 to 56 years of age. Analysis showed a significant correlation between the different…

  11. Visual Inference Programming

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin; Timucin, Dogan; Rabbette, Maura; Curry, Charles; Allan, Mark; Lvov, Nikolay; Clanton, Sam; Pilewskie, Peter

    2002-01-01

    The goal of visual inference programming is to develop a software framework data analysis and to provide machine learning algorithms for inter-active data exploration and visualization. The topics include: 1) Intelligent Data Understanding (IDU) framework; 2) Challenge problems; 3) What's new here; 4) Framework features; 5) Wiring diagram; 6) Generated script; 7) Results of script; 8) Initial algorithms; 9) Independent Component Analysis for instrument diagnosis; 10) Output sensory mapping virtual joystick; 11) Output sensory mapping typing; 12) Closed-loop feedback mu-rhythm control; 13) Closed-loop training; 14) Data sources; and 15) Algorithms. This paper is in viewgraph form.

  12. Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli

    PubMed Central

    Störmer, Viola S.; McDonald, John J.; Hillyard, Steven A.

    2009-01-01

    The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex. PMID:20007778

  13. Cross-modal cueing of attention alters appearance and early cortical processing of visual stimuli.

    PubMed

    Störmer, Viola S; McDonald, John J; Hillyard, Steven A

    2009-12-29

    The question of whether attention makes sensory impressions appear more intense has been a matter of debate for over a century. Recent psychophysical studies have reported that attention increases apparent contrast of visual stimuli, but the issue continues to be debated. We obtained converging neurophysiological evidence from human observers as they judged the relative contrast of visual stimuli presented to the left and right visual fields following a lateralized auditory cue. Cross-modal cueing of attention boosted the apparent contrast of the visual target in association with an enlarged neural response in the contralateral visual cortex that began within 100 ms after target onset. The magnitude of the enhanced neural response was positively correlated with perceptual reports of the cued target being higher in contrast. The results suggest that attention increases the perceived contrast of visual stimuli by boosting early sensory processing in the visual cortex.

  14. Congruent representation of visual and acoustic space in the superior colliculus of the echolocating bat Phyllostomus discolor.

    PubMed

    Hoffmann, Susanne; Vega-Zuniga, Tomas; Greiter, Wolfgang; Krabichler, Quirin; Bley, Alexandra; Matthes, Mariana; Zimmer, Christiane; Firzlaff, Uwe; Luksch, Harald

    2016-11-01

    The midbrain superior colliculus (SC) commonly features a retinotopic representation of visual space in its superficial layers, which is congruent with maps formed by multisensory neurons and motor neurons in its deep layers. Information flow between layers is suggested to enable the SC to mediate goal-directed orienting movements. While most mammals strongly rely on vision for orienting, some species such as echolocating bats have developed alternative strategies, which raises the question how sensory maps are organized in these animals. We probed the visual system of the echolocating bat Phyllostomus discolor and found that binocular high acuity vision is frontally oriented and thus aligned with the biosonar system, whereas monocular visual fields cover a large area of peripheral space. For the first time in echolocating bats, we could show that in contrast with other mammals, visual processing is restricted to the superficial layers of the SC. The topographic representation of visual space, however, followed the general mammalian pattern. In addition, we found a clear topographic representation of sound azimuth in the deeper collicular layers, which was congruent with the superficial visual space map and with a previously documented map of orienting movements. Especially for bats navigating at high speed in densely structured environments, it is vitally important to transfer and coordinate spatial information between sensors and motor systems. Here, we demonstrate first evidence for the existence of congruent maps of sensory space in the bat SC that might serve to generate a unified representation of the environment to guide motor actions. © 2016 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  15. The effect of visual-vestibulosomatosensory conflict induced by virtual reality on postural stability in humans.

    PubMed

    Nishiike, Suetaka; Okazaki, Suzuyo; Watanabe, Hiroshi; Akizuki, Hironori; Imai, Takao; Uno, Atsuhiko; Kitahara, Tadashi; Horii, Arata; Takeda, Noriaki; Inohara, Hidenori

    2013-01-01

    In this study, we examined the effects of sensory inputs of visual-vestibulosomatosensory conflict induced by virtual reality (VR) on subjective dizziness, posture stability and visual dependency on postural control in humans. Eleven healthy young volunteers were immersed in two different VR conditions. In the control condition, subjects walked voluntarily with the background images of interactive computer graphics proportionally synchronized to their walking pace. In the visual-vestibulosomatosensory conflict condition, subjects kept still, but the background images that subjects experienced in the control condition were presented. The scores of both Graybiel's and Hamilton's criteria, postural instability and Romberg ratio were measured before and after the two conditions. After immersion in the conflict condition, both subjective dizziness and objective postural instability were significantly increased, and Romberg ratio, an index of the visual dependency on postural control, was slightly decreased. These findings suggest that sensory inputs of visual-vestibulosomatosensory conflict induced by VR induced motion sickness, resulting in subjective dizziness and postural instability. They also suggest that adaptation to the conflict condition decreases the contribution of visual inputs to postural control with re-weighing of vestibulosomatosensory inputs. VR may be used as a rehabilitation tool for dizzy patients by its ability to induce sensory re-weighing of postural control.

  16. Effect of altered sensory conditions on multivariate descriptors of human postural sway

    NASA Technical Reports Server (NTRS)

    Kuo, A. D.; Speers, R. A.; Peterka, R. J.; Horak, F. B.; Peterson, B. W. (Principal Investigator)

    1998-01-01

    Multivariate descriptors of sway were used to test whether altered sensory conditions result not only in changes in amount of sway but also in postural coordination. Eigenvalues and directions of eigenvectors of the covariance of shnk and hip angles were used as a set of multivariate descriptors. These quantities were measured in 14 healthy adult subjects performing the Sensory Organization test, which disrupts visual and somatosensory information used for spatial orientation. Multivariate analysis of variance and discriminant analysis showed that resulting sway changes were at least bivariate in character, with visual and somatosensory conditions producing distinct changes in postural coordination. The most significant changes were found when somatosensory information was disrupted by sway-referencing of the support surface (P = 3.2 x 10(-10)). The resulting covariance measurements showed that subjects not only swayed more but also used increased hip motion analogous to the hip strategy. Disruption of vision, by either closing the eyes or sway-referencing the visual surround, also resulted in altered sway (P = 1.7 x 10(-10)), with proportionately more motion of the center of mass than with platform sway-referencing. As shown by discriminant analysis, an optimal univariate measure could explain at most 90% of the behavior due to altered sensory conditions. The remaining 10%, while smaller, are highly significant changes in posture control that depend on sensory conditions. The results imply that normal postural coordination of the trunk and legs requires both somatosensory and visual information and that each sensory modality makes a unique contribution to posture control. Descending postural commands are multivariate in nature, and the motion at each joint is affected uniquely by input from multiple sensors.

  17. Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception

    PubMed Central

    Bejjanki, Vikranth Rao; Clayards, Meghan; Knill, David C.; Aslin, Richard N.

    2011-01-01

    Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks. PMID:21637344

  18. Model-based analysis of pattern motion processing in mouse primary visual cortex

    PubMed Central

    Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.

    2015-01-01

    Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738

  19. When a hit sounds like a kiss: An electrophysiological exploration of semantic processing in visual narrative.

    PubMed

    Manfredi, Mirella; Cohn, Neil; Kutas, Marta

    2017-06-01

    Researchers have long questioned whether information presented through different sensory modalities involves distinct or shared semantic systems. We investigated uni-sensory cross-modal processing by recording event-related brain potentials to words replacing the climactic event in a visual narrative sequence (comics). We compared Onomatopoeic words, which phonetically imitate action sounds (Pow!), with Descriptive words, which describe an action (Punch!), that were (in)congruent within their sequence contexts. Across two experiments, larger N400s appeared to Anomalous Onomatopoeic or Descriptive critical panels than to their congruent counterparts, reflecting a difficulty in semantic access/retrieval. Also, Descriptive words evinced a greater late frontal positivity compared to Onomatopoetic words, suggesting that, though plausible, they may be less predictable/expected in visual narratives. Our results indicate that uni-sensory cross-model integration of word/letter-symbol strings within visual narratives elicit ERP patterns typically observed for written sentence processing, thereby suggesting the engagement of similar domain-independent integration/interpretation mechanisms. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Food and conspecific chemical cues modify visual behavior of zebrafish, Danio rerio.

    PubMed

    Stephenson, Jessica F; Partridge, Julian C; Whitlock, Kathleen E

    2012-06-01

    Animals use the different qualities of olfactory and visual sensory information to make decisions. Ethological and electrophysiological evidence suggests that there is cross-modal priming between these sensory systems in fish. We present the first experimental study showing that ecologically relevant chemical mixtures alter visual behavior, using adult male and female zebrafish, Danio rerio. Neutral-density filters were used to attenuate the light reaching the tank to an initial light intensity of 2.3×10(16) photons/s/m2. Fish were exposed to food cue and to alarm cue. The light intensity was then increased by the removal of one layer of filter (nominal absorbance 0.3) every minute until, after 10 minutes, the light level was 15.5×10(16) photons/s/m2. Adult male and female zebrafish responded to a moving visual stimulus at lower light levels if they had been first exposed to food cue, or to conspecific alarm cue. These results suggest the need for more integrative studies of sensory biology.

  1. Modeling the Perception of Audiovisual Distance: Bayesian Causal Inference and Other Models

    PubMed Central

    2016-01-01

    Studies of audiovisual perception of distance are rare. Here, visual and auditory cue interactions in distance are tested against several multisensory models, including a modified causal inference model. In this causal inference model predictions of estimate distributions are included. In our study, the audiovisual perception of distance was overall better explained by Bayesian causal inference than by other traditional models, such as sensory dominance and mandatory integration, and no interaction. Causal inference resolved with probability matching yielded the best fit to the data. Finally, we propose that sensory weights can also be estimated from causal inference. The analysis of the sensory weights allows us to obtain windows within which there is an interaction between the audiovisual stimuli. We find that the visual stimulus always contributes by more than 80% to the perception of visual distance. The visual stimulus also contributes by more than 50% to the perception of auditory distance, but only within a mobile window of interaction, which ranges from 1 to 4 m. PMID:27959919

  2. When vision is not an option: children's integration of auditory and haptic information is suboptimal.

    PubMed

    Petrini, Karin; Remark, Alicia; Smith, Louise; Nardini, Marko

    2014-05-01

    When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In , different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In , adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In , adults and children used similar weighting strategies to solve audio-haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood. © 2014 The Authors. Developmental Science Published by John Wiley & Sons Ltd.

  3. When a hit sounds like a kiss: an electrophysiological exploration of semantic processing in visual narrative

    PubMed Central

    Manfredi, Mirella; Cohn, Neil; Kutas, Marta

    2017-01-01

    Researchers have long questioned whether information presented through different sensory modalities involves distinct or shared semantic systems. We investigated uni-sensory cross-modal processing by recording event-related brain potentials to words replacing the climactic event in a visual narrative sequence (comics). We compared Onomatopoeic words, which phonetically imitate action sounds (Pow!), with Descriptive words, which describe an action (Punch!), that were (in)congruent within their sequence contexts. Across two experiments, larger N400s appeared to Anomalous Onomatopoeic or Descriptive critical panels than to their congruent counterparts, reflecting a difficulty in semantic access/retrieval. Also, Descriptive words evinced a greater late frontal positivity compared to Onomatopoetic words, suggesting that, though plausible, they may be less predictable/expected in visual narratives. Our results indicate that uni-sensory cross-model integration of word/letter-symbol strings within visual narratives elicit ERP patterns typically observed for written sentence processing, thereby suggesting the engagement of similar domain-independent integration/interpretation mechanisms. PMID:28242517

  4. The sensory strength of voluntary visual imagery predicts visual working memory capacity.

    PubMed

    Keogh, Rebecca; Pearson, Joel

    2014-10-09

    How much we can actively hold in mind is severely limited and differs greatly from one person to the next. Why some individuals have greater capacities than others is largely unknown. Here, we investigated why such large variations in visual working memory (VWM) capacity might occur, by examining the relationship between visual working memory and visual mental imagery. To assess visual working memory capacity participants were required to remember the orientation of a number of Gabor patches and make subsequent judgments about relative changes in orientation. The sensory strength of voluntary imagery was measured using a previously documented binocular rivalry paradigm. Participants with greater imagery strength also had greater visual working memory capacity. However, they were no better on a verbal number working memory task. Introducing a uniform luminous background during the retention interval of the visual working memory task reduced memory capacity, but only for those with strong imagery. Likewise, for the good imagers increasing background luminance during imagery generation reduced its effect on subsequent binocular rivalry. Luminance increases did not affect any of the subgroups on the verbal number working memory task. Together, these results suggest that luminance was disrupting sensory mechanisms common to both visual working memory and imagery, and not a general working memory system. The disruptive selectivity of background luminance suggests that good imagers, unlike moderate or poor imagers, may use imagery as a mnemonic strategy to perform the visual working memory task. © 2014 ARVO.

  5. Visualization of Sensory Neurons and Their Projections in an Upper Motor Neuron Reporter Line.

    PubMed

    Genç, Barış; Lagrimas, Amiko Krisa Bunag; Kuru, Pınar; Hess, Robert; Tu, Michael William; Menichella, Daniela Maria; Miller, Richard J; Paller, Amy S; Özdinler, P Hande

    2015-01-01

    Visualization of peripheral nervous system axons and cell bodies is important to understand their development, target recognition, and integration into complex circuitries. Numerous studies have used protein gene product (PGP) 9.5 [a.k.a. ubiquitin carboxy-terminal hydrolase L1 (UCHL1)] expression as a marker to label sensory neurons and their axons. Enhanced green fluorescent protein (eGFP) expression, under the control of UCHL1 promoter, is stable and long lasting in the UCHL1-eGFP reporter line. In addition to the genetic labeling of corticospinal motor neurons in the motor cortex and degeneration-resistant spinal motor neurons in the spinal cord, here we report that neurons of the peripheral nervous system are also fluorescently labeled in the UCHL1-eGFP reporter line. eGFP expression is turned on at embryonic ages and lasts through adulthood, allowing detailed studies of cell bodies, axons and target innervation patterns of all sensory neurons in vivo. In addition, visualization of both the sensory and the motor neurons in the same animal offers many advantages. In this report, we used UCHL1-eGFP reporter line in two different disease paradigms: diabetes and motor neuron disease. eGFP expression in sensory axons helped determine changes in epidermal nerve fiber density in a high-fat diet induced diabetes model. Our findings corroborate previous studies, and suggest that more than five months is required for significant skin denervation. Crossing UCHL1-eGFP with hSOD1G93A mice generated hSOD1G93A-UeGFP reporter line of amyotrophic lateral sclerosis, and revealed sensory nervous system defects, especially towards disease end-stage. Our studies not only emphasize the complexity of the disease in ALS, but also reveal that UCHL1-eGFP reporter line would be a valuable tool to visualize and study various aspects of sensory nervous system development and degeneration in the context of numerous diseases.

  6. Unimodal primary sensory cortices are directly connected by long-range horizontal projections in the rat sensory cortex.

    PubMed

    Stehberg, Jimmy; Dang, Phat T; Frostig, Ron D

    2014-01-01

    Research based on functional imaging and neuronal recordings in the barrel cortex subdivision of primary somatosensory cortex (SI) of the adult rat has revealed novel aspects of structure-function relationships in this cortex. Specifically, it has demonstrated that single whisker stimulation evokes subthreshold neuronal activity that spreads symmetrically within gray matter from the appropriate barrel area, crosses cytoarchitectural borders of SI and reaches deeply into other unimodal primary cortices such as primary auditory (AI) and primary visual (VI). It was further demonstrated that this spread is supported by a spatially matching underlying diffuse network of border-crossing, long-range projections that could also reach deeply into AI and VI. Here we seek to determine whether such a network of border-crossing, long-range projections is unique to barrel cortex or characterizes also other primary, unimodal sensory cortices and therefore could directly connect them. Using anterograde (BDA) and retrograde (CTb) tract-tracing techniques, we demonstrate that such diffuse horizontal networks directly and mutually connect VI, AI and SI. These findings suggest that diffuse, border-crossing axonal projections connecting directly primary cortices are an important organizational motif common to all major primary sensory cortices in the rat. Potential implications of these findings for topics including cortical structure-function relationships, multisensory integration, functional imaging, and cortical parcellation are discussed.

  7. Unimodal primary sensory cortices are directly connected by long-range horizontal projections in the rat sensory cortex

    PubMed Central

    Stehberg, Jimmy; Dang, Phat T.; Frostig, Ron D.

    2014-01-01

    Research based on functional imaging and neuronal recordings in the barrel cortex subdivision of primary somatosensory cortex (SI) of the adult rat has revealed novel aspects of structure-function relationships in this cortex. Specifically, it has demonstrated that single whisker stimulation evokes subthreshold neuronal activity that spreads symmetrically within gray matter from the appropriate barrel area, crosses cytoarchitectural borders of SI and reaches deeply into other unimodal primary cortices such as primary auditory (AI) and primary visual (VI). It was further demonstrated that this spread is supported by a spatially matching underlying diffuse network of border-crossing, long-range projections that could also reach deeply into AI and VI. Here we seek to determine whether such a network of border-crossing, long-range projections is unique to barrel cortex or characterizes also other primary, unimodal sensory cortices and therefore could directly connect them. Using anterograde (BDA) and retrograde (CTb) tract-tracing techniques, we demonstrate that such diffuse horizontal networks directly and mutually connect VI, AI and SI. These findings suggest that diffuse, border-crossing axonal projections connecting directly primary cortices are an important organizational motif common to all major primary sensory cortices in the rat. Potential implications of these findings for topics including cortical structure-function relationships, multisensory integration, functional imaging, and cortical parcellation are discussed. PMID:25309339

  8. Sandwich masking eliminates both visual awareness of faces and face-specific brain activity through a feedforward mechanism.

    PubMed

    Harris, Joseph A; Wu, Chien-Te; Woldorff, Marty G

    2011-06-07

    It is generally agreed that considerable amounts of low-level sensory processing of visual stimuli can occur without conscious awareness. On the other hand, the degree of higher level visual processing that occurs in the absence of awareness is as yet unclear. Here, event-related potential (ERP) measures of brain activity were recorded during a sandwich-masking paradigm, a commonly used approach for attenuating conscious awareness of visual stimulus content. In particular, the present study used a combination of ERP activation contrasts to track both early sensory-processing ERP components and face-specific N170 ERP activations, in trials with versus without awareness. The electrophysiological measures revealed that the sandwich masking abolished the early face-specific N170 neural response (peaking at ~170 ms post-stimulus), an effect that paralleled the abolition of awareness of face versus non-face image content. Furthermore, however, the masking appeared to render a strong attenuation of earlier feedforward visual sensory-processing signals. This early attenuation presumably resulted in insufficient information being fed into the higher level visual system pathways specific to object category processing, thus leading to unawareness of the visual object content. These results support a coupling of visual awareness and neural indices of face processing, while also demonstrating an early low-level mechanism of interference in sandwich masking.

  9. Manually controlled human balancing using visual, vestibular and proprioceptive senses involves a common, low frequency neural process

    PubMed Central

    Lakie, Martin; Loram, Ian D

    2006-01-01

    Ten subjects balanced their own body or a mechanically equivalent unstable inverted pendulum by hand, through a compliant spring linkage. Their balancing process was always characterized by repeated small reciprocating hand movements. These bias adjustments were an observable sign of intermittent alterations in neural output. On average, the adjustments occurred at intervals of ∼400 ms. To generate appropriate stabilizing bias adjustments, sensory information about body or load movement is needed. Subjects used visual, vestibular or proprioceptive sensation alone and in combination to perform the tasks. We first ask, is the time between adjustments (bias duration) sensory specific? Vision is associated with slow responses. Other senses involved with balance are known to be faster. Our second question is; does bias duration depend on sensory abundance? An appropriate bias adjustment cannot occur until unplanned motion is unambiguously perceived (a sensory threshold). The addition of more sensory data should therefore expedite action, decreasing the mean bias adjustment duration. Statistical analysis showed that (1) the mean bias adjustment duration was remarkably independent of the sensory modality and (2) the addition of one or two sensory modalities made a small, but significant, decrease in the mean bias adjustment duration. Thus, a threshold effect can alter only a very minor part of the bias duration. The bias adjustment duration in manual balancing must reflect something more than visual sensation and perceptual thresholds; our suggestion is that it is a common central motor planning process. We predict that similar processes may be identified in the control of standing. PMID:16959857

  10. Social learning of predators in the dark: understanding the role of visual, chemical and mechanical information.

    PubMed

    Manassa, R P; McCormick, M I; Chivers, D P; Ferrari, M C O

    2013-08-22

    The ability of prey to observe and learn to recognize potential predators from the behaviour of nearby individuals can dramatically increase survival and, not surprisingly, is widespread across animal taxa. A range of sensory modalities are available for this learning, with visual and chemical cues being well-established modes of transmission in aquatic systems. The use of other sensory cues in mediating social learning in fishes, including mechano-sensory cues, remains unexplored. Here, we examine the role of different sensory cues in social learning of predator recognition, using juvenile damselfish (Amphiprion percula). Specifically, we show that a predator-naive observer can socially learn to recognize a novel predator when paired with a predator-experienced conspecific in total darkness. Furthermore, this study demonstrates that when threatened, individuals release chemical cues (known as disturbance cues) into the water. These cues induce an anti-predator response in nearby individuals; however, they do not facilitate learnt recognition of the predator. As such, another sensory modality, probably mechano-sensory in origin, is responsible for information transfer in the dark. This study highlights the diversity of sensory cues used by coral reef fishes in a social learning context.

  11. The cortical basis of true memory and false memory for motion.

    PubMed

    Karanian, Jessica M; Slotnick, Scott D

    2014-02-01

    Behavioral evidence indicates that false memory, like true memory, can be rich in sensory detail. By contrast, there is fMRI evidence that true memory for visual information produces greater activity in earlier visual regions than false memory, which suggests true memory is associated with greater sensory detail. However, false memory in previous fMRI paradigms may have lacked sufficient sensory detail to recruit earlier visual processing regions. To investigate this possibility in the present fMRI study, we employed a paradigm that produced feature-specific false memory with a high degree of visual detail. During the encoding phase, moving or stationary abstract shapes were presented to the left or right of fixation. During the retrieval phase, shapes from encoding were presented at fixation and participants classified each item as previously "moving" or "stationary" within each visual field. Consistent with previous fMRI findings, true memory but not false memory for motion activated motion processing region MT+, while both true memory and false memory activated later cortical processing regions. In addition, false memory but not true memory for motion activated language processing regions. The present findings indicate that true memory activates earlier visual regions to a greater degree than false memory, even under conditions of detailed retrieval. Thus, the dissociation between previous behavioral findings and fMRI findings do not appear to be task dependent. Future work will be needed to assess whether the same pattern of true memory and false memory activity is observed for different sensory modalities. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. A graph-theoretical analysis algorithm for quantifying the transition from sensory input to motor output by an emotional stimulus.

    PubMed

    Karmonik, Christof; Fung, Steve H; Dulay, M; Verma, A; Grossman, Robert G

    2013-01-01

    Graph-theoretical analysis algorithms have been used for identifying subnetworks in the human brain during the Default Mode State. Here, these methods are expanded to determine the interaction of the sensory and the motor subnetworks during the performance of an approach-avoidance paradigm utilizing the correlation strength between the signal intensity time courses as measure of synchrony. From functional magnetic resonance imaging (fMRI) data of 9 healthy volunteers, two signal time courses, one from the primary visual cortex (sensory input) and one from the motor cortex (motor output) were identified and a correlation difference map was calculated. Graph networks were created from this map and visualized with spring-embedded layouts and 3D layouts in the original anatomical space. Functional clusters in these networks were identified with the MCODE clustering algorithm. Interactions between the sensory sub-network and the motor sub-network were quantified through the interaction strengths of these clusters. The percentages of interactions involving the visual cortex ranged from 85 % to 18 % and the motor cortex ranged from 40 % to 9 %. Other regions with high interactions were: frontal cortex (19 ± 18 %), insula (17 ± 22 %), cuneus (16 ± 15 %), supplementary motor area (SMA, 11 ± 18 %) and subcortical regions (11 ± 10 %). Interactions between motor cortex, SMA and visual cortex accounted for 12 %, between visual cortex and cuneus for 8 % and between motor cortex, SMA and cuneus for 6 % of all interactions. These quantitative findings are supported by the visual impressions from the 2D and 3D network layouts.

  13. Memorable Audiovisual Narratives Synchronize Sensory and Supramodal Neural Responses

    PubMed Central

    2016-01-01

    Abstract Our brains integrate information across sensory modalities to generate perceptual experiences and form memories. However, it is difficult to determine the conditions under which multisensory stimulation will benefit or hinder the retrieval of everyday experiences. We hypothesized that the determining factor is the reliability of information processing during stimulus presentation, which can be measured through intersubject correlation of stimulus-evoked activity. We therefore presented biographical auditory narratives and visual animations to 72 human subjects visually, auditorily, or combined, while neural activity was recorded using electroencephalography. Memory for the narrated information, contained in the auditory stream, was tested 3 weeks later. While the visual stimulus alone led to no meaningful retrieval, this related stimulus improved memory when it was combined with the story, even when it was temporally incongruent with the audio. Further, individuals with better subsequent memory elicited neural responses during encoding that were more correlated with their peers. Surprisingly, portions of this predictive synchronized activity were present regardless of the sensory modality of the stimulus. These data suggest that the strength of sensory and supramodal activity is predictive of memory performance after 3 weeks, and that neural synchrony may explain the mnemonic benefit of the functionally uninformative visual context observed for these real-world stimuli. PMID:27844062

  14. Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

    PubMed

    Wahn, Basil; König, Peter

    2017-01-01

    Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves object-based attention (e.g., the discrimination of stimulus attributes) or spatial attention (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

  15. Sensory gain control (amplification) as a mechanism of selective attention: electrophysiological and neuroimaging evidence.

    PubMed Central

    Hillyard, S A; Vogel, E K; Luck, S J

    1998-01-01

    Both physiological and behavioral studies have suggested that stimulus-driven neural activity in the sensory pathways can be modulated in amplitude during selective attention. Recordings of event-related brain potentials indicate that such sensory gain control or amplification processes play an important role in visual-spatial attention. Combined event-related brain potential and neuroimaging experiments provide strong evidence that attentional gain control operates at an early stage of visual processing in extrastriate cortical areas. These data support early selection theories of attention and provide a basis for distinguishing between separate mechanisms of attentional suppression (of unattended inputs) and attentional facilitation (of attended inputs). PMID:9770220

  16. Temporal recalibration of motor and visual potentials in lag adaptation in voluntary movement.

    PubMed

    Cai, Chang; Ogawa, Kenji; Kochiyama, Takanori; Tanaka, Hirokazu; Imamizu, Hiroshi

    2018-05-15

    Adaptively recalibrating motor-sensory asynchrony is critical for animals to perceive self-produced action consequences. It is controversial whether motor- or sensory-related neural circuits recalibrate this asynchrony. By combining magnetoencephalography (MEG) and functional MRI (fMRI), we investigate the temporal changes in brain activities caused by repeated exposure to a 150-ms delay inserted between a button-press action and a subsequent flash. We found that readiness potentials significantly shift later in the motor system, especially in parietal regions (average: 219.9 ms), while visually evoked potentials significantly shift earlier in occipital regions (average: 49.7 ms) in the delay condition compared to the no-delay condition. Moreover, the shift in readiness potentials, but not in visually evoked potentials, was significantly correlated with the psychophysical measure of motor-sensory adaptation. These results suggest that although both motor and sensory processes contribute to the recalibration, the motor process plays the major role, given the magnitudes of shift and the correlation with the psychophysical measure. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Postural Control Deficits in Autism Spectrum Disorder: The Role of Sensory Integration

    ERIC Educational Resources Information Center

    Doumas, Michail; McKenna, Roisin; Murphy, Blain

    2016-01-01

    We investigated the nature of sensory integration deficits in postural control of young adults with ASD. Postural control was assessed in a fixed environment, and in three environments in which sensory information about body sway from visual, proprioceptive or both channels was inaccurate. Furthermore, two levels of inaccurate information were…

  18. Sensory Aids Research Project - Clarke School for the Deaf.

    ERIC Educational Resources Information Center

    Boothroyd, Arthur

    Described is a program of research into sensory aids for the deaf, emphasizing research on factors involved in the effective use of sensory aids rather than evaluation of particular devices. Aspects of the program are the development of a programed testing and training unit, the control of fundamental voice frequency using visual feedback, and…

  19. Escape from harm: linking affective vision and motor responses during active avoidance

    PubMed Central

    Keil, Andreas

    2014-01-01

    When organisms confront unpleasant objects in their natural environments, they engage in behaviors that allow them to avoid aversive outcomes. Here, we linked visual processing of threat to its behavioral consequences by including a motor response that terminated exposure to an aversive event. Dense-array steady-state visual evoked potentials were recorded in response to conditioned threat and safety signals viewed in active or passive behavioral contexts. The amplitude of neuronal responses in visual cortex increased additively, as a function of emotional value and action relevance. The gain in local cortical population activity for threat relative to safety cues persisted when aversive reinforcement was behaviorally terminated, suggesting a lingering emotionally based response amplification within the visual system. Distinct patterns of long-range neural synchrony emerged between the visual cortex and extravisual regions. Increased coupling between visual and higher-order structures was observed specifically during active perception of threat, consistent with a reorganization of neuronal populations involved in linking sensory processing to action preparation. PMID:24493849

  20. Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli.

    PubMed

    Kanaya, Shoko; Yokosawa, Kazuhiko

    2011-02-01

    Many studies on multisensory processes have focused on performance in simplified experimental situations, with a single stimulus in each sensory modality. However, these results cannot necessarily be applied to explain our perceptual behavior in natural scenes where various signals exist within one sensory modality. We investigated the role of audio-visual syllable congruency on participants' auditory localization bias or the ventriloquism effect using spoken utterances and two videos of a talking face. Salience of facial movements was also manipulated. Results indicated that more salient visual utterances attracted participants' auditory localization. Congruent pairing of audio-visual utterances elicited greater localization bias than incongruent pairing, while previous studies have reported little dependency on the reality of stimuli in ventriloquism. Moreover, audio-visual illusory congruency, owing to the McGurk effect, caused substantial visual interference on auditory localization. Multisensory performance appears more flexible and adaptive in this complex environment than in previous studies.

  1. Direct neural pathways convey distinct visual information to Drosophila mushroom bodies

    PubMed Central

    Vogt, Katrin; Aso, Yoshinori; Hige, Toshihide; Knapek, Stephan; Ichinose, Toshiharu; Friedrich, Anja B; Turner, Glenn C; Rubin, Gerald M; Tanimoto, Hiromu

    2016-01-01

    Previously, we demonstrated that visual and olfactory associative memories of Drosophila share mushroom body (MB) circuits (Vogt et al., 2014). Unlike for odor representation, the MB circuit for visual information has not been characterized. Here, we show that a small subset of MB Kenyon cells (KCs) selectively responds to visual but not olfactory stimulation. The dendrites of these atypical KCs form a ventral accessory calyx (vAC), distinct from the main calyx that receives olfactory input. We identified two types of visual projection neurons (VPNs) directly connecting the optic lobes and the vAC. Strikingly, these VPNs are differentially required for visual memories of color and brightness. The segregation of visual and olfactory domains in the MB allows independent processing of distinct sensory memories and may be a conserved form of sensory representations among insects. DOI: http://dx.doi.org/10.7554/eLife.14009.001 PMID:27083044

  2. Age-equivalent top-down modulation during cross-modal selective attention.

    PubMed

    Guerreiro, Maria J S; Anguera, Joaquin A; Mishra, Jyoti; Van Gerven, Pascal W M; Gazzaley, Adam

    2014-12-01

    Selective attention involves top-down modulation of sensory cortical areas, such that responses to relevant information are enhanced whereas responses to irrelevant information are suppressed. Suppression of irrelevant information, unlike enhancement of relevant information, has been shown to be deficient in aging. Although these attentional mechanisms have been well characterized within the visual modality, little is known about these mechanisms when attention is selectively allocated across sensory modalities. The present EEG study addressed this issue by testing younger and older participants in three different tasks: Participants attended to the visual modality and ignored the auditory modality, attended to the auditory modality and ignored the visual modality, or passively perceived information presented through either modality. We found overall modulation of visual and auditory processing during cross-modal selective attention in both age groups. Top-down modulation of visual processing was observed as a trend toward enhancement of visual information in the setting of auditory distraction, but no significant suppression of visual distraction when auditory information was relevant. Top-down modulation of auditory processing, on the other hand, was observed as suppression of auditory distraction when visual stimuli were relevant, but no significant enhancement of auditory information in the setting of visual distraction. In addition, greater visual enhancement was associated with better recognition of relevant visual information, and greater auditory distractor suppression was associated with a better ability to ignore auditory distraction. There were no age differences in these effects, suggesting that when relevant and irrelevant information are presented through different sensory modalities, selective attention remains intact in older age.

  3. [Ventriloquism and audio-visual integration of voice and face].

    PubMed

    Yokosawa, Kazuhiko; Kanaya, Shoko

    2012-07-01

    Presenting synchronous auditory and visual stimuli in separate locations creates the illusion that the sound originates from the direction of the visual stimulus. Participants' auditory localization bias, called the ventriloquism effect, has revealed factors affecting the perceptual integration of audio-visual stimuli. However, many studies on audio-visual processes have focused on performance in simplified experimental situations, with a single stimulus in each sensory modality. These results cannot necessarily explain our perceptual behavior in natural scenes, where various signals exist within a single sensory modality. In the present study we report the contributions of a cognitive factor, that is, the audio-visual congruency of speech, although this factor has often been underestimated in previous ventriloquism research. Thus, we investigated the contribution of speech congruency on the ventriloquism effect using a spoken utterance and two videos of a talking face. The salience of facial movements was also manipulated. As a result, when bilateral visual stimuli are presented in synchrony with a single voice, cross-modal speech congruency was found to have a significant impact on the ventriloquism effect. This result also indicated that more salient visual utterances attracted participants' auditory localization. The congruent pairing of audio-visual utterances elicited greater localization bias than did incongruent pairing, whereas previous studies have reported little dependency on the reality of stimuli in ventriloquism. Moreover, audio-visual illusory congruency, owing to the McGurk effect, caused substantial visual interference to auditory localization. This suggests that a greater flexibility in responding to multi-sensory environments exists than has been previously considered.

  4. Visual stimuli induced by self-motion and object-motion modify odour-guided flight of male moths (Manduca sexta L.).

    PubMed

    Verspui, Remko; Gray, John R

    2009-10-01

    Animals rely on multimodal sensory integration for proper orientation within their environment. For example, odour-guided behaviours often require appropriate integration of concurrent visual cues. To gain a further understanding of mechanisms underlying sensory integration in odour-guided behaviour, our study examined the effects of visual stimuli induced by self-motion and object-motion on odour-guided flight in male M. sexta. By placing stationary objects (pillars) on either side of a female pheromone plume, moths produced self-induced visual motion during odour-guided flight. These flights showed a reduction in both ground and flight speeds and inter-turn interval when compared with flight tracks without stationary objects. Presentation of an approaching 20 cm disc, to simulate object-motion, resulted in interrupted odour-guided flight and changes in flight direction away from the pheromone source. Modifications of odour-guided flight behaviour in the presence of stationary objects suggest that visual information, in conjunction with olfactory cues, can be used to control the rate of counter-turning. We suggest that the behavioural responses to visual stimuli induced by object-motion indicate the presence of a neural circuit that relays visual information to initiate escape responses. These behavioural responses also suggest the presence of a sensory conflict requiring a trade-off between olfactory and visually driven behaviours. The mechanisms underlying olfactory and visual integration are discussed in the context of these behavioural responses.

  5. Prefrontal contributions to visual selective attention.

    PubMed

    Squire, Ryan F; Noudoost, Behrad; Schafer, Robert J; Moore, Tirin

    2013-07-08

    The faculty of attention endows us with the capacity to process important sensory information selectively while disregarding information that is potentially distracting. Much of our understanding of the neural circuitry underlying this fundamental cognitive function comes from neurophysiological studies within the visual modality. Past evidence suggests that a principal function of the prefrontal cortex (PFC) is selective attention and that this function involves the modulation of sensory signals within posterior cortices. In this review, we discuss recent progress in identifying the specific prefrontal circuits controlling visual attention and its neural correlates within the primate visual system. In addition, we examine the persisting challenge of precisely defining how behavior should be affected when attentional function is lost.

  6. An inside look at the sensory biology of triatomines.

    PubMed

    Barrozo, Romina B; Reisenman, Carolina E; Guerenstein, Pablo; Lazzari, Claudio R; Lorenzo, Marcelo G

    Although kissing bugs (Triatominae: Reduviidae) are perhaps best known as vectors of Chagas disease, they are important experimental models in studies of insect sensory physiology, pioneered by the seminal studies of Wigglesworth and Gillet more than eighty years ago. Since then, many investigations have revealed that the thermal, hygric, visual and olfactory senses play critical roles in the orientation of these blood-sucking insects towards hosts. Here we review the current knowledge about the role of these sensory systems, focussing on relevant stimuli, sensory structures, receptor physiology and the molecular players involved in the complex and cryptic behavioural repertoire of these nocturnal insects. Odours are particularly relevant, as they are involved in host search and are used for sexual, aggregation and alarm communication. Tastants are critical for a proper recognition of hosts, food and conspecifics. Heat and relative humidity mediate orientation towards hosts and are also important for the selection of resting places. Vision, which mediates negative phototaxis and flight dispersion, is also critical for modulating shelter use and mediating escape responses. The molecular bases underlying the detection of sensory stimuli started to be uncovered by means of functional genetics due to both the recent publication of the genome sequence of Rhodnius prolixus and the availability of modern genome editing techniques. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Age-Related Changes in Visual Temporal Order Judgment Performance: Relation to Sensory and Cognitive Capacities

    PubMed Central

    Busey, Thomas; Craig, James; Clark, Chris; Humes, Larry

    2010-01-01

    Five measures of temporal order judgments were obtained from 261 participants, including 146 elder, 44 middle aged, and 71 young participants. Strong age group differences were observed in all five measures, although the group differences were reduced when letter discriminability was matched for all participants. Significant relations were found between these measures of temporal processing and several cognitive and sensory assays, and structural equation modeling revealed the degree to which temporal order processing can be viewed as a latent factor that depends in part on contributions from sensory and cognitive capacities. The best-fitting model involved two different latent factors representing temporal order processing at same and different locations, and the sensory and cognitive factors were more successful predicting performance in the different location factor than the same-location factor. Processing speed, even measured using high-contrast symbols on a paper-and-pencil test, was a surprisingly strong predictor of variability in both latent factors. However, low-level sensory measures also made significant contributions to the latent factors. The results demonstrate the degree to which temporal order processing relates to other perceptual and cognitive capacities, and address the question of whether age-related declines in these capacities share a common cause. PMID:20580644

  8. Age-related changes in visual temporal order judgment performance: Relation to sensory and cognitive capacities.

    PubMed

    Busey, Thomas; Craig, James; Clark, Chris; Humes, Larry

    2010-08-06

    Five measures of temporal order judgments were obtained from 261 participants, including 146 elder, 44 middle aged, and 71 young participants. Strong age group differences were observed in all five measures, although the group differences were reduced when letter discriminability was matched for all participants. Significant relations were found between these measures of temporal processing and several cognitive and sensory assays, and structural equation modeling revealed the degree to which temporal order processing can be viewed as a latent factor that depends in part on contributions from sensory and cognitive capacities. The best-fitting model involved two different latent factors representing temporal order processing at same and different locations, and the sensory and cognitive factors were more successful predicting performance in the different location factor than the same-location factor. Processing speed, even measured using high-contrast symbols on a paper-and-pencil test, was a surprisingly strong predictor of variability in both latent factors. However, low-level sensory measures also made significant contributions to the latent factors. The results demonstrate the degree to which temporal order processing relates to other perceptual and cognitive capacities, and address the question of whether age-related declines in these capacities share a common cause. Copyright 2010 Elsevier Ltd. All rights reserved.

  9. Explicit Encoding of Multimodal Percepts by Single Neurons in the Human Brain

    PubMed Central

    Quiroga, Rodrigo Quian; Kraskov, Alexander; Koch, Christof; Fried, Itzhak

    2010-01-01

    Summary Different pictures of Marilyn Monroe can evoke the same percept, even if greatly modified as in Andy Warhol’s famous portraits. But how does the brain recognize highly variable pictures as the same percept? Various studies have provided insights into how visual information is processed along the “ventral pathway,” via both single-cell recordings in monkeys [1, 2] and functional imaging in humans [3, 4]. Interestingly, in humans, the same “concept” of Marilyn Monroe can be evoked with other stimulus modalities, for instance by hearing or reading her name. Brain imaging studies have identified cortical areas selective to voices [5, 6] and visual word forms [7, 8]. However, how visual, text, and sound information can elicit a unique percept is still largely unknown. By using presentations of pictures and of spoken and written names, we show that (1) single neurons in the human medial temporal lobe (MTL) respond selectively to representations of the same individual across different sensory modalities; (2) the degree of multimodal invariance increases along the hierarchical structure within the MTL; and (3) such neuronal representations can be generated within less than a day or two. These results demonstrate that single neurons can encode percepts in an explicit, selective, and invariant manner, even if evoked by different sensory modalities. PMID:19631538

  10. A hierarchy of timescales explains distinct effects of local inhibition of primary visual cortex and frontal eye fields

    PubMed Central

    Cocchi, Luca; Sale, Martin V; L Gollo, Leonardo; Bell, Peter T; Nguyen, Vinh T; Zalesky, Andrew; Breakspear, Michael; Mattingley, Jason B

    2016-01-01

    Within the primate visual system, areas at lower levels of the cortical hierarchy process basic visual features, whereas those at higher levels, such as the frontal eye fields (FEF), are thought to modulate sensory processes via feedback connections. Despite these functional exchanges during perception, there is little shared activity between early and late visual regions at rest. How interactions emerge between regions encompassing distinct levels of the visual hierarchy remains unknown. Here we combined neuroimaging, non-invasive cortical stimulation and computational modelling to characterize changes in functional interactions across widespread neural networks before and after local inhibition of primary visual cortex or FEF. We found that stimulation of early visual cortex selectively increased feedforward interactions with FEF and extrastriate visual areas, whereas identical stimulation of the FEF decreased feedback interactions with early visual areas. Computational modelling suggests that these opposing effects reflect a fast-slow timescale hierarchy from sensory to association areas. DOI: http://dx.doi.org/10.7554/eLife.15252.001 PMID:27596931

  11. A hierarchy of timescales explains distinct effects of local inhibition of primary visual cortex and frontal eye fields.

    PubMed

    Cocchi, Luca; Sale, Martin V; L Gollo, Leonardo; Bell, Peter T; Nguyen, Vinh T; Zalesky, Andrew; Breakspear, Michael; Mattingley, Jason B

    2016-09-06

    Within the primate visual system, areas at lower levels of the cortical hierarchy process basic visual features, whereas those at higher levels, such as the frontal eye fields (FEF), are thought to modulate sensory processes via feedback connections. Despite these functional exchanges during perception, there is little shared activity between early and late visual regions at rest. How interactions emerge between regions encompassing distinct levels of the visual hierarchy remains unknown. Here we combined neuroimaging, non-invasive cortical stimulation and computational modelling to characterize changes in functional interactions across widespread neural networks before and after local inhibition of primary visual cortex or FEF. We found that stimulation of early visual cortex selectively increased feedforward interactions with FEF and extrastriate visual areas, whereas identical stimulation of the FEF decreased feedback interactions with early visual areas. Computational modelling suggests that these opposing effects reflect a fast-slow timescale hierarchy from sensory to association areas.

  12. Identifying selective visual attention biases related to fear of pain by tracking eye movements within a dot-probe paradigm.

    PubMed

    Yang, Zhou; Jackson, Todd; Gao, Xiao; Chen, Hong

    2012-08-01

    This research examined selective biases in visual attention related to fear of pain by tracking eye movements (EM) toward pain-related stimuli among the pain-fearful. EM of 21 young adults scoring high on a fear of pain measure (H-FOP) and 20 lower-scoring (L-FOP) control participants were measured during a dot-probe task that featured sensory pain-neutral, health catastrophe-neutral and neutral-neutral word pairs. Analyses indicated that the H-FOP group was more likely to direct immediate visual attention toward sensory pain and health catastrophe words than was the L-FOP group. The H-FOP group also had comparatively shorter first fixation latencies toward sensory pain and health catastrophe words. Conversely, groups did not differ on EM indices of attentional maintenance (i.e., first fixation duration, gaze duration, and average fixation duration) or reaction times to dot probes. Finally, both groups showed a cycle of disengagement followed by re-engagement toward sensory pain words relative to other word types. In sum, this research is the first to reveal biases toward pain stimuli during very early stages of visual information processing among the highly pain-fearful and highlights the utility of EM tracking as a means to evaluate visual attention as a dynamic process in the context of FOP. Copyright © 2012 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  13. Measuring the effect of attention on simple visual search.

    PubMed

    Palmer, J; Ames, C T; Lindsey, D T

    1993-02-01

    Set-size in visual search may be due to 1 or more of 3 factors: sensory processes such as lateral masking between stimuli, attentional processes limiting the perception of individual stimuli, or attentional processes affecting the decision rules for combining information from multiple stimuli. These possibilities were evaluated in tasks such as searching for a longer line among shorter lines. To evaluate sensory contributions, display set-size effects were compared with cuing conditions that held sensory phenomena constant. Similar effects for the display and cue manipulations suggested that sensory processes contributed little under the conditions of this experiment. To evaluate the contribution of decision processes, the set-size effects were modeled with signal detection theory. In these models, a decision effect alone was sufficient to predict the set-size effects without any attentional limitation due to perception.

  14. Visual Working Memory Is Independent of the Cortical Spacing Between Memoranda.

    PubMed

    Harrison, William J; Bays, Paul M

    2018-03-21

    The sensory recruitment hypothesis states that visual short-term memory is maintained in the same visual cortical areas that initially encode a stimulus' features. Although it is well established that the distance between features in visual cortex determines their visibility, a limitation known as crowding, it is unknown whether short-term memory is similarly constrained by the cortical spacing of memory items. Here, we investigated whether the cortical spacing between sequentially presented memoranda affects the fidelity of memory in humans (of both sexes). In a first experiment, we varied cortical spacing by taking advantage of the log-scaling of visual cortex with eccentricity, presenting memoranda in peripheral vision sequentially along either the radial or tangential visual axis with respect to the fovea. In a second experiment, we presented memoranda sequentially either within or beyond the critical spacing of visual crowding, a distance within which visual features cannot be perceptually distinguished due to their nearby cortical representations. In both experiments and across multiple measures, we found strong evidence that the ability to maintain visual features in memory is unaffected by cortical spacing. These results indicate that the neural architecture underpinning working memory has properties inconsistent with the known behavior of sensory neurons in visual cortex. Instead, the dissociation between perceptual and memory representations supports a role of higher cortical areas such as posterior parietal or prefrontal regions or may involve an as yet unspecified mechanism in visual cortex in which stimulus features are bound to their temporal order. SIGNIFICANCE STATEMENT Although much is known about the resolution with which we can remember visual objects, the cortical representation of items held in short-term memory remains contentious. A popular hypothesis suggests that memory of visual features is maintained via the recruitment of the same neural architecture in sensory cortex that encodes stimuli. We investigated this claim by manipulating the spacing in visual cortex between sequentially presented memoranda such that some items shared cortical representations more than others while preventing perceptual interference between stimuli. We found clear evidence that short-term memory is independent of the intracortical spacing of memoranda, revealing a dissociation between perceptual and memory representations. Our data indicate that working memory relies on different neural mechanisms from sensory perception. Copyright © 2018 Harrison and Bays.

  15. Stronger Neural Modulation by Visual Motion Intensity in Autism Spectrum Disorders

    PubMed Central

    Peiker, Ina; Schneider, Till R.; Milne, Elizabeth; Schöttle, Daniel; Vogeley, Kai; Münchau, Alexander; Schunke, Odette; Siegel, Markus; Engel, Andreas K.; David, Nicole

    2015-01-01

    Theories of autism spectrum disorders (ASD) have focused on altered perceptual integration of sensory features as a possible core deficit. Yet, there is little understanding of the neuronal processing of elementary sensory features in ASD. For typically developed individuals, we previously established a direct link between frequency-specific neural activity and the intensity of a specific sensory feature: Gamma-band activity in the visual cortex increased approximately linearly with the strength of visual motion. Using magnetoencephalography (MEG), we investigated whether in individuals with ASD neural activity reflect the coherence, and thus intensity, of visual motion in a similar fashion. Thirteen adult participants with ASD and 14 control participants performed a motion direction discrimination task with increasing levels of motion coherence. A polynomial regression analysis revealed that gamma-band power increased significantly stronger with motion coherence in ASD compared to controls, suggesting excessive visual activation with increasing stimulus intensity originating from motion-responsive visual areas V3, V6 and hMT/V5. Enhanced neural responses with increasing stimulus intensity suggest an enhanced response gain in ASD. Response gain is controlled by excitatory-inhibitory interactions, which also drive high-frequency oscillations in the gamma-band. Thus, our data suggest that a disturbed excitatory-inhibitory balance underlies enhanced neural responses to coherent motion in ASD. PMID:26147342

  16. Multisensory integration across the senses in young and old adults

    PubMed Central

    Mahoney, Jeannette R.; Li, Po Ching Clara; Oh-Park, Mooyeon; Verghese, Joe; Holtzer, Roee

    2011-01-01

    Stimuli are processed concurrently and across multiple sensory inputs. Here we directly compared the effect of multisensory integration (MSI) on reaction time across three paired sensory inputs in eighteen young (M=19.17 yrs) and eighteen old (M=76.44 yrs) individuals. Participants were determined to be non-demented and without any medical or psychiatric conditions that would affect their performance. Participants responded to randomly presented unisensory (auditory, visual, somatosensory) stimuli and three paired sensory inputs consisting of auditory-somatosensory (AS) auditory-visual (AV) and visual-somatosensory (VS) stimuli. Results revealed that reaction time (RT) to all multisensory pairings was significantly faster than those elicited to the constituent unisensory conditions across age groups; findings that could not be accounted for by simple probability summation. Both young and old participants responded the fastest to multisensory pairings containing somatosensory input. Compared to younger adults, older adults demonstrated a significantly greater RT benefit when processing concurrent VS information. In terms of co-activation, older adults demonstrated a significant increase in the magnitude of visual-somatosensory co-activation (i.e., multisensory integration), while younger adults demonstrated a significant increase in the magnitude of auditory-visual and auditory-somatosensory co-activation. This study provides first evidence in support of the facilitative effect of pairing somatosensory with visual stimuli in older adults. PMID:22024545

  17. Pharmacologic attenuation of cross-modal sensory augmentation within the chronic pain insula

    PubMed Central

    Harte, Steven E.; Ichesco, Eric; Hampson, Johnson P.; Peltier, Scott J.; Schmidt-Wilcke, Tobias; Clauw, Daniel J.; Harris, Richard E.

    2016-01-01

    Abstract Pain can be elicited through all mammalian sensory pathways yet cross-modal sensory integration, and its relationship to clinical pain, is largely unexplored. Centralized chronic pain conditions such as fibromyalgia are often associated with symptoms of multisensory hypersensitivity. In this study, female patients with fibromyalgia demonstrated cross-modal hypersensitivity to visual and pressure stimuli compared with age- and sex-matched healthy controls. Functional magnetic resonance imaging revealed that insular activity evoked by an aversive level of visual stimulation was associated with the intensity of fibromyalgia pain. Moreover, attenuation of this insular activity by the analgesic pregabalin was accompanied by concomitant reductions in clinical pain. A multivariate classification method using support vector machines (SVM) applied to visual-evoked brain activity distinguished patients with fibromyalgia from healthy controls with 82% accuracy. A separate SVM classification of treatment effects on visual-evoked activity reliably identified when patients were administered pregabalin as compared with placebo. Both SVM analyses identified significant weights within the insular cortex during aversive visual stimulation. These data suggest that abnormal integration of multisensory and pain pathways within the insula may represent a pathophysiological mechanism in some chronic pain conditions and that insular response to aversive visual stimulation may have utility as a marker for analgesic drug development. PMID:27101425

  18. The sensory basis of rheotaxis in turbulent flow

    NASA Astrophysics Data System (ADS)

    Elder, John P.

    Rheotaxis is a robust, multisensory behavior with many potential benefits for fish and other aquatic animals, yet the influence of different fluvial conditions on rheotactic performance and its sensory basis is still poorly understood. Here, we examine the role that vision and the lateral line play in the rheotactic behavior of a stream-dwelling species (Mexican tetra, Astyanax mexicanus) under both rectilinear and turbulent flow conditions. Turbulence enhanced overall rheotactic strength and lowered the flow speed at which rheotaxis was initiated; this effect did not depend on the availability of either visual or lateral line information. Compared to fish without access to visual information, fish with access to visual information exhibited increased levels of positional stability and as a result, increased levels of rheotactic accuracy. No disruption in rheotactic performance was found when the lateral line was disabled, suggesting that this sensory system is not necessary for either rheotaxis or turbulence detection under the conditions of this study.

  19. [Personnel with poor vision at fighter pilot school].

    PubMed

    Corbé, C; Menu, J P

    1997-10-01

    The piloting of fighting aircraft, the navigation of space-shuttle, the piloting of an helicopter in tactical flight at an altitude of 50 metres require the use of all sensorial, ocular, vestibular, proprioceptive ... sensors. So, the selection and the follow-up of these aerial engines' pilots need a very complete study of medical parameters, in particular sensorial and notably visual system. The doctors and the expert researchers in Aeronautical and spatial Medicine of the Army Health Department, which have in charge the medical supervision of flight crew, should study, create, and improve tests of visual sensorial exploration developed from fundamental and applied research. These authenticated tests with military pilots were applied in ophthalmology for the estimation of normal and deficient vision. A proposition to change norms of World Health Organisation applied to the vision has been following these to low visual persons was equally introduced.

  20. Subsystems of sensory attention for skilled reaching: vision for transport and pre-shaping and somatosensation for grasping, withdrawal and release.

    PubMed

    Sacrey, Lori-Ann R; Whishaw, Ian Q

    2012-06-01

    Skilled reaching is a forelimb movement in which a subject reaches for a piece of food that is placed in the mouth for eating. It is a natural movement used by many animal species and is a routine, daily activity for humans. Its prominent features include transport of the hand to a target, shaping the digits in preparation for grasping, grasping, and withdrawal of the hand to place the food in the mouth. Studies on normal human adults show that skilled reaching is mediated by at least two sensory attention processes. Hand transport to the target and hand shaping are temporally coupled with visual fixation on the target. Grasping, withdrawal, and placing the food into the mouth are associated with visual disengagement and somatosensory guidance. Studies on nonhuman animal species illustrate that shared visual and somatosensory attention likely evolved in the primate lineage. Studies on developing infants illustrate that shared attention requires both experience and maturation. Studies on subjects with Parkinson's disease and Huntington's disease illustrate that decomposition of shared attention also features compensatory visual guidance. The evolutionary, developmental, and neural control of skilled reaching suggests that associative learning processes are importantly related to normal adult attention sharing and so can be used in remediation. The economical use of sensory attention in the different phases of skilled reaching ensures efficiency in eating, reduces sensory interference between sensory reference frames, and provides efficient neural control of the advance and withdrawal components of skilled reaching movements. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Visual Predictions in the Orbitofrontal Cortex Rely on Associative Content

    PubMed Central

    Chaumon, Maximilien; Kveraga, Kestutis; Barrett, Lisa Feldman; Bar, Moshe

    2014-01-01

    Predicting upcoming events from incomplete information is an essential brain function. The orbitofrontal cortex (OFC) plays a critical role in this process by facilitating recognition of sensory inputs via predictive feedback to sensory cortices. In the visual domain, the OFC is engaged by low spatial frequency (LSF) and magnocellular-biased inputs, but beyond this, we know little about the information content required to activate it. Is the OFC automatically engaged to analyze any LSF information for meaning? Or is it engaged only when LSF information matches preexisting memory associations? We tested these hypotheses and show that only LSF information that could be linked to memory associations engages the OFC. Specifically, LSF stimuli activated the OFC in 2 distinct medial and lateral regions only if they resembled known visual objects. More identifiable objects increased activity in the medial OFC, known for its function in affective responses. Furthermore, these objects also increased the connectivity of the lateral OFC with the ventral visual cortex, a crucial region for object identification. At the interface between sensory, memory, and affective processing, the OFC thus appears to be attuned to the associative content of visual information and to play a central role in visuo-affective prediction. PMID:23771980

  2. A sensorimotor account of vision and visual consciousness.

    PubMed

    O'Regan, J K; Noë, A

    2001-10-01

    Many current neurophysiological, psychophysical, and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in," visual stability despite eye movements, change blindness, sensory substitution, and color perception.

  3. Countermeasures to Enhance Sensorimotor Adaptability

    NASA Technical Reports Server (NTRS)

    Bloomberg, J. J.; Peters, B. T.; Mulavara, A. P.; Brady, R. A.; Batson, C. C.; Miller, C. A.; Cohen, H. S.

    2011-01-01

    During exploration-class missions, sensorimotor disturbances may lead to disruption in the ability to ambulate and perform functional tasks during the initial introduction to a novel gravitational environment following a landing on a planetary surface. The goal of our current project is to develop a sensorimotor adaptability (SA) training program to facilitate rapid adaptation to novel gravitational environments. We have developed a unique training system comprised of a treadmill placed on a motion-base facing a virtual visual scene that provides an unstable walking surface combined with incongruent visual flow designed to enhance sensorimotor adaptability. We have conducted a series of studies that have shown: Training using a combination of modified visual flow and support surface motion during treadmill walking enhances locomotor adaptability to a novel sensorimotor environment. Trained individuals become more proficient at performing multiple competing tasks while walking during adaptation to novel discordant sensorimotor conditions. Trained subjects can retain their increased level of adaptability over a six months period. SA training is effective in producing increased adaptability in a more complex over-ground ambulatory task on an obstacle course. This confirms that for a complex task like walking, treadmill training contains enough of the critical features of overground walking to be an effective training modality. The structure of individual training sessions can be optimized to promote fast/strategic motor learning. Training sessions that each contain short-duration exposures to multiple perturbation stimuli allows subjects to acquire a greater ability to rapidly reorganize appropriate response strategies when encountering a novel sensory environment. Individual sensory biases (i.e. increased visual dependency) can predict adaptive responses to novel sensory environments suggesting that customized training prescriptions can be developed to enhance adaptability. These results indicate that SA training techniques can be added to existing treadmill exercise equipment and procedures to produce a single integrated countermeasure system to improve performance of astro/cosmonauts during prolonged exploratory space missions.

  4. Estimating the relative weights of visual and auditory tau versus heuristic-based cues for time-to-contact judgments in realistic, familiar scenes by older and younger adults.

    PubMed

    Keshavarz, Behrang; Campos, Jennifer L; DeLucia, Patricia R; Oberfeld, Daniel

    2017-04-01

    Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background). Here we evaluated TTC estimates by using a traffic scene with an approaching vehicle to evaluate the weights of visual and auditory TTC cues under more realistic conditions. Younger (18-39 years) and older (65+ years) participants made TTC estimates in three sensory conditions: visual-only, auditory-only, and audio-visual. Stimuli were presented within an immersive virtual-reality environment, and cue weights were calculated for both visual cues (e.g., visual τ, final optical size) and auditory cues (e.g., auditory τ, final sound pressure level). The results demonstrated the use of visual τ as well as heuristic cues in the visual-only condition. TTC estimates in the auditory-only condition, however, were primarily based on an auditory heuristic cue (final sound pressure level), rather than on auditory τ. In the audio-visual condition, the visual cues dominated overall, with the highest weight being assigned to visual τ by younger adults, and a more equal weighting of visual τ and heuristic cues in older adults. Overall, better characterizing the effects of combined sensory inputs, stimulus characteristics, and age on the cues used to estimate TTC will provide important insights into how these factors may affect everyday behavior.

  5. Seeing Your Error Alters My Pointing: Observing Systematic Pointing Errors Induces Sensori-Motor After-Effects

    PubMed Central

    Ronchi, Roberta; Revol, Patrice; Katayama, Masahiro; Rossetti, Yves; Farnè, Alessandro

    2011-01-01

    During the procedure of prism adaptation, subjects execute pointing movements to visual targets under a lateral optical displacement: As consequence of the discrepancy between visual and proprioceptive inputs, their visuo-motor activity is characterized by pointing errors. The perception of such final errors triggers error-correction processes that eventually result into sensori-motor compensation, opposite to the prismatic displacement (i.e., after-effects). Here we tested whether the mere observation of erroneous pointing movements, similar to those executed during prism adaptation, is sufficient to produce adaptation-like after-effects. Neurotypical participants observed, from a first-person perspective, the examiner's arm making incorrect pointing movements that systematically overshot visual targets location to the right, thus simulating a rightward optical deviation. Three classical after-effect measures (proprioceptive, visual and visual-proprioceptive shift) were recorded before and after first-person's perspective observation of pointing errors. Results showed that mere visual exposure to an arm that systematically points on the right-side of a target (i.e., without error correction) produces a leftward after-effect, which mostly affects the observer's proprioceptive estimation of her body midline. In addition, being exposed to such a constant visual error induced in the observer the illusion “to feel” the seen movement. These findings indicate that it is possible to elicit sensori-motor after-effects by mere observation of movement errors. PMID:21731649

  6. Visual Reliance for Balance Control in Older Adults Persists When Visual Information Is Disrupted by Artificial Feedback Delays

    PubMed Central

    Balasubramaniam, Ramesh

    2014-01-01

    Sensory information from our eyes, skin and muscles helps guide and correct balance. Less appreciated, however, is that delays in the transmission of sensory information between our eyes, limbs and central nervous system can exceed several 10s of milliseconds. Investigating how these time-delayed sensory signals influence balance control is central to understanding the postural system. Here, we investigate how delayed visual feedback and cognitive performance influence postural control in healthy young and older adults. The task required that participants position their center of pressure (COP) in a fixed target as accurately as possible without visual feedback about their COP location (eyes-open balance), or with artificial time delays imposed on visual COP feedback. On selected trials, the participants also performed a silent arithmetic task (cognitive dual task). We separated COP time series into distinct frequency components using low and high-pass filtering routines. Visual feedback delays affected low frequency postural corrections in young and older adults, with larger increases in postural sway noted for the group of older adults. In comparison, cognitive performance reduced the variability of rapid center of pressure displacements in young adults, but did not alter postural sway in the group of older adults. Our results demonstrate that older adults prioritize vision to control posture. This visual reliance persists even when feedback about the task is delayed by several hundreds of milliseconds. PMID:24614576

  7. System of Attitudes in Parents of Young People Having Sensory Disorders

    ERIC Educational Resources Information Center

    Posokhova, Svetlana; Konovalova, Natalia; Sorokin, Victor; Demyanov, Yuri; Kolosova, Tatyana; Didenko, Elena

    2016-01-01

    The objective of the research was to identify the system of attitudes in parents of young people having sensory disorders. The survey covered parents of children aged 17 and older having hearing disorders, visual disorders, and no sensory disorders. The parents' system of attitudes united the attitude of the parents to themselves, to the child and…

  8. [Sensory loss and brain reorganization].

    PubMed

    Fortin, Madeleine; Voss, Patrice; Lassonde, Maryse; Lepore, Franco

    2007-11-01

    It is without a doubt that humans are first and foremost visual beings. Even though the other sensory modalities provide us with valuable information, it is vision that generally offers the most reliable and detailed information concerning our immediate surroundings. It is therefore not surprising that nearly a third of the human brain processes, in one way or another, visual information. But what happens when the visual information no longer reaches these brain regions responsible for processing it? Indeed numerous medical conditions such as congenital glaucoma, retinis pigmentosa and retinal detachment, to name a few, can disrupt the visual system and lead to blindness. So, do the brain areas responsible for processing visual stimuli simply shut down and become non-functional? Do they become dead weight and simply stop contributing to cognitive and sensory processes? Current data suggests that this is not the case. Quite the contrary, it would seem that congenitally blind individuals benefit from the recruitment of these areas by other sensory modalities to carry out non-visual tasks. In fact, our laboratory has been studying blindness and its consequences on both the brain and behaviour for many years now. We have shown that blind individuals demonstrate exceptional hearing abilities. This finding holds true for stimuli originating from both near and far space. It also holds true, under certain circumstances, for those who lost their sight later in life, beyond a period generally believed to limit the brain changes following the loss of sight. In the case of the early blind, we have shown their ability to localize sounds is strongly correlated with activity in the occipital cortex (the location of the visual processing), demonstrating that these areas are functionally engaged by the task. Therefore it would seem that the plastic nature of the human brain allows them to make new use of the cerebral areas normally dedicated to visual processing.

  9. Cross-sensory reference frame transfer in spatial memory: the case of proprioceptive learning.

    PubMed

    Avraamides, Marios N; Sarrou, Mikaella; Kelly, Jonathan W

    2014-04-01

    In three experiments, we investigated whether the information available to visual perception prior to encoding the locations of objects in a path through proprioception would influence the reference direction from which the spatial memory was formed. Participants walked a path whose orientation was misaligned to the walls of the enclosing room and to the square sheet that covered the path prior to learning (Exp. 1) and, in addition, to the intrinsic structure of a layout studied visually prior to walking the path and to the orientation of stripes drawn on the floor (Exps. 2 and 3). Despite the availability of prior visual information, participants constructed spatial memories that were aligned with the canonical axes of the path, as opposed to the reference directions primed by visual experience. The results are discussed in the context of previous studies documenting transfer of reference frames within and across perceptual modalities.

  10. Regeneration of the Rhopalium and the Rhopalial Nervous System in the Box Jellyfish Tripedalia cystophora.

    PubMed

    Stamatis, Sebastian-Alexander; Worsaae, Katrine; Garm, Anders

    2018-02-01

    Cubozoans have the most intricate visual apparatus within Cnidaria. It comprises four identical sensory structures, the rhopalia, each of which holds six eyes of four morphological types. Two of these eyes are camera-type eyes that are, in many ways, similar to the vertebrate eye. The visual input is used to control complex behaviors, such as navigation and obstacle avoidance, and is processed by an elaborate rhopalial nervous system. Several studies have examined the rhopalial nervous system, which, despite a radial symmetric body plan, is bilaterally symmetrical, connecting the two sides of the rhopalium through commissures in an extensive neuropil. The four rhopalia are interconnected by a nerve ring situated in the oral margin of the bell, and together these structures constitute the cubozoan central nervous system. Cnidarians have excellent regenerative capabilities, enabling most species to regenerate large body areas or body parts, and some species can regenerate completely from just a few hundred cells. Here we test whether cubozoans are capable of regenerating the rhopalia, despite the complexity of the visual system and the rhopalial nervous system. The results show that the rhopalia are readily regrown after amputation and have developed most, if not all, neural elements within two weeks. Using electrophysiology, we investigated the functionality of the regrown rhopalia and found that they generated pacemaker signals and that the lens eyes showed a normal response to light. Our findings substantiate the amazing regenerative ability in Cnidaria by showing here the complex sensory system of Cubozoa, a model system proving to be highly applicable in studies of neurogenesis.

  11. Filling gaps in visual motion for target capture

    PubMed Central

    Bosco, Gianfranco; Delle Monache, Sergio; Gravano, Silvio; Indovina, Iole; La Scaleia, Barbara; Maffei, Vincenzo; Zago, Myrka; Lacquaniti, Francesco

    2015-01-01

    A remarkable challenge our brain must face constantly when interacting with the environment is represented by ambiguous and, at times, even missing sensory information. This is particularly compelling for visual information, being the main sensory system we rely upon to gather cues about the external world. It is not uncommon, for example, that objects catching our attention may disappear temporarily from view, occluded by visual obstacles in the foreground. Nevertheless, we are often able to keep our gaze on them throughout the occlusion or even catch them on the fly in the face of the transient lack of visual motion information. This implies that the brain can fill the gaps of missing sensory information by extrapolating the object motion through the occlusion. In recent years, much experimental evidence has been accumulated that both perceptual and motor processes exploit visual motion extrapolation mechanisms. Moreover, neurophysiological and neuroimaging studies have identified brain regions potentially involved in the predictive representation of the occluded target motion. Within this framework, ocular pursuit and manual interceptive behavior have proven to be useful experimental models for investigating visual extrapolation mechanisms. Studies in these fields have pointed out that visual motion extrapolation processes depend on manifold information related to short-term memory representations of the target motion before the occlusion, as well as to longer term representations derived from previous experience with the environment. We will review recent oculomotor and manual interception literature to provide up-to-date views on the neurophysiological underpinnings of visual motion extrapolation. PMID:25755637

  12. Filling gaps in visual motion for target capture.

    PubMed

    Bosco, Gianfranco; Monache, Sergio Delle; Gravano, Silvio; Indovina, Iole; La Scaleia, Barbara; Maffei, Vincenzo; Zago, Myrka; Lacquaniti, Francesco

    2015-01-01

    A remarkable challenge our brain must face constantly when interacting with the environment is represented by ambiguous and, at times, even missing sensory information. This is particularly compelling for visual information, being the main sensory system we rely upon to gather cues about the external world. It is not uncommon, for example, that objects catching our attention may disappear temporarily from view, occluded by visual obstacles in the foreground. Nevertheless, we are often able to keep our gaze on them throughout the occlusion or even catch them on the fly in the face of the transient lack of visual motion information. This implies that the brain can fill the gaps of missing sensory information by extrapolating the object motion through the occlusion. In recent years, much experimental evidence has been accumulated that both perceptual and motor processes exploit visual motion extrapolation mechanisms. Moreover, neurophysiological and neuroimaging studies have identified brain regions potentially involved in the predictive representation of the occluded target motion. Within this framework, ocular pursuit and manual interceptive behavior have proven to be useful experimental models for investigating visual extrapolation mechanisms. Studies in these fields have pointed out that visual motion extrapolation processes depend on manifold information related to short-term memory representations of the target motion before the occlusion, as well as to longer term representations derived from previous experience with the environment. We will review recent oculomotor and manual interception literature to provide up-to-date views on the neurophysiological underpinnings of visual motion extrapolation.

  13. Negative BOLD in sensory cortices during verbal memory: a component in generating internal representations?

    PubMed

    Azulay, Haim; Striem, Ella; Amedi, Amir

    2009-05-01

    People tend to close their eyes when trying to retrieve an event or a visual image from memory. However the brain mechanisms behind this phenomenon remain poorly understood. Recently, we showed that during visual mental imagery, auditory areas show a much more robust deactivation than during visual perception. Here we ask whether this is a special case of a more general phenomenon involving retrieval of intrinsic, internally stored information, which would result in crossmodal deactivations in other sensory cortices which are irrelevant to the task at hand. To test this hypothesis, a group of 9 sighted individuals were scanned while performing a memory retrieval task for highly abstract words (i.e., with low imaginability scores). We also scanned a group of 10 congenitally blind, which by definition do not have any visual imagery per se. In sighted subjects, both auditory and visual areas were robustly deactivated during memory retrieval, whereas in the blind the auditory cortex was deactivated while visual areas, shown previously to be relevant for this task, presented a positive BOLD signal. These results suggest that deactivation may be most prominent in task-irrelevant sensory cortices whenever there is a need for retrieval or manipulation of internally stored representations. Thus, there is a task-dependent balance of activation and deactivation that might allow maximization of resources and filtering out of non relevant information to enable allocation of attention to the required task. Furthermore, these results suggest that the balance between positive and negative BOLD might be crucial to our understanding of a large variety of intrinsic and extrinsic tasks including high-level cognitive functions, sensory processing and multisensory integration.

  14. Impaired Visual Motor Coordination in Obese Adults.

    PubMed

    Gaul, David; Mat, Arimin; O'Shea, Donal; Issartel, Johann

    2016-01-01

    Objective. To investigate whether obesity alters the sensory motor integration process and movement outcome during a visual rhythmic coordination task. Methods. 88 participants (44 obese and 44 matched control) sat on a chair equipped with a wrist pendulum oscillating in the sagittal plane. The task was to swing the pendulum in synchrony with a moving visual stimulus displayed on a screen. Results. Obese participants demonstrated significantly ( p < 0.01) higher values for continuous relative phase (CRP) indicating poorer level of coordination, increased movement variability ( p < 0.05), and a larger amplitude ( p < 0.05) than their healthy weight counterparts. Conclusion. These results highlight the existence of visual sensory integration deficiencies for obese participants. The obese group have greater difficulty in synchronizing their movement with a visual stimulus. Considering that visual motor coordination is an essential component of many activities of daily living, any impairment could significantly affect quality of life.

  15. Causal evidence for retina dependent and independent visual motion computations in mouse cortex

    PubMed Central

    Hillier, Daniel; Fiscella, Michele; Drinnenberg, Antonia; Trenholm, Stuart; Rompani, Santiago B.; Raics, Zoltan; Katona, Gergely; Juettner, Josephine; Hierlemann, Andreas; Rozsa, Balazs; Roska, Botond

    2017-01-01

    How neuronal computations in the sensory periphery contribute to computations in the cortex is not well understood. We examined this question in the context of visual-motion processing in the retina and primary visual cortex (V1) of mice. We disrupted retinal direction selectivity – either exclusively along the horizontal axis using FRMD7 mutants or along all directions by ablating starburst amacrine cells – and monitored neuronal activity in layer 2/3 of V1 during stimulation with visual motion. In control mice, we found an overrepresentation of cortical cells preferring posterior visual motion, the dominant motion direction an animal experiences when it moves forward. In mice with disrupted retinal direction selectivity, the overrepresentation of posterior-motion-preferring cortical cells disappeared, and their response at higher stimulus speeds was reduced. This work reveals the existence of two functionally distinct, sensory-periphery-dependent and -independent computations of visual motion in the cortex. PMID:28530661

  16. Short-Term Memory for Space and Time Flexibly Recruit Complementary Sensory-Biased Frontal Lobe Attention Networks.

    PubMed

    Michalka, Samantha W; Kong, Lingqiang; Rosen, Maya L; Shinn-Cunningham, Barbara G; Somers, David C

    2015-08-19

    The frontal lobes control wide-ranging cognitive functions; however, functional subdivisions of human frontal cortex are only coarsely mapped. Here, functional magnetic resonance imaging reveals two distinct visual-biased attention regions in lateral frontal cortex, superior precentral sulcus (sPCS) and inferior precentral sulcus (iPCS), anatomically interdigitated with two auditory-biased attention regions, transverse gyrus intersecting precentral sulcus (tgPCS) and caudal inferior frontal sulcus (cIFS). Intrinsic functional connectivity analysis demonstrates that sPCS and iPCS fall within a broad visual-attention network, while tgPCS and cIFS fall within a broad auditory-attention network. Interestingly, we observe that spatial and temporal short-term memory (STM), respectively, recruit visual and auditory attention networks in the frontal lobe, independent of sensory modality. These findings not only demonstrate that both sensory modality and information domain influence frontal lobe functional organization, they also demonstrate that spatial processing co-localizes with visual processing and that temporal processing co-localizes with auditory processing in lateral frontal cortex. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Dynamics of cortico-subcortical cross-modal operations involved in audio-visual object detection in humans.

    PubMed

    Fort, Alexandra; Delpuech, Claude; Pernier, Jacques; Giard, Marie-Hélène

    2002-10-01

    Very recently, a number of neuroimaging studies in humans have begun to investigate the question of how the brain integrates information from different sensory modalities to form unified percepts. Already, intermodal neural processing appears to depend on the modalities of inputs or the nature (speech/non-speech) of information to be combined. Yet, the variety of paradigms, stimuli and technics used make it difficult to understand the relationships between the factors operating at the perceptual level and the underlying physiological processes. In a previous experiment, we used event-related potentials to describe the spatio-temporal organization of audio-visual interactions during a bimodal object recognition task. Here we examined the network of cross-modal interactions involved in simple detection of the same objects. The objects were defined either by unimodal auditory or visual features alone, or by the combination of the two features. As expected, subjects detected bimodal stimuli more rapidly than either unimodal stimuli. Combined analysis of potentials, scalp current densities and dipole modeling revealed several interaction patterns within the first 200 micro s post-stimulus: in occipito-parietal visual areas (45-85 micro s), in deep brain structures, possibly the superior colliculus (105-140 micro s), and in right temporo-frontal regions (170-185 micro s). These interactions differed from those found during object identification in sensory-specific areas and possibly in the superior colliculus, indicating that the neural operations governing multisensory integration depend crucially on the nature of the perceptual processes involved.

  18. A Portable Sensory Augmentation Device for Balance Rehabilitation Using Fingertip Skin Stretch Feedback.

    PubMed

    Pan, Yi-Tsen; Yoon, Han U; Hur, P

    2017-01-01

    Neurological disorders are the leading causes of poor balance. Previous studies have shown that biofeedback can compensate for weak or missing sensory information in people with sensory deficits. These biofeedback inputs can be easily recognized and converted into proper information by the central nervous system (CNS), which integrates the appropriate sensorimotor information and stabilizes the human posture. In this study, we proposed a form of cutaneous feedback which stretches the fingertip pad with a rotational contactor, so-called skin stretch. Skin stretch at a fingertip pad can be simply perceived and its small contact area makes it favored for small wearable devices. Taking advantage of skin stretch feedback, we developed a portable sensory augmentation device (SAD) for rehabilitation of balance. SAD was designed to provide postural sway information through additional skin stretch feedback. To demonstrate the feasibility of the SAD, quiet standing on a force plate was evaluated while sensory deficits were simulated. Fifteen healthy young adults were asked to stand quietly under six sensory conditions: three levels of sensory deficits (normal, visual deficit, and visual + vestibular deficits) combined with and without augmented sensation provided by SAD. The results showed that augmented sensation via skin stretch feedback helped subjects correct their posture and balance, especially as the deficit level of sensory feedback increased. These findings demonstrate the potential use of skin stretch feedback in balance rehabilitation.

  19. How actions shape perception: learning action-outcome relations and predicting sensory outcomes promote audio-visual temporal binding

    PubMed Central

    Desantis, Andrea; Haggard, Patrick

    2016-01-01

    To maintain a temporally-unified representation of audio and visual features of objects in our environment, the brain recalibrates audio-visual simultaneity. This process allows adjustment for both differences in time of transmission and time for processing of audio and visual signals. In four experiments, we show that the cognitive processes for controlling instrumental actions also have strong influence on audio-visual recalibration. Participants learned that right and left hand button-presses each produced a specific audio-visual stimulus. Following one action the audio preceded the visual stimulus, while for the other action audio lagged vision. In a subsequent test phase, left and right button-press generated either the same audio-visual stimulus as learned initially, or the pair associated with the other action. We observed recalibration of simultaneity only for previously-learned audio-visual outcomes. Thus, learning an action-outcome relation promotes temporal grouping of the audio and visual events within the outcome pair, contributing to the creation of a temporally unified multisensory object. This suggests that learning action-outcome relations and the prediction of perceptual outcomes can provide an integrative temporal structure for our experiences of external events. PMID:27982063

  20. How actions shape perception: learning action-outcome relations and predicting sensory outcomes promote audio-visual temporal binding.

    PubMed

    Desantis, Andrea; Haggard, Patrick

    2016-12-16

    To maintain a temporally-unified representation of audio and visual features of objects in our environment, the brain recalibrates audio-visual simultaneity. This process allows adjustment for both differences in time of transmission and time for processing of audio and visual signals. In four experiments, we show that the cognitive processes for controlling instrumental actions also have strong influence on audio-visual recalibration. Participants learned that right and left hand button-presses each produced a specific audio-visual stimulus. Following one action the audio preceded the visual stimulus, while for the other action audio lagged vision. In a subsequent test phase, left and right button-press generated either the same audio-visual stimulus as learned initially, or the pair associated with the other action. We observed recalibration of simultaneity only for previously-learned audio-visual outcomes. Thus, learning an action-outcome relation promotes temporal grouping of the audio and visual events within the outcome pair, contributing to the creation of a temporally unified multisensory object. This suggests that learning action-outcome relations and the prediction of perceptual outcomes can provide an integrative temporal structure for our experiences of external events.

  1. Nascent body ego: metapsychological and neurophysiological aspects.

    PubMed

    Lehtonen, Johannes; Partanen, Juhani; Purhonen, Maija; Valkonen-Korhonen, Minna; Kononen, Mervi; Saarikoski, Seppo; Launiala, Kari

    2006-10-01

    For Freud, body ego was the organizing basis of the structural theory. He defined it as a psychic projection of the body surface. Isakower's and Lewin's classical findings suggest that the body surface experiences of nursing provide the infant with sensory-affective stimulation that initiates a projection of sensory processes towards the psychic realm. During nursing, somato-sensory, gustatory and olfactory modalities merge with a primitive somatic affect of satiation, whereas auditory modality is involved more indirectly and visual contact more gradually. Repeated regularly, such nascent experiences are likely to play a part in the organization of the primitive protosymbolic mental experience. In support of this hypothesis, the authors review findings from a neurophysiological study of infants before, during and after nursing. Nursing is associated with a significant amplitude change in the newborn electroencephalogram (EEG), which wanes before the age of 3 months, and is transformed at the age of 6 months into rhythmic 3-5 Hz hedonic theta-activity. Sucking requires active physiological work, which is shown in a regular rise in heart rate. The hypothesis of a sensory-affective organization of the nascent body ego, enhanced by nursing and active sucking, seems concordant with neurophysiological phenomena related to nursing.

  2. Image Mapping and Visual Attention on the Sensory Ego-Sphere

    NASA Technical Reports Server (NTRS)

    Fleming, Katherine Achim; Peters, Richard Alan, II

    2012-01-01

    The Sensory Ego-Sphere (SES) is a short-term memory for a robot in the form of an egocentric, tessellated, spherical, sensory-motor map of the robot s locale. Visual attention enables fast alignment of overlapping images without warping or position optimization, since an attentional point (AP) on the composite typically corresponds to one on each of the collocated regions in the images. Such alignment speeds analysis of the multiple images of the area. Compositing and attention were performed two ways and compared: (1) APs were computed directly on the composite and not on the full-resolution images until the time of retrieval; and (2) the attentional operator was applied to all incoming imagery. It was found that although the second method was slower, it produced consistent and, thereby, more useful APs. The SES is an integral part of a control system that will enable a robot to learn new behaviors based on its previous experiences, and that will enable it to recombine its known behaviors in such a way as to solve related, but novel, task problems with apparent creativity. The approach is to combine sensory-motor data association and dimensionality reduction to learn navigation and manipulation tasks as sequences of basic behaviors that can be implemented with a small set of closed-loop controllers. Over time, the aggregate of behaviors and their transition probabilities form a stochastic network. Then given a task, the robot finds a path in the network that leads from its current state to the goal. The SES provides a short-term memory for the cognitive functions of the robot, association of sensory and motor data via spatio-temporal coincidence, direction of the attention of the robot, navigation through spatial localization with respect to known or discovered landmarks, and structured data sharing between the robot and human team members, the individuals in multi-robot teams, or with a C3 center.

  3. Association of visual sensory function and higher order visual processing skills with incident driving cessation

    PubMed Central

    Huisingh, Carrie; McGwin, Gerald; Owsley, Cynthia

    2017-01-01

    Background Many studies on vision and driving cessation have relied on measures of sensory function, which are insensitive to the higher order cognitive aspects of visual processing. The purpose of this study was to examine the association between traditional measures of visual sensory function and higher order visual processing skills with incident driving cessation in a population-based sample of older drivers. Methods Two thousand licensed drivers aged ≥70 were enrolled and followed-up for three years. Tests for central vision and visual processing were administered at baseline and included visual acuity, contrast sensitivity, sensitivity in the driving visual field, visual processing speed (Useful Field of View (UFOV) Subtest 2 and Trails B), and spatial ability measured by the Visual Closure Subtest of the Motor-free Visual Perception Test. Participants self-reported the month and year of driving cessation and provided a reason for cessation. Cox proportional hazards models were used to generate crude and adjusted hazard ratios with 95% confidence intervals between visual functioning characteristics and risk of driving cessation over a three-year period. Results During the study period, 164 participants stopped driving which corresponds to a cumulative incidence of 8.5%. Impaired contrast sensitivity, visual fields, visual processing speed (UFOVand Trails B), and spatial ability were significant risk factors for subsequent driving cessation after adjusting for age, gender, marital status, number of medical conditions, and miles driven. Visual acuity impairment was not associated with driving cessation. Medical problems (63%), specifically musculoskeletal and neurological problems, as well as vision problems (17%) were cited most frequently as the reason for driving cessation. Conclusion Assessment of cognitive and visual functioning can provide useful information about subsequent risk of driving cessation among older drivers. In addition, a variety of factors, not just vision, influenced the decision to stop driving and may be amenable to intervention. PMID:27353969

  4. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2010-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing attentional locations at each node.

  5. System and method for image mapping and visual attention

    NASA Technical Reports Server (NTRS)

    Peters, II, Richard A. (Inventor)

    2011-01-01

    A method is described for mapping dense sensory data to a Sensory Ego Sphere (SES). Methods are also described for finding and ranking areas of interest in the images that form a complete visual scene on an SES. Further, attentional processing of image data is best done by performing attentional processing on individual full-size images from the image sequence, mapping each attentional location to the nearest node, and then summing all attentional locations at each node.

  6. Visual noise disrupts conceptual integration in reading.

    PubMed

    Gao, Xuefei; Stine-Morrow, Elizabeth A L; Noh, Soo Rim; Eskew, Rhea T

    2011-02-01

    The Effortfulness Hypothesis suggests that sensory impairment (either simulated or age-related) may decrease capacity for semantic integration in language comprehension. We directly tested this hypothesis by measuring resource allocation to different levels of processing during reading (i.e., word vs. semantic analysis). College students read three sets of passages word-by-word, one at each of three levels of dynamic visual noise. There was a reliable interaction between processing level and noise, such that visual noise increased resources allocated to word-level processing, at the cost of attention paid to semantic analysis. Recall of the most important ideas also decreased with increasing visual noise. Results suggest that sensory challenge can impair higher-level cognitive functions in learning from text, supporting the Effortfulness Hypothesis.

  7. Cross-Modal Correspondences Enhance Performance on a Colour-to-Sound Sensory Substitution Device.

    PubMed

    Hamilton-Fletcher, Giles; Wright, Thomas D; Ward, Jamie

    Visual sensory substitution devices (SSDs) can represent visual characteristics through distinct patterns of sound, allowing a visually impaired user access to visual information. Previous SSDs have avoided colour and when they do encode colour, have assigned sounds to colour in a largely unprincipled way. This study introduces a new tablet-based SSD termed the ‘Creole’ (so called because it combines tactile scanning with image sonification) and a new algorithm for converting colour to sound that is based on established cross-modal correspondences (intuitive mappings between different sensory dimensions). To test the utility of correspondences, we examined the colour–sound associative memory and object recognition abilities of sighted users who had their device either coded in line with or opposite to sound–colour correspondences. Improved colour memory and reduced colour-errors were made by users who had the correspondence-based mappings. Interestingly, the colour–sound mappings that provided the highest improvements during the associative memory task also saw the greatest gains for recognising realistic objects that also featured these colours, indicating a transfer of abilities from memory to recognition. These users were also marginally better at matching sounds to images varying in luminance, even though luminance was coded identically across the different versions of the device. These findings are discussed with relevance for both colour and correspondences for sensory substitution use.

  8. Mental Imagery Induces Cross-Modal Sensory Plasticity and Changes Future Auditory Perception.

    PubMed

    Berger, Christopher C; Ehrsson, H Henrik

    2018-04-01

    Can what we imagine in our minds change how we perceive the world in the future? A continuous process of multisensory integration and recalibration is responsible for maintaining a correspondence between the senses (e.g., vision, touch, audition) and, ultimately, a stable and coherent perception of our environment. This process depends on the plasticity of our sensory systems. The so-called ventriloquism aftereffect-a shift in the perceived localization of sounds presented alone after repeated exposure to spatially mismatched auditory and visual stimuli-is a clear example of this type of plasticity in the audiovisual domain. In a series of six studies with 24 participants each, we investigated an imagery-induced ventriloquism aftereffect in which imagining a visual stimulus elicits the same frequency-specific auditory aftereffect as actually seeing one. These results demonstrate that mental imagery can recalibrate the senses and induce the same cross-modal sensory plasticity as real sensory stimuli.

  9. Learning effects of dynamic postural control by auditory biofeedback versus visual biofeedback training.

    PubMed

    Hasegawa, Naoya; Takeda, Kenta; Sakuma, Moe; Mani, Hiroki; Maejima, Hiroshi; Asaka, Tadayoshi

    2017-10-01

    Augmented sensory biofeedback (BF) for postural control is widely used to improve postural stability. However, the effective sensory information in BF systems of motor learning for postural control is still unknown. The purpose of this study was to investigate the learning effects of visual versus auditory BF training in dynamic postural control. Eighteen healthy young adults were randomly divided into two groups (visual BF and auditory BF). In test sessions, participants were asked to bring the real-time center of pressure (COP) in line with a hidden target by body sway in the sagittal plane. The target moved in seven cycles of sine curves at 0.23Hz in the vertical direction on a monitor. In training sessions, the visual and auditory BF groups were required to change the magnitude of a visual circle and a sound, respectively, according to the distance between the COP and target in order to reach the target. The perceptual magnitudes of visual and auditory BF were equalized according to Stevens' power law. At the retention test, the auditory but not visual BF group demonstrated decreased postural performance errors in both the spatial and temporal parameters under the no-feedback condition. These findings suggest that visual BF increases the dependence on visual information to control postural performance, while auditory BF may enhance the integration of the proprioceptive sensory system, which contributes to motor learning without BF. These results suggest that auditory BF training improves motor learning of dynamic postural control. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Sensory Substitution and Multimodal Mental Imagery.

    PubMed

    Nanay, Bence

    2017-09-01

    Many philosophers use findings about sensory substitution devices in the grand debate about how we should individuate the senses. The big question is this: Is "vision" assisted by (tactile) sensory substitution really vision? Or is it tactile perception? Or some sui generis novel form of perception? My claim is that sensory substitution assisted "vision" is neither vision nor tactile perception, because it is not perception at all. It is mental imagery: visual mental imagery triggered by tactile sensory stimulation. But it is a special form of mental imagery that is triggered by corresponding sensory stimulation in a different sense modality, which I call "multimodal mental imagery."

  11. Sensory and motoric influences on attention dynamics during standing balance recovery in young and older adults.

    PubMed

    Redfern, Mark S; Chambers, April J; Jennings, J Richard; Furman, Joseph M

    2017-08-01

    This study investigated the impact of attention on the sensory and motor actions during postural recovery from underfoot perturbations in young and older adults. A dual-task paradigm was used involving disjunctive and choice reaction time (RT) tasks to auditory and visual stimuli at different delays from the onset of two types of platform perturbations (rotations and translations). The RTs were increased prior to the perturbation (preparation phase) and during the immediate recovery response (response initiation) in young and older adults, but this interference dissipated rapidly after the perturbation response was initiated (<220 ms). The sensory modality of the RT task impacted the results with interference being greater for the auditory task compared to the visual task. As motor complexity of the RT task increased (disjunctive versus choice) there was greater interference from the perturbation. Finally, increasing the complexity of the postural perturbation by mixing the rotational and translational perturbations together increased interference for the auditory RT tasks, but did not affect the visual RT responses. These results suggest that sensory and motoric components of postural control are under the influence of different dynamic attentional processes.

  12. Association Between Sensory Impairment and Dementia in Older Adults: Evidence from China.

    PubMed

    Luo, Yanan; He, Ping; Guo, Chao; Chen, Gong; Li, Ning; Zheng, Xiaoying

    2018-03-01

    To determine the association between sensory impairment and dementia in Chinese older adults. Cross-sectional. Older adults in 31 provinces of China. Individuals aged 65 and older (N = 250,752). Psychiatrists ascertained dementia based on the International Classification of Diseases, 10th Revision. Sensory impairment was measured as only hearing impairment, only vision impairment, and combined sensory impairment (combined hearing and vision impairment). Hearing impairment was defined as greater than 40 dB loss in the better ear according to the standard of the World Health Organization (WHO) Prevention of Deafness and Hearing Impairment (PDH) standard 97.3. Ophthalmologists assessed vision impairment according to the WHO best-corrected visual acuity (BCVA) criteria (low vision: 0.05≤BCVA ≤0.29; blindness: no light perception ≤ BCVA <0.05, visual field less than 10 degrees; the better-seeing eye). The prevalence of dementia was 0.41% (95% CI = 0.39-0.44%) without sensory impairment, 0.83% (95% CI = 0.70-0.99%) with only visual impairment, 0.61 (95% CI = 0.53-0.71%) with only hearing impairment, and 1.27% (95% CI = 1.00-1.61%) with combined sensory impairments. After adjusting for sociodemographic characteristics, vision impairment (odds ratio (OR) = 1.58, 95% CI = 1.28-1.96) and combined sensory impairments (OR = 1.64, 95% CI = 1.23-2.20) were associated with greater risk of severe to extremely severe dementia. Hearing impairment was not significantly associated with dementia. Sensory impairments are associated with greater risk of dementia in Chinese older adults. Studies are needed to further explore the pathway of this association in Chinese elderly adults and to provide suggestions to improve health status for this population. © 2018, Copyright the Authors Journal compilation © 2018, The American Geriatrics Society.

  13. Does manipulating the speed of visual flow in virtual reality change distance estimation while walking in Parkinson's disease?

    PubMed

    Ehgoetz Martens, Kaylena A; Ellard, Colin G; Almeida, Quincy J

    2015-03-01

    Although dopaminergic replacement therapy is believed to improve sensory processing in PD, while delayed perceptual speed is thought to be caused by a predominantly cholinergic deficit, it is unclear whether sensory-perceptual deficits are a result of corrupt sensory processing, or a delay in updating perceived feedback during movement. The current study aimed to examine these two hypotheses by manipulating visual flow speed and dopaminergic medication to examine which influenced distance estimation in PD. Fourteen PD and sixteen HC participants were instructed to estimate the distance of a remembered target by walking to the position the target formerly occupied. This task was completed in virtual reality in order to manipulate the visual flow (VF) speed in real time. Three conditions were carried out: (1) BASELINE: VF speed was equal to participants' real-time movement speed; (2) SLOW: VF speed was reduced by 50 %; (2) FAST: VF speed was increased by 30 %. Individuals with PD performed the experiment in their ON and OFF state. PD demonstrated significantly greater judgement error during BASELINE and FAST conditions compared to HC, although PD did not improve their judgement error during the SLOW condition. Additionally, PD had greater variable error during baseline compared to HC; however, during the SLOW conditions, PD had significantly less variable error compared to baseline and similar variable error to HC participants. Overall, dopaminergic medication did not significantly influence judgement error. Therefore, these results suggest that corrupt processing of sensory information is the main contributor to sensory-perceptual deficits during movement in PD rather than delayed updating of sensory feedback.

  14. Medial temporal lobe-dependent repetition suppression and enhancement due to implicit vs. explicit processing of individual repeated search displays

    PubMed Central

    Geyer, Thomas; Baumgartner, Florian; Müller, Hermann J.; Pollmann, Stefan

    2012-01-01

    Using visual search, functional magnetic resonance imaging (fMRI) and patient studies have demonstrated that medial temporal lobe (MTL) structures differentiate repeated from novel displays—even when observers are unaware of display repetitions. This suggests a role for MTL in both explicit and, importantly, implicit learning of repeated sensory information (Greene et al., 2007). However, recent behavioral studies suggest, by examining visual search and recognition performance concurrently, that observers have explicit knowledge of at least some of the repeated displays (Geyer et al., 2010). The aim of the present fMRI study was thus to contribute new evidence regarding the contribution of MTL structures to explicit vs. implicit learning in visual search. It was found that MTL activation was increased for explicit and, respectively, decreased for implicit relative to baseline displays. These activation differences were most pronounced in left anterior parahippocampal cortex (aPHC), especially when observers were highly trained on the repeated displays. The data are taken to suggest that explicit and implicit memory processes are linked within MTL structures, but expressed via functionally separable mechanisms (repetition-enhancement vs. -suppression). They further show that repetition effects in visual search would have to be investigated at the display level. PMID:23060776

  15. Cross-sectional and longitudinal relationship between neuroticism and cognitive ability in advanced old age: the moderating role of severe sensory impairment.

    PubMed

    Wettstein, Markus; Kuźma, Elżbieta; Wahl, Hans-Werner; Heyl, Vera

    2016-09-01

    Gaining a comprehensive picture of the network of constructs in which cognitive functioning is embedded is crucial across the full lifespan. With respect to personality, previous findings support a relationship between neuroticism and cognitive abilities. However, findings regarding old age are inconsistent. In particular, little is known about potentially moderating variables which might explain some of the inconsistency. Our aim was to examine the moderating effect of severe sensory impairment on cross-sectional and longitudinal associations between neuroticism and cognitive functioning. The study sample consisted of 121 visually impaired (VI), 116 hearing impaired (HI), and 150 sensory unimpaired older adults (UI). Mean age was 82.50 years (SD = 4.71 years). Neuroticism was assessed by the NEO Five Factor Inventory, and multiple established tests were used for the assessment of cognitive performance (e.g., subtests of the revised Wechsler Adult Intelligence Scale). Bivariate correlations and multi-group structural equation models indicated stronger relationships between cognitive abilities and neuroticism in both sensory impaired groups (VI and HI) compared to UI older individuals. This relationship was attenuated but still significant in both sensory impaired groups when controlling for age, education and health (number of chronic conditions). In cross-lagged panel models, higher baseline neuroticism was significantly associated with lower cognitive performance four years later in VI and HI individuals. Our results suggest that sensory impairment moderates both cross-sectional and longitudinal associations between neuroticism and cognitive function in advanced old age.

  16. Virtual Reality: Visualization in Three Dimensions.

    ERIC Educational Resources Information Center

    McLellan, Hilary

    Virtual reality is a newly emerging tool for scientific visualization that makes possible multisensory, three-dimensional modeling of scientific data. While the emphasis is on visualization, the other senses are added to enhance what the scientist can visualize. Researchers are working to extend the sensory range of what can be perceived in…

  17. Functional connectivity of visual cortex in the blind follows retinotopic organization principles

    PubMed Central

    Ovadia-Caro, Smadar; Caramazza, Alfonso; Margulies, Daniel S.; Villringer, Arno

    2015-01-01

    Is visual input during critical periods of development crucial for the emergence of the fundamental topographical mapping of the visual cortex? And would this structure be retained throughout life-long blindness or would it fade as a result of plastic, use-based reorganization? We used functional connectivity magnetic resonance imaging based on intrinsic blood oxygen level-dependent fluctuations to investigate whether significant traces of topographical mapping of the visual scene in the form of retinotopic organization, could be found in congenitally blind adults. A group of 11 fully and congenitally blind subjects and 18 sighted controls were studied. The blind demonstrated an intact functional connectivity network structural organization of the three main retinotopic mapping axes: eccentricity (centre-periphery), laterality (left-right), and elevation (upper-lower) throughout the retinotopic cortex extending to high-level ventral and dorsal streams, including characteristic eccentricity biases in face- and house-selective areas. Functional connectivity-based topographic organization in the visual cortex was indistinguishable from the normally sighted retinotopic functional connectivity structure as indicated by clustering analysis, and was found even in participants who did not have a typical retinal development in utero (microphthalmics). While the internal structural organization of the visual cortex was strikingly similar, the blind exhibited profound differences in functional connectivity to other (non-visual) brain regions as compared to the sighted, which were specific to portions of V1. Central V1 was more connected to language areas but peripheral V1 to spatial attention and control networks. These findings suggest that current accounts of critical periods and experience-dependent development should be revisited even for primary sensory areas, in that the connectivity basis for visual cortex large-scale topographical organization can develop without any visual experience and be retained through life-long experience-dependent plasticity. Furthermore, retinotopic divisions of labour, such as that between the visual cortex regions normally representing the fovea and periphery, also form the basis for topographically-unique plastic changes in the blind. PMID:25869851

  18. Enhanced audio-visual interactions in the auditory cortex of elderly cochlear-implant users.

    PubMed

    Schierholz, Irina; Finke, Mareike; Schulte, Svenja; Hauthal, Nadine; Kantzke, Christoph; Rach, Stefan; Büchner, Andreas; Dengler, Reinhard; Sandmann, Pascale

    2015-10-01

    Auditory deprivation and the restoration of hearing via a cochlear implant (CI) can induce functional plasticity in auditory cortical areas. How these plastic changes affect the ability to integrate combined auditory (A) and visual (V) information is not yet well understood. In the present study, we used electroencephalography (EEG) to examine whether age, temporary deafness and altered sensory experience with a CI can affect audio-visual (AV) interactions in post-lingually deafened CI users. Young and elderly CI users and age-matched NH listeners performed a speeded response task on basic auditory, visual and audio-visual stimuli. Regarding the behavioral results, a redundant signals effect, that is, faster response times to cross-modal (AV) than to both of the two modality-specific stimuli (A, V), was revealed for all groups of participants. Moreover, in all four groups, we found evidence for audio-visual integration. Regarding event-related responses (ERPs), we observed a more pronounced visual modulation of the cortical auditory response at N1 latency (approximately 100 ms after stimulus onset) in the elderly CI users when compared with young CI users and elderly NH listeners. Thus, elderly CI users showed enhanced audio-visual binding which may be a consequence of compensatory strategies developed due to temporary deafness and/or degraded sensory input after implantation. These results indicate that the combination of aging, sensory deprivation and CI facilitates the coupling between the auditory and the visual modality. We suggest that this enhancement in multisensory interactions could be used to optimize auditory rehabilitation, especially in elderly CI users, by the application of strong audio-visually based rehabilitation strategies after implant switch-on. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. [Sound improves distinction of low intensities of light in the visual cortex of a rabbit].

    PubMed

    Polianskiĭ, V B; Alymkulov, D E; Evtikhin, D V; Chernyshev, B V

    2011-01-01

    Electrodes were implanted into cranium above the primary visual cortex of four rabbits (Orictolagus cuniculus). At the first stage, visual evoked potentials (VEPs) were recorded in response to substitution of threshold visual stimuli (0.28 and 0.31 cd/m2). Then the sound (2000 Hz, 84 dB, duration 40 ms) was added simultaneously to every visual stimulus. Single sounds (without visual stimuli) did not produce a VEP-response. It was found that the amplitude ofVEP component N1 (85-110 ms) in response to complex stimuli (visual and sound) increased 1.6 times as compared to "simple" visual stimulation. At the second stage, paired substitutions of 8 different visual stimuli (range 0.38-20.2 cd/m2) by each other were performed. Sensory spaces of intensity were reconstructed on the basis of factor analysis. Sensory spaces of complexes were reconstructed in a similar way for simultaneous visual and sound stimulation. Comparison of vectors representing the stimuli in the spaces showed that the addition of a sound led to a 1.4-fold expansion of the space occupied by smaller intensities (0.28; 1.02; 3.05; 6.35 cd/m2). Also, the addition of the sound led to an arrangement of intensities in an ascending order. At the same time, the sound 1.33-times narrowed the space of larger intensities (8.48; 13.7; 16.8; 20.2 cd/m2). It is suggested that the addition of a sound improves a distinction of smaller intensities and impairs a dis- tinction of larger intensities. Sensory spaces revealed by complex stimuli were two-dimensional. This fact can be a consequence of integration of sound and light in a unified complex at simultaneous stimulation.

  20. Prevalence and correlates of hearing and visual impairments in European nursing homes: results from the SHELTER study.

    PubMed

    Yamada, Yukari; Vlachova, Martina; Richter, Tomas; Finne-Soveri, Harriet; Gindin, Jacob; van der Roest, Henriëtte; Denkinger, Michael D; Bernabei, Roberto; Onder, Graziano; Topinkova, Eva

    2014-10-01

    Visual and hearing impairments are known to be related to functional disability, cognitive impairment, and depression in community-dwelling older people. The aim of this study was to examine the prevalence of sensory impairment in nursing home residents, and whether sensory impairment is related to other common clinical problems in nursing homes, mediated by functional disability, cognitive impairment, and depressive symptoms. Cross-sectional data of 4007 nursing home residents in 59 facilities in 8 countries from the SHELTER study were analyzed. Visual and hearing impairments were assessed by trained staff using the interRAI instrument for Long-Term Care Facilities. Generalized linear mixed models adjusted for functional disability, cognitive impairment, and depressive symptoms were used to analyze associations of sensory impairments with prevalence of clinical problems, including behavioral symptoms, incontinence, fatigue, falls, problems with balance, sleep, nutrition, and communication. Of the participants, 32% had vision or hearing impairment (single impairment) and another 32% had both vision and hearing impairments (dual impairment). Residents with single impairment had significantly higher rates of communication problems, fatigue, balance problems, and sleep problems, as compared with residents without any sensory impairment. Those with dual impairment had significantly higher rates of all clinical problems assessed in this study as compared with those without sensory impairment. For each clinical problem, the magnitude of the odds ratio for specific clinical problems was higher for dual impairment than for single impairment. Visual and hearing impairments are associated with higher rates of common clinical problems among nursing home residents, independent of functional disability, cognitive impairment, and depressive symptoms. Copyright © 2014 AMDA – The Society for Post-Acute and Long-Term Care Medicine. Published by Elsevier Inc. All rights reserved.

  1. Cross-sensory correspondences and symbolism in spoken and written language.

    PubMed

    Walker, Peter

    2016-09-01

    Lexical sound symbolism in language appears to exploit the feature associations embedded in cross-sensory correspondences. For example, words incorporating relatively high acoustic frequencies (i.e., front/close rather than back/open vowels) are deemed more appropriate as names for concepts associated with brightness, lightness in weight, sharpness, smallness, speed, and thinness, because higher pitched sounds appear to have these cross-sensory features. Correspondences also support prosodic sound symbolism. For example, speakers might raise the fundamental frequency of their voice to emphasize the smallness of the concept they are naming. The conceptual nature of correspondences and their functional bidirectionality indicate they should also support other types of symbolism, including a visual equivalent of prosodic sound symbolism. For example, the correspondence between auditory pitch and visual thinness predicts that a typeface with relatively thin letter strokes will reinforce a word's reference to a relatively high pitch sound (e.g., squeal). An initial rating study confirms that the thinness-thickness of a typeface's letter strokes accesses the same cross-sensory correspondences observed elsewhere. A series of speeded word classification experiments then confirms that the thinness-thickness of letter strokes can facilitate a reader's comprehension of the pitch of a sound named by a word (thinner letter strokes being appropriate for higher pitch sounds), as can the brightness of the text (e.g., white-on-gray text being appropriate for the names of higher pitch sounds). It is proposed that the elementary visual features of text are represented in the same conceptual system as word meaning, allowing cross-sensory correspondences to support visual symbolism in language. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  2. Supramodal processing optimizes visual perceptual learning and plasticity.

    PubMed

    Zilber, Nicolas; Ciuciu, Philippe; Gramfort, Alexandre; Azizi, Leila; van Wassenhove, Virginie

    2014-06-01

    Multisensory interactions are ubiquitous in cortex and it has been suggested that sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs (Pascual-Leone and Hamilton, 2001; Renier et al., 2013; Ricciardi and Pietrini, 2011; Voss and Zatorre, 2012). Here, we asked whether learning to discriminate visual coherence could benefit from supramodal processing. To this end, three groups of participants were briefly trained to discriminate which of a red or green intermixed population of random-dot-kinematograms (RDKs) was most coherent in a visual display while being recorded with magnetoencephalography (MEG). During training, participants heard no sound (V), congruent acoustic textures (AV) or auditory noise (AVn); importantly, congruent acoustic textures shared the temporal statistics - i.e. coherence - of visual RDKs. After training, the AV group significantly outperformed participants trained in V and AVn although they were not aware of their progress. In pre- and post-training blocks, all participants were tested without sound and with the same set of RDKs. When contrasting MEG data collected in these experimental blocks, selective differences were observed in the dynamic pattern and the cortical loci responsive to visual RDKs. First and common to all three groups, vlPFC showed selectivity to the learned coherence levels whereas selectivity in visual motion area hMT+ was only seen for the AV group. Second and solely for the AV group, activity in multisensory cortices (mSTS, pSTS) correlated with post-training performances; additionally, the latencies of these effects suggested feedback from vlPFC to hMT+ possibly mediated by temporal cortices in AV and AVn groups. Altogether, we interpret our results in the context of the Reverse Hierarchy Theory of learning (Ahissar and Hochstein, 2004) in which supramodal processing optimizes visual perceptual learning by capitalizing on sensory-invariant representations - here, global coherence levels across sensory modalities. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Cognitive Processes in Intelligence Analysis: A Descriptive Model and Review of the Literature

    DTIC Science & Technology

    1979-12-01

    vision, hearing , touch,) tion’frquenly ncouterIntefernce In column 2 are the means by which allresulting from unavoidable confusion on reqenty couner... auditory , touch, or senses and makes It available to the muscular sense Inputs outside rest of the cognitive structure, while at awareness and attention...or Ie, the visual to the auditory . change in the sensory Input. The buffer p Shas several characteristics: (The reader may be able to recap- ture

  4. Inferring Nonlinear Neuronal Computation Based on Physiologically Plausible Inputs

    PubMed Central

    McFarland, James M.; Cui, Yuwei; Butts, Daniel A.

    2013-01-01

    The computation represented by a sensory neuron's response to stimuli is constructed from an array of physiological processes both belonging to that neuron and inherited from its inputs. Although many of these physiological processes are known to be nonlinear, linear approximations are commonly used to describe the stimulus selectivity of sensory neurons (i.e., linear receptive fields). Here we present an approach for modeling sensory processing, termed the Nonlinear Input Model (NIM), which is based on the hypothesis that the dominant nonlinearities imposed by physiological mechanisms arise from rectification of a neuron's inputs. Incorporating such ‘upstream nonlinearities’ within the standard linear-nonlinear (LN) cascade modeling structure implicitly allows for the identification of multiple stimulus features driving a neuron's response, which become directly interpretable as either excitatory or inhibitory. Because its form is analogous to an integrate-and-fire neuron receiving excitatory and inhibitory inputs, model fitting can be guided by prior knowledge about the inputs to a given neuron, and elements of the resulting model can often result in specific physiological predictions. Furthermore, by providing an explicit probabilistic model with a relatively simple nonlinear structure, its parameters can be efficiently optimized and appropriately regularized. Parameter estimation is robust and efficient even with large numbers of model components and in the context of high-dimensional stimuli with complex statistical structure (e.g. natural stimuli). We describe detailed methods for estimating the model parameters, and illustrate the advantages of the NIM using a range of example sensory neurons in the visual and auditory systems. We thus present a modeling framework that can capture a broad range of nonlinear response functions while providing physiologically interpretable descriptions of neural computation. PMID:23874185

  5. Reducing Problems in Fine Motor Development among Primary Children through the Use of Multi-Sensory Techniques.

    ERIC Educational Resources Information Center

    Wessel, Dorothy

    A 10-week classroom intervention program was implemented to facilitate the fine-motor development of eight first-grade children assessed as being deficient in motor skills. The program was divided according to five deficits to be remediated: visual motor, visual discrimination, visual sequencing, visual figure-ground, and visual memory. Each area…

  6. Modelling effects on grid cells of sensory input during self‐motion

    PubMed Central

    Raudies, Florian; Hinman, James R.

    2016-01-01

    Abstract The neural coding of spatial location for memory function may involve grid cells in the medial entorhinal cortex, but the mechanism of generating the spatial responses of grid cells remains unclear. This review describes some current theories and experimental data concerning the role of sensory input in generating the regular spatial firing patterns of grid cells, and changes in grid cell firing fields with movement of environmental barriers. As described here, the influence of visual features on spatial firing could involve either computations of self‐motion based on optic flow, or computations of absolute position based on the angle and distance of static visual cues. Due to anatomical selectivity of retinotopic processing, the sensory features on the walls of an environment may have a stronger effect on ventral grid cells that have wider spaced firing fields, whereas the sensory features on the ground plane may influence the firing of dorsal grid cells with narrower spacing between firing fields. These sensory influences could contribute to the potential functional role of grid cells in guiding goal‐directed navigation. PMID:27094096

  7. Functional near-infrared spectroscopy (fNIRS) brain imaging of multi-sensory integration during computerized dynamic posturography in middle-aged and older adults.

    PubMed

    Lin, Chia-Cheng; Barker, Jeffrey W; Sparto, Patrick J; Furman, Joseph M; Huppert, Theodore J

    2017-04-01

    Studies suggest that aging affects the sensory re-weighting process, but the neuroimaging evidence is minimal. Functional Near-Infrared Spectroscopy (fNIRS) is a novel neuroimaging tool that can detect brain activities during dynamic movement condition. In this study, fNIRS was used to investigate the hemodynamic changes in the frontal-lateral, temporal-parietal, and occipital regions of interest (ROIs) during four sensory integration conditions that manipulated visual and somatosensory feedback in 15 middle-aged and 15 older adults. The results showed that the temporal-parietal ROI was activated more when somatosensory and visual information were absent in both groups, which indicated the sole use of vestibular input for maintaining balance. While both older adults and middle-aged adults had greater activity in most brain ROIs during changes in the sensory conditions, the older adults had greater increases in the occipital ROI and frontal-lateral ROIs. These findings suggest a cortical component to sensory re-weighting that is more distributed and requires greater attention in older adults.

  8. M.I.T./Canadian vestibular experiments on the Spacelab-1 mission. I - Sensory adaptation to weightlessness and readaptation to one-g: An overview

    NASA Technical Reports Server (NTRS)

    Young, L. R.; Oman, C. M.; Lichtenberg, B. K.; Watt, D. G. D.; Money, K. E.

    1986-01-01

    Human sensory/motor adaptation to weightlessness and readaptation to earth's gravity are assessed. Preflight and postflight vestibular and visual responses for the crew on the Spacelab-1 mission are studied; the effect of the abnormal pattern of otolith afferent signals caused by weightlessness on the pitch and roll perception and postural adjustments of the subjects are examined. It is observed that body position and postural reactions change due to weightlessness in order to utilize the varied sensory inputs in a manner suited to microgravity conditions. The aspects of reinterpretation include: (1) tilt acceleration reinterpretation, (2) reduced postural response to z-axis linear acceleration, and (3) increased attention to visual cues.

  9. Studying Sensory Perception.

    ERIC Educational Resources Information Center

    Ackerly, Spafford C.

    2001-01-01

    Explains the vestibular organ's role in balancing the body and stabilizing the visual world using the example of a hunter. Describes the relationship between sensory perception and learning. Recommends using optical illusions to illustrate the distinctions between external realities and internal perceptions. (Contains 13 references.) (YDS)

  10. Iconic Memory and Reading Performance in Nine-Year-Old Children

    ERIC Educational Resources Information Center

    Riding, R. J.; Pugh, J. C.

    1977-01-01

    The reading process incorporates three factors: images registered in visual sensory memory, semantic analysis in short-term memory, and long-term memory storage. The focus here is on the contribution of sensory memory to reading performance. (Author/RK)

  11. Evaluating the Visually Impaired: Neuropsychological Techniques.

    ERIC Educational Resources Information Center

    Price, J. R.; And Others

    1987-01-01

    Assessment of nonvisual neuropsychological impairments in visually impaired persons can be achieved through modification of existing intelligence, memory, sensory-motor, personality, language, and achievement tests so that they do not require vision or penalize visually impaired persons. The Halstead-Reitan and Luria-Nebraska neuropsychological…

  12. Weighted integration of short-term memory and sensory signals in the oculomotor system.

    PubMed

    Deravet, Nicolas; Blohm, Gunnar; de Xivry, Jean-Jacques Orban; Lefèvre, Philippe

    2018-05-01

    Oculomotor behaviors integrate sensory and prior information to overcome sensory-motor delays and noise. After much debate about this process, reliability-based integration has recently been proposed and several models of smooth pursuit now include recurrent Bayesian integration or Kalman filtering. However, there is a lack of behavioral evidence in humans supporting these theoretical predictions. Here, we independently manipulated the reliability of visual and prior information in a smooth pursuit task. Our results show that both smooth pursuit eye velocity and catch-up saccade amplitude were modulated by visual and prior information reliability. We interpret these findings as the continuous reliability-based integration of a short-term memory of target motion with visual information, which support modeling work. Furthermore, we suggest that saccadic and pursuit systems share this short-term memory. We propose that this short-term memory of target motion is quickly built and continuously updated, and constitutes a general building block present in all sensorimotor systems.

  13. Attention distributed across sensory modalities enhances perceptual performance

    PubMed Central

    Mishra, Jyoti; Gazzaley, Adam

    2012-01-01

    This study investigated the interaction between top-down attentional control and multisensory processing in humans. Using semantically congruent and incongruent audiovisual stimulus streams, we found target detection to be consistently improved in the setting of distributed audiovisual attention versus focused visual attention. This performance benefit was manifested as faster reaction times for congruent audiovisual stimuli, and as accuracy improvements for incongruent stimuli, resulting in a resolution of stimulus interference. Electrophysiological recordings revealed that these behavioral enhancements were associated with reduced neural processing of both auditory and visual components of the audiovisual stimuli under distributed vs. focused visual attention. These neural changes were observed at early processing latencies, within 100–300 ms post-stimulus onset, and localized to auditory, visual, and polysensory temporal cortices. These results highlight a novel neural mechanism for top-down driven performance benefits via enhanced efficacy of sensory neural processing during distributed audiovisual attention relative to focused visual attention. PMID:22933811

  14. Visual Perceptual Learning and Models.

    PubMed

    Dosher, Barbara; Lu, Zhong-Lin

    2017-09-15

    Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.

  15. Predictions penetrate perception: Converging insights from brain, behaviour and disorder

    PubMed Central

    O’Callaghan, Claire; Kveraga, Kestutis; Shine, James M; Adams, Reginald B.; Bar, Moshe

    2018-01-01

    It is argued that during ongoing visual perception, the brain is generating top-down predictions to facilitate, guide and constrain the processing of incoming sensory input. Here we demonstrate that these predictions are drawn from a diverse range of cognitive processes, in order to generate the richest and most informative prediction signals. This is consistent with a central role for cognitive penetrability in visual perception. We review behavioural and mechanistic evidence that indicate a wide spectrum of domains—including object recognition, contextual associations, cognitive biases and affective state—that can directly influence visual perception. We combine these insights from the healthy brain with novel observations from neuropsychiatric disorders involving visual hallucinations, which highlight the consequences of imbalance between top-down signals and incoming sensory information. Together, these lines of evidence converge to indicate that predictive penetration, be it cognitive, social or emotional, should be considered a fundamental framework that supports visual perception. PMID:27222169

  16. Straightening the Eyes Doesn't Rebalance the Brain

    PubMed Central

    Zhou, Jiawei; Wang, Yonghua; Feng, Lixia; Wang, Jiafeng; Hess, Robert F.

    2017-01-01

    Surgery to align the two eyes is commonly used in treating strabismus. However, the role of strabismic surgery on patients' binocular visual processing is not yet fully understood. In this study, we asked two questions: (1) Does realigning the eyes by strabismic surgery produce an immediate benefit to patients' sensory eye balance? (2) If not, is there a subsequent period of “alignment adaptation” akin to refractive adaptation where sensory benefits to binocular function accrue? Seventeen patients with strabismus (mean age: 17.06 ± 5.16 years old) participated in our experiment. All participants had normal or corrected to normal visual acuity (LogMAR < 0.10) in the two eyes. We quantitatively measured their sensory eye balance before and after surgery using a binocular phase combination paradigm. For the seven patients whose sensory eye balance was measured before surgery, we found no significant change [t(6) = −0.92; p = 0.39] in the sensory eye balance measured 0.5–1 months after the surgery, indicating that the surgical re-alignment didn't by itself produce any immediate benefit for sensory eye balance. To answer the second question, we measured 16 patients' sensory eye balance at around 5–12 months after their eyes had been surgically re-aligned and compared this with our measurements 0.5–1 months after surgery. We found no significant change [t(15) = −0.89; p = 0.39] in sensory eye balance 5–12 months after the surgery. These results suggest that strabismic surgery while being necessary is not itself sufficient for re-establishing balanced sensory eye dominance. PMID:28955214

  17. Temporal profile of pain and other sensory manifestations in Guillain-Barre' syndrome during ten days of hospitalization.

    PubMed

    Karkare, K; Taly, Arun B; Sinha, Sanjib; Rao, S

    2011-01-01

    Focused studies on sensory manifestations, especially pain and paresthesia in Guillain-Barre' (GB) syndrome are few and far between. To study the sensory manifestations in GB syndrome during 10 days of hospitalization with clinico-electrophysiological correlation. The study included 60 non-consecutive patients with GB syndrome, fulfilling National Institute of Neurological and Communicative Disorders and Stroke (NINCDS) criteria for GB syndrome. Data especially related to clinical and electrophysiological evidence of sensory involvement were analyzed. Pain was assessed using a) visual analogue paraesthesias (Vapar), b) visual analogue for pain (Vap) and c) verbal rating scale for pain (Verp). Sensory symptoms were widely prevalent: paraesthesia in 45 (75%) patients and pain in 30 (50%) patients. Impairment of different sensory modalities included: pain in 8 (13.3%), joint position sense in 14 (23.3%), and vibration in 11 (18.3%). Electrophysiological evidence of abnormal sensory nerve conduction was noted in 35 (58.3%) patients. Pain assessment using Vapar, Vap and Verp for from Day 1 to Day 10 of hospitalization revealed that from Day 7 onwards the degree and frequency of sensory symptoms and signs decreased. On comparing various clinico-electrophysiological parameters among patients of GB syndrome with and without pain and paresthesia. Presence of respiratory distress correlated with pain and paresthesia (P=0.02). Sensory manifestations in GB syndrome are often under-recognized and under-emphasized. This study analyzed the evolution and the profile of pain and paresthesia in GB syndrome during hospitalization. Knowledge, especially about evolution of pain and paresthesia during hospitalization might improve understanding and patient care.

  18. Sensory signals during active versus passive movement.

    PubMed

    Cullen, Kathleen E

    2004-12-01

    Our sensory systems are simultaneously activated as the result of our own actions and changes in the external world. The ability to distinguish self-generated sensory events from those that arise externally is thus essential for perceptual stability and accurate motor control. Recently, progress has been made towards understanding how this distinction is made. It has been proposed that an internal prediction of the consequences of our actions is compared to the actual sensory input to cancel the resultant self-generated activation. Evidence in support of this hypothesis has been obtained for early stages of sensory processing in the vestibular, visual and somatosensory systems. These findings have implications for the sensory-motor transformations that are needed to guide behavior.

  19. Moving in Dim Light: Behavioral and Visual Adaptations in Nocturnal Ants.

    PubMed

    Narendra, Ajay; Kamhi, J Frances; Ogawa, Yuri

    2017-11-01

    Visual navigation is a benchmark information processing task that can be used to identify the consequence of being active in dim-light environments. Visual navigational information that animals use during the day includes celestial cues such as the sun or the pattern of polarized skylight and terrestrial cues such as the entire panorama, canopy pattern, or significant salient features in the landscape. At night, some of these navigational cues are either unavailable or are significantly dimmer or less conspicuous than during the day. Even under these circumstances, animals navigate between locations of importance. Ants are a tractable system for studying navigation during day and night because the fine scale movement of individual animals can be recorded in high spatial and temporal detail. Ant species range from being strictly diurnal, crepuscular, and nocturnal. In addition, a number of species have the ability to change from a day- to a night-active lifestyle owing to environmental demands. Ants also offer an opportunity to identify the evolution of sensory structures for discrete temporal niches not only between species but also within a single species. Their unique caste system with an exclusive pedestrian mode of locomotion in workers and an exclusive life on the wing in males allows us to disentangle sensory adaptations that cater for different lifestyles. In this article, we review the visual navigational abilities of nocturnal ants and identify the optical and physiological adaptations they have evolved for being efficient visual navigators in dim-light. © The Author 2017. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.

  20. Aging effects on functional auditory and visual processing using fMRI with variable sensory loading.

    PubMed

    Cliff, Michael; Joyce, Dan W; Lamar, Melissa; Dannhauser, Thomas; Tracy, Derek K; Shergill, Sukhwinder S

    2013-05-01

    Traditionally, studies investigating the functional implications of age-related structural brain alterations have focused on higher cognitive processes; by increasing stimulus load, these studies assess behavioral and neurophysiological performance. In order to understand age-related changes in these higher cognitive processes, it is crucial to examine changes in visual and auditory processes that are the gateways to higher cognitive functions. This study provides evidence for age-related functional decline in visual and auditory processing, and regional alterations in functional brain processing, using non-invasive neuroimaging. Using functional magnetic resonance imaging (fMRI), younger (n=11; mean age=31) and older (n=10; mean age=68) adults were imaged while observing flashing checkerboard images (passive visual stimuli) and hearing word lists (passive auditory stimuli) across varying stimuli presentation rates. Younger adults showed greater overall levels of temporal and occipital cortical activation than older adults for both auditory and visual stimuli. The relative change in activity as a function of stimulus presentation rate showed differences between young and older participants. In visual cortex, the older group showed a decrease in fMRI blood oxygen level dependent (BOLD) signal magnitude as stimulus frequency increased, whereas the younger group showed a linear increase. In auditory cortex, the younger group showed a relative increase as a function of word presentation rate, while older participants showed a relatively stable magnitude of fMRI BOLD response across all rates. When analyzing participants across all ages, only the auditory cortical activation showed a continuous, monotonically decreasing BOLD signal magnitude as a function of age. Our preliminary findings show an age-related decline in demand-related, passive early sensory processing. As stimulus demand increases, visual and auditory cortex do not show increases in activity in older compared to younger people. This may negatively impact on the fidelity of information available to higher cognitive processing. Such evidence may inform future studies focused on cognitive decline in aging. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Mapping the structure of perceptual and visual-motor abilities in healthy young adults.

    PubMed

    Wang, Lingling; Krasich, Kristina; Bel-Bahar, Tarik; Hughes, Lauren; Mitroff, Stephen R; Appelbaum, L Gregory

    2015-05-01

    The ability to quickly detect and respond to visual stimuli in the environment is critical to many human activities. While such perceptual and visual-motor skills are important in a myriad of contexts, considerable variability exists between individuals in these abilities. To better understand the sources of this variability, we assessed perceptual and visual-motor skills in a large sample of 230 healthy individuals via the Nike SPARQ Sensory Station, and compared variability in their behavioral performance to demographic, state, sleep and consumption characteristics. Dimension reduction and regression analyses indicated three underlying factors: Visual-Motor Control, Visual Sensitivity, and Eye Quickness, which accounted for roughly half of the overall population variance in performance on this battery. Inter-individual variability in Visual-Motor Control was correlated with gender and circadian patters such that performance on this factor was better for males and for those who had been awake for a longer period of time before assessment. The current findings indicate that abilities involving coordinated hand movements in response to stimuli are subject to greater individual variability, while visual sensitivity and occulomotor control are largely stable across individuals. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Environmental influences on neural systems of relational complexity

    PubMed Central

    Kalbfleisch, M. Layne; deBettencourt, Megan T.; Kopperman, Rebecca; Banasiak, Meredith; Roberts, Joshua M.; Halavi, Maryam

    2013-01-01

    Constructivist learning theory contends that we construct knowledge by experience and that environmental context influences learning. To explore this principle, we examined the cognitive process relational complexity (RC), defined as the number of visual dimensions considered during problem solving on a matrix reasoning task and a well-documented measure of mature reasoning capacity. We sought to determine how the visual environment influences RC by examining the influence of color and visual contrast on RC in a neuroimaging task. To specify the contributions of sensory demand and relational integration to reasoning, our participants performed a non-verbal matrix task comprised of color, no-color line, or black-white visual contrast conditions parametrically varied by complexity (relations 0, 1, 2). The use of matrix reasoning is ecologically valid for its psychometric relevance and for its potential to link the processing of psychophysically specific visual properties with various levels of RC during reasoning. The role of these elements is important because matrix tests assess intellectual aptitude based on these seemingly context-less exercises. This experiment is a first step toward examining the psychophysical underpinnings of performance on these types of problems. The importance of this is increased in light of recent evidence that intelligence can be linked to visual discrimination. We submit three main findings. First, color and black-white visual contrast (BWVC) add demand at a basic sensory level, but contributions from color and from BWVC are dissociable in cortex such that color engages a “reasoning heuristic” and BWVC engages a “sensory heuristic.” Second, color supports contextual sense-making by boosting salience resulting in faster problem solving. Lastly, when visual complexity reaches 2-relations, color and visual contrast relinquish salience to other dimensions of problem solving. PMID:24133465

  3. Visual and cross-modal cues increase the identification of overlapping visual stimuli in Balint's syndrome.

    PubMed

    D'Imperio, Daniela; Scandola, Michele; Gobbetto, Valeria; Bulgarelli, Cristina; Salgarello, Matteo; Avesani, Renato; Moro, Valentina

    2017-10-01

    Cross-modal interactions improve the processing of external stimuli, particularly when an isolated sensory modality is impaired. When information from different modalities is integrated, object recognition is facilitated probably as a result of bottom-up and top-down processes. The aim of this study was to investigate the potential effects of cross-modal stimulation in a case of simultanagnosia. We report a detailed analysis of clinical symptoms and an 18 F-fluorodeoxyglucose (FDG) brain positron emission tomography/computed tomography (PET/CT) study of a patient affected by Balint's syndrome, a rare and invasive visual-spatial disorder following bilateral parieto-occipital lesions. An experiment was conducted to investigate the effects of visual and nonvisual cues on performance in tasks involving the recognition of overlapping pictures. Four modalities of sensory cues were used: visual, tactile, olfactory, and auditory. Data from neuropsychological tests showed the presence of ocular apraxia, optic ataxia, and simultanagnosia. The results of the experiment indicate a positive effect of the cues on the recognition of overlapping pictures, not only in the identification of the congruent valid-cued stimulus (target) but also in the identification of the other, noncued stimuli. All the sensory modalities analyzed (except the auditory stimulus) were efficacious in terms of increasing visual recognition. Cross-modal integration improved the patient's ability to recognize overlapping figures. However, while in the visual unimodal modality both bottom-up (priming, familiarity effect, disengagement of attention) and top-down processes (mental representation and short-term memory, the endogenous orientation of attention) are involved, in the cross-modal integration it is semantic representations that mainly activate visual recognition processes. These results are potentially useful for the design of rehabilitation training for attentional and visual-perceptual deficits.

  4. Conservation implications of anthropogenic impacts on visual communication and camouflage.

    PubMed

    Delhey, Kaspar; Peters, Anne

    2017-02-01

    Anthropogenic environmental impacts can disrupt the sensory environment of animals and affect important processes from mate choice to predator avoidance. Currently, these effects are best understood for auditory and chemosensory modalities, and recent reviews highlight their importance for conservation. We examined how anthropogenic changes to the visual environment (ambient light, transmission, and backgrounds) affect visual communication and camouflage and considered the implications of these effects for conservation. Human changes to the visual environment can increase predation risk by affecting camouflage effectiveness, lead to maladaptive patterns of mate choice, and disrupt mutualistic interactions between pollinators and plants. Implications for conservation are particularly evident for disrupted camouflage due to its tight links with survival. The conservation importance of impaired visual communication is less documented. The effects of anthropogenic changes on visual communication and camouflage may be severe when they affect critical processes such as pollination or species recognition. However, when impaired mate choice does not lead to hybridization, the conservation consequences are less clear. We suggest that the demographic effects of human impacts on visual communication and camouflage will be particularly strong when human-induced modifications to the visual environment are evolutionarily novel (i.e., very different from natural variation); affected species and populations have low levels of intraspecific (genotypic and phenotypic) variation and behavioral, sensory, or physiological plasticity; and the processes affected are directly related to survival (camouflage), species recognition, or number of offspring produced, rather than offspring quality or attractiveness. Our findings suggest that anthropogenic effects on the visual environment may be of similar importance relative to conservation as anthropogenic effects on other sensory modalities. © 2016 Society for Conservation Biology.

  5. Thalamic nuclei convey diverse contextual information to layer 1 of visual cortex

    PubMed Central

    Imhof, Fabia; Martini, Francisco J.; Hofer, Sonja B.

    2017-01-01

    Sensory perception depends on the context within which a stimulus occurs. Prevailing models emphasize cortical feedback as the source of contextual modulation. However, higher-order thalamic nuclei, such as the pulvinar, interconnect with many cortical and subcortical areas, suggesting a role for the thalamus in providing sensory and behavioral context – yet the nature of the signals conveyed to cortex by higher-order thalamus remains poorly understood. Here we use axonal calcium imaging to measure information provided to visual cortex by the pulvinar equivalent in mice, the lateral posterior nucleus (LP), as well as the dorsolateral geniculate nucleus (dLGN). We found that dLGN conveys retinotopically precise visual signals, while LP provides distributed information from the visual scene. Both LP and dLGN projections carry locomotion signals. However, while dLGN inputs often respond to positive combinations of running and visual flow speed, LP signals discrepancies between self-generated and external visual motion. This higher-order thalamic nucleus therefore conveys diverse contextual signals that inform visual cortex about visual scene changes not predicted by the animal’s own actions. PMID:26691828

  6. The primary visual cortex in the neural circuit for visual orienting

    NASA Astrophysics Data System (ADS)

    Zhaoping, Li

    The primary visual cortex (V1) is traditionally viewed as remote from influencing brain's motor outputs. However, V1 provides the most abundant cortical inputs directly to the sensory layers of superior colliculus (SC), a midbrain structure to command visual orienting such as shifting gaze and turning heads. I will show physiological, anatomical, and behavioral data suggesting that V1 transforms visual input into a saliency map to guide a class of visual orienting that is reflexive or involuntary. In particular, V1 receives a retinotopic map of visual features, such as orientation, color, and motion direction of local visual inputs; local interactions between V1 neurons perform a local-to-global computation to arrive at a saliency map that highlights conspicuous visual locations by higher V1 responses. The conspicuous location are usually, but not always, where visual input statistics changes. The population V1 outputs to SC, which is also retinotopic, enables SC to locate, by lateral inhibition between SC neurons, the most salient location as the saccadic target. Experimental tests of this hypothesis will be shown. Variations of the neural circuit for visual orienting across animal species, with more or less V1 involvement, will be discussed. Supported by the Gatsby Charitable Foundation.

  7. Selection on quantitative colour variation in Centaurea cyanus: the role of the pollinator's visual system.

    PubMed

    Renoult, J P; Thomann, M; Schaefer, H M; Cheptou, P-O

    2013-11-01

    Even though the importance of selection for trait evolution is well established, we still lack a functional understanding of the mechanisms underlying phenotypic selection. Because animals necessarily use their sensory system to perceive phenotypic traits, the model of sensory bias assumes that sensory systems are the main determinant of signal evolution. Yet, it has remained poorly known how sensory systems contribute to shaping the fitness surface of selected individuals. In a greenhouse experiment, we quantified the strength and direction of selection on floral coloration in a population of cornflowers exposed to bumblebees as unique pollinators during 4 days. We detected significant selection on the chromatic and achromatic (brightness) components of floral coloration. We then studied whether these patterns of selection are explicable by accounting for the visual system of the pollinators. Using data on bumblebee colour vision, we first showed that bumblebees should discriminate among quantitative colour variants. The observed selection was then compared to the selection predicted by psychophysical models of bumblebee colour vision. The achromatic but not the chromatic channel of the bumblebee's visual system could explain the observed pattern of selection. These results highlight that (i) pollinators can select quantitative variation in floral coloration and could thus account for a gradual evolution of flower coloration, and (ii) stimulation of the visual system represents, at least partly, a functional mechanism potentially explaining pollinators' selection on floral colour variants. © 2013 The Authors. Journal of Evolutionary Biology © 2013 European Society For Evolutionary Biology.

  8. Properties of intermodal transfer after dual visuo- and auditory-motor adaptation.

    PubMed

    Schmitz, Gerd; Bock, Otmar L

    2017-10-01

    Previous work documented that sensorimotor adaptation transfers between sensory modalities: When subjects adapt with one arm to a visuomotor distortion while responding to visual targets, they also appear to be adapted when they are subsequently tested with auditory targets. Vice versa, when they adapt to an auditory-motor distortion while pointing to auditory targets, they appear to be adapted when they are subsequently tested with visual targets. Therefore, it was concluded that visuomotor as well as auditory-motor adaptation use the same adaptation mechanism. Furthermore, it has been proposed that sensory information from the trained modality is weighted larger than sensory information from an untrained one, because transfer between sensory modalities is incomplete. The present study tested these hypotheses for dual arm adaptation. One arm adapted to an auditory-motor distortion and the other either to an opposite directed auditory-motor or visuomotor distortion. We found that both arms adapted significantly. However, compared to reference data on single arm adaptation, adaptation in the dominant arm was reduced indicating interference from the non-dominant to the dominant arm. We further found that arm-specific aftereffects of adaptation, which reflect recalibration of sensorimotor transformation rules, were stronger or equally strong when targets were presented in the previously adapted compared to the non-adapted sensory modality, even when one arm adapted visually and the other auditorily. The findings are discussed with respect to a recently published schematic model on sensorimotor adaptation. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Performance evaluation of a kinesthetic-tactual display

    NASA Technical Reports Server (NTRS)

    Jagacinski, R. J.; Flach, J. M.; Gilson, R. D.; Dunn, R. S.

    1982-01-01

    Simulator studies demonstrated the feasibility of using kinesthetic-tactual (KT) displays for providing collective and cyclic command information, and suggested that KT displays may increase pilot workload capability. A dual-axis laboratory tracking task suggested that beyond reduction in visual scanning, there may be additional sensory or cognitive benefits to the use of multiple sensory modalities. Single-axis laboratory tracking tasks revealed performance with a quickened KT display to be equivalent to performance with a quickened visual display for a low frequency sum-of-sinewaves input. In contrast, an unquickened KT display was inferior to an unquickened visual display. Full scale simulator studies and/or inflight testing are recommended to determine the generality of these results.

  10. Changing motor perception by sensorimotor conflicts and body ownership

    PubMed Central

    Salomon, R.; Fernandez, N. B.; van Elk, M.; Vachicouras, N.; Sabatier, F.; Tychinskaya, A.; Llobera, J.; Blanke, O.

    2016-01-01

    Experimentally induced sensorimotor conflicts can result in a loss of the feeling of control over a movement (sense of agency). These findings are typically interpreted in terms of a forward model in which the predicted sensory consequences of the movement are compared with the observed sensory consequences. In the present study we investigated whether a mismatch between movements and their observed sensory consequences does not only result in a reduced feeling of agency, but may affect motor perception as well. Visual feedback of participants’ finger movements was manipulated using virtual reality to be anatomically congruent or incongruent to the performed movement. Participants made a motor perception judgment (i.e. which finger did you move?) or a visual perceptual judgment (i.e. which finger did you see moving?). Subjective measures of agency and body ownership were also collected. Seeing movements that were visually incongruent to the performed movement resulted in a lower accuracy for motor perception judgments, but not visual perceptual judgments. This effect was modified by rotating the virtual hand (Exp.2), but not by passively induced movements (Exp.3). Hence, sensorimotor conflicts can modulate the perception of one’s motor actions, causing viewed “alien actions” to be felt as one’s own. PMID:27225834

  11. VISUAL DEFICIENCIES AND READING DISABILITY.

    ERIC Educational Resources Information Center

    ROSEN, CARL L.

    THE ROLE OF VISUAL SENSORY DEFICIENCIES IN THE CAUSATION READING DISABILITY IS DISCUSSED. PREVIOUS AND CURRENT RESEARCH STUDIES DEALING WITH SPECIFIC VISUAL PROBLEMS WHICH HAVE BEEN FOUND TO BE NEGATIVELY RELATED TO SUCCESSFUL READING ACHIEVEMENT ARE LISTED--(1) FARSIGHTEDNESS, (2) ASTIGMATISM, (3) BINOCULAR INCOORDINATIONS, AND (4) FUSIONAL…

  12. RELEVANCE OF VISUAL EFFECTS OF VOLATILE ORGANIC COMPOUNDS TO HUMAN HEALTH RISK ASSESSMENT

    EPA Science Inventory

    Traditional measures of neurotoxicity have included assessment of sensory, cognitive, and motor function. Visual system function and the neurobiological substrates are well characterized across species. Dysfunction in the visual system may be specific or may be surrogate for mor...

  13. Cross-Sensory Transfer of Reference Frames in Spatial Memory

    ERIC Educational Resources Information Center

    Kelly, Jonathan W.; Avraamides, Marios N.

    2011-01-01

    Two experiments investigated whether visual cues influence spatial reference frame selection for locations learned through touch. Participants experienced visual cues emphasizing specific environmental axes and later learned objects through touch. Visual cues were manipulated and haptic learning conditions were held constant. Imagined perspective…

  14. Age-Related Sensory Impairments and Risk of Cognitive Impairment

    PubMed Central

    Fischer, Mary E; Cruickshanks, Karen J.; Schubert, Carla R; Pinto, Alex A; Carlsson, Cynthia M; Klein, Barbara EK; Klein, Ronald; Tweed, Ted S.

    2016-01-01

    Background/Objectives To evaluate the associations of sensory impairments with the 10-year risk of cognitive impairment. Previous work has primarily focused on the relationship between a single sensory system and cognition. Design The Epidemiology of Hearing Loss Study (EHLS) is a longitudinal, population-based study of aging in the Beaver Dam, WI community. Baseline examinations were conducted in 1993 and follow-up exams have been conducted every 5 years. Setting General community Participants EHLS members without cognitive impairment at EHLS-2 (1998–2000). There were 1,884 participants (mean age = 66.7 years) with complete EHLS-2 sensory data and follow-up information. Measurements Cognitive impairment was a Mini-Mental State Examination score of < 24 or history of dementia or Alzheimer’s disease. Hearing impairment was a pure-tone average of hearing thresholds (0.5, 1, 2 and 4 kHz) of > 25 decibel Hearing Level in either ear. Visual impairment was Pelli-Robson contrast sensitivity of < 1.55 log units in the better eye and olfactory impairment was a San Diego Odor Identification Test score of < 6. Results Hearing, visual, and olfactory impairment were independently associated with cognitive impairment risk [Hearing: Hazard Ratio (HR) = 1.90, 95% Confidence Interval (C.I.) = 1.11, 3.26; Vision: HR = 2.05, 95% C.I. = 1.24, 3.38; Olfaction: HR = 3.92, 95% C.I. = 2.45, 6.26]. However, 85% with hearing impairment, 81% with visual impairment, and 76% with olfactory impairment did not develop cognitive impairment during follow-up. Conclusion The relationship between sensory impairment and cognitive impairment was not unique to one sensory system suggesting sensorineural health may be a marker of brain aging. The development of a combined sensorineurocognitive measure may be useful in uncovering mechanisms of healthy brain aging. PMID:27611845

  15. T-type calcium channels cause bursts of spikes in motor but not sensory thalamic neurons during mimicry of natural patterns of synaptic input.

    PubMed

    Kim, Haram R; Hong, Su Z; Fiorillo, Christopher D

    2015-01-01

    Although neurons within intact nervous systems can be classified as 'sensory' or 'motor,' it is not known whether there is any general distinction between sensory and motor neurons at the cellular or molecular levels. Here, we extend and test a theory according to which activation of certain subtypes of voltage-gated ion channel (VGC) generate patterns of spikes in neurons of motor systems, whereas VGC are proposed to counteract patterns in sensory neurons. We previously reported experimental evidence for the theory from visual thalamus, where we found that T-type calcium channels (TtCCs) did not cause bursts of spikes but instead served the function of 'predictive homeostasis' to maximize the causal and informational link between retinogeniculate excitation and spike output. Here, we have recorded neurons in brain slices from eight sensory and motor regions of rat thalamus while mimicking key features of natural excitatory and inhibitory post-synaptic potentials. As predicted by theory, TtCC did cause bursts of spikes in motor thalamus. TtCC-mediated responses in motor thalamus were activated at more hyperpolarized potentials and caused larger depolarizations with more spikes than in visual and auditory thalamus. Somatosensory thalamus is known to be more closely connected to motor regions relative to auditory and visual thalamus, and likewise the strength of its TtCC responses was intermediate between these regions and motor thalamus. We also observed lower input resistance, as well as limited evidence of stronger hyperpolarization-induced ('H-type') depolarization, in nuclei closer to motor output. These findings support our theory of a specific difference between sensory and motor neurons at the cellular level.

  16. From Sensory Signals to Modality-Independent Conceptual Representations: A Probabilistic Language of Thought Approach

    PubMed Central

    Erdogan, Goker; Yildirim, Ilker; Jacobs, Robert A.

    2015-01-01

    People learn modality-independent, conceptual representations from modality-specific sensory signals. Here, we hypothesize that any system that accomplishes this feat will include three components: a representational language for characterizing modality-independent representations, a set of sensory-specific forward models for mapping from modality-independent representations to sensory signals, and an inference algorithm for inverting forward models—that is, an algorithm for using sensory signals to infer modality-independent representations. To evaluate this hypothesis, we instantiate it in the form of a computational model that learns object shape representations from visual and/or haptic signals. The model uses a probabilistic grammar to characterize modality-independent representations of object shape, uses a computer graphics toolkit and a human hand simulator to map from object representations to visual and haptic features, respectively, and uses a Bayesian inference algorithm to infer modality-independent object representations from visual and/or haptic signals. Simulation results show that the model infers identical object representations when an object is viewed, grasped, or both. That is, the model’s percepts are modality invariant. We also report the results of an experiment in which different subjects rated the similarity of pairs of objects in different sensory conditions, and show that the model provides a very accurate account of subjects’ ratings. Conceptually, this research significantly contributes to our understanding of modality invariance, an important type of perceptual constancy, by demonstrating how modality-independent representations can be acquired and used. Methodologically, it provides an important contribution to cognitive modeling, particularly an emerging probabilistic language-of-thought approach, by showing how symbolic and statistical approaches can be combined in order to understand aspects of human perception. PMID:26554704

  17. Sensory organization for balance: specific deficits in Alzheimer's but not in Parkinson's disease.

    PubMed

    Chong, R K; Horak, F B; Frank, J; Kaye, J

    1999-03-01

    The cause of frequent falling in patients with dementia of the Alzheimer type (AD) is not well understood. Distraction from incongruent visual stimuli may be an important factor as suggested by their poor performance in tests of shifting visual attention in other studies. The purpose of this study was to determine whether AD patients have difficulty maintaining upright balance under absent and/or incongruent visual and other sensory conditions compared to nondemented healthy elderly persons and individuals with Parkinson's disease (PD). Seventeen healthy older adults, 15 medicated PD subjects, and 11 AD subjects underwent the Sensory Organization Test protocol. The incidence of loss of balance ("falls"), and the peak-to-peak amplitude of body center of mass sway during stance in the six sensory conditions were used to infer the ability to use visual, somatosensory, and vestibular signals when they provided useful information for balance, and to suppress them when they were incongruent as an orientation reference. Vestibular reflex tests were conducted to ensure normal vestibular function in the subjects. AD subjects had normal vestibular function but had trouble using it in condition 6, where they had to concurrently suppress both incongruent visual and somatosensory inputs. All 11 AD subjects fell in the first trial of this condition. With repeated trials, only three AD subjects were able to stay balanced. AD subjects were able to keep their balance when only somatosensory input was incongruent. In this condition, all AD subjects were able to maintain balance whereas some falls occurred in the other groups. In all conditions, when AD subjects did not fall, they were able to control as large a sway as the healthy controls, except when standing with eyes closed in condition 2: AD subjects did not increase their sway whereas the other groups did. In the PD group, the total fall incidence was similar to the AD group, but the distribution was generalized across more sensory conditions. PD subjects were also able to improve with repeated trials in condition 6. Patients with dementia of the Alzheimer type have decreased ability to suppress incongruent visual stimuli when trying to maintain balance. However, they did not seem to be dependent on vision for balance because they did not increase their sway when vision was absent. Parkinsonian patients have a more general balance control problem in the sensory organization test, possibly related to difficulty changing set.

  18. Laminar and cytoarchitectonic features of the cerebral cortex in the Risso's dolphin (Grampus griseus), striped dolphin (Stenella coeruleoalba), and bottlenose dolphin (Tursiops truncatus).

    PubMed

    Furutani, Rui

    2008-09-01

    The present investigation carried out Nissl, Klüver-Barrera, and Golgi studies of the cerebral cortex in three distinct genera of oceanic dolphins (Risso's dolphin, striped dolphin and bottlenose dolphin) to identify and classify cortical laminar and cytoarchitectonic structures in four distinct functional areas, including primary motor (M1), primary sensory (S1), primary visual (V1), and primary auditory (A1) cortices. The laminar and cytoarchitectonic organization of each of these cortical areas was similar among the three dolphin species. M1 was visualized as five-layer structure that included the molecular layer (layer I), external granular layer (layer II), external pyramidal layer (layer III), internal pyramidal layer (layer V), and fusiform layer (layer VI). The internal granular layer was absent. The cetacean sensory-related cortical areas S1, V1, and A1 were also found to have a five-layer organization comprising layers I, II, III, V and VI. In particular, A1 was characterized by the broadest layer I, layer II and developed band of pyramidal neurons in layers III (sublayers IIIa, IIIb and IIIc) and V. The patch organization consisting of the layer IIIb-pyramidal neurons was detected in the S1 and V1, but not in A1. The laminar patterns of V1 and S1 were similar, but the cytoarchitectonic structures of the two areas were different. V1 was characterized by a broader layer II than that of S1, and also contained the specialized pyramidal and multipolar stellate neurons in layers III and V.

  19. Reliability and relative weighting of visual and nonvisual information for perceiving direction of self-motion during walking

    PubMed Central

    Saunders, Jeffrey A.

    2014-01-01

    Direction of self-motion during walking is indicated by multiple cues, including optic flow, nonvisual sensory cues, and motor prediction. I measured the reliability of perceived heading from visual and nonvisual cues during walking, and whether cues are weighted in an optimal manner. I used a heading alignment task to measure perceived heading during walking. Observers walked toward a target in a virtual environment with and without global optic flow. The target was simulated to be infinitely far away, so that it did not provide direct feedback about direction of self-motion. Variability in heading direction was low even without optic flow, with average RMS error of 2.4°. Global optic flow reduced variability to 1.9°–2.1°, depending on the structure of the environment. The small amount of variance reduction was consistent with optimal use of visual information. The relative contribution of visual and nonvisual information was also measured using cue conflict conditions. Optic flow specified a conflicting heading direction (±5°), and bias in walking direction was used to infer relative weighting. Visual feedback influenced heading direction by 16%–34% depending on scene structure, with more effect with dense motion parallax. The weighting of visual feedback was close to the predictions of an optimal integration model given the observed variability measures. PMID:24648194

  20. Sensory-based expert monitoring and control

    NASA Astrophysics Data System (ADS)

    Yen, Gary G.

    1999-03-01

    Field operators use their eyes, ears, and nose to detect process behavior and to trigger corrective control actions. For instance: in daily practice, the experienced operator in sulfuric acid treatment of phosphate rock may observe froth color or bubble character to control process material in-flow. Or, similarly, (s)he may use acoustic sound of cavitation or boiling/flashing to increase or decrease material flow rates in tank levels. By contrast, process control computers continue to be limited to taking action on P, T, F, and A signals. Yet, there is sufficient evidence from the fields that visual and acoustic information can be used for control and identification. Smart in-situ sensors have facilitated potential mechanism for factory automation with promising industry applicability. In respond to these critical needs, a generic, structured health monitoring approach is proposed. The system assumes a given sensor suite will act as an on-line health usage monitor and at best provide the real-time control autonomy. The sensor suite can incorporate various types of sensory devices, from vibration accelerometers, directional microphones, machine vision CCDs, pressure gauges to temperature indicators. The decision can be shown in a visual on-board display or fed to the control block to invoke controller reconfigurration.

  1. Brainstem origins for cortical 'what' and 'where' pathways in the auditory system.

    PubMed

    Kraus, Nina; Nicol, Trent

    2005-04-01

    We have developed a data-driven conceptual framework that links two areas of science: the source-filter model of acoustics and cortical sensory processing streams. The source-filter model describes the mechanics behind speech production: the identity of the speaker is carried largely in the vocal cord source and the message is shaped by the ever-changing filters of the vocal tract. Sensory processing streams, popularly called 'what' and 'where' pathways, are well established in the visual system as a neural scheme for separately carrying different facets of visual objects, namely their identity and their position/motion, to the cortex. A similar functional organization has been postulated in the auditory system. Both speaker identity and the spoken message, which are simultaneously conveyed in the acoustic structure of speech, can be disentangled into discrete brainstem response components. We argue that these two response classes are early manifestations of auditory 'what' and 'where' streams in the cortex. This brainstem link forges a new understanding of the relationship between the acoustics of speech and cortical processing streams, unites two hitherto separate areas in science, and provides a model for future investigations of auditory function.

  2. Location-specific effects of attention during visual short-term memory maintenance.

    PubMed

    Matsukura, Michi; Cosman, Joshua D; Roper, Zachary J J; Vatterott, Daniel B; Vecera, Shaun P

    2014-06-01

    Recent neuroimaging studies suggest that early sensory areas such as area V1 are recruited to actively maintain a selected feature of the item held in visual short-term memory (VSTM). These findings raise the possibility that visual attention operates in similar manners across perceptual and memory representations to a certain extent, despite memory-level and perception-level selections are functionally dissociable. If VSTM operates by retaining "reasonable copies" of scenes constructed during sensory processing (Serences et al., 2009, p. 207, the sensory recruitment hypothesis), then it is possible that selective attention can be guided by both exogenous (peripheral) and endogenous (central) cues during VSTM maintenance. Yet, the results from the previous studies that examined this issue are inconsistent. In the present study, we investigated whether attention can be directed to a specific item's location represented in VSTM with the exogenous cue in a well-controlled setting. The results from the four experiments suggest that, as observed with the endogenous cue, the exogenous cue can efficiently guide selective attention during VSTM maintenance. The finding is not only consistent with the sensory recruitment hypothesis but also validates the legitimacy of the exogenous cue use in past and future studies. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Fine-scale topography in sensory systems: insights from Drosophila and vertebrates

    PubMed Central

    Kaneko, Takuya; Ye, Bing

    2015-01-01

    To encode the positions of sensory stimuli, sensory circuits form topographic maps in the central nervous system through specific point-to-point connections between pre- and post-synaptic neurons. In vertebrate visual systems, the establishment of topographic maps involves the formation of a coarse topography followed by that of fine-scale topography that distinguishes the axon terminals of neighboring neurons. It is known that intrinsic differences in the form of broad gradients of guidance molecules instruct coarse topography while neuronal activity is required for fine-scale topography. On the other hand, studies in the Drosophila visual system have shown that intrinsic differences in cell adhesion among the axon terminals of neighboring neurons instruct the fine-scale topography. Recent studies on activity-dependent topography in the Drosophila somatosensory system have revealed a role of neuronal activity in creating molecular differences among sensory neurons for establishing fine-scale topography, implicating a conserved principle. Here we review the findings in both Drosophila and vertebrates and propose an integrated model for fine-scale topography. PMID:26091779

  4. Fine-scale topography in sensory systems: insights from Drosophila and vertebrates.

    PubMed

    Kaneko, Takuya; Ye, Bing

    2015-09-01

    To encode the positions of sensory stimuli, sensory circuits form topographic maps in the central nervous system through specific point-to-point connections between pre- and postsynaptic neurons. In vertebrate visual systems, the establishment of topographic maps involves the formation of a coarse topography followed by that of fine-scale topography that distinguishes the axon terminals of neighboring neurons. It is known that intrinsic differences in the form of broad gradients of guidance molecules instruct coarse topography while neuronal activity is required for fine-scale topography. On the other hand, studies in the Drosophila visual system have shown that intrinsic differences in cell adhesion among the axon terminals of neighboring neurons instruct the fine-scale topography. Recent studies on activity-dependent topography in the Drosophila somatosensory system have revealed a role of neuronal activity in creating molecular differences among sensory neurons for establishing fine-scale topography, implicating a conserved principle. Here we review the findings in both Drosophila and vertebrates and propose an integrated model for fine-scale topography.

  5. The associations between multisensory temporal processing and symptoms of schizophrenia.

    PubMed

    Stevenson, Ryan A; Park, Sohee; Cochran, Channing; McIntosh, Lindsey G; Noel, Jean-Paul; Barense, Morgan D; Ferber, Susanne; Wallace, Mark T

    2017-01-01

    Recent neurobiological accounts of schizophrenia have included an emphasis on changes in sensory processing. These sensory and perceptual deficits can have a cascading effect onto higher-level cognitive processes and clinical symptoms. One form of sensory dysfunction that has been consistently observed in schizophrenia is altered temporal processing. In this study, we investigated temporal processing within and across the auditory and visual modalities in individuals with schizophrenia (SCZ) and age-matched healthy controls. Individuals with SCZ showed auditory and visual temporal processing abnormalities, as well as multisensory temporal processing dysfunction that extended beyond that attributable to unisensory processing dysfunction. Most importantly, these multisensory temporal deficits were associated with the severity of hallucinations. This link between atypical multisensory temporal perception and clinical symptomatology suggests that clinical symptoms of schizophrenia may be at least partly a result of cascading effects from (multi)sensory disturbances. These results are discussed in terms of underlying neural bases and the possible implications for remediation. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Autistic traits, but not schizotypy, predict increased weighting of sensory information in Bayesian visual integration.

    PubMed

    Karvelis, Povilas; Seitz, Aaron R; Lawrie, Stephen M; Seriès, Peggy

    2018-05-14

    Recent theories propose that schizophrenia/schizotypy and autistic spectrum disorder are related to impairments in Bayesian inference that is, how the brain integrates sensory information (likelihoods) with prior knowledge. However existing accounts fail to clarify: (i) how proposed theories differ in accounts of ASD vs. schizophrenia and (ii) whether the impairments result from weaker priors or enhanced likelihoods. Here, we directly address these issues by characterizing how 91 healthy participants, scored for autistic and schizotypal traits, implicitly learned and combined priors with sensory information. This was accomplished through a visual statistical learning paradigm designed to quantitatively assess variations in individuals' likelihoods and priors. The acquisition of the priors was found to be intact along both traits spectra. However, autistic traits were associated with more veridical perception and weaker influence of expectations. Bayesian modeling revealed that this was due, not to weaker prior expectations, but to more precise sensory representations. © 2018, Karvelis et al.

  7. Grey matter connectivity within and between auditory, language and visual systems in prelingually deaf adolescents.

    PubMed

    Li, Wenjing; Li, Jianhong; Wang, Zhenchang; Li, Yong; Liu, Zhaohui; Yan, Fei; Xian, Junfang; He, Huiguang

    2015-01-01

    Previous studies have shown brain reorganizations after early deprivation of auditory sensory. However, changes of grey matter connectivity have not been investigated in prelingually deaf adolescents yet. In the present study, we aimed to investigate changes of grey matter connectivity within and between auditory, language and visual systems in prelingually deaf adolescents. We recruited 16 prelingually deaf adolescents and 16 age-and gender-matched normal controls, and extracted the grey matter volume as the structural characteristic from 14 regions of interest involved in auditory, language or visual processing to investigate the changes of grey matter connectivity within and between auditory, language and visual systems. Sparse inverse covariance estimation (SICE) was utilized to construct grey matter connectivity between these brain regions. The results show that prelingually deaf adolescents present weaker grey matter connectivity within auditory and visual systems, and connectivity between language and visual systems declined. Notably, significantly increased brain connectivity was found between auditory and visual systems in prelingually deaf adolescents. Our results indicate "cross-modal" plasticity after deprivation of the auditory input in prelingually deaf adolescents, especially between auditory and visual systems. Besides, auditory deprivation and visual deficits might affect the connectivity pattern within language and visual systems in prelingually deaf adolescents.

  8. 38 CFR 17.149 - Sensori-neural aids.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... attendance or by reason of being permanently housebound; (6) Those who have a visual or hearing impairment... normally occurring visual or hearing impairments; and (8) Those visually or hearing impaired so severely... frequency ranges which contribute to a loss of communication ability; however, hearing aids are to be...

  9. Methods and Apparatus for Autonomous Robotic Control

    NASA Technical Reports Server (NTRS)

    Gorshechnikov, Anatoly (Inventor); Livitz, Gennady (Inventor); Versace, Massimiliano (Inventor); Palma, Jesse (Inventor)

    2017-01-01

    Sensory processing of visual, auditory, and other sensor information (e.g., visual imagery, LIDAR, RADAR) is conventionally based on "stovepiped," or isolated processing, with little interactions between modules. Biological systems, on the other hand, fuse multi-sensory information to identify nearby objects of interest more quickly, more efficiently, and with higher signal-to-noise ratios. Similarly, examples of the OpenSense technology disclosed herein use neurally inspired processing to identify and locate objects in a robot's environment. This enables the robot to navigate its environment more quickly and with lower computational and power requirements.

  10. Attention modulates specific motor cortical circuits recruited by transcranial magnetic stimulation.

    PubMed

    Mirdamadi, J L; Suzuki, L Y; Meehan, S K

    2017-09-17

    Skilled performance and acquisition is dependent upon afferent input to motor cortex. The present study used short-latency afferent inhibition (SAI) to probe how manipulation of sensory afference by attention affects different circuits projecting to pyramidal tract neurons in motor cortex. SAI was assessed in the first dorsal interosseous muscle while participants performed a low or high attention-demanding visual detection task. SAI was evoked by preceding a suprathreshold transcranial magnetic stimulus with electrical stimulation of the median nerve at the wrist. To isolate different afferent intracortical circuits in motor cortex SAI was evoked using either posterior-anterior (PA) or anterior-posterior (PA) monophasic current. In an independent sample, somatosensory processing during the same attention-demanding visual detection tasks was assessed using somatosensory-evoked potentials (SEP) elicited by median nerve stimulation. SAI elicited by AP TMS was reduced under high compared to low visual attention demands. SAI elicited by PA TMS was not affected by visual attention demands. SEPs revealed that the high visual attention load reduced the fronto-central P20-N30 but not the contralateral parietal N20-P25 SEP component. P20-N30 reduction confirmed that the visual attention task altered sensory afference. The current results offer further support that PA and AP TMS recruit different neuronal circuits. AP circuits may be one substrate by which cognitive strategies shape sensorimotor processing during skilled movement by altering sensory processing in premotor areas. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.

  11. The origins of metamodality in visual object area LO: Bodily topographical biases and increased functional connectivity to S1

    PubMed Central

    Tal, Zohar; Geva, Ran; Amedi, Amir

    2016-01-01

    Recent evidence from blind participants suggests that visual areas are task-oriented and sensory modality input independent rather than sensory-specific to vision. Specifically, visual areas are thought to retain their functional selectivity when using non-visual inputs (touch or sound) even without having any visual experience. However, this theory is still controversial since it is not clear whether this also characterizes the sighted brain, and whether the reported results in the sighted reflect basic fundamental a-modal processes or are an epiphenomenon to a large extent. In the current study, we addressed these questions using a series of fMRI experiments aimed to explore visual cortex responses to passive touch on various body parts and the coupling between the parietal and visual cortices as manifested by functional connectivity. We show that passive touch robustly activated the object selective parts of the lateral–occipital (LO) cortex while deactivating almost all other occipital–retinotopic-areas. Furthermore, passive touch responses in the visual cortex were specific to hand and upper trunk stimulations. Psychophysiological interaction (PPI) analysis suggests that LO is functionally connected to the hand area in the primary somatosensory homunculus (S1), during hand and shoulder stimulations but not to any of the other body parts. We suggest that LO is a fundamental hub that serves as a node between visual-object selective areas and S1 hand representation, probably due to the critical evolutionary role of touch in object recognition and manipulation. These results might also point to a more general principle suggesting that recruitment or deactivation of the visual cortex by other sensory input depends on the ecological relevance of the information conveyed by this input to the task/computations carried out by each area or network. This is likely to rely on the unique and differential pattern of connectivity for each visual area with the rest of the brain. PMID:26673114

  12. Functional double dissociation within the entorhinal cortex for visual scene-dependent choice behavior

    PubMed Central

    Yoo, Seung-Woo; Lee, Inah

    2017-01-01

    How visual scene memory is processed differentially by the upstream structures of the hippocampus is largely unknown. We sought to dissociate functionally the lateral and medial subdivisions of the entorhinal cortex (LEC and MEC, respectively) in visual scene-dependent tasks by temporarily inactivating the LEC and MEC in the same rat. When the rat made spatial choices in a T-maze using visual scenes displayed on LCD screens, the inactivation of the MEC but not the LEC produced severe deficits in performance. However, when the task required the animal to push a jar or to dig in the sand in the jar using the same scene stimuli, the LEC but not the MEC became important. Our findings suggest that the entorhinal cortex is critical for scene-dependent mnemonic behavior, and the response modality may interact with a sensory modality to determine the involvement of the LEC and MEC in scene-based memory tasks. DOI: http://dx.doi.org/10.7554/eLife.21543.001 PMID:28169828

  13. Assessing morphology and function of the semicircular duct system: introducing new in-situ visualization and software toolbox

    PubMed Central

    David, R.; Stoessel, A.; Berthoz, A.; Spoor, F.; Bennequin, D.

    2016-01-01

    The semicircular duct system is part of the sensory organ of balance and essential for navigation and spatial awareness in vertebrates. Its function in detecting head rotations has been modelled with increasing sophistication, but the biomechanics of actual semicircular duct systems has rarely been analyzed, foremost because the fragile membranous structures in the inner ear are hard to visualize undistorted and in full. Here we present a new, easy-to-apply and non-invasive method for three-dimensional in-situ visualization and quantification of the semicircular duct system, using X-ray micro tomography and tissue staining with phosphotungstic acid. Moreover, we introduce Ariadne, a software toolbox which provides comprehensive and improved morphological and functional analysis of any visualized duct system. We demonstrate the potential of these methods by presenting results for the duct system of humans, the squirrel monkey and the rhesus macaque, making comparisons with past results from neurophysiological, oculometric and biomechanical studies. Ariadne is freely available at http://www.earbank.org. PMID:27604473

  14. The persistence of a visual dominance effect in a telemanipulator task: A comparison between visual and electrotactile feedback

    NASA Technical Reports Server (NTRS)

    Gaillard, J. P.

    1981-01-01

    The possibility to use an electrotactile stimulation in teleoperation and to observe the interpretation of such information as a feedback to the operator was investigated. It is proposed that visual feedback is more informative than an electrotactile one; and that complex electrotactile feedback slows down both the motor decision and motor response processes, is processed as an all or nothing signal, and bypasses the receptive structure and accesses directly in a working memory where information is sequentially processed and where memory is limited in treatment capacity. The electrotactile stimulation is used as an alerting signal. It is suggested that the visual dominance effect is the result of the advantage of both a transfer function and a sensory memory register where information is pretreated and memorized for a short time. It is found that dividing attention has an effect on the acquisition of the information but not on the subsequent decision processes.

  15. Spatial resolution in visual memory.

    PubMed

    Ben-Shalom, Asaf; Ganel, Tzvi

    2015-04-01

    Representations in visual short-term memory are considered to contain relatively elaborated information on object structure. Conversely, representations in earlier stages of the visual hierarchy are thought to be dominated by a sensory-based, feed-forward buildup of information. In four experiments, we compared the spatial resolution of different object properties between two points in time along the processing hierarchy in visual short-term memory. Subjects were asked either to estimate the distance between objects or to estimate the size of one of the objects' features under two experimental conditions, of either a short or a long delay period between the presentation of the target stimulus and the probe. When different objects were referred to, similar spatial resolution was found for the two delay periods, suggesting that initial processing stages are sensitive to object-based properties. Conversely, superior resolution was found for the short, as compared with the long, delay when features were referred to. These findings suggest that initial representations in visual memory are hybrid in that they allow fine-grained resolution for object features alongside normal visual sensitivity to the segregation between objects. The findings are also discussed in reference to the distinction made in earlier studies between visual short-term memory and iconic memory.

  16. Sensory and demographic characteristics of deafblindness rehabilitation clients in Montréal, Canada.

    PubMed

    Wittich, Walter; Watanabe, Donald H; Gagné, Jean-Pierre

    2012-05-01

      Demographic changes are increasing the number of older adults with combined age-related vision and hearing loss, while medical advances increase the survival probability of children with congenital dual (or multiple) impairments due to pre-maturity or rare hereditary diseases. Rehabilitation services for these populations are highly in demand since traditional uni-sensory rehabilitation approaches using the other sense to compensate are not always utilizable. Very little is currently known about the client population characteristics with dual sensory impairment. The present study provides information about demographic and sensory variables of persons in the Montreal region that were receiving rehabilitation for dual impairment in December 2010. This information can inform researchers, clinicians, educators, as well as administrators about potential research and service delivery priorities. A chart review of all client files across the three rehabilitation agencies that offer integrated dual sensory rehabilitation services in Montreal provided data on visual acuity, visual field, hearing detection thresholds, and demographic variables. The 209 males and 355 females ranged in age from 4months to 105years (M=71.9, S.D.=24.6), indicating a prevalence estimate for dual sensory impairment at 15/100000. Only 5.7% were under 18years of age, while 69.1% were over the age of 65years, with 43.1% over the age of 85years. The diagnostic combination that accounted for 31% of the entire sample was age-related macular degeneration with presbycusis. Their visual and auditory measures indicated that older adults were likely to fall into moderate to severe levels of impairment on both measures. Individuals with Usher Syndrome comprised 20.9% (n=118) of the sample. The age distribution in this sample of persons with dual sensory impairment indicates that service delivery planning will need to strongly consider the growing presence of older adults as the baby-boomers approach retirement age. The distribution of their visual and auditory limits indicates that the large majority of this client group has residual vision and hearing that can be maximized in the rehabilitation process in order to restore functional abilities and social participation. Future research in this area should identify the specific priorities in both rehabilitation and research in individuals affected with combined vision and hearing loss. Ophthalmic & Physiological Optics © 2012 The College of Optometrists.

  17. Visual attention: low-level and high-level viewpoints

    NASA Astrophysics Data System (ADS)

    Stentiford, Fred W. M.

    2012-06-01

    This paper provides a brief outline of the approaches to modeling human visual attention. Bottom-up and top-down mechanisms are described together with some of the problems that they face. It has been suggested in brain science that memory functions by trading measurement precision for associative power; sensory inputs from the environment are never identical on separate occasions, but the associations with memory compensate for the differences. A graphical representation for image similarity is described that relies on the size of maximally associative structures (cliques) that are found to reflect between pairs of images. This is applied to the recognition of movie posters, the location and recognition of characters, and the recognition of faces. The similarity mechanism is shown to model popout effects when constraints are placed on the physical separation of pixels that correspond to nodes in the maximal cliques. The effect extends to modeling human visual behaviour on the Poggendorff illusion.

  18. Visual Working Memory Enhances the Neural Response to Matching Visual Input.

    PubMed

    Gayet, Surya; Guggenmos, Matthias; Christophel, Thomas B; Haynes, John-Dylan; Paffen, Chris L E; Van der Stigchel, Stefan; Sterzer, Philipp

    2017-07-12

    Visual working memory (VWM) is used to maintain visual information available for subsequent goal-directed behavior. The content of VWM has been shown to affect the behavioral response to concurrent visual input, suggesting that visual representations originating from VWM and from sensory input draw upon a shared neural substrate (i.e., a sensory recruitment stance on VWM storage). Here, we hypothesized that visual information maintained in VWM would enhance the neural response to concurrent visual input that matches the content of VWM. To test this hypothesis, we measured fMRI BOLD responses to task-irrelevant stimuli acquired from 15 human participants (three males) performing a concurrent delayed match-to-sample task. In this task, observers were sequentially presented with two shape stimuli and a retro-cue indicating which of the two shapes should be memorized for subsequent recognition. During the retention interval, a task-irrelevant shape (the probe) was briefly presented in the peripheral visual field, which could either match or mismatch the shape category of the memorized stimulus. We show that this probe stimulus elicited a stronger BOLD response, and allowed for increased shape-classification performance, when it matched rather than mismatched the concurrently memorized content, despite identical visual stimulation. Our results demonstrate that VWM enhances the neural response to concurrent visual input in a content-specific way. This finding is consistent with the view that neural populations involved in sensory processing are recruited for VWM storage, and it provides a common explanation for a plethora of behavioral studies in which VWM-matching visual input elicits a stronger behavioral and perceptual response. SIGNIFICANCE STATEMENT Humans heavily rely on visual information to interact with their environment and frequently must memorize such information for later use. Visual working memory allows for maintaining such visual information in the mind's eye after termination of its retinal input. It is hypothesized that information maintained in visual working memory relies on the same neural populations that process visual input. Accordingly, the content of visual working memory is known to affect our conscious perception of concurrent visual input. Here, we demonstrate for the first time that visual input elicits an enhanced neural response when it matches the content of visual working memory, both in terms of signal strength and information content. Copyright © 2017 the authors 0270-6474/17/376638-10$15.00/0.

  19. Visual cortex extrastriate body-selective area activation in congenitally blind people "seeing" by using sounds.

    PubMed

    Striem-Amit, Ella; Amedi, Amir

    2014-03-17

    Vision is by far the most prevalent sense for experiencing others' body shapes, postures, actions, and intentions, and its congenital absence may dramatically hamper body-shape representation in the brain. We investigated whether the absence of visual experience and limited exposure to others' body shapes could still lead to body-shape selectivity. We taught congenitally fully-blind adults to perceive full-body shapes conveyed through a sensory-substitution algorithm topographically translating images into soundscapes [1]. Despite the limited experience of the congenitally blind with external body shapes (via touch of close-by bodies and for ~10 hr via soundscapes), once the blind could retrieve body shapes via soundscapes, they robustly activated the visual cortex, specifically the extrastriate body area (EBA; [2]). Furthermore, body selectivity versus textures, objects, and faces in both the blind and sighted control groups was not found in the temporal (auditory) or parietal (somatosensory) cortex but only in the visual EBA. Finally, resting-state data showed that the blind EBA is functionally connected to the temporal cortex temporal-parietal junction/superior temporal sulcus Theory-of-Mind areas [3]. Thus, the EBA preference is present without visual experience and with little exposure to external body-shape information, supporting the view that the brain has a sensory-independent, task-selective supramodal organization rather than a sensory-specific organization. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Distributed patterns of activity in sensory cortex reflect the precision of multiple items maintained in visual short-term memory.

    PubMed

    Emrich, Stephen M; Riggall, Adam C; Larocque, Joshua J; Postle, Bradley R

    2013-04-10

    Traditionally, load sensitivity of sustained, elevated activity has been taken as an index of storage for a limited number of items in visual short-term memory (VSTM). Recently, studies have demonstrated that the contents of a single item held in VSTM can be decoded from early visual cortex, despite the fact that these areas do not exhibit elevated, sustained activity. It is unknown, however, whether the patterns of neural activity decoded from sensory cortex change as a function of load, as one would expect from a region storing multiple representations. Here, we use multivoxel pattern analysis to examine the neural representations of VSTM in humans across multiple memory loads. In an important extension of previous findings, our results demonstrate that the contents of VSTM can be decoded from areas that exhibit a transient response to visual stimuli, but not from regions that exhibit elevated, sustained load-sensitive delay-period activity. Moreover, the neural information present in these transiently activated areas decreases significantly with increasing load, indicating load sensitivity of the patterns of activity that support VSTM maintenance. Importantly, the decrease in classification performance as a function of load is correlated with within-subject changes in mnemonic resolution. These findings indicate that distributed patterns of neural activity in putatively sensory visual cortex support the representation and precision of information in VSTM.

  1. Parallel pathways for cross-modal memory retrieval in Drosophila.

    PubMed

    Zhang, Xiaonan; Ren, Qingzhong; Guo, Aike

    2013-05-15

    Memory-retrieval processing of cross-modal sensory preconditioning is vital for understanding the plasticity underlying the interactions between modalities. As part of the sensory preconditioning paradigm, it has been hypothesized that the conditioned response to an unreinforced cue depends on the memory of the reinforced cue via a sensory link between the two cues. To test this hypothesis, we studied cross-modal memory-retrieval processing in a genetically tractable model organism, Drosophila melanogaster. By expressing the dominant temperature-sensitive shibire(ts1) (shi(ts1)) transgene, which blocks synaptic vesicle recycling of specific neural subsets with the Gal4/UAS system at the restrictive temperature, we specifically blocked visual and olfactory memory retrieval, either alone or in combination; memory acquisition remained intact for these modalities. Blocking the memory retrieval of the reinforced olfactory cues did not impair the conditioned response to the unreinforced visual cues or vice versa, in contrast to the canonical memory-retrieval processing of sensory preconditioning. In addition, these conditioned responses can be abolished by blocking the memory retrieval of the two modalities simultaneously. In sum, our results indicated that a conditioned response to an unreinforced cue in cross-modal sensory preconditioning can be recalled through parallel pathways.

  2. Predictive Coding or Evidence Accumulation? False Inference and Neuronal Fluctuations

    PubMed Central

    Friston, Karl J.; Kleinschmidt, Andreas

    2010-01-01

    Perceptual decisions can be made when sensory input affords an inference about what generated that input. Here, we report findings from two independent perceptual experiments conducted during functional magnetic resonance imaging (fMRI) with a sparse event-related design. The first experiment, in the visual modality, involved forced-choice discrimination of coherence in random dot kinematograms that contained either subliminal or periliminal motion coherence. The second experiment, in the auditory domain, involved free response detection of (non-semantic) near-threshold acoustic stimuli. We analysed fluctuations in ongoing neural activity, as indexed by fMRI, and found that neuronal activity in sensory areas (extrastriate visual and early auditory cortex) biases perceptual decisions towards correct inference and not towards a specific percept. Hits (detection of near-threshold stimuli) were preceded by significantly higher activity than both misses of identical stimuli or false alarms, in which percepts arise in the absence of appropriate sensory input. In accord with predictive coding models and the free-energy principle, this observation suggests that cortical activity in sensory brain areas reflects the precision of prediction errors and not just the sensory evidence or prediction errors per se. PMID:20369004

  3. Intracranial Cortical Responses during Visual–Tactile Integration in Humans

    PubMed Central

    Quinn, Brian T.; Carlson, Chad; Doyle, Werner; Cash, Sydney S.; Devinsky, Orrin; Spence, Charles; Halgren, Eric

    2014-01-01

    Sensory integration of touch and sight is crucial to perceiving and navigating the environment. While recent evidence from other sensory modality combinations suggests that low-level sensory areas integrate multisensory information at early processing stages, little is known about how the brain combines visual and tactile information. We investigated the dynamics of multisensory integration between vision and touch using the high spatial and temporal resolution of intracranial electrocorticography in humans. We present a novel, two-step metric for defining multisensory integration. The first step compares the sum of the unisensory responses to the bimodal response as multisensory responses. The second step eliminates the possibility that double addition of sensory responses could be misinterpreted as interactions. Using these criteria, averaged local field potentials and high-gamma-band power demonstrate a functional processing cascade whereby sensory integration occurs late, both anatomically and temporally, in the temporo–parieto–occipital junction (TPOJ) and dorsolateral prefrontal cortex. Results further suggest two neurophysiologically distinct and temporally separated integration mechanisms in TPOJ, while providing direct evidence for local suppression as a dominant mechanism for synthesizing visual and tactile input. These results tend to support earlier concepts of multisensory integration as relatively late and centered in tertiary multimodal association cortices. PMID:24381279

  4. Multichannel brain recordings in behaving Drosophila reveal oscillatory activity and local coherence in response to sensory stimulation and circuit activation

    PubMed Central

    Paulk, Angelique C.; Zhou, Yanqiong; Stratton, Peter; Liu, Li

    2013-01-01

    Neural networks in vertebrates exhibit endogenous oscillations that have been associated with functions ranging from sensory processing to locomotion. It remains unclear whether oscillations may play a similar role in the insect brain. We describe a novel “whole brain” readout for Drosophila melanogaster using a simple multichannel recording preparation to study electrical activity across the brain of flies exposed to different sensory stimuli. We recorded local field potential (LFP) activity from >2,000 registered recording sites across the fly brain in >200 wild-type and transgenic animals to uncover specific LFP frequency bands that correlate with: 1) brain region; 2) sensory modality (olfactory, visual, or mechanosensory); and 3) activity in specific neural circuits. We found endogenous and stimulus-specific oscillations throughout the fly brain. Central (higher-order) brain regions exhibited sensory modality-specific increases in power within narrow frequency bands. Conversely, in sensory brain regions such as the optic or antennal lobes, LFP coherence, rather than power, best defined sensory responses across modalities. By transiently activating specific circuits via expression of TrpA1, we found that several circuits in the fly brain modulate LFP power and coherence across brain regions and frequency domains. However, activation of a neuromodulatory octopaminergic circuit specifically increased neuronal coherence in the optic lobes during visual stimulation while decreasing coherence in central brain regions. Our multichannel recording and brain registration approach provides an effective way to track activity simultaneously across the fly brain in vivo, allowing investigation of functional roles for oscillations in processing sensory stimuli and modulating behavior. PMID:23864378

  5. Validity of Sensory Systems as Distinct Constructs

    PubMed Central

    Su, Chia-Ting

    2014-01-01

    This study investigated the validity of sensory systems as distinct measurable constructs as part of a larger project examining Ayres’s theory of sensory integration. Confirmatory factor analysis (CFA) was conducted to test whether sensory questionnaire items represent distinct sensory system constructs. Data were obtained from clinical records of two age groups, 2- to 5-yr-olds (n = 231) and 6- to 10-yr-olds (n = 223). With each group, we tested several CFA models for goodness of fit with the data. The accepted model was identical for each group and indicated that tactile, vestibular–proprioceptive, visual, and auditory systems form distinct, valid factors that are not age dependent. In contrast, alternative models that grouped items according to sensory processing problems (e.g., over- or underresponsiveness within or across sensory systems) did not yield valid factors. Results indicate that distinct sensory system constructs can be measured validly using questionnaire data. PMID:25184467

  6. Sinking Maps: A Conceptual Tool for Visual Metaphor

    ERIC Educational Resources Information Center

    Giampa, Joan Marie

    2012-01-01

    Sinking maps, created by Northern Virginia Community College professor Joan Marie Giampa, are tools that teach fine art students how to construct visual metaphor by conceptually mapping sensory perceptions. Her dissertation answers the question, "Can visual metaphor be conceptually mapped in the art classroom?" In the Prologue, Giampa…

  7. Slipped Lips: Onset Asynchrony Detection of Auditory-Visual Language in Autism

    ERIC Educational Resources Information Center

    Grossman, Ruth B.; Schneps, Matthew H.; Tager-Flusberg, Helen

    2009-01-01

    Background: It has frequently been suggested that individuals with autism spectrum disorder (ASD) have deficits in auditory-visual (AV) sensory integration. Studies of language integration have mostly used non-word syllables presented in congruent and incongruent AV combinations and demonstrated reduced influence of visual speech in individuals…

  8. Motor imagery learning modulates functional connectivity of multiple brain systems in resting state.

    PubMed

    Zhang, Hang; Long, Zhiying; Ge, Ruiyang; Xu, Lele; Jin, Zhen; Yao, Li; Liu, Yijun

    2014-01-01

    Learning motor skills involves subsequent modulation of resting-state functional connectivity in the sensory-motor system. This idea was mostly derived from the investigations on motor execution learning which mainly recruits the processing of sensory-motor information. Behavioral evidences demonstrated that motor skills in our daily lives could be learned through imagery procedures. However, it remains unclear whether the modulation of resting-state functional connectivity also exists in the sensory-motor system after motor imagery learning. We performed a fMRI investigation on motor imagery learning from resting state. Based on previous studies, we identified eight sensory and cognitive resting-state networks (RSNs) corresponding to the brain systems and further explored the functional connectivity of these RSNs through the assessments, connectivity and network strengths before and after the two-week consecutive learning. Two intriguing results were revealed: (1) The sensory RSNs, specifically sensory-motor and lateral visual networks exhibited greater connectivity strengths in precuneus and fusiform gyrus after learning; (2) Decreased network strength induced by learning was proved in the default mode network, a cognitive RSN. These results indicated that resting-state functional connectivity could be modulated by motor imagery learning in multiple brain systems, and such modulation displayed in the sensory-motor, visual and default brain systems may be associated with the establishment of motor schema and the regulation of introspective thought. These findings further revealed the neural substrates underlying motor skill learning and potentially provided new insights into the therapeutic benefits of motor imagery learning.

  9. Touch Precision Modulates Visual Bias.

    PubMed

    Misceo, Giovanni F; Jones, Maurice D

    2018-01-01

    The sensory precision hypothesis holds that different seen and felt cues about the size of an object resolve themselves in favor of the more reliable modality. To examine this precision hypothesis, 60 college students were asked to look at one size while manually exploring another unseen size either with their bare fingers or, to lessen the reliability of touch, with their fingers sleeved in rigid tubes. Afterwards, the participants estimated either the seen size or the felt size by finding a match from a visual display of various sizes. Results showed that the seen size biased the estimates of the felt size when the reliability of touch decreased. This finding supports the interaction between touch reliability and visual bias predicted by statistically optimal models of sensory integration.

  10. Efficient encoding of motion is mediated by gap junctions in the fly visual system.

    PubMed

    Wang, Siwei; Borst, Alexander; Zaslavsky, Noga; Tishby, Naftali; Segev, Idan

    2017-12-01

    Understanding the computational implications of specific synaptic connectivity patterns is a fundamental goal in neuroscience. In particular, the computational role of ubiquitous electrical synapses operating via gap junctions remains elusive. In the fly visual system, the cells in the vertical-system network, which play a key role in visual processing, primarily connect to each other via axonal gap junctions. This network therefore provides a unique opportunity to explore the functional role of gap junctions in sensory information processing. Our information theoretical analysis of a realistic VS network model shows that within 10 ms following the onset of the visual input, the presence of axonal gap junctions enables the VS system to efficiently encode the axis of rotation, θ, of the fly's ego motion. This encoding efficiency, measured in bits, is near-optimal with respect to the physical limits of performance determined by the statistical structure of the visual input itself. The VS network is known to be connected to downstream pathways via a subset of triplets of the vertical system cells; we found that because of the axonal gap junctions, the efficiency of this subpopulation in encoding θ is superior to that of the whole vertical system network and is robust to a wide range of signal to noise ratios. We further demonstrate that this efficient encoding of motion by this subpopulation is necessary for the fly's visually guided behavior, such as banked turns in evasive maneuvers. Because gap junctions are formed among the axons of the vertical system cells, they only impact the system's readout, while maintaining the dendritic input intact, suggesting that the computational principles implemented by neural circuitries may be much richer than previously appreciated based on point neuron models. Our study provides new insights as to how specific network connectivity leads to efficient encoding of sensory stimuli.

  11. Sensory-driven and spontaneous gamma oscillations engage distinct cortical circuitry

    PubMed Central

    2015-01-01

    Gamma oscillations are a robust component of sensory responses but are also part of the background spontaneous activity of the brain. To determine whether the properties of gamma oscillations in cortex are specific to their mechanism of generation, we compared in mouse visual cortex in vivo the laminar geometry and single-neuron rhythmicity of oscillations produced during sensory representation with those occurring spontaneously in the absence of stimulation. In mouse visual cortex under anesthesia (isoflurane and xylazine), visual stimulation triggered oscillations mainly between 20 and 50 Hz, which, because of their similar functional significance to gamma oscillations in higher mammals, we define here as gamma range. Sensory representation in visual cortex specifically increased gamma oscillation amplitude in the supragranular (L2/3) and granular (L4) layers and strongly entrained putative excitatory and inhibitory neurons in infragranular layers, while spontaneous gamma oscillations were distributed evenly through the cortical depth and primarily entrained putative inhibitory neurons in the infragranular (L5/6) cortical layers. The difference in laminar distribution of gamma oscillations during the two different conditions may result from differences in the source of excitatory input to the cortex. In addition, modulation of superficial gamma oscillation amplitude did not result in a corresponding change in deep-layer oscillations, suggesting that superficial and deep layers of cortex may utilize independent but related networks for gamma generation. These results demonstrate that stimulus-driven gamma oscillations engage cortical circuitry in a manner distinct from spontaneous oscillations and suggest multiple networks for the generation of gamma oscillations in cortex. PMID:26719085

  12. Eye Movement as an Indicator of Sensory Components in Thought.

    ERIC Educational Resources Information Center

    Buckner, Michael; And Others

    1987-01-01

    Investigated Neuro-Linguistic Programming eye movement model's claim that specific eye movements are indicative of specific sensory components in thought. Agreement between students' (N=48) self-reports and trained observers' records support visual and auditory portions of model; do not support kinesthetic portion. Interrater agreement supports…

  13. Sensory Cues, Visualization and Physics Learning

    ERIC Educational Resources Information Center

    Reiner, Miriam

    2009-01-01

    Bodily manipulations, such as juggling, suggest a well-synchronized physical interaction as if the person were a physics expert. The juggler uses "knowledge" that is rooted in bodily experience, to interact with the environment. Such enacted bodily knowledge is powerful, efficient, predictive, and relates to sensory perception of the dynamics of…

  14. Sensory Integration and Ego Development in a Schizophrenic Adolescent Male.

    ERIC Educational Resources Information Center

    Pettit, Karen A.

    1987-01-01

    A retrospective study compared hours spent by a schizophrenic adolescent in "time out" before and after initiation of treatment. The study evaluated the effects of sensory integrative treatment on the ability to handle anger and frustration. Results demonstrate the utility of statistical analysis versus visual comparison to validate effectiveness…

  15. Central Processing Dysfunctions in Children: A Review of Research.

    ERIC Educational Resources Information Center

    Chalfant, James C.; Scheffelin, Margaret A.

    Research on central processing dysfunctions in children is reviewed in three major areas. The first, dysfunctions in the analysis of sensory information, includes auditory, visual, and haptic processing. The second, dysfunction in the synthesis of sensory information, covers multiple stimulus integration and short-term memory. The third area of…

  16. Feature-Selective Attentional Modulations in Human Frontoparietal Cortex.

    PubMed

    Ester, Edward F; Sutterer, David W; Serences, John T; Awh, Edward

    2016-08-03

    Control over visual selection has long been framed in terms of a dichotomy between "source" and "site," where top-down feedback signals originating in frontoparietal cortical areas modulate or bias sensory processing in posterior visual areas. This distinction is motivated in part by observations that frontoparietal cortical areas encode task-level variables (e.g., what stimulus is currently relevant or what motor outputs are appropriate), while posterior sensory areas encode continuous or analog feature representations. Here, we present evidence that challenges this distinction. We used fMRI, a roving searchlight analysis, and an inverted encoding model to examine representations of an elementary feature property (orientation) across the entire human cortical sheet while participants attended either the orientation or luminance of a peripheral grating. Orientation-selective representations were present in a multitude of visual, parietal, and prefrontal cortical areas, including portions of the medial occipital cortex, the lateral parietal cortex, and the superior precentral sulcus (thought to contain the human homolog of the macaque frontal eye fields). Additionally, representations in many-but not all-of these regions were stronger when participants were instructed to attend orientation relative to luminance. Collectively, these findings challenge models that posit a strict segregation between sources and sites of attentional control on the basis of representational properties by demonstrating that simple feature values are encoded by cortical regions throughout the visual processing hierarchy, and that representations in many of these areas are modulated by attention. Influential models of visual attention posit a distinction between top-down control and bottom-up sensory processing networks. These models are motivated in part by demonstrations showing that frontoparietal cortical areas associated with top-down control represent abstract or categorical stimulus information, while visual areas encode parametric feature information. Here, we show that multivariate activity in human visual, parietal, and frontal cortical areas encode representations of a simple feature property (orientation). Moreover, representations in several (though not all) of these areas were modulated by feature-based attention in a similar fashion. These results provide an important challenge to models that posit dissociable top-down control and sensory processing networks on the basis of representational properties. Copyright © 2016 the authors 0270-6474/16/368188-12$15.00/0.

  17. Restless 'rest': intrinsic sensory hyperactivity and disinhibition in post-traumatic stress disorder.

    PubMed

    Clancy, Kevin; Ding, Mingzhou; Bernat, Edward; Schmidt, Norman B; Li, Wen

    2017-07-01

    Post-traumatic stress disorder is characterized by exaggerated threat response, and theoretical accounts to date have focused on impaired threat processing and dysregulated prefrontal-cortex-amygdala circuitry. Nevertheless, evidence is accruing for broad, threat-neutral sensory hyperactivity in post-traumatic stress disorder. As low-level, sensory processing impacts higher-order operations, such sensory anomalies can contribute to widespread dysfunctions, presenting an additional aetiological mechanism for post-traumatic stress disorder. To elucidate a sensory pathology of post-traumatic stress disorder, we examined intrinsic visual cortical activity (based on posterior alpha oscillations) and bottom-up sensory-driven causal connectivity (Granger causality in the alpha band) during a resting state (eyes open) and a passive, serial picture viewing state. Compared to patients with generalized anxiety disorder (n = 24) and healthy control subjects (n = 20), patients with post-traumatic stress disorder (n = 25) demonstrated intrinsic sensory hyperactivity (suppressed posterior alpha power, source-localized to the visual cortex-cuneus and precuneus) and bottom-up inhibition deficits (reduced posterior→frontal Granger causality). As sensory input increased from resting to passive picture viewing, patients with post-traumatic stress disorder failed to demonstrate alpha adaptation, highlighting a rigid, set mode of sensory hyperactivity. Interestingly, patients with post-traumatic stress disorder also showed heightened frontal processing (augmented frontal gamma power, source-localized to the superior frontal gyrus and dorsal cingulate cortex), accompanied by attenuated top-down inhibition (reduced frontal→posterior causality). Importantly, not only did suppressed alpha power and bottom-up causality correlate with heightened frontal gamma power, they also correlated with increased severity of sensory and executive dysfunctions (i.e. hypervigilance and impulse control deficits, respectively). Therefore, sensory aberrations help construct a vicious cycle in post-traumatic stress disorder that is in action even at rest, implicating dysregulated triangular sensory-prefrontal-cortex-amygdala circuitry: intrinsic sensory hyperactivity and disinhibition give rise to frontal overload and disrupt executive control, fuelling and perpetuating post-traumatic stress disorder symptoms. Absent in generalized anxiety disorder, these aberrations highlight a unique sensory pathology of post-traumatic stress disorder (ruling out effects merely reflecting anxious hyperarousal), motivating new interventions targeting sensory processing and the sensory brain in these patients. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. A biologically inspired neural model for visual and proprioceptive integration including sensory training.

    PubMed

    Saidi, Maryam; Towhidkhah, Farzad; Gharibzadeh, Shahriar; Lari, Abdolaziz Azizi

    2013-12-01

    Humans perceive the surrounding world by integration of information through different sensory modalities. Earlier models of multisensory integration rely mainly on traditional Bayesian and causal Bayesian inferences for single causal (source) and two causal (for two senses such as visual and auditory systems), respectively. In this paper a new recurrent neural model is presented for integration of visual and proprioceptive information. This model is based on population coding which is able to mimic multisensory integration of neural centers in the human brain. The simulation results agree with those achieved by casual Bayesian inference. The model can also simulate the sensory training process of visual and proprioceptive information in human. Training process in multisensory integration is a point with less attention in the literature before. The effect of proprioceptive training on multisensory perception was investigated through a set of experiments in our previous study. The current study, evaluates the effect of both modalities, i.e., visual and proprioceptive training and compares them with each other through a set of new experiments. In these experiments, the subject was asked to move his/her hand in a circle and estimate its position. The experiments were performed on eight subjects with proprioception training and eight subjects with visual training. Results of the experiments show three important points: (1) visual learning rate is significantly more than that of proprioception; (2) means of visual and proprioceptive errors are decreased by training but statistical analysis shows that this decrement is significant for proprioceptive error and non-significant for visual error, and (3) visual errors in training phase even in the beginning of it, is much less than errors of the main test stage because in the main test, the subject has to focus on two senses. The results of the experiments in this paper is in agreement with the results of the neural model simulation.

  19. Information fusion via isocortex-based Area 37 modeling

    NASA Astrophysics Data System (ADS)

    Peterson, James K.

    2004-08-01

    A simplified model of information processing in the brain can be constructed using primary sensory input from two modalities (auditory and visual) and recurrent connections to the limbic subsystem. Information fusion would then occur in Area 37 of the temporal cortex. The creation of meta concepts from the low order primary inputs is managed by models of isocortex processing. Isocortex algorithms are used to model parietal (auditory), occipital (visual), temporal (polymodal fusion) cortex and the limbic system. Each of these four modules is constructed out of five cortical stacks in which each stack consists of three vertically oriented six layer isocortex models. The input to output training of each cortical model uses the OCOS (on center - off surround) and FFP (folded feedback pathway) circuitry of (Grossberg, 1) which is inherently a recurrent network type of learning characterized by the identification of perceptual groups. Models of this sort are thus closely related to cognitive models as it is difficult to divorce the sensory processing subsystems from the higher level processing in the associative cortex. The overall software architecture presented is biologically based and is presented as a potential architectural prototype for the development of novel sensory fusion strategies. The algorithms are motivated to some degree by specific data from projects on musical composition and autonomous fine art painting programs, but only in the sense that these projects use two specific types of auditory and visual cortex data. Hence, the architectures are presented for an artificial information processing system which utilizes two disparate sensory sources. The exact nature of the two primary sensory input streams is irrelevant.

  20. "The Mask Who Wasn't There": Visual Masking Effect with the Perceptual Absence of the Mask

    ERIC Educational Resources Information Center

    Rey, Amandine Eve; Riou, Benoit; Muller, Dominique; Dabic, Stéphanie; Versace, Rémy

    2015-01-01

    Does a visual mask need to be perceptually present to disrupt processing? In the present research, we proposed to explore the link between perceptual and memory mechanisms by demonstrating that a typical sensory phenomenon (visual masking) can be replicated at a memory level. Experiment 1 highlighted an interference effect of a visual mask on the…

  1. Enhanced visual responses in the superior colliculus in an animal model of attention-deficit hyperactivity disorder and their suppression by D-amphetamine.

    PubMed

    Clements, K M; Devonshire, I M; Reynolds, J N J; Overton, P G

    2014-08-22

    Attention-deficit hyperactivity disorder (ADHD) is a prevalent neurodevelopmental disorder characterized by overactivity, impulsiveness and attentional problems, including an increase in distractibility. A structure that is intimately linked with distractibility is the superior colliculus (SC), a midbrain sensory structure which plays a particular role in the production of eye and head movements. Although others have proposed the involvement of such diverse elements as the frontal cortex and forebrain noradrenaline in ADHD, given the role of the colliculus in distractibility and the increased distractibility in ADHD, we have proposed that distractibility in ADHD arises due to collicular sensory hyper-responsiveness. To further investigate this possibility, we recorded the extracellular activity (multi-unit (MUA) and local field potential (LFP)) in the superficial visual layers of the SC in an animal model of ADHD, the New Zealand genetically hypertensive (GH) rat, in response to wholefield light flashes. The MUA and LFP peak amplitude and summed activity within a one-second time window post-stimulus were both significantly greater in GH rats than in Wistar controls, across the full range of stimulus intensities. Given that baseline firing rate did not differ between the strains, this suggests that the signal-to-noise ratio is elevated in GH animals. D-Amphetamine reduced the peak amplitude and summed activity of the multi-unit response in Wistar animals. It also reduced the peak amplitude and summed activity of the multi-unit response in GH animals, at higher doses bringing it down to levels that were equivalent to those of Wistar animals at baseline. The present results provide convergent evidence that a collicular dysfunction (sensory hyper-responsiveness) is present in ADHD, and that it may underlie the enhanced distractibility. In addition, D-amphetamine - a widely used treatment in ADHD - may have one of its loci of therapeutic action at the level of the colliculus. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Sensory and Working Memory Representations of Small and Large Numerosities in the Crow Endbrain.

    PubMed

    Ditz, Helen M; Nieder, Andreas

    2016-11-23

    Neurons in the avian nidopallium caudolaterale (NCL), an endbrain structure that originated independently from the mammalian neocortex, process visual numerosities. To clarify the code for number in this anatomically distinct endbrain area in birds, neuronal responses to a broad range of numerosities were analyzed. We recorded single-neuron activity from the NCL of crows performing a delayed match-to-sample task with visual numerosities as discriminanda. The responses of >20% of randomly selected neurons were modulated significantly by numerosities ranging from one to 30 items. Numerosity-selective neurons showed bell-shaped tuning curves with one of the presented numerosities as preferred numerosity regardless of the physical appearance of the items. The resulting labeled-line code exhibited logarithmic compression obeying the Weber-Fechner law for magnitudes. Comparable proportions of selective neurons were found, not only during stimulus presentation, but also in the delay phase, indicating a dominant role of the NCL in numerical working memory. Both during sensory encoding and memorization of numerosities in working memory, NCL activity predicted the crows' number discrimination performance. These neuronal data reveal striking similarities across vertebrate taxa in their code for number despite convergently evolved and anatomically distinct endbrain structures. Birds are known for their capabilities to process numerical quantity. However, birds lack a six-layered neocortex that enables primates with numerical competence. We aimed to decipher the neuronal code for numerical quantity in the independently and distinctly evolved endbrain of birds. We recorded the activity of neurons in an endbrain association area termed nidopallium caudolaterale (NCL) from crows that assessed and briefly memorized numerosities from one to 30 dots. We report a neuronal code for sensory representation and working memory of numerosities in the crow NCL exhibiting several characteristics that are surprisingly similar to the ones found in primates. Our data suggest a common code for number in two different vertebrate taxa that has evolved based on convergent evolution. Copyright © 2016 the authors 0270-6474/16/3612044-09$15.00/0.

  3. A crossmodal crossover: opposite effects of visual and auditory perceptual load on steady-state evoked potentials to irrelevant visual stimuli.

    PubMed

    Jacoby, Oscar; Hall, Sarah E; Mattingley, Jason B

    2012-07-16

    Mechanisms of attention are required to prioritise goal-relevant sensory events under conditions of stimulus competition. According to the perceptual load model of attention, the extent to which task-irrelevant inputs are processed is determined by the relative demands of discriminating the target: the more perceptually demanding the target task, the less unattended stimuli will be processed. Although much evidence supports the perceptual load model for competing stimuli within a single sensory modality, the effects of perceptual load in one modality on distractor processing in another is less clear. Here we used steady-state evoked potentials (SSEPs) to measure neural responses to irrelevant visual checkerboard stimuli while participants performed either a visual or auditory task that varied in perceptual load. Consistent with perceptual load theory, increasing visual task load suppressed SSEPs to the ignored visual checkerboards. In contrast, increasing auditory task load enhanced SSEPs to the ignored visual checkerboards. This enhanced neural response to irrelevant visual stimuli under auditory load suggests that exhausting capacity within one modality selectively compromises inhibitory processes required for filtering stimuli in another. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Role of multisensory stimuli in vigilance enhancement- a single trial event related potential study.

    PubMed

    Abbasi, Nida Itrat; Bodala, Indu Prasad; Bezerianos, Anastasios; Yu Sun; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Development of interventions to prevent vigilance decrement has important applications in sensitive areas like transportation and defence. The objective of this work is to use multisensory (visual and haptic) stimuli for cognitive enhancement during mundane tasks. Two different epoch intervals representing sensory perception and motor response were analysed using minimum variance distortionless response (MVDR) based single trial ERP estimation to understand the performance dependency on both factors. Bereitschaftspotential (BP) latency L3 (r=0.6 in phase 1 (visual) and r=0.71 in phase 2 (visual and haptic)) was significantly correlated with reaction time as compared to that of sensory ERP latency L2 (r=0.1 in both phase 1 and phase 2). This implies that low performance in monotonous tasks is predominantly dependent on the prolonged neural interaction with the muscles to initiate movement. Further, negative relationship was found between the ERP latencies related to sensory perception and Bereitschaftspotential (BP) and occurrence of epochs when multisensory cues are provided. This means that vigilance decrement is reduced with the help of multisensory stimulus presentation in prolonged monotonous tasks.

  5. The effect of extended sensory range via the EyeCane sensory substitution device on the characteristics of visionless virtual navigation.

    PubMed

    Maidenbaum, Shachar; Levy-Tzedek, Shelly; Chebat, Daniel Robert; Namer-Furstenberg, Rinat; Amedi, Amir

    2014-01-01

    Mobility training programs for helping the blind navigate through unknown places with a White-Cane significantly improve their mobility. However, what is the effect of new assistive technologies, offering more information to the blind user, on the underlying premises of these programs such as navigation patterns? We developed the virtual-EyeCane, a minimalistic sensory substitution device translating single-point-distance into auditory cues identical to the EyeCane's in the real world. We compared performance in virtual environments when using the virtual-EyeCane, a virtual-White-Cane, no device and visual navigation. We show that the characteristics of virtual-EyeCane navigation differ from navigation with a virtual-White-Cane or no device, and that virtual-EyeCane users complete more levels successfully, taking shorter paths and with less collisions than these groups, and we demonstrate the relative similarity of virtual-EyeCane and visual navigation patterns. This suggests that additional distance information indeed changes navigation patterns from virtual-White-Cane use, and brings them closer to visual navigation.

  6. Training Modalities to Increase Sensorimotor Adaptability

    NASA Technical Reports Server (NTRS)

    Bloomberg, J. J.; Mulavara, A. P.; Peters, B. T.; Brady, R.; Audas, C.; Cohen, H. S.

    2009-01-01

    During the acute phase of adaptation to novel gravitational environments, sensorimotor disturbances have the potential to disrupt the ability of astronauts to perform required mission tasks. The goal of our current series of studies is develop a sensorimotor adaptability (SA) training program designed to facilitate recovery of functional capabilities when astronauts transition to different gravitational environments. The project has conducted a series of studies investigating the efficacy of treadmill training combined with a variety of sensory challenges (incongruent visual input, support surface instability) designed to increase adaptability. SA training using a treadmill combined with exposure to altered visual input was effective in producing increased adaptability in a more complex over-ground ambulatory task on an obstacle course. This confirms that for a complex task like walking, treadmill training contains enough of the critical features of overground walking to be an effective training modality. SA training can be optimized by using a periodized training schedule. Test sessions that each contain short-duration exposures to multiple perturbation stimuli allows subjects to acquire a greater ability to rapidly reorganize appropriate response strategies when encountering a novel sensory environment. Using a treadmill mounted on top of a six degree-of-freedom motion base platform we investigated locomotor training responses produced by subjects introduced to a dynamic walking surface combined with alterations in visual flow. Subjects who received this training had improved locomotor performance and faster reaction times when exposed to the novel sensory stimuli compared to control subjects. Results also demonstrate that individual sensory biases (i.e. increased visual dependency) can predict adaptive responses to novel sensory environments suggesting that individual training prescription can be developed to enhance adaptability. These data indicate that SA training can be effectively integrated with treadmill exercise and optimized to provide a unique system that combines multiple training requirements in a single countermeasure system. Learning Objectives: The development of a new countermeasure approach that enhances sensorimotor adaptability will be discussed.

  7. Long-term sensorimotor and therapeutical effects of a mild regime of prism adaptation in spatial neglect. A double-blind RCT essay.

    PubMed

    Rode, G; Lacour, S; Jacquin-Courtois, S; Pisella, L; Michel, C; Revol, P; Alahyane, N; Luauté, J; Gallagher, S; Halligan, P; Pélisson, D; Rossetti, Y

    2015-04-01

    Spatial neglect (SN) is commonly associated with poor functional outcome. Adaptation to a rightward optical deviation of vision has been shown to benefit to SN rehabilitation. The neurophysiological foundations and the optimal modalities of prism adaptation (PA) therapy however remain to be validated. This study is aimed at exploring the long-term sensory-motor, cognitive and functional effects produced by weekly PA sessions over a period of four weeks. A double-blind, monocentric randomized and controlled trial (RCT) was carried out. Twenty patients with left SN secondary to stroke were included, 10 in the "prism" group and 10 in the "control" group. The sensory-motor effects of PA were evaluated by measurement of manual and visual straight-ahead, and also by precision of pointing without visual feedback before and after each PA session. The functional independence measure (FIM) was evaluated before and at 1, 3 and 6 months after PA, while SN severity was assessed using the Behavioural Inattention Test (BIT) before and 6 months after PA. Before the intervention, only manual straight-ahead pointing constituted a reproducible sensory-motor measurement. During prism exposure, a questionnaire showed that not a single patient were aware of the direct effects of optical deviation on pointing movement performance. The sensory-motor after-effects produced by the PA produced a more rapid reduction of the rightward manual straight-ahead, which was secondarily followed by visual straight-ahead. These sensory-motor effects helped to clarify the action mechanisms of PA on SN. At the conclusion of the 6-month follow-up, the two groups showed similar improvement, indicating that a weekly PA session over 4 weeks was not sufficient to produce long-term functional benefit. This improvement was correlated with the evolution of visual straight-ahead, which can be proposed as a marker for patients outcome. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  8. Passive Double-Sensory Evoked Coherence Correlates with Long-Term Memory Capacity.

    PubMed

    Horwitz, Anna; Mortensen, Erik L; Osler, Merete; Fagerlund, Birgitte; Lauritzen, Martin; Benedek, Krisztina

    2017-01-01

    HIGHLIGHTS Memory correlates with the difference between single and double-sensory evoked steady-state coherence in the gamma range (Δ C ).The correlation is most pronounced for the anterior brain region (Δ C A ).The correlation is not driven by birth size, education, speed of processing, or intelligence.The sensitivity of Δ C A for detecting low memory capacity is 90%. Cerebral rhythmic activity and oscillations are important pathways of communication between cortical cell assemblies and may be key factors in memory. We asked whether memory performance is related to gamma coherence in a non-task sensory steady-state stimulation. We investigated 40 healthy males born in 1953 who were part of a Danish birth cohort study. Coherence was measured in the gamma range in response to a single-sensory visual stimulation (36 Hz) and a double-sensory combined audiovisual stimulation (auditive: 40 Hz; visual: 36 Hz). The individual difference in coherence (Δ C ) between the bimodal and monomodal stimulation was calculated for each subject and used as the main explanatory variable. Δ C in total brain were significantly negatively correlated with long-term verbal recall. This correlation was pronounced for the anterior region. In addition, the correlation between Δ C and long-term memory was robust when controlling for working memory, as well as a wide range of potentially confounding factors, including intelligence, length of education, speed of processing, visual attention and executive function. Moreover, we found that the difference in anterior coherence (Δ C A ) is a better predictor of memory than power in multivariate models. The sensitivity of Δ C A for detecting low memory capacity is 92%. Finally, Δ C A was also associated with other types of memory: verbal learning, visual recognition, and spatial memory, and these additional correlations were also robust enough to control for a range of potentially confounding factors. Thus, the Δ C is a predictor of memory performance may be useful in cognitive neuropsychological testing.

  9. Passive Double-Sensory Evoked Coherence Correlates with Long-Term Memory Capacity

    PubMed Central

    Horwitz, Anna; Mortensen, Erik L.; Osler, Merete; Fagerlund, Birgitte; Lauritzen, Martin; Benedek, Krisztina

    2017-01-01

    HIGHLIGHTS Memory correlates with the difference between single and double-sensory evoked steady-state coherence in the gamma range (ΔC).The correlation is most pronounced for the anterior brain region (ΔCA).The correlation is not driven by birth size, education, speed of processing, or intelligence.The sensitivity of ΔCA for detecting low memory capacity is 90%. Cerebral rhythmic activity and oscillations are important pathways of communication between cortical cell assemblies and may be key factors in memory. We asked whether memory performance is related to gamma coherence in a non-task sensory steady-state stimulation. We investigated 40 healthy males born in 1953 who were part of a Danish birth cohort study. Coherence was measured in the gamma range in response to a single-sensory visual stimulation (36 Hz) and a double-sensory combined audiovisual stimulation (auditive: 40 Hz; visual: 36 Hz). The individual difference in coherence (ΔC) between the bimodal and monomodal stimulation was calculated for each subject and used as the main explanatory variable. ΔC in total brain were significantly negatively correlated with long-term verbal recall. This correlation was pronounced for the anterior region. In addition, the correlation between ΔC and long-term memory was robust when controlling for working memory, as well as a wide range of potentially confounding factors, including intelligence, length of education, speed of processing, visual attention and executive function. Moreover, we found that the difference in anterior coherence (ΔCA) is a better predictor of memory than power in multivariate models. The sensitivity of ΔCA for detecting low memory capacity is 92%. Finally, ΔCA was also associated with other types of memory: verbal learning, visual recognition, and spatial memory, and these additional correlations were also robust enough to control for a range of potentially confounding factors. Thus, the ΔC is a predictor of memory performance may be useful in cognitive neuropsychological testing. PMID:29311868

  10. M.I.T./Canadian Vestibular Experiments on the Spacelab-1 Mission. Part 1: Sensory Adaptation to Weightlessness and Readaptation to One-G: An Overview

    NASA Technical Reports Server (NTRS)

    Young, Laurence R.; Oman, C. M.; Watt, D. G. D.; Money, K. E.; Lichtenberg, B. K.; Kenyon, R. V.; Arrott, A. P.

    1991-01-01

    Experiments on human spatial orientation were conducted on four crewmembers of Space Shuttle Spacelab Mission 1. The conceptual background of the project, the relationship among the experiments, and their relevance to a 'sensory reinterpretation hypothesis' are presented. Detailed experiment procedures and results are presented in the accompanying papers in this series. The overall findings are discussed as they pertain to the following aspects of hypothesized sensory reinterpretation in weightlessness: (1) utricular otolith afferent signals are reinterpreted as indicating head translation rather than tilt, (2) sensitivity of reflex responses to footward acceleration is reduced, and (3) increased weighting is given to visual and tactile cues in orientation perception and posture control. Results suggest increased weighting of visual cues and reduced weighting of graviceptor signals in weightlessness.

  11. Recruitment of Occipital Cortex during Sensory Substitution Training Linked to Subjective Experience of Seeing in People with Blindness

    PubMed Central

    Ortiz, Tomás; Poch, Joaquín; Santos, Juan M.; Requena, Carmen; Martínez, Ana M.; Ortiz-Terán, Laura; Turrero, Agustín; Barcia, Juan; Nogales, Ramón; Calvo, Agustín; Martínez, José M.; Córdoba, José L.; Pascual-Leone, Alvaro

    2011-01-01

    Over three months of intensive training with a tactile stimulation device, 18 blind and 10 blindfolded seeing subjects improved in their ability to identify geometric figures by touch. Seven blind subjects spontaneously reported ‘visual qualia’, the subjective sensation of seeing flashes of light congruent with tactile stimuli. In the latter subjects tactile stimulation evoked activation of occipital cortex on electroencephalography (EEG). None of the blind subjects who failed to experience visual qualia, despite identical tactile stimulation training, showed EEG recruitment of occipital cortex. None of the blindfolded seeing humans reported visual-like sensations during tactile stimulation. These findings support the notion that the conscious experience of seeing is linked to the activation of occipital brain regions in people with blindness. Moreover, the findings indicate that provision of visual information can be achieved through non-visual sensory modalities which may help to minimize the disability of blind individuals, affording them some degree of object recognition and navigation aid. PMID:21853098

  12. A number-form area in the blind

    PubMed Central

    Abboud, Sami; Maidenbaum, Shachar; Dehaene, Stanislas; Amedi, Amir

    2015-01-01

    Distinct preference for visual number symbols was recently discovered in the human right inferior temporal gyrus (rITG). It remains unclear how this preference emerges, what is the contribution of shape biases to its formation and whether visual processing underlies it. Here we use congenital blindness as a model for brain development without visual experience. During fMRI, we present blind subjects with shapes encoded using a novel visual-to-music sensory-substitution device (The EyeMusic). Greater activation is observed in the rITG when subjects process symbols as numbers compared with control tasks on the same symbols. Using resting-state fMRI in the blind and sighted, we further show that the areas with preference for numerals and letters exhibit distinct patterns of functional connectivity with quantity and language-processing areas, respectively. Our findings suggest that specificity in the ventral ‘visual’ stream can emerge independently of sensory modality and visual experience, under the influence of distinct connectivity patterns. PMID:25613599

  13. Devices for visually impaired people: High technological devices with low user acceptance and no adaptability for children.

    PubMed

    Gori, Monica; Cappagli, Giulia; Tonelli, Alessia; Baud-Bovy, Gabriel; Finocchietti, Sara

    2016-10-01

    Considering that cortical plasticity is maximal in the child, why are the majority of technological devices available for visually impaired users meant for adults and not for children? Moreover, despite high technological advancements in recent years, why is there still no full user acceptance of existing sensory substitution devices? The goal of this review is to create a link between neuroscientists and engineers by opening a discussion about the direction that the development of technological devices for visually impaired people is taking. Firstly, we review works on spatial and social skills in children with visual impairments, showing that lack of vision is associated with other sensory and motor delays. Secondly, we present some of the technological solutions developed to date for visually impaired people. Doing this, we highlight the core features of these systems and discuss their limits. We also discuss the possible reasons behind the low adaptability in children. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Aversive learning shapes neuronal orientation tuning in human visual cortex.

    PubMed

    McTeague, Lisa M; Gruss, L Forest; Keil, Andreas

    2015-07-28

    The responses of sensory cortical neurons are shaped by experience. As a result perceptual biases evolve, selectively facilitating the detection and identification of sensory events that are relevant for adaptive behaviour. Here we examine the involvement of human visual cortex in the formation of learned perceptual biases. We use classical aversive conditioning to associate one out of a series of oriented gratings with a noxious sound stimulus. After as few as two grating-sound pairings, visual cortical responses to the sound-paired grating show selective amplification. Furthermore, as learning progresses, responses to the orientations with greatest similarity to the sound-paired grating are increasingly suppressed, suggesting inhibitory interactions between orientation-selective neuronal populations. Changes in cortical connectivity between occipital and fronto-temporal regions mirror the changes in visuo-cortical response amplitudes. These findings suggest that short-term behaviourally driven retuning of human visual cortical neurons involves distal top-down projections as well as local inhibitory interactions.

  15. Diagnosing Contributions of Sensory and Cognitive Deficits to Hearing Dysfunction in Blast Exposed/TBI Service Members

    DTIC Science & Technology

    2016-10-01

    1 AWARD NUMBER: W81XWH-15-1-0490 TITLE: Diagnosing Contributions of Sensory and Cognitive Deficits to Hearing Dysfunction in Blast-Exposed/ TBI...3. DATES COVERED 15 Sep 2015 - 14 Sep 2016 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Diagnosing Contributions of Sensory and Cognitive Deficits to...installed at WRNMMC, and is running finalized versions of both the auditory and visual selective attention tasks. Subject recruitment has started, and

  16. A magnetoencephalography study of multi-modal processing of pain anticipation in primary sensory cortices.

    PubMed

    Gopalakrishnan, R; Burgess, R C; Plow, E B; Floden, D P; Machado, A G

    2015-09-24

    Pain anticipation plays a critical role in pain chronification and results in disability due to pain avoidance. It is important to understand how different sensory modalities (auditory, visual or tactile) may influence pain anticipation as different strategies could be applied to mitigate anticipatory phenomena and chronification. In this study, using a countdown paradigm, we evaluated with magnetoencephalography the neural networks associated with pain anticipation elicited by different sensory modalities in normal volunteers. When encountered with well-established cues that signaled pain, visual and somatosensory cortices engaged the pain neuromatrix areas early during the countdown process, whereas the auditory cortex displayed delayed processing. In addition, during pain anticipation, the visual cortex displayed independent processing capabilities after learning the contextual meaning of cues from associative and limbic areas. Interestingly, cross-modal activation was also evident and strong when visual and tactile cues signaled upcoming pain. Dorsolateral prefrontal cortex and mid-cingulate cortex showed significant activity during pain anticipation regardless of modality. Our results show pain anticipation is processed with great time efficiency by a highly specialized and hierarchical network. The highest degree of higher-order processing is modulated by context (pain) rather than content (modality) and rests within the associative limbic regions, corroborating their intrinsic role in chronification. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.

  17. Model of rhythmic ball bouncing using a visually controlled neural oscillator.

    PubMed

    Avrin, Guillaume; Siegler, Isabelle A; Makarov, Maria; Rodriguez-Ayerbe, Pedro

    2017-10-01

    The present paper investigates the sensory-driven modulations of central pattern generator dynamics that can be expected to reproduce human behavior during rhythmic hybrid tasks. We propose a theoretical model of human sensorimotor behavior able to account for the observed data from the ball-bouncing task. The novel control architecture is composed of a Matsuoka neural oscillator coupled with the environment through visual sensory feedback. The architecture's ability to reproduce human-like performance during the ball-bouncing task in the presence of perturbations is quantified by comparison of simulated and recorded trials. The results suggest that human visual control of the task is achieved online. The adaptive behavior is made possible by a parametric and state control of the limit cycle emerging from the interaction of the rhythmic pattern generator, the musculoskeletal system, and the environment. NEW & NOTEWORTHY The study demonstrates that a behavioral model based on a neural oscillator controlled by visual information is able to accurately reproduce human modulations in a motor action with respect to sensory information during the rhythmic ball-bouncing task. The model attractor dynamics emerging from the interaction between the neuromusculoskeletal system and the environment met task requirements, environmental constraints, and human behavioral choices without relying on movement planning and explicit internal models of the environment. Copyright © 2017 the American Physiological Society.

  18. Electrotactile and vibrotactile displays for sensory substitution systems

    NASA Technical Reports Server (NTRS)

    Kaczmarek, Kurt A.; Webster, John G.; Bach-Y-rita, Paul; Tompkins, Willis J.

    1991-01-01

    Sensory substitution systems provide their users with environmental information through a human sensory channel (eye, ear, or skin) different from that normally used or with the information processed in some useful way. The authors review the methods used to present visual, auditory, and modified tactile information to the skin and discuss present and potential future applications of sensory substitution, including tactile vision substitution (TVS), tactile auditory substitution, and remote tactile sensing or feedback (teletouch). The relevant sensory physiology of the skin, including the mechanisms of normal touch and the mechanisms and sensations associated with electrical stimulation of the skin using surface electrodes (electrotactile, or electrocutaneous, stimulation), is reviewed. The information-processing ability of the tactile sense and its relevance to sensory substitution is briefly summarized. The limitations of current tactile display technologies are discussed.

  19. Functional connectivity of visual cortex in the blind follows retinotopic organization principles.

    PubMed

    Striem-Amit, Ella; Ovadia-Caro, Smadar; Caramazza, Alfonso; Margulies, Daniel S; Villringer, Arno; Amedi, Amir

    2015-06-01

    Is visual input during critical periods of development crucial for the emergence of the fundamental topographical mapping of the visual cortex? And would this structure be retained throughout life-long blindness or would it fade as a result of plastic, use-based reorganization? We used functional connectivity magnetic resonance imaging based on intrinsic blood oxygen level-dependent fluctuations to investigate whether significant traces of topographical mapping of the visual scene in the form of retinotopic organization, could be found in congenitally blind adults. A group of 11 fully and congenitally blind subjects and 18 sighted controls were studied. The blind demonstrated an intact functional connectivity network structural organization of the three main retinotopic mapping axes: eccentricity (centre-periphery), laterality (left-right), and elevation (upper-lower) throughout the retinotopic cortex extending to high-level ventral and dorsal streams, including characteristic eccentricity biases in face- and house-selective areas. Functional connectivity-based topographic organization in the visual cortex was indistinguishable from the normally sighted retinotopic functional connectivity structure as indicated by clustering analysis, and was found even in participants who did not have a typical retinal development in utero (microphthalmics). While the internal structural organization of the visual cortex was strikingly similar, the blind exhibited profound differences in functional connectivity to other (non-visual) brain regions as compared to the sighted, which were specific to portions of V1. Central V1 was more connected to language areas but peripheral V1 to spatial attention and control networks. These findings suggest that current accounts of critical periods and experience-dependent development should be revisited even for primary sensory areas, in that the connectivity basis for visual cortex large-scale topographical organization can develop without any visual experience and be retained through life-long experience-dependent plasticity. Furthermore, retinotopic divisions of labour, such as that between the visual cortex regions normally representing the fovea and periphery, also form the basis for topographically-unique plastic changes in the blind. © The Author (2015). Published by Oxford University Press on behalf of the Guarantors of Brain.

  20. Handwriting Error Patterns of Children with Mild Motor Difficulties.

    ERIC Educational Resources Information Center

    Malloy-Miller, Theresa; And Others

    1995-01-01

    A test of handwriting legibility and 6 perceptual-motor tests were completed by 66 children ages 7-12. Among handwriting error patterns, execution was associated with visual-motor skill and sensory discrimination, aiming with visual-motor and fine-motor skills. The visual-spatial factor had no significant association with perceptual-motor…

  1. Enhanced attentional gain as a mechanism for generalized perceptual learning in human visual cortex.

    PubMed

    Byers, Anna; Serences, John T

    2014-09-01

    Learning to better discriminate a specific visual feature (i.e., a specific orientation in a specific region of space) has been associated with plasticity in early visual areas (sensory modulation) and with improvements in the transmission of sensory information from early visual areas to downstream sensorimotor and decision regions (enhanced readout). However, in many real-world scenarios that require perceptual expertise, observers need to efficiently process numerous exemplars from a broad stimulus class as opposed to just a single stimulus feature. Some previous data suggest that perceptual learning leads to highly specific neural modulations that support the discrimination of specific trained features. However, the extent to which perceptual learning acts to improve the discriminability of a broad class of stimuli via the modulation of sensory responses in human visual cortex remains largely unknown. Here, we used functional MRI and a multivariate analysis method to reconstruct orientation-selective response profiles based on activation patterns in the early visual cortex before and after subjects learned to discriminate small offsets in a set of grating stimuli that were rendered in one of nine possible orientations. Behavioral performance improved across 10 training sessions, and there was a training-related increase in the amplitude of orientation-selective response profiles in V1, V2, and V3 when orientation was task relevant compared with when it was task irrelevant. These results suggest that generalized perceptual learning can lead to modified responses in the early visual cortex in a manner that is suitable for supporting improved discriminability of stimuli drawn from a large set of exemplars. Copyright © 2014 the American Physiological Society.

  2. Functional Architecture of the Retina: Development and Disease

    PubMed Central

    Hoon, Mrinalini; Okawa, Haruhisa; Santina, Luca Della; Wong, Rachel O.L.

    2014-01-01

    Structure and function are highly correlated in the vertebrate retina, a sensory tissue that is organized into cell layers with microcircuits working in parallel and together to encode visual information. All vertebrate retinas share a fundamental plan, comprising five major neuronal cell classes with cell body distributions and connectivity arranged in stereotypic patterns. Conserved features in retinal design have enabled detailed analysis and comparisons of structure, connectivity and function across species. Each species, however, can adopt structural and/or functional retinal specializations, implementing variations to the basic design in order to satisfy unique requirements in visual function. Recent advances in molecular tools, imaging and electrophysiological approaches have greatly facilitated identification of the cellular and molecular mechanisms that establish the fundamental organization of the retina and the specializations of its microcircuits during development. Here, we review advances in our understanding of how these mechanisms act to shape structure and function at the single cell level, to coordinate the assembly of cell populations, and to define their specific circuitry. We also highlight how structure is rearranged and function is disrupted in disease, and discuss current approaches to re-establish the intricate functional architecture of the retina. PMID:24984227

  3. Functional architecture of the retina: development and disease.

    PubMed

    Hoon, Mrinalini; Okawa, Haruhisa; Della Santina, Luca; Wong, Rachel O L

    2014-09-01

    Structure and function are highly correlated in the vertebrate retina, a sensory tissue that is organized into cell layers with microcircuits working in parallel and together to encode visual information. All vertebrate retinas share a fundamental plan, comprising five major neuronal cell classes with cell body distributions and connectivity arranged in stereotypic patterns. Conserved features in retinal design have enabled detailed analysis and comparisons of structure, connectivity and function across species. Each species, however, can adopt structural and/or functional retinal specializations, implementing variations to the basic design in order to satisfy unique requirements in visual function. Recent advances in molecular tools, imaging and electrophysiological approaches have greatly facilitated identification of the cellular and molecular mechanisms that establish the fundamental organization of the retina and the specializations of its microcircuits during development. Here, we review advances in our understanding of how these mechanisms act to shape structure and function at the single cell level, to coordinate the assembly of cell populations, and to define their specific circuitry. We also highlight how structure is rearranged and function is disrupted in disease, and discuss current approaches to re-establish the intricate functional architecture of the retina. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Decoding visual object categories in early somatosensory cortex.

    PubMed

    Smith, Fraser W; Goodale, Melvyn A

    2015-04-01

    Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influence from both within and across modality connections. In the present work, we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associations formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in 2 fMRI experiments. Multivariate pattern analysis revealed reliable decoding of familiar visual object category in bilateral S1 (i.e., postcentral gyri) and right S2. We further show that this decoding is observed for familiar but not unfamiliar visual objects in S1. In addition, whole-brain searchlight decoding analyses revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somatosensory processing carry information about the category of visually presented familiar objects. © The Author 2013. Published by Oxford University Press.

  5. Decoding Visual Object Categories in Early Somatosensory Cortex

    PubMed Central

    Smith, Fraser W.; Goodale, Melvyn A.

    2015-01-01

    Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influence from both within and across modality connections. In the present work, we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associations formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in 2 fMRI experiments. Multivariate pattern analysis revealed reliable decoding of familiar visual object category in bilateral S1 (i.e., postcentral gyri) and right S2. We further show that this decoding is observed for familiar but not unfamiliar visual objects in S1. In addition, whole-brain searchlight decoding analyses revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somatosensory processing carry information about the category of visually presented familiar objects. PMID:24122136

  6. Comparing the visual spans for faces and letters

    PubMed Central

    He, Yingchen; Scholz, Jennifer M.; Gage, Rachel; Kallie, Christopher S.; Liu, Tingting; Legge, Gordon E.

    2015-01-01

    The visual span—the number of adjacent text letters that can be reliably recognized on one fixation—has been proposed as a sensory bottleneck that limits reading speed (Legge, Mansfield, & Chung, 2001). Like reading, searching for a face is an important daily task that involves pattern recognition. Is there a similar limitation on the number of faces that can be recognized in a single fixation? Here we report on a study in which we measured and compared the visual-span profiles for letter and face recognition. A serial two-stage model for pattern recognition was developed to interpret the data. The first stage is characterized by factors limiting recognition of isolated letters or faces, and the second stage represents the interfering effect of nearby stimuli on recognition. Our findings show that the visual span for faces is smaller than that for letters. Surprisingly, however, when differences in first-stage processing for letters and faces are accounted for, the two visual spans become nearly identical. These results suggest that the concept of visual span may describe a common sensory bottleneck that underlies different types of pattern recognition. PMID:26129858

  7. Neural mechanisms of human perceptual choice under focused and divided attention.

    PubMed

    Wyart, Valentin; Myers, Nicholas E; Summerfield, Christopher

    2015-02-25

    Perceptual decisions occur after the evaluation and integration of momentary sensory inputs, and dividing attention between spatially disparate sources of information impairs decision performance. However, it remains unknown whether dividing attention degrades the precision of sensory signals, precludes their conversion into decision signals, or dampens the integration of decision information toward an appropriate response. Here we recorded human electroencephalographic (EEG) activity while participants categorized one of two simultaneous and independent streams of visual gratings according to their average tilt. By analyzing trial-by-trial correlations between EEG activity and the information offered by each sample, we obtained converging behavioral and neural evidence that dividing attention between left and right visual fields does not dampen the encoding of sensory or decision information. Under divided attention, momentary decision information from both visual streams was encoded in slow parietal signals without interference but was lost downstream during their integration as reflected in motor mu- and beta-band (10-30 Hz) signals, resulting in a "leaky" accumulation process that conferred greater behavioral influence to more recent samples. By contrast, sensory inputs that were explicitly cued as irrelevant were not converted into decision signals. These findings reveal that a late cognitive bottleneck on information integration limits decision performance under divided attention, and places new capacity constraints on decision-theoretic models of information integration under cognitive load. Copyright © 2015 the authors 0270-6474/15/353485-14$15.00/0.

  8. Neural mechanisms of human perceptual choice under focused and divided attention

    PubMed Central

    Wyart, Valentin; Myers, Nicholas E.; Summerfield, Christopher

    2015-01-01

    Perceptual decisions occur after evaluation and integration of momentary sensory inputs, and dividing attention between spatially disparate sources of information impairs decision performance. However, it remains unknown whether dividing attention degrades the precision of sensory signals, precludes their conversion into decision signals, or dampens the integration of decision information towards an appropriate response. Here we recorded human electroencephalographic (EEG) activity whilst participants categorised one of two simultaneous and independent streams of visual gratings according to their average tilt. By analyzing trial-by-trial correlations between EEG activity and the information offered by each sample, we obtained converging behavioural and neural evidence that dividing attention between left and right visual fields does not dampen the encoding of sensory or decision information. Under divided attention, momentary decision information from both visual streams was encoded in slow parietal signals without interference but was lost downstream during their integration as reflected in motor mu- and beta-band (10–30 Hz) signals, resulting in a ‘leaky’ accumulation process which conferred greater behavioural influence to more recent samples. By contrast, sensory inputs that were explicitly cued as irrelevant were not converted into decision signals. These findings reveal that a late cognitive bottleneck on information integration limits decision performance under divided attention, and place new capacity constraints on decision-theoretic models of information integration under cognitive load. PMID:25716848

  9. The primate amygdala represents the positive and negative value of visual stimuli during learning

    PubMed Central

    Paton, Joseph J.; Belova, Marina A.; Morrison, Sara E.; Salzman, C. Daniel

    2008-01-01

    Visual stimuli can acquire positive or negative value through their association with rewards and punishments, a process called reinforcement learning. Although we now know a great deal about how the brain analyses visual information, we know little about how visual representations become linked with values. To study this process, we turned to the amygdala, a brain structure implicated in reinforcement learning1–5. We recorded the activity of individual amygdala neurons in monkeys while abstract images acquired either positive or negative value through conditioning. After monkeys had learned the initial associations, we reversed image value assignments. We examined neural responses in relation to these reversals in order to estimate the relative contribution to neural activity of the sensory properties of images and their conditioned values. Here we show that changes in the values of images modulate neural activity, and that this modulation occurs rapidly enough to account for, and correlates with, monkeys’ learning. Furthermore, distinct populations of neurons encode the positive and negative values of visual stimuli. Behavioural and physiological responses to visual stimuli may therefore be based in part on the plastic representation of value provided by the amygdala. PMID:16482160

  10. More Than Meets the Eye: The Eye and Migraine-What You Need to Know.

    PubMed

    Digre, Kathleen B

    2018-05-02

    Migraine has long been associated with disturbances of vision, especially migraine with aura. However, the eye plays an important role in sensory processing as well. We have found that the visual quality of life is reduced in migraine. In this review, we discuss how the migraine and eye pain pathways are similar and affect many of the common complaints which are seen in ophthalmology and neuro-ophthalmology offices, such as dry eye and postoperative eye pain. We also review other related phenomena, including visual snow and photophobia, which also are related to altered sensory processing in migraine.

  11. Individual variation in cone photoreceptor density in house sparrows: implications for between-individual differences in visual resolution and chromatic contrast.

    PubMed

    Ensminger, Amanda L; Fernández-Juricic, Esteban

    2014-01-01

    Between-individual variation has been documented in a wide variety of taxa, especially for behavioral characteristics; however, intra-population variation in sensory systems has not received similar attention in wild animals. We measured a key trait of the visual system, the density of retinal cone photoreceptors, in a wild population of house sparrows (Passer domesticus). We tested whether individuals differed from each other in cone densities given within-individual variation across the retina and across eyes. We further tested whether the existing variation could lead to individual differences in two aspects of perception: visual resolution and chromatic contrast. We found consistent between-individual variation in the densities of all five types of avian cones, involved in chromatic and achromatic vision. Using perceptual modeling, we found that this degree of variation translated into significant between-individual differences in visual resolution and the chromatic contrast of a plumage signal that has been associated with mate choice and agonistic interactions. However, there was no evidence for a relationship between individual visual resolution and chromatic contrast. The implication is that some birds may have the sensory potential to perform "better" in certain visual tasks, but not necessarily in both resolution and contrast simultaneously. Overall, our findings (a) highlight the need to consider multiple individuals when characterizing sensory traits of a species, and (b) provide some mechanistic basis for between-individual variation in different behaviors (i.e., animal personalities) and for testing the predictions of several widely accepted hypotheses (e.g., honest signaling).

  12. Individual Variation in Cone Photoreceptor Density in House Sparrows: Implications for Between-Individual Differences in Visual Resolution and Chromatic Contrast

    PubMed Central

    Ensminger, Amanda L.; Fernández-Juricic, Esteban

    2014-01-01

    Between-individual variation has been documented in a wide variety of taxa, especially for behavioral characteristics; however, intra-population variation in sensory systems has not received similar attention in wild animals. We measured a key trait of the visual system, the density of retinal cone photoreceptors, in a wild population of house sparrows (Passer domesticus). We tested whether individuals differed from each other in cone densities given within-individual variation across the retina and across eyes. We further tested whether the existing variation could lead to individual differences in two aspects of perception: visual resolution and chromatic contrast. We found consistent between-individual variation in the densities of all five types of avian cones, involved in chromatic and achromatic vision. Using perceptual modeling, we found that this degree of variation translated into significant between-individual differences in visual resolution and the chromatic contrast of a plumage signal that has been associated with mate choice and agonistic interactions. However, there was no evidence for a relationship between individual visual resolution and chromatic contrast. The implication is that some birds may have the sensory potential to perform “better” in certain visual tasks, but not necessarily in both resolution and contrast simultaneously. Overall, our findings (a) highlight the need to consider multiple individuals when characterizing sensory traits of a species, and (b) provide some mechanistic basis for between-individual variation in different behaviors (i.e., animal personalities) and for testing the predictions of several widely accepted hypotheses (e.g., honest signaling). PMID:25372039

  13. Reliability-Weighted Integration of Audiovisual Signals Can Be Modulated by Top-down Attention

    PubMed Central

    Noppeney, Uta

    2018-01-01

    Abstract Behaviorally, it is well established that human observers integrate signals near-optimally weighted in proportion to their reliabilities as predicted by maximum likelihood estimation. Yet, despite abundant behavioral evidence, it is unclear how the human brain accomplishes this feat. In a spatial ventriloquist paradigm, participants were presented with auditory, visual, and audiovisual signals and reported the location of the auditory or the visual signal. Combining psychophysics, multivariate functional MRI (fMRI) decoding, and models of maximum likelihood estimation (MLE), we characterized the computational operations underlying audiovisual integration at distinct cortical levels. We estimated observers’ behavioral weights by fitting psychometric functions to participants’ localization responses. Likewise, we estimated the neural weights by fitting neurometric functions to spatial locations decoded from regional fMRI activation patterns. Our results demonstrate that low-level auditory and visual areas encode predominantly the spatial location of the signal component of a region’s preferred auditory (or visual) modality. By contrast, intraparietal sulcus forms spatial representations by integrating auditory and visual signals weighted by their reliabilities. Critically, the neural and behavioral weights and the variance of the spatial representations depended not only on the sensory reliabilities as predicted by the MLE model but also on participants’ modality-specific attention and report (i.e., visual vs. auditory). These results suggest that audiovisual integration is not exclusively determined by bottom-up sensory reliabilities. Instead, modality-specific attention and report can flexibly modulate how intraparietal sulcus integrates sensory signals into spatial representations to guide behavioral responses (e.g., localization and orienting). PMID:29527567

  14. Sex and Caste-Specific Variation in Compound Eye Morphology of Five Honeybee Species

    PubMed Central

    Streinzer, Martin; Brockmann, Axel; Nagaraja, Narayanappa; Spaethe, Johannes

    2013-01-01

    Ranging from dwarfs to giants, the species of honeybees show remarkable differences in body size that have placed evolutionary constrains on the size of sensory organs and the brain. Colonies comprise three adult phenotypes, drones and two female castes, the reproductive queen and sterile workers. The phenotypes differ with respect to tasks and thus selection pressures which additionally constrain the shape of sensory systems. In a first step to explore the variability and interaction between species size-limitations and sex and caste-specific selection pressures in sensory and neural structures in honeybees, we compared eye size, ommatidia number and distribution of facet lens diameters in drones, queens and workers of five species (Apis andreniformis, A. florea, A. dorsata, A. mellifera, A. cerana). In these species, male and female eyes show a consistent sex-specific organization with respect to eye size and regional specialization of facet diameters. Drones possess distinctly enlarged eyes with large dorsal facets. Aside from these general patterns, we found signs of unique adaptations in eyes of A. florea and A. dorsata drones. In both species, drone eyes are disproportionately enlarged. In A. dorsata the increased eye size results from enlarged facets, a likely adaptation to crepuscular mating flights. In contrast, the relative enlargement of A. florea drone eyes results from an increase in ommatidia number, suggesting strong selection for high spatial resolution. Comparison of eye morphology and published mating flight times indicates a correlation between overall light sensitivity and species-specific mating flight times. The correlation suggests an important role of ambient light intensities in the regulation of species-specific mating flight times and the evolution of the visual system. Our study further deepens insights into visual adaptations within the genus Apis and opens up future perspectives for research to better understand the timing mechanisms and sensory physiology of mating related signals. PMID:23460896

  15. The dusp1 Immediate Early Gene is Regulated by Natural Stimuli Predominantly in Sensory Input Neurons

    PubMed Central

    Horita, Haruhito; Wada, Kazuhiro; Rivas, Miriam V.; Hara, Erina; Jarvis, Erich D.

    2010-01-01

    Many immediate early genes (IEGs) have activity-dependent induction in a subset of brain subdivisions or neuron types. However, none have been reported yet with regulation specific to thalamic-recipient sensory neurons of the telencephalon or in the thalamic sensory input neurons themselves. Here, we report the first such gene, dual specificity phosphatase 1 (dusp1). Dusp1 is an inactivator of mitogen-activated protein kinase (MAPK), and MAPK activates expression of egr1, one of the most commonly studied IEGs, as determined in cultured cells. We found that in the brain of naturally behaving songbirds and other avian species, hearing song, seeing visual stimuli, or performing motor behavior caused high dusp1 upregulation, respectively, in auditory, visual, and somatosensory input cell populations of the thalamus and thalamic-recipient sensory neurons of the telencephalic pallium, whereas high egr1 upregulation occurred only in subsequently connected secondary and tertiary sensory neuronal populations of these same pathways. Motor behavior did not induce high levels of dusp1 expression in the motor-associated areas adjacent to song nuclei, where egr1 is upregulated in response to movement. Our analysis of dusp1 expression in mouse brain suggests similar regulation in the sensory input neurons of the thalamus and thalamic-recipient layer IV and VI neurons of the cortex. These findings suggest that dusp1 has specialized regulation to sensory input neurons of the thalamus and telencephalon; they further suggest that this regulation may serve to attenuate stimulus-induced expression of egr1 and other IEGs, leading to unique molecular properties of forebrain sensory input neurons. PMID:20506480

  16. Sensory feedback by peripheral nerve stimulation improves task performance in individuals with upper limb loss using a myoelectric prosthesis.

    PubMed

    Schiefer, Matthew; Tan, Daniel; Sidek, Steven M; Tyler, Dustin J

    2016-02-01

    Tactile feedback is critical to grip and object manipulation. Its absence results in reliance on visual and auditory cues. Our objective was to assess the effect of sensory feedback on task performance in individuals with limb loss. Stimulation of the peripheral nerves using implanted cuff electrodes provided two subjects with sensory feedback with intensity proportional to forces on the thumb, index, and middle fingers of their prosthetic hand during object manipulation. Both subjects perceived the sensation on their phantom hand at locations corresponding to the locations of the forces on the prosthetic hand. A bend sensor measured prosthetic hand span. Hand span modulated the intensity of sensory feedback perceived on the thenar eminence for subject 1 and the middle finger for subject 2. We performed three functional tests with the blindfolded subjects. First, the subject tried to determine whether or not a wooden block had been placed in his prosthetic hand. Second, the subject had to locate and remove magnetic blocks from a metal table. Third, the subject performed the Southampton Hand Assessment Procedure (SHAP). We also measured the subject's sense of embodiment with a survey and his self-confidence. Blindfolded performance with sensory feedback was similar to sighted performance in the wooden block and magnetic block tasks. Performance on the SHAP, a measure of hand mechanical function and control, was similar with and without sensory feedback. An embodiment survey showed an improved sense of integration of the prosthesis in self body image with sensory feedback. Sensory feedback by peripheral nerve stimulation improved object discrimination and manipulation, embodiment, and confidence. With both forms of feedback, the blindfolded subjects tended toward results obtained with visual feedback.

  17. Hydrocortisone accelerates the decay of iconic memory traces: on the modulation of executive and stimulus-driven constituents of sensory information maintenance.

    PubMed

    Miller, Robert; Weckesser, Lisa J; Smolka, Michael N; Kirschbaum, Clemens; Plessow, Franziska

    2015-03-01

    A substantial amount of research documents the impact of glucocorticoids on higher-order cognitive functioning. By contrast, surprisingly little is known about the susceptibility of basic sensory processes to glucocorticoid exposure given that the glucocorticoid receptor density in the human visual cortex exceeds those observed in prefrontal and most hippocampal brain regions. As executive tasks also rely on these sensory processes, the present study investigates the impact of glucocorticoid exposure on different performance parameters characterizing the maintenance and transfer of sensory information from iconic memory (IM; the sensory buffer of the visual system) to working memory (WM). Using a crossover factorial design, we administered one out of three doses of hydrocortisone (0.06, 0.12, or 0.24mg/kg bodyweight) and a placebo to 18 healthy young men. Thereafter participants performed a partial report task, which was used to assess their individual ability to process sensory information. Blood samples were concurrently drawn to determine free and total cortisol concentrations. The compiled pharmacokinetic and psychophysical data demonstrates that free cortisol specifically accelerated the decay of sensory information (r=0.46) without significantly affecting the selective information transfer from IM to WM or the capacity limit of WM. Specifically, nonparametric regression revealed a sigmoid dose-response relationship between free cortisol levels during the testing period and the IM decay rates. Our findings highlight that glucocorticoid exposure may not only impact on the recruitment of top-down control for an active maintenance of sensory information, but alter their passive (stimulus-driven) maintenance thereby changing the availability of information prior to subsequent executive processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Haptic-assistive technologies for audition and vision sensory disabilities.

    PubMed

    Sorgini, Francesca; Caliò, Renato; Carrozza, Maria Chiara; Oddo, Calogero Maria

    2018-05-01

    The aim of this review is to analyze haptic sensory substitution technologies for deaf, blind and deaf-blind individuals. The literature search has been performed in Scopus, PubMed and Google Scholar databases using selected keywords, analyzing studies from 1960s to present. Search on databases for scientific publications has been accompanied by web search for commercial devices. Results have been classified by sensory disability and functionality, and analyzed by assistive technology. Complementary analyses have also been carried out on websites of public international agencies, such as the World Health Organization (WHO), and of associations representing sensory disabled persons. The reviewed literature provides evidences that sensory substitution aids are able to mitigate in part the deficits in language learning, communication and navigation for deaf, blind and deaf-blind individuals, and that the tactile sense can be a means of communication to provide some kind of information to sensory disabled individuals. A lack of acceptance emerged from the discussion of capabilities and limitations of haptic assistive technologies. Future researches shall go towards miniaturized, custom-designed and low-cost haptic interfaces and integration with personal devices such as smartphones for a major diffusion of sensory aids among disabled. Implications for rehabilitation Systematic review of state of the art of haptic assistive technologies for vision and audition sensory disabilities. Sensory substitution systems for visual and hearing disabilities have a central role in the transmission of information for patients with sensory impairments, enabling users to interact with the not disabled community in daily activities. Visual and auditory inputs are converted in haptic feedback via different actuation technologies. The information is presented in the form of static or dynamic stimulation of the skin. Their effectiveness and ease of use make haptic sensory substitution systems suitable for patients with different levels of disabilities. They constitute a cheaper and less invasive alternative to implantable partial sensory restitution systems. Future researches are oriented towards the optimization of the stimulation parameters together with the development of miniaturized, custom-designed and low-cost aids operating in synergy in networks, aiming to increase patients' acceptability of these technologies.

  19. Topographic contribution of early visual cortex to short-term memory consolidation: a transcranial magnetic stimulation study.

    PubMed

    van de Ven, Vincent; Jacobs, Christianne; Sack, Alexander T

    2012-01-04

    The neural correlates for retention of visual information in visual short-term memory are considered separate from those of sensory encoding. However, recent findings suggest that sensory areas may play a role also in short-term memory. We investigated the functional relevance, spatial specificity, and temporal characteristics of human early visual cortex in the consolidation of capacity-limited topographic visual memory using transcranial magnetic stimulation (TMS). Topographically specific TMS pulses were delivered over lateralized occipital cortex at 100, 200, or 400 ms into the retention phase of a modified change detection task with low or high memory loads. For the high but not the low memory load, we found decreased memory performance for memory trials in the visual field contralateral, but not ipsilateral to the side of TMS, when pulses were delivered at 200 ms into the retention interval. A behavioral version of the TMS experiment, in which a distractor stimulus (memory mask) replaced the TMS pulses, further corroborated these findings. Our findings suggest that retinotopic visual cortex contributes to the short-term consolidation of topographic visual memory during early stages of the retention of visual information. Further, TMS-induced interference decreased the strength (amplitude) of the memory representation, which most strongly affected the high memory load trials.

  20. Laminar and cytoarchitectonic features of the cerebral cortex in the Risso's dolphin (Grampus griseus), striped dolphin (Stenella coeruleoalba), and bottlenose dolphin (Tursiops truncatus)

    PubMed Central

    Furutani, Rui

    2008-01-01

    The present investigation carried out Nissl, Klüver-Barrera, and Golgi studies of the cerebral cortex in three distinct genera of oceanic dolphins (Risso's dolphin, striped dolphin and bottlenose dolphin) to identify and classify cortical laminar and cytoarchitectonic structures in four distinct functional areas, including primary motor (M1), primary sensory (S1), primary visual (V1), and primary auditory (A1) cortices. The laminar and cytoarchitectonic organization of each of these cortical areas was similar among the three dolphin species. M1 was visualized as five-layer structure that included the molecular layer (layer I), external granular layer (layer II), external pyramidal layer (layer III), internal pyramidal layer (layer V), and fusiform layer (layer VI). The internal granular layer was absent. The cetacean sensory-related cortical areas S1, V1, and A1 were also found to have a five-layer organization comprising layers I, II, III, V and VI. In particular, A1 was characterized by the broadest layer I, layer II and developed band of pyramidal neurons in layers III (sublayers IIIa, IIIb and IIIc) and V. The patch organization consisting of the layer IIIb-pyramidal neurons was detected in the S1 and V1, but not in A1. The laminar patterns of V1 and S1 were similar, but the cytoarchitectonic structures of the two areas were different. V1 was characterized by a broader layer II than that of S1, and also contained the specialized pyramidal and multipolar stellate neurons in layers III and V. PMID:18625031

  1. Development of a Bayesian Estimator for Audio-Visual Integration: A Neurocomputational Study

    PubMed Central

    Ursino, Mauro; Crisafulli, Andrea; di Pellegrino, Giuseppe; Magosso, Elisa; Cuppini, Cristiano

    2017-01-01

    The brain integrates information from different sensory modalities to generate a coherent and accurate percept of external events. Several experimental studies suggest that this integration follows the principle of Bayesian estimate. However, the neural mechanisms responsible for this behavior, and its development in a multisensory environment, are still insufficiently understood. We recently presented a neural network model of audio-visual integration (Neural Computation, 2017) to investigate how a Bayesian estimator can spontaneously develop from the statistics of external stimuli. Model assumes the presence of two unimodal areas (auditory and visual) topologically organized. Neurons in each area receive an input from the external environment, computed as the inner product of the sensory-specific stimulus and the receptive field synapses, and a cross-modal input from neurons of the other modality. Based on sensory experience, synapses were trained via Hebbian potentiation and a decay term. Aim of this work is to improve the previous model, including a more realistic distribution of visual stimuli: visual stimuli have a higher spatial accuracy at the central azimuthal coordinate and a lower accuracy at the periphery. Moreover, their prior probability is higher at the center, and decreases toward the periphery. Simulations show that, after training, the receptive fields of visual and auditory neurons shrink to reproduce the accuracy of the input (both at the center and at the periphery in the visual case), thus realizing the likelihood estimate of unimodal spatial position. Moreover, the preferred positions of visual neurons contract toward the center, thus encoding the prior probability of the visual input. Finally, a prior probability of the co-occurrence of audio-visual stimuli is encoded in the cross-modal synapses. The model is able to simulate the main properties of a Bayesian estimator and to reproduce behavioral data in all conditions examined. In particular, in unisensory conditions the visual estimates exhibit a bias toward the fovea, which increases with the level of noise. In cross modal conditions, the SD of the estimates decreases when using congruent audio-visual stimuli, and a ventriloquism effect becomes evident in case of spatially disparate stimuli. Moreover, the ventriloquism decreases with the eccentricity. PMID:29046631

  2. Emotional Intelligence Levels of Students with Sensory Impairment

    ERIC Educational Resources Information Center

    Al-Tal, Suhair; AL-Jawaldeh, Fuad; AL-Taj, Heyam; Maharmeh, Lina

    2017-01-01

    This study aimed at revealing the emotional intelligence levels of students with sensory disability in Amman in Jordan. The participants of the study were 200 students; 140 hearing impaired students and 60 visual impaired students enrolled in the special education schools and centers for the academic year 2016-2017. The study adopted the…

  3. Feedback control of one's own action: Self-other sensory attribution in motor control.

    PubMed

    Asai, Tomohisa

    2015-12-15

    The sense of agency, the subjective experience of controlling one's own action, has an important function in motor control. When we move our own body or even external tools, we attribute that movement to ourselves and utilize that sensory information in order to correct "our own" movement in theory. The dynamic relationship between conscious self-other attribution and feedback control, however, is still unclear. Participants were required to make a sinusoidal reaching movement and received its visual feedback (i.e., cursor). When participants received a fake movement that was spatio-temporally close to their actual movement, illusory self-attribution of the fake movement was observed. In this situation, since participants tried to control the cursor but it was impossible to do so, the movement error was increased (Experiment 1). However, when the visual feedback was reduced to make self-other attribution difficult, there was no further increase in the movement error (Experiment 2). These results indicate that conscious self-other sensory attribution might coordinate sensory input and motor output. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Impairments of Multisensory Integration and Cross-Sensory Learning as Pathways to Dyslexia

    PubMed Central

    Hahn, Noemi; Foxe, John J.; Molholm, Sophie

    2014-01-01

    Two sensory systems are intrinsic to learning to read. Written words enter the brain through the visual system and associated sounds through the auditory system. The task before the beginning reader is quite basic. She must learn correspondences between orthographic tokens and phonemic utterances, and she must do this to the point that there is seamless automatic ‘connection’ between these sensorially distinct units of language. It is self-evident then that learning to read requires formation of cross-sensory associations to the point that deeply encoded multisensory representations are attained. While the majority of individuals manage this task to a high degree of expertise, some struggle to attain even rudimentary capabilities. Why do dyslexic individuals, who learn well in myriad other domains, fail at this particular task? Here, we examine the literature as it pertains to multisensory processing in dyslexia. We find substantial support for multisensory deficits in dyslexia, and make the case that to fully understand its neurological basis, it will be necessary to thoroughly probe the integrity of auditory-visual integration mechanisms. PMID:25265514

  5. Sensory over-responsivity in adults with autism spectrum conditions.

    PubMed

    Tavassoli, Teresa; Miller, Lucy J; Schoen, Sarah A; Nielsen, Darci M; Baron-Cohen, Simon

    2014-05-01

    Anecdotal reports and empirical evidence suggest that sensory processing issues are a key feature of autism spectrum conditions. This study set out to investigate whether adults with autism spectrum conditions report more sensory over-responsivity than adults without autism spectrum conditions. Another goal of the study was to identify whether autistic traits in adults with and without autism spectrum conditions were associated with sensory over-responsivity. Adults with (n = 221) and without (n = 181) autism spectrum conditions participated in an online survey. The Autism Spectrum Quotient, the Raven Matrices and the Sensory Processing Scale were used to characterize the sample. Adults with autism spectrum conditions reported more sensory over-responsivity than control participants across various sensory domains (visual, auditory, tactile, olfactory, gustatory and proprioceptive). Sensory over-responsivity correlated positively with autistic traits (Autism Spectrum Quotient) at a significant level across groups and within groups. Adults with autism spectrum conditions experience sensory over-responsivity to daily sensory stimuli to a high degree. A positive relationship exists between sensory over-responsivity and autistic traits. Understanding sensory over-responsivity and ways of measuring it in adults with autism spectrum conditions has implications for research and clinical settings.

  6. Semantics of the visual environment encoded in parahippocampal cortex

    PubMed Central

    Bonner, Michael F.; Price, Amy Rose; Peelle, Jonathan E.; Grossman, Murray

    2016-01-01

    Semantic representations capture the statistics of experience and store this information in memory. A fundamental component of this memory system is knowledge of the visual environment, including knowledge of objects and their associations. Visual semantic information underlies a range of behaviors, from perceptual categorization to cognitive processes such as language and reasoning. Here we examine the neuroanatomic system that encodes visual semantics. Across three experiments, we found converging evidence indicating that knowledge of verbally mediated visual concepts relies on information encoded in a region of the ventral-medial temporal lobe centered on parahippocampal cortex. In an fMRI study, this region was strongly engaged by the processing of concepts relying on visual knowledge but not by concepts relying on other sensory modalities. In a study of patients with the semantic variant of primary progressive aphasia (semantic dementia), atrophy that encompassed this region was associated with a specific impairment in verbally mediated visual semantic knowledge. Finally, in a structural study of healthy adults from the fMRI experiment, gray matter density in this region related to individual variability in the processing of visual concepts. The anatomic location of these findings aligns with recent work linking the ventral-medial temporal lobe with high-level visual representation, contextual associations, and reasoning through imagination. Together this work suggests a critical role for parahippocampal cortex in linking the visual environment with knowledge systems in the human brain. PMID:26679216

  7. Semantics of the Visual Environment Encoded in Parahippocampal Cortex.

    PubMed

    Bonner, Michael F; Price, Amy Rose; Peelle, Jonathan E; Grossman, Murray

    2016-03-01

    Semantic representations capture the statistics of experience and store this information in memory. A fundamental component of this memory system is knowledge of the visual environment, including knowledge of objects and their associations. Visual semantic information underlies a range of behaviors, from perceptual categorization to cognitive processes such as language and reasoning. Here we examine the neuroanatomic system that encodes visual semantics. Across three experiments, we found converging evidence indicating that knowledge of verbally mediated visual concepts relies on information encoded in a region of the ventral-medial temporal lobe centered on parahippocampal cortex. In an fMRI study, this region was strongly engaged by the processing of concepts relying on visual knowledge but not by concepts relying on other sensory modalities. In a study of patients with the semantic variant of primary progressive aphasia (semantic dementia), atrophy that encompassed this region was associated with a specific impairment in verbally mediated visual semantic knowledge. Finally, in a structural study of healthy adults from the fMRI experiment, gray matter density in this region related to individual variability in the processing of visual concepts. The anatomic location of these findings aligns with recent work linking the ventral-medial temporal lobe with high-level visual representation, contextual associations, and reasoning through imagination. Together, this work suggests a critical role for parahippocampal cortex in linking the visual environment with knowledge systems in the human brain.

  8. Effects of aging on perception of motion

    NASA Astrophysics Data System (ADS)

    Kaur, Manpreet; Wilder, Joseph; Hung, George; Julesz, Bela

    1997-09-01

    Driving requires two basic visual components: 'visual sensory function' and 'higher order skills.' Among the elderly, it has been observed that when attention must be divided in the presence of multiple objects, their attentional skills and relational processes, along with impairment of basic visual sensory function, are markedly impaired. A high frame rate imaging system was developed to assess the elderly driver's ability to locate and distinguish computer generated images of vehicles and to determine their direction of motion in a simulated intersection. Preliminary experiments were performed at varying target speeds and angular displacements to study the effect of these parameters on motion perception. Results for subjects in four different age groups, ranging from mid- twenties to mid-sixties, show significantly better performance for the younger subjects as compared to the older ones.

  9. Tactile discrimination activates the visual cortex of the recently blind naive to Braille: a functional magnetic resonance imaging study in humans.

    PubMed

    Sadato, Norihiro; Okada, Tomohisa; Kubota, Kiyokazu; Yonekura, Yoshiharu

    2004-04-08

    The occipital cortex of blind subjects is known to be activated during tactile discrimination tasks such as Braille reading. To investigate whether this is due to long-term learning of Braille or to sensory deafferentation, we used fMRI to study tactile discrimination tasks in subjects who had recently lost their sight and never learned Braille. The occipital cortex of the blind subjects without Braille training was activated during the tactile discrimination task, whereas that of control sighted subjects was not. This finding suggests that the activation of the visual cortex of the blind during performance of a tactile discrimination task may be due to sensory deafferentation, wherein a competitive imbalance favors the tactile over the visual modality.

  10. Areas activated during naturalistic reading comprehension overlap topological visual, auditory, and somatotomotor maps

    PubMed Central

    2016-01-01

    Abstract Cortical mapping techniques using fMRI have been instrumental in identifying the boundaries of topological (neighbor‐preserving) maps in early sensory areas. The presence of topological maps beyond early sensory areas raises the possibility that they might play a significant role in other cognitive systems, and that topological mapping might help to delineate areas involved in higher cognitive processes. In this study, we combine surface‐based visual, auditory, and somatomotor mapping methods with a naturalistic reading comprehension task in the same group of subjects to provide a qualitative and quantitative assessment of the cortical overlap between sensory‐motor maps in all major sensory modalities, and reading processing regions. Our results suggest that cortical activation during naturalistic reading comprehension overlaps more extensively with topological sensory‐motor maps than has been heretofore appreciated. Reading activation in regions adjacent to occipital lobe and inferior parietal lobe almost completely overlaps visual maps, whereas a significant portion of frontal activation for reading in dorsolateral and ventral prefrontal cortex overlaps both visual and auditory maps. Even classical language regions in superior temporal cortex are partially overlapped by topological visual and auditory maps. By contrast, the main overlap with somatomotor maps is restricted to a small region on the anterior bank of the central sulcus near the border between the face and hand representations of M‐I. Hum Brain Mapp 37:2784–2810, 2016. © 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc. PMID:27061771

  11. Sensory Contributions to Impaired Emotion Processing in Schizophrenia

    PubMed Central

    Butler, Pamela D.; Abeles, Ilana Y.; Weiskopf, Nicole G.; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E.; Zemon, Vance; Loughead, James; Gur, Ruben C.; Javitt, Daniel C.

    2009-01-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective. PMID:19793797

  12. Training to Facilitate Adaptation to Novel Sensory Environments

    NASA Technical Reports Server (NTRS)

    Bloomberg, J. J.; Peters, B. T.; Mulavara, A. P.; Brady, R. A.; Batson, C. D.; Ploutz-Snyder, R. J.; Cohen, H. S.

    2010-01-01

    After spaceflight, the process of readapting to Earth s gravity causes locomotor dysfunction. We are developing a gait training countermeasure to facilitate adaptive responses in locomotor function. Our training system is comprised of a treadmill placed on a motion-base facing a virtual visual scene that provides an unstable walking surface combined with incongruent visual flow designed to train subjects to rapidly adapt their gait patterns to changes in the sensory environment. The goal of our present study was to determine if training improved both the locomotor and dual-tasking ability responses to a novel sensory environment and to quantify the retention of training. Subjects completed three, 30-minute training sessions during which they walked on the treadmill while receiving discordant support surface and visual input. Control subjects walked on the treadmill without any support surface or visual alterations. To determine the efficacy of training, all subjects were then tested using a novel visual flow and support surface movement not previously experienced during training. This test was performed 20 minutes, 1 week, and 1, 3, and 6 months after the final training session. Stride frequency and auditory reaction time were collected as measures of postural stability and cognitive effort, respectively. Subjects who received training showed less alteration in stride frequency and auditory reaction time compared to controls. Trained subjects maintained their level of performance over 6 months. We conclude that, with training, individuals became more proficient at walking in novel discordant sensorimotor conditions and were able to devote more attention to competing tasks.

  13. Neural correlates of tactile perception during pre-, peri-, and post-movement.

    PubMed

    Juravle, Georgiana; Heed, Tobias; Spence, Charles; Röder, Brigitte

    2016-05-01

    Tactile information is differentially processed over the various phases of goal-directed movements. Here, event-related potentials (ERPs) were used to investigate the neural correlates of tactile and visual information processing during movement. Participants performed goal-directed reaches for an object placed centrally on the table in front of them. Tactile and visual stimulation (100 ms) was presented in separate trials during the different phases of the movement (i.e. preparation, execution, and post-movement). These stimuli were independently delivered to either the moving or resting hand. In a control condition, the participants only performed the movement, while omission (i.e. movement-only) ERPs were recorded. Participants were instructed to ignore the presence or absence of any sensory events and to concentrate solely on the execution of the movement. Enhanced ERPs were observed 80-200 ms after tactile stimulation, as well as 100-250 ms after visual stimulation: These modulations were greatest during the execution of the goal-directed movement, and they were effector based (i.e. significantly more negative for stimuli presented to the moving hand). Furthermore, ERPs revealed enhanced sensory processing during goal-directed movements for visual stimuli as well. Such enhanced processing of both tactile and visual information during the execution phase suggests that incoming sensory information is continuously monitored for a potential adjustment of the current motor plan. Furthermore, the results reported here also highlight a tight coupling between spatial attention and the execution of motor actions.

  14. Sensory contributions to impaired emotion processing in schizophrenia.

    PubMed

    Butler, Pamela D; Abeles, Ilana Y; Weiskopf, Nicole G; Tambini, Arielle; Jalbrzikowski, Maria; Legatt, Michael E; Zemon, Vance; Loughead, James; Gur, Ruben C; Javitt, Daniel C

    2009-11-01

    Both emotion and visual processing deficits are documented in schizophrenia, and preferential magnocellular visual pathway dysfunction has been reported in several studies. This study examined the contribution to emotion-processing deficits of magnocellular and parvocellular visual pathway function, based on stimulus properties and shape of contrast response functions. Experiment 1 examined the relationship between contrast sensitivity to magnocellular- and parvocellular-biased stimuli and emotion recognition using the Penn Emotion Recognition (ER-40) and Emotion Differentiation (EMODIFF) tests. Experiment 2 altered the contrast levels of the faces themselves to determine whether emotion detection curves would show a pattern characteristic of magnocellular neurons and whether patients would show a deficit in performance related to early sensory processing stages. Results for experiment 1 showed that patients had impaired emotion processing and a preferential magnocellular deficit on the contrast sensitivity task. Greater deficits in ER-40 and EMODIFF performance correlated with impaired contrast sensitivity to the magnocellular-biased condition, which remained significant for the EMODIFF task even when nonspecific correlations due to group were considered in a step-wise regression. Experiment 2 showed contrast response functions indicative of magnocellular processing for both groups, with patients showing impaired performance. Impaired emotion identification on this task was also correlated with magnocellular-biased visual sensory processing dysfunction. These results provide evidence for a contribution of impaired early-stage visual processing in emotion recognition deficits in schizophrenia and suggest that a bottom-up approach to remediation may be effective.

  15. Impaired downregulation of visual cortex during auditory processing is associated with autism symptomatology in children and adolescents with autism spectrum disorder.

    PubMed

    Jao Keehn, R Joanne; Sanchez, Sandra S; Stewart, Claire R; Zhao, Weiqi; Grenesko-Stevens, Emily L; Keehn, Brandon; Müller, Ralph-Axel

    2017-01-01

    Autism spectrum disorders (ASD) are pervasive developmental disorders characterized by impairments in language development and social interaction, along with restricted and stereotyped behaviors. These behaviors often include atypical responses to sensory stimuli; some children with ASD are easily overwhelmed by sensory stimuli, while others may seem unaware of their environment. Vision and audition are two sensory modalities important for social interactions and language, and are differentially affected in ASD. In the present study, 16 children and adolescents with ASD and 16 typically developing (TD) participants matched for age, gender, nonverbal IQ, and handedness were tested using a mixed event-related/blocked functional magnetic resonance imaging paradigm to examine basic perceptual processes that may form the foundation for later-developing cognitive abilities. Auditory (high or low pitch) and visual conditions (dot located high or low in the display) were presented, and participants indicated whether the stimuli were "high" or "low." Results for the auditory condition showed downregulated activity of the visual cortex in the TD group, but upregulation in the ASD group. This atypical activity in visual cortex was associated with autism symptomatology. These findings suggest atypical crossmodal (auditory-visual) modulation linked to sociocommunicative deficits in ASD, in agreement with the general hypothesis of low-level sensorimotor impairments affecting core symptomatology. Autism Res 2017, 10: 130-143. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  16. Freezing of Gait in Parkinson's Disease: An Overload Problem?

    PubMed

    Beck, Eric N; Ehgoetz Martens, Kaylena A; Almeida, Quincy J

    2015-01-01

    Freezing of gait (FOG) is arguably the most severe symptom associated with Parkinson's disease (PD), and often occurs while performing dual tasks or approaching narrowed and cluttered spaces. While it is well known that visual cues alleviate FOG, it is not clear if this effect may be the result of cognitive or sensorimotor mechanisms. Nevertheless, the role of vision may be a critical link that might allow us to disentangle this question. Gaze behaviour has yet to be carefully investigated while freezers approach narrow spaces, thus the overall objective of this study was to explore the interaction between cognitive and sensory-perceptual influences on FOG. In experiment #1, if cognitive load is the underlying factor leading to FOG, then one might expect that a dual-task would elicit FOG episodes even in the presence of visual cues, since the load on attention would interfere with utilization of visual cues. Alternatively, if visual cues alleviate gait despite performance of a dual-task, then it may be more probable that sensory mechanisms are at play. In compliment to this, the aim of experiment#2 was to further challenge the sensory systems, by removing vision of the lower-limbs and thereby forcing participants to rely on other forms of sensory feedback rather than vision while walking toward the narrow space. Spatiotemporal aspects of gait, percentage of gaze fixation frequency and duration, as well as skin conductance levels were measured in freezers and non-freezers across both experiments. Results from experiment#1 indicated that although freezers and non-freezers both walked with worse gait while performing the dual-task, in freezers, gait was relieved by visual cues regardless of whether the cognitive demands of the dual-task were present. At baseline and while dual-tasking, freezers demonstrated a gaze behaviour that neglected the doorway and instead focused primarily on the pathway, a strategy that non-freezers adopted only when performing the dual-task. Interestingly, with the combination of visual cues and dual-task, freezers increased the frequency and duration of fixations toward the doorway, compared to non-freezers. These results suggest that although increasing demand on attention does significantly deteriorate gait in freezers, an increase in cognitive demand is not exclusively responsible for freezing (since visual cues were able to overcome any interference elicited by the dual-task). When vision of the lower limbs was removed in experiment#2, only the freezers' gait was affected. However, when visual cues were present, freezers' gait improved regardless of the dual-task. This gait behaviour was accompanied by greater amount of time spent looking at the visual cues irrespective of the dual-task. Since removing vision of the lower-limbs hindered gait even under low attentional demand, restricted sensory feedback may be an important factor to the mechanisms underlying FOG.

  17. Freezing of Gait in Parkinson’s Disease: An Overload Problem?

    PubMed Central

    Beck, Eric N.; Ehgoetz Martens, Kaylena A.; Almeida, Quincy J.

    2015-01-01

    Freezing of gait (FOG) is arguably the most severe symptom associated with Parkinson’s disease (PD), and often occurs while performing dual tasks or approaching narrowed and cluttered spaces. While it is well known that visual cues alleviate FOG, it is not clear if this effect may be the result of cognitive or sensorimotor mechanisms. Nevertheless, the role of vision may be a critical link that might allow us to disentangle this question. Gaze behaviour has yet to be carefully investigated while freezers approach narrow spaces, thus the overall objective of this study was to explore the interaction between cognitive and sensory-perceptual influences on FOG. In experiment #1, if cognitive load is the underlying factor leading to FOG, then one might expect that a dual-task would elicit FOG episodes even in the presence of visual cues, since the load on attention would interfere with utilization of visual cues. Alternatively, if visual cues alleviate gait despite performance of a dual-task, then it may be more probable that sensory mechanisms are at play. In compliment to this, the aim of experiment#2 was to further challenge the sensory systems, by removing vision of the lower-limbs and thereby forcing participants to rely on other forms of sensory feedback rather than vision while walking toward the narrow space. Spatiotemporal aspects of gait, percentage of gaze fixation frequency and duration, as well as skin conductance levels were measured in freezers and non-freezers across both experiments. Results from experiment#1 indicated that although freezers and non-freezers both walked with worse gait while performing the dual-task, in freezers, gait was relieved by visual cues regardless of whether the cognitive demands of the dual-task were present. At baseline and while dual-tasking, freezers demonstrated a gaze behaviour that neglected the doorway and instead focused primarily on the pathway, a strategy that non-freezers adopted only when performing the dual-task. Interestingly, with the combination of visual cues and dual-task, freezers increased the frequency and duration of fixations toward the doorway, compared to non-freezers. These results suggest that although increasing demand on attention does significantly deteriorate gait in freezers, an increase in cognitive demand is not exclusively responsible for freezing (since visual cues were able to overcome any interference elicited by the dual-task). When vision of the lower limbs was removed in experiment#2, only the freezers’ gait was affected. However, when visual cues were present, freezers’ gait improved regardless of the dual-task. This gait behaviour was accompanied by greater amount of time spent looking at the visual cues irrespective of the dual-task. Since removing vision of the lower-limbs hindered gait even under low attentional demand, restricted sensory feedback may be an important factor to the mechanisms underlying FOG. PMID:26678262

  18. Mosaic and Concerted Evolution in the Visual System of Birds

    PubMed Central

    Gutiérrez-Ibáñez, Cristián; Iwaniuk, Andrew N.; Moore, Bret A.; Fernández-Juricic, Esteban; Corfield, Jeremy R.; Krilow, Justin M.; Kolominsky, Jeffrey; Wylie, Douglas R.

    2014-01-01

    Two main models have been proposed to explain how the relative size of neural structures varies through evolution. In the mosaic evolution model, individual brain structures vary in size independently of each other, whereas in the concerted evolution model developmental constraints result in different parts of the brain varying in size in a coordinated manner. Several studies have shown variation of the relative size of individual nuclei in the vertebrate brain, but it is currently not known if nuclei belonging to the same functional pathway vary independently of each other or in a concerted manner. The visual system of birds offers an ideal opportunity to specifically test which of the two models apply to an entire sensory pathway. Here, we examine the relative size of 9 different visual nuclei across 98 species of birds. This includes data on interspecific variation in the cytoarchitecture and relative size of the isthmal nuclei, which has not been previously reported. We also use a combination of statistical analyses, phylogenetically corrected principal component analysis and evolutionary rates of change on the absolute and relative size of the nine nuclei, to test if visual nuclei evolved in a concerted or mosaic manner. Our results strongly indicate a combination of mosaic and concerted evolution (in the relative size of nine nuclei) within the avian visual system. Specifically, the relative size of the isthmal nuclei and parts of the tectofugal pathway covary across species in a concerted fashion, whereas the relative volume of the other visual nuclei measured vary independently of one another, such as that predicted by the mosaic model. Our results suggest the covariation of different neural structures depends not only on the functional connectivity of each nucleus, but also on the diversity of afferents and efferents of each nucleus. PMID:24621573

  19. Motor Imagery Learning Modulates Functional Connectivity of Multiple Brain Systems in Resting State

    PubMed Central

    Zhang, Hang; Long, Zhiying; Ge, Ruiyang; Xu, Lele; Jin, Zhen; Yao, Li; Liu, Yijun

    2014-01-01

    Background Learning motor skills involves subsequent modulation of resting-state functional connectivity in the sensory-motor system. This idea was mostly derived from the investigations on motor execution learning which mainly recruits the processing of sensory-motor information. Behavioral evidences demonstrated that motor skills in our daily lives could be learned through imagery procedures. However, it remains unclear whether the modulation of resting-state functional connectivity also exists in the sensory-motor system after motor imagery learning. Methodology/Principal Findings We performed a fMRI investigation on motor imagery learning from resting state. Based on previous studies, we identified eight sensory and cognitive resting-state networks (RSNs) corresponding to the brain systems and further explored the functional connectivity of these RSNs through the assessments, connectivity and network strengths before and after the two-week consecutive learning. Two intriguing results were revealed: (1) The sensory RSNs, specifically sensory-motor and lateral visual networks exhibited greater connectivity strengths in precuneus and fusiform gyrus after learning; (2) Decreased network strength induced by learning was proved in the default mode network, a cognitive RSN. Conclusions/Significance These results indicated that resting-state functional connectivity could be modulated by motor imagery learning in multiple brain systems, and such modulation displayed in the sensory-motor, visual and default brain systems may be associated with the establishment of motor schema and the regulation of introspective thought. These findings further revealed the neural substrates underlying motor skill learning and potentially provided new insights into the therapeutic benefits of motor imagery learning. PMID:24465577

  20. Aging and response interference across sensory modalities.

    PubMed

    Guerreiro, Maria J S; Adam, Jos J; Van Gerven, Pascal W M

    2014-06-01

    Advancing age is associated with decrements in selective attention. It was recently hypothesized that age-related differences in selective attention depend on sensory modality. The goal of the present study was to investigate the role of sensory modality in age-related vulnerability to distraction, using a response interference task. To this end, 16 younger (mean age = 23.1 years) and 24 older (mean age = 65.3 years) adults performed four response interference tasks, involving all combinations of visual and auditory targets and distractors. The results showed that response interference effects differ across sensory modalities, but not across age groups. These results indicate that sensory modality plays an important role in vulnerability to distraction, but not in age-related distractibility by irrelevant spatial information.

  1. Successful tactile based visual sensory substitution use functions independently of visual pathway integrity

    PubMed Central

    Lee, Vincent K.; Nau, Amy C.; Laymon, Charles; Chan, Kevin C.; Rosario, Bedda L.; Fisher, Chris

    2014-01-01

    Purpose: Neuronal reorganization after blindness is of critical interest because it has implications for the rational prescription of artificial vision devices. The purpose of this study was to distinguish the microstructural differences between perinatally blind (PB), acquired blind (AB), and normally sighted controls (SCs) and relate these differences to performance on functional tasks using a sensory substitution device (BrainPort). Methods: We enrolled 52 subjects (PB n = 11; AB n = 35; SC n = 6). All subjects spent 15 h undergoing BrainPort device training. Outcomes of light perception, motion, direction, temporal resolution, grating, and acuity were tested at baseline and after training. Twenty-six of the subjects were scanned with a three Tesla MRI scanner for diffusion tensor imaging (DTI), and with a positron emission tomography (PET) scanner for mapping regional brain glucose consumption during sensory substitution function. Non-parametric models were used to analyze fractional anisotropy (FA; a DTI measure of microstructural integrity) of the brain via region-of-interest (ROI) analysis and tract-based spatial statistics (TBSS). Results: At baseline, all subjects performed all tasks at chance level. After training, light perception, time resolution, location and grating acuity tasks improved significantly for all subject groups. ROI and TBSS analyses of FA maps show areas of statistically significant differences (p ≤ 0.025) in the bilateral optic radiations and some visual association connections between all three groups. No relationship was found between FA and functional performance with the BrainPort. Discussion: All subjects showed performance improvements using the BrainPort irrespective of nature and duration of blindness. Definite brain areas with significant microstructural integrity changes exist among PB, AB, and NC, and these variations are most pronounced in the visual pathways. However, the use of sensory substitution devices is feasible irrespective of microstructural integrity of the primary visual pathways between the eye and the brain. Therefore, tongue based devices devices may be usable for a broad array of non-sighted patients. PMID:24860473

  2. Neural correlates of auditory short-term memory in rostral superior temporal cortex

    PubMed Central

    Scott, Brian H.; Mishkin, Mortimer; Yin, Pingbo

    2014-01-01

    Summary Background Auditory short-term memory (STM) in the monkey is less robust than visual STM and may depend on a retained sensory trace, which is likely to reside in the higher-order cortical areas of the auditory ventral stream. Results We recorded from the rostral superior temporal cortex as monkeys performed serial auditory delayed-match-to-sample (DMS). A subset of neurons exhibited modulations of their firing rate during the delay between sounds, during the sensory response, or both. This distributed subpopulation carried a predominantly sensory signal modulated by the mnemonic context of the stimulus. Excitatory and suppressive effects on match responses were dissociable in their timing, and in their resistance to sounds intervening between the sample and match. Conclusions Like the monkeys’ behavioral performance, these neuronal effects differ from those reported in the same species during visual DMS, suggesting different neural mechanisms for retaining dynamic sounds and static images in STM. PMID:25456448

  3. Partitioning neuronal variability

    PubMed Central

    Goris, Robbe L.T.; Movshon, J. Anthony; Simoncelli, Eero P.

    2014-01-01

    Responses of sensory neurons differ across repeated measurements. This variability is usually treated as stochasticity arising within neurons or neural circuits. However, some portion of the variability arises from fluctuations in excitability due to factors that are not purely sensory, such as arousal, attention, and adaptation. To isolate these fluctuations, we developed a model in which spikes are generated by a Poisson process whose rate is the product of a drive that is sensory in origin, and a gain summarizing stimulus-independent modulatory influences on excitability. This model provides an accurate account of response distributions of visual neurons in macaque LGN, V1, V2, and MT, revealing that variability originates in large part from excitability fluctuations which are correlated over time and between neurons, and which increase in strength along the visual pathway. The model provides a parsimonious explanation for observed systematic dependencies of response variability and covariability on firing rate. PMID:24777419

  4. Sensory and non-sensory factors and the concept of externality in obese subjects.

    PubMed

    Gardner, R M; Brake, S J; Reyes, B; Maestas, D

    1983-08-01

    9 obese and 9 normal subjects performed a psychophysical task in which food- or non-food-related stimuli were briefly flashed tachistoscopically at a speed and intensity near the visual threshold. A signal was presented on one-half the trials and noise only on the other one-half of the trials. Using signal detection theory methodology, separate measures of sensory sensitivity (d') and response bias (beta) were calculated. No differences were noted between obese and normal subjects on measures of sensory sensitivity but significant differences on response bias. Obese subjects had consistently lower response criteria than normal ones. Analysis for subjects categorized by whether they were restrained or unrestrained eaters gave findings identical to those for obese and normal. The importance of using a methodology that separates sensory and non-sensory factors in research on obesity is discussed.

  5. Wittgenstein on Köhler and Gestalt psychology: a critique.

    PubMed

    Pastore, N

    1991-10-01

    Wittgenstein's objections to Köhler and gestalt psychology are critically examined. Principal features of Köhler's Gestalt Psychology are discussed that are relevant to Wittgenstein's views. They include Köhler's concepts of "subjective" and "objective" experiences, "sensory organization," and "empiristic theory." Wittgenstein's objections, which focus on the concept of sensory organization, are examined. Wittgenstein employs the term "aspect," which is derived from the findings of gestalt psychology, as a replacement for Köhler's term "sensory organization." After tracing his uses of aspect, it is shown that aspect is a superordinate entity distinct from 'sensory content' (colors and shapes). This dualism of aspect and sensory content is of the same kind that prevailed in the empiristic theory of visual perception. Wittgenstein's adherence to the empiristic theory is discussed. Finally, the difference between Wittgenstein's aspect and Köhler's sensory organization is examined.

  6. Bottlenecks of Motion Processing during a Visual Glance: The Leaky Flask Model

    PubMed Central

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E.; Tripathy, Srimant P.

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing. PMID:24391806

  7. Bottlenecks of motion processing during a visual glance: the leaky flask model.

    PubMed

    Öğmen, Haluk; Ekiz, Onur; Huynh, Duong; Bedell, Harold E; Tripathy, Srimant P

    2013-01-01

    Where do the bottlenecks for information and attention lie when our visual system processes incoming stimuli? The human visual system encodes the incoming stimulus and transfers its contents into three major memory systems with increasing time scales, viz., sensory (or iconic) memory, visual short-term memory (VSTM), and long-term memory (LTM). It is commonly believed that the major bottleneck of information processing resides in VSTM. In contrast to this view, we show major bottlenecks for motion processing prior to VSTM. In the first experiment, we examined bottlenecks at the stimulus encoding stage through a partial-report technique by delivering the cue immediately at the end of the stimulus presentation. In the second experiment, we varied the cue delay to investigate sensory memory and VSTM. Performance decayed exponentially as a function of cue delay and we used the time-constant of the exponential-decay to demarcate sensory memory from VSTM. We then decomposed performance in terms of quality and quantity measures to analyze bottlenecks along these dimensions. In terms of the quality of information, two thirds to three quarters of the motion-processing bottleneck occurs in stimulus encoding rather than memory stages. In terms of the quantity of information, the motion-processing bottleneck is distributed, with the stimulus-encoding stage accounting for one third of the bottleneck. The bottleneck for the stimulus-encoding stage is dominated by the selection compared to the filtering function of attention. We also found that the filtering function of attention is operating mainly at the sensory memory stage in a specific manner, i.e., influencing only quantity and sparing quality. These results provide a novel and more complete understanding of information processing and storage bottlenecks for motion processing.

  8. The dark side of the alpha rhythm: fMRI evidence for induced alpha modulation during complete darkness.

    PubMed

    Ben-Simon, Eti; Podlipsky, Ilana; Okon-Singer, Hadas; Gruberger, Michal; Cvetkovic, Dean; Intrator, Nathan; Hendler, Talma

    2013-03-01

    The unique role of the EEG alpha rhythm in different states of cortical activity is still debated. The main theories regarding alpha function posit either sensory processing or attention allocation as the main processes governing its modulation. Closing and opening eyes, a well-known manipulation of the alpha rhythm, could be regarded as attention allocation from inward to outward focus though during light is also accompanied by visual change. To disentangle the effects of attention allocation and sensory visual input on alpha modulation, 14 healthy subjects were asked to open and close their eyes during conditions of light and of complete darkness while simultaneous recordings of EEG and fMRI were acquired. Thus, during complete darkness the eyes-open condition is not related to visual input but only to attention allocation, allowing direct examination of its role in alpha modulation. A data-driven ridge regression classifier was applied to the EEG data in order to ascertain the contribution of the alpha rhythm to eyes-open/eyes-closed inference in both lighting conditions. Classifier results revealed significant alpha contribution during both light and dark conditions, suggesting that alpha rhythm modulation is closely linked to the change in the direction of attention regardless of the presence of visual sensory input. Furthermore, fMRI activation maps derived from an alpha modulation time-course during the complete darkness condition exhibited a right frontal cortical network associated with attention allocation. These findings support the importance of top-down processes such as attention allocation to alpha rhythm modulation, possibly as a prerequisite to its known bottom-up processing of sensory input. © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  9. Visually suboptimal bananas: How ripeness affects consumer expectation and perception.

    PubMed

    Symmank, Claudia; Zahn, Susann; Rohm, Harald

    2018-01-01

    One reason for the significant amount of food that is wasted in developed countries is that consumers often expect visually suboptimal food as being less palatable. Using bananas as example, the objective of this study was to determine how appearance affects consumer overall liking, the rating of sensory attributes, purchase intention, and the intended use of bananas. The ripeness degree (RD) of the samples was adjusted to RD 5 (control) and RD 7 (more ripened, visually suboptimal). After preliminary experiments, a total of 233 participants were asked to judge their satisfaction with the intensity of sensory attributes that referred to flavor, taste, and texture using just-about-right scales. Subjects who received peeled samples were asked after tasting, whereas subjects who received unpeeled bananas judged expectation and, after peeling and tasting, perception. Expected overall liking and purchase intention were significantly lower for RD 7 bananas. Purchase intention was still significantly different between RD 5 and RD 7 after tasting, whereas no difference in overall liking was observed. Significant differences between RD 5 and RD 7 were observed when asking participants for their intended use of the bananas. Concerning the sensory attributes, penalty analysis revealed that only the firmness of the RD 7 bananas was still not just-about-right after tasting. The importance that consumers attribute to the shelf-life of food had a pronounced impact on purchase intention of bananas with different ripeness degree. In the case of suboptimal bananas, the results demonstrate a positive relationship between the sensory perception and overall liking and purchase intention. Convincing consumers that visually suboptimal food is still tasty is of high relevance for recommending different ways of communication. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Cortico-fugal output from visual cortex promotes plasticity of innate motor behaviour.

    PubMed

    Liu, Bao-Hua; Huberman, Andrew D; Scanziani, Massimo

    2016-10-20

    The mammalian visual cortex massively innervates the brainstem, a phylogenetically older structure, via cortico-fugal axonal projections. Many cortico-fugal projections target brainstem nuclei that mediate innate motor behaviours, but the function of these projections remains poorly understood. A prime example of such behaviours is the optokinetic reflex (OKR), an innate eye movement mediated by the brainstem accessory optic system, that stabilizes images on the retina as the animal moves through the environment and is thus crucial for vision. The OKR is plastic, allowing the amplitude of this reflex to be adaptively adjusted relative to other oculomotor reflexes and thereby ensuring image stability throughout life. Although the plasticity of the OKR is thought to involve subcortical structures such as the cerebellum and vestibular nuclei, cortical lesions have suggested that the visual cortex might also be involved. Here we show that projections from the mouse visual cortex to the accessory optic system promote the adaptive plasticity of the OKR. OKR potentiation, a compensatory plastic increase in the amplitude of the OKR in response to vestibular impairment, is diminished by silencing visual cortex. Furthermore, targeted ablation of a sparse population of cortico-fugal neurons that specifically project to the accessory optic system severely impairs OKR potentiation. Finally, OKR potentiation results from an enhanced drive exerted by the visual cortex onto the accessory optic system. Thus, cortico-fugal projections to the brainstem enable the visual cortex, an area that has been principally studied for its sensory processing function, to plastically adapt the execution of innate motor behaviours.

  11. Sensory Prioritization in Rats: Behavioral Performance and Neuronal Correlates.

    PubMed

    Lee, Conrad C Y; Diamond, Mathew E; Arabzadeh, Ehsan

    2016-03-16

    Operating with some finite quantity of processing resources, an animal would benefit from prioritizing the sensory modality expected to provide key information in a particular context. The present study investigated whether rats dedicate attentional resources to the sensory modality in which a near-threshold event is more likely to occur. We manipulated attention by controlling the likelihood with which a stimulus was presented from one of two modalities. In a whisker session, 80% of trials contained a brief vibration stimulus applied to whiskers and the remaining 20% of trials contained a brief change of luminance. These likelihoods were reversed in a visual session. When a stimulus was presented in the high-likelihood context, detection performance increased and was faster compared with the same stimulus presented in the low-likelihood context. Sensory prioritization was also reflected in neuronal activity in the vibrissal area of primary somatosensory cortex: single units responded differentially to the whisker vibration stimulus when presented with higher probability compared with lower probability. Neuronal activity in the vibrissal cortex displayed signatures of multiplicative gain control and enhanced response to vibration stimuli during the whisker session. In conclusion, rats allocate priority to the more likely stimulus modality and the primary sensory cortex may participate in the redistribution of resources. Detection of low-amplitude events is critical to survival; for example, to warn prey of predators. To formulate a response, decision-making systems must extract minute neuronal signals from the sensory modality that provides key information. Here, we identify the behavioral and neuronal correlates of sensory prioritization in rats. Rats were trained to detect whisker vibrations or visual flickers. Stimuli were embedded in two contexts in which either visual or whisker modality was more likely to occur. When a stimulus was presented in the high-likelihood context, detection was faster and more reliable. Neuronal recording from the vibrissal cortex revealed enhanced representation of vibrations in the prioritized context. These results establish the rat as an alternative model organism to primates for studying attention. Copyright © 2016 the authors 0270-6474/16/363243-11$15.00/0.

  12. Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked Responses

    PubMed Central

    Molloy, Katharine; Griffiths, Timothy D.; Lavie, Nilli

    2015-01-01

    Due to capacity limits on perception, conditions of high perceptual load lead to reduced processing of unattended stimuli (Lavie et al., 2014). Accumulating work demonstrates the effects of visual perceptual load on visual cortex responses, but the effects on auditory processing remain poorly understood. Here we establish the neural mechanisms underlying “inattentional deafness”—the failure to perceive auditory stimuli under high visual perceptual load. Participants performed a visual search task of low (target dissimilar to nontarget items) or high (target similar to nontarget items) load. On a random subset (50%) of trials, irrelevant tones were presented concurrently with the visual stimuli. Brain activity was recorded with magnetoencephalography, and time-locked responses to the visual search array and to the incidental presence of unattended tones were assessed. High, compared to low, perceptual load led to increased early visual evoked responses (within 100 ms from onset). This was accompanied by reduced early (∼100 ms from tone onset) auditory evoked activity in superior temporal sulcus and posterior middle temporal gyrus. A later suppression of the P3 “awareness” response to the tones was also observed under high load. A behavioral experiment revealed reduced tone detection sensitivity under high visual load, indicating that the reduction in neural responses was indeed associated with reduced awareness of the sounds. These findings support a neural account of shared audiovisual resources, which, when depleted under load, leads to failures of sensory perception and awareness. SIGNIFICANCE STATEMENT The present work clarifies the neural underpinning of inattentional deafness under high visual load. The findings of near-simultaneous load effects on both visual and auditory evoked responses suggest shared audiovisual processing capacity. Temporary depletion of shared capacity in perceptually demanding visual tasks leads to a momentary reduction in sensory processing of auditory stimuli, resulting in inattentional deafness. The dynamic “push–pull” pattern of load effects on visual and auditory processing furthers our understanding of both the neural mechanisms of attention and of cross-modal effects across visual and auditory processing. These results also offer an explanation for many previous failures to find cross-modal effects in experiments where the visual load effects may not have coincided directly with auditory sensory processing. PMID:26658858

  13. Multisensory Integration in Non-Human Primates during a Sensory-Motor Task

    PubMed Central

    Lanz, Florian; Moret, Véronique; Rouiller, Eric Michel; Loquet, Gérard

    2013-01-01

    Daily our central nervous system receives inputs via several sensory modalities, processes them and integrates information in order to produce a suitable behavior. The amazing part is that such a multisensory integration brings all information into a unified percept. An approach to start investigating this property is to show that perception is better and faster when multimodal stimuli are used as compared to unimodal stimuli. This forms the first part of the present study conducted in a non-human primate’s model (n = 2) engaged in a detection sensory-motor task where visual and auditory stimuli were displayed individually or simultaneously. The measured parameters were the reaction time (RT) between stimulus and onset of arm movement, successes and errors percentages, as well as the evolution as a function of time of these parameters with training. As expected, RTs were shorter when the subjects were exposed to combined stimuli. The gains for both subjects were around 20 and 40 ms, as compared with the auditory and visual stimulus alone, respectively. Moreover the number of correct responses increased in response to bimodal stimuli. We interpreted such multisensory advantage through redundant signal effect which decreases perceptual ambiguity, increases speed of stimulus detection, and improves performance accuracy. The second part of the study presents single-unit recordings derived from the premotor cortex (PM) of the same subjects during the sensory-motor task. Response patterns to sensory/multisensory stimulation are documented and specific type proportions are reported. Characterization of bimodal neurons indicates a mechanism of audio-visual integration possibly through a decrease of inhibition. Nevertheless the neural processing leading to faster motor response from PM as a polysensory association cortical area remains still unclear. PMID:24319421

  14. Age-Related Sensory Impairments and Risk of Cognitive Impairment.

    PubMed

    Fischer, Mary E; Cruickshanks, Karen J; Schubert, Carla R; Pinto, Alex A; Carlsson, Cynthia M; Klein, Barbara E K; Klein, Ronald; Tweed, Ted S

    2016-10-01

    To evaluate the associations between sensory impairments and 10-year risk of cognitive impairment. The Epidemiology of Hearing Loss Study (EHLS), a longitudinal, population-based study of aging in the Beaver Dam, Wisconsin community. Baseline examinations were conducted in 1993 and follow-up examinations have been conducted every 5 years. General community. EHLS members without cognitive impairment at EHLS-2 (1998-2000). There were 1,884 participants (mean age 66.7) with complete EHLS-2 sensory data and follow-up information. Cognitive impairment was defined as a Mini-Mental State Examination score of <24 or history of dementia or Alzheimer's disease. Hearing impairment was a pure-tone average of hearing thresholds (0.5, 1, 2, 4 kHz) of >25 dB hearing level in either ear, visual impairment was a Pelli-Robson contrast sensitivity of <1.55 log units in the better eye, and olfactory impairment was a San Diego Odor Identification Test score of <6. Hearing, visual, and olfactory impairment were independently associated with cognitive impairment risk (hearing: hazard ratio (HR) = 1.90, 95% confidence interval (CI) = 1.11-3.26; vision: HR = 2.05, 95% CI = 1.24-3.38; olfaction: HR = 3.92, 95% CI = 2.45-6.26)). Nevertheless, 85% of participants with hearing impairment, 81% with visual impairment, and 76% with olfactory impairment did not develop cognitive impairment during follow-up. The relationship between sensory impairment and cognitive impairment was not unique to one sensory system, suggesting that sensorineural health may be a marker of brain aging. The development of a combined sensorineurocognitive measure may be useful in uncovering mechanisms of healthy brain aging. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.

  15. Effects of Attention and Laterality on Motion and Orientation Discrimination in Deaf Signers

    ERIC Educational Resources Information Center

    Bosworth, Rain G.; Petrich, Jennifer A. F.; Dobkins, Karen R.

    2013-01-01

    Previous studies have asked whether visual sensitivity and attentional processing in deaf signers are enhanced or altered as a result of their different sensory experiences during development, i.e., auditory deprivation and exposure to a visual language. In particular, deaf and hearing signers have been shown to exhibit a right visual field/left…

  16. Brain architecture of the Pacific White Shrimp Penaeus vannamei Boone, 1931 (Malacostraca, Dendrobranchiata): correspondence of brain structure and sensory input?

    PubMed

    Meth, Rebecca; Wittfoth, Christin; Harzsch, Steffen

    2017-08-01

    Penaeus vannamei (Dendrobranchiata, Decapoda) is best known as the "Pacific White Shrimp" and is currently the most important crustacean in commercial aquaculture worldwide. Although the neuroanatomy of crustaceans has been well examined in representatives of reptant decapods ("ground-dwelling decapods"), there are only a few studies focusing on shrimps and prawns. In order to obtain insights into the architecture of the brain of P. vannamei, we use neuroanatomical methods including X-ray micro-computed tomography, 3D reconstruction and immunohistochemical staining combined with confocal laser-scanning microscopy and serial sectioning. The brain of P. vannamei exhibits all the prominent neuropils and tracts that characterize the ground pattern of decapod crustaceans. However, the size proportion of some neuropils is salient. The large lateral protocerebrum that comprises the visual neuropils as well as the hemiellipsoid body and medulla terminalis is remarkable. This observation corresponds with the large size of the compound eyes of these animals. In contrast, the remaining median part of the brain is relatively small. It is dominated by the paired antenna 2 neuropils, while the deutocerebral chemosensory lobes play a minor role. Our findings suggest that visual input from the compound eyes and mechanosensory input from the second pair of antennae are major sensory modalities, which this brain processes.

  17. Parallel processing in the honeybee olfactory pathway: structure, function, and evolution.

    PubMed

    Rössler, Wolfgang; Brill, Martin F

    2013-11-01

    Animals face highly complex and dynamic olfactory stimuli in their natural environments, which require fast and reliable olfactory processing. Parallel processing is a common principle of sensory systems supporting this task, for example in visual and auditory systems, but its role in olfaction remained unclear. Studies in the honeybee focused on a dual olfactory pathway. Two sets of projection neurons connect glomeruli in two antennal-lobe hemilobes via lateral and medial tracts in opposite sequence with the mushroom bodies and lateral horn. Comparative studies suggest that this dual-tract circuit represents a unique adaptation in Hymenoptera. Imaging studies indicate that glomeruli in both hemilobes receive redundant sensory input. Recent simultaneous multi-unit recordings from projection neurons of both tracts revealed widely overlapping response profiles strongly indicating parallel olfactory processing. Whereas lateral-tract neurons respond fast with broad (generalistic) profiles, medial-tract neurons are odorant specific and respond slower. In analogy to "what-" and "where" subsystems in visual pathways, this suggests two parallel olfactory subsystems providing "what-" (quality) and "when" (temporal) information. Temporal response properties may support across-tract coincidence coding in higher centers. Parallel olfactory processing likely enhances perception of complex odorant mixtures to decode the diverse and dynamic olfactory world of a social insect.

  18. Beauty and the beholder: the role of visual sensitivity in visual preference

    PubMed Central

    Spehar, Branka; Wong, Solomon; van de Klundert, Sarah; Lui, Jessie; Clifford, Colin W. G.; Taylor, Richard P.

    2015-01-01

    For centuries, the essence of aesthetic experience has remained one of the most intriguing mysteries for philosophers, artists, art historians and scientists alike. Recently, views emphasizing the link between aesthetics, perception and brain function have become increasingly prevalent (Ramachandran and Hirstein, 1999; Zeki, 1999; Livingstone, 2002; Ishizu and Zeki, 2013). The link between art and the fractal-like structure of natural images has also been highlighted (Spehar et al., 2003; Graham and Field, 2007; Graham and Redies, 2010). Motivated by these claims and our previous findings that humans display a consistent preference across various images with fractal-like statistics, here we explore the possibility that observers’ preference for visual patterns might be related to their sensitivity for such patterns. We measure sensitivity to simple visual patterns (sine-wave gratings varying in spatial frequency and random textures with varying scaling exponent) and find that they are highly correlated with visual preferences exhibited by the same observers. Although we do not attempt to offer a comprehensive neural model of aesthetic experience, we demonstrate a strong relationship between visual sensitivity and preference for simple visual patterns. Broadly speaking, our results support assertions that there is a close relationship between aesthetic experience and the sensory coding of natural stimuli. PMID:26441611

  19. Retinal and visual system: occupational and environmental toxicology.

    PubMed

    Fox, Donald A

    2015-01-01

    Occupational chemical exposure often results in sensory systems alterations that occur without other clinical signs or symptoms. Approximately 3000 chemicals are toxic to the retina and central visual system. Their dysfunction can have immediate, long-term, and delayed effects on mental health, physical health, and performance and lead to increased occupational injuries. The aims of this chapter are fourfold. First, provide references on retinal/visual system structure, function, and assessment techniques. Second, discuss the retinal features that make it especially vulnerable to toxic chemicals. Third, review the clinical and corresponding experimental data regarding retinal/visual system deficits produced by occupational toxicants: organic solvents (carbon disulfide, trichloroethylene, tetrachloroethylene, styrene, toluene, and mixtures) and metals (inorganic lead, methyl mercury, and mercury vapor). Fourth, discuss occupational and environmental toxicants as risk factors for late-onset retinal diseases and degeneration. Overall, the toxicants altered color vision, rod- and/or cone-mediated electroretinograms, visual fields, spatial contrast sensitivity, and/or retinal thickness. The findings elucidate the importance of conducting multimodal noninvasive clinical, electrophysiologic, imaging and vision testing to monitor toxicant-exposed workers for possible retinal/visual system alterations. Finally, since the retina is a window into the brain, an increased awareness and understanding of retinal/visual system dysfunction should provide additional insight into acquired neurodegenerative disorders. © 2015 Elsevier B.V. All rights reserved.

  20. Mastoid vibration affects dynamic postural control during gait in healthy older adults

    NASA Astrophysics Data System (ADS)

    Chien, Jung Hung; Mukherjee, Mukul; Kent, Jenny; Stergiou, Nicholas

    2017-01-01

    Vestibular disorders are difficult to diagnose early due to the lack of a systematic assessment. Our previous work has developed a reliable experimental design and the result shows promising results that vestibular sensory input while walking could be affected through mastoid vibration (MV) and changes are in the direction of motion. In the present paper, we wanted to extend this work to older adults and investigate how manipulating sensory input through mastoid vibration (MV) could affect dynamic postural control during walking. Three levels of MV (none, unilateral, and bilateral) applied via vibrating elements placed on the mastoid processes were combined with the Locomotor Sensory Organization Test (LSOT) paradigm to challenge the visual and somatosensory systems. We hypothesized that the MV would affect sway variability during walking in older adults. Our results revealed that MV significantly not only increased the amount of sway variability but also decreased the temporal structure of sway variability only in anterior-posterior direction. Importantly, the bilateral MV stimulation generally produced larger effects than the unilateral. This is an important finding that confirmed our experimental design and the results produced could guide a more reliable screening of vestibular system deterioration.

Top