Science.gov

Sample records for multisensory multifunctional nucleus

  1. Multisensory constraints on awareness.

    PubMed

    Deroy, Ophelia; Chen, Yi-Chuan; Spence, Charles

    2014-05-05

    Given that multiple senses are often stimulated at the same time, perceptual awareness is most likely to take place in multisensory situations. However, theories of awareness are based on studies and models established for a single sense (mostly vision). Here, we consider the methodological and theoretical challenges raised by taking a multisensory perspective on perceptual awareness. First, we consider how well tasks designed to study unisensory awareness perform when used in multisensory settings, stressing that studies using binocular rivalry, bistable figure perception, continuous flash suppression, the attentional blink, repetition blindness and backward masking can demonstrate multisensory influences on unisensory awareness, but fall short of tackling multisensory awareness directly. Studies interested in the latter phenomenon rely on a method of subjective contrast and can, at best, delineate conditions under which individuals report experiencing a multisensory object or two unisensory objects. As there is not a perfect match between these conditions and those in which multisensory integration and binding occur, the link between awareness and binding advocated for visual information processing needs to be revised for multisensory cases. These challenges point at the need to question the very idea of multisensory awareness.

  2. Multisensory constraints on awareness

    PubMed Central

    Deroy, Ophelia; Chen, Yi-Chuan; Spence, Charles

    2014-01-01

    Given that multiple senses are often stimulated at the same time, perceptual awareness is most likely to take place in multisensory situations. However, theories of awareness are based on studies and models established for a single sense (mostly vision). Here, we consider the methodological and theoretical challenges raised by taking a multisensory perspective on perceptual awareness. First, we consider how well tasks designed to study unisensory awareness perform when used in multisensory settings, stressing that studies using binocular rivalry, bistable figure perception, continuous flash suppression, the attentional blink, repetition blindness and backward masking can demonstrate multisensory influences on unisensory awareness, but fall short of tackling multisensory awareness directly. Studies interested in the latter phenomenon rely on a method of subjective contrast and can, at best, delineate conditions under which individuals report experiencing a multisensory object or two unisensory objects. As there is not a perfect match between these conditions and those in which multisensory integration and binding occur, the link between awareness and binding advocated for visual information processing needs to be revised for multisensory cases. These challenges point at the need to question the very idea of multisensory awareness. PMID:24639579

  3. Principles of multisensory behavior.

    PubMed

    Otto, Thomas U; Dassy, Brice; Mamassian, Pascal

    2013-04-24

    The combined use of multisensory signals is often beneficial. Based on neuronal recordings in the superior colliculus of cats, three basic rules were formulated to describe the effectiveness of multisensory signals: the enhancement of neuronal responses to multisensory compared with unisensory signals is largest when signals occur at the same location ("spatial rule"), when signals are presented at the same time ("temporal rule"), and when signals are rather weak ("principle of inverse effectiveness"). These rules are also considered with respect to multisensory benefits as observed with behavioral measures, but do they capture these benefits best? To uncover the principles that rule benefits in multisensory behavior, we here investigated the classical redundant signal effect (RSE; i.e., the speedup of response times in multisensory compared with unisensory conditions) in humans. Based on theoretical considerations using probability summation, we derived two alternative principles to explain the effect. First, the "principle of congruent effectiveness" states that the benefit in multisensory behavior (here the speedup of response times) is largest when behavioral performance in corresponding unisensory conditions is similar. Second, the "variability rule" states that the benefit is largest when performance in corresponding unisensory conditions is unreliable. We then tested these predictions in two experiments, in which we manipulated the relative onset and the physical strength of distinct audiovisual signals. Our results, which are based on a systematic analysis of response time distributions, show that the RSE follows these principles very well, thereby providing compelling evidence in favor of probability summation as the underlying combination rule.

  4. Neglect: a multisensory deficit?

    PubMed

    Jacobs, Stéphane; Brozzoli, Claudio; Farnè, Alessandro

    2012-05-01

    Neglect is a neurological syndrome characterised by a lack of conscious perception of events localised in the contralesional side of space. Here, we consider the possible multisensory nature of this disorder, critically reviewing the literature devoted to multisensory manifestations and processing in neglect. Although its most striking manifestations have been observed in the visual domain, a number of studies demonstrate that neglect can affect virtually any sensory modality, in particular touch and audition. Furthermore, a few recent studies have reported a correlation in severity between visual and non-visual neglect-related deficits evaluated in the same patients, providing some preliminary support for a multisensory conception of neglect. Sensory stimulation and sensorimotor adaptation techniques, aimed at alleviating neglect, have also been shown to affect several sensory modalities, including some that were not directly affected by the intervention. Finally, in some cases neglect can bias multisensory interactions known to occur in healthy individuals, leading to abnormal behaviour or uncovering multisensory compensation mechanisms. This evidence, together with neurophysiological and neuroimaging data revealing the multisensory role played by the areas that are most commonly damaged in neglect patients, seems to speak in favour of neglect as a multisensory disorder. However, since most previous studies were not conducted with the specific purpose of systematically investigating the multisensory nature of neglect, we conclude that more research is needed to appropriately assess this question, and suggest some methodological guidelines that we hope will help clarify this issue. At present, the conception of neglect as a multisensory disorder remains a promising working hypothesis that may help define the pathophysiology of this syndrome.

  5. Multisensory flavor perception.

    PubMed

    Spence, Charles

    2015-03-26

    The perception of flavor is perhaps the most multisensory of our everyday experiences. The latest research by psychologists and cognitive neuroscientists increasingly reveals the complex multisensory interactions that give rise to the flavor experiences we all know and love, demonstrating how they rely on the integration of cues from all of the human senses. This Perspective explores the contributions of distinct senses to our perception of food and the growing realization that the same rules of multisensory integration that have been thoroughly explored in interactions between audition, vision, and touch may also explain the combination of the (admittedly harder to study) flavor senses. Academic advances are now spilling out into the real world, with chefs and food industry increasingly taking the latest scientific findings on board in their food design.

  6. Metacognition in Multisensory Perception.

    PubMed

    Deroy, Ophelia; Spence, Charles; Noppeney, Uta

    2016-10-01

    Metacognition - the ability to monitor one's own decisions and representations, their accuracy and uncertainty - is considered a hallmark of intelligent behavior. Little is known about metacognition in our natural multisensory environment. To form a coherent percept, the brain should integrate signals from a common cause but segregate those from independent causes. Multisensory perception thus relies on inferring the world's causal structure, raising new challenges for metacognition. We discuss the extent to which observers can monitor their uncertainties not only about their final integrated percept but also about the individual sensory signals and the world's causal structure. The latter causal metacognition highlights fundamental links between perception and other cognitive domains such as social and abstract reasoning.

  7. Parietal connectivity mediates multisensory facilitation.

    PubMed

    Brang, David; Taich, Zachary J; Hillyard, Steven A; Grabowecky, Marcia; Ramachandran, V S

    2013-09-01

    Our senses interact in daily life through multisensory integration, facilitating perceptual processes and behavioral responses. The neural mechanisms proposed to underlie this multisensory facilitation include anatomical connections directly linking early sensory areas, indirect connections to higher-order multisensory regions, as well as thalamic connections. Here we examine the relationship between white matter connectivity, as assessed with diffusion tensor imaging, and individual differences in multisensory facilitation and provide the first demonstration of a relationship between anatomical connectivity and multisensory processing in typically developed individuals. Using a whole-brain analysis and contrasting anatomical models of multisensory processing we found that increased connectivity between parietal regions and early sensory areas was associated with the facilitation of reaction times to multisensory (auditory-visual) stimuli. Furthermore, building on prior animal work suggesting the involvement of the superior colliculus in this process, using probabilistic tractography we determined that the strongest cortical projection area connected with the superior colliculus includes the region of connectivity implicated in our independent whole-brain analysis.

  8. Multifunctional nanocrystals

    DOEpatents

    Klimov, Victor I.; Hollingsworth, Jennifer A.; Crooker, Scott A.; Kim, Hyungrak

    2010-06-22

    Multifunctional nanocomposites are provided including a core of either a magnetic material or an inorganic semiconductor, and, a shell of either a magnetic material or an inorganic semiconductor, wherein the core and the shell are of differing materials, such multifunctional nanocomposites having multifunctional properties including magnetic properties from the magnetic material and optical properties from the inorganic semiconductor material. Various applications of such multifunctional nanocomposites are also provided.

  9. Multifunctional nanocrystals

    DOEpatents

    Klimov, Victor I.; Hollingsworth, Jennifer A.; Crooker, Scott A.; Kim, Hyungrak

    2007-08-28

    Multifunctional nanocomposites are provided including a core of either a magnetic material or an inorganic semiconductor, and, a shell of either a magnetic material or an inorganic semiconductor, wherein the core and the shell are of differing materials, such multifunctional nanocomposites having multifunctional properties including magnetic properties from the magnetic material and optical properties from the inorganic semiconductor material. Various applications of such multifunctional nanocomposites are also provided.

  10. Generalization of multisensory perceptual learning.

    PubMed

    Powers Iii, Albert R; Hillock-Dunn, Andrea; Wallace, Mark T

    2016-03-22

    Life in a multisensory world requires the rapid and accurate integration of stimuli across the different senses. In this process, the temporal relationship between stimuli is critical in determining which stimuli share a common origin. Numerous studies have described a multisensory temporal binding window-the time window within which audiovisual stimuli are likely to be perceptually bound. In addition to characterizing this window's size, recent work has shown it to be malleable, with the capacity for substantial narrowing following perceptual training. However, the generalization of these effects to other measures of perception is not known. This question was examined by characterizing the ability of training on a simultaneity judgment task to influence perception of the temporally-dependent sound-induced flash illusion (SIFI). Results do not demonstrate a change in performance on the SIFI itself following training. However, data do show an improved ability to discriminate rapidly-presented two-flash control conditions following training. Effects were specific to training and scaled with the degree of temporal window narrowing exhibited. Results do not support generalization of multisensory perceptual learning to other multisensory tasks. However, results do show that training results in improvements in visual temporal acuity, suggesting a generalization effect of multisensory training on unisensory abilities.

  11. Generalization of multisensory perceptual learning

    PubMed Central

    Powers III, Albert R.; Hillock-Dunn, Andrea; Wallace, Mark T.

    2016-01-01

    Life in a multisensory world requires the rapid and accurate integration of stimuli across the different senses. In this process, the temporal relationship between stimuli is critical in determining which stimuli share a common origin. Numerous studies have described a multisensory temporal binding window—the time window within which audiovisual stimuli are likely to be perceptually bound. In addition to characterizing this window’s size, recent work has shown it to be malleable, with the capacity for substantial narrowing following perceptual training. However, the generalization of these effects to other measures of perception is not known. This question was examined by characterizing the ability of training on a simultaneity judgment task to influence perception of the temporally-dependent sound-induced flash illusion (SIFI). Results do not demonstrate a change in performance on the SIFI itself following training. However, data do show an improved ability to discriminate rapidly-presented two-flash control conditions following training. Effects were specific to training and scaled with the degree of temporal window narrowing exhibited. Results do not support generalization of multisensory perceptual learning to other multisensory tasks. However, results do show that training results in improvements in visual temporal acuity, suggesting a generalization effect of multisensory training on unisensory abilities. PMID:27000988

  12. Early Experience & Multisensory Perceptual Narrowing

    PubMed Central

    Lewkowicz, David J.

    2014-01-01

    Perceptual narrowing is a reflection of early experience and contributes in key ways to perceptual and cognitive development. In general, findings have shown that unisensory perceptual sensitivity in early infancy is broadly tuned such that young infants respond to, and discriminate, native as well as non-native sensory inputs, whereas older infants only respond to native inputs. Recently, my colleagues and I discovered that perceptual narrowing occurs at the multisensory processing level as well. The present article reviews this new evidence and puts it in the larger context of multisensory perceptual development and the role that perceptual experience plays in it. Together, the evidence on unisensory and multisensory narrowing shows that early experience shapes the emergence of perceptual specialization and expertise. PMID:24435505

  13. A multisensory perspective of working memory

    PubMed Central

    Quak, Michel; London, Raquel Elea; Talsma, Durk

    2015-01-01

    Although our sensory experience is mostly multisensory in nature, research on working memory representations has focused mainly on examining the senses in isolation. Results from the multisensory processing literature make it clear that the senses interact on a more intimate manner than previously assumed. These interactions raise questions regarding the manner in which multisensory information is maintained in working memory. We discuss the current status of research on multisensory processing and the implications of these findings on our theoretical understanding of working memory. To do so, we focus on reviewing working memory research conducted from a multisensory perspective, and discuss the relation between working memory, attention, and multisensory processing in the context of the predictive coding framework. We argue that a multisensory approach to the study of working memory is indispensable to achieve a realistic understanding of how working memory processes maintain and manipulate information. PMID:25954176

  14. Multisensory integration mechanisms during aging.

    PubMed

    Freiherr, Jessica; Lundström, Johan N; Habel, Ute; Reetz, Kathrin

    2013-12-13

    The rapid demographical shift occurring in our society implies that understanding of healthy aging and age-related diseases is one of our major future challenges. Sensory impairments have an enormous impact on our lives and are closely linked to cognitive functioning. Due to the inherent complexity of sensory perceptions, we are commonly presented with a complex multisensory stimulation and the brain integrates the information from the individual sensory channels into a unique and holistic percept. The cerebral processes involved are essential for our perception of sensory stimuli and becomes especially important during the perception of emotional content. Despite ongoing deterioration of the individual sensory systems during aging, there is evidence for an increase in, or maintenance of, multisensory integration processing in aging individuals. Within this comprehensive literature review on multisensory integration we aim to highlight basic mechanisms and potential compensatory strategies the human brain utilizes to help maintain multisensory integration capabilities during healthy aging to facilitate a broader understanding of age-related pathological conditions. Further our goal was to identify where further research is needed.

  15. Multisensory integration mechanisms during aging

    PubMed Central

    Freiherr, Jessica; Lundström, Johan N.; Habel, Ute; Reetz, Kathrin

    2013-01-01

    The rapid demographical shift occurring in our society implies that understanding of healthy aging and age-related diseases is one of our major future challenges. Sensory impairments have an enormous impact on our lives and are closely linked to cognitive functioning. Due to the inherent complexity of sensory perceptions, we are commonly presented with a complex multisensory stimulation and the brain integrates the information from the individual sensory channels into a unique and holistic percept. The cerebral processes involved are essential for our perception of sensory stimuli and becomes especially important during the perception of emotional content. Despite ongoing deterioration of the individual sensory systems during aging, there is evidence for an increase in, or maintenance of, multisensory integration processing in aging individuals. Within this comprehensive literature review on multisensory integration we aim to highlight basic mechanisms and potential compensatory strategies the human brain utilizes to help maintain multisensory integration capabilities during healthy aging to facilitate a broader understanding of age-related pathological conditions. Further our goal was to identify where further research is needed. PMID:24379773

  16. Multisensory stimulation in stroke rehabilitation.

    PubMed

    Johansson, Barbro Birgitta

    2012-01-01

    The brain has a large capacity for automatic simultaneous processing and integration of sensory information. Combining information from different sensory modalities facilitates our ability to detect, discriminate, and recognize sensory stimuli, and learning is often optimal in a multisensory environment. Currently used multisensory stimulation methods in stroke rehabilitation include motor imagery, action observation, training with a mirror or in a virtual environment, and various kinds of music therapy. Non-invasive brain stimulation has showed promising preliminary results in aphasia and neglect. Patient heterogeneity and the interaction of age, gender, genes, and environment are discussed. Randomized controlled longitudinal trials starting earlier post-stroke are needed. The advance in brain network science and neuroimaging enabling longitudinal studies of structural and functional networks are likely to have an important impact on patient selection for specific interventions in future stroke rehabilitation. It is proposed that we should pay more attention to age, gender, and laterality in clinical studies.

  17. Multisensory Stimulation in Stroke Rehabilitation

    PubMed Central

    Johansson, Barbro Birgitta

    2012-01-01

    The brain has a large capacity for automatic simultaneous processing and integration of sensory information. Combining information from different sensory modalities facilitates our ability to detect, discriminate, and recognize sensory stimuli, and learning is often optimal in a multisensory environment. Currently used multisensory stimulation methods in stroke rehabilitation include motor imagery, action observation, training with a mirror or in a virtual environment, and various kinds of music therapy. Non-invasive brain stimulation has showed promising preliminary results in aphasia and neglect. Patient heterogeneity and the interaction of age, gender, genes, and environment are discussed. Randomized controlled longitudinal trials starting earlier post-stroke are needed. The advance in brain network science and neuroimaging enabling longitudinal studies of structural and functional networks are likely to have an important impact on patient selection for specific interventions in future stroke rehabilitation. It is proposed that we should pay more attention to age, gender, and laterality in clinical studies. PMID:22509159

  18. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’. PMID:27695705

  19. Walking to a multisensory beat.

    PubMed

    Roy, Charlotte; Lagarde, Julien; Dotov, Dobromir; Dalla Bella, Simone

    2017-04-01

    Living in a complex and multisensory environment demands constant interaction between perception and action. In everyday life it is common to combine efficiently simultaneous signals coming from different modalities. There is evidence of a multisensory benefit in a variety of laboratory tasks (temporal judgement, reaction time tasks). It is less clear if this effect extends to ecological tasks, such as walking. Furthermore, benefits of multimodal stimulation are linked to temporal properties such as the temporal window of integration and temporal recalibration. These properties have been examined in tasks involving single, non-repeating stimulus presentations. Here we investigate the same temporal properties in the context of a rhythmic task, namely audio-tactile stimulation during walking. The effect of audio-tactile rhythmic cues on gait variability and the ability to synchronize to the cues was studied in young adults. Participants walked with rhythmic cues presented at different stimulus-onset asynchronies. We observed a multisensory benefit by comparing audio-tactile to unimodal stimulation. Moreover, both the temporal window of integration and temporal recalibration mediated the response to multimodal stimulation. In sum, rhythmic behaviours obey the same principles as temporal discrimination and detection behaviours and thus can also benefit from multimodal stimulation.

  20. Multisensory Integration and Child Neurodevelopment

    PubMed Central

    Dionne-Dostie, Emmanuelle; Paquette, Natacha; Lassonde, Maryse; Gallagher, Anne

    2015-01-01

    A considerable number of cognitive processes depend on the integration of multisensory information. The brain integrates this information, providing a complete representation of our surrounding world and giving us the ability to react optimally to the environment. Infancy is a period of great changes in brain structure and function that are reflected by the increase of processing capacities of the developing child. However, it is unclear if the optimal use of multisensory information is present early in childhood or develops only later, with experience. The first part of this review has focused on the typical development of multisensory integration (MSI). We have described the two hypotheses on the developmental process of MSI in neurotypical infants and children, and have introduced MSI and its neuroanatomic correlates. The second section has discussed the neurodevelopmental trajectory of MSI in cognitively-challenged infants and children. A few studies have brought to light various difficulties to integrate sensory information in children with a neurodevelopmental disorder. Consequently, we have exposed certain possible neurophysiological relationships between MSI deficits and neurodevelopmental disorders, especially dyslexia and attention deficit disorder with/without hyperactivity. PMID:25679116

  1. Multisensory Teaching of Basic Language Skills.

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    This text on multisensory structured language education (MSLE) provides the foundation for MSLE and offers components of instruction and effective teaching strategies that teachers can put into practice for students with dyslexia and others struggling to learn to read, write, and spell. Chapters include: (1) "Multisensory Instruction" (Louisa C.…

  2. Multi-Sensory Intervention Observational Research

    ERIC Educational Resources Information Center

    Thompson, Carla J.

    2011-01-01

    An observational research study based on sensory integration theory was conducted to examine the observed impact of student selected multi-sensory experiences within a multi-sensory intervention center relative to the sustained focus levels of students with special needs. A stratified random sample of 50 students with severe developmental…

  3. Multisensory Teaching of Basic Language Skills.

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    This text on multisensory structured language education (MSLE) provides the foundation for MSLE and offers components of instruction and effective teaching strategies that teachers can put into practice for students with dyslexia and others struggling to learn to read, write, and spell. Chapters include: (1) "Multisensory Instruction" (Louisa C.…

  4. Multisensory attention training for treatment of tinnitus.

    PubMed

    Spiegel, D P; Linford, T; Thompson, B; Petoe, M A; Kobayashi, K; Stinear, C M; Searchfield, G D

    2015-05-28

    Tinnitus is the conscious perception of sound with no physical sound source. Some models of tinnitus pathophysiology suggest that networks associated with attention, memory, distress and multisensory experience are involved in tinnitus perception. The aim of this study was to evaluate whether a multisensory attention training paradigm which used audio, visual, and somatosensory stimulation would reduce tinnitus. Eighteen participants with predominantly unilateral chronic tinnitus were randomized between two groups receiving 20 daily sessions of either integration (attempting to reduce salience to tinnitus by binding with multisensory stimuli) or attention diversion (multisensory stimuli opposite side to tinnitus) training. The training resulted in small but statistically significant reductions in Tinnitus Functional Index and Tinnitus Severity Numeric Scale scores and improved attentional abilities. No statistically significant improvements in tinnitus were found between the training groups. This study demonstrated that a short period of multisensory attention training reduced unilateral tinnitus, but directing attention toward or away from the tinnitus side did not differentiate this effect.

  5. Multisensory attention training for treatment of tinnitus

    PubMed Central

    D. P., Spiegel; T., Linford; B., Thompson; M. A., Petoe; K., Kobayashi; C. M., Stinear; G. D., Searchfield

    2015-01-01

    Tinnitus is the conscious perception of sound with no physical sound source. Some models of tinnitus pathophysiology suggest that networks associated with attention, memory, distress and multisensory experience are involved in tinnitus perception. The aim of this study was to evaluate whether a multisensory attention training paradigm which used audio, visual, and somatosensory stimulation would reduce tinnitus. Eighteen participants with predominantly unilateral chronic tinnitus were randomized between two groups receiving 20 daily sessions of either integration (attempting to reduce salience to tinnitus by binding with multisensory stimuli) or attention diversion (multisensory stimuli opposite side to tinnitus) training. The training resulted in small but statistically significant reductions in Tinnitus Functional Index and Tinnitus Severity Numeric Scale scores and improved attentional abilities. No statistically significant improvements in tinnitus were found between the training groups. This study demonstrated that a short period of multisensory attention training reduced unilateral tinnitus, but directing attention toward or away from the tinnitus side did not differentiate this effect. PMID:26020589

  6. Multisensory integration is independent of perceived simultaneity.

    PubMed

    Harrar, Vanessa; Harris, Laurence R; Spence, Charles

    2017-03-01

    The importance of multisensory integration for perception and action has long been recognised. Integrating information from individual senses increases the chance of survival by reducing the variability in the incoming signals, thus allowing us to respond more rapidly. Reaction times (RTs) are fastest when the components of the multisensory signals are simultaneous. This response facilitation is traditionally attributed to multisensory integration. However, it is unclear if facilitation of RTs occurs when stimuli are perceived as synchronous or are actually physically synchronous. Repeated exposure to audiovisual asynchrony can change the delay at which multisensory stimuli are perceived as simultaneous, thus changing the delay at which the stimuli are integrated-perceptually. Here we set out to determine how such changes in multisensory integration for perception affect our ability to respond to multisensory events. If stimuli perceived as simultaneous were reacted to most rapidly, it would suggest a common system for multisensory integration for perception and action. If not, it would suggest separate systems. We measured RTs to auditory, visual, and audiovisual stimuli following exposure to audiovisual asynchrony. Exposure affected the variability of the unisensory RT distributions; in particular, the slowest RTs were either speed up or slowed down (in the direction predicted from shifts in perceived simultaneity). Additionally, the multisensory facilitation of RTs (beyond statistical summation) only occurred when audiovisual onsets were physically synchronous, rather than when they appeared simultaneous. We conclude that the perception of synchrony is therefore independent of multisensory integration and suggest a division between multisensory processes that are fast (automatic and unaffected by temporal adaptation) and those that are slow (perceptually driven and adaptable).

  7. Connectional parameters determine multisensory processing in a spiking network model of multisensory convergence.

    PubMed

    Lim, H K; Keniston, L P; Shin, J H; Allman, B L; Meredith, M A; Cios, K J

    2011-09-01

    For the brain to synthesize information from different sensory modalities, connections from different sensory systems must converge onto individual neurons. However, despite being the definitive, first step in the multisensory process, little is known about multisensory convergence at the neuronal level. This lack of knowledge may be due to the difficulty for biological experiments to manipulate and test the connectional parameters that define convergence. Therefore, the present study used a computational network of spiking neurons to measure the influence of convergence from two separate projection areas on the responses of neurons in a convergent area. Systematic changes in the proportion of extrinsic projections, the proportion of intrinsic connections, or the amount of local inhibitory contacts affected the multisensory properties of neurons in the convergent area by influencing (1) the proportion of multisensory neurons generated, (2) the proportion of neurons that generate integrated multisensory responses, and (3) the magnitude of multisensory integration. These simulations provide insight into the connectional parameters of convergence that contribute to the generation of populations of multisensory neurons in different neural regions as well as indicate that the simple effect of multisensory convergence is sufficient to generate multisensory properties like those of biological multisensory neurons.

  8. Visual learning in multisensory environments.

    PubMed

    Jacobs, Robert A; Shams, Ladan

    2010-04-01

    We study the claim that multisensory environments are useful for visual learning because nonvisual percepts can be processed to produce error signals that people can use to adapt their visual systems. This hypothesis is motivated by a Bayesian network framework. The framework is useful because it ties together three observations that have appeared in the literature: (a) signals from nonvisual modalities can "teach" the visual system; (b) signals from nonvisual modalities can facilitate learning in the visual system; and (c) visual signals can become associated with (or be predicted by) signals from nonvisual modalities. Experimental data consistent with each of these observations are reviewed.

  9. Altered multisensory temporal integration in obesity

    PubMed Central

    Scarpina, Federica; Migliorati, Daniele; Marzullo, Paolo; Mauro, Alessandro; Scacchi, Massimo; Costantini, Marcello

    2016-01-01

    Eating is a multisensory behavior. The act of placing food in the mouth provides us with a variety of sensory information, including gustatory, olfactory, somatosensory, visual, and auditory. Evidence suggests altered eating behavior in obesity. Nonetheless, multisensory integration in obesity has been scantily investigated so far. Starting from this gap in the literature, we seek to provide the first comprehensive investigation of multisensory integration in obesity. Twenty male obese participants and twenty male healthy-weight participants took part in the study aimed at describing the multisensory temporal binding window (TBW). The TBW is defined as the range of stimulus onset asynchrony in which multiple sensory inputs have a high probability of being integrated. To investigate possible multisensory temporal processing deficits in obesity, we investigated performance in two multisensory audiovisual temporal tasks, namely simultaneity judgment and temporal order judgment. Results showed a wider TBW in obese participants as compared to healthy-weight controls. This holds true for both the simultaneity judgment and the temporal order judgment tasks. An explanatory hypothesis would regard the effect of metabolic alterations and low-grade inflammatory state, clinically observed in obesity, on the temporal organization of brain ongoing activity, which one of the neural mechanisms enabling multisensory integration. PMID:27324727

  10. Multisensory maps in parietal cortex☆

    PubMed Central

    Sereno, Martin I; Huang, Ruey-Song

    2014-01-01

    Parietal cortex has long been known to be a site of sensorimotor integration. Recent findings in humans have shown that it is divided up into a number of small areas somewhat specialized for eye movements, reaching, and hand movements, but also face-related movements (avoidance, eating), lower body movements, and movements coordinating multiple body parts. The majority of these areas contain rough sensory (receptotopic) maps, including a substantial multisensory representation of the lower body and lower visual field immediately medial to face VIP. There is strong evidence for retinotopic remapping in LIP and face-centered remapping in VIP, and weaker evidence for hand-centered remapping. The larger size of the functionally distinct inferior parietal default mode network in humans compared to monkeys results in a superior and medial displacement of middle parietal areas (e.g., the saccade-related LIP's). Multisensory superior parietal areas located anterior to the angular gyrus such as AIP and VIP are less medially displaced relative to macaque monkeys, so that human LIP paradoxically ends up medial to human VIP. PMID:24492077

  11. Multisensory temporal integration in autism spectrum disorders.

    PubMed

    Stevenson, Ryan A; Siemann, Justin K; Schneider, Brittany C; Eberly, Haley E; Woynaroski, Tiffany G; Camarata, Stephen M; Wallace, Mark T

    2014-01-15

    The new DSM-5 diagnostic criteria for autism spectrum disorders (ASDs) include sensory disturbances in addition to the well-established language, communication, and social deficits. One sensory disturbance seen in ASD is an impaired ability to integrate multisensory information into a unified percept. This may arise from an underlying impairment in which individuals with ASD have difficulty perceiving the temporal relationship between cross-modal inputs, an important cue for multisensory integration. Such impairments in multisensory processing may cascade into higher-level deficits, impairing day-to-day functioning on tasks, such as speech perception. To investigate multisensory temporal processing deficits in ASD and their links to speech processing, the current study mapped performance on a number of multisensory temporal tasks (with both simple and complex stimuli) onto the ability of individuals with ASD to perceptually bind audiovisual speech signals. High-functioning children with ASD were compared with a group of typically developing children. Performance on the multisensory temporal tasks varied with stimulus complexity for both groups; less precise temporal processing was observed with increasing stimulus complexity. Notably, individuals with ASD showed a speech-specific deficit in multisensory temporal processing. Most importantly, the strength of perceptual binding of audiovisual speech observed in individuals with ASD was strongly related to their low-level multisensory temporal processing abilities. Collectively, the results represent the first to illustrate links between multisensory temporal function and speech processing in ASD, strongly suggesting that deficits in low-level sensory processing may cascade into higher-order domains, such as language and communication.

  12. Nucleus-nucleus potentials

    SciTech Connect

    Satchler, G.R.

    1983-01-01

    The significance of a nucleus-nucleus potential is discussed. Information about such potentials obtained from scattering experiments is reviewed, including recent examples of so-called rainbow scattering that probe the potential at smaller distances. The evidence for interactions involving the nuclear spins is summarized, and their possible origin in couplings to non-elastic channels. Various models of the potentials are discussed.

  13. Impact of response duration on multisensory integration.

    PubMed

    Ghose, Dipanwita; Barnett, Zachary P; Wallace, Mark T

    2012-11-01

    Multisensory neurons in the superior colliculus (SC) have been shown to have large receptive fields that are heterogeneous in nature. These neurons have the capacity to integrate their different sensory inputs, a process that has been shown to depend on the physical characteristics of the stimuli that are combined (i.e., spatial and temporal relationship and relative effectiveness). Recent work has highlighted the interdependence of these factors in driving multisensory integration, adding a layer of complexity to our understanding of multisensory processes. In the present study our goal was to add to this understanding by characterizing how stimulus location impacts the temporal dynamics of multisensory responses in cat SC neurons. The results illustrate that locations within the spatial receptive fields (SRFs) of these neurons can be divided into those showing short-duration responses and long-duration response profiles. Most importantly, discharge duration appears to be a good determinant of multisensory integration, such that short-duration responses are typically associated with a high magnitude of multisensory integration (i.e., superadditive responses) while long-duration responses are typically associated with low integrative capacity. These results further reinforce the complexity of the integrative features of SC neurons and show that the large SRFs of these neurons are characterized by vastly differing temporal dynamics, dynamics that strongly shape the integrative capacity of these neurons.

  14. Multisensory Integration Uses a Real-Time Unisensory-Multisensory Transform.

    PubMed

    Miller, Ryan L; Stein, Barry E; Rowland, Benjamin A

    2017-05-17

    The manner in which the brain integrates different sensory inputs to facilitate perception and behavior has been the subject of numerous speculations. By examining multisensory neurons in cat superior colliculus, the present study demonstrated that two operational principles are sufficient to understand how this remarkable result is achieved: (1) unisensory signals are integrated continuously and in real time as soon as they arrive at their common target neuron and (2) the resultant multisensory computation is modified in shape and timing by a delayed, calibrating inhibition. These principles were tested for descriptive sufficiency by embedding them in a neurocomputational model and using it to predict a neuron's moment-by-moment multisensory response given only knowledge of its responses to the individual modality-specific component cues. The predictions proved to be highly accurate, reliable, and unbiased and were, in most cases, not statistically distinguishable from the neuron's actual instantaneous multisensory response at any phase throughout its entire duration. The model was also able to explain why different multisensory products are often observed in different neurons at different time points, as well as the higher-order properties of multisensory integration, such as the dependency of multisensory products on the temporal alignment of crossmodal cues. These observations not only reveal this fundamental integrative operation, but also identify quantitatively the multisensory transform used by each neuron. As a result, they provide a means of comparing the integrative profiles among neurons and evaluating how they are affected by changes in intrinsic or extrinsic factors.SIGNIFICANCE STATEMENT Multisensory integration is the process by which the brain combines information from multiple sensory sources (e.g., vision and audition) to maximize an organism's ability to identify and respond to environmental stimuli. The actual transformative process by which

  15. Multi-sensory integration in brainstem and auditory cortex.

    PubMed

    Basura, Gregory J; Koehler, Seth D; Shore, Susan E

    2012-11-16

    Tinnitus is the perception of sound in the absence of a physical sound stimulus. It is thought to arise from aberrant neural activity within central auditory pathways that may be influenced by multiple brain centers, including the somatosensory system. Auditory-somatosensory (bimodal) integration occurs in the dorsal cochlear nucleus (DCN), where electrical activation of somatosensory regions alters pyramidal cell spike timing and rates of sound stimuli. Moreover, in conditions of tinnitus, bimodal integration in DCN is enhanced, producing greater spontaneous and sound-driven neural activity, which are neural correlates of tinnitus. In primary auditory cortex (A1), a similar auditory-somatosensory integration has been described in the normal system (Lakatos et al., 2007), where sub-threshold multisensory modulation may be a direct reflection of subcortical multisensory responses (Tyll et al., 2011). The present work utilized simultaneous recordings from both DCN and A1 to directly compare bimodal integration across these separate brain stations of the intact auditory pathway. Four-shank, 32-channel electrodes were placed in DCN and A1 to simultaneously record tone-evoked unit activity in the presence and absence of spinal trigeminal nucleus (Sp5) electrical activation. Bimodal stimulation led to long-lasting facilitation or suppression of single and multi-unit responses to subsequent sound in both DCN and A1. Immediate (bimodal response) and long-lasting (bimodal plasticity) effects of Sp5-tone stimulation were facilitation or suppression of tone-evoked firing rates in DCN and A1 at all Sp5-tone pairing intervals (10, 20, and 40 ms), and greater suppression at 20 ms pairing-intervals for single unit responses. Understanding the complex relationships between DCN and A1 bimodal processing in the normal animal provides the basis for studying its disruption in hearing loss and tinnitus models. This article is part of a Special Issue entitled: Tinnitus Neuroscience.

  16. Decentralized Multisensory Information Integration in Neural Systems

    PubMed Central

    Zhang, Wen-hao; Chen, Aihua

    2016-01-01

    How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer heading direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. SIGNIFICANCE STATEMENT To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of heading direction with visual and vestibular cues, we show that

  17. Computing Multisensory Target Probabilities on a Neural Map

    DTIC Science & Technology

    2007-11-02

    integrates multisensory input and participates in the generation of orienting movements directed toward the source of sensory stimulation (target). The deep...Computing Multisensory Target Probabilities on a Neural Map T. J. Anastasio1,2, P. E. Patton2 1,2Department of Molecular and Integrative Physiology...SC neurons exhibit multisensory enhancement, in which the response to input of one modality is augmented by input of another modality. Multisensory

  18. Multisensory body representation in autoimmune diseases.

    PubMed

    Finotti, Gianluca; Costantini, Marcello

    2016-02-12

    Body representation has been linked to the processing and integration of multisensory signals. An outstanding example of the pivotal role played by multisensory mechanisms in body representation is the Rubber Hand Illusion (RHI). In this paradigm, multisensory stimulation induces a sense of ownership over a fake limb. Previous work has shown high interindividual differences in the susceptibility to the RHI. The origin of this variability remains largely unknown. Given the tight and bidirectional communication between the brain and the immune system, we predicted that the origin of this variability could be traced, in part, to the immune system's functioning, which is altered by several clinical conditions, including Coeliac Disease (CD). Consistent with this prediction, we found that the Rubber Hand Illusion is stronger in CD patients as compared to healthy controls. We propose a biochemical mechanism accounting for the dependency of multisensory body representation upon the Immune system. Our finding has direct implications for a range of neurological, psychiatric and immunological conditions where alterations of multisensory integration, body representation and dysfunction of the immune system co-exist.

  19. Multisensory body representation in autoimmune diseases

    PubMed Central

    Finotti, Gianluca; Costantini, Marcello

    2016-01-01

    Body representation has been linked to the processing and integration of multisensory signals. An outstanding example of the pivotal role played by multisensory mechanisms in body representation is the Rubber Hand Illusion (RHI). In this paradigm, multisensory stimulation induces a sense of ownership over a fake limb. Previous work has shown high interindividual differences in the susceptibility to the RHI. The origin of this variability remains largely unknown. Given the tight and bidirectional communication between the brain and the immune system, we predicted that the origin of this variability could be traced, in part, to the immune system’s functioning, which is altered by several clinical conditions, including Coeliac Disease (CD). Consistent with this prediction, we found that the Rubber Hand Illusion is stronger in CD patients as compared to healthy controls. We propose a biochemical mechanism accounting for the dependency of multisensory body representation upon the Immune system. Our finding has direct implications for a range of neurological, psychiatric and immunological conditions where alterations of multisensory integration, body representation and dysfunction of the immune system co-exist. PMID:26867786

  20. A multisensory perspective on object memory.

    PubMed

    Matusz, Pawel J; Wallace, Mark T; Murray, Micah M

    2017-04-08

    Traditional studies of memory and object recognition involved objects presented within a single sensory modality (i.e., purely visual or purely auditory objects). However, in naturalistic settings, objects are often evaluated and processed in a multisensory manner. This begets the question of how object representations that combine information from the different senses are created and utilised by memory functions. Here we review research that has demonstrated that a single multisensory exposure can influence memory for both visual and auditory objects. In an old/new object discrimination task, objects that were presented initially with a task-irrelevant stimulus in another sense were better remembered compared to stimuli presented alone, most notably when the two stimuli were semantically congruent. The brain discriminates between these two types of object representations within the first 100ms post-stimulus onset, indicating early "tagging" of objects/events by the brain based on the nature of their initial presentation context. Interestingly, the specific brain networks supporting the improved object recognition vary based on a variety of factors, including the effectiveness of the initial multisensory presentation and the sense that is task-relevant. We specify the requisite conditions for multisensory contexts to improve object discrimination following single exposures, and the individual differences that exist with respect to these improvements. Our results shed light onto how memory operates on the multisensory nature of object representations as well as how the brain stores and retrieves memories of objects.

  1. Causal Inference in Multisensory Heading Estimation

    PubMed Central

    Katliar, Mikhail; Bülthoff, Heinrich H.

    2017-01-01

    A large body of research shows that the Central Nervous System (CNS) integrates multisensory information. However, this strategy should only apply to multisensory signals that have a common cause; independent signals should be segregated. Causal Inference (CI) models account for this notion. Surprisingly, previous findings suggested that visual and inertial cues on heading of self-motion are integrated regardless of discrepancy. We hypothesized that CI does occur, but that characteristics of the motion profiles affect multisensory processing. Participants estimated heading of visual-inertial motion stimuli with several different motion profiles and a range of intersensory discrepancies. The results support the hypothesis that judgments of signal causality are included in the heading estimation process. Moreover, the data suggest a decreasing tolerance for discrepancies and an increasing reliance on visual cues for longer duration motions. PMID:28060957

  2. Multisensory integration: flexible use of general operations

    PubMed Central

    van Atteveldt, Nienke; Murray, Micah M.; Thut, Gregor; Schroeder, Charles

    2014-01-01

    Research into the anatomical substrates and “principles” for integrating inputs from separate sensory surfaces has yielded divergent findings. This suggests that multisensory integration is flexible and context-dependent, and underlines the need for dynamically adaptive neuronal integration mechanisms. We propose that flexible multisensory integration can be explained by a combination of canonical, population-level integrative operations, such as oscillatory phase-resetting and divisive normalization. These canonical operations subsume multisensory integration into a fundamental set of principles as to how the brain integrates all sorts of information, and they are being used proactively and adaptively. We illustrate this proposition by unifying recent findings from different research themes such as timing, behavioral goal and experience-related differences in integration. PMID:24656248

  3. Functional Analytic Multisensory Environmental Therapy for People with Dementia

    PubMed Central

    Staal, Jason A.

    2012-01-01

    This paper introduces Functional Analytic Multisensory Environmental Therapy (FAMSET) for use with elders with dementia while using a multisensory environment/snoezelen room. The model introduces behavioral theory and practice to the multisensory environment treatment, addressing assessment, and, within session techniques, integrating behavioral interventions with emotion-oriented care. A modular approach is emphasized to delineate different treatment phases for multisensory environment therapy. The aim of the treatment is to provide a safe and effective framework for reducing the behavioral disturbance of the disease process, increasing elder well-being, and to promote transfer of positive effects to other environments outside of the multisensory treatment room. PMID:22347667

  4. The Complex Interplay Between Multisensory Integration and Perceptual Awareness.

    PubMed

    Deroy, O; Faivre, N; Lunghi, C; Spence, C; Aller, M; Noppeney, U

    2016-01-01

    The integration of information has been considered a hallmark of human consciousness, as it requires information being globally available via widespread neural interactions. Yet the complex interdependencies between multisensory integration and perceptual awareness, or consciousness, remain to be defined. While perceptual awareness has traditionally been studied in a single sense, in recent years we have witnessed a surge of interest in the role of multisensory integration in perceptual awareness. Based on a recent IMRF symposium on multisensory awareness, this review discusses three key questions from conceptual, methodological and experimental perspectives: (1) What do we study when we study multisensory awareness? (2) What is the relationship between multisensory integration and perceptual awareness? (3) Which experimental approaches are most promising to characterize multisensory awareness? We hope that this review paper will provoke lively discussions, novel experiments, and conceptual considerations to advance our understanding of the multifaceted interplay between multisensory integration and consciousness.

  5. The Complex Interplay Between Multisensory Integration and Perceptual Awareness

    PubMed Central

    Aller, M.; Noppeney, U.

    2016-01-01

    The integration of information has been considered a hallmark of human consciousness, as it requires information being globally available via widespread neural interactions. Yet the complex interdependencies between multisensory integration and perceptual awareness, or consciousness, remain to be defined. While perceptual awareness has traditionally been studied in a single sense, in recent years we have witnessed a surge of interest in the role of multisensory integration in perceptual awareness. Based on a recent IMRF symposium on multisensory awareness, this review discusses three key questions from conceptual, methodological and experimental perspectives: (1) What do we study when we study multisensory awareness? (2) What is the relationship between multisensory integration and perceptual awareness? (3) Which experimental approaches are most promising to characterize multisensory awareness? We hope that this review paper will provoke lively discussions, novel experiments, and conceptual considerations to advance our understanding of the multifaceted interplay between multisensory integration and consciousness. PMID:27795942

  6. Dual-targeted peptide-conjugated multifunctional fluorescent probe with AIEgen for efficient nucleus-specific imaging and long-term tracing of cancer cells† †Electronic supplementary information (ESI) available: Experimental procedures, structural characterization data, and additional figures and scheme. See DOI: 10.1039/c7sc00402h Click here for additional data file.

    PubMed Central

    Cheng, Yong; Sun, Chunli; Ou, Xiaowen; Liu, Bifeng

    2017-01-01

    Precisely targeted transportation of a long-term tracing regent to a nucleus with low toxicity is one of the most challenging concerns in revealing cancer cell behaviors. Here, we report a dual-targeted peptide-conjugated multifunctional fluorescent probe (cNGR-CPP-NLS-RGD-PyTPE, TCNTP) with aggregation-induced emission (AIE) characteristic, for efficient nucleus-specific imaging and long-term and low-toxicity tracing of cancer cells. TCNTP mainly consists of two components: one is a functionalized combinatorial peptide (TCNT) containing two targeted peptides (cNGR and RGD), a cell-penetrating peptide (CPP) and a nuclear localization signal (NLS), which can specifically bind to a cell surface and effectively enter into the nucleus; the other one is an AIE-active tetraphenylethene derivative (PyTPE, a typical AIEgen) as fluorescence imaging reagent. In the presence of aminopeptidase N (CD13) and integrin αvβ3, TCNTP can specifically bind to both of them using cNGR and RGD, respectively, lighting up its yellow fluorescence. Because it contains CPP, TCNTP can be effectively integrated into the cytoplasm, and then be delivered into the nucleus with the help of NLS. TCNTP exhibited strong fluorescence in the nucleus of CD13 and integrin αvβ3 overexpression cells due to the specific targeting ability, efficient transport capacity and AIE characteristic in a more crowded space. Furthermore, TCNTP can be applied for long-term tracing in living cells, scarcely affecting normal cells with negligible toxicity in more than ten passages. PMID:28626568

  7. Improving Vocabulary Acquisition with Multisensory Instruction

    ERIC Educational Resources Information Center

    D'Alesio, Rosemary; Scalia, Maureen T.; Zabel, Renee M.

    2007-01-01

    The purpose of this action research project was to improve student vocabulary acquisition through a multisensory, direct instructional approach. The study involved three teachers and a target population of 73 students in second and seventh grade classrooms. The intervention was implemented from September through December of 2006 and analyzed in…

  8. Multisensory integration, sensory substitution and visual rehabilitation.

    PubMed

    Proulx, Michael J; Ptito, Maurice; Amedi, Amir

    2014-04-01

    Sensory substitution has advanced remarkably over the past 35 years since first introduced to the scientific literature by Paul Bach-y-Rita. In this issue dedicated to his memory, we describe a collection of reviews that assess the current state of neuroscience research on sensory substitution, visual rehabilitation, and multisensory processes.

  9. Multisensory Instruction in Foreign Language Education.

    ERIC Educational Resources Information Center

    Robles, Teresita del Rosario Caballero; Uglem, Craig Thomas Chase

    This paper reviews some theories that through history have explained the process of learning. It also taps some new findings on how the brain learns. Multisensory instruction is a pedagogic strategy that covers the greatest number of individual preferences in the classroom, language laboratories, and multimedia rooms for a constant and diverse…

  10. Supervised calibration relies on the multisensory percept

    PubMed Central

    Zaidel, Adam; Ma, Wei Ji; Angelaki, Dora E.

    2013-01-01

    SUMMARY Multisensory plasticity enables us to dynamically adapt sensory cues to one another and to the environment. Without external feedback, “unsupervised” multisensory calibration reduces cue conflict in a manner largely independent of cue-reliability. But environmental feedback regarding cue-accuracy (“supervised”) also affects calibration. Here we measured the combined influence of cue-accuracy and cue-reliability on supervised multisensory calibration, using discrepant visual and vestibular motion stimuli. When the less-reliable cue was inaccurate, it alone got calibrated. However, when the more-reliable cue was inaccurate, cues were yoked and calibrated together in the same direction. Strikingly, the less-reliable cue shifted away from external feedback, becoming less accurate. A computational model in which supervised and unsupervised calibration work in parallel, where the former only relies on the multisensory percept, but the latter can calibrate cues individually, accounts for the observed behavior. In combination, they could ultimately achieve the optimal solution of both external accuracy and internal consistency. PMID:24290205

  11. Laminar and connectional organization of a multisensory cortex.

    PubMed

    Foxworthy, W Alex; Clemo, H Ruth; Meredith, M Alex

    2013-06-01

    The transformation of sensory signals as they pass through cortical circuits has been revealed almost exclusively through studies of the primary sensory cortices, for which principles of laminar organization, local connectivity, and parallel processing have been elucidated. In contrast, almost nothing is known about the circuitry or laminar features of multisensory processing in higher order, multisensory cortex. Therefore, using the ferret higher order multisensory rostral posterior parietal (PPr) cortex, the present investigation employed a combination of multichannel recording and neuroanatomical techniques to elucidate the laminar basis of multisensory cortical processing. The proportion of multisensory neurons, the share of neurons showing multisensory integration, and the magnitude of multisensory integration were all found to differ by layer in a way that matched the functional or connectional characteristics of the PPr. Specifically, the supragranular layers (L2/3) demonstrated among the highest proportions of multisensory neurons and the highest incidence of multisensory response enhancement, while also receiving the highest levels of extrinsic inputs, exhibiting the highest dendritic spine densities, and providing a major source of local connectivity. In contrast, layer 6 showed the highest proportion of unisensory neurons while receiving the fewest external and local projections and exhibiting the lowest dendritic spine densities. Coupled with a lack of input from principal thalamic nuclei and a minimal layer 4, these observations indicate that this higher level multisensory cortex shows functional and organizational modifications from the well-known patterns identified for primary sensory cortical regions.

  12. Laminar and Connectional Organization of a Multisensory Cortex

    PubMed Central

    Foxworthy, W. Alex; Clemo, H. Ruth; Meredith, M. Alex

    2012-01-01

    The transformation of sensory signals as they pass through cortical circuits has been revealed almost exclusively through studies of the primary sensory cortices, where principles of laminar organization, local connectivity and parallel processing have been elucidated. In contrast, almost nothing is known about the circuitry or laminar features of multisensory processing in higher-order, multisensory cortex. Therefore, using the ferret higher-order multisensory rostral posterior parietal (PPr) cortex, the present investigation employed a combination of multichannel recording and neuroanatomical techniques to elucidate the laminar basis of multisensory cortical processing. The proportion of multisensory neurons, the share of neurons showing multisensory integration, and the magnitude of multisensory integration were all found to differ by layer in a way that matched the functional or connectional characteristics of the PPr. Specifically, the supragranular layers (L2–3) demonstrated among the highest proportions of multisensory neurons and the highest incidence of multisensory response enhancement, while also receiving the highest levels of extrinsic inputs, exhibiting the highest dendritic spine densities, and providing a major source of local connectivity. In contrast, layer 6 showed the highest proportion of unisensory neurons while receiving the fewest external and local projections and exhibiting the lowest dendritic spine densities. Coupled with a lack of input from principal thalamic nuclei and a minimal layer 4, these observations indicate that this higher-level multisensory cortex shows unique functional and organizational modifications from the well-known patterns identified for primary sensory cortical regions. PMID:23172137

  13. The interactions of multisensory integration with endogenous and exogenous attention.

    PubMed

    Tang, Xiaoyu; Wu, Jinglong; Shen, Yong

    2016-02-01

    Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner.

  14. The interactions of multisensory integration with endogenous and exogenous attention

    PubMed Central

    Tang, Xiaoyu; Wu, Jinglong; Shen, Yong

    2016-01-01

    Stimuli from multiple sensory organs can be integrated into a coherent representation through multiple phases of multisensory processing; this phenomenon is called multisensory integration. Multisensory integration can interact with attention. Here, we propose a framework in which attention modulates multisensory processing in both endogenous (goal-driven) and exogenous (stimulus-driven) ways. Moreover, multisensory integration exerts not only bottom-up but also top-down control over attention. Specifically, we propose the following: (1) endogenous attentional selectivity acts on multiple levels of multisensory processing to determine the extent to which simultaneous stimuli from different modalities can be integrated; (2) integrated multisensory events exert top-down control on attentional capture via multisensory search templates that are stored in the brain; (3) integrated multisensory events can capture attention efficiently, even in quite complex circumstances, due to their increased salience compared to unimodal events and can thus improve search accuracy; and (4) within a multisensory object, endogenous attention can spread from one modality to another in an exogenous manner. PMID:26546734

  15. Where are multisensory signals combined for perceptual decision-making?

    PubMed

    Bizley, Jennifer K; Jones, Gareth P; Town, Stephen M

    2016-10-01

    Multisensory integration is observed in many subcortical and cortical locations including primary and non-primary sensory cortex, and higher cortical areas including frontal and parietal cortex. During unisensory perceptual tasks many of these same brain areas show neural signatures associated with decision-making. It is unclear whether multisensory representations in sensory cortex directly inform decision-making in a multisensory task, or if cross-modal signals are only combined after the accumulation of unisensory evidence at a final decision-making stage in higher cortical areas. Manipulations of neuronal activity are required to establish causal roles for given brain regions in multisensory perceptual decision-making, and so far indicate that distributed networks underlie multisensory decision-making. Understanding multisensory integration requires synthesis of small-scale pathway specific and large-scale network level manipulations.

  16. Multisensory Mechanisms of Gaze Stabilization and Flight Control

    DTIC Science & Technology

    2008-12-17

    Multisensory mechanisms of gaze stabilization and flight control 5c. PROGRAM ELEMENT NUMBER 5d. PROJECT NUMBER 5d. TASK NUMBER 6. AUTHOR(S...email: h.g.krapp@imperial.ac.uk Imperial College South Kensington Campus London SW7 2AZ “ Multisensory Mechanisms of Gaze... Multisensory Integration, and Efference Copies in Behaving Animals 2.5 State-dependent Processing in LPTCs 2.6 Characterization of Head Movements in the

  17. Multisensory oddity detection as bayesian inference.

    PubMed

    Hospedales, Timothy; Vijayakumar, Sethu

    2009-01-01

    A key goal for the perceptual system is to optimally combine information from all the senses that may be available in order to develop the most accurate and unified picture possible of the outside world. The contemporary theoretical framework of ideal observer maximum likelihood integration (MLI) has been highly successful in modelling how the human brain combines information from a variety of different sensory modalities. However, in various recent experiments involving multisensory stimuli of uncertain correspondence, MLI breaks down as a successful model of sensory combination. Within the paradigm of direct stimulus estimation, perceptual models which use Bayesian inference to resolve correspondence have recently been shown to generalize successfully to these cases where MLI fails. This approach has been known variously as model inference, causal inference or structure inference. In this paper, we examine causal uncertainty in another important class of multi-sensory perception paradigm--that of oddity detection and demonstrate how a Bayesian ideal observer also treats oddity detection as a structure inference problem. We validate this approach by showing that it provides an intuitive and quantitative explanation of an important pair of multi-sensory oddity detection experiments--involving cues across and within modalities--for which MLI previously failed dramatically, allowing a novel unifying treatment of within and cross modal multisensory perception. Our successful application of structure inference models to the new 'oddity detection' paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain.

  18. Behavioural benefits of multisensory processing in ferrets.

    PubMed

    Hammond-Kenny, Amy; Bajo, Victoria M; King, Andrew J; Nodal, Fernando R

    2017-01-01

    Enhanced detection and discrimination, along with faster reaction times, are the most typical behavioural manifestations of the brain's capacity to integrate multisensory signals arising from the same object. In this study, we examined whether multisensory behavioural gains are observable across different components of the localization response that are potentially under the command of distinct brain regions. We measured the ability of ferrets to localize unisensory (auditory or visual) and spatiotemporally coincident auditory-visual stimuli of different durations that were presented from one of seven locations spanning the frontal hemifield. During the localization task, we recorded the head movements made following stimulus presentation, as a metric for assessing the initial orienting response of the ferrets, as well as the subsequent choice of which target location to approach to receive a reward. Head-orienting responses to auditory-visual stimuli were more accurate and faster than those made to visual but not auditory targets, suggesting that these movements were guided principally by sound alone. In contrast, approach-to-target localization responses were more accurate and faster to spatially congruent auditory-visual stimuli throughout the frontal hemifield than to either visual or auditory stimuli alone. Race model inequality analysis of head-orienting reaction times and approach-to-target response times indicates that different processes, probability summation and neural integration, respectively, are likely to be responsible for the effects of multisensory stimulation on these two measures of localization behaviour.

  19. Multisensory integration and attention in developmental dyslexia.

    PubMed

    Harrar, Vanessa; Tammam, Jonathan; Pérez-Bellido, Alexis; Pitt, Anna; Stein, John; Spence, Charles

    2014-03-03

    Developmental dyslexia affects 5%-10% of the population, resulting in poor spelling and reading skills. While there are well-documented differences in the way dyslexics process low-level visual and auditory stimuli, it is mostly unknown whether there are similar differences in audiovisual multisensory processes. Here, we investigated audiovisual integration using the redundant target effect (RTE) paradigm. Some conditions demonstrating audiovisual integration appear to depend upon magnocellular pathways, and dyslexia has been associated with deficits in this pathway; so, we postulated that developmental dyslexics ("dyslexics" hereafter) would show differences in audiovisual integration compared with controls. Reaction times (RTs) to multisensory stimuli were compared with predictions from Miller's race model. Dyslexics showed difficulty shifting their attention between modalities; but such "sluggish attention shifting" (SAS) appeared only when dyslexics shifted their attention from the visual to the auditory modality. These results suggest that dyslexics distribute their crossmodal attention resources differently from controls, causing different patterns in multisensory responses compared to controls. From this, we propose that dyslexia training programs should take into account the asymmetric shifts of crossmodal attention.

  20. Uncovering Multisensory Processing through Non-Invasive Brain Stimulation.

    PubMed

    Bolognini, Nadia; Maravita, Angelo

    2011-01-01

    Most of current knowledge about the mechanisms of multisensory integration of environmental stimuli by the human brain derives from neuroimaging experiments. However, neuroimaging studies do not always provide conclusive evidence about the causal role of a given area for multisensory interactions, since these techniques can mainly derive correlations between brain activations and behavior. Conversely, techniques of non-invasive brain stimulation (NIBS) represent a unique and powerful approach to inform models of causal relations between specific brain regions and individual cognitive and perceptual functions. Although NIBS has been widely used in cognitive neuroscience, its use in the study of multisensory processing in the human brain appears a quite novel field of research. In this paper, we review and discuss recent studies that have used two techniques of NIBS, namely transcranial magnetic stimulation and transcranial direct current stimulation, for investigating the causal involvement of unisensory and heteromodal cortical areas in multisensory processing, the effects of multisensory cues on cortical excitability in unisensory areas, and the putative functional connections among different cortical areas subserving multisensory interactions. The emerging view is that NIBS is an essential tool available to neuroscientists seeking for causal relationships between a given area or network and multisensory processes. With its already large and fast increasing usage, future work using NIBS in isolation, as well as in conjunction with different neuroimaging techniques, could substantially improve our understanding of multisensory processing in the human brain.

  1. The COGs (context, object, and goals) in multisensory processing.

    PubMed

    ten Oever, Sanne; Romei, Vincenzo; van Atteveldt, Nienke; Soto-Faraco, Salvador; Murray, Micah M; Matusz, Pawel J

    2016-05-01

    Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and "top-down" control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer's goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications.

  2. TMS of posterior parietal cortex disrupts visual tactile multisensory integration.

    PubMed

    Pasalar, Siavash; Ro, Tony; Beauchamp, Michael S

    2010-05-01

    Functional neuroimaging studies have implicated a number of brain regions, especially the posterior parietal cortex (PPC), as being potentially important for visual-tactile multisensory integration. However, neuroimaging studies are correlational and do not prove the necessity of a region for the behavioral improvements that are the hallmark of multisensory integration. To remedy this knowledge gap, we interrupted activity in the PPC, near the junction of the anterior intraparietal sulcus and the postcentral sulcus, using MRI-guided transcranial magnetic stimulation (TMS) while subjects localized touches delivered to different fingers. As the touches were delivered, subjects viewed a congruent touch video, an incongruent touch video, or no video. Without TMS, a strong effect of multisensory integration was observed, with significantly better behavioral performance for discrimination of congruent multisensory touch than for unisensory touch alone. Incongruent multisensory touch produced a smaller improvement in behavioral performance. TMS of the PPC eliminated the behavioral advantage of both congruent and incongruent multisensory stimuli, reducing performance to unisensory levels. These results demonstrate a causal role for the PPC in visual-tactile multisensory integration. Taken together with converging evidence from other studies, these results support a model in which the PPC contains a map of space around the hand that receives input from both the visual and somatosensory modalities. Activity in this map is likely to be the neural substrate for visual-tactile multisensory integration.

  3. The multisensory nature of unisensory cortices: a puzzle continued.

    PubMed

    Kayser, Christoph

    2010-07-29

    Multisensory integration is central to perception, and recent work drafts it as a distributed process involving many and even primary sensory cortices. Studies in behaving animals performing a multisensory task provide an ideal means to elucidate the underlying neural basis, and a new study by Lemus et al. in this issue of Neuron thrusts in this direction.

  4. Multisensory Modalities for Blending and Segmenting among Early Readers

    ERIC Educational Resources Information Center

    Lee, Lay Wah

    2016-01-01

    With the advent of touch-screen interfaces on the tablet computer, multisensory elements in reading instruction have taken on a new dimension. This computer assisted language learning research aimed to determine whether specific technology features of a tablet computer can add to the functionality of multisensory instruction in early reading…

  5. Multisensory Teaching of Basic Language Skills Activity Book. Revised Edition

    ERIC Educational Resources Information Center

    Carreker, Suzanne; Birsh, Judith R.

    2011-01-01

    With the new edition of this activity book--the companion to Judith Birsh's bestselling text, "Multisensory Teaching of Basic Language Skills"--students and practitioners will get the practice they need to use multisensory teaching effectively with students who have dyslexia and other learning disabilities. Ideal for both pre-service teacher…

  6. Multisensory Modalities for Blending and Segmenting among Early Readers

    ERIC Educational Resources Information Center

    Lee, Lay Wah

    2016-01-01

    With the advent of touch-screen interfaces on the tablet computer, multisensory elements in reading instruction have taken on a new dimension. This computer assisted language learning research aimed to determine whether specific technology features of a tablet computer can add to the functionality of multisensory instruction in early reading…

  7. A Rational Analysis of the Acquisition of Multisensory Representations

    ERIC Educational Resources Information Center

    Yildirim, Ilker; Jacobs, Robert A.

    2012-01-01

    How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory…

  8. The Relationship between Multisensory Integration and IQ in Children

    ERIC Educational Resources Information Center

    Barutchu, Ayla; Crewther, Sheila G.; Fifer, Joanne; Shivdasani, Mohit N.; Innes-Brown, Hamish; Toohey, Sarah; Danaher, Jaclyn; Paolini, Antonio G.

    2011-01-01

    It is well accepted that multisensory integration has a facilitative effect on perceptual and motor processes, evolutionarily enhancing the chance of survival of many species, including humans. Yet, there is limited understanding of the relationship between multisensory processes, environmental noise, and children's cognitive abilities. Thus, this…

  9. The Relationship between Multisensory Integration and IQ in Children

    ERIC Educational Resources Information Center

    Barutchu, Ayla; Crewther, Sheila G.; Fifer, Joanne; Shivdasani, Mohit N.; Innes-Brown, Hamish; Toohey, Sarah; Danaher, Jaclyn; Paolini, Antonio G.

    2011-01-01

    It is well accepted that multisensory integration has a facilitative effect on perceptual and motor processes, evolutionarily enhancing the chance of survival of many species, including humans. Yet, there is limited understanding of the relationship between multisensory processes, environmental noise, and children's cognitive abilities. Thus, this…

  10. Using Multisensory Phonics to Foster Reading Skills of Adolescent Delinquents

    ERIC Educational Resources Information Center

    Warnick, Kristan; Caldarella, Paul

    2016-01-01

    This study examined the effectiveness of a multisensory phonics-based reading remediation program for adolescent delinquents classified as poor readers living at a residential treatment center. We used a pretest--posttest control group design with random assignment. The treatment group participated in a 30-hr multisensory phonics reading…

  11. Using Multisensory Phonics to Foster Reading Skills of Adolescent Delinquents

    ERIC Educational Resources Information Center

    Warnick, Kristan; Caldarella, Paul

    2016-01-01

    This study examined the effectiveness of a multisensory phonics-based reading remediation program for adolescent delinquents classified as poor readers living at a residential treatment center. We used a pretest--posttest control group design with random assignment. The treatment group participated in a 30-hr multisensory phonics reading…

  12. Multisensory Teaching of Basic Language Skills Activity Book

    ERIC Educational Resources Information Center

    Carreker, Suzanne; Birsh, Judith R.

    2005-01-01

    With this companion workbook to Judith Birsh's bestselling resource, "Multisensory Teaching of Basic Language Skills, Second Edition," students and practitioners alike will improve their knowledge of multisensory teaching and hone their language and instruction skills. Ideal for both preservice teacher education courses and inservice professional…

  13. Multisensory Teaching of Basic Language Skills Activity Book. Revised Edition

    ERIC Educational Resources Information Center

    Carreker, Suzanne; Birsh, Judith R.

    2011-01-01

    With the new edition of this activity book--the companion to Judith Birsh's bestselling text, "Multisensory Teaching of Basic Language Skills"--students and practitioners will get the practice they need to use multisensory teaching effectively with students who have dyslexia and other learning disabilities. Ideal for both pre-service teacher…

  14. Multisensory Processes: A Balancing Act across the Lifespan.

    PubMed

    Murray, Micah M; Lewkowicz, David J; Amedi, Amir; Wallace, Mark T

    2016-08-01

    Multisensory processes are fundamental in scaffolding perception, cognition, learning, and behavior. How and when stimuli from different sensory modalities are integrated rather than treated as separate entities is poorly understood. We review how the relative reliance on stimulus characteristics versus learned associations dynamically shapes multisensory processes. We illustrate the dynamism in multisensory function across two timescales: one long term that operates across the lifespan and one short term that operates during the learning of new multisensory relations. In addition, we highlight the importance of task contingencies. We conclude that these highly dynamic multisensory processes, based on the relative weighting of stimulus characteristics and learned associations, provide both stability and flexibility to brain functions over a wide range of temporal scales.

  15. The multisensory approach to birth and aromatherapy.

    PubMed

    Gutteridge, Kathryn

    2014-05-01

    The birth environment continues to be a subject of midwifery discourse within theory and practice. This article discusses the birth environment from the perspective of understanding the aromas and aromatherapy for the benefit of women and midwives The dynamic between the olfactory system and stimulation of normal birth processes proves to be fascinating. By examining other health models of care we can incorporate simple but powerful methods that can shape clinical outcomes. There is still more that midwives can do by using aromatherapy in the context of a multisensory approach to make birth environments synchronise with women's potential to birth in a positive way.

  16. A model of the temporal dynamics of multisensory enhancement

    PubMed Central

    Rowland, Benjamin A.; Stein, Barry E.

    2014-01-01

    The senses transduce different forms of environmental energy, and the brain synthesizes information across them to enhance responses to salient biological events. We hypothesize that the potency of multisensory integration is attributable to the convergence of independent and temporally aligned signals derived from cross-modal stimulus configurations onto multisensory neurons. The temporal profile of multisensory integration in neurons of the deep superior colliculus (SC) is consistent with this hypothesis. The responses of these neurons to visual, auditory, and combinations of visual–auditory stimuli reveal that multisensory integration takes place in real-time; that is, the input signals are integrated as soon as they arrive at the target neuron. Interactions between cross-modal signals may appear to reflect linear or nonlinear computations on a moment-by-moment basis, the aggregate of which determines the net product of multisensory integration. Modeling observations presented here suggest that the early nonlinear components of the temporal profile of multisensory integration can be explained with a simple spiking neuron model, and do not require more sophisticated assumptions about the underlying biology. A transition from nonlinear “super-additive” computation to linear, additive computation can be accomplished via scaled inhibition. The findings provide a set of design constraints for artificial implementations seeking to exploit the basic principles and potency of biological multisensory integration in contexts of sensory substitution or augmentation. PMID:24374382

  17. Neonatal cortical ablation disrupts multisensory development in superior colliculus

    PubMed Central

    Jiang, Wan; Jiang, Huai; Stein, Barry E.

    2006-01-01

    The ability of cat superior colliculus (SC) neurons to synthesize information from different senses depends on influences from two areas of the cortex: the anterior ectosylvian sulcus (AES) and the rostral lateral suprasylvian sulcus (rLS). Reversibly deactivating the inputs to the SC from either of these areas in normal adults severely compromises this ability and the SC-mediated behaviors that depend on it. In the present study we found that removal of these areas in neonatal animals precluded the normal development of multisensory SC processes. At maturity there was a substantial decrease in the incidence of multisensory neurons, and those multisensory neurons that did develop were highly abnormal. Their cross-modal receptive field register was severely compromised, as was their ability to integrate cross-modal stimuli. Apparently, despite the impressive plasticity of the neonatal brain, it cannot compensate for the early loss of these cortices. Surprisingly, however, neonatal removal of either AES or rLS had comparatively minor consequences on these properties. At maturity multisensory SC neurons were quite common: they developed the characteristic spatial register among their unisensory receptive fields and exhibited normal adult-like multisensory integration. These observations suggest that during early ontogeny, when the multisensory properties of SC neurons are being crafted, AES and rLS may have the ability to compensate for the loss of one another’s cortico-collicular influences so that normal multisensory processes can develop in the SC. PMID:16267111

  18. A rational analysis of the acquisition of multisensory representations.

    PubMed

    Yildirim, Ilker; Jacobs, Robert A

    2012-03-01

    How do people learn multisensory, or amodal, representations, and what consequences do these representations have for perceptual performance? We address this question by performing a rational analysis of the problem of learning multisensory representations. This analysis makes use of a Bayesian nonparametric model that acquires latent multisensory features that optimally explain the unisensory features arising in individual sensory modalities. The model qualitatively accounts for several important aspects of multisensory perception: (a) it integrates information from multiple sensory sources in such a way that it leads to superior performances in, for example, categorization tasks; (b) its performances suggest that multisensory training leads to better learning than unisensory training, even when testing is conducted in unisensory conditions; (c) its multisensory representations are modality invariant; and (d) it predicts ''missing" sensory representations in modalities when the input to those modalities is absent. Our rational analysis indicates that all of these aspects emerge as part of the optimal solution to the problem of learning to represent complex multisensory environments.

  19. The effects of visual training on multisensory temporal processing.

    PubMed

    Stevenson, Ryan A; Wilson, Magdalena M; Powers, Albert R; Wallace, Mark T

    2013-04-01

    The importance of multisensory integration for human behavior and perception is well documented, as is the impact that temporal synchrony has on driving such integration. Thus, the more temporally coincident two sensory inputs from different modalities are, the more likely they will be perceptually bound. This temporal integration process is captured by the construct of the temporal binding window-the range of temporal offsets within which an individual is able to perceptually bind inputs across sensory modalities. Recent work has shown that this window is malleable and can be narrowed via a multisensory perceptual feedback training process. In the current study, we seek to extend this by examining the malleability of the multisensory temporal binding window through changes in unisensory experience. Specifically, we measured the ability of visual perceptual feedback training to induce changes in the multisensory temporal binding window. Visual perceptual training with feedback successfully improved temporal visual processing, and more importantly, this visual training increased the temporal precision across modalities, which manifested as a narrowing of the multisensory temporal binding window. These results are the first to establish the ability of unisensory temporal training to modulate multisensory temporal processes, findings that can provide mechanistic insights into multisensory integration and which may have a host of practical applications.

  20. The effects of visual training on multisensory temporal processing

    PubMed Central

    Stevenson, Ryan A.; Wilson, Magdalena M.; Powers, Albert R.; Wallace, Mark T.

    2013-01-01

    The importance of multisensory integration for human behavior and perception is well documented, as is the impact that temporal synchrony has on driving such integration. Thus, the more temporally coincident two sensory inputs from different modalities are, the more likely they will be perceptually bound. This temporal integration process is captured by the construct of the temporal binding window - the range of temporal offsets within which an individual is able to perceptually bind inputs across sensory modalities. Recent work has shown that this window is malleable, and can be narrowed via a multisensory perceptual feedback training process. In the current study, we seek to extend this by examining the malleability of the multisensory temporal binding window through changes in unisensory experience. Specifically, we measured the ability of visual perceptual feedback training to induce changes in the multisensory temporal binding window. Visual perceptual training with feedback successfully improved temporal visual processing and more importantly, this visual training increased the temporal precision across modalities, which manifested as a narrowing of the multisensory temporal binding window. These results are the first to establish the ability of unisensory temporal training to modulate multisensory temporal processes, findings that can provide mechanistic insights into multisensory integration and which may have a host of practical applications. PMID:23307155

  1. A model of the temporal dynamics of multisensory enhancement.

    PubMed

    Rowland, Benjamin A; Stein, Barry E

    2014-04-01

    The senses transduce different forms of environmental energy, and the brain synthesizes information across them to enhance responses to salient biological events. We hypothesize that the potency of multisensory integration is attributable to the convergence of independent and temporally aligned signals derived from cross-modal stimulus configurations onto multisensory neurons. The temporal profile of multisensory integration in neurons of the deep superior colliculus (SC) is consistent with this hypothesis. The responses of these neurons to visual, auditory, and combinations of visual-auditory stimuli reveal that multisensory integration takes place in real-time; that is, the input signals are integrated as soon as they arrive at the target neuron. Interactions between cross-modal signals may appear to reflect linear or nonlinear computations on a moment-by-moment basis, the aggregate of which determines the net product of multisensory integration. Modeling observations presented here suggest that the early nonlinear components of the temporal profile of multisensory integration can be explained with a simple spiking neuron model, and do not require more sophisticated assumptions about the underlying biology. A transition from nonlinear "super-additive" computation to linear, additive computation can be accomplished via scaled inhibition. The findings provide a set of design constraints for artificial implementations seeking to exploit the basic principles and potency of biological multisensory integration in contexts of sensory substitution or augmentation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. The multisensory function of the human primary visual cortex.

    PubMed

    Murray, Micah M; Thelen, Antonia; Thut, Gregor; Romei, Vincenzo; Martuzzi, Roberto; Matusz, Pawel J

    2016-03-01

    It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex.

  3. Multisensory perceptual learning and sensory substitution.

    PubMed

    Proulx, Michael J; Brown, David J; Pasqualotto, Achille; Meijer, Peter

    2014-04-01

    One of the most exciting recent findings in neuroscience has been the capacity for neural plasticity in adult humans and animals. Studies of perceptual learning have provided key insights into the mechanisms of neural plasticity and the changes in functional neuroanatomy that it affords. Key questions in this field of research concern how practice of a task leads to specific or general improvement. Although much of this work has been carried out with a focus on a single sensory modality, primarily visual, there is increasing interest in multisensory perceptual learning. Here we will examine how advances in perceptual learning research both inform and can be informed by the development and advancement of sensory substitution devices for blind persons. To allow 'sight' to occur in the absence of visual input through the eyes, visual information can be transformed by a sensory substitution device into a representation that can be processed as sound or touch, and thus give one the potential to 'see' through the ears or tongue. Investigations of auditory, visual and multisensory perceptual learning can have key benefits for the advancement of sensory substitution, and the study of sensory deprivation and sensory substitution likewise will further the understanding of perceptual learning in general and the reverse hierarchy theory in particular. It also has significant importance for the developing understanding of the brain in metamodal terms, where functional brain areas might be best defined by the computations they carry out rather than by their sensory-specific processing role. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Multisensory architectures for action-oriented perception

    NASA Astrophysics Data System (ADS)

    Alba, L.; Arena, P.; De Fiore, S.; Listán, J.; Patané, L.; Salem, A.; Scordino, G.; Webb, B.

    2007-05-01

    In order to solve the navigation problem of a mobile robot in an unstructured environment a versatile sensory system and efficient locomotion control algorithms are necessary. In this paper an innovative sensory system for action-oriented perception applied to a legged robot is presented. An important problem we address is how to utilize a large variety and number of sensors, while having systems that can operate in real time. Our solution is to use sensory systems that incorporate analog and parallel processing, inspired by biological systems, to reduce the required data exchange with the motor control layer. In particular, as concerns the visual system, we use the Eye-RIS v1.1 board made by Anafocus, which is based on a fully parallel mixed-signal array sensor-processor chip. The hearing sensor is inspired by the cricket hearing system and allows efficient localization of a specific sound source with a very simple analog circuit. Our robot utilizes additional sensors for touch, posture, load, distance, and heading, and thus requires customized and parallel processing for concurrent acquisition. Therefore a Field Programmable Gate Array (FPGA) based hardware was used to manage the multi-sensory acquisition and processing. This choice was made because FPGAs permit the implementation of customized digital logic blocks that can operate in parallel allowing the sensors to be driven simultaneously. With this approach the multi-sensory architecture proposed can achieve real time capabilities.

  5. The question of simultaneity in multisensory integration

    NASA Astrophysics Data System (ADS)

    Leone, Lynnette; McCourt, Mark E.

    2012-03-01

    Early reports of audiovisual (AV) multisensory integration (MI) indicated that unisensory stimuli must evoke simultaneous physiological responses to produce decreases in reaction time (RT) such that for unisensory stimuli with unequal RTs the stimulus eliciting the faster RT had to be delayed relative to the stimulus eliciting the slower RT. The "temporal rule" states that MI depends on the temporal proximity of unisensory stimuli, the neural responses to which must fall within a window of integration. Ecological validity demands that MI should occur only for simultaneous events (which may give rise to non-simultaneous neural activations). However, spurious neural response simultaneities which are unrelated to singular environmental multisensory occurrences must somehow be rejected. Using an RT/race model paradigm we measured AV MI as a function of stimulus onset asynchrony (SOA: +/-200 ms, 50 ms intervals) under fully dark adapted conditions for visual (V) stimuli that were either weak (scotopic 525 nm flashes; 511 ms mean RT) or strong (photopic 630 nm flashes; 356 ms mean RT). Auditory (A) stimulus (1000 Hz pure tone) intensity was constant. Despite the 155 ms slower mean RT to the scotopic versus photopic stimulus, facilitative AV MI in both conditions nevertheless occurred exclusively at an SOA of 0 ms. Thus, facilitative MI demands both physical and physiological simultaneity. We consider the mechanisms by which the nervous system may take account of variations in response latency arising from changes in stimulus intensity in order to selectively integrate only those physiological simultaneities that arise from physical simultaneities.

  6. Multisensory temporal integration: Task and stimulus dependencies

    PubMed Central

    Stevenson, Ryan A.; Wallace, Mark T.

    2013-01-01

    The ability of human sensory systems to integrate information across the different modalities provides a wide range of behavioral and perceptual benefits. This integration process is dependent upon the temporal relationship of the different sensory signals, with stimuli occurring close together in time typically resulting in the largest behavior changes. The range of temporal intervals over which such benefits are seen is typically referred to as the temporal binding window (TBW). Given the importance of temporal factors in multisensory integration under both normal and atypical circumstances such as autism and dyslexia, the TBW has been measured with a variety of experimental protocols that differ according to criterion, task, and stimulus type, making comparisons across experiments difficult. In the current study we attempt to elucidate the role that these various factors play in the measurement of this important construct. The results show a strong effect of stimulus type, with the TBW assessed with speech stimuli being both larger and more symmetrical than that seen using simple and complex non-speech stimuli. These effects are robust across task and statistical criteria, and are highly consistent within individuals, suggesting substantial overlap in the neural and cognitive operations that govern multisensory temporal processes. PMID:23604624

  7. Multisensory temporal integration: task and stimulus dependencies.

    PubMed

    Stevenson, Ryan A; Wallace, Mark T

    2013-06-01

    The ability of human sensory systems to integrate information across the different modalities provides a wide range of behavioral and perceptual benefits. This integration process is dependent upon the temporal relationship of the different sensory signals, with stimuli occurring close together in time typically resulting in the largest behavior changes. The range of temporal intervals over which such benefits are seen is typically referred to as the temporal binding window (TBW). Given the importance of temporal factors in multisensory integration under both normal and atypical circumstances such as autism and dyslexia, the TBW has been measured with a variety of experimental protocols that differ according to criterion, task, and stimulus type, making comparisons across experiments difficult. In the current study, we attempt to elucidate the role that these various factors play in the measurement of this important construct. The results show a strong effect of stimulus type, with the TBW assessed with speech stimuli being both larger and more symmetrical than that seen using simple and complex non-speech stimuli. These effects are robust across task and statistical criteria and are highly consistent within individuals, suggesting substantial overlap in the neural and cognitive operations that govern multisensory temporal processes.

  8. The multisensory brain and its ability to learn music.

    PubMed

    Zimmerman, Emily; Lahav, Amir

    2012-04-01

    Playing a musical instrument requires a complex skill set that depends on the brain's ability to quickly integrate information from multiple senses. It has been well documented that intensive musical training alters brain structure and function within and across multisensory brain regions, supporting the experience-dependent plasticity model. Here, we argue that this experience-dependent plasticity occurs because of the multisensory nature of the brain and may be an important contributing factor to musical learning. This review highlights key multisensory regions within the brain and discusses their role in the context of music learning and rehabilitation.

  9. On the relative contributions of multisensory integration and crossmodal exogenous spatial attention to multisensory response enhancement.

    PubMed

    Van der Stoep, N; Spence, C; Nijboer, T C W; Van der Stigchel, S

    2015-11-01

    Two processes that can give rise to multisensory response enhancement (MRE) are multisensory integration (MSI) and crossmodal exogenous spatial attention. It is, however, currently unclear what the relative contribution of each of these is to MRE. We investigated this issue using two tasks that are generally assumed to measure MSI (a redundant target effect task) and crossmodal exogenous spatial attention (a spatial cueing task). One block of trials consisted of unimodal auditory and visual targets designed to provide a unimodal baseline. In two other blocks of trials, the participants were presented with spatially and temporally aligned and misaligned audiovisual (AV) targets (0, 50, 100, and 200ms SOA). In the integration block, the participants were instructed to respond to the onset of the first target stimulus that they detected (A or V). The instruction for the cueing block was to respond only to the onset of the visual targets. The targets could appear at one of three locations: left, center, and right. The participants were instructed to respond only to lateral targets. The results indicated that MRE was caused by MSI at 0ms SOA. At 50ms SOA, both crossmodal exogenous spatial attention and MSI contributed to the observed MRE, whereas the MRE observed at the 100 and 200ms SOAs was attributable to crossmodal exogenous spatial attention, alerting, and temporal preparation. These results therefore suggest that there may be a temporal window in which both MSI and exogenous crossmodal spatial attention can contribute to multisensory response enhancement.

  10. Auditory and multisensory responses in the tectofugal pathway of the barn owl.

    PubMed

    Reches, Amit; Gutfreund, Yoram

    2009-07-29

    A common visual pathway in all amniotes is the tectofugal pathway connecting the optic tectum with the forebrain. The tectofugal pathway has been suggested to be involved in tasks such as orienting and attention, tasks that may benefit from integrating information across senses. Nevertheless, previous research has characterized the tectofugal pathway as strictly visual. Here we recorded from two stations along the tectofugal pathway of the barn owl: the thalamic nucleus rotundus (nRt) and the forebrain entopallium (E). We report that neurons in E and nRt respond to auditory stimuli as well as to visual stimuli. Visual tuning to the horizontal position of the stimulus and auditory tuning to the corresponding spatial cue (interaural time difference) were generally broad, covering a large portion of the contralateral space. Responses to spatiotemporally coinciding multisensory stimuli were mostly enhanced above the responses to the single modality stimuli, whereas spatially misaligned stimuli were not. Results from inactivation experiments suggest that the auditory responses in E are of tectal origin. These findings support the notion that the tectofugal pathway is involved in multisensory processing. In addition, the findings suggest that the ascending auditory information to the forebrain is not as bottlenecked through the auditory thalamus as previously thought.

  11. Multisensory enhancement of electromotor responses to a single moving object.

    PubMed

    Pluta, Scott R; Kawasaki, Masashi

    2008-09-01

    Weakly electric fish possess three cutaneous sensory organs structured in arrays with overlapping receptive fields. Theoretically, these tuberous electrosensory, ampullary electrosensory and mechanosensory lateral line receptors receive spatiotemporally congruent stimulation in the presence of a moving object. The current study is the first to quantify the magnitude of multisensory enhancement across these mechanosensory and electrosensory systems during moving-object recognition. We used the novelty response of a pulse-type weakly electric fish to quantitatively compare multisensory responses to their component unisensory responses. Principally, we discovered that multisensory novelty responses are significantly larger than their arithmetically summed component unisensory responses. Additionally, multimodal stimulation yielded a significant increase in novelty response amplitude, probability and the rate of a high-frequency burst, known as a ;scallop'. Supralinear multisensory enhancement of the novelty response may signify an augmentation of perception driven by the ecological significance of multimodal stimuli. Scalloping may function as a sensory scan aimed at rapidly facilitating the electrolocation of novel stimuli.

  12. The efficacy of single-trial multisensory memories.

    PubMed

    Thelen, Antonia; Murray, Micah M

    2013-01-01

    This review article summarizes evidence that multisensory experiences at one point in time have long-lasting effects on subsequent unisensory visual and auditory object recognition. The efficacy of single-trial exposure to task-irrelevant multisensory events is its ability to modulate memory performance and brain activity to unisensory components of these events presented later in time. Object recognition (either visual or auditory) is enhanced if the initial multisensory experience had been semantically congruent and can be impaired if this multisensory pairing was either semantically incongruent or entailed meaningless information in the task-irrelevant modality, when compared to objects encountered exclusively in a unisensory context. Processes active during encoding cannot straightforwardly explain these effects; performance on all initial presentations was indistinguishable despite leading to opposing effects with stimulus repetitions. Brain responses to unisensory stimulus repetitions differ during early processing stages (-100 ms post-stimulus onset) according to whether or not they had been initially paired in a multisensory context. Plus, the network exhibiting differential responses varies according to whether or not memory performance is enhanced or impaired. The collective findings we review indicate that multisensory associations formed via single-trial learning exert influences on later unisensory processing to promote distinct object representations that manifest as differentiable brain networks whose activity is correlated with memory performance. These influences occur incidentally, despite many intervening stimuli, and are distinguishable from the encoding/learning processes during the formation of the multisensory associations. The consequences of multisensory interactions that persist over time to impact memory retrieval and object discrimination.

  13. Looming signals reveal synergistic principles of multisensory integration.

    PubMed

    Cappe, Céline; Thelen, Antonia; Romei, Vincenzo; Thut, Gregor; Murray, Micah M

    2012-01-25

    Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (∼75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (∼115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential

  14. Initiating the development of multisensory integration by manipulating sensory experience.

    PubMed

    Yu, Liping; Rowland, Benjamin A; Stein, Barry E

    2010-04-07

    The multisensory integration capabilities of superior colliculus neurons emerge gradually during early postnatal life as a consequence of experience with cross-modal stimuli. Without such experience neurons become responsive to multiple sensory modalities but are unable to integrate their inputs. The present study demonstrates that neurons retain sensitivity to cross-modal experience well past the normal developmental period for acquiring multisensory integration capabilities. Experience surprisingly late in life was found to rapidly initiate the development of multisensory integration, even more rapidly than expected based on its normal developmental time course. Furthermore, the requisite experience was acquired by the anesthetized brain and in the absence of any of the stimulus-response contingencies generally associated with learning. The key experiential factor was repeated exposure to the relevant stimuli, and this required that the multiple receptive fields of a multisensory neuron encompassed the cross-modal exposure site. Simple exposure to the individual components of a cross-modal stimulus was ineffective in this regard. Furthermore, once a neuron acquired multisensory integration capabilities at the exposure site, it generalized this experience to other locations, albeit with lowered effectiveness. These observations suggest that the prolonged period during which multisensory integration normally appears is due to developmental factors in neural circuitry in addition to those required for incorporating the statistics of cross-modal events; that neurons learn a multisensory principle based on the specifics of experience and can then apply it to other stimulus conditions; and that the incorporation of this multisensory information does not depend on an alert brain.

  15. Initiating the development of multisensory integration by manipulating sensory experience

    PubMed Central

    Yu, Liping; Rowland, Benjamin A.; Stein, Barry E.

    2010-01-01

    The multisensory integration capabilities of superior colliculus (SC) neurons emerge gradually during early postnatal life as a consequence of experience with cross-modal stimuli. Without such experience neurons become responsive to multiple sensory modalities but are unable to integrate their inputs. The present study demonstrates that neurons retain sensitivity to cross-modal experience well past the normal developmental period for acquiring multisensory integration capabilities. Experience surprisingly late in life was found to rapidly initiate the development of multisensory integration, even more rapidly than expected based on its normal developmental time course. Furthermore, the requisite experience was acquired by the anesthetized brain and in the absence of any of the stimulus-response contingencies generally associated with learning. The key experiential factor was repeated exposure to the relevant stimuli, and this required that the multiple receptive fields of a multisensory neuron encompassed the cross-modal exposure site. Simple exposure to the individual components of a cross-modal stimulus was ineffective in this regard. Furthermore, once a neuron acquired multisensory integration capabilities at the exposure site, it generalized this experience to other locations, albeit with lowered effectiveness. These observations suggest that the prolonged period during which multisensory integration normally appears is due to developmental factors in neural circuitry in addition to those required for incorporating the statistics of cross-modal events; that neurons learn a multisensory principle based on the specifics of experience and can then apply it to other stimulus conditions; and that the incorporation of this multisensory information does not depend on an alert brain. PMID:20371810

  16. Multisensory dysfunction accompanies crossmodal plasticity following adult hearing impairment.

    PubMed

    Meredith, M A; Keniston, L P; Allman, B L

    2012-07-12

    Until now, cortical crossmodal plasticity has largely been regarded as the effect of early and complete sensory loss. Recently, massive crossmodal cortical reorganization was demonstrated to result from profound hearing loss in adult ferrets (Allman et al., 2009a). Moderate adult hearing loss, on the other hand, induced not just crossmodal reorganization, but also merged new crossmodal inputs with residual auditory function to generate multisensory neurons. Because multisensory convergence can lead to dramatic levels of response integration when stimuli from more than one modality are present (and thereby potentially interfere with residual auditory processing), the present investigation sought to evaluate the multisensory properties of auditory cortical neurons in partially deafened adult ferrets. When compared with hearing controls, partially-deaf animals revealed elevated spontaneous levels and a dramatic increase (∼2 times) in the proportion of multisensory cortical neurons, but few of which showed multisensory integration. Moreover, a large proportion (68%) of neurons with somatosensory and/or visual inputs was vigorously active in core auditory cortex in the absence of auditory stimulation. Collectively, these results not only demonstrate multisensory dysfunction in core auditory cortical neurons from hearing impaired adults but also reveal a potential cortical substrate for maladaptive perceptual effects such as tinnitus.

  17. Voluntary initiation of movement: multifunctional integration of subjective agency.

    PubMed

    Grüneberg, Patrick; Kadone, Hideki; Suzuki, Kenji

    2015-01-01

    This paper investigates subjective agency (SA) as a special type of efficacious action consciousness. Our central claims are, firstly, that SA is a conscious act of voluntarily initiating bodily motion. Secondly, we argue that SA is a case of multifunctional integration of behavioral functions being analogous to multisensory integration of sensory modalities. This is based on new perspectives on the initiation of action opened up by recent advancements in robot assisted neuro-rehabilitation which depends on the active participation of the patient and yields experimental evidence that there is SA in terms of a conscious act of voluntarily initiating bodily motion (phenomenal performance). Conventionally, action consciousness has been considered as a sense of agency (SoA). According to this view, the conscious subject merely echoes motor performance and does not cause bodily motion. Depending on sensory input, SoA is implemented by means of unifunctional integration (binding) and inevitably results in non-efficacious action consciousness. In contrast, SA comes as a phenomenal performance which causes motion and builds on multifunctional integration. Therefore, the common conception of the brain should be shifted toward multifunctional integration in order to allow for efficacious action consciousness. For this purpose, we suggest the heterarchic principle of asymmetric reciprocity and neural operators underlying SA. The general idea is that multifunctional integration allows conscious acts to be simultaneously implemented with motor behavior so that the resulting behavior (SA) comes as efficacious action consciousness. Regarding the neural implementation, multifunctional integration rather relies on operators than on modular functions. A robotic case study and possible experimental setups with testable hypotheses building on SA are presented.

  18. Voluntary initiation of movement: multifunctional integration of subjective agency

    PubMed Central

    Grüneberg, Patrick; Kadone, Hideki; Suzuki, Kenji

    2015-01-01

    This paper investigates subjective agency (SA) as a special type of efficacious action consciousness. Our central claims are, firstly, that SA is a conscious act of voluntarily initiating bodily motion. Secondly, we argue that SA is a case of multifunctional integration of behavioral functions being analogous to multisensory integration of sensory modalities. This is based on new perspectives on the initiation of action opened up by recent advancements in robot assisted neuro-rehabilitation which depends on the active participation of the patient and yields experimental evidence that there is SA in terms of a conscious act of voluntarily initiating bodily motion (phenomenal performance). Conventionally, action consciousness has been considered as a sense of agency (SoA). According to this view, the conscious subject merely echoes motor performance and does not cause bodily motion. Depending on sensory input, SoA is implemented by means of unifunctional integration (binding) and inevitably results in non-efficacious action consciousness. In contrast, SA comes as a phenomenal performance which causes motion and builds on multifunctional integration. Therefore, the common conception of the brain should be shifted toward multifunctional integration in order to allow for efficacious action consciousness. For this purpose, we suggest the heterarchic principle of asymmetric reciprocity and neural operators underlying SA. The general idea is that multifunctional integration allows conscious acts to be simultaneously implemented with motor behavior so that the resulting behavior (SA) comes as efficacious action consciousness. Regarding the neural implementation, multifunctional integration rather relies on operators than on modular functions. A robotic case study and possible experimental setups with testable hypotheses building on SA are presented. PMID:26052308

  19. Modeling of multisensory convergence with a network of spiking neurons: a reverse engineering approach.

    PubMed

    Lim, Hun Ki; Keniston, Leslie P; Cios, Krzysztof J

    2011-07-01

    Multisensory processing in the brain underlies a wide variety of perceptual phenomena, but little is known about the underlying mechanisms of how multisensory neurons are formed. This lack of knowledge is due to the difficulty for biological experiments to manipulate and test the parameters of multisensory convergence, the first and definitive step in the multisensory process. Therefore, by using a computational model of multisensory convergence, this study seeks to provide insight into the mechanisms of multisensory convergence. To reverse-engineer multisensory convergence, we used a biologically realistic neuron model and a biology-inspired plasticity rule, but did not make any a priori assumptions about multisensory properties of neurons in the network. The network consisted of two separate projection areas that converged upon neurons in a third area, and stimulation involved activation of one of the projection areas (or the other) or their combination. Experiments consisted of two parts: network training and multisensory simulation. Analyses were performed, first, to find multisensory properties in the simulated networks; second, to reveal properties of the network using graph theoretical approach; and third, to generate hypothesis related to the multisensory convergence. The results showed that the generation of multisensory neurons related to the topological properties of the network, in particular, the strengths of connections after training, was found to play an important role in forming and thus distinguishing multisensory neuron types.

  20. Spatial determinants of multisensory integration in cat superior colliculus neurons.

    PubMed

    Meredith, M A; Stein, B E

    1996-05-01

    1. Although a representation of multisensory space is contained in the superior colliculus, little is known about the spatial requirements of multisensory stimuli that influence the activity of neurons here. Critical to this problem is an assessment of the registry of the different receptive fields within individual multisensory neurons. The present study was initiated to determine how closely the receptive fields of individual multisensory neurons are aligned, the physiological role of that alignment, and the possible functional consequences of inducing receptive-field misalignment. 2. Individual multisensory neurons in the superior colliculus of anesthetized, paralyzed cats were studied with the use of standard extracellular recording techniques. The receptive fields of multisensory neurons were large, as reported previously, but exhibited a surprisingly high degree of spatial coincidence. The average proportion of receptive-field overlap was 86% for the population of visual-auditory neurons sampled. 3. Because of this high degree of intersensory receptive-field correspondence, combined-modality stimuli that were coincident in space tended to fall within the excitatory regions of the receptive fields involved. The result was a significantly enhanced neuronal response in 88% of the multisensory neurons studied. If stimuli were spatially disparate, so that one fell outside its receptive field, either a decreased response occurred (56%), or no intersensory effect was apparent (44%). 4. The normal alignment of the different receptive fields of a multisensory neuron could be disrupted by passively displacing the eyes, pinnae, or limbs/body. In no case was a shift in location or size observed in a neuron's other receptive field(s) to compensate for this displacement. The physiological result of receptive-field misalignment was predictable and based on the location of the stimuli relative to the new positions of their respective receptive fields. Now, for example, one

  1. Multifunctionality in molecular magnetism.

    PubMed

    Pinkowicz, Dawid; Czarnecki, Bernard; Reczyński, Mateusz; Arczyński, Mirosław

    2015-01-01

    Molecular magnetism draws from the fundamental ideas of structural chemistry and combines them with experimental physics resulting in one of the highest profile current topics, namely molecular materials that exhibit multifunctionality. Recent advances in the design of new generations of multifunctional molecular magnets that retain the functions of the building blocks and exhibit non-trivial magnetic properties at higher temperatures provide promising evidence that they may be useful for the future construction of nanoscale devices. This article is not a complete review but is rather an introduction into thefascinating world of multifunctional solids with magnetism as the leitmotif. We provide a subjective selection and discussion of the most inspiring examples of multifunctional molecular magnets: magnetic sponges, guest-responsive magnets, molecular magnets with ionic conductivity, photomagnets and non-centrosymmetric and chiral magnets.

  2. Multifunctional thin film surface

    DOEpatents

    Brozik, Susan M.; Harper, Jason C.; Polsky, Ronen; Wheeler, David R.; Arango, Dulce C.; Dirk, Shawn M.

    2015-10-13

    A thin film with multiple binding functionality can be prepared on an electrode surface via consecutive electroreduction of two or more aryl-onium salts with different functional groups. This versatile and simple method for forming multifunctional surfaces provides an effective means for immobilization of diverse molecules at close proximities. The multifunctional thin film has applications in bioelectronics, molecular electronics, clinical diagnostics, and chemical and biological sensing.

  3. Multifunctional cellulase and hemicellulase

    DOEpatents

    Fox, Brian G.; Takasuka, Taichi; Bianchetti, Christopher M.

    2015-09-29

    A multifunctional polypeptide capable of hydrolyzing cellulosic materials, xylan, and mannan is disclosed. The polypeptide includes the catalytic core (cc) of Clostridium thermocellum Cthe_0797 (CelE), the cellulose-specific carbohydrate-binding module CBM3 of the cellulosome anchoring protein cohesion region (CipA) of Clostridium thermocellum (CBM3a), and a linker region interposed between the catalytic core and the cellulose-specific carbohydrate binding module. Methods of using the multifunctional polypeptide are also disclosed.

  4. Multisensory integration during short-term music reading training enhances both uni- and multisensory cortical processing.

    PubMed

    Paraskevopoulos, Evangelos; Kuchenbuch, Anja; Herholz, Sibylle C; Pantev, Christo

    2014-10-01

    The human ability to integrate the input of several sensory systems is essential for building a meaningful interpretation out of the complexity of the environment. Training studies have shown that the involvement of multiple senses during training enhances neuroplasticity, but it is not clear to what extent integration of the senses during training is required for the observed effects. This study intended to elucidate the differential contributions of uni- and multisensory elements of music reading training in the resulting plasticity of abstract audiovisual incongruency identification. We used magnetoencephalography to measure the pre- and posttraining cortical responses of two randomly assigned groups of participants that followed either an audiovisual music reading training that required multisensory integration (AV-Int group) or a unisensory training that had separate auditory and visual elements (AV-Sep group). Results revealed a network of frontal generators for the abstract audiovisual incongruency response, confirming previous findings, and indicated the central role of anterior prefrontal cortex in this process. Differential neuroplastic effects of the two types of training in frontal and temporal regions point to the crucial role of multisensory integration occurring during training. Moreover, a comparison of the posttraining cortical responses of both groups to a group of musicians that were tested using the same paradigm revealed that long-term music training leads to significantly greater responses than the short-term training of the AV-Int group in anterior prefrontal regions as well as to significantly greater responses than both short-term training protocols in the left superior temporal gyrus (STG).

  5. The effect of multisensory cues on attention in aging.

    PubMed

    Mahoney, Jeannette R; Verghese, Joe; Dumas, Kristina; Wang, Cuiling; Holtzer, Roee

    2012-09-07

    The attention network test (ANT) assesses the effect of alerting and orienting cues on a visual flanker task measuring executive attention. Previous findings revealed that older adults demonstrate greater reaction times (RT) benefits when provided with visual orienting cues that offer both spatial and temporal information of an ensuing target. Given the overlap of neural substrates and networks involved in multisensory processing and cueing (i.e., alerting and orienting), an investigation of multisensory cueing effects on RT was warranted. The current study was designed to determine whether participants, both old and young, benefited from receiving multisensory alerting and orienting cues. Eighteen young (M=19.17 years; 45% female) and eighteen old (M=76.44 years; 61% female) individuals that were determined to be non-demented and without any medical or psychiatric conditions that would affect their performance were included. Results revealed main effects for the executive attention and orienting networks, but not for the alerting network. In terms of orienting, both old and young adults demonstrated significant orienting effects for auditory-somatosensory (AS), auditory-visual (AV), and visual-somatosensory (VS) cues. RT benefits of multisensory compared to unisensory orienting effects differed by cue type and age group; younger adults demonstrated greater RT benefits for AS orienting cues whereas older adults demonstrated greater RT benefits for AV orienting cues. Both groups, however, demonstrated significant RT benefits for multisensory VS orienting cues. These findings provide evidence for the facilitative effect of multisensory orienting cues, and not multisensory alerting cues, in old and young adults.

  6. Multisensory integration, aging, and the sound-induced flash illusion.

    PubMed

    DeLoss, Denton J; Pierce, Russell S; Andersen, George J

    2013-09-01

    The present study examined age-related differences in multisensory integration and the role of attention in age-related differences in multisensory integration. The sound-induced flash illusion--the misperception of the number of visual flashes due to the simultaneous presentation of a different number of auditory beeps--was used to examine the strength of multisensory integration in older and younger observers. The effects of integration were examined when discriminating 1-3 flashes, 1-3 beeps, or 1-3 flashes presented with 1-3 beeps. Stimulus conditions were blocked according to these conditions with baseline (unisensory) performance assessed during the multisensory block. Older participants demonstrated greater multisensory integration--a greater influence of the beeps when judging the number of visual flashes--than younger observers. In a second experiment, the role of attention was assessed using a go/no-go paradigm. The results of Experiment 2 replicated those of Experiment 1. In addition, the strength of the illusion was modulated by the sensory domain of the go/no-go task, though this did not differ by age group. In the visual go/no-go task we found a decrease in the illusion, yet in the auditory go/no-go task we found an increase in the illusion. These results demonstrate that older individuals exhibit increased multisensory integration compared with younger individuals. Attention was also found to modulate the strength of the sound-induced flash illusion. However, the results also suggest that attention was not likely to be a factor in the age-related differences in multisensory integration.

  7. Bayesian-based integration of multisensory naturalistic perithreshold stimuli.

    PubMed

    Regenbogen, Christina; Johansson, Emilia; Andersson, Patrik; Olsson, Mats J; Lundström, Johan N

    2016-07-29

    Most studies exploring multisensory integration have used clearly perceivable stimuli. According to the principle of inverse effectiveness, the added neural and behavioral benefit of integrating clear stimuli is reduced in comparison to stimuli with degraded and less salient unisensory information. Traditionally, speed and accuracy measures have been analyzed separately with few studies merging these to gain an understanding of speed-accuracy trade-offs in multisensory integration. In two separate experiments, we assessed multisensory integration of naturalistic audio-visual objects consisting of individually-tailored perithreshold dynamic visual and auditory stimuli, presented within a multiple-choice task, using a Bayesian Hierarchical Drift Diffusion Model that combines response time and accuracy. For both experiments, unisensory stimuli were degraded to reach a 75% identification accuracy level for all individuals and stimuli to promote multisensory binding. In Experiment 1, we subsequently presented uni- and their respective bimodal stimuli followed by a 5-alternative-forced-choice task. In Experiment 2, we controlled for low-level integration and attentional differences. Both experiments demonstrated significant superadditive multisensory integration of bimodal perithreshold dynamic information. We present evidence that the use of degraded sensory stimuli may provide a link between previous findings of inverse effectiveness on a single neuron level and overt behavior. We further suggest that a combined measure of accuracy and reaction time may be a more valid and holistic approach of studying multisensory integration and propose the application of drift diffusion models for studying behavioral correlates as well as brain-behavior relationships of multisensory integration. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Creating Multisensory Environments: Practical Ideas for Teaching and Learning. David Fulton/Nasen

    ERIC Educational Resources Information Center

    Davies, Christopher

    2011-01-01

    Multi-sensory environments in the classroom provide a wealth of stimulating learning experiences for all young children whose senses are still under development. "Creating Multisensory Environments: Practical Ideas for Teaching and Learning" is a highly practical guide to low-cost cost, easy to assemble multi-sensory environments. With a…

  9. Neuromodulation of early multisensory interactions in the visual cortex.

    PubMed

    Convento, Silvia; Vallar, Giuseppe; Galantini, Chiara; Bolognini, Nadia

    2013-05-01

    Merging information derived from different sensory channels allows the brain to amplify minimal signals to reduce their ambiguity, thereby improving the ability of orienting to, detecting, and identifying environmental events. Although multisensory interactions have been mostly ascribed to the activity of higher-order heteromodal areas, multisensory convergence may arise even in primary sensory-specific areas located very early along the cortical processing stream. In three experiments, we investigated early multisensory interactions in lower-level visual areas, by using a novel approach, based on the coupling of behavioral stimulation with two noninvasive brain stimulation techniques, namely, TMS and transcranial direct current stimulation (tDCS). First, we showed that redundant multisensory stimuli can increase visual cortical excitability, as measured by means of phosphene induction by occipital TMS; such physiological enhancement is followed by a behavioral facilitation through the amplification of signal intensity in sensory-specific visual areas. The more sensory inputs are combined (i.e., trimodal vs. bimodal stimuli), the greater are the benefits on phosphene perception. Second, neuroelectrical activity changes induced by tDCS in the temporal and in the parietal cortices, but not in the occipital cortex, can further boost the multisensory enhancement of visual cortical excitability, by increasing the auditory and tactile inputs from temporal and parietal regions, respectively, to lower-level visual areas.

  10. Greater benefits of multisensory integration during complex sensorimotor transformations.

    PubMed

    Buchholz, Verena N; Goonetilleke, Samanthi C; Medendorp, W Pieter; Corneil, Brian D

    2012-06-01

    Multisensory integration enables rapid and accurate behavior. To orient in space, sensory information registered initially in different reference frames has to be integrated with the current postural information to produce an appropriate motor response. In some postures, multisensory integration requires convergence of sensory evidence across hemispheres, which would presumably lessen or hinder integration. Here, we examined orienting gaze shifts in humans to visual, tactile, or visuotactile stimuli when the hands were either in a default uncrossed posture or a crossed posture requiring convergence across hemispheres. Surprisingly, we observed the greatest benefits of multisensory integration in the crossed posture, as indexed by reaction time (RT) decreases. Moreover, such shortening of RTs to multisensory stimuli did not come at the cost of increased error propensity. To explain these results, we propose that two accepted principles of multisensory integration, the spatial principle and inverse effectiveness, dynamically interact to aid the rapid and accurate resolution of complex sensorimotor transformations. First, early mutual inhibition of initial visual and tactile responses registered in different hemispheres reduces error propensity. Second, inverse effectiveness in the integration of the weakened visual response with the remapped tactile representation expedites the generation of the correct motor response. Our results imply that the concept of inverse effectiveness, which is usually associated with external stimulus properties, might extend to internal spatial representations that are more complex given certain body postures.

  11. Neuronal Plasticity and Multisensory Integration in Filial Imprinting

    PubMed Central

    Town, Stephen Michael; McCabe, Brian John

    2011-01-01

    Many organisms sample their environment through multiple sensory systems and the integration of multisensory information enhances learning. However, the mechanisms underlying multisensory memory formation and their similarity to unisensory mechanisms remain unclear. Filial imprinting is one example in which experience is multisensory, and the mechanisms of unisensory neuronal plasticity are well established. We investigated the storage of audiovisual information through experience by comparing the activity of neurons in the intermediate and medial mesopallium of imprinted and naïve domestic chicks (Gallus gallus domesticus) in response to an audiovisual imprinting stimulus and novel object and their auditory and visual components. We find that imprinting enhanced the mean response magnitude of neurons to unisensory but not multisensory stimuli. Furthermore, imprinting enhanced responses to incongruent audiovisual stimuli comprised of mismatched auditory and visual components. Our results suggest that the effects of imprinting on the unisensory and multisensory responsiveness of IMM neurons differ and that IMM neurons may function to detect unexpected deviations from the audiovisual imprinting stimulus. PMID:21423770

  12. Neuronal plasticity and multisensory integration in filial imprinting.

    PubMed

    Town, Stephen Michael; McCabe, Brian John

    2011-03-10

    Many organisms sample their environment through multiple sensory systems and the integration of multisensory information enhances learning. However, the mechanisms underlying multisensory memory formation and their similarity to unisensory mechanisms remain unclear. Filial imprinting is one example in which experience is multisensory, and the mechanisms of unisensory neuronal plasticity are well established. We investigated the storage of audiovisual information through experience by comparing the activity of neurons in the intermediate and medial mesopallium of imprinted and naïve domestic chicks (Gallus gallus domesticus) in response to an audiovisual imprinting stimulus and novel object and their auditory and visual components. We find that imprinting enhanced the mean response magnitude of neurons to unisensory but not multisensory stimuli. Furthermore, imprinting enhanced responses to incongruent audiovisual stimuli comprised of mismatched auditory and visual components. Our results suggest that the effects of imprinting on the unisensory and multisensory responsiveness of IMM neurons differ and that IMM neurons may function to detect unexpected deviations from the audiovisual imprinting stimulus.

  13. Multisensory simultaneity judgment and proximity to the body

    PubMed Central

    Noel, Jean-Paul; Łukowska, Marta; Wallace, Mark; Serino, Andrea

    2016-01-01

    The integration of information across different sensory modalities is known to be dependent upon the statistical characteristics of the stimuli to be combined. For example, the spatial and temporal proximity of stimuli are important determinants with stimuli that are close in space and time being more likely to be bound. These multisensory interactions occur not only for singular points in space/time, but over “windows” of space and time that likely relate to the ecological statistics of real-world stimuli. Relatedly, human psychophysical work has demonstrated that individuals are highly prone to judge multisensory stimuli as co-occurring over a wide range of time—a so-called simultaneity window (SW). Similarly, there exists a spatial representation of peripersonal space (PPS) surrounding the body in which stimuli related to the body and to external events occurring near the body are highly likely to be jointly processed. In the current study, we sought to examine the interaction between these temporal and spatial dimensions of multisensory representation by measuring the SW for audiovisual stimuli through proximal–distal space (i.e., PPS and extrapersonal space). Results demonstrate that the audiovisual SWs within PPS are larger than outside PPS. In addition, we suggest that this effect is likely due to an automatic and additional computation of these multisensory events in a body-centered reference frame. We discuss the current findings in terms of the spatiotemporal constraints of multisensory interactions and the implication of distinct reference frames on this process. PMID:26891828

  14. Development of multisensory reweighting for posture control in children.

    PubMed

    Bair, Woei-Nan; Kiemel, Tim; Jeka, John J; Clark, Jane E

    2007-12-01

    Reweighting to multisensory inputs adaptively contributes to stable and flexible upright stance control. However, few studies have examined how early a child develops multisensory reweighting ability, or how this ability develops through childhood. The purpose of the study was to characterize a developmental landscape of multisensory reweighting for upright postural control in children 4-10 years of age. Children were presented with simultaneous small-amplitude somatosensory and visual environmental movement at 0.28 and 0.2 Hz, respectively, within five conditions that independently varied the amplitude of the stimuli. The primary measure was body sway amplitude relative to each stimulus: touch gain and vision gain. We found that children can reweight to multisensory inputs from 4 years on. Specifically, intra-modal reweighting was exhibited by children as young as 4 years of age; however, inter-modal reweighting was only observed in the older children. The amount of reweighting increased with age indicating development of a better adaptive ability. Our results rigorously demonstrate the development of simultaneous reweighting to two sensory inputs for postural control in children. The present results provide further evidence that the development of multisensory reweighting contributes to more stable and flexible control of upright stance, which ultimately serves as the foundation for functional behaviors such as locomotion and reaching.

  15. The function of consciousness in multisensory integration.

    PubMed

    Palmer, Terry D; Ramsey, Ashley K

    2012-12-01

    The function of consciousness was explored in two contexts of audio-visual speech, cross-modal visual attention guidance and McGurk cross-modal integration. Experiments 1, 2, and 3 utilized a novel cueing paradigm in which two different flash suppressed lip-streams cooccured with speech sounds matching one of these streams. A visual target was then presented at either the audio-visually congruent or incongruent location. Target recognition differed for the congruent versus incongruent trials, and the nature of this difference depended on the probabilities of a target appearing at these respective locations. Thus, even though the lip-streams were never consciously perceived, they were nevertheless meaningfully integrated with the consciously perceived sounds, and participants learned to guide their attention according to statistical regularities between targets and these unconsciously perceived cross-modal cues. In Experiment 4, McGurk stimuli were presented in which the lip-streams were either flash suppressed (4a) or unsuppressed (4b), and the McGurk effect was found to vanish under conditions of flash suppression. Overall, these results suggest a simple yet fundamental principle regarding the function of consciousness in multisensory integration - cross-modal effects can occur in the absence of consciousness, but the influencing modality must be consciously perceived for its information to cross modalities.

  16. A multisensory algorithm of satellite radiothermovision

    NASA Astrophysics Data System (ADS)

    Ermakov, D. M.; Sharkov, E. A.; Chernushich, A. P.

    2016-12-01

    A multisensory algorithm of satellite radiothermovision has been proposed that makes it possible to combine the data of satellite radiothermal monitoring of the Earth from different sources within a single method of spatiotemporal interpolation taking into account the differences in time of measurements and in the spatial resolution of different instruments. The new algorithm was tested in a series of measurements with SSMIS instruments onboard F16, F17 satellites of DMSP and AMSR-2 instruments onboard the GCOMW1 satellite in November 2013, as well as a series of measurements with the same instruments in August 2012 supplemented with data from WindSat. The parameters of the spatiotemporal interpolation of total precipitable water fields are improved nearly twofold (with a time step of 1.5 h on a regular grid with 0.125° sampling) compared with the method that previously used only SSM/I data. The achieved spatial sampling exceeds known world analogues while maintaining a high accuracy of interpolation. In addition, the problem of the transition from the fields at a fixed local time to the fields at a fixed universal time is briefly considered. It has been shown that this transition is mainly relevant in the study of the fast atmospheric processes on a global scale with high temporal resolution.

  17. How prior expectations shape multisensory perception.

    PubMed

    Gau, Remi; Noppeney, Uta

    2016-01-01

    The brain generates a representation of our environment by integrating signals from a common source, but segregating signals from different sources. This fMRI study investigated how the brain arbitrates between perceptual integration and segregation based on top-down congruency expectations and bottom-up stimulus-bound congruency cues. Participants were presented audiovisual movies of phonologically congruent, incongruent or McGurk syllables that can be integrated into an illusory percept (e.g. "ti" percept for visual «ki» with auditory /pi/). They reported the syllable they perceived. Critically, we manipulated participants' top-down congruency expectations by presenting McGurk stimuli embedded in blocks of congruent or incongruent syllables. Behaviorally, participants were more likely to fuse audiovisual signals into an illusory McGurk percept in congruent than incongruent contexts. At the neural level, the left inferior frontal sulcus (lIFS) showed increased activations for bottom-up incongruent relative to congruent inputs. Moreover, lIFS activations were increased for physically identical McGurk stimuli, when participants segregated the audiovisual signals and reported their auditory percept. Critically, this activation increase for perceptual segregation was amplified when participants expected audiovisually incongruent signals based on prior sensory experience. Collectively, our results demonstrate that the lIFS combines top-down prior (in)congruency expectations with bottom-up (in)congruency cues to arbitrate between multisensory integration and segregation.

  18. Capturing spatial attention with multisensory cues.

    PubMed

    Santangelo, Valerio; Ho, Cristy; Spence, Charles

    2008-04-01

    We assessed the influence ofmultisensory interactions on the exogenous orienting of spatial attention by comparing the ability of auditory, tactile, and audiotactile exogenous cues to capture visuospatial attention under conditions of no perceptual load versus high perceptual load. In Experiment 1, participants discriminated the elevation of visual targets preceded by either unimodal or bimodal cues under conditions of either a high perceptual load (involving the monitoring of a rapidly presented central stream of visual letters for occasionally presented target digits) or no perceptual load (when the central stream was replaced by a fixation point). All of the cues captured spatial attention in the no-load condition, whereas only the bimodal cues captured visuospatial attention in the high-load condition. In Experiment 2, we ruled out the possibility that the presentation of any changing stimulus at fixation (i.e., a passively monitored stream of letters) would eliminate exogenous orienting, which instead appears to be a consequence of high perceptual load conditions (Experiment 1). These results demonstrate that multisensory cues capture spatial attention more effectively than unimodal cues under conditions of concurrent perceptual load.

  19. Multisensorial Vision For Autonomous Vehicle Driving

    NASA Astrophysics Data System (ADS)

    Giusto, Daniele D.; Regazzoni, Carlo S.; Vernazza, Gianni L.

    1989-03-01

    A multisensorial vision system for autonomous vehicle driving is presented, that operates in outdoor natural environments. The system, currently under development in our laboratories, will be able to integrate data provided by different sensors in order to achieve a more reliable description of a scene and to meet safety requirements. We chose to perform a high-level symbolic fusion of the data to better accomplish the recognition task. A knowledge-based approach is followed, which provides a more accurate solution; in particular, it will be possible to integrate both physical data, furnished by each channel, and different fusion strategies, by using an appropriate control structure. The high complexity of data integration is reduced by acquiring, filtering, segmenting and extracting features from each sensor channel. Production rules, divided into groups according to specific goals, drive the fusion process, linking to a symbolic frame all the segmented regions characterized by similar properties. As a first application, road and obstacle detection is performed. A particular fusion strategy is tested that integrates results separately obtained by applying the recognition module to each different sensor according to the related model description. Preliminary results are very promising and confirm the validity of the proposed approach.

  20. Molecular basis of dental sensitivity: The odontoblasts are multisensory cells and express multifunctional ion channels.

    PubMed

    Solé-Magdalena, A; Martínez-Alonso, M; Coronado, C A; Junquera, L M; Cobo, J; Vega, J A

    2017-09-24

    Odontoblasts are the dental pulp cells responsible for the formation of dentin. In addition, accumulating data strongly suggest that they can also function as sensory cells that mediate the early steps of mechanical, thermic, and chemical dental sensitivity. This assumption is based on the expression of different families of ion channels involved in various modalities of sensitivity and the release of putative neurotransmitters in response to odontoblast stimulation which are able to act on pulp sensory nerve fibers. This review updates the current knowledge on the expression of transient-potential receptor ion channels and acid-sensing ion channels in odontoblasts, nerve fibers innervating them and trigeminal sensory neurons, as well as in pulp cells. Moreover, the innervation of the odontoblasts and the interrelationship been odontoblasts and nerve fibers mediated by neurotransmitters was also revisited. These data might provide the basis for novel therapeutic approaches for the treatment of dentin sensibility and/or dental pain. Copyright © 2017. Published by Elsevier GmbH.

  1. Synchronous multisensory stimulation blurs self-other boundaries.

    PubMed

    Paladino, Maria-Paola; Mazzurega, Mara; Pavani, Francesco; Schubert, Thomas W

    2010-09-01

    In a study that builds on recent cognitive neuroscience research on body perception and social psychology research on social relations, we tested the hypothesis that synchronous multisensory stimulation leads to self-other merging. We brushed the cheek of each study participant as he or she watched a stranger's cheek being brushed in the same way, either in synchrony or in asynchrony. We found that this multisensory procedure had an effect on participants' body perception as well as social perception. Study participants exposed to synchronous stimulation showed more merging of self and the other than participants exposed to asynchronous stimulation. The degree of self-other merging was determined by measuring participants' body sensations and their perception of face resemblance, as well as participants' judgment of the inner state of the other, closeness felt toward the other, and conformity behavior. The results of this study show how multisensory integration can affect social perception and create a sense of self-other similarity.

  2. Senses make sense: An individualized multisensory stimulation for dementia.

    PubMed

    Cui, Yuanwu; Shen, Minxue; Ma, Yan; Wen, Shi Wu

    2017-01-01

    Nonpharmacologic interventions have been recommended as first-line treatments for dementia, and multisensory stimulation environment has been used as a non-pharmacological treatment to dementia patients in the last decade. However, the clinical effect of multisensory stimulation environment remains temporary and uncertain. Individualized medicine has been suggested to hold great promise in medicine, and it should be equally important for dementia. Reminiscence integrating individual experiences into therapeutic schemes has shown potential in the field of improving cognitive functions and depressive symptoms for dementia patients, and interactive music also demonstrated a positive outcome by using individualized music for the hearing aspect. We therefore hypothesize that an individualized multisensory stimulation in a natural and realistic environment integrating personal experience may be an effective intervention for patients suffering from dementia.

  3. Multisensory integration in complete unawareness: evidence from audiovisual congruency priming.

    PubMed

    Faivre, Nathan; Mudrik, Liad; Schwartz, Naama; Koch, Christof

    2014-11-01

    Multisensory integration is thought to require conscious perception. Although previous studies have shown that an invisible stimulus could be integrated with an audible one, none have demonstrated integration of two subliminal stimuli of different modalities. Here, pairs of identical or different audiovisual target letters (the sound /b/ with the written letter "b" or "m," respectively) were preceded by pairs of masked identical or different audiovisual prime digits (the sound /6/ with the written digit "6" or "8," respectively). In three experiments, awareness of the audiovisual digit primes was manipulated, such that participants were either unaware of the visual digit, the auditory digit, or both. Priming of the semantic relations between the auditory and visual digits was found in all experiments. Moreover, a further experiment showed that unconscious multisensory integration was not obtained when participants did not undergo prior conscious training of the task. This suggests that following conscious learning, unconscious processing suffices for multisensory integration.

  4. Visuo-haptic multisensory object recognition, categorization, and representation

    PubMed Central

    Lacey, Simon; Sathian, K.

    2014-01-01

    Visual and haptic unisensory object processing show many similarities in terms of categorization, recognition, and representation. In this review, we discuss how these similarities contribute to multisensory object processing. In particular, we show that similar unisensory visual and haptic representations lead to a shared multisensory representation underlying both cross-modal object recognition and view-independence. This shared representation suggests a common neural substrate and we review several candidate brain regions, previously thought to be specialized for aspects of visual processing, that are now known also to be involved in analogous haptic tasks. Finally, we lay out the evidence for a model of multisensory object recognition in which top-down and bottom-up pathways to the object-selective lateral occipital complex are modulated by object familiarity and individual differences in object and spatial imagery. PMID:25101014

  5. Multisensory Integration in the Virtual Hand Illusion with Active Movement

    PubMed Central

    Satoh, Satoru; Hachimura, Kozaburo

    2016-01-01

    Improving the sense of immersion is one of the core issues in virtual reality. Perceptual illusions of ownership can be perceived over a virtual body in a multisensory virtual reality environment. Rubber Hand and Virtual Hand Illusions showed that body ownership can be manipulated by applying suitable visual and tactile stimulation. In this study, we investigate the effects of multisensory integration in the Virtual Hand Illusion with active movement. A virtual xylophone playing system which can interactively provide synchronous visual, tactile, and auditory stimulation was constructed. We conducted two experiments regarding different movement conditions and different sensory stimulations. Our results demonstrate that multisensory integration with free active movement can improve the sense of immersion in virtual reality. PMID:27847822

  6. Balanced crossmodal excitation and inhibition essential for maximizing multisensory gain.

    PubMed

    Hoshino, Osamu

    2014-07-01

    We examined whether and how the balancing of crossmodal excitation and inhibition affects intersensory facilitation. A neural network model, comprising lower-order unimodal networks (X, Y) and a higher-order multimodal network (M), was simulated. Crossmodal excitation was made by direct activation of principal cells of the X network by the Y network. Crossmodal inhibition was made in an indirect manner: the Y network activated glial cells of the X network. This let glial plasma membrane transporters export GABA molecules into the extracellular space and increased the level of ambient GABA. The ambient GABA molecules were accepted by extrasynaptic GABAa receptors and tonically inhibited principal cells of the X network. Namely, crossmodal inhibition was made through GABAergic gliotransmission. Intersensory facilitation was assessed in terms of multisensory gain: the difference between the numbers of spikes evoked by multisensory (XY) stimulation and unisensory (X-alone) stimulation. The maximal multisensory gain (XY-X) could be achieved at an intermediate noise level by balancing crossmodal excitation and inhibition. This result supports an experimentally derived conclusion: intersensory facilitation under noisy environmental conditions is not necessarily in accord with the principle of inverse effectiveness; rather, multisensory gain is maximal at intermediate signal-to-noise ratio (SNR) levels. The maximal multisensory gain was available at the weakest signal if noise was not present, indicating that the principle of inverse effectiveness is a special case of the intersensory facilitation model proposed here. We suggest that the balancing of crossmodal excitation and inhibition may be crucial for intersensory facilitation. The GABAergic glio-transmission-mediated crossmodal inhibitory mechanism effectively works for intersensory facilitation and on determining the maximal multisensory gain in the entire SNR range between the two extremes: low and high SNRs.

  7. A novel behavioral paradigm to assess multisensory processing in mice

    PubMed Central

    Siemann, Justin K.; Muller, Christopher L.; Bamberger, Gary; Allison, John D.; Veenstra-VanderWeele, Jeremy; Wallace, Mark T.

    2015-01-01

    Human psychophysical and animal behavioral studies have illustrated the benefits that can be conferred from having information available from multiple senses. Given the central role of multisensory integration for perceptual and cognitive function, it is important to design behavioral paradigms for animal models to provide mechanistic insights into the neural bases of these multisensory processes. Prior studies have focused on large mammals, yet the mouse offers a host of advantages, most importantly the wealth of available genetic manipulations relevant to human disease. To begin to employ this model species for multisensory research it is necessary to first establish and validate a robust behavioral assay for the mouse. Two common mouse strains (C57BL/6J and 129S6/SvEv) were first trained to respond to unisensory (visual and auditory) stimuli separately. Once trained, performance with paired audiovisual stimuli was then examined with a focus on response accuracy and behavioral gain. Stimulus durations varied from 50 ms to 1 s in order to modulate the effectiveness of the stimuli and to determine if the well-established “principle of inverse effectiveness” held in this model. Response accuracy in the multisensory condition was greater than for either unisensory condition for all stimulus durations, with significant gains observed at the 300 ms and 100 ms durations. Main effects of stimulus duration, stimulus modality and a significant interaction between these factors were observed. The greatest behavioral gain was seen for the 100 ms duration condition, with a trend observed that as the stimuli became less effective, larger behavioral gains were observed upon their pairing (i.e., inverse effectiveness). These results are the first to validate the mouse as a species that shows demonstrable behavioral facilitations under multisensory conditions and provides a platform for future mechanistically directed studies to examine the neural bases of multisensory

  8. Multifunctional nanoparticles: analytical prospects.

    PubMed

    de Dios, Alejandro Simón; Díaz-García, Marta Elena

    2010-05-07

    Multifunctional nanoparticles are among the most exciting nanomaterials with promising applications in analytical chemistry. These applications include (bio)sensing, (bio)assays, catalysis and separations. Although most of these applications are based on the magnetic, optical and electrochemical properties of multifunctional nanoparticles, other aspects such as the synergistic effect of the functional groups and the amplification effect associated with the nanoscale dimension have also been observed. Considering not only the nature of the raw material but also the shape, there is a huge variety of nanoparticles. In this review only magnetic, quantum dots, gold nanoparticles, carbon and inorganic nanotubes as well as silica, titania and gadolinium oxide nanoparticles are addressed. This review presents a narrative summary on the use of multifunctional nanoparticles for analytical applications, along with a discussion on some critical challenges existing in the field and possible solutions that have been or are being developed to overcome these challenges.

  9. Mechanics of Multifunctional Materials & Microsystems

    DTIC Science & Technology

    2012-03-09

    Mechanics of Materials; Life Prediction (Materials & Micro-devices); Sensing, Precognition & Diagnosis; Multifunctional Design of Autonomic...Life Prediction (Materials & Micro-devices); Sensing, Precognition & Diagnosis; Multifunctional Design of Autonomic Systems; Multifunctional...release; distribution is unlimited. 7 VISION: EXPANDED • site specific • autonomic AUTONOMIC AEROSPACE STRUCTURES • Sensing & Precognition • Self

  10. Implicit multisensory associations influence voice recognition.

    PubMed

    von Kriegstein, Katharina; Giraud, Anne-Lise

    2006-10-01

    Natural objects provide partially redundant information to the brain through different sensory modalities. For example, voices and faces both give information about the speech content, age, and gender of a person. Thanks to this redundancy, multimodal recognition is fast, robust, and automatic. In unimodal perception, however, only part of the information about an object is available. Here, we addressed whether, even under conditions of unimodal sensory input, crossmodal neural circuits that have been shaped by previous associative learning become activated and underpin a performance benefit. We measured brain activity with functional magnetic resonance imaging before, while, and after participants learned to associate either sensory redundant stimuli, i.e. voices and faces, or arbitrary multimodal combinations, i.e. voices and written names, ring tones, and cell phones or brand names of these cell phones. After learning, participants were better at recognizing unimodal auditory voices that had been paired with faces than those paired with written names, and association of voices with faces resulted in an increased functional coupling between voice and face areas. No such effects were observed for ring tones that had been paired with cell phones or names. These findings demonstrate that brief exposure to ecologically valid and sensory redundant stimulus pairs, such as voices and faces, induces specific multisensory associations. Consistent with predictive coding theories, associative representations become thereafter available for unimodal perception and facilitate object recognition. These data suggest that for natural objects effective predictive signals can be generated across sensory systems and proceed by optimization of functional connectivity between specialized cortical sensory modules.

  11. Implicit Multisensory Associations Influence Voice Recognition

    PubMed Central

    von Kriegstein, Katharina; Giraud, Anne-Lise

    2006-01-01

    Natural objects provide partially redundant information to the brain through different sensory modalities. For example, voices and faces both give information about the speech content, age, and gender of a person. Thanks to this redundancy, multimodal recognition is fast, robust, and automatic. In unimodal perception, however, only part of the information about an object is available. Here, we addressed whether, even under conditions of unimodal sensory input, crossmodal neural circuits that have been shaped by previous associative learning become activated and underpin a performance benefit. We measured brain activity with functional magnetic resonance imaging before, while, and after participants learned to associate either sensory redundant stimuli, i.e. voices and faces, or arbitrary multimodal combinations, i.e. voices and written names, ring tones, and cell phones or brand names of these cell phones. After learning, participants were better at recognizing unimodal auditory voices that had been paired with faces than those paired with written names, and association of voices with faces resulted in an increased functional coupling between voice and face areas. No such effects were observed for ring tones that had been paired with cell phones or names. These findings demonstrate that brief exposure to ecologically valid and sensory redundant stimulus pairs, such as voices and faces, induces specific multisensory associations. Consistent with predictive coding theories, associative representations become thereafter available for unimodal perception and facilitate object recognition. These data suggest that for natural objects effective predictive signals can be generated across sensory systems and proceed by optimization of functional connectivity between specialized cortical sensory modules. PMID:17002519

  12. Heterogeneity in the spatial receptive field architecture of multisensory neurons of the superior colliculus and its effects on multisensory integration.

    PubMed

    Ghose, D; Wallace, M T

    2014-01-03

    Multisensory integration has been widely studied in neurons of the mammalian superior colliculus (SC). This has led to the description of various determinants of multisensory integration, including those based on stimulus- and neuron-specific factors. The most widely characterized of these illustrate the importance of the spatial and temporal relationships of the paired stimuli as well as their relative effectiveness in eliciting a response in determining the final integrated output. Although these stimulus-specific factors have generally been considered in isolation (i.e., manipulating stimulus location while holding all other factors constant), they have an intrinsic interdependency that has yet to be fully elucidated. For example, changes in stimulus location will likely also impact both the temporal profile of response and the effectiveness of the stimulus. The importance of better describing this interdependency is further reinforced by the fact that SC neurons have large receptive fields, and that responses at different locations within these receptive fields are far from equivalent. To address these issues, the current study was designed to examine the interdependency between the stimulus factors of space and effectiveness in dictating the multisensory responses of SC neurons. The results show that neuronal responsiveness changes dramatically with changes in stimulus location - highlighting a marked heterogeneity in the spatial receptive fields of SC neurons. More importantly, this receptive field heterogeneity played a major role in the integrative product exhibited by stimulus pairings, such that pairings at weakly responsive locations of the receptive fields resulted in the largest multisensory interactions. Together these results provide greater insight into the interrelationship of the factors underlying multisensory integration in SC neurons, and may have important mechanistic implications for multisensory integration and the role it plays in shaping

  13. Multifunctional Tanks for Spacecraft

    NASA Technical Reports Server (NTRS)

    Collins, David H.; Lewis, Joseph C.; MacNeal, Paul D.

    2006-01-01

    A document discusses multifunctional tanks as means to integrate additional structural and functional efficiencies into designs of spacecraft. Whereas spacecraft tanks are traditionally designed primarily to store fluids and only secondarily to provide other benefits, multifunctional tanks are designed to simultaneously provide multiple primary benefits. In addition to one or more chamber(s) for storage of fluids, a multifunctional tank could provide any or all of the following: a) Passageways for transferring the fluids; b) Part or all of the primary structure of a spacecraft; c) All or part of an enclosure; d) Mechanical interfaces to components, subsystems, and/or systems; e) Paths and surfaces for transferring heat; f)Shielding against space radiation; j) Shielding against electromagnetic interference; h) Electrically conductive paths and surfaces; and i) Shades and baffles to protect against sunlight and/or other undesired light. Many different multifunctional-tank designs are conceivable. The design of a particular tank can be tailored to the requirements for the spacecraft in which the tank is to be installed. For example, the walls of the tank can be flat or curved or have more complicated shapes, and the tank can include an internal structure for strengthening the tank and/or other uses.

  14. Multisensory Public Access Catalogs on CD-ROM.

    ERIC Educational Resources Information Center

    Harrison, Nancy; Murphy, Brower

    1987-01-01

    BiblioFile Intelligent Catalog is a CD-ROM-based public access catalog system which incorporates graphics and sound to provide a multisensory interface and artificial intelligence techniques to increase search precision. The system can be updated frequently and inexpensively by linking hard disk drives to CD-ROM optical drives. (MES)

  15. Multisensory Speech Perception Without the Left Superior Temporal Sulcus

    PubMed Central

    Baum, Sarah H.; Martin, Randi C.; Hamilton, A. Cris; Beauchamp, Michael S.

    2012-01-01

    Converging evidence suggests that the left superior temporal sulcus (STS) is a critical site for multisensory integration of auditory and visual information during speech perception. We report a patient, SJ, who suffered a stroke that damaged the left tempo-parietal area, resulting in mild anomic aphasia. Structural MRI showed complete destruction of the left middle and posterior STS, as well as damage to adjacent areas in the temporal and parietal lobes. Surprisingly, SJ demonstrated preserved multisensory integration measured with two independent tests. First, she perceived the McGurk effect, an illusion that requires integration of auditory and visual speech. Second, her perception of morphed audiovisual speech with ambiguous auditory or visual information was significantly influenced by the opposing modality. To understand the neural basis for this preserved multisensory integration, blood-oxygen level dependent functional magnetic resonance imaging (BOLD fMRI) was used to examine brain responses to audiovisual speech in SJ and 23 healthy age-matched controls. In controls, bilateral STS activity was observed. In SJ, no activity was observed in the damaged left STS but in the right STS, more cortex was active in SJ than in any of the normal controls. Further, the amplitude of the BOLD response in right STS response to McGurk stimuli was significantly greater in SJ than in controls. The simplest explanation of these results is a reorganization of SJ's cortical language networks such that the right STS now subserves multisensory integration of speech. PMID:22634292

  16. The Multisensory Sound Lab: Sounds You Can See and Feel.

    ERIC Educational Resources Information Center

    Lederman, Norman; Hendricks, Paula

    1994-01-01

    A multisensory sound lab has been developed at the Model Secondary School for the Deaf (District of Columbia). A special floor allows vibrations to be felt, and a spectrum analyzer displays frequencies and harmonics visually. The lab is used for science education, auditory training, speech therapy, music and dance instruction, and relaxation…

  17. Detection of Iberian ham aroma by a semiconductor multisensorial system.

    PubMed

    Otero, Laura; Horrillo, M A Carmen; García, María; Sayago, Isabel; Aleixandre, Manuel; Fernández, M A Jesús; Arés, Luis; Gutiérrez, Javier

    2003-11-01

    A semiconductor multisensorial system, based on tin oxide, to control the quality of dry-cured Iberian hams is described. Two types of ham (submitted to different drying temperatures) were selected. Good responses were obtained from the 12 elements forming the multisensor for different operating temperatures. Discrimination between the two types of ham was successfully realised through principal component analysis (PCA).

  18. Musicians react faster and are better multisensory integrators.

    PubMed

    Landry, Simon P; Champoux, François

    2017-02-01

    The results from numerous investigations suggest that musical training might enhance how senses interact. Despite repeated confirmation of anatomical and structural changes in visual, tactile, and auditory regions, significant changes have only been reported in the audiovisual domain and for the detection of audio-tactile incongruencies. In the present study, we aim at testing whether long-term musical training might also enhance other multisensory processes at a behavioural level. An audio-tactile reaction time task was administrated to a group of musicians and non-musicians. We found significantly faster reaction times with musicians for auditory, tactile, and audio-tactile stimulations. Statistical analyses between the combined uni- and multisensory reaction times revealed that musicians possess a statistical advantage when responding to multisensory stimuli compared to non-musicians. These results suggest for the first time that long-term musical training reduces simple non-musical auditory, tactile, and multisensory reaction times. Taken together with the previous results from other sensory modalities, these results strongly point towards musicians being better at integrating the inputs from various senses. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Multisensory in-car warning signals for collision avoidance.

    PubMed

    Ho, Cristy; Reed, Nick; Spence, Charles

    2007-12-01

    A driving simulator study was conducted in order to assess the relative utility of unimodal auditory, unimodal vibrotactile, and combined audiotactile (i.e., multisensory) in-car warning signals to alert and inform drivers of likely front-to-rear-end collision events in a situation modeled on real-world driving. The implementation of nonvisual in-car warning signals may have important safety implications in lessening any visual overload during driving. Multisensory integration can provide synergistic facilitation effects. The participants drove along a rural road in a car-following scenario in either the presence or absence of a radio program in the background. The brake light signals of the lead vehicle were also unpredictably either enabled or disabled on a trial-by-trial basis. The results showed that the participants initiated their braking responses significantly more rapidly following the presentation of audiotactile warning signals than following the presentation of either unimodal auditory or unimodal vibrotactile warning signals. Multisensory warning signals offer a particularly effective means of capturing driver attention in demanding situations such as driving. The potential value of such multisensory in-car warning signals is explained with reference to recent cognitive neuroscience research.

  20. Early Visual Deprivation Alters Multisensory Processing in Peripersonal Space

    ERIC Educational Resources Information Center

    Collignon, Olivier; Charbonneau, Genevieve; Lassonde, Maryse; Lepore, Franco

    2009-01-01

    Multisensory peripersonal space develops in a maturational process that is thought to be influenced by early sensory experience. We investigated the role of vision in the effective development of audiotactile interactions in peripersonal space. Early blind (EB), late blind (LB) and sighted control (SC) participants were asked to lateralize…

  1. Multiple Pathways to Self: A Multisensory Art Experience.

    ERIC Educational Resources Information Center

    Jensen, Sharon M.

    1997-01-01

    Reports on a multisensory intervention that combined art, music, and movement within a long-term care setting for Alzheimer's patients. Details the benefits derived by some of the participants who attended the sessions regularly. Many were able to retrieve memories, enjoy socialization, and have the opportunity for affective expression. (RJM)

  2. Evidence for Diminished Multisensory Integration in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Stevenson, Ryan A.; Siemann, Justin K.; Woynaroski, Tiffany G.; Schneider, Brittany C.; Eberly, Haley E.; Camarata, Stephen M.; Wallace, Mark T.

    2014-01-01

    Individuals with autism spectrum disorders (ASD) exhibit alterations in sensory processing, including changes in the integration of information across the different sensory modalities. In the current study, we used the sound-induced flash illusion to assess multisensory integration in children with ASD and typically-developing (TD) controls.…

  3. Bringing Art to Life through Multi-Sensory Tours

    ERIC Educational Resources Information Center

    Dodek, Wendy L.

    2012-01-01

    Learning occurs in myriad ways yet most art museums remain wedded to visual instruction. Adult visitors touring the galleries are offered audio guides or lecture style tours to complement the visual but are there other ways to enhance learning? This article reports on a case study that found that active, multi-sensory experiences in art museums…

  4. Multisensory Information Boosts Numerical Matching Abilities in Young Children

    ERIC Educational Resources Information Center

    Jordan, Kerry E.; Baker, Joseph

    2011-01-01

    This study presents the first evidence that preschool children perform more accurately in a numerical matching task when given multisensory rather than unisensory information about number. Three- to 5-year-old children learned to play a numerical matching game on a touchscreen computer, which asked them to match a sample numerosity with a…

  5. Multisensory Teaching of Basic Language Skills. Third Edition

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    2011-01-01

    As new research shows how effective systematic and explicit teaching of language-based skills is for students with learning disabilities--along with the added benefits of multisensory techniques--discover the latest on this popular teaching approach with the third edition of this bestselling textbook. Adopted by colleges and universities across…

  6. Multisensory integration across the senses in young and old adults

    PubMed Central

    Mahoney, Jeannette R.; Li, Po Ching Clara; Oh-Park, Mooyeon; Verghese, Joe; Holtzer, Roee

    2011-01-01

    Stimuli are processed concurrently and across multiple sensory inputs. Here we directly compared the effect of multisensory integration (MSI) on reaction time across three paired sensory inputs in eighteen young (M=19.17 yrs) and eighteen old (M=76.44 yrs) individuals. Participants were determined to be non-demented and without any medical or psychiatric conditions that would affect their performance. Participants responded to randomly presented unisensory (auditory, visual, somatosensory) stimuli and three paired sensory inputs consisting of auditory-somatosensory (AS) auditory-visual (AV) and visual-somatosensory (VS) stimuli. Results revealed that reaction time (RT) to all multisensory pairings was significantly faster than those elicited to the constituent unisensory conditions across age groups; findings that could not be accounted for by simple probability summation. Both young and old participants responded the fastest to multisensory pairings containing somatosensory input. Compared to younger adults, older adults demonstrated a significantly greater RT benefit when processing concurrent VS information. In terms of co-activation, older adults demonstrated a significant increase in the magnitude of visual-somatosensory co-activation (i.e., multisensory integration), while younger adults demonstrated a significant increase in the magnitude of auditory-visual and auditory-somatosensory co-activation. This study provides first evidence in support of the facilitative effect of pairing somatosensory with visual stimuli in older adults. PMID:22024545

  7. Multisensory integration across the senses in young and old adults.

    PubMed

    Mahoney, Jeannette R; Li, Po Ching Clara; Oh-Park, Mooyeon; Verghese, Joe; Holtzer, Roee

    2011-12-02

    Stimuli are processed concurrently and across multiple sensory inputs. Here we directly compared the effect of multisensory integration (MSI) on reaction time across three paired sensory inputs in eighteen young (M=19.17 years) and eighteen old (M=76.44 years) individuals. Participants were determined to be non-demented and without any medical or psychiatric conditions that would affect their performance. Participants responded to randomly presented unisensory (auditory, visual, somatosensory) stimuli and three paired sensory inputs consisting of auditory-somatosensory (AS) auditory-visual (AV) and visual-somatosensory (VS) stimuli. Results revealed that reaction time (RT) to all multisensory pairings was significantly faster than those elicited to the constituent unisensory conditions across age groups; findings that could not be accounted for by simple probability summation. Both young and old participants responded the fastest to multisensory pairings containing somatosensory input. Compared to younger adults, older adults demonstrated a significantly greater RT benefit when processing concurrent VS information. In terms of co-activation, older adults demonstrated a significant increase in the magnitude of visual-somatosensory co-activation (i.e., multisensory integration), while younger adults demonstrated a significant increase in the magnitude of auditory-visual and auditory-somatosensory co-activation. This study provides first evidence in support of the facilitative effect of pairing somatosensory with visual stimuli in older adults. 2011 Elsevier B.V. All rights reserved.

  8. Evidence for Diminished Multisensory Integration in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Stevenson, Ryan A.; Siemann, Justin K.; Woynaroski, Tiffany G.; Schneider, Brittany C.; Eberly, Haley E.; Camarata, Stephen M.; Wallace, Mark T.

    2014-01-01

    Individuals with autism spectrum disorders (ASD) exhibit alterations in sensory processing, including changes in the integration of information across the different sensory modalities. In the current study, we used the sound-induced flash illusion to assess multisensory integration in children with ASD and typically-developing (TD) controls.…

  9. Multisensory Teaching of Basic Language Skills. Third Edition

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    2011-01-01

    As new research shows how effective systematic and explicit teaching of language-based skills is for students with learning disabilities--along with the added benefits of multisensory techniques--discover the latest on this popular teaching approach with the third edition of this bestselling textbook. Adopted by colleges and universities across…

  10. Multisensory Teaching of Basic Language Skills. Second Edition

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    2005-01-01

    For students with dyslexia and other learning disabilities--and for their peers--creative teaching methods that use two or more senses can dramatically improve language skills and academic outcomes. That is why every current and future educator needs the second edition of this definitive guide to multisensory teaching. A core text for a variety of…

  11. An extended multisensory temporal binding window in autism spectrum disorders

    PubMed Central

    Foss-Feig, Jennifer H.; Kwakye, Leslie D.; Cascio, Carissa J.; Burnette, Courtney P.; Kadivar, Haleh; Stone, Wendy L.

    2010-01-01

    Autism spectrum disorders (ASD) form a continuum of neurodevelopmental disorders, characterized by deficits in communication and reciprocal social interaction, as well as by repetitive behaviors and restricted interests. Sensory disturbances are also frequently reported in clinical and autobiographical accounts. However, surprisingly few empirical studies have characterized the fundamental features of sensory and multisensory processing in ASD. The current study is structured to test for potential differences in multisensory temporal function in ASD by making use of a temporally dependent, low-level multisensory illusion. In this illusion, the presentation of a single flash of light accompanied by multiple sounds often results in the illusory perception of multiple flashes. By systematically varying the temporal structure of the audiovisual stimuli, a “temporal window” within which these stimuli are likely to be bound into a single perceptual entity can be defined. The results of this study revealed that children with ASD report the flash-beep illusion over an extended range of stimulus onset asynchronies relative to children with typical development, suggesting that children with ASD have altered multisensory temporal function. These findings provide valuable new insights into our understanding of sensory processing in ASD and may hold promise for the development of more sensitive diagnostic measures and improved remediation strategies. PMID:20390256

  12. Multisensory Teaching of Basic Language Skills. Second Edition

    ERIC Educational Resources Information Center

    Birsh, Judith R., Ed.

    2005-01-01

    For students with dyslexia and other learning disabilities--and for their peers--creative teaching methods that use two or more senses can dramatically improve language skills and academic outcomes. That is why every current and future educator needs the second edition of this definitive guide to multisensory teaching. A core text for a variety of…

  13. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves as…

  14. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves…

  15. Accelerating Early Language Development with Multi-Sensory Training

    ERIC Educational Resources Information Center

    Bjorn, Piia M.; Kakkuri, Irma; Karvonen, Pirkko; Leppanen, Paavo H. T.

    2012-01-01

    This paper reports the outcome of a multi-sensory intervention on infant language skills. A programme titled "Rhyming Game and Exercise Club", which included kinaesthetic-tactile mother-child rhyming games performed in natural joint attention situations, was intended to accelerate Finnish six- to eight-month-old infants' language development. The…

  16. Multisensory Speech Perception in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Woynaroski, Tiffany G.; Kwakye, Leslie D.; Foss-Feig, Jennifer H.; Stevenson, Ryan A.; Stone, Wendy L.; Wallace, Mark T.

    2013-01-01

    This study examined unisensory and multisensory speech perception in 8-17 year old children with autism spectrum disorders (ASD) and typically developing controls matched on chronological age, sex, and IQ. Consonant-vowel syllables were presented in visual only, auditory only, matched audiovisual, and mismatched audiovisual ("McGurk")…

  17. Please! Teach All of Me: Multisensory Activities for Preschoolers.

    ERIC Educational Resources Information Center

    Crawford, Jackie; Hanson, Joni; Gums, Marcia; Neys, Paula

    Most people, including children, have preferences for how they learn about the world. When these preferences are clearly noticeable, they may be thought of as sensory strengths. For some children, sensory strengths develop because of a weakness in another sensory area. For these children, multisensory instruction can be very helpful. Multisensory…

  18. Multisensory Speech Perception in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Woynaroski, Tiffany G.; Kwakye, Leslie D.; Foss-Feig, Jennifer H.; Stevenson, Ryan A.; Stone, Wendy L.; Wallace, Mark T.

    2013-01-01

    This study examined unisensory and multisensory speech perception in 8-17 year old children with autism spectrum disorders (ASD) and typically developing controls matched on chronological age, sex, and IQ. Consonant-vowel syllables were presented in visual only, auditory only, matched audiovisual, and mismatched audiovisual ("McGurk")…

  19. Attention and multisensory modulation argue against total encapsulation.

    PubMed

    de Haas, Benjamin; Schwarzkopf, Dietrich Samuel; Rees, Geraint

    2016-01-01

    Firestone & Scholl (F&S) postulate that vision proceeds without any direct interference from cognition. We argue that this view is extreme and not in line with the available evidence. Specifically, we discuss two well-established counterexamples: Attention directly affects core aspects of visual processing, and multisensory modulations of vision originate on multiple levels, some of which are unlikely to fall "within perception."

  20. The Multisensory Sound Lab: Sounds You Can See and Feel.

    ERIC Educational Resources Information Center

    Lederman, Norman; Hendricks, Paula

    1994-01-01

    A multisensory sound lab has been developed at the Model Secondary School for the Deaf (District of Columbia). A special floor allows vibrations to be felt, and a spectrum analyzer displays frequencies and harmonics visually. The lab is used for science education, auditory training, speech therapy, music and dance instruction, and relaxation…

  1. Please! Teach All of Me: Multisensory Activities for Preschoolers.

    ERIC Educational Resources Information Center

    Crawford, Jackie; Hanson, Joni; Gums, Marcia; Neys, Paula

    Most people, including children, have preferences for how they learn about the world. When these preferences are clearly noticeable, they may be thought of as sensory strengths. For some children, sensory strengths develop because of a weakness in another sensory area. For these children, multisensory instruction can be very helpful. Multisensory…

  2. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves as…

  3. Multisensory Emplaced Learning: Resituating Situated Learning in a Moving World

    ERIC Educational Resources Information Center

    Fors, Vaike; Backstrom, Asa; Pink, Sarah

    2013-01-01

    This article outlines the implications of a theory of "sensory-emplaced learning" for understanding the interrelationships between the embodied and environmental in learning processes. Understanding learning as multisensory and contingent within everyday place-events, this framework analytically describes how people establish themselves…

  4. Influences of Multisensory Experience on Subsequent Unisensory Processing

    PubMed Central

    Shams, Ladan; Wozny, David R.; Kim, Robyn; Seitz, Aaron

    2011-01-01

    Multisensory perception has been the focus of intense investigation in recent years. It is now well-established that crossmodal interactions are ubiquitous in perceptual processing and endow the system with improved precision, accuracy, processing speed, etc. While these findings have shed much light on principles and mechanisms of perception, ultimately it is not very surprising that multiple sources of information provides benefits in performance compared to a single source of information. Here, we argue that the more surprising recent findings are those showing that multisensory experience also influences the subsequent unisensory processing. For example, exposure to auditory–visual stimuli can change the way that auditory or visual stimuli are processed subsequently even in isolation. We review three sets of findings that represent three different types of learning ranging from perceptual learning, to sensory recalibration, to associative learning. In all these cases exposure to multisensory stimuli profoundly influences the subsequent unisensory processing. This diversity of phenomena may suggest that continuous modification of unisensory representations by multisensory relationships may be a general learning strategy employed by the brain. PMID:22028697

  5. Measuring multisensory integration: from reaction times to spike counts.

    PubMed

    Colonius, Hans; Diederich, Adele

    2017-06-08

    A neuron is categorized as "multisensory" if there is a statistically significant difference between the response evoked, e.g., by a crossmodal stimulus combination and that evoked by the most effective of its components separately. Being responsive to multiple sensory modalities does not guarantee that a neuron has actually engaged in integrating its multiple sensory inputs: it could simply respond to the stimulus component eliciting the strongest response in a given trial. Crossmodal enhancement is commonly expressed as a proportion of the strongest mean unisensory response. This traditional index does not take into account any statistical dependency between the sensory channels under crossmodal stimulation. We propose an alternative index measuring by how much the multisensory response surpasses the level obtainable by optimally combining the unisensory responses, with optimality defined as probability summation under maximal negative stochastic dependence. The new index is analogous to measuring crossmodal enhancement in reaction time studies by the strength of violation of the "race model inequality', a numerical measure of multisensory integration. Since the new index tends to be smaller than the traditional one, neurons previously labeled as "multisensory' may lose that property. The index is easy to compute and it is sensitive to variability in data.

  6. Multisensory Public Access Catalogs on CD-ROM.

    ERIC Educational Resources Information Center

    Harrison, Nancy; Murphy, Brower

    1987-01-01

    BiblioFile Intelligent Catalog is a CD-ROM-based public access catalog system which incorporates graphics and sound to provide a multisensory interface and artificial intelligence techniques to increase search precision. The system can be updated frequently and inexpensively by linking hard disk drives to CD-ROM optical drives. (MES)

  7. Alterations to multisensory and unisensory integration by stimulus competition

    PubMed Central

    Rowland, Benjamin A.; Stanford, Terrence R.; Stein, Barry E.

    2011-01-01

    In environments containing sensory events at competing locations, selecting a target for orienting requires prioritization of stimulus values. Although the superior colliculus (SC) is causally linked to the stimulus selection process, the manner in which SC multisensory integration operates in a competitive stimulus environment is unknown. Here we examined how the activity of visual-auditory SC neurons is affected by placement of a competing target in the opposite hemifield, a stimulus configuration that would, in principle, promote interhemispheric competition for access to downstream motor circuitry. Competitive interactions between the targets were evident in how they altered unisensory and multisensory responses of individual neurons. Responses elicited by a cross-modal stimulus (multisensory responses) proved to be substantially more resistant to competitor-induced depression than were unisensory responses (evoked by the component modality-specific stimuli). Similarly, when a cross-modal stimulus served as the competitor, it exerted considerably more depression than did its individual component stimuli, in some cases producing more depression than predicted by their linear sum. These findings suggest that multisensory integration can help resolve competition among multiple targets by enhancing orientation to the location of cross-modal events while simultaneously suppressing orientation to events at alternate locations. PMID:21957224

  8. Early Visual Deprivation Alters Multisensory Processing in Peripersonal Space

    ERIC Educational Resources Information Center

    Collignon, Olivier; Charbonneau, Genevieve; Lassonde, Maryse; Lepore, Franco

    2009-01-01

    Multisensory peripersonal space develops in a maturational process that is thought to be influenced by early sensory experience. We investigated the role of vision in the effective development of audiotactile interactions in peripersonal space. Early blind (EB), late blind (LB) and sighted control (SC) participants were asked to lateralize…

  9. Altered Auditory and Multisensory Temporal Processing in Autism Spectrum Disorders

    PubMed Central

    Kwakye, Leslie D.; Foss-Feig, Jennifer H.; Cascio, Carissa J.; Stone, Wendy L.; Wallace, Mark T.

    2011-01-01

    Autism spectrum disorders (ASD) are characterized by deficits in social reciprocity and communication, as well as by repetitive behaviors and restricted interests. Unusual responses to sensory input and disruptions in the processing of both unisensory and multisensory stimuli also have been reported frequently. However, the specific aspects of sensory processing that are disrupted in ASD have yet to be fully elucidated. Recent published work has shown that children with ASD can integrate low-level audiovisual stimuli, but do so over an extended range of time when compared with typically developing (TD) children. However, the possible contributions of altered unisensory temporal processes to the demonstrated changes in multisensory function are yet unknown. In the current study, unisensory temporal acuity was measured by determining individual thresholds on visual and auditory temporal order judgment (TOJ) tasks, and multisensory temporal function was assessed through a cross-modal version of the TOJ task. Whereas no differences in thresholds for the visual TOJ task were seen between children with ASD and TD, thresholds were higher in ASD on the auditory TOJ task, providing preliminary evidence for impairment in auditory temporal processing. On the multisensory TOJ task, children with ASD showed performance improvements over a wider range of temporal intervals than TD children, reinforcing prior work showing an extended temporal window of multisensory integration in ASD. These findings contribute to a better understanding of basic sensory processing differences, which may be critical for understanding more complex social and cognitive deficits in ASD, and ultimately may contribute to more effective diagnostic and interventional strategies. PMID:21258617

  10. Affect differentially modulates brain activation in uni- and multisensory body-voice perception.

    PubMed

    Jessen, Sarah; Kotz, Sonja A

    2015-01-01

    Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception. While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions.

  11. Multifunctional Antenna Techniques

    DTIC Science & Technology

    2015-11-25

    Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Multifunctional antennas, reconfigurable antennas, electromagnetics REPORT... Electromagnetic Analysis and Applications, (06 2013): 223. doi: 10.4236/jemaa.2013.55036 Teng-Kai Chen, Gregory H. Huff. Transmission line analysis...of the Archimedean spiral antenna in free space, Journal of Electromagnetic Waves and Applications, (04 2014): 1175. doi: 10.1080/09205071

  12. Multifunctional Materials and Structures

    DTIC Science & Technology

    2003-07-01

    searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments...Power F Affordability Ø Proposed Solution – “ LIVE ” Ship Concept F Lightweight, High Performance Multifunctional Composite Structure F Desired...www.onr.navy.mil/sci_tech/grandc.htm F Navy DD(X) Program 1 July 2003© 2003 University of Delaware All rights reservedYarlagadda ONR Review - 4 “ LIVE ” Ship

  13. Adaptive multifunctional composites

    NASA Astrophysics Data System (ADS)

    Wang, Ya; Inman, Daniel J.

    2013-05-01

    The adaptive multifunctional composite structure studied here is to address two issues remaining in lightweight structural composites required by many engineering applications. The first is to add additional functionality to multifunctional composites and the second is to provide adaptive damping in structures that cover a wide range of frequencies and temperatures. Because of its potential for practical payoffs, passive structural damping can find wide application through the use of high-damping viscoelastic polymers or elastomers. However, all passive damping using these damping materials suffer from failing at certain temperatures and in certain frequency ranges. The extreme environments often seen by engineering systems provide high temperature, which is exactly where damping levels in structures reduce causing unacceptable vibrations. In addition, as loading frequencies reduce damping levels also fall off, and many loads experienced by large structures are low frequency. The proposed research addresses increasing the range of effectiveness of damping by addressing the temperature and frequency dependence of material damping by using a multifunctional composite system containing an active element. Previous research has yielded a finite element model of linear viscoelastic material and structural behavior that captures characteristic frequency-dependent behavior, continuing research has addressed the accommodation of temperature dependence, and the examination of the new concept of `electronic damping' or `e-damping'. The resulting modeling approach is validated through experimental validation.

  14. Unisensory processing and multisensory integration in schizophrenia: a high-density electrical mapping study.

    PubMed

    Stone, David B; Urrea, Laura J; Aine, Cheryl J; Bustillo, Juan R; Clark, Vincent P; Stephen, Julia M

    2011-10-01

    In real-world settings, information from multiple sensory modalities is combined to form a complete, behaviorally salient percept - a process known as multisensory integration. While deficits in auditory and visual processing are often observed in schizophrenia, little is known about how multisensory integration is affected by the disorder. The present study examined auditory, visual, and combined audio-visual processing in schizophrenia patients using high-density electrical mapping. An ecologically relevant task was used to compare unisensory and multisensory evoked potentials from schizophrenia patients to potentials from healthy normal volunteers. Analysis of unisensory responses revealed a large decrease in the N100 component of the auditory-evoked potential, as well as early differences in the visual-evoked components in the schizophrenia group. Differences in early evoked responses to multisensory stimuli were also detected. Multisensory facilitation was assessed by comparing the sum of auditory and visual evoked responses to the audio-visual evoked response. Schizophrenia patients showed a significantly greater absolute magnitude response to audio-visual stimuli than to summed unisensory stimuli when compared to healthy volunteers, indicating significantly greater multisensory facilitation in the patient group. Behavioral responses also indicated increased facilitation from multisensory stimuli. The results represent the first report of increased multisensory facilitation in schizophrenia and suggest that, although unisensory deficits are present, compensatory mechanisms may exist under certain conditions that permit improved multisensory integration in individuals afflicted with the disorder.

  15. Multisensory brain mechanisms of bodily self-consciousness.

    PubMed

    Blanke, Olaf

    2012-07-18

    Recent research has linked bodily self-consciousness to the processing and integration of multisensory bodily signals in temporoparietal, premotor, posterior parietal and extrastriate cortices. Studies in which subjects receive ambiguous multisensory information about the location and appearance of their own body have shown that these brain areas reflect the conscious experience of identifying with the body (self-identification (also known as body-ownership)), the experience of where 'I' am in space (self-location) and the experience of the position from where 'I' perceive the world (first-person perspective). Along with phenomena of altered states of self-consciousness in neurological patients and electrophysiological data from non-human primates, these findings may form the basis for a neurobiological model of bodily self-consciousness.

  16. The Behavioral Relevance of Multisensory Neural Response Interactions

    PubMed Central

    Sperdin, Holger F.; Cappe, Céline; Murray, Micah M.

    2009-01-01

    Sensory information can interact to impact perception and behavior. Foods are appreciated according to their appearance, smell, taste and texture. Athletes and dancers combine visual, auditory, and somatosensory information to coordinate their movements. Under laboratory settings, detection and discrimination are likewise facilitated by multisensory signals. Research over the past several decades has shown that the requisite anatomy exists to support interactions between sensory systems in regions canonically designated as exclusively unisensory in their function and, more recently, that neural response interactions occur within these same regions, including even primary cortices and thalamic nuclei, at early post-stimulus latencies. Here, we review evidence concerning direct links between early, low-level neural response interactions and behavioral measures of multisensory integration. PMID:20582260

  17. Auditory, Somatosensory, and Multisensory Insular Cortex in the Rat

    PubMed Central

    Rodgers, Krista M.; Benison, Alexander M.; Klein, Andrea

    2008-01-01

    Compared with other areas of the forebrain, the function of insular cortex is poorly understood. This study examined the unisensory and multisensory function of the rat insula using high-resolution, whole-hemisphere, epipial evoked potential mapping. We found the posterior insula to contain distinct auditory and somatotopically organized somatosensory fields with an interposed and overlapping region capable of integrating these sensory modalities. Unisensory and multisensory responses were uninfluenced by complete lesioning of primary and secondary auditory and somatosensory cortices, suggesting a high degree of parallel afferent input from the thalamus. In light of the established connections of the posterior insula with the amygdala, we propose that integration of auditory and somatosensory modalities reported here may play a role in auditory fear conditioning. PMID:18424777

  18. Multisensory Information Processing For Enhanced Human Machine Symbiosis

    DTIC Science & Technology

    2015-08-02

    scales. In many cases, contextual information is essential for understanding human activities but often unavailable. Conceptually, vision based human...Activity recognition and prediction. Activity recognition typically requires behavior modeling and high level reasoning, which is essential for activity or...Cambridge (2004) 26. Ghazanfar, A. A., & Schroeder, C. E. Is neocortex essentially multisensory? Trends in cognitive sciences, 10(6), 278-285 (2006

  19. Social perception of others shapes one's own multisensory peripersonal space.

    PubMed

    Pellencin, Elisa; Paladino, Maria Paola; Herbelin, Bruno; Serino, Andrea

    2017-09-06

    The perception of our self is not restricted to our physical boundaries, but it extends beyond the body to incorporate the space where individual-environment interactions occur, i.e., the peripersonal space (PPS). PPS is generally conceived as a low-level multisensory-motor interface mediating hand-object interactions. Recent studies, however, showed that PPS representation is affected by higher-level cognitive factors. Here we asked whether the multisensory representation of PPS is influenced by high-level mechanisms implied in social interactions, such as the social perception of others. To this aim, in Experiment 1, we developed and validated a new multisensory interaction task in mixed reality (i.e., the Social PPS task). This task allows measuring the boundaries of PPS between one self and another person in a fully controlled, yet highly ecological, set-up. In the Experiment 2, we used this task to measure how participants' PPS varied when facing another person. The social perception of this person was manipulated via a classic social psychology procedure, so that, in two conditions, she was perceived either as a moral or an immoral character. We found that PPS representation is sensitive to the social perception of the other, being more extended when participants were facing a moral than when facing an immoral person. This effect was specific for social context, as no change in PPS was found if participants were facing an object, instead of the person. Interestingly, the social manipulation affected also attitude, identification, willingness to interact with the other, so as interpersonal distance. Together these findings show that social perception of others affects both the psychological representation of the others in relation to oneself and the multisensory representations of the space between oneself and the other, offering new insights about the role of social cognition in body representation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Attention modeled as information in learning multisensory integration.

    PubMed

    Bauer, Johannes; Magg, Sven; Wermter, Stefan

    2015-05-01

    Top-down cognitive processes affect the way bottom-up cross-sensory stimuli are integrated. In this paper, we therefore extend a successful previous neural network model of learning multisensory integration in the superior colliculus (SC) by top-down, attentional input and train it on different classes of cross-modal stimuli. The network not only learns to integrate cross-modal stimuli, but the model also reproduces neurons specializing in different combinations of modalities as well as behavioral and neurophysiological phenomena associated with spatial and feature-based attention. Importantly, we do not provide the model with any information about which input neurons are sensory and which are attentional. If the basic mechanisms of our model-self-organized learning of input statistics and divisive normalization-play a major role in the ontogenesis of the SC, then this work shows that these mechanisms suffice to explain a wide range of aspects both of bottom-up multisensory integration and the top-down influence on multisensory integration.

  1. Perceived Object Stability Depends on Multisensory Estimates of Gravity

    PubMed Central

    Barnett-Cowan, Michael; Fleming, Roland W.; Singh, Manish; Bülthoff, Heinrich H.

    2011-01-01

    Background How does the brain estimate object stability? Objects fall over when the gravity-projected centre-of-mass lies outside the point or area of support. To estimate an object's stability visually, the brain must integrate information across the shape and compare its orientation to gravity. When observers lie on their sides, gravity is perceived as tilted toward body orientation, consistent with a representation of gravity derived from multisensory information. We exploited this to test whether vestibular and kinesthetic information affect this visual task or whether the brain estimates object stability solely from visual information. Methodology/Principal Findings In three body orientations, participants viewed images of objects close to a table edge. We measured the critical angle at which each object appeared equally likely to fall over or right itself. Perceived gravity was measured using the subjective visual vertical. The results show that the perceived critical angle was significantly biased in the same direction as the subjective visual vertical (i.e., towards the multisensory estimate of gravity). Conclusions/Significance Our results rule out a general explanation that the brain depends solely on visual heuristics and assumptions about object stability. Instead, they suggest that multisensory estimates of gravity govern the perceived stability of objects, resulting in objects appearing more stable than they are when the head is tilted in the same direction in which they fall. PMID:21556363

  2. Sensorimotor synchronization with audio-visual stimuli: limited multisensory integration.

    PubMed

    Armstrong, Alan; Issartel, Johann

    2014-11-01

    Understanding how we synchronize our actions with stimuli from different sensory modalities plays a central role in helping to establish how we interact with our multisensory environment. Recent research has shown better performance with multisensory over unisensory stimuli; however, the type of stimuli used has mainly been auditory and tactile. The aim of this article was to expand our understanding of sensorimotor synchronization with multisensory audio-visual stimuli and compare these findings to their individual unisensory counterparts. This research also aims to assess the role of spatio-temporal structure for each sensory modality. The visual and/or auditory stimuli had either temporal or spatio-temporal information available and were presented to the participants in unimodal and bimodal conditions. Globally, the performance was significantly better for the bimodal compared to the unimodal conditions; however, this benefit was limited to only one of the bimodal conditions. In terms of the unimodal conditions, the level of synchronization with visual stimuli was better than auditory, and while there was an observed benefit with the spatio-temporal compared to temporal visual stimulus, this was not replicated with the auditory stimulus.

  3. Multisensory integration in children with Developmental Coordination Disorder.

    PubMed

    Coats, R O A; Britten, L; Utley, A; Astill, S L

    2015-10-01

    This study examines how multisensory stimuli affect the performance of children with Developmental Coordination Disorder (DCD) on a choice reaction time (CRT) task. Ten children with DCD, identified using the Movement Assessment Battery for Children-2, aged 7-10 years (4F, M=8 y 3 m, SD=17 m) and 10 typically developing peers (TDC) (5F, M=8 y 4 m, SD=17 m) reached to unimodal (auditory (AO), visual (VO)) and bimodal (audiovisual (AV)) stimuli at one of three target locations. A multisensory (AV) stimulus reduced RTs for both groups (p<0.001, η(2)=0.36). While the children with DCD had a longer RT in all conditions, the AV stimulus produced RTs in children with DCD (494 ms) that were equivalent to those produced by the TDC to the VO stimulus (493 ms). Movement Time (DCD=486 ms; TDC=434 ms) and Path Length (DCD=25.6 cm; TDC=24.2 cm) were longer in children with DCD compared to TDC as expected (p<0.05). Only the TDC benefited from the AV information for movement control, as deceleration time of the dominant hand was seen to decrease when moving to an AV stimulus (p<0.05). Overall, data shows children with DCD do benefit from a bimodal stimulus to plan their movement, but do not for movement control. Further research is required to understand if this is a result of impaired multisensory integration.

  4. Verbal and novel multisensory associative learning in adults

    PubMed Central

    Crewther, Sheila G

    2013-01-01

    To date, few studies have focused on the behavioural differences between the learning of multisensory auditory-visual and intra-modal associations. More specifically, the relative benefits of novel auditory-visual and verbal-visual associations for learning have not been directly compared. In Experiment 1, 20 adult volunteers completed three paired associate learning tasks: non-verbal novel auditory-visual (novel-AV), verbal-visual (verbal-AV; using pseudowords), and visual-visual (shape-VV). Participants were directed to make a motor response to matching novel and arbitrarily related stimulus pairs. Feedback was provided to facilitate trial and error learning. The results of Signal Detection Theory analyses suggested a multisensory enhancement of learning, with significantly higher discriminability measures (d-prime) in both the novel-AV and verbal-AV tasks than the shape-VV task. Motor reaction times were also significantly faster during the verbal-AV task than during the non-verbal learning tasks.  Experiment 2 (n = 12) used a forced-choice discrimination paradigm to assess whether a difference in unisensory stimulus discriminability could account for the learning trends in Experiment 1. Participants were significantly slower at discriminating unisensory pseudowords than the novel sounds and visual shapes, which was notable given that these stimuli produced superior learning. Together the findings suggest that verbal information has an added enhancing effect on multisensory associative learning in adults PMID:24627770

  5. Multisensory object representation: insights from studies of vision and touch.

    PubMed

    Lacey, Simon; Sathian, K

    2011-01-01

    Behavioral studies show that the unisensory representations underlying within-modal visual and haptic object recognition are strikingly similar in terms of view- and size-sensitivity, and integration of structural and surface properties. However, the basis for these attributes differs in each modality, indicating that while these representations are functionally similar, they are not identical. Imaging studies reveal bisensory, visuo-haptic object selectivity, notably in the lateral occipital complex and the intraparietal sulcus, that suggests a shared representation of objects. Such a multisensory representation could underlie visuo-haptic cross-modal object recognition. In this chapter, we compare visual and haptic within-modal object recognition and trace a progression from functionally similar but separate unisensory representations to a shared multisensory representation underlying cross-modal object recognition as well as view-independence, regardless of modality. We outline, and provide evidence for, a model of multisensory object recognition in which representations are flexibly accessible via top-down or bottom-up processing, the choice of route being influenced by object familiarity and individual preference along the object-spatial continuum of mental imagery. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Video game players show more precise multisensory temporal processing abilities.

    PubMed

    Donohue, Sarah E; Woldorff, Marty G; Mitroff, Stephen R

    2010-05-01

    Recent research has demonstrated enhanced visual attention and visual perception in individuals with extensive experience playing action video games. These benefits manifest in several realms, but much remains unknown about the ways in which video game experience alters perception and cognition. In the present study, we examined whether video game players' benefits generalize beyond vision to multisensory processing by presenting auditory and visual stimuli within a short temporal window to video game players and non-video game players. Participants performed two discrimination tasks, both of which revealed benefits for video game players: In a simultaneity judgment task, video game players were better able to distinguish whether simple visual and auditory stimuli occurred at the same moment or slightly offset in time, and in a temporal-order judgment task, they revealed an enhanced ability to determine the temporal sequence of multisensory stimuli. These results suggest that people with extensive experience playing video games display benefits that extend beyond the visual modality to also impact multisensory processing.

  7. Perceived object stability depends on multisensory estimates of gravity.

    PubMed

    Barnett-Cowan, Michael; Fleming, Roland W; Singh, Manish; Bülthoff, Heinrich H

    2011-04-27

    How does the brain estimate object stability? Objects fall over when the gravity-projected centre-of-mass lies outside the point or area of support. To estimate an object's stability visually, the brain must integrate information across the shape and compare its orientation to gravity. When observers lie on their sides, gravity is perceived as tilted toward body orientation, consistent with a representation of gravity derived from multisensory information. We exploited this to test whether vestibular and kinesthetic information affect this visual task or whether the brain estimates object stability solely from visual information. In three body orientations, participants viewed images of objects close to a table edge. We measured the critical angle at which each object appeared equally likely to fall over or right itself. Perceived gravity was measured using the subjective visual vertical. The results show that the perceived critical angle was significantly biased in the same direction as the subjective visual vertical (i.e., towards the multisensory estimate of gravity). Our results rule out a general explanation that the brain depends solely on visual heuristics and assumptions about object stability. Instead, they suggest that multisensory estimates of gravity govern the perceived stability of objects, resulting in objects appearing more stable than they are when the head is tilted in the same direction in which they fall.

  8. Neuroscience robotics to investigate multisensory integration and bodily awareness.

    PubMed

    Duenas, J; Chapuis, D; Pfeiffer, C; Martuzzi, R; Ionta, S; Blanke, O; Gassert, R

    2011-01-01

    Humans experience the self as localized within their body. This aspect of bodily self-consciousness can be experimentally manipulated by exposing individuals to conflicting multisensory input, or can be abnormal following focal brain injury. Recent technological developments helped to unravel some of the mechanisms underlying multisensory integration and self-location, but the neural underpinnings are still under investigation, and the manual application of stimuli resulted in large variability difficult to control. This paper presents the development and evaluation of an MR-compatible stroking device capable of presenting moving tactile stimuli to both legs and the back of participants lying on a scanner bed while acquiring functional neuroimaging data. The platform consists of four independent stroking devices with a travel of 16-20 cm and a maximum stroking velocity of 15 cm/s, actuated over non-magnetic ultrasonic motors. Complemented with virtual reality, this setup provides a unique research platform allowing to investigate multisensory integration and its effects on self-location under well-controlled experimental conditions. The MR-compatibility of the system was evaluated in both a 3 and a 7 Tesla scanner and showed negligible interference with brain imaging. In a preliminary study using a prototype device with only one tactile stimulator, fMRI data acquired on 12 healthy participants showed visuo-tactile synchrony-related and body-specific modulations of the brain activity in bilateral temporoparietal cortex.

  9. Multisensory perceptual learning is dependent upon task difficulty.

    PubMed

    De Niear, Matthew A; Koo, Bonhwang; Wallace, Mark T

    2016-11-01

    There has been a growing interest in developing behavioral tasks to enhance temporal acuity as recent findings have demonstrated changes in temporal processing in a number of clinical conditions. Prior research has demonstrated that perceptual training can enhance temporal acuity both within and across different sensory modalities. Although certain forms of unisensory perceptual learning have been shown to be dependent upon task difficulty, this relationship has not been explored for multisensory learning. The present study sought to determine the effects of task difficulty on multisensory perceptual learning. Prior to and following a single training session, participants completed a simultaneity judgment (SJ) task, which required them to judge whether a visual stimulus (flash) and auditory stimulus (beep) presented in synchrony or at various stimulus onset asynchronies (SOAs) occurred synchronously or asynchronously. During the training session, participants completed the same SJ task but received feedback regarding the accuracy of their responses. Participants were randomly assigned to one of three levels of difficulty during training: easy, moderate, and hard, which were distinguished based on the SOAs used during training. We report that only the most difficult (i.e., hard) training protocol enhanced temporal acuity. We conclude that perceptual training protocols for enhancing multisensory temporal acuity may be optimized by employing audiovisual stimuli for which it is difficult to discriminate temporal synchrony from asynchrony.

  10. Multifunctional materials and composites

    DOEpatents

    Seo, Dong-Kyun; Jeon, Ki-Wan

    2017-08-22

    Forming multifunctional materials and composites thereof includes contacting a first material having a plurality of oxygen-containing functional groups with a chalcogenide compound, and initiating a chemical reaction between the first material and the chalcogenide compound, thereby replacing oxygen in some of the oxygen-containing functional groups with chalcogen from the chalcogen-containing compound to yield a second material having chalcogen-containing functional groups and oxygen-containing functional groups. The first material is a carbonaceous material or a macromolecular material. A product including the second material is collected and may be processed further to yield a modified product or a composite.

  11. Templated biomimetic multifunctional coatings

    NASA Astrophysics Data System (ADS)

    Sun, Chih-Hung; Gonzalez, Adriel; Linn, Nicholas C.; Jiang, Peng; Jiang, Bin

    2008-02-01

    We report a bioinspired templating technique for fabricating multifunctional optical coatings that mimic both unique functionalities of antireflective moth eyes and superhydrophobic cicada wings. Subwavelength-structured fluoropolymer nipple arrays are created by a soft-lithography-like process. The utilization of fluoropolymers simultaneously enhances the antireflective performance and the hydrophobicity of the replicated films. The specular reflectivity matches the optical simulation using a thin-film multilayer model. The dependence of the size and the crystalline ordering of the replicated nipples on the resulting antireflective properties have also been investigated by experiment and modeling. These biomimetic materials may find important technological application in self-cleaning antireflection coatings.

  12. Multifunctional reference electrode

    DOEpatents

    Redey, L.; Vissers, D.R.

    1981-12-30

    A multifunctional, low mass reference electrode of a nickel tube, thermocouple means inside the nickel tube electrically insulated therefrom for measuring the temperature thereof, a housing surrounding the nickel tube, an electrolyte having a fixed sulfide ion activity between the housing and the outer surface of the nickel tube forming the nickel/nickel sulfide/sulfide half-cell are described. An ion diffusion barrier is associated with the housing in contact with the electrolyte. Also disclosed is a cell using the reference electrode to measure characteristics of a working electrode.

  13. Multifunctional reference electrode

    DOEpatents

    Redey, Laszlo; Vissers, Donald R.

    1983-01-01

    A multifunctional, low mass reference electrode of a nickel tube, thermocouple means inside the nickel tube electrically insulated therefrom for measuring the temperature thereof, a housing surrounding the nickel tube, an electrolyte having a fixed sulfide ion activity between the housing and the outer surface of the nickel tube forming the nickel/nickel sulfide/sulfide half-cell. An ion diffusion barrier is associated with the housing in contact with the electrolyte. Also disclosed is a cell using the reference electrode to measure characteristics of a working electrode.

  14. Multisensory interactions in the depth plane in front and rear space: a review.

    PubMed

    Van der Stoep, N; Nijboer, T C W; Van der Stigchel, S; Spence, C

    2015-04-01

    In this review, we evaluate the neurophysiological, neuropsychological, and psychophysical evidence relevant to the claim that multisensory information is processed differently depending on the region of space in which it happens to be presented. We discuss how the majority of studies of multisensory interactions in the depth plane that have been conducted to date have focused on visuotactile and audiotactile interactions in frontal peripersonal space and underline the importance of such multisensory interactions in defining peripersonal space. Based on our review of studies of multisensory interactions in depth, we question the extent to which peri- and extra-personal space (both frontal and rear) are characterized by differences in multisensory interactions (as evidenced by multisensory stimuli producing a different behavioral outcome as compared to unisensory stimulation). In addition to providing an overview of studies of multisensory interactions in different regions of space, our goal in writing this review has been to demonstrate that the various kinds of multisensory interactions that have been documented may follow very similar organizing principles. Multisensory interactions in depth that involve tactile stimuli are constrained by the fact that such stimuli typically need to contact the skin surface. Therefore, depth-related preferences of multisensory interactions involving touch can largely be explained in terms of their spatial alignment in depth and their alignment with the body. As yet, no such depth-related asymmetry has been observed in the case of audiovisual interactions. We therefore suggest that the spatial boundary of peripersonal space and the enhanced audiotactile and visuotactile interactions that occur in peripersonal space can be explained in terms of the particular spatial alignment of stimuli from different modalities with the body and that they likely reflect the result of prior multisensory experience.

  15. Single-trial multisensory memories affect later auditory and visual object discrimination.

    PubMed

    Thelen, Antonia; Talsma, Durk; Murray, Micah M

    2015-05-01

    Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand.

  16. Multifunctional sandwich composites

    NASA Astrophysics Data System (ADS)

    Vaidya, Uday K.

    2003-10-01

    Sandwich composites find increasing use as flexural load bearing lightweight sub-elements in air/space vehicles, rail/ground transportation, marine and sporting goods. The core in these applications is usually balsa wood, foam or honeycomb with laminated carbon or glass facesheets. A limitation of traditional sandwich onfigurations is that the space in the core becomes inaccessible once the facesheets are bonded in place. Significant multi-functional benefits can be obtained by making either the facesheets or the core, space accessible. Multi-functionality is generally referred to as value added to the structure that enhances functions beyond traditional load bearing. Such functions may include sound/vibration damping, ability to route wires or embed sensors. The present work reviews recent work done in enhancing the functionality of the core by use of the space in the core. The damage created by impact to sandwich constructions is always a limiting issue in design. In the present work, low velocity impact (LVI) response of newer/multi-functional sandwich constructions has been studied. Concepts of increasing sandwich core functionality have been reported.

  17. Cell Nucleus-Targeting Zwitterionic Carbon Dots

    PubMed Central

    Jung, Yun Kyung; Shin, Eeseul; Kim, Byeong-Su

    2015-01-01

    An innovative nucleus-targeting zwitterionic carbon dot (CD) vehicle has been developed for anticancer drug delivery and optical monitoring. The zwitterionic functional groups of the CDs introduced by a simple one-step synthesis using β-alanine as a passivating and zwitterionic ligand allow cytoplasmic uptake and subsequent nuclear translocation of the CDs. Moreover, multicolor fluorescence improves the accuracy of the CDs as an optical code. The CD-based drug delivery system constructed by non-covalent grafting of doxorubicin, exhibits superior antitumor efficacy owing to enhanced nuclear delivery in vitro and tumor accumulation in vivo, resulting in highly effective tumor growth inhibition. Since the zwitterionic CDs are highly biocompatible and effectively translocated into the nucleus, it provides a compelling solution to a multifunctional nanoparticle for substantially enhanced nuclear uptake of drugs and optical monitoring of translocation. PMID:26689549

  18. Cell Nucleus-Targeting Zwitterionic Carbon Dots.

    PubMed

    Jung, Yun Kyung; Shin, Eeseul; Kim, Byeong-Su

    2015-12-22

    An innovative nucleus-targeting zwitterionic carbon dot (CD) vehicle has been developed for anticancer drug delivery and optical monitoring. The zwitterionic functional groups of the CDs introduced by a simple one-step synthesis using β-alanine as a passivating and zwitterionic ligand allow cytoplasmic uptake and subsequent nuclear translocation of the CDs. Moreover, multicolor fluorescence improves the accuracy of the CDs as an optical code. The CD-based drug delivery system constructed by non-covalent grafting of doxorubicin, exhibits superior antitumor efficacy owing to enhanced nuclear delivery in vitro and tumor accumulation in vivo, resulting in highly effective tumor growth inhibition. Since the zwitterionic CDs are highly biocompatible and effectively translocated into the nucleus, it provides a compelling solution to a multifunctional nanoparticle for substantially enhanced nuclear uptake of drugs and optical monitoring of translocation.

  19. The Race that Precedes Coactivation: Development of Multisensory Facilitation in Children

    ERIC Educational Resources Information Center

    Barutchu, Ayla; Crewther, David P.; Crewther, Sheila G.

    2009-01-01

    Rationale: The facilitating effect of multisensory integration on motor responses in adults is much larger than predicted by race-models and is in accordance with the idea of coactivation. However, the development of multisensory facilitation of endogenously driven motor processes and its relationship to the development of complex cognitive skills…

  20. Binding of Sights and Sounds: Age-Related Changes in Multisensory Temporal Processing

    ERIC Educational Resources Information Center

    Hillock, Andrea R.; Powers, Albert R.; Wallace, Mark T.

    2011-01-01

    We live in a multisensory world and one of the challenges the brain is faced with is deciding what information belongs together. Our ability to make assumptions about the relatedness of multisensory stimuli is partly based on their temporal and spatial relationships. Stimuli that are proximal in time and space are likely to be bound together by…

  1. Individual Differences in the Multisensory Temporal Binding Window Predict Susceptibility to Audiovisual Illusions

    ERIC Educational Resources Information Center

    Stevenson, Ryan A.; Zemtsov, Raquel K.; Wallace, Mark T.

    2012-01-01

    Human multisensory systems are known to bind inputs from the different sensory modalities into a unified percept, a process that leads to measurable behavioral benefits. This integrative process can be observed through multisensory illusions, including the McGurk effect and the sound-induced flash illusion, both of which demonstrate the ability of…

  2. Individual Differences in the Multisensory Temporal Binding Window Predict Susceptibility to Audiovisual Illusions

    ERIC Educational Resources Information Center

    Stevenson, Ryan A.; Zemtsov, Raquel K.; Wallace, Mark T.

    2012-01-01

    Human multisensory systems are known to bind inputs from the different sensory modalities into a unified percept, a process that leads to measurable behavioral benefits. This integrative process can be observed through multisensory illusions, including the McGurk effect and the sound-induced flash illusion, both of which demonstrate the ability of…

  3. Binding of Sights and Sounds: Age-Related Changes in Multisensory Temporal Processing

    ERIC Educational Resources Information Center

    Hillock, Andrea R.; Powers, Albert R.; Wallace, Mark T.

    2011-01-01

    We live in a multisensory world and one of the challenges the brain is faced with is deciding what information belongs together. Our ability to make assumptions about the relatedness of multisensory stimuli is partly based on their temporal and spatial relationships. Stimuli that are proximal in time and space are likely to be bound together by…

  4. Hierarchical multifunctional nanocomposites

    NASA Astrophysics Data System (ADS)

    Ghasemi-Nejhad, Mehrdad N.

    2014-03-01

    Nanocomposites; including nano-materials such as nano-particles, nanoclays, nanofibers, nanotubes, and nanosheets; are of significant importance in the rapidly developing field of nanotechnology. Due to the nanometer size of these inclusions, their physicochemical characteristics differ significantly from those of micron size and bulk materials. The field of nanocomposites involves the study of multiphase materials where at least one of the constituent phases has one dimension less than 100 nm. This is the range where the phenomena associated with the atomic and molecular interaction strongly influence the macroscopic properties of materials. Since the building blocks of nanocomposites are at nanoscale, they have an enormous surface area with numerous interfaces between the two intermix phases. The special properties of the nano-composite arise from the interaction of its phases at the interface and/or interphase regions. By contrast, in a conventional composite based on micrometer sized filler such as carbon fibers, the interfaces between the filler and matrix constitutes have a much smaller surface-to-volume fraction of the bulk materials, and hence influence the properties of the host structure to a much smaller extent. The optimum amount of nanomaterials in the nanocomposites depends on the filler size, shape, homogeneity of particles distribution, and the interfacial bonding properties between the fillers and matrix. The promise of nanocomposites lies in their multifunctionality, i.e., the possibility of realizing unique combination of properties unachievable with traditional materials. The challenges in reaching this promise are tremendous. They include control over the distribution in size and dispersion of the nanosize constituents, and tailoring and understanding the role of interfaces between structurally or chemically dissimilar phases on bulk properties. While the properties of the matrix can be improved by the inclusions of nanomaterials, the

  5. High energy nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Wosiek, B.

    1986-01-01

    Experimental results on high energy nucleus-nucleus interactions are presented. The data are discussed within the framework of standard super-position models and from the point-of-view of the possible formation of new states of matter in heavy ion collisions.

  6. Multi-functional windows

    NASA Astrophysics Data System (ADS)

    Nag, Nagendra; Goldman, Lee M.; Balasubramanian, Sreeram; Sastri, Suri

    2013-06-01

    The requirements for modern aircraft are driving the need for conformal windows for future sensor systems. However, limitations on optical systems and the physical properties of optically transparent materials currently limit the geometry of existing windows and window assemblies to faceted assemblies of flat windows held in weight bearing frames. Novel material systems will have to be developed which combine different materials (e.g. ductile metals with transparent ceramics) into structures that combine transparency with structural integrity. Surmet's demonstrated ability to produce novel transparent ceramic/metal structures will allow us to produce such structures in the types of conformal shapes required for future aircraft applications. Furthermore, the ability to incorporate transparencies into such structures also holds out the promise of creating multi-functional windows which provide a broad range of capabilities that might include RF antennas and de-icing in addition to transparency. Recent results in this area will be presented.

  7. Learning Multisensory Integration and Coordinate Transformation via Density Estimation

    PubMed Central

    Sabes, Philip N.

    2013-01-01

    Sensory processing in the brain includes three key operations: multisensory integration—the task of combining cues into a single estimate of a common underlying stimulus; coordinate transformations—the change of reference frame for a stimulus (e.g., retinotopic to body-centered) effected through knowledge about an intervening variable (e.g., gaze position); and the incorporation of prior information. Statistically optimal sensory processing requires that each of these operations maintains the correct posterior distribution over the stimulus. Elements of this optimality have been demonstrated in many behavioral contexts in humans and other animals, suggesting that the neural computations are indeed optimal. That the relationships between sensory modalities are complex and plastic further suggests that these computations are learned—but how? We provide a principled answer, by treating the acquisition of these mappings as a case of density estimation, a well-studied problem in machine learning and statistics, in which the distribution of observed data is modeled in terms of a set of fixed parameters and a set of latent variables. In our case, the observed data are unisensory-population activities, the fixed parameters are synaptic connections, and the latent variables are multisensory-population activities. In particular, we train a restricted Boltzmann machine with the biologically plausible contrastive-divergence rule to learn a range of neural computations not previously demonstrated under a single approach: optimal integration; encoding of priors; hierarchical integration of cues; learning when not to integrate; and coordinate transformation. The model makes testable predictions about the nature of multisensory representations. PMID:23637588

  8. Behavioral measures of multisensory integration: bounds on bimodal detection probability.

    PubMed

    Colonius, Hans

    2015-01-01

    One way to test and quantify multisensory integration in a behavioral paradigm is to compare bimodal detection probability with bounds defined by some combination of the unimodal detection probabilities. Here we (1) improve on an upper bound recently suggested by Stevenson et al. (Brain Topogr 27(6):707-730, 2014), (2) present a lower bound, (3) interpret the bounds in terms of stochastic dependency between the detection probabilities, (4) discuss some additional assumptions required for the validity of any such bound, (5) suggest some potential applications to neurophysiologic measures, and point out some parallels to the 'race model inequality' for reaction times.

  9. [Cross-modal stochastic resonance--a special multisensory integration].

    PubMed

    Liu, Jie; Ai, Leit; Lou, Kewet; Liu, Jun

    2010-08-01

    Cross-modal stochastic resonance is a ubiquitous phenomenon, that is, a weak signal from one sensory pathway can be enhanced by the noise from a different sensory pathway. It is a special multisensory integration (MI) that can not be explained by the inverse-effectiveness rule. According to cross-modal stochastic resonance, the detection of signal is an inverted U-like function of the intensity of noise at different levels. In this paper, we reviewed the research of cross-modal stochastic resonance and put forward some possible explanations for it. These efforts raise a new idea for neural encoding and information processing of the brain.

  10. Protein Multifunctionality: Principles and Mechanisms

    PubMed Central

    Zaretsky, Joseph Z.; Wreschner, Daniel H.

    2008-01-01

    In the review, the nature of protein multifunctionality is analyzed. In the first part of the review the principles of structural/functional organization of protein are discussed. In the second part, the main mechanisms involved in development of multiple functions on a single gene product(s) are analyzed. The last part represents a number of examples showing that multifunctionality is a basic feature of biologically active proteins. PMID:21566747

  11. Multisensory interactions in early evoked brain activity follow the principle of inverse effectiveness.

    PubMed

    Senkowski, Daniel; Saint-Amour, Dave; Höfle, Marion; Foxe, John J

    2011-06-15

    A major determinant of multisensory integration, derived from single-neuron studies in animals, is the principle of inverse effectiveness (IE), which describes the phenomenon whereby maximal multisensory response enhancements occur when the constituent unisensory stimuli are minimally effective in evoking responses. Human behavioral studies, which have shown that multisensory interactions are strongest when stimuli are low in intensity are in agreement with the IE principle, but the neurophysiologic basis for this finding is unknown. In this high-density electroencephalography (EEG) study, we examined effects of stimulus intensity on multisensory audiovisual processing in event-related potentials (ERPs) and response time (RT) facilitation in the bisensory redundant target effect (RTE). The RTE describes that RTs are faster for bisensory redundant targets than for the respective unisensory targets. Participants were presented with semantically meaningless unisensory auditory, unisensory visual and bisensory audiovisual stimuli of low, middle and high intensity, while they were instructed to make a speeded button response when a stimulus in either modality was presented. Behavioral data showed that the RTE exceeded predictions on the basis of probability summations of unisensory RTs, indicative of integrative multisensory processing, but only for low intensity stimuli. Paralleling this finding, multisensory interactions in short latency (40-60ms) ERPs with a left posterior and right anterior topography were found particularly for stimuli with low intensity. Our findings demonstrate that the IE principle is applicable to early multisensory processing in humans.

  12. Ludic content in multisensory stimulation environments: an exploratory study about practice in Portugal.

    PubMed

    Castelhano, Nuno; Silva, Fabiana; Rezende, Márcia; Roque, Licínio; Magalhães, Lívia

    2013-09-01

    This article aims to document the use of multisensory stimulation environments and its related perceptions, concerning ludic content, play and the computer-mediated ludic activity, from the perspective of professionals organizing and delivering therapeutic activities in these spaces with children with developmental disabilities, in Portugal. Face-to-face open interviews with 12 professionals working in multisensory stimulation environments, selected by convenience criteria, were individually recorded, transcribed and submitted to content analysis. Three main themes emerged from the data: multisensory stimulation environments offer multiple possibilities for intervention, play is part of the intervention in multisensory environments and the computer-mediated ludic experience is perceived as useful for intervention. Data suggest that multisensory stimulation environments are used as versatile spaces, both considered and explored by the interviewed professionals in its ludic potential. This fact can renew the interest in multisensory environments, in particular for the area of play in Occupational Therapy, in which the use of the computer-mediated ludic experience is a recognized possibility. Limitations of this study are associated to the level of representativeness of the interviews in relation to the diverse universe of professionals using multisensory environments. The method for collecting data is also highly sensitive to the influence of the interviewer.

  13. Binding of sights and sounds: Age-related changes in multisensory temporal processing

    PubMed Central

    Hillock, Andrea R.; Powers, Albert R.; Wallace, Mark T.

    2011-01-01

    We live in a multisensory world and one of the challenges the brain is faced with is deciding what information belongs together. Our ability to make assumptions about the relatedness of multisensory stimuli is partly based on their temporal and spatial relationships. Stimuli that are proximal in time and space are likely to be bound together by the brain and ascribed to a common external event. Using this framework we can describe multisensory processes in the context of spatial and temporal filters or windows that compute the probability of the relatedness of stimuli. Whereas numerous studies have examined the characteristics of these multisensory filters in adults and discrepancies in window size have been reported between infants and adults, virtually nothing is known about multisensory temporal processing in childhood. To examine this, we compared the ability of 10 and 11 year olds and adults to detect audiovisual temporal asynchrony. Findings revealed striking and asymmetric age-related differences. Whereas children were able to identify asynchrony as readily as adults when visual stimuli preceded auditory cues, significant group differences were identified at moderately long stimulus onset asynchronies (150–350 ms) where the auditory stimulus was first. Results suggest that changes in audiovisual temporal perception extend beyond the first decade of life. In addition to furthering our understanding of basic multisensory developmental processes, these findings have implications on disorders (e.g., autism, dyslexia) in which emerging evidence suggests alterations in multisensory temporal function. PMID:21134385

  14. Evidence for training-induced plasticity in multisensory brain structures: an MEG study.

    PubMed

    Paraskevopoulos, Evangelos; Kuchenbuch, Anja; Herholz, Sibylle C; Pantev, Christo

    2012-01-01

    Multisensory learning and resulting neural brain plasticity have recently become a topic of renewed interest in human cognitive neuroscience. Music notation reading is an ideal stimulus to study multisensory learning, as it allows studying the integration of visual, auditory and sensorimotor information processing. The present study aimed at answering whether multisensory learning alters uni-sensory structures, interconnections of uni-sensory structures or specific multisensory areas. In a short-term piano training procedure musically naive subjects were trained to play tone sequences from visually presented patterns in a music notation-like system [Auditory-Visual-Somatosensory group (AVS)], while another group received audio-visual training only that involved viewing the patterns and attentively listening to the recordings of the AVS training sessions [Auditory-Visual group (AV)]. Training-related changes in cortical networks were assessed by pre- and post-training magnetoencephalographic (MEG) recordings of an auditory, a visual and an integrated audio-visual mismatch negativity (MMN). The two groups (AVS and AV) were differently affected by the training. The results suggest that multisensory training alters the function of multisensory structures, and not the uni-sensory ones along with their interconnections, and thus provide an answer to an important question presented by cognitive models of multisensory training.

  15. Effects of multisensory integration processes on response inhibition in adolescent autism spectrum disorder.

    PubMed

    Chmielewski, W X; Wolff, N; Mückschel, M; Roessner, V; Beste, C

    2016-10-01

    In everyday life it is often required to integrate multisensory input to successfully conduct response inhibition (RI) and thus major executive control processes. Both RI and multisensory processes have been suggested to be altered in autism spectrum disorder (ASD). It is, however, unclear which neurophysiological processes relate to changes in RI in ASD and in how far these processes are affected by possible multisensory integration deficits in ASD. Combining high-density EEG recordings with source localization analyses, we examined a group of adolescent ASD patients (n = 20) and healthy controls (n = 20) using a novel RI task. Compared to controls, RI processes are generally compromised in adolescent ASD. This aggravation of RI processes is modulated by the content of multisensory information. The neurophysiological data suggest that deficits in ASD emerge in attentional selection and resource allocation processes related to occipito-parietal and middle frontal regions. Most importantly, conflict monitoring subprocesses during RI were specifically modulated by content of multisensory information in the superior frontal gyrus. RI processes are overstrained in adolescent ASD, especially when conflicting multisensory information has to be integrated to perform RI. It seems that the content of multisensory input is important to consider in ASD and its effects on cognitive control processes.

  16. Perceptual learning shapes multisensory causal inference via two distinct mechanisms.

    PubMed

    McGovern, David P; Roudaia, Eugenie; Newell, Fiona N; Roach, Neil W

    2016-04-19

    To accurately represent the environment, our brains must integrate sensory signals from a common source while segregating those from independent sources. A reasonable strategy for performing this task is to restrict integration to cues that coincide in space and time. However, because multisensory signals are subject to differential transmission and processing delays, the brain must retain a degree of tolerance for temporal discrepancies. Recent research suggests that the width of this 'temporal binding window' can be reduced through perceptual learning, however, little is known about the mechanisms underlying these experience-dependent effects. Here, in separate experiments, we measure the temporal and spatial binding windows of human participants before and after training on an audiovisual temporal discrimination task. We show that training leads to two distinct effects on multisensory integration in the form of (i) a specific narrowing of the temporal binding window that does not transfer to spatial binding and (ii) a general reduction in the magnitude of crossmodal interactions across all spatiotemporal disparities. These effects arise naturally from a Bayesian model of causal inference in which learning improves the precision of audiovisual timing estimation, whilst concomitantly decreasing the prior expectation that stimuli emanate from a common source.

  17. A perspective on multisensory integration and rapid perturbation responses.

    PubMed

    Cluff, Tyler; Crevecoeur, Frédéric; Scott, Stephen H

    2015-05-01

    In order to perform accurate movements, the nervous system must transform sensory feedback into motor commands that compensate for errors caused by motor variability and external disturbances. Recent studies focusing on the importance of sensory feedback in motor control have illustrated that the brain generates highly flexible responses to visual perturbations (hand-cursor or target jumps), or following mechanical loads applied to the limb. These parallel approaches have emphasized sophisticated, goal-directed feedback control, but also reveal that flexible perturbation responses are expressed at different latencies depending on what sensory system is engaged by the perturbation. Across studies, goal-directed visuomotor responses consistently emerge in muscle activity ∼100ms after a perturbation, while mechanical perturbations evoke goal-directed muscle responses in as little as ∼60ms (long-latency responses). We discuss the limitation of current models of multisensory integration in light of these asynchronous processing delays, and suggest that understanding how the brain performs real-time multisensory integration is an open question for future studies.

  18. Multisensory cueing for enhancing orientation information during flight.

    PubMed

    Albery, William B

    2007-05-01

    The U.S. Air Force still regards spatial disorientation (SD) and loss of situational awareness (SA) as major contributing factors in operational Class A aircraft mishaps ($1M in aircraft loss and/or pilot fatality). Air Force Safety Agency data show 71 Class A SD mishaps from 1991-2004 in both fixed and rotary-wing aircraft. These mishaps resulted in 62 fatalities and an aircraft cost of over $2.OB. These losses account for 21 % of the USAF's Class A mishaps during that 14-yr period. Even non-mishap SD events negatively impact aircrew performance and reduce mission effectiveness. A multisensory system has been developed called the Spatial Orientation Retention Device (SORD) to enhance the aircraft attitude information to the pilot. SORD incorporates multisensory aids including helmet mounted symbology and tactile and audio cues. SORD has been prototyped and demonstrated in the Air Force Research Laboratory at Wright-Patterson AFB, OH. The technology has now been transitioned to a Rotary Wing Brownout program. This paper discusses the development of SORD and a potential application, including an augmented cognition application. Unlike automatic ground collision avoidance systems, SORD does not take over the aircraft if a pre-set altitude is broached by the pilot; rather, SORD provides complementary attitude cues to the pilot via the tactile, audio, and visual systems that allow the pilot to continue flying through disorienting conditions.

  19. Cortical hierarchies perform Bayesian causal inference in multisensory perception.

    PubMed

    Rohe, Tim; Noppeney, Uta

    2015-02-01

    To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.

  20. The role of expectation in multisensory body representation - neural evidence.

    PubMed

    Ferri, Francesca; Ambrosini, Ettore; Pinti, Paola; Merla, Arcangelo; Costantini, Marcello

    2017-08-01

    Sensory events contribute to body ownership, the feeling that the body belongs to me. However, the encoding of sensory events is not only reactive, but also proactive in that our brain generates prediction about forthcoming stimuli. In previous studies, we have shown that prediction of sensory events is a sufficient condition to induce the sense of body ownership. In this study, we investigated the underlying neural mechanisms. Participants were seated with their right arm resting upon a table just below another smaller table. Hence, the real hand was hidden from the participant's view and a life-sized rubber model of a right hand was placed on the small table in front of them. Participants observed a wooden plank while approaching - without touching - the rubber hand. We measured the phenomenology of the illusion by means of questionnaire. Neural activity was recorded by means of near-infrared spectroscopy (fNIRS). Results showed higher activation of multisensory parietal cortices in the rubber hand illusion induced by touch expectation. Furthermore, such activity was correlated with the subjective feeling of owning the rubber hand. Our results enrich current models of body ownership suggesting that our multisensory brain regions generate prediction on what could be my body and what could not. This finding might have interesting implications in all those cases in which body representation is altered, anorexia, bulimia nervosa and obesity, among others. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. The multisensory body revealed through its cast shadows

    PubMed Central

    Pavani, Francesco; Galfano, Giovanni

    2015-01-01

    One key issue when conceiving the body as a multisensory object is how the cognitive system integrates visible instances of the self and other bodies with one’s own somatosensory processing, to achieve self-recognition and body ownership. Recent research has strongly suggested that shadows cast by our own body have a special status for cognitive processing, directing attention to the body in a fast and highly specific manner. The aim of the present article is to review the most recent scientific contributions addressing how body shadows affect both sensory/perceptual and attentional processes. The review examines three main points: (1) body shadows as a special window to investigate the construction of multisensory body perception; (2) experimental paradigms and related findings; (3) open questions and future trajectories. The reviewed literature suggests that shadows cast by one’s own body promote binding between personal and extrapersonal space and elicit automatic orienting of attention toward the body-part casting the shadow. Future research should address whether the effects exerted by body shadows are similar to those observed when observers are exposed to other visual instances of their body. The results will further clarify the processes underlying the merging of vision and somatosensation when creating body representations. PMID:26042079

  2. Crocodylians evolved scattered multi-sensory micro-organs

    PubMed Central

    2013-01-01

    Background During their evolution towards a complete life cycle on land, stem reptiles developed both an impermeable multi-layered keratinized epidermis and skin appendages (scales) providing mechanical, thermal, and chemical protection. Previous studies have demonstrated that, despite the presence of a particularly armored skin, crocodylians have exquisite mechanosensory abilities thanks to the presence of small integumentary sensory organs (ISOs) distributed on postcranial and/or cranial scales. Results Here, we analyze and compare the structure, innervation, embryonic morphogenesis and sensory functions of postcranial, cranial, and lingual sensory organs of the Nile crocodile (Crocodylus niloticus) and the spectacled caiman (Caiman crocodilus). Our molecular analyses indicate that sensory neurons of crocodylian ISOs express a large repertoire of transduction channels involved in mechano-, thermo-, and chemosensory functions, and our electrophysiological analyses confirm that each ISO exhibits a combined sensitivity to mechanical, thermal and pH stimuli (but not hyper-osmotic salinity), making them remarkable multi-sensorial micro-organs with no equivalent in the sensory systems of other vertebrate lineages. We also show that ISOs all exhibit similar morphologies and modes of development, despite forming at different stages of scale morphogenesis across the body. Conclusions The ancestral vertebrate diffused sensory system of the skin was transformed in the crocodylian lineages into an array of discrete multi-sensory micro-organs innervated by multiple pools of sensory neurons. This discretization of skin sensory expression sites is unique among vertebrates and allowed crocodylians to develop a highly-armored, but very sensitive, skin. PMID:23819918

  3. A cellular mechanism for inverse effectiveness in multisensory integration

    PubMed Central

    Truszkowski, Torrey LS; Carrillo, Oscar A; Bleier, Julia; Ramirez-Vizcarrondo, Carolina M; Felch, Daniel L; McQuillan, Molly; Truszkowski, Christopher P; Khakhalin, Arseny S; Aizenman, Carlos D

    2017-01-01

    To build a coherent view of the external world, an organism needs to integrate multiple types of sensory information from different sources, a process known as multisensory integration (MSI). Previously, we showed that the temporal dependence of MSI in the optic tectum of Xenopus laevis tadpoles is mediated by the network dynamics of the recruitment of local inhibition by sensory input (Felch et al., 2016). This was one of the first cellular-level mechanisms described for MSI. Here, we expand this cellular level view of MSI by focusing on the principle of inverse effectiveness, another central feature of MSI stating that the amount of multisensory enhancement observed inversely depends on the size of unisensory responses. We show that non-linear summation of crossmodal synaptic responses, mediated by NMDA-type glutamate receptor (NMDARs) activation, form the cellular basis for inverse effectiveness, both at the cellular and behavioral levels. DOI: http://dx.doi.org/10.7554/eLife.25392.001 PMID:28315524

  4. Behavioural effects of long-term multi-sensory stimulation.

    PubMed

    Martin, N T; Gaffan, E A; Williams, T

    1998-02-01

    Regular access to a multi-sensory environment (MSE or Snoezelen room) was compared with a non-complex sensory environment for individuals with learning disabilities. We also tested the prediction that those individuals whose challenging behaviour was maintained by sensory consequences would benefit most from exposure to the MSE. The conditions were compared over 16-week periods using a double crossover design, and were matched for social contact and attention from the enabler. Participants were randomly assigned to orders of treatments. Participants were 27 adults with severe/profound learning disabilities who exhibited challenging behaviour. Behaviour was assessed before and after each treatment phase using both direct observation and standardized assessments (the Functional Performance Record and the Problem Behaviour Inventory). The behavioural observations formed the basis of a functional analysis of each individual's challenging behaviour. Some participants became more calm and relaxed while in the MSE, however, the objective measures of behaviour outside the treatment settings revealed no difference between the MSE and control conditions. Challenging behaviour maintained by sensory consequences showed no greater responsivity to the MSE than to the control condition. The multi-sensory environment had no effects beyond those that could be ascribed to the social interaction between participant and enabler. Anecdotal evidence of favourable responses within the MSE itself could not be confirmed outside the environment.

  5. Multifunctions of bounded variation

    NASA Astrophysics Data System (ADS)

    Vinter, R. B.

    2016-02-01

    Consider control systems described by a differential equation with a control term or, more generally, by a differential inclusion with velocity set F (t , x). Certain properties of state trajectories can be derived when it is assumed that F (t , x) is merely measurable w.r.t. the time variable t. But sometimes a refined analysis requires the imposition of stronger hypotheses regarding the time dependence. Stronger forms of necessary conditions for minimizing state trajectories can be derived, for example, when F (t , x) is Lipschitz continuous w.r.t. time. It has recently become apparent that significant addition properties of state trajectories can still be derived, when the Lipschitz continuity hypothesis is replaced by the weaker requirement that F (t , x) has bounded variation w.r.t. time. This paper introduces a new concept of multifunctions F (t , x) that have bounded variation w.r.t. time near a given state trajectory, of special relevance to control. We provide an application to sensitivity analysis.

  6. Multifunctional periodic cellular metals.

    PubMed

    Wadley, Haydn N G

    2006-01-15

    Periodic cellular metals with honeycomb and corrugated topologies are widely used for the cores of light weight sandwich panel structures. Honeycombs have closed cell pores and are well suited for thermal protection while also providing efficient load support. Corrugated core structures provide less efficient and highly anisotropic load support, but enable cross flow heat exchange opportunities because their pores are continuous in one direction. Recent advances in topology design and fabrication have led to the emergence of lattice truss structures with open cell structures. These three classes of periodic cellular metals can now be fabricated from a wide variety of structural alloys. Many topologies are found to provide adequate stiffness and strength for structural load support when configured as the cores of sandwich panels. Sandwich panels with core relative densities of 2-10% and cell sizes in the millimetre range are being assessed for use as multifunctional structures. The open, three-dimensional interconnected pore networks of lattice truss topologies provide opportunities for simultaneously supporting high stresses while also enabling cross flow heat exchange. These highly compressible structures also provide opportunities for the mitigation of high intensity dynamic loads created by impacts and shock waves in air or water. By filling the voids with polymers and hard ceramics, these structures have also been found to offer significant resistance to penetration by projectiles.

  7. Multifunctional Mitochondrial AAA Proteases

    PubMed Central

    Glynn, Steven E.

    2017-01-01

    Mitochondria perform numerous functions necessary for the survival of eukaryotic cells. These activities are coordinated by a diverse complement of proteins encoded in both the nuclear and mitochondrial genomes that must be properly organized and maintained. Misregulation of mitochondrial proteostasis impairs organellar function and can result in the development of severe human diseases. ATP-driven AAA+ proteins play crucial roles in preserving mitochondrial activity by removing and remodeling protein molecules in accordance with the needs of the cell. Two mitochondrial AAA proteases, i-AAA and m-AAA, are anchored to either face of the mitochondrial inner membrane, where they engage and process an array of substrates to impact protein biogenesis, quality control, and the regulation of key metabolic pathways. The functionality of these proteases is extended through multiple substrate-dependent modes of action, including complete degradation, partial processing, or dislocation from the membrane without proteolysis. This review discusses recent advances made toward elucidating the mechanisms of substrate recognition, handling, and degradation that allow these versatile proteases to control diverse activities in this multifunctional organelle. PMID:28589125

  8. A study on the effect of multisensory stimulation in behaving rats.

    PubMed

    Semprini, Marianna; Boi, Fabio; Tucci, Valter; Vato, Alessandro

    2016-08-01

    This study explored the psychophysical effects of intracortical microstimulation (ICMS) coupled to auditory stimulation during a behavioral detection task in rats. ICMS directed to the sensory areas of the cortex can be instrumental in facilitating operant conditioning behavior. Moreover, multisensory stimulation promotes learning by enabling the subject to access multiple information channels. However, the extent to which multisensory information can be used as a cue to make decisions has not been fully understood. This study addressed the exploration of the parameters of multisensory stimulation delivered to behaving rats in an operant conditioning task. Preliminary data indicate that animal decisions can be shaped by online changing the stimulation parameters.

  9. Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus

    PubMed Central

    Gruters, Kurtis G.; Groh, Jennifer M.

    2012-01-01

    The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC “hears” would seem to be passed both “upward” to thalamus and thence to auditory cortex and beyond, as well as “downward” via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus. PMID:23248584

  10. Multisensory stimulation for people with dementia: a review of the literature.

    PubMed

    Sánchez, Alba; Millán-Calenti, José C; Lorenzo-López, Laura; Maseda, Ana

    2013-02-01

    The use of multisensory stimulation in people with dementia is becoming increasingly popular in the last decades. The aim of this review is to analyze the therapeutic effectiveness of multisensory stimulation in people with dementia. We made a search on Medline and Web of Science databases referred to all researches published from the year 1990 to 2012, which used multisensory stimulation techniques in people with dementia. The revision of the 18 articles which fulfilled the inclusion/exclusion criteria seems to prove evidence that multisensory stimulation environments produce immediate positive effects on the behavior and mood of people with dementia. Based on the above, we think it can be a useful nonpharmacological intervention on neuropsychological symptoms though, in any case, it would be necessary to start more reliable protocols from the methodological point of view in order to establish its long-term effectiveness.

  11. Suppression of multisensory integration by modality-specific attention in aging

    PubMed Central

    Hugenschmidt, Christina E.; Mozolic, Jennifer L.; Laurienti, Paul J.

    2009-01-01

    Previous research demonstrates that modality-specific selective attention attenuates multisensory integration in healthy young adults. Additionally, older adults evidence enhanced multisensory integration compared to younger adults. We hypothesized that these increases were due to changes in top-down suppression, and therefore older adults would demonstrate multisensory integration while selectively attending. Performance of older and younger adults was compared on a cued discrimination task. Older adults had greater multisensory integration than younger adults in all conditions, yet were still able to reduce integration using selective attention. This suggests that attentional processes are intact in older adults, but are unable to compensate for an overall increase in the amount of sensory processing during divided attention. PMID:19218871

  12. Multisensory-Based Rehabilitation Approach: Translational Insights from Animal Models to Early Intervention.

    PubMed

    Purpura, Giulia; Cioni, Giovanni; Tinelli, Francesca

    2017-01-01

    Multisensory processes permit combinations of several inputs, coming from different sensory systems, allowing for a coherent representation of biological events and facilitating adaptation to environment. For these reasons, their application in neurological and neuropsychological rehabilitation has been enhanced in the last decades. Recent studies on animals and human models have indicated that, on one hand multisensory integration matures gradually during post-natal life and development is closely linked to environment and experience and, on the other hand, that modality-specific information seems to do not benefit by redundancy across multiple sense modalities and is more readily perceived in unimodal than in multimodal stimulation. In this review, multisensory process development is analyzed, highlighting clinical effects in animal and human models of its manipulation for rehabilitation of sensory disorders. In addition, new methods of early intervention based on multisensory-based rehabilitation approach and their applications on different infant populations at risk of neurodevelopmental disabilities are discussed.

  13. Enhanced multifunctional paint for detection of radiation

    DOEpatents

    Farmer, Joseph C.; Moses, Edward Ira; Rubenchik, Alexander M.

    2017-03-07

    An enhanced multifunctional paint apparatus, systems, and methods for detecting radiation on a surface include providing scintillation particles; providing an enhance neutron absorptive material; providing a binder; combining the scintillation particles, the enhance neutron absorptive material, and the binder creating a multifunctional paint; applying the multifunctional paint to the surface; and monitoring the surface for detecting radiation.

  14. Terahertz Nanoscience of Multifunctional Materials: Atomistic Exploration

    DTIC Science & Technology

    2014-03-28

    Approved for Public Release; Distribution Unlimited Final report on the project "Terahertz Nanoscience of Multifunctional Materials: Atomistic...non peer-reviewed journals: Final report on the project "Terahertz Nanoscience of Multifunctional Materials: Atomistic Exploration" Report Title In... nanoscience of multifunctional materials: atomistic exploration” PI:Inna Ponomareva We have accomplished the following. 1. We have developed a set of

  15. Multifunctional pattern-generating circuits.

    PubMed

    Briggman, K L; Kristan, W B

    2008-01-01

    The ability of distinct anatomical circuits to generate multiple behavioral patterns is widespread among vertebrate and invertebrate species. These multifunctional neuronal circuits are the result of multistable neural dynamics and modular organization. The evidence suggests multifunctional circuits can be classified by distinct architectures, yet the activity patterns of individual neurons involved in more than one behavior can vary dramatically. Several mechanisms, including sensory input, the parallel activity of projection neurons, neuromodulation, and biomechanics, are responsible for the switching between patterns. Recent advances in both analytical and experimental tools have aided the study of these complex circuits.

  16. Altered neural oscillations during multisensory integration in adolescents with fetal alcohol spectrum disorder.

    PubMed

    Bolaños, Alfredo D; Coffman, Brian A; Candelaria-Cook, Felicha T; Kodituwakku, Piyadasa; Stephen, Julia M

    2017-09-25

    Children with fetal alcohol spectrum disorder (FASD), who were exposed to alcohol in utero, display a broad range of sensory, cognitive, and behavioral deficits, which are broadly theorized to be rooted in altered brain function and structure. Based on the role of neural oscillations in multisensory integration from past studies, we hypothesized that adolescents with FASD would show a decrease in oscillatory power during event-related gamma oscillatory activity (30-100Hz), when compared to typically-developing healthy controls (HC), and that such decrease in oscillatory power would predict behavioral performance. We measured sensory neurophysiology using magnetoencephalography (MEG) during passive auditory (A), somatosensory (S), and multisensory (synchronous A/S) stimulation in 19 adolescents (12-21yrs) with FASD and 23 age- and gender-matched HC. We employed a cross-hemisphere multisensory paradigm to assess interhemispheric connectivity deficits in children with FASD. Time-frequency analysis of MEG data revealed a significant decrease in gamma oscillatory power for both unisensory and multisensory conditions in the FASD group relative to HC, based on permutation testing of significant group differences. Greater beta oscillatory power (15-30 Hz) was also noted in the FASD group compared to HC in both unisensory and multisensory conditions. Regression analysis revealed greater predictive power of multisensory oscillations from unisensory oscillations in the FASD group compared to the HC group. Furthermore, multisensory oscillatory power, for both groups, predicted performance on the Intra-Extradimensional Set Shift Task and the Cambridge Gambling Task. Altered oscillatory power in the FASD group may reflect a restricted ability to process somatosensory and multisensory stimuli during day-to-day interactions. These alterations in neural oscillations may be associated with the neurobehavioral deficits experienced by adolescents with FASD, and may carry over to

  17. What does a neuron learn from multisensory experience?

    PubMed Central

    Xu, Jinghong; Yu, Liping; Stanford, Terrence R.; Rowland, Benjamin A.

    2014-01-01

    The brain's ability to integrate information from different senses is acquired only after extensive sensory experience. However, whether early life experience instantiates a general integrative capacity in multisensory neurons or one limited to the particular cross-modal stimulus combinations to which one has been exposed is not known. By selectively restricting either visual-nonvisual or auditory-nonauditory experience during the first few months of life, the present study found that trisensory neurons in cat superior colliculus (as well as their bisensory counterparts) became adapted to the cross-modal stimulus combinations specific to each rearing environment. Thus, even at maturity, trisensory neurons did not integrate all cross-modal stimulus combinations to which they were capable of responding, but only those that had been linked via experience to constitute a coherent spatiotemporal event. This selective maturational process determines which environmental events will become the most effective targets for superior colliculus-mediated shifts of attention and orientation. PMID:25392160

  18. Multisensory cues capture spatial attention regardless of perceptual load.

    PubMed

    Santangelo, Valerio; Spence, Charles

    2007-12-01

    We compared the ability of auditory, visual, and audiovisual (bimodal) exogenous cues to capture visuo-spatial attention under conditions of no load versus high perceptual load. Participants had to discriminate the elevation (up vs. down) of visual targets preceded by either unimodal or bimodal cues under conditions of high perceptual load (in which they had to monitor a rapidly presented central stream of visual letters for occasionally presented target digits) or no perceptual load (in which the central stream was replaced by a fixation point). The results of 3 experiments showed that all 3 cues captured visuo-spatial attention in the no-load condition. By contrast, only the bimodal cues captured visuo-spatial attention in the high-load condition, indicating for the first time that multisensory integration can play a key role in disengaging spatial attention from a concurrent perceptually demanding stimulus.

  19. A Multisensory Cortical Network for Understanding Speech in Noise

    PubMed Central

    Bishop, Christopher W.; Miller, Lee M.

    2010-01-01

    In noisy environments, listeners tend to hear a speaker’s voice yet struggle to understand what is said. The most effective way to improve intelligibility in such conditions is to watch the speaker’s mouth movements. Here we identify the neural networks that distinguish understanding from merely hearing speech, and determine how the brain applies visual information to improve intelligibility. Using functional magnetic resonance imaging, we show that understanding speech-in-noise is supported by a network of brain areas including the left superior parietal lobule, the motor/premotor cortex, and the left anterior superior temporal sulcus (STS), a likely apex of the acoustic processing hierarchy. Multisensory integration likely improves comprehension through improved communication between the left temporal–occipital boundary, the left medial-temporal lobe, and the left STS. This demonstrates how the brain uses information from multiple modalities to improve speech comprehension in naturalistic, acoustically adverse conditions. PMID:18823249

  20. The benefit of multisensory integration with biological motion signals.

    PubMed

    Mendonça, Catarina; Santos, Jorge A; López-Moliner, Joan

    2011-09-01

    Assessing intentions, direction, and velocity of others is necessary for most daily tasks, and such information is often made available by both visual and auditory motion cues. Therefore, it is not surprising our great ability to perceive human motion. Here, we explore the multisensory integration of cues of biological motion walking speed. After testing for audiovisual asynchronies (visual signals led auditory ones by 30 ms in simultaneity temporal windows of 76.4 ms), in the main experiment, visual, auditory, and bimodal stimuli were compared to a standard audiovisual walker in a velocity discrimination task. Results in variance reduction conformed to optimal integration of congruent bimodal stimuli across all subjects. Interestingly, the perceptual judgements were still close to optimal for stimuli at the smallest level of incongruence. Comparison of slopes allows us to estimate an integration window of about 60 ms, which is smaller than that reported in audiovisual speech.

  1. Multi-sensory intervention for preterm infants improves sucking organization

    PubMed Central

    Medoff-Cooper, Barbara; Rankin, Kristin; Li, Zhuoying; Liu, Li; White-Traut, Rosemary

    2015-01-01

    Objective The aim of this RCT was to evaluate sucking organization in premature infants following a preterm infant multi-sensory intervention, the Auditory, Tactile, Visual, and Vestibular (ATVV). Study Design A convenience sample of 183 healthy premature infants born 29 - 34 weeks post-menstrual age (PMA) enrolled. Sucking organization was measured at baseline, then weekly assessments, during the infant's hospital stay. Results A quadratic trend was observed for number of sucks, sucks per burst, and maturity index, with the intervention group increasing significantly faster by day 7 (Model estimates for group*day: β = 13.69, p < 0.01; β = 1.16, p < 0.01; and β = 0.12, p < 0.05, respectively). Sucking pressure increased linearly over time, with significant between-group differences at day 14 (β = 45.66, p < 0.01). Conclusion ATVV infants exhibited improved sucking organization during hospitalization, suggestive that ATVV intervention improves oral feeding. PMID:25822519

  2. Survey on multisensory feedback virtual reality dental training systems.

    PubMed

    Wang, D; Li, T; Zhang, Y; Hou, J

    2016-11-01

    Compared with traditional dental training methods, virtual reality training systems integrated with multisensory feedback possess potentials advantages. However, there exist many technical challenges in developing a satisfactory simulator. In this manuscript, we systematically survey several current dental training systems to identify the gaps between the capabilities of these systems and the clinical training requirements. After briefly summarising the components, functions and unique features of each system, we discuss the technical challenges behind these systems including the software, hardware and user evaluation methods. Finally, the clinical requirements of an ideal dental training system are proposed. Future research/development areas are identified based on an analysis of the gaps between current systems and clinical training requirements.

  3. Coding of multisensory temporal patterns in human superior temporal sulcus

    PubMed Central

    Noesselt, Tömme; Bergmann, Daniel; Heinze, Hans-Jochen; Münte, Thomas; Spence, Charles

    2012-01-01

    Philosophers, psychologists, and neuroscientists have long been interested in how the temporal aspects of perception are represented in the brain. In the present study, we investigated the neural basis of the temporal perception of synchrony/asynchrony for audiovisual speech stimuli using functional magnetic resonance imaging (fMRI). Subjects judged the temporal relation of (a)synchronous audiovisual speech streams, and indicated any changes in their perception of the stimuli over time. Differential hemodynamic responses for synchronous versus asynchronous stimuli were observed in the multisensory superior temporal sulcus complex (mSTS-c) and prefrontal cortex. Within mSTS-c we found adjacent regions expressing an enhanced BOLD-response to the different physical (a)synchrony conditions. These regions were further modulated by the subjects' perceptual state. By calculating the distances between the modulated regions within mSTS-c in single-subjects we demonstrate that the “auditory leading (AL)” and “visual leading (VL) areas” lie closer to “synchrony areas” than to each other. Moreover, analysis of interregional connectivity indicates a stronger functional connection between multisensory prefrontal cortex and mSTS-c during the perception of asynchrony. Taken together, these results therefore suggest the presence of distinct sub-regions within the human STS-c for the maintenance of temporal relations for audiovisual speech stimuli plus differential functional connectivity with prefrontal regions. The respective local activity in mSTS-c is dependent both upon the physical properties of the stimuli presented and upon the subjects' perception of (a)synchrony. PMID:22973202

  4. The multi-sensory approach as a geoeducational strategy

    NASA Astrophysics Data System (ADS)

    Musacchio, Gemma; Piangiamore, Giovanna Lucia; Pino, Nicola Alessandro

    2014-05-01

    Geoscience knowledge has a strong impact in modern society as it relates to natural hazards, sustainability and environmental issues. The general public has a demanding attitude towards the understanding of crucial geo-scientific topics that is only partly satisfied by science communication strategies and/or by outreach or school programs. A proper knowledge of the phenomena might help trigger crucial inquiries when approaching mitigation of geo-hazards and geo-resources, while providing the right tool for the understanding of news and ideas floating from the web or other media, and, in other words, help communication to be more efficient. Nonetheless available educational resources seem to be inadequate in meeting the goal, while research institutions are facing the challenge to experience new communication strategies and non-conventional way of learning capable to allow the understanding of crucial scientific contents. We suggest the use of multi-sensory approach as a successful non-conventional way of learning for children and as a different perspective of learning for older students and adults. Sense organs stimulation are perceived and processed to build the knowledge of the surrounding, including all sorts of hazards. Powerfully relying in the sense of sight, Humans have somehow lost most of their ability for a deep perception of the environment enriched by all the other senses. Since hazards involve emotions we argue that new ways to approach the learning might go exactly through emotions that one might stress with a tactile experience, a hearing or smell stimulation. To test and support our idea we are building a package of learning activities and exhibits based on a multi-sensory experience where the sight is not allowed.

  5. Developmental changes in the multisensory temporal binding window persist into adolescence.

    PubMed

    Hillock-Dunn, Andrea; Wallace, Mark T

    2012-09-01

    We live in a world rich in sensory information, and consequently the brain is challenged with deciphering which cues from the various sensory modalities belong together. Determinations regarding the relatedness of sensory information appear to be based, at least in part, on the spatial and temporal relationships between the stimuli. Stimuli that are presented in close spatial and temporal correspondence are more likely to be associated with one another and thus 'bound' into a single perceptual entity. While there is a robust literature delineating behavioral changes in perception induced by multisensory stimuli, maturational changes in multisensory processing, particularly in the temporal realm, are poorly understood. The current study examines the developmental progression of multisensory temporal function by analyzing responses on an audiovisual simultaneity judgment task in 6- to 23-year-old participants. The overarching hypothesis for the study was that multisensory temporal function will mature with increasing age, with the developmental trajectory for this change being the primary point of inquiry. Results indeed reveal an age-dependent decrease in the size of the 'multisensory temporal binding window', the temporal interval within which multisensory stimuli are likely to be perceptually bound, with changes occurring over a surprisingly protracted time course that extends into adolescence.

  6. Evidence for enhanced multisensory facilitation with stimulus relevance: an electrophysiological investigation.

    PubMed

    Barutchu, Ayla; Freestone, Dean R; Innes-Brown, Hamish; Crewther, David P; Crewther, Sheila G

    2013-01-01

    Currently debate exists relating to the interplay between multisensory processes and bottom-up and top-down influences. However, few studies have looked at neural responses to newly paired audiovisual stimuli that differ in their prescribed relevance. For such newly associated audiovisual stimuli, optimal facilitation of motor actions was observed only when both components of the audiovisual stimuli were targets. Relevant auditory stimuli were found to significantly increase the amplitudes of the event-related potentials at the occipital pole during the first 100 ms post-stimulus onset, though this early integration was not predictive of multisensory facilitation. Activity related to multisensory behavioral facilitation was observed approximately 166 ms post-stimulus, at left central and occipital sites. Furthermore, optimal multisensory facilitation was found to be associated with a latency shift of induced oscillations in the beta range (14-30 Hz) at right hemisphere parietal scalp regions. These findings demonstrate the importance of stimulus relevance to multisensory processing by providing the first evidence that the neural processes underlying multisensory integration are modulated by the relevance of the stimuli being combined. We also provide evidence that such facilitation may be mediated by changes in neural synchronization in occipital and centro-parietal neural populations at early and late stages of neural processing that coincided with stimulus selection, and the preparation and initiation of motor action.

  7. Taking a call is facilitated by the multisensory processing of smartphone vibrations, sounds, and flashes.

    PubMed

    Pomper, Ulrich; Brincker, Jana; Harwood, James; Prikhodko, Ivan; Senkowski, Daniel

    2014-01-01

    Many electronic devices that we use in our daily lives provide inputs that need to be processed and integrated by our senses. For instance, ringing, vibrating, and flashing indicate incoming calls and messages in smartphones. Whether the presentation of multiple smartphone stimuli simultaneously provides an advantage over the processing of the same stimuli presented in isolation has not yet been investigated. In this behavioral study we examined multisensory processing between visual (V), tactile (T), and auditory (A) stimuli produced by a smartphone. Unisensory V, T, and A stimuli as well as VA, AT, VT, and trisensory VAT stimuli were presented in random order. Participants responded to any stimulus appearance by touching the smartphone screen using the stimulated hand (Experiment 1), or the non-stimulated hand (Experiment 2). We examined violations of the race model to test whether shorter response times to multisensory stimuli exceed probability summations of unisensory stimuli. Significant violations of the race model, indicative of multisensory processing, were found for VA stimuli in both experiments and for VT stimuli in Experiment 1. Across participants, the strength of this effect was not associated with prior learning experience and daily use of smartphones. This indicates that this integration effect, similar to what has been previously reported for the integration of semantically meaningless stimuli, could involve bottom-up driven multisensory processes. Our study demonstrates for the first time that multisensory processing of smartphone stimuli facilitates taking a call. Thus, research on multisensory integration should be taken into consideration when designing electronic devices such as smartphones.

  8. Developmental changes in the multisensory temporal binding window persist into adolescence

    PubMed Central

    Hillock-Dunn, Andrea; Wallace, Mark T.

    2014-01-01

    We live in a world rich in sensory information, and consequently the brain is challenged with deciphering which cues from the various sensory modalities belong together. Determinations regarding the relatedness of sensory information appear to be based, at least in part, on the spatial and temporal relationships between the stimuli. Stimuli that are presented in close spatial and temporal correspondence are more likely to be associated with one another and thus ‘bound’ into a single perceptual entity. While there is a robust literature delineating behavioral changes in perception induced by multisensory stimuli, maturational changes in multisensory processing, particularly in the temporal realm, are poorly understood. The current study examines the developmental progression of multisensory temporal function by analyzing responses on an audiovisual simultaneity judgment task in 6- to 23-year-old participants. The overarching hypothesis for the study was that multisensory temporal function will mature with increasing age, with the developmental trajectory for this change being the primary point of inquiry. Results indeed reveal an age-dependent decrease in the size of the ‘multisensory temporal binding window’, the temporal interval within which multisensory stimuli are likely to be perceptually bound, with changes occurring over a surprisingly protracted time course that extends into adolescence. PMID:22925516

  9. Evidence for Enhanced Multisensory Facilitation with Stimulus Relevance: An Electrophysiological Investigation

    PubMed Central

    Barutchu, Ayla; Freestone, Dean R.; Innes-Brown, Hamish; Crewther, David P.; Crewther, Sheila G.

    2013-01-01

    Currently debate exists relating to the interplay between multisensory processes and bottom-up and top-down influences. However, few studies have looked at neural responses to newly paired audiovisual stimuli that differ in their prescribed relevance. For such newly associated audiovisual stimuli, optimal facilitation of motor actions was observed only when both components of the audiovisual stimuli were targets. Relevant auditory stimuli were found to significantly increase the amplitudes of the event-related potentials at the occipital pole during the first 100 ms post-stimulus onset, though this early integration was not predictive of multisensory facilitation. Activity related to multisensory behavioral facilitation was observed approximately 166 ms post-stimulus, at left central and occipital sites. Furthermore, optimal multisensory facilitation was found to be associated with a latency shift of induced oscillations in the beta range (14–30 Hz) at right hemisphere parietal scalp regions. These findings demonstrate the importance of stimulus relevance to multisensory processing by providing the first evidence that the neural processes underlying multisensory integration are modulated by the relevance of the stimuli being combined. We also provide evidence that such facilitation may be mediated by changes in neural synchronization in occipital and centro-parietal neural populations at early and late stages of neural processing that coincided with stimulus selection, and the preparation and initiation of motor action. PMID:23372652

  10. Inverse Effectiveness and Multisensory Interactions in Visual Event-Related Potentials with Audiovisual Speech

    PubMed Central

    Bushmakin, Maxim; Kim, Sunah; Wallace, Mark T.; Puce, Aina; James, Thomas W.

    2013-01-01

    In recent years, it has become evident that neural responses previously considered to be unisensory can be modulated by sensory input from other modalities. In this regard, visual neural activity elicited to viewing a face is strongly influenced by concurrent incoming auditory information, particularly speech. Here, we applied an additive-factors paradigm aimed at quantifying the impact that auditory speech has on visual event-related potentials (ERPs) elicited to visual speech. These multisensory interactions were measured across parametrically varied stimulus salience, quantified in terms of signal to noise, to provide novel insights into the neural mechanisms of audiovisual speech perception. First, we measured a monotonic increase of the amplitude of the visual P1-N1-P2 ERP complex during a spoken-word recognition task with increases in stimulus salience. ERP component amplitudes varied directly with stimulus salience for visual, audiovisual, and summed unisensory recordings. Second, we measured changes in multisensory gain across salience levels. During audiovisual speech, the P1 and P1-N1 components exhibited less multisensory gain relative to the summed unisensory components with reduced salience, while N1-P2 amplitude exhibited greater multisensory gain as salience was reduced, consistent with the principle of inverse effectiveness. The amplitude interactions were correlated with behavioral measures of multisensory gain across salience levels as measured by response times, suggesting that change in multisensory gain associated with unisensory salience modulations reflects an increased efficiency of visual speech processing. PMID:22367585

  11. Somatosensory and multisensory properties of the medial bank of the ferret rostral suprasylvian sulcus

    PubMed Central

    Keniston, L. P.; Allman, B. L.; Meredith, M. A.

    2010-01-01

    In ferret cortex, the rostral portion of the suprasylvian sulcus separates primary somatosensory cortex (SI) from the anterior auditory fields. The boundary of the SI extends to this sulcus, but the adjoining medial sulcal bank has been described as “unresponsive.” Given its location between the representations of two different sensory modalities, it seems possible that the medial bank of the rostral suprasylvian sulcus (MRSS) might be multisensory in nature and contains neurons responsive to stimuli not examined by previous studies. The aim of this investigation was to determine if the MRSS contained tactile, auditory and/or multisensory neurons and to evaluate if its anatomical connections were consistent with these properties. The MRSS was found to be primarily responsive to low-threshold cutaneous stimulation, with regions of the head, neck and upper trunk represented somatotopically that were primarily connected with the SI face representation. Unlike the adjoining SI, the MRSS exhibited a different cytoarchitecture, its cutaneous representation was largely bilateral, and it contained a mixture of somatosensory, auditory and multisensory neurons. Despite the presence of multisensory neurons, however, auditory inputs exerted only modest effects on tactile processing in MRSS neurons and showed no influence on the averaged population response. These results identify the MRSS as a distinct, higher order somatosensory region as well as demonstrate that an area containing multisensory neurons may not necessarily exhibit activity indicative of multisensory processing at the population level. PMID:19466399

  12. The TLC: a novel auditory nucleus of the mammalian brain.

    PubMed

    Saldaña, Enrique; Viñuela, Antonio; Marshall, Allen F; Fitzpatrick, Douglas C; Aparicio, M-Auxiliadora

    2007-11-28

    We have identified a novel nucleus of the mammalian brain and termed it the tectal longitudinal column (TLC). Basic histologic stains, tract-tracing techniques and three-dimensional reconstructions reveal that the rat TLC is a narrow, elongated structure spanning the midbrain tectum longitudinally. This paired nucleus is located close to the midline, immediately dorsal to the periaqueductal gray matter. It occupies what has traditionally been considered the most medial region of the deep superior colliculus and the most medial region of the inferior colliculus. The TLC differs from the neighboring nuclei of the superior and inferior colliculi and the periaqueductal gray by its distinct connections and cytoarchitecture. Extracellular electrophysiological recordings show that TLC neurons respond to auditory stimuli with physiologic properties that differ from those of neurons in the inferior or superior colliculi. We have identified the TLC in rodents, lagomorphs, carnivores, nonhuman primates, and humans, which indicates that the nucleus is conserved across mammals. The discovery of the TLC reveals an unexpected level of longitudinal organization in the mammalian tectum and raises questions as to the participation of this mesencephalic region in essential, yet completely unexplored, aspects of multisensory and/or sensorimotor integration.

  13. Glyconanoparticles: multifunctional nanomaterials for biomedical applications.

    PubMed

    García, Isabel; Marradi, Marco; Penadés, Soledad

    2010-07-01

    Metal-based glyconanoparticles (GNPs) are biofunctional nanomaterials that combine the unique physical, chemical and optical properties of the metallic nucleus with the characteristics of the carbohydrate coating. The latter characteristics comprise a series of advantages that range from ensuring water solubility, biocompatibility and stability to targeting properties. The selection of suitable carbohydrates for specifically targeting biomarkers opens up the possibility to employ metallic GNPs in diagnostics and/or therapy. Within the vast nanoscience field, this review intends to focus on the advances of multifunctional and multimodal GNPs, which make use of the 'glycocode' to specifically address pathogens or pathological-related biomedical problems. Examples of their potential application in antiadhesion therapy and diagnosis are highlighted. From the ex vivo diagnostic perspective, it can be predicted that GNPs will soon be used clinically. However, the in vivo application of metallic GNPs in humans will probably need more time. In particular, major concerns regarding nanotoxicity need to be exhaustively addressed. However, it is expected that the sugar shell of GNPs will lower the intrinsic toxicity of metal nanoclusters better than other non-natural coatings.

  14. Multi-functional composite structures

    DOEpatents

    Mulligan, Anthony C.; Halloran, John; Popovich, Dragan; Rigali, Mark J.; Sutaria, Manish P.; Vaidyanathan, K. Ranji; Fulcher, Michael L.; Knittel, Kenneth L.

    2010-04-27

    Fibrous monolith processing techniques to fabricate multifunctional structures capable of performing more than one discrete function such as structures capable of bearing structural loads and mechanical stresses in service and also capable of performing at least one additional non-structural function.

  15. Multi-functional composite structures

    DOEpatents

    Mulligan, Anthony C.; Halloran, John; Popovich, Dragan; Rigali, Mark J.; Sutaria, Manish P.; Vaidyanathan, K. Ranji; Fulcher, Michael L.; Knittel, Kenneth L.

    2004-10-19

    Fibrous monolith processing techniques to fabricate multifunctional structures capable of performing more than one discrete function such as structures capable of bearing structural loads and mechanical stresses in service and also capable of performing at least one additional non-structural function.

  16. Multifunctional reactive nanocomposite materials

    NASA Astrophysics Data System (ADS)

    Stamatis, Demitrios

    Many multifunctional nanocomposite materials have been developed for use in propellants, explosives, pyrotechnics, and reactive structures. These materials exhibit high reaction rates due to their developed reaction interfacial area. Two applications addressed in this work include nanocomposite powders prepared by arrested reactive milling (ARM) for burn rate modifiers and reactive structures. In burn rate modifiers, addition of reactive nanocomposite powders to aluminized propellants increases the burn rate of aluminum and thus the overall reaction rate of an energetic formulation. Replacing only a small fraction of aluminum by 8Al·MoO3 and 2B·Ti nanocomposite powders enhances the reaction rate with little change to the thermodynamic performance of the formulation; both the rate of pressure rise and maximum pressure measured in the constant volume explosion test increase. For reactive structures, nanocomposite powders with bulk compositions of 8Al·MoO3, 12Al·MoO3, and 8Al·3CuO were prepared by ARM and consolidated using a uniaxial die. Consolidated samples had densities greater than 90% of theoretical maximum density while maintaining their high reactivity. Pellets prepared using 8Al·MoO3 powders were ignited by a CO2 laser. Ignition delays increased at lower laser powers and greater pellet densities. A simplified numerical model describing heating and thermal initiation of the reactive pellets predicted adequately the observed effects of both laser power and pellet density on the measured ignition delays. To investigate the reaction mechanisms in nanocomposite thermites, two types of nanocomposite reactive materials with the same bulk compositions 8Al·MoO3 were prepared by different methods. One of the materials was manufactured by ARM and the other, so called metastable interstitial composite (MIC), by mixing of nano-scaled individual powders. Clear differences in the low-temperature redox reactions, welldetectable by differential scanning calorimetry

  17. Changes in effective connectivity of human superior parietal lobule under multisensory and unisensory stimulation.

    PubMed

    Moran, R J; Molholm, S; Reilly, R B; Foxe, J J

    2008-05-01

    Previous event-related potential (ERP) studies have identified the superior parietal lobule (SPL) as actively multisensory. This study compares effective, or contextually active, connections to this region under unisensory and multisensory conditions. Effective connectivity, the influence of one brain region over another, during unisensory visual, unisensory auditory and multisensory audiovisual stimulation was investigated. ERPs were recorded from subdural electrodes placed over the parietal lobe of three patients while they conducted a rapid reaction-time task. A generative model of interacting neuronal ensembles for ERPs was inverted in a scheme allowing investigation of the connections from and to the SPL, a multisensory processing area. Important features of the ensemble model include inhibitory and excitatory feedback connections to pyramidal cells and extrinsic input to the stellate cell pool, with extrinsic forward and backward connections delineated by laminar connection differences between ensembles. The framework embeds the SPL in a plausible connection of distinct neuronal ensembles mirroring the integrated brain regions involved in the response task. Bayesian model comparison was used to test competing feed-forward and feed-backward models of how the electrophysiological data were generated. Comparisons were performed between multisensory and unisensory data. Findings from three patients show differences in summed unisensory and multisensory ERPs that can be accounted for by a mediation of both forward and backward connections to the SPL. In particular, a negative gain in all forward and backward connections to the SPL from other regions was observed during the period of multisensory integration, while a positive gain was observed for forward projections that arise from the SPL.

  18. Role of the anterior insular cortex in integrative causal signaling during multisensory auditory-visual attention.

    PubMed

    Chen, Tianwen; Michels, Lars; Supekar, Kaustubh; Kochalka, John; Ryali, Srikanth; Menon, Vinod

    2015-01-01

    Coordinated attention to information from multiple senses is fundamental to our ability to respond to salient environmental events, yet little is known about brain network mechanisms that guide integration of information from multiple senses. Here we investigate dynamic causal mechanisms underlying multisensory auditory-visual attention, focusing on a network of right-hemisphere frontal-cingulate-parietal regions implicated in a wide range of tasks involving attention and cognitive control. Participants performed three 'oddball' attention tasks involving auditory, visual and multisensory auditory-visual stimuli during fMRI scanning. We found that the right anterior insula (rAI) demonstrated the most significant causal influences on all other frontal-cingulate-parietal regions, serving as a major causal control hub during multisensory attention. Crucially, we then tested two competing models of the role of the rAI in multisensory attention: an 'integrated' signaling model in which the rAI generates a common multisensory control signal associated with simultaneous attention to auditory and visual oddball stimuli versus a 'segregated' signaling model in which the rAI generates two segregated and independent signals in each sensory modality. We found strong support for the integrated, rather than the segregated, signaling model. Furthermore, the strength of the integrated control signal from the rAI was most pronounced on the dorsal anterior cingulate and posterior parietal cortices, two key nodes of saliency and central executive networks respectively. These results were preserved with the addition of a superior temporal sulcus region involved in multisensory processing. Our study provides new insights into the dynamic causal mechanisms by which the AI facilitates multisensory attention.

  19. Experience with adults shapes multisensory representation of social familiarity in the brain of a songbird.

    PubMed

    George, Isabelle; Cousillas, Hugo; Richard, Jean-Pierre; Hausberger, Martine

    2012-01-01

    Social animals learn to perceive their social environment, and their social skills and preferences are thought to emerge from greater exposure to and hence familiarity with some social signals rather than others. Familiarity appears to be tightly linked to multisensory integration. The ability to differentiate and categorize familiar and unfamiliar individuals and to build a multisensory representation of known individuals emerges from successive social interactions, in particular with adult, experienced models. In different species, adults have been shown to shape the social behavior of young by promoting selective attention to multisensory cues. The question of what representation of known conspecifics adult-deprived animals may build therefore arises. Here we show that starlings raised with no experience with adults fail to develop a multisensory representation of familiar and unfamiliar starlings. Electrophysiological recordings of neuronal activity throughout the primary auditory area of these birds, while they were exposed to audio-only or audiovisual familiar and unfamiliar cues, showed that visual stimuli did, as in wild-caught starlings, modulate auditory responses but that, unlike what was observed in wild-caught birds, this modulation was not influenced by familiarity. Thus, adult-deprived starlings seem to fail to discriminate between familiar and unfamiliar individuals. This suggests that adults may shape multisensory representation of known individuals in the brain, possibly by focusing the young's attention on relevant, multisensory cues. Multisensory stimulation by experienced, adult models may thus be ubiquitously important for the development of social skills (and of the neural properties underlying such skills) in a variety of species.

  20. Multisensory temporal function and EEG complexity in patients with epilepsy and psychogenic nonepileptic events.

    PubMed

    Noel, Jean-Paul; Kurela, LeAnne; Baum, Sarah H; Yu, Hong; Neimat, Joseph S; Gallagher, Martin J; Wallace, Mark

    2017-05-01

    Cognitive and perceptual comorbidities frequently accompany epilepsy and psychogenic nonepileptic events (PNEE). However, and despite the fact that perceptual function is built upon a multisensory foundation, little knowledge exists concerning multisensory function in these populations. Here, we characterized facets of multisensory processing abilities in patients with epilepsy and PNEE, and probed the relationship between individual resting-state EEG complexity and these psychophysical measures in each patient. We prospectively studied a cohort of patients with epilepsy (N=18) and PNEE (N=20) patients who were admitted to Vanderbilt's Epilepsy Monitoring Unit (EMU) and weaned off of anticonvulsant drugs. Unaffected age-matched persons staying with the patients in the EMU (N=15) were also recruited as controls. All participants performed two tests of multisensory function: an audio-visual simultaneity judgment and an audio-visual redundant target task. Further, in the cohort of patients with epilepsy and PNEE we quantified resting state EEG gamma power and complexity. Compared with both patients with epilepsy and control subjects, patients with PNEE exhibited significantly poorer acuity in audiovisual temporal function as evidenced in significantly larger temporal binding windows (i.e., they perceived larger stimulus asynchronies as being presented simultaneously). These differences appeared to be specific for temporal function, as there was no difference among the three groups in a non-temporally based measure of multisensory function - the redundant target task. Further, patients with PNEE exhibited more complex resting state EEG patterns as compared to their patients with epilepsy, and EEG complexity correlated with multisensory temporal performance on a subject-by-subject manner. Taken together, findings seem to indicate that patients with PNEE bind information from audition and vision over larger temporal intervals when compared with control subjects as well

  1. Role of the anterior insular cortex in integrative causal signaling during multisensory auditory–visual attention

    PubMed Central

    Chen, Tianwen; Michels, Lars; Supekar, Kaustubh; Kochalka, John; Ryali, Srikanth; Menon, Vinod

    2014-01-01

    Coordinated attention to information from multiple senses is fundamental to our ability to respond to salient environmental events, yet little is known about brain network mechanisms that guide integration of information from multiple senses. Here we investigate dynamic causal mechanisms underlying multisensory auditory–visual attention, focusing on a network of right-hemisphere frontal–cingulate–parietal regions implicated in a wide range of tasks involving attention and cognitive control. Participants performed three ‘oddball’ attention tasks involving auditory, visual and multisensory auditory–visual stimuli during fMRI scanning. We found that the right anterior insula (rAI) demonstrated the most significant causal influences on all other frontal–cingulate–parietal regions, serving as a major causal control hub during multisensory attention. Crucially, we then tested two competing models of the role of the rAI in multisensory attention: an ‘integrated’ signaling model in which the rAI generates a common multisensory control signal associated with simultaneous attention to auditory and visual oddball stimuli versus a ‘segregated’ signaling model in which the rAI generates two segregated and independent signals in each sensory modality. We found strong support for the integrated, rather than the segregated, signaling model. Furthermore, the strength of the integrated control signal from the rAI was most pronounced on the dorsal anterior cingulate and posterior parietal cortices, two key nodes of saliency and central executive networks respectively. These results were preserved with the addition of a superior temporal sulcus region involved in multisensory processing. Our study provides new insights into the dynamic causal mechanisms by which the AI facilitates multisensory attention. PMID:25352218

  2. The Nucleus Introduced

    PubMed Central

    Pederson, Thoru

    2011-01-01

    Now is an opportune moment to address the confluence of cell biological form and function that is the nucleus. Its arrival is especially timely because the recognition that the nucleus is extremely dynamic has now been solidly established as a paradigm shift over the past two decades, and also because we now see on the horizon numerous ways in which organization itself, including gene location and possibly self-organizing bodies, underlies nuclear functions. PMID:20660024

  3. Multifunctional nanoparticles for cancer immunotherapy

    PubMed Central

    Saleh, Tayebeh; Shojaosadati, Seyed Abbas

    2016-01-01

    ABSTRACT During the last decades significant progress has been made in the field of cancer immunotherapy. However, cancer vaccines have not been successful in clinical trials due to poor immunogenicity of antigen, limitations of safety associated with traditional systemic delivery as well as the complex regulation of the immune system in tumor microenvironment. In recent years, nanotechnology-based delivery systems have attracted great interest in the field of immunotherapy since they provide new opportunities to fight the cancer. In particular, for delivery of cancer vaccines, multifunctional nanoparticles present many advantages such as targeted delivery to immune cells, co-delivery of therapeutic agents, reduced adverse outcomes, blocked immune checkpoint molecules, and amplify immune activation via the use of stimuli-responsive or immunostimulatory materials. In this review article, we highlight recent progress and future promise of multifunctional nanoparticles that have been applied to enhance the efficiency of cancer vaccines. PMID:26901287

  4. Multifunctional Information Distribution System (MIDS)

    DTIC Science & Technology

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-554 Multifunctional Information Distribution System (MIDS) As of FY 2017 President’s Budget...Program Office Estimate RDT&E - Research, Development, Test, and Evaluation SAR - Selected Acquisition Report SCP - Service Cost Position TBD - To Be... selectable levels Multiple selectable levels >=200 with IF for 1000 200 with IF Multiple selectable levels LVT(2) Multiple selectable levels Multiple

  5. The roles of physical and physiological simultaneity in audiovisual multisensory facilitation

    PubMed Central

    Leone, Lynnette M.; McCourt, Mark E.

    2013-01-01

    A series of experiments measured the audiovisual stimulus onset asynchrony (SOAAV), yielding facilitative multisensory integration. We evaluated (1) the range of SOAAV over which facilitation occurred when unisensory stimuli were weak; (2) whether the range of SOAAV producing facilitation supported the hypothesis that physiological simultaneity of unisensory activity governs multisensory facilitation; and (3) whether AV multisensory facilitation depended on relative stimulus intensity. We compared response-time distributions to unisensory auditory (A) and visual (V) stimuli with those to AV stimuli over a wide range (300 and 20 ms increments) of SOAAV, across four conditions of varying stimulus intensity. In condition 1, the intensity of unisensory stimuli was adjusted such that d′ ≈ 2. In condition 2, V stimulus intensity was increased (d′ > 4), while A stimulus intensity was as in condition 1. In condition 3, A stimulus intensity was increased (d′ > 4) while V stimulus intensity was as in condition 1. In condition 4, both A and V stimulus intensities were increased to clearly suprathreshold levels (d′ > 4). Across all conditions of stimulus intensity, significant multisensory facilitation occurred exclusively for simultaneously presented A and V stimuli. In addition, facilitation increased as stimulus intensity increased, in disagreement with inverse effectiveness. These results indicate that the requirements for facilitative multisensory integration include both physical and physiological simultaneity. PMID:24349682

  6. Dynamic weighting of multisensory stimuli shapes decision-making in rats and humans.

    PubMed

    Sheppard, John P; Raposo, David; Churchland, Anne K

    2013-05-08

    Stimuli that animals encounter in the natural world are frequently time-varying and activate multiple sensory systems together. Such stimuli pose a major challenge for the brain: Successful multisensory integration requires subjects to estimate the reliability of each modality and use these estimates to weight each signal appropriately. Here, we examined whether humans and rats can estimate the reliability of time-varying multisensory stimuli when stimulus reliability changes unpredictably from trial to trial. Using an existing multisensory decision task that features time-varying audiovisual stimuli, we independently manipulated the signal-to-noise ratios of each modality and measured subjects' decisions on single- and multi-sensory trials. We report three main findings: (a) Sensory reliability influences how subjects weight multisensory evidence even for time-varying, stochastic stimuli. (b) The ability to exploit sensory reliability extends beyond human and nonhuman primates: Rodents and humans both weight incoming sensory information in a reliability-dependent manner. (c) Regardless of sensory reliability, most subjects are disinclined to make "snap judgments" and instead base decisions on evidence presented over the majority of the trial duration. Rare departures from this trend highlight the importance of using time-varying stimuli that permit this analysis. Taken together, these results suggest that the brain's ability to use stimulus reliability to guide decision-making likely relies on computations that are conserved across species and operate over a wide range of stimulus conditions.

  7. Augmented multisensory feedback enhances locomotor adaptation in humans with incomplete spinal cord injury.

    PubMed

    Yen, Sheng-Che; Landry, Jill M; Wu, Ming

    2014-06-01

    Different forms of augmented feedback may engage different motor learning pathways, but it is unclear how these pathways interact with each other, especially in patients with incomplete spinal cord injury (SCI). The purpose of this study was to test whether augmented multisensory feedback could enhance aftereffects following short term locomotor training (i.e., adaptation) in patients with incomplete SCI. A total of 10 subjects with incomplete SCI were recruited to perform locomotor adaptation. Three types of augmented feedback were provided during the adaptation: (a) computerized visual cues showing the actual and target stride length (augmented visual feedback); (b) a swing resistance applied to the leg (augmented proprioceptive feedback); (c) a combination of the visual cues and resistance (augmented multisensory feedback). The results showed that subjects' stride length increased in all conditions following the adaptation, but the increase was greater and retained longer in the multisensory feedback condition. The multisensory feedback provided in this study may engage both explicit and implicit learning pathways during the adaptation and in turn enhance the aftereffect. The results implied that multisensory feedback may be used as an adjunctive approach to enhance gait recovery in humans with SCI.

  8. Multisensory gain within and across hemispaces in simple and choice reaction time paradigms.

    PubMed

    Girard, Simon; Collignon, Olivier; Lepore, Franco

    2011-09-01

    Recent results on the nature and limits of multisensory enhancement are inconsistent when stimuli are presented across spatial regions. We presented visual, tactile and visuotactile stimuli to participants in two speeded response tasks. Each unisensory stimulus was presented to either the left or right hemispace, and multisensory stimuli were presented as either aligned (e.g. visual right/tactile right) or misaligned (e.g. visual right/tactile left). The first task was a simple reaction time (SRT) paradigm where participants responded to all stimulations irrespective of spatial position. Results showed that multisensory gain and coactivation were the same for spatially aligned and misaligned visuotactile stimulation. In the second task, a choice reaction time (CRT) paradigm where participants responded to right-sided stimuli only, misaligned stimuli yielded slower reaction times. No difference in multisensory gain was found between the SRT and CRT tasks for aligned stimulation. Overall, the results suggest that when spatial information is task-irrelevant, multisensory integration of spatially aligned and misaligned stimuli is equivalent. However, manipulating task requirements can alter this effect.

  9. Dynamic weighting of multisensory stimuli shapes decision-making in rats and humans

    PubMed Central

    Sheppard, John P.; Raposo, David; Churchland, Anne K.

    2013-01-01

    Stimuli that animals encounter in the natural world are frequently time-varying and activate multiple sensory systems together. Such stimuli pose a major challenge for the brain: Successful multisensory integration requires subjects to estimate the reliability of each modality and use these estimates to weight each signal appropriately. Here, we examined whether humans and rats can estimate the reliability of time-varying multisensory stimuli when stimulus reliability changes unpredictably from trial to trial. Using an existing multisensory decision task that features time-varying audiovisual stimuli, we independently manipulated the signal-to-noise ratios of each modality and measured subjects' decisions on single- and multi-sensory trials. We report three main findings: (a) Sensory reliability influences how subjects weight multisensory evidence even for time-varying, stochastic stimuli. (b) The ability to exploit sensory reliability extends beyond human and nonhuman primates: Rodents and humans both weight incoming sensory information in a reliability-dependent manner. (c) Regardless of sensory reliability, most subjects are disinclined to make “snap judgments” and instead base decisions on evidence presented over the majority of the trial duration. Rare departures from this trend highlight the importance of using time-varying stimuli that permit this analysis. Taken together, these results suggest that the brain's ability to use stimulus reliability to guide decision-making likely relies on computations that are conserved across species and operate over a wide range of stimulus conditions. PMID:23658374

  10. The Construct of the Multisensory Temporal Binding Window and its Dysregulation in Developmental Disabilities

    PubMed Central

    Wallace, Mark T.; Stevenson, Ryan A.

    2014-01-01

    Behavior, perception and cognition are strongly shaped by the synthesis of information across the different sensory modalities. Such multisensory integration often results in performance and perceptual benefits that reflect the additional information conferred by having cues from multiple senses providing redundant or complementary information. The spatial and temporal relationships of these cues provide powerful statistical information about how these cues should be integrated or “bound” in order to create a unified perceptual representation. Much recent work has examined the temporal factors that are integral in multisensory processing, with many focused on the construct of the multisensory temporal binding window – the epoch of time within which stimuli from different modalities is likely to be integrated and perceptually bound. Emerging evidence suggests that this temporal window is altered in a series of neurodevelopmental disorders, including autism, dyslexia and schizophrenia. In addition to their role in sensory processing, these deficits in multisensory temporal function may play an important role in the perceptual and cognitive weaknesses that characterize these clinical disorders. Within this context, focus on improving the acuity of multisensory temporal function may have important implications for the amelioration of the “higher-order” deficits that serve as the defining features of these disorders. PMID:25128432

  11. Facilitation of multisensory integration by the "unity effect" reveals that speech is special.

    PubMed

    Vatakis, Argiro; Ghazanfar, Asif A; Spence, Charles

    2008-07-29

    Whenever two or more sensory inputs are highly consistent in one or more dimension(s), observers will be more likely to perceive them as a single multisensory event rather than as separate unimodal events. For audiovisual speech, but not for other noncommunicative events, participants exhibit a "unity effect," whereby they are less sensitive to temporal asynchrony (i.e., that are more likely to bind the multisensory signals together) for matched (than for mismatched) speech events. This finding suggests that the modulation of multisensory integration by the unity effect in humans may be specific to speech. To test this hypothesis directly, we investigated whether the unity effect would also influence the multisensory integration of vocalizations from another primate species, the rhesus monkey. Human participants made temporal order judgments for both matched and mismatched audiovisual stimuli presented at a range of stimulus-onset asynchronies. The unity effect was examined with (1) a single call-type across two different monkeys, (2) two different call-types from the same monkey, (3) human versus monkey "cooing," and (4) speech sounds produced by a male and a female human. The results show that the unity effect only influenced participants' performance for the speech stimuli; no effect was observed for monkey vocalizations or for the human imitations of monkey calls. These findings suggest that the facilitation of multisensory integration by the unity effect is specific to human speech signals.

  12. The roles of physical and physiological simultaneity in audiovisual multisensory facilitation.

    PubMed

    Leone, Lynnette M; McCourt, Mark E

    2013-01-01

    A series of experiments measured the audiovisual stimulus onset asynchrony (SOAAV), yielding facilitative multisensory integration. We evaluated (1) the range of SOAAV over which facilitation occurred when unisensory stimuli were weak; (2) whether the range of SOAAV producing facilitation supported the hypothesis that physiological simultaneity of unisensory activity governs multisensory facilitation; and (3) whether AV multisensory facilitation depended on relative stimulus intensity. We compared response-time distributions to unisensory auditory (A) and visual (V) stimuli with those to AV stimuli over a wide range (300 and 20 ms increments) of SOAAV, across four conditions of varying stimulus intensity. In condition 1, the intensity of unisensory stimuli was adjusted such that d' ≈ 2. In condition 2, V stimulus intensity was increased (d' > 4), while A stimulus intensity was as in condition 1. In condition 3, A stimulus intensity was increased (d' > 4) while V stimulus intensity was as in condition 1. In condition 4, both A and V stimulus intensities were increased to clearly suprathreshold levels (d' > 4). Across all conditions of stimulus intensity, significant multisensory facilitation occurred exclusively for simultaneously presented A and V stimuli. In addition, facilitation increased as stimulus intensity increased, in disagreement with inverse effectiveness. These results indicate that the requirements for facilitative multisensory integration include both physical and physiological simultaneity.

  13. Human upright posture control models based on multisensory inputs; in fast and slow dynamics.

    PubMed

    Chiba, Ryosuke; Takakusaki, Kaoru; Ota, Jun; Yozu, Arito; Haga, Nobuhiko

    2016-03-01

    Posture control to maintain an upright stance is one of the most important and basic requirements in the daily life of humans. The sensory inputs involved in posture control include visual and vestibular inputs, as well as proprioceptive and tactile somatosensory inputs. These multisensory inputs are integrated to represent the body state (body schema); this is then utilized in the brain to generate the motion. Changes in the multisensory inputs result in postural alterations (fast dynamics), as well as long-term alterations in multisensory integration and posture control itself (slow dynamics). In this review, we discuss the fast and slow dynamics, with a focus on multisensory integration including an introduction of our study to investigate "internal force control" with multisensory integration-evoked posture alteration. We found that the study of the slow dynamics is lagging compared to that of fast dynamics, such that our understanding of long-term alterations is insufficient to reveal the underlying mechanisms and to propose suitable models. Additional studies investigating slow dynamics are required to expand our knowledge of this area, which would support the physical training and rehabilitation of elderly and impaired persons. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  14. Primary and multisensory cortical activity is correlated with audiovisual percepts.

    PubMed

    Benoit, Margo McKenna; Raij, Tommi; Lin, Fa-Hsuan; Jääskeläinen, Iiro P; Stufflebeam, Steven

    2010-04-01

    Incongruent auditory and visual stimuli can elicit audiovisual illusions such as the McGurk effect where visual /ka/ and auditory /pa/ fuse into another percept such as/ta/. In the present study, human brain activity was measured with adaptation functional magnetic resonance imaging to investigate which brain areas support such audiovisual illusions. Subjects viewed trains of four movies beginning with three congruent /pa/ stimuli to induce adaptation. The fourth stimulus could be (i) another congruent /pa/, (ii) a congruent /ka/, (iii) an incongruent stimulus that evokes the McGurk effect in susceptible individuals (lips /ka/ voice /pa/), or (iv) the converse combination that does not cause the McGurk effect (lips /pa/ voice/ ka/). This paradigm was predicted to show increased release from adaptation (i.e. stronger brain activation) when the fourth movie and the related percept was increasingly different from the three previous movies. A stimulus change in either the auditory or the visual stimulus from /pa/ to /ka/ (iii, iv) produced within-modality and cross-modal responses in primary auditory and visual areas. A greater release from adaptation was observed for incongruent non-McGurk (iv) compared to incongruent McGurk (iii) trials. A network including the primary auditory and visual cortices, nonprimary auditory cortex, and several multisensory areas (superior temporal sulcus, intraparietal sulcus, insula, and pre-central cortex) showed a correlation between perceiving the McGurk effect and the fMRI signal, suggesting that these areas support the audiovisual illusion.

  15. Assessing the benefits of multisensory audiotactile stimulation for overweight individuals.

    PubMed

    Wan, Xiaoang; Spence, Charles; Mu, Bingbing; Zhou, Xi; Ho, Cristy

    2014-04-01

    We report an experiment designed to examine whether individuals who are overweight would perform differently when trying to detect and/or discriminate auditory, vibrotactile, and audiotactile targets. The vibrotactile stimuli were delivered either to the participant's abdomen or to his hand. Thirty-six young male participants were classified into normal, underweight, or overweight groups based on their body mass index. All three groups exhibited a significant benefit of multisensory (over the best of the unisensory) stimulation, but the magnitude of this benefit was modulated by the weight of the participant, the task, and the location from which the vibrotactile stimuli happened to be presented. For the detection task, the overweight group exhibited a significantly smaller benefit than the underweight group. In the discrimination task, the overweight group showed significantly more benefits than the other two groups when the vibrotactile stimuli were delivered to their hands, but not when the stimuli were delivered to their abdomens. These results might raise some interesting questions regarding the mechanisms underlying audiotactile information processing and have applied relevance for the design of the most effective warning signal (e.g., for drivers).

  16. Multisensory interactions between auditory and haptic object recognition.

    PubMed

    Kassuba, Tanja; Menz, Mareike M; Röder, Brigitte; Siebner, Hartwig R

    2013-05-01

    Object manipulation produces characteristic sounds and causes specific haptic sensations that facilitate the recognition of the manipulated object. To identify the neural correlates of audio-haptic binding of object features, healthy volunteers underwent functional magnetic resonance imaging while they matched a target object to a sample object within and across audition and touch. By introducing a delay between the presentation of sample and target stimuli, it was possible to dissociate haptic-to-auditory and auditory-to-haptic matching. We hypothesized that only semantically coherent auditory and haptic object features activate cortical regions that host unified conceptual object representations. The left fusiform gyrus (FG) and posterior superior temporal sulcus (pSTS) showed increased activation during crossmodal matching of semantically congruent but not incongruent object stimuli. In the FG, this effect was found for haptic-to-auditory and auditory-to-haptic matching, whereas the pSTS only displayed a crossmodal matching effect for congruent auditory targets. Auditory and somatosensory association cortices showed increased activity during crossmodal object matching which was, however, independent of semantic congruency. Together, the results show multisensory interactions at different hierarchical stages of auditory and haptic object processing. Object-specific crossmodal interactions culminate in the left FG, which may provide a higher order convergence zone for conceptual object knowledge.

  17. Multi-sensory integration as a result of perception

    SciTech Connect

    Stansfield, S.A.

    1988-09-13

    For the most part, research into multi-sensor data fusion has concentrated on two areas: sensing and sensor modeling, and the creation of some central, usually geometric, representation of the world. From our perspective, this approach is narrow and local. The perceptual system will in fact need to integrate sensed data in many different ways/endash/homogeneously, heterogeneously, supportively, and directly. Integration of sensory information takes place at several different levels and utilizes several different mechanisms. A first level of integration takes place as sensor primitives are extracted from the world and combined to form more complex features. As these features are identified, they may be used to invoke motor functions which extract other, related, features from the object being perceived. This use of one set of sensory inputs to guide the extraction of others may be thought of as the second level of integration carried out by the perceptual system. Finally, at the highest level, multiple, disparate sensory features are aggregated into a heterogeneous central representation of the object as a whole. How the information contained within this aggregrate is used by the system is a function of both the available information and the task at hand. Thus, at this level, integration may be thought of as a top-down, knowledge-driven process. This paper explores how this model of multi-sensory integration might be implemented and utilized within a robotic system equipped with visual and tactile sensors. 21. refs., 5 figs.

  18. Effect of odour on multisensory environmental evaluations of road traffic

    SciTech Connect

    Jiang, Like Masullo, Massimiliano Maffei, Luigi

    2016-09-15

    This study investigated the effect of odour on multisensory environmental evaluations of road traffic. The study aimed to answer: (1) Does odour have any effect on evaluations on noise, landscape and the overall environment? (2) How different are participants' responses to odour stimuli and are these differences influential on the evaluations? Experimental scenarios varied in three Traffic levels, three Tree screening conditions and two Odour presence conditions were designed, and presented to participants in virtual reality. Perceived Loudness, Noise Annoyance, Landscape Quality and Overall Pleasantness of each scenario were evaluated and the results were analysed. It shows that Odour presence did not have significant main effect on any of the evaluations, but has significant interactions with Traffic level on Noise Annoyance and with Tree screening on Landscape Quality, indicating the potential of odour to modulate noise and visual landscape perceptions in specific environmental content. Concerning participants' responses to odour stimuli, large differences were found in this study. However, the differences did not seem to be influential on environmental evaluations in this study. Larger samples of participants may benefit this study for more significant results of odour effect.

  19. Multisensory interactions between vestibular, visual and somatosensory signals.

    PubMed

    Ferrè, Elisa Raffaella; Walther, Leif Erik; Haggard, Patrick

    2015-01-01

    Vestibular inputs are constantly processed and integrated with signals from other sensory modalities, such as vision and touch. The multiply-connected nature of vestibular cortical anatomy led us to investigate whether vestibular signals could participate in a multi-way interaction with visual and somatosensory perception. We used signal detection methods to identify whether vestibular stimulation might interact with both visual and somatosensory events in a detection task. Participants were instructed to detect near-threshold somatosensory stimuli that were delivered to the left index finger in one half of experimental trials. A visual signal occurred close to the finger in half of the trials, independent of somatosensory stimuli. A novel Near infrared caloric vestibular stimulus (NirCVS) was used to artificially activate the vestibular organs. Sham stimulations were used to control for non-specific effects of NirCVS. We found that both visual and vestibular events increased somatosensory sensitivity. Critically, we found no evidence for supra-additive multisensory enhancement when both visual and vestibular signals were administered together: in fact, we found a trend towards sub-additive interaction. The results are compatible with a vestibular role in somatosensory gain regulation.

  20. Multisensory Interactions between Vestibular, Visual and Somatosensory Signals

    PubMed Central

    Ferrè, Elisa Raffaella; Walther, Leif Erik; Haggard, Patrick

    2015-01-01

    Vestibular inputs are constantly processed and integrated with signals from other sensory modalities, such as vision and touch. The multiply-connected nature of vestibular cortical anatomy led us to investigate whether vestibular signals could participate in a multi-way interaction with visual and somatosensory perception. We used signal detection methods to identify whether vestibular stimulation might interact with both visual and somatosensory events in a detection task. Participants were instructed to detect near-threshold somatosensory stimuli that were delivered to the left index finger in one half of experimental trials. A visual signal occurred close to the finger in half of the trials, independent of somatosensory stimuli. A novel Near infrared caloric vestibular stimulus (NirCVS) was used to artificially activate the vestibular organs. Sham stimulations were used to control for non-specific effects of NirCVS. We found that both visual and vestibular events increased somatosensory sensitivity. Critically, we found no evidence for supra-additive multisensory enhancement when both visual and vestibular signals were administered together: in fact, we found a trend towards sub-additive interaction. The results are compatible with a vestibular role in somatosensory gain regulation. PMID:25875819

  1. Learning to integrate contradictory multisensory self-motion cue pairings.

    PubMed

    Kaliuzhna, Mariia; Prsa, Mario; Gale, Steven; Lee, Stella J; Blanke, Olaf

    2015-01-14

    Humans integrate multisensory information to reduce perceptual uncertainty when perceiving the world and self. Integration fails, however, if a common causality is not attributed to the sensory signals, as would occur in conditions of spatiotemporal discrepancies. In the case of passive self-motion, visual and vestibular cues are integrated according to statistical optimality, yet the extent of cue conflicts that do not compromise this optimality is currently underexplored. Here, we investigate whether human subjects can learn to integrate two arbitrary, but co-occurring, visual and vestibular cues of self-motion. Participants made size comparisons between two successive whole-body rotations using only visual, only vestibular, and both modalities together. The vestibular stimulus provided a yaw self-rotation cue, the visual a roll (Experiment 1) or pitch (Experiment 2) rotation cue. Experimentally measured thresholds in the bimodal condition were compared with theoretical predictions derived from the single-cue thresholds. Our results show that human subjects combine and optimally integrate vestibular and visual information, each signaling self-motion around a different rotation axis (yaw vs. roll and yaw vs. pitch). This finding suggests that the experience of two temporally co-occurring but spatially unrelated self-motion cues leads to inferring a common cause for these two initially unrelated sources of information about self-motion. We discuss our results in terms of specific task demands, cross-modal adaptation, and spatial compatibility. The importance of these results for the understanding of bodily illusions is also discussed.

  2. Hydrotherapy combined with Snoezelen multi-sensory therapy.

    PubMed

    Lavie, Efrat; Shapiro, Michele; Julius, Mona

    2005-01-01

    The aim of this article is to present a new and challenging model of treatment that combines two therapeutic interventions: hydrotherapy and Snoezelen or controlled multisensory stimulation. The combination of the two therapeutic approaches enhances the treatment effect by utilizing the unique characteristics of each approach. We believe that this combined model will further enhance each media to the benefit of the clients and create a new intervention approach. This article relates to a hydrotherapy swimming pool facility that has been established at the Williams Island Therapeutic Swimming and Recreation Center, Beit Issie Shapiro, Raanana in Israel, after acquiring many years of experience and gaining substantial knowledge both in the field of hydrotherapy and Snoezelen intervention. Beit Issie Shapiro is a non-profit community organization providing a range of services for children with developmental disabilities and their families. The organization provides direct services for nearly 6,000 children and adults each year. This article provides an overview of hydrotherapy and Snoezelen and presents a case study, which will demonstrate the new model of treatment and show how this new and innovative form of therapy can be used as a successful intervention. We believe it will open a path to enriching the repertoire of therapists helping people with special needs. This article is also addressed to researchers to provide ideas for further studies in this area.

  3. Multisensory space: from eye-movements to self-motion

    PubMed Central

    Bremmer, Frank

    2011-01-01

    We perceive the world around us as stable. This is remarkable given that our body parts as well as we ourselves are constantly in motion. Humans and other primates move their eyes more often than their hearts beat. Such eye movements lead to coherent motion of the images of the outside world across the retina. Furthermore, during everyday life, we constantly approach targets, avoid obstacles or otherwise move in space. These movements induce motion across different sensory receptor epithels: optical flow across the retina, tactile flow across the body surface and even auditory flow as detected from the two ears. It is generally assumed that motion signals as induced by one's own movement have to be identified and differentiated from the real motion in the outside world. In a number of experimental studies we and others have functionally characterized the primate posterior parietal cortex (PPC) and its role in multisensory encoding of spatial and motion information. Extracellular recordings in the macaque monkey showed that during steady fixation the visual, auditory and tactile spatial representations in the ventral intraparietal area (VIP) are congruent. This finding was of major importance given that a functional MRI (fMRI) study determined the functional equivalent of macaque area VIP in humans. Further recordings in other areas of the dorsal stream of the visual cortical system of the macaque pointed towards the neural basis of perceptual phenomena (heading detection during eye movements, saccadic suppression, mislocalization of visual stimuli during eye movements) as determined in psychophysical studies in humans. PMID:20921203

  4. Using multisensory cues to facilitate air traffic management.

    PubMed

    Ngo, Mary K; Pierce, Russell S; Spence, Charles

    2012-12-01

    In the present study, we sought to investigate whether auditory and tactile cuing could be used to facilitate a complex, real-world air traffic management scenario. Auditory and tactile cuing provides an effective means of improving both the speed and accuracy of participants' performance in a variety of laboratory-based visual target detection and identification tasks. A low-fidelity air traffic simulation task was used in which participants monitored and controlled aircraft.The participants had to ensure that the aircraft landed or exited at the correct altitude, speed, and direction and that they maintained a safe separation from all other aircraft and boundaries. The performance measures recorded included en route time, handoff delay, and conflict resolution delay (the performance measure of interest). In a baseline condition, the aircraft in conflict was highlighted in red (visual cue), and in the experimental conditions, this standard visual cue was accompanied by a simultaneously presented auditory, vibrotactile, or audiotactile cue. Participants responded significantly more rapidly, but no less accurately, to conflicts when presented with an additional auditory or audiotactile cue than with either a vibrotactile or visual cue alone. Auditory and audiotactile cues have the potential for improving operator performance by reducing the time it takes to detect and respond to potential visual target events. These results have important implications for the design and use of multisensory cues in air traffic management.

  5. Comparative Effects of Multisensory and Metacognitive Instructional Approaches on English Vocabulary Achievement of Underachieving Nigerian Secondary School Students

    ERIC Educational Resources Information Center

    Adeniyi, Folakemi O.; Lawal, R. Adebayo

    2012-01-01

    The purpose of this study was to find out the relative effects of three instructional Approaches i.e. Multisensory, Metacognitive, and a combination of Multisensory and Metacognitive Instructional Approaches on the Vocabulary achievement of underachieving Secondary School Students. The study adopted the quasi-experimental design in which a…

  6. Bimodal and trimodal multisensory enhancement: effects of stimulus onset and intensity on reaction time.

    PubMed

    Diederich, Adele; Colonius, Hans

    2004-11-01

    Manual reaction times to visual, auditory, and tactile stimuli presented simultaneously, or with a delay, were measured to test for multisensory interaction effects in a simple detection task with redundant signals. Responses to trimodal stimulus combinations were faster than those to bimodal combinations, which in turn were faster than reactions to unimodal stimuli. Response enhancement increased with decreasing auditory and tactile stimulus intensity and was a U-shaped function of stimulus onset asynchrony. Distribution inequality tests indicated that the multisensory interaction effects were larger than predicted by separate activation models, including the difference between bimodal and trimodal response facilitation. The results are discussed with respect to previous findings in a focused attention task and are compared with multisensory integration rules observed in bimodal and trimodal superior colliculus neurons in the cat and monkey.

  7. Auditory-driven phase reset in visual cortex: Human electrocorticography reveals mechanisms of early multisensory integration

    PubMed Central

    Mercier, Manuel R.; Foxe, John J.; Fiebelkorn, Ian C.; Butler, John S.; Schwartz, Theodore H.; Molholm, Sophie

    2013-01-01

    Findings in animal models demonstrate that activity within hierarchically early sensory cortical regions can be modulated by cross-sensory inputs through resetting of the phase of ongoing intrinsic neural oscillations. Here, subdural recordings evaluated whether phase resetting by auditory inputs would impact multisensory integration processes in human visual cortex. Results clearly showed auditory-driven phase reset in visual cortices and, in some cases, frank auditory event-related potentials (ERP) were also observed over these regions. Further, when audiovisual bisensory stimuli were presented, this led to robust multisensory integration effects which were observed in both the ERP and in measures of phase concentration. These results extend findings from animal models to human visual cortices, and highlight the impact of cross-sensory phase resetting by a non-primary stimulus on multisensory integration in ostensibly unisensory cortices. PMID:23624493

  8. Effect of mechanical tactile noise on amplitude of visual evoked potentials: multisensory stochastic resonance.

    PubMed

    Méndez-Balbuena, Ignacio; Huidobro, Nayeli; Silva, Mayte; Flores, Amira; Trenado, Carlos; Quintanar, Luis; Arias-Carrión, Oscar; Kristeva, Rumyana; Manjarrez, Elias

    2015-10-01

    The present investigation documents the electrophysiological occurrence of multisensory stochastic resonance in the human visual pathway elicited by tactile noise. We define multisensory stochastic resonance of brain evoked potentials as the phenomenon in which an intermediate level of input noise of one sensory modality enhances the brain evoked response of another sensory modality. Here we examined this phenomenon in visual evoked potentials (VEPs) modulated by the addition of tactile noise. Specifically, we examined whether a particular level of mechanical Gaussian noise applied to the index finger can improve the amplitude of the VEP. We compared the amplitude of the positive P100 VEP component between zero noise (ZN), optimal noise (ON), and high mechanical noise (HN). The data disclosed an inverted U-like graph for all the subjects, thus demonstrating the occurrence of a multisensory stochastic resonance in the P100 VEP.

  9. Effect of mechanical tactile noise on amplitude of visual evoked potentials: multisensory stochastic resonance

    PubMed Central

    Huidobro, Nayeli; Silva, Mayte; Flores, Amira; Trenado, Carlos; Quintanar, Luis; Arias-Carrión, Oscar; Kristeva, Rumyana

    2015-01-01

    The present investigation documents the electrophysiological occurrence of multisensory stochastic resonance in the human visual pathway elicited by tactile noise. We define multisensory stochastic resonance of brain evoked potentials as the phenomenon in which an intermediate level of input noise of one sensory modality enhances the brain evoked response of another sensory modality. Here we examined this phenomenon in visual evoked potentials (VEPs) modulated by the addition of tactile noise. Specifically, we examined whether a particular level of mechanical Gaussian noise applied to the index finger can improve the amplitude of the VEP. We compared the amplitude of the positive P100 VEP component between zero noise (ZN), optimal noise (ON), and high mechanical noise (HN). The data disclosed an inverted U-like graph for all the subjects, thus demonstrating the occurrence of a multisensory stochastic resonance in the P100 VEP. PMID:26156387

  10. Development of multisensory integration from the perspective of the individual neuron

    PubMed Central

    Stein, Barry E.; Stanford, Terrence R.; Rowland, Benjamin A.

    2014-01-01

    The ability to use cues from multiple senses in concert is a fundamental aspect of brain function. It maximizes the brain’s use of the information available to it at any given moment and enhances the physiological salience of external events. Because each sense conveys a unique perspective of the external world, synthesizing information across senses affords computational benefits that cannot otherwise be achieved. Multisensory integration not only has substantial survival value but can also create unique experiences that emerge when signals from different sensory channels are bound together. However, neurons in a newborn’s brain are not capable of multisensory integration, and studies in the midbrain have shown that the development of this process is not predetermined. Rather, its emergence and maturation critically depend on cross-modal experiences that alter the underlying neural circuit in such a way that optimizes multisensory integrative capabilities for the environment in which the animal will function. PMID:25158358

  11. Electrical neuroimaging of memory discrimination based on single-trial multisensory learning.

    PubMed

    Thelen, Antonia; Cappe, Céline; Murray, Micah M

    2012-09-01

    Multisensory experiences influence subsequent memory performance and brain responses. Studies have thus far concentrated on semantically congruent pairings, leaving unresolved the influence of stimulus pairing and memory sub-types. Here, we paired images with unique, meaningless sounds during a continuous recognition task to determine if purely episodic, single-trial multisensory experiences can incidentally impact subsequent visual object discrimination. Psychophysics and electrical neuroimaging analyses of visual evoked potentials (VEPs) compared responses to repeated images either paired or not with a meaningless sound during initial encounters. Recognition accuracy was significantly impaired for images initially presented as multisensory pairs and could not be explained in terms of differential attention or transfer of effects from encoding to retrieval. VEP modulations occurred at 100-130 ms and 270-310 ms and stemmed from topographic differences indicative of network configuration changes within the brain. Distributed source estimations localized the earlier effect to regions of the right posterior temporal gyrus (STG) and the later effect to regions of the middle temporal gyrus (MTG). Responses in these regions were stronger for images previously encountered as multisensory pairs. Only the later effect correlated with performance such that greater MTG activity in response to repeated visual stimuli was linked with greater performance decrements. The present findings suggest that brain networks involved in this discrimination may critically depend on whether multisensory events facilitate or impair later visual memory performance. More generally, the data support models whereby effects of multisensory interactions persist to incidentally affect subsequent behavior as well as visual processing during its initial stages.

  12. Individual differences in the multisensory temporal binding window predict susceptibility to audiovisual illusions.

    PubMed

    Stevenson, Ryan A; Zemtsov, Raquel K; Wallace, Mark T

    2012-12-01

    Human multisensory systems are known to bind inputs from the different sensory modalities into a unified percept, a process that leads to measurable behavioral benefits. This integrative process can be observed through multisensory illusions, including the McGurk effect and the sound-induced flash illusion, both of which demonstrate the ability of one sensory modality to modulate perception in a second modality. Such multisensory integration is highly dependent upon the temporal relationship of the different sensory inputs, with perceptual binding occurring within a limited range of asynchronies known as the temporal binding window (TBW). Previous studies have shown that this window is highly variable across individuals, but it is unclear how these variations in the TBW relate to an individual's ability to integrate multisensory cues. Here we provide evidence linking individual differences in multisensory temporal processes to differences in the individual's audiovisual integration of illusory stimuli. Our data provide strong evidence that the temporal processing of multiple sensory signals and the merging of multiple signals into a single, unified perception, are highly related. Specifically, the width of right side of an individuals' TBW, where the auditory stimulus follows the visual, is significantly correlated with the strength of illusory percepts, as indexed via both an increase in the strength of binding synchronous sensory signals and in an improvement in correctly dissociating asynchronous signals. These findings are discussed in terms of their possible neurobiological basis, relevance to the development of sensory integration, and possible importance for clinical conditions in which there is growing evidence that multisensory integration is compromised.

  13. Individual Differences in the Multisensory Temporal Binding Window Predict Susceptibility to Audiovisual Illusions

    PubMed Central

    Stevenson, Ryan A.; Zemtsov, Raquel K.; Wallace, Mark T.

    2013-01-01

    Human multisensory systems are known to bind inputs from the different sensory modalities into a unified percept, a process that leads to measurable behavioral benefits. This integrative process can be observed through multisensory illusions, including the McGurk effect and the sound-induced flash illusion, both of which demonstrate the ability of one sensory modality to modulate perception in a second modality. Such multisensory integration is highly dependent upon the temporal relationship of the different sensory inputs, with perceptual binding occurring within a limited range of asynchronies known as the temporal binding window (TBW). Previous studies have shown that this window is highly variable across individuals, but it is unclear how these variations in the TBW relate to an individual’s ability to integrate multisensory cues. Here we provide evidence linking individual differences in multisensory temporal processes to differences in the individual’s audiovisual integration of illusory stimuli. Our data provide strong evidence that the temporal processing of multiple sensory signals and the merging of multiple signals into a single, unified perception, are highly related. Specifically, the width of right side of an individuals’ TBW, where the auditory stimulus follows the visual, is significantly correlated with the strength of illusory percepts, as indexed via both an increase in the strength of binding synchronous sensory signals and in an improvement in correctly dissociating asynchronous signals. These findings are discussed in terms of their possible neurobiological basis, relevance to the development of sensory integration, and possible importance for clinical conditions in which there is growing evidence that multisensory integration is compromised. PMID:22390292

  14. Unconscious integration of multisensory bodily inputs in the peripersonal space shapes bodily self-consciousness.

    PubMed

    Salomon, Roy; Noel, Jean-Paul; Łukowska, Marta; Faivre, Nathan; Metzinger, Thomas; Serino, Andrea; Blanke, Olaf

    2017-09-01

    Recent studies have highlighted the role of multisensory integration as a key mechanism of self-consciousness. In particular, integration of bodily signals within the peripersonal space (PPS) underlies the experience of the self in a body we own (self-identification) and that is experienced as occupying a specific location in space (self-location), two main components of bodily self-consciousness (BSC). Experiments investigating the effects of multisensory integration on BSC have typically employed supra-threshold sensory stimuli, neglecting the role of unconscious sensory signals in BSC, as tested in other consciousness research. Here, we used psychophysical techniques to test whether multisensory integration of bodily stimuli underlying BSC also occurs for multisensory inputs presented below the threshold of conscious perception. Our results indicate that visual stimuli rendered invisible through continuous flash suppression boost processing of tactile stimuli on the body (Exp. 1), and enhance the perception of near-threshold tactile stimuli (Exp. 2), only once they entered PPS. We then employed unconscious multisensory stimulation to manipulate BSC. Participants were presented with tactile stimulation on their body and with visual stimuli on a virtual body, seen at a distance, which were either visible or rendered invisible. We found that participants reported higher self-identification with the virtual body in the synchronous visuo-tactile stimulation (as compared to asynchronous stimulation; Exp. 3), and shifted their self-location toward the virtual body (Exp.4), even if stimuli were fully invisible. Our results indicate that multisensory inputs, even outside of awareness, are integrated and affect the phenomenological content of self-consciousness, grounding BSC firmly in the field of psychophysical consciousness studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Nonvisual Multisensory Impairment of Body Perception in Anorexia Nervosa: A Systematic Review of Neuropsychological Studies

    PubMed Central

    Gaudio, Santino; Brooks, Samantha Jane; Riva, Giuseppe

    2014-01-01

    Background Body image distortion is a central symptom of Anorexia Nervosa (AN). Even if corporeal awareness is multisensory majority of AN studies mainly investigated visual misperception. We systematically reviewed AN studies that have investigated different nonvisual sensory inputs using an integrative multisensory approach to body perception. We also discussed the findings in the light of AN neuroimaging evidence. Methods PubMed and PsycINFO were searched until March, 2014. To be included in the review, studies were mainly required to: investigate a sample of patients with current or past AN and a control group and use tasks that directly elicited one or more nonvisual sensory domains. Results Thirteen studies were included. They studied a total of 223 people with current or past AN and 273 control subjects. Overall, results show impairment in tactile and proprioceptive domains of body perception in AN patients. Interoception and multisensory integration have been poorly explored directly in AN patients. A limitation of this review is the relatively small amount of literature available. Conclusions Our results showed that AN patients had a multisensory impairment of body perception that goes beyond visual misperception and involves tactile and proprioceptive sensory components. Furthermore, impairment of tactile and proprioceptive components may be associated with parietal cortex alterations in AN patients. Interoception and multisensory integration have been weakly explored directly. Further research, using multisensory approaches as well as neuroimaging techniques, is needed to better define the complexity of body image distortion in AN. Key Findings The review suggests an altered capacity of AN patients in processing and integration of bodily signals: body parts are experienced as dissociated from their holistic and perceptive dimensions. Specifically, it is likely that not only perception but memory, and in particular sensorimotor/proprioceptive memory

  16. Staying within the lines: the formation of visuospatial boundaries influences multisensory feature integration.

    PubMed

    Fiebelkorn, Ian C; Foxe, John J; Schwartz, Theodore H; Molholm, Sophie

    2010-05-01

    The brain processes multisensory features of an object (e.g., its sound and shape) in separate cortical regions. A key question is how representations of these features bind together to form a coherent percept (the 'binding problem'). Here we tested the hypothesis that the determination of an object's visuospatial boundaries is paramount to the linking of its multisensory features (i.e., that the refinement of attended space through the formation of visual boundaries establishes the boundaries for multisensory feature integration). We recorded both scalp and intracranial electrophysiological data in response to Kanizsa-type illusory contour stimuli (in which pacman-like elements give the impression of a single object), their non-illusory counterparts, and auditory stimuli. Participants performed a visual task and ignored sounds. Enhanced processing of task-irrelevant sounds when paired with attended visual stimuli served as our metric for multisensory feature integration [e.g., Busse et al. (2005) Proc. Natl Acad. Sci. USA 102: 18751-18756]. According to our hypothesis, task-irrelevant sounds paired with Kanizsa-type illusory contour stimuli (which have well-defined boundaries) should receive enhanced processing relative to task-irrelevant sounds paired with non-illusory contour stimuli (which have ambiguous boundaries). The scalp data clearly support this prediction and, combined with the intracranial data, advocate for an important extension of models for multisensory feature integration. We propose a model in which (i) the visual boundaries of an object are established through processing in occipitotemporal cortex, and (ii) attention then spreads to cortical regions that process features that fall within the object's established visual boundaries, including its task-irrelevant multisensory features.

  17. Basic multisensory functions can be acquired after congenital visual pattern deprivation in humans.

    PubMed

    Putzar, Lisa; Gondan, Matthias; Röder, Brigitte

    2012-01-01

    People treated for bilateral congenital cataracts offer a model to study the influence of visual deprivation in early infancy on visual and multisensory development. We investigated cross-modal integration capabilities in cataract patients using a simple detection task that provided redundant information to two different senses. In both patients and controls, redundancy gains were consistent with coactivation models, indicating an integrated processing of modality-specific information. This finding is in contrast with recent studies showing impaired higher-level multisensory interactions in cataract patients. The present results suggest that basic cross-modal integrative processes for simple short stimuli do not depend on visual and/or crossmodal input since birth.

  18. Kaon-nucleus scattering

    NASA Technical Reports Server (NTRS)

    Hong, Byungsik; Maung, Khin Maung; Wilson, John W.; Buck, Warren W.

    1989-01-01

    The derivations of the Lippmann-Schwinger equation and Watson multiple scattering are given. A simple optical potential is found to be the first term of that series. The number density distribution models of the nucleus, harmonic well, and Woods-Saxon are used without t-matrix taken from the scattering experiments. The parameterized two-body inputs, which are kaon-nucleon total cross sections, elastic slope parameters, and the ratio of the real to the imaginary part of the forward elastic scattering amplitude, are presented. The eikonal approximation was chosen as our solution method to estimate the total and absorptive cross sections for the kaon-nucleus scattering.

  19. Kaon-nucleus scattering

    NASA Technical Reports Server (NTRS)

    Hong, Byungsik; Buck, Warren W.; Maung, Khin M.

    1989-01-01

    Two kinds of number density distributions of the nucleus, harmonic well and Woods-Saxon models, are used with the t-matrix that is taken from the scattering experiments to find a simple optical potential. The parameterized two body inputs, which are kaon-nucleon total cross sections, elastic slope parameters, and the ratio of the real to imaginary part of the forward elastic scattering amplitude, are shown. The eikonal approximation was chosen as the solution method to estimate the total and absorptive cross sections for the kaon-nucleus scattering.

  20. Primary and Multisensory Cortical Activity is Correlated with Audiovisual Percepts

    PubMed Central

    Benoit, Margo McKenna; Raij, Tommi; Lin, Fa-Hsuan; Jääskeläinen, Iiro P.; Stufflebeam, Steven

    2012-01-01

    Incongruent auditory and visual stimuli can elicit audiovisual illusions such as the McGurk effect where visual /ka/ and auditory /pa/ fuse into another percept such as/ta/. In the present study, human brain activity was measured with adaptation functional magnetic resonance imaging to investigate which brain areas support such audiovisual illusions. Subjects viewed trains of four movies beginning with three congruent /pa/ stimuli to induce adaptation. The fourth stimulus could be (i) another congruent /pa/, (ii) a congruent /ka/, (iii) an incongruent stimulus that evokes the McGurk effect in susceptible individuals (lips /ka/ voice /pa/), or (iv) the converse combination that does not cause the McGurk effect (lips /pa/ voice/ ka/). This paradigm was predicted to show increased release from adaptation (i.e. stronger brain activation) when the fourth movie and the related percept was increasingly different from the three previous movies. A stimulus change in either the auditory or the visual stimulus from /pa/ to /ka/ (iii, iv) produced within-modality and cross-modal responses in primary auditory and visual areas. A greater release from adaptation was observed for incongruent non-McGurk (iv) compared to incongruent McGurk (iii) trials. A network including the primary auditory and visual cortices, nonprimary auditory cortex, and several multisensory areas (superior temporal sulcus, intraparietal sulcus, insula, and pre-central cortex) showed a correlation between perceiving the McGurk effect and the fMRI signal, suggesting that these areas support the audiovisual illusion. PMID:19780040

  1. Multisensory fusion and the stochastic structure of postural sway.

    PubMed

    Kiemel, Tim; Oie, Kelvin S; Jeka, John J

    2002-10-01

    We analyze the stochastic structure of postural sway and demonstrate that this structure imposes important constraints on models of postural control. Linear stochastic models of various orders were fit to the center-of-mass trajectories of subjects during quiet stance in four sensory conditions: (i) light touch and vision, (ii) light touch, (iii) vision, and (iv) neither touch nor vision. For each subject and condition, the model of appropriate order was determined, and this model was characterized by the eigenvalues and coefficients of its autocovariance function. In most cases, postural-sway trajectories were similar to those produced by a third-order model with eigenvalues corresponding to a slow first-order decay plus a faster-decaying damped oscillation. The slow-decay fraction, which we define as the slow-decay autocovariance coefficient divided by the total variance, was usually near 1. We compare the stochastic structure of our data to two linear control-theory models: (i) a proportional-integral-derivative control model in which the postural system's state is assumed to be known, and (ii) an optimal-control model in which the system's state is estimated based on noisy multisensory information using a Kalman filter. Under certain assumptions, both models have eigenvalues consistent with our results. However, the slow-decay fraction predicted by both models is less than we observe. We show that our results are more consistent with a modification of the optimal-control model in which noise is added to the computations performed by the state estimator. This modified model has a slow-decay fraction near 1 in a parameter regime in which sensory information related to the body's velocity is more accurate than sensory information related to position and acceleration. These findings suggest that: (i) computation noise is responsible for much of the variance observed in postural sway, and (ii) the postural control system under the conditions tested resides in the

  2. Multisensory Training Improves Auditory Spatial Processing following Bilateral Cochlear Implantation

    PubMed Central

    Isaiah, Amal; Vongpaisal, Tara; King, Andrew J.

    2014-01-01

    Cochlear implants (CIs) partially restore hearing to the deaf by directly stimulating the inner ear. In individuals fitted with CIs, lack of auditory experience due to loss of hearing before language acquisition can adversely impact outcomes. For example, adults with early-onset hearing loss generally do not integrate inputs from both ears effectively when fitted with bilateral CIs (BiCIs). Here, we used an animal model to investigate the effects of long-term deafness on auditory localization with BiCIs and approaches for promoting the use of binaural spatial cues. Ferrets were deafened either at the age of hearing onset or as adults. All animals were implanted in adulthood, either unilaterally or bilaterally, and were subsequently assessed for their ability to localize sound in the horizontal plane. The unilaterally implanted animals were unable to perform this task, regardless of the duration of deafness. Among animals with BiCIs, early-onset hearing loss was associated with poor auditory localization performance, compared with late-onset hearing loss. However, performance in the early-deafened group with BiCIs improved significantly after multisensory training with interleaved auditory and visual stimuli. We demonstrate a possible neural substrate for this by showing a training-induced improvement in the responsiveness of auditory cortical neurons and in their sensitivity to interaural level differences, the principal localization cue available to BiCI users. Importantly, our behavioral and physiological evidence demonstrates a facilitative role for vision in restoring auditory spatial processing following potential cross-modal reorganization. These findings support investigation of a similar training paradigm in human CI users. PMID:25122908

  3. Structure-power multifunctional materials for UAV's

    NASA Astrophysics Data System (ADS)

    Thomas, James; Qidwai, Muhammad A.; Matic, Peter; Everett, Richard; Gozdz, Antoni S.; Keennon, Matt; Grasmeyer, Joel

    2002-07-01

    This paper presents multifunctional structure-plus-power developments being pursued under DARPA sponsorship with the focus on structure-battery components for unmanned air vehicles (UAV). New design strategies, analysis methods, performance indices, and prototypes for multifunctional structure-battery materials are described along with the development of two UAV prototypes with structure-battery implementation.

  4. Locally rare species influence grassland ecosystem multifunctionality.

    PubMed

    Soliveres, Santiago; Manning, Peter; Prati, Daniel; Gossner, Martin M; Alt, Fabian; Arndt, Hartmut; Baumgartner, Vanessa; Binkenstein, Julia; Birkhofer, Klaus; Blaser, Stefan; Blüthgen, Nico; Boch, Steffen; Böhm, Stefan; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Heinze, Johannes; Hölzel, Norbert; Jung, Kirsten; Klaus, Valentin H; Klein, Alexandra-Maria; Kleinebecker, Till; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Morris, E Kathryn; Müller, Jörg; Oelmann, Yvonne; Overmann, Jörg; Pašalić, Esther; Renner, Swen C; Rillig, Matthias C; Schaefer, H Martin; Schloter, Michael; Schmitt, Barbara; Schöning, Ingo; Schrumpf, Marion; Sikorski, Johannes; Socher, Stephanie A; Solly, Emily F; Sonnemann, Ilja; Sorkau, Elisabeth; Steckel, Juliane; Steffan-Dewenter, Ingolf; Stempfhuber, Barbara; Tschapka, Marco; Türke, Manfred; Venter, Paul; Weiner, Christiane N; Weisser, Wolfgang W; Werner, Michael; Westphal, Catrin; Wilcke, Wolfgang; Wolters, Volkmar; Wubet, Tesfaye; Wurst, Susanne; Fischer, Markus; Allan, Eric

    2016-05-19

    Species diversity promotes the delivery of multiple ecosystem functions (multifunctionality). However, the relative functional importance of rare and common species in driving the biodiversity-multifunctionality relationship remains unknown. We studied the relationship between the diversity of rare and common species (according to their local abundances and across nine different trophic groups), and multifunctionality indices derived from 14 ecosystem functions on 150 grasslands across a land-use intensity (LUI) gradient. The diversity of above- and below-ground rare species had opposite effects, with rare above-ground species being associated with high levels of multifunctionality, probably because their effects on different functions did not trade off against each other. Conversely, common species were only related to average, not high, levels of multifunctionality, and their functional effects declined with LUI. Apart from the community-level effects of diversity, we found significant positive associations between the abundance of individual species and multifunctionality in 6% of the species tested. Species-specific functional effects were best predicted by their response to LUI: species that declined in abundance with land use intensification were those associated with higher levels of multifunctionality. Our results highlight the importance of rare species for ecosystem multifunctionality and help guiding future conservation priorities.

  5. Locally rare species influence grassland ecosystem multifunctionality

    PubMed Central

    Manning, Peter; Prati, Daniel; Gossner, Martin M.; Alt, Fabian; Arndt, Hartmut; Baumgartner, Vanessa; Binkenstein, Julia; Birkhofer, Klaus; Blaser, Stefan; Blüthgen, Nico; Boch, Steffen; Böhm, Stefan; Börschig, Carmen; Buscot, Francois; Diekötter, Tim; Heinze, Johannes; Hölzel, Norbert; Jung, Kirsten; Klaus, Valentin H.; Klein, Alexandra-Maria; Kleinebecker, Till; Klemmer, Sandra; Krauss, Jochen; Lange, Markus; Morris, E. Kathryn; Müller, Jörg; Oelmann, Yvonne; Overmann, Jörg; Pašalić, Esther; Renner, Swen C.; Rillig, Matthias C.; Schaefer, H. Martin; Schloter, Michael; Schmitt, Barbara; Schöning, Ingo; Schrumpf, Marion; Sikorski, Johannes; Socher, Stephanie A.; Solly, Emily F.; Sonnemann, Ilja; Sorkau, Elisabeth; Steckel, Juliane; Steffan-Dewenter, Ingolf; Stempfhuber, Barbara; Tschapka, Marco; Türke, Manfred; Venter, Paul; Weiner, Christiane N.; Weisser, Wolfgang W.; Werner, Michael; Westphal, Catrin; Wilcke, Wolfgang; Wolters, Volkmar; Wubet, Tesfaye; Wurst, Susanne; Fischer, Markus; Allan, Eric

    2016-01-01

    Species diversity promotes the delivery of multiple ecosystem functions (multifunctionality). However, the relative functional importance of rare and common species in driving the biodiversity–multifunctionality relationship remains unknown. We studied the relationship between the diversity of rare and common species (according to their local abundances and across nine different trophic groups), and multifunctionality indices derived from 14 ecosystem functions on 150 grasslands across a land-use intensity (LUI) gradient. The diversity of above- and below-ground rare species had opposite effects, with rare above-ground species being associated with high levels of multifunctionality, probably because their effects on different functions did not trade off against each other. Conversely, common species were only related to average, not high, levels of multifunctionality, and their functional effects declined with LUI. Apart from the community-level effects of diversity, we found significant positive associations between the abundance of individual species and multifunctionality in 6% of the species tested. Species-specific functional effects were best predicted by their response to LUI: species that declined in abundance with land use intensification were those associated with higher levels of multifunctionality. Our results highlight the importance of rare species for ecosystem multifunctionality and help guiding future conservation priorities. PMID:27114572

  6. Generic Automated Multi-function Finger Design

    NASA Astrophysics Data System (ADS)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  7. The development of audiovisual multisensory integration across childhood and early adolescence: a high-density electrical mapping study.

    PubMed

    Brandwein, Alice B; Foxe, John J; Russo, Natalie N; Altschuler, Ted S; Gomes, Hilary; Molholm, Sophie

    2011-05-01

    The integration of multisensory information is essential to forming meaningful representations of the environment. Adults benefit from related multisensory stimuli but the extent to which the ability to optimally integrate multisensory inputs for functional purposes is present in children has not been extensively examined. Using a cross-sectional approach, high-density electrical mapping of event-related potentials (ERPs) was combined with behavioral measures to characterize neurodevelopmental changes in basic audiovisual (AV) integration from middle childhood through early adulthood. The data indicated a gradual fine-tuning of multisensory facilitation of performance on an AV simple reaction time task (as indexed by race model violation), which reaches mature levels by about 14 years of age. They also revealed a systematic relationship between age and the brain processes underlying multisensory integration (MSI) in the time frame of the auditory N1 ERP component (∼ 120 ms). A significant positive correlation between behavioral and neurophysiological measures of MSI suggested that the underlying brain processes contributed to the fine-tuning of multisensory facilitation of behavior that was observed over middle childhood. These findings are consistent with protracted plasticity in a dynamic system and provide a starting point from which future studies can begin to examine the developmental course of multisensory processing in clinical populations.

  8. Electrospun multifunctional tissue engineering scaffolds

    NASA Astrophysics Data System (ADS)

    Wang, Chong; Wang, Min

    2014-03-01

    Tissue engineering holds great promises in providing successful treatments of human body tissue loss that current methods are unable to treat or unable to achieve satisfactory clinical outcomes. In scaffold-based tissue engineering, a highperformance scaffold underpins the success of a tissue engineering strategy and a major direction in the field is to create multifunctional tissue engineering scaffolds for enhanced biological performance and for regenerating complex body tissues. Electrospinning can produce nanofibrous scaffolds that are highly desirable for tissue engineering. The enormous interest in electrospinning and electrospun fibrous structures by the science, engineering and medical communities has led to various developments of the electrospinning technology and wide investigations of electrospun products in many industries, including biomedical engineering, over the past two decades. It is now possible to create novel, multicomponent tissue engineering scaffolds with multiple functions. This article provides a concise review of recent advances in the R & D of electrospun multifunctional tissue engineering scaffolds. It also presents our philosophy and research in the designing and fabrication of electrospun multicomponent scaffolds with multiple functions.

  9. Phototriggered multifunctional drug delivery device

    NASA Astrophysics Data System (ADS)

    Härtner, S.; Kim, H.-C.; Hampp, N.

    2006-02-01

    Although phototriggered cleavage of chemical bonds induced by single-photon or two-photon-absorption provides attractive tools for controlled drug delivery, the choice of drugs is still limited by the linker system to which the therapeutic molecules need to be bound covalently. The use of a multifunctional linker system suitable for coupling a broad spectrum of drugs to the polymeric carrier will open a new field for drug delivery. We have developed a novel photocleavable multifunctional linker system based on coumarin dimers, whose unique photochemical behavior are well characterized. As a first example, an acrylic polymer-drug conjugate with antimetabolites is explored. The cleavage of the link between the drug and the polymer backbone is triggered by both single- as well as two-photon absorption. The release of the drug is investigated. It is possible to manufacture a polymeric drug delivery device with several drugs in different areas. In particular the two-photon-absorption induced process offers the possibility to address the drug of interest owing to the superior spatial resolution. The key to such devices is a versatile linker-system which can be adopted to work with various drug compounds.

  10. Onset of deconfinement in nucleus-nucleus collisions

    SciTech Connect

    Gazdzicki, M.; Gorenstein, M. I.; Seyboth, P.

    2012-05-15

    The energy dependence of hadron production in relativistic nucleus-nucleus collisions reveals anomalies-the kink, horn, and step. They were predicted as signals of the deconfinement phase transition and observed by the NA49 Collaboration in central PbPb collisions at the CERN SPS. This indicates the onset of the deconfinement in nucleus-nucleus collisions at about 30 A GeV.

  11. Integration of semicircular canal and otolith information for multisensory orientation stimuli

    NASA Technical Reports Server (NTRS)

    Ormsby, C. C.; Young, L. R.

    1977-01-01

    This paper presents a model for the perception of dynamic orientation resulting from stimuli which involve both the otoliths and the semicircular canals. The model was applied to several multisensory stimuli and its predictions evaluated. In all cases, the model predictions were in substantial agreement with the known illusions or with the relevant experimental data.

  12. A Teacher's Guide to Multisensory Learning: Improving Literacy by Engaging the Senses

    ERIC Educational Resources Information Center

    Baines, Lawrence

    2008-01-01

    Discover how teachers can motivate students and help them retain more knowledge longer by using sight, sound, smell, taste, touch, and movement in the classroom. In this first-ever guide to multisensory learning, author Lawrence Baines explains how teachers in every grade and subject can change curriculum from a series of assignments to a series…

  13. Multi-sensory Contexts and Support in Science for Special Needs Pupils.

    ERIC Educational Resources Information Center

    Bancroft, Jill

    1999-01-01

    Describes an activity that was adapted from the Chemical Industry Education Center (CIEC) materials for use with special-needs students. The activity, "Transporting Chocolate," addresses a real-world problem and makes use of multisensory contexts to maximize the student's capabilities while minimizing limitations. (WRM)

  14. A Multisensory Approach to Teach Arabic Decoding to Students with Dyslexia

    ERIC Educational Resources Information Center

    Hazoury, Katia H.; Oweini, Ahmad A.; Bahous, Rima

    2009-01-01

    This paper proposes a technique for teaching decoding of the Arabic language to Arab dyslexic students following the multisensory, systematic, explicit phonics approach and based in part on the Orton-Gillingham approach. This technique emphasizes vocabulary controlled, font-modified, cumulative, color-coded reading materials, and orthographic…

  15. "Magic Day": Multi-Disciplinary, Multi-Sensory Awareness Gathered and Integrated into the Curriculum.

    ERIC Educational Resources Information Center

    Wall, Guy; And Others

    This document contains a comprehensive set of activities that serve to integrate all the curricular areas commonly taught in elementary schools. The 45 activities are designed to encourage multi-disciplinary and multi-sensory learning experiences in a cemetery. In addition to their use in cemeteries, these field tested activities may also be…

  16. Effects of Multisensory Environments on Stereotyped Behaviours Assessed as Maintained by Automatic Reinforcement

    ERIC Educational Resources Information Center

    Hill, Lindsay; Trusler, Karen; Furniss, Frederick; Lancioni, Giulio

    2012-01-01

    Background: The aim of the present study was to evaluate the effects of the sensory equipment provided in a multi-sensory environment (MSE) and the level of social contact provided on levels of stereotyped behaviours assessed as being maintained by automatic reinforcement. Method: Stereotyped and engaged behaviours of two young people with severe…

  17. An Empirical Evaluation of an Interactive Multisensory Environment for Children with Disability.

    ERIC Educational Resources Information Center

    Houghton, Stephen; Douglas, Graham; Brigg, John; Langsford, Shane; Powell, Lesley; West, John; Chapman, Annaliese; Kellner, Rick

    1998-01-01

    Seventeen students with severe disability (ages 5 to 18) were assessed on Foundation Outcome Statement (FOS) Skills and subsequently exposed to an interactive multi-sensory environment which included equipment for light and visual stimulation and touch/tactile activities. Students increased in their number of FOS Skills immediately following…

  18. Independent mechanisms for ventriloquism and multisensory integration as revealed by theta-burst stimulation.

    PubMed

    Bertini, Caterina; Leo, Fabrizio; Avenanti, Alessio; Làdavas, Elisabetta

    2010-05-01

    The visual and auditory systems often concur to create a unified perceptual experience and to determine the localization of objects in the external world. Co-occurring auditory and visual stimuli in spatial coincidence are known to enhance performance of auditory localization due to the integration of stimuli from different sensory channels (i.e. multisensory integration). However, auditory localization of audiovisual stimuli presented at spatial disparity might also induce a mislocalization of the sound towards the visual stimulus (i.e. ventriloquism effect). Using repetitive transcranial magnetic stimulation we tested the role of right temporoparietal (rTPC), right occipital (rOC) and right posterior parietal (rPPC) cortex in an auditory localization task in which indices of ventriloquism and multisensory integration were computed. We found that suppression of rTPC excitability by means of continuous theta-burst stimulation (cTBS) reduced multisensory integration. No similar effect was found for cTBS over rOC. Moreover, inhibition of rOC, but not of rTPC, suppressed the visual bias in the contralateral hemifield. In contrast, cTBS over rPPC did not produce any modulation of ventriloquism or integrative effects. The double dissociation found in the present study suggests that ventriloquism and audiovisual multisensory integration are functionally independent phenomena and may be underpinned by partially different neural circuits.

  19. Beta/Gamma Oscillations and Event-Related Potentials Indicate Aberrant Multisensory Processing in Schizophrenia.

    PubMed

    Balz, Johanna; Roa Romero, Yadira; Keil, Julian; Krebber, Martin; Niedeggen, Michael; Gallinat, Jürgen; Senkowski, Daniel

    2016-01-01

    Recent behavioral and neuroimaging studies have suggested multisensory processing deficits in patients with schizophrenia (SCZ). Thus far, the neural mechanisms underlying these deficits are not well understood. Previous studies with unisensory stimulation have shown altered neural oscillations in SCZ. As such, altered oscillations could contribute to aberrant multisensory processing in this patient group. To test this assumption, we conducted an electroencephalography (EEG) study in 15 SCZ and 15 control participants in whom we examined neural oscillations and event-related potentials (ERPs) in the sound-induced flash illusion (SIFI). In the SIFI multiple auditory stimuli that are presented alongside a single visual stimulus can induce the illusory percept of multiple visual stimuli. In SCZ and control participants we compared ERPs and neural oscillations between trials that induced an illusion and trials that did not induce an illusion. On the behavioral level, SCZ (55.7%) and control participants (55.4%) did not significantly differ in illusion rates. The analysis of ERPs revealed diminished amplitudes and altered multisensory processing in SCZ compared to controls around 135 ms after stimulus onset. Moreover, the analysis of neural oscillations revealed altered 25-35 Hz power after 100 to 150 ms over occipital scalp for SCZ compared to controls. Our findings extend previous observations of aberrant neural oscillations in unisensory perception paradigms. They suggest that altered ERPs and altered occipital beta/gamma band power reflect aberrant multisensory processing in SCZ.

  20. The recalibration patterns of perceptual synchrony and multisensory integration after exposure to asynchronous speech.

    PubMed

    Yuan, Xiangyong; Bi, Cuihua; Yin, Huazhan; Li, Baolin; Huang, Xiting

    2014-05-21

    Perceptual synchrony and multisensory integration both vary as a function of stimulus onset asynchrony, but evidences from behavioral, patient, and lesion studies all support some dissociation between these two processes. Although it has been found that both perceptual synchrony and multisensory integration are recalibrated after exposure to asynchronous multisensory stimuli, no studies have directly compared these two recalibration patterns. We addressed this by using McGurk speech and requiring participants to perform simultaneity judgments and a syllable identification task in separate sessions. The results revealed that after exposure to asynchrony, both perceptual synchrony and McGurk fusion shifted toward the temporal lag. The recalibration aftereffects (i.e., the magnitude of shifts) of these two processes have no significant difference and correlation. In addition, McGurk fusion increased strongly at the direction of the temporal lag, which could not be fully explained by fusion shifts. Thus, the present research implies that recalibration patterns of explicit and implicit timing represented by perceptual synchrony and multisensory integration have both similarity and difference.

  1. Assessing the Role of the ‘Unity Assumption’ on Multisensory Integration: A Review

    PubMed Central

    Chen, Yi-Chuan; Spence, Charles

    2017-01-01

    There has been longstanding interest from both experimental psychologists and cognitive neuroscientists in the potential modulatory role of various top–down factors on multisensory integration/perception in humans. One such top–down influence, often referred to in the literature as the ‘unity assumption,’ is thought to occur in those situations in which an observer considers that various of the unisensory stimuli that they have been presented with belong to one and the same object or event (Welch and Warren, 1980). Here, we review the possible factors that may lead to the emergence of the unity assumption. We then critically evaluate the evidence concerning the consequences of the unity assumption from studies of the spatial and temporal ventriloquism effects, from the McGurk effect, and from the Colavita visual dominance paradigm. The research that has been published to date using these tasks provides support for the claim that the unity assumption influences multisensory perception under at least a subset of experimental conditions. We then consider whether the notion has been superseded in recent years by the introduction of priors in Bayesian causal inference models of human multisensory perception. We suggest that the prior of common cause (that is, the prior concerning whether multisensory signals originate from the same source or not) offers the most useful way to quantify the unity assumption as a continuous cognitive variable. PMID:28408890

  2. The maxillary palp of aedes aegypti, a model of multisensory integration

    USDA-ARS?s Scientific Manuscript database

    Female yellow-fever mosquitoes, Aedes aegypti, are obligate blood-feeders and vectors of the pathogens that cause dengue fever, yellow fever and Chikungunya. This feeding behavior concludes a series of multisensory events guiding the mosquito to its host from a distance. The antennae and maxillary...

  3. Multisensory adaptation of spatial-to-motor transformations in children with developmental coordination disorder.

    PubMed

    King, Bradley R; Kagerer, Florian A; Harring, Jeffrey R; Contreras-Vidal, Jose L; Clark, Jane E

    2011-07-01

    Recent research has demonstrated that adaptation to a visuomotor distortion systematically influenced movements to auditory targets in adults and typically developing (TD) children, suggesting that the adaptation of spatial-to-motor transformations for reaching movements is multisensory (i.e., generalizable across sensory modalities). The multisensory characteristics of these transformations in children with developmental coordination disorder (DCD) have not been examined. Given that previous research has demonstrated that children with DCD have deficits in sensorimotor integration, these children may also have impairments in the formation of multisensory spatial-to-motor transformations for target-directed arm movements. To investigate this hypothesis, children with and without DCD executed discrete arm movements to visual and acoustic targets prior to and following exposure to an abrupt visual feedback rotation. Results demonstrated that the magnitudes of the visual aftereffects were equivalent in the TD children and the children with DCD, indicating that both groups of children adapted similarly to the visuomotor perturbation. Moreover, the influence of visuomotor adaptation on auditory-motor performance was similar in the two groups of children. This suggests that the multisensory processes underlying adaptation of spatial-to-motor transformations are similar in children with DCD and TD children.

  4. Microcontroller based fibre-optic visual presentation system for multisensory neuroimaging.

    PubMed

    Kurniawan, Veldri; Klemen, Jane; Chambers, Christopher D

    2011-10-30

    Presenting visual stimuli in physical 3D space during fMRI experiments carries significant technical challenges. Certain types of multisensory visuotactile experiments and visuomotor tasks require presentation of visual stimuli in peripersonal space, which cannot be accommodated by ordinary projection screens or binocular goggles. However, light points produced by a group of LEDs can be transmitted through fibre-optic cables and positioned anywhere inside the MRI scanner. Here we describe the design and implementation of a microcontroller-based programmable digital device for controlling fibre-optically transmitted LED lights from a PC. The main feature of this device is the ability to independently control the colour, brightness, and timing of each LED. Moreover, the device was designed in a modular and extensible way, which enables easy adaptation for various experimental paradigms. The device was tested and validated in three fMRI experiments involving basic visual perception, a simple colour discrimination task, and a blocked multisensory visuo-tactile task. The results revealed significant lateralized activation in occipital cortex of all participants, a reliable response in ventral occipital areas to colour stimuli elicited by the device, and strong activations in multisensory brain regions in the multisensory task. Overall, these findings confirm the suitability of this device for presenting complex fibre-optic visual and cross-modal stimuli inside the scanner.

  5. Program Evaluation of a School District's Multisensory Reading Initiative

    ERIC Educational Resources Information Center

    Asip, Michael Patrick

    2012-01-01

    The purpose of this study was to conduct a formative program evaluation of a school district's multisensory reading initiative. The mixed methods study involved semi-structured interviews, online survey, focus groups, document review, and analysis of extant special education student reading achievement data. Participants included elementary…

  6. The Impact of Using Multi-Sensory Approach for Teaching Students with Learning Disabilities

    ERIC Educational Resources Information Center

    Obaid, Majeda Al Sayyed

    2013-01-01

    The purpose of this study is to investigate the effect of using the Multi-Sensory Approach for teaching students with learning disabilities on the sixth grade students' achievement in mathematics at Jordanian public schools. To achieve the purpose of the study, a pre/post-test was constructed to measure students' achievement in mathematics. The…

  7. Enhanced multisensory integration and motor reactivation after active motor learning of audiovisual associations.

    PubMed

    Butler, Andrew J; James, Thomas W; James, Karin Harman

    2011-11-01

    Everyday experience affords us many opportunities to learn about objects through multiple senses using physical interaction. Previous work has shown that active motor learning of unisensory items enhances memory and leads to the involvement of motor systems during subsequent perception. However, the impact of active motor learning on subsequent perception and recognition of associations among multiple senses has not been investigated. Twenty participants were included in an fMRI study that explored the impact of active motor learning on subsequent processing of unisensory and multisensory stimuli. Participants were exposed to visuo-motor associations between novel objects and novel sounds either through self-generated actions on the objects or by observing an experimenter produce the actions. Immediately after exposure, accuracy, RT, and BOLD fMRI measures were collected with unisensory and multisensory stimuli in associative perception and recognition tasks. Response times during audiovisual associative and unisensory recognition were enhanced by active learning, as was accuracy during audiovisual associative recognition. The difference in motor cortex activation between old and new associations was greater for the active than the passive group. Furthermore, functional connectivity between visual and motor cortices was stronger after active learning than passive learning. Active learning also led to greater activation of the fusiform gyrus during subsequent unisensory visual perception. Finally, brain regions implicated in audiovisual integration (e.g., STS) showed greater multisensory gain after active learning than after passive learning. Overall, the results show that active motor learning modulates the processing of multisensory associations.

  8. Improving Reading and Decoding Skills through the Use of Multisensory Teaching Strategies.

    ERIC Educational Resources Information Center

    O'Dea, Donna

    This paper describes a project for improving the reading and decoding skills of 23 high school students with reading learning disabilities. The targeted population consisted of high school students in a suburban community in the Midwest. The project used Auditory Discrimination in Depth, which is a multisensory program that develops…

  9. TIMING OF AUDIOVISUAL INPUTS TO THE PREFRONTAL CORTEX AND MULTISENSORY INTEGRATION

    PubMed Central

    ROMANSKI, L. M.; HWANG, J.

    2013-01-01

    A number of studies have demonstrated that the relative timing of audiovisual stimuli is especially important for multisensory integration of speech signals although the neuronal mechanisms underlying this complex behavior are unknown. Temporal coincidence and congruency are thought to underlie the successful merging of two inter-modal stimuli into a coherent perceptual representation. It has been previously shown that single neurons in the non-human primate prefrontal cortex integrate face and vocalization information. However, these multisensory responses and the degree to which they depend on temporal coincidence have yet to be determined. In this study we analyzed the response latency of ventrolateral prefrontal (VLPFC) neurons to face, vocalization and combined face–vocalization stimuli and an offset (asynchronous) version of the face–vocalization stimulus. Our results indicate that for most prefrontal multisensory neurons, the response latency for the vocalization was the shortest, followed by the combined face–vocalization stimuli. The face stimulus had the longest onset response latency. When tested with a dynamic face–vocalization stimulus that had been temporally offset (asynchronous) one-third of multisensory cells in VLPFC demonstrated a change in response compared to the response to the natural, synchronous face–vocalization movie. Our results indicate that prefrontal neurons are sensitive to the temporal properties of audiovisual stimuli. A disruption in the temporal synchrony of an audiovisual signal which results in a change in the firing of communication related prefrontal neurons could underlie the loss in intelligibility which occurs with asynchronous speech stimuli. PMID:22516006

  10. A Multisensory Aquatic Environment for Individuals with Intellectual/Developmental Disabilities

    ERIC Educational Resources Information Center

    Potter, Cindy; Erzen, Carol

    2008-01-01

    This article presents the eighth of a 12-part series exploring the benefits of aquatic therapy and recreation for people with special needs. Here, the authors describe the process of development and installation of an aquatic multisensory environment (MSE) and the many factors that one should consider for a successful result. There are many…

  11. A Multisensory Language Approach to the Introduction of the Alphabet to Hearing Impaired Preschoolers.

    ERIC Educational Resources Information Center

    Jaworski, Anne Porter; Schroder, Ann

    The project was designed to develop a multisensory, language-oriented curriculum to introduce the letters of the alphabet to six hearing impaired preschoolers. Every week a new letter is introduced via such tasks as art and cooking activities, snacks, beginning sound picture cards, yarn and lacing letters, sandpaper letters, alphabet string beads,…

  12. Technologically and Artistically Enhanced Multi-Sensory Computer-Programming Education

    ERIC Educational Resources Information Center

    Katai, Zoltan; Toth, Laszlo

    2010-01-01

    Over the last decades more and more research has analysed relatively new or rediscovered teaching-learning concepts like blended, hybrid, multi-sensory or technologically enhanced learning. This increased interest in these educational forms can be explained by new exciting discoveries in brain research and cognitive psychology, as well as by the…

  13. Behavioral States of Children with Severe Disabilities in the Multisensory Environment

    ERIC Educational Resources Information Center

    Tunson, Je'na; Candler, Catherine

    2010-01-01

    The purpose of this study was to examine the behavioral states of individual children for evidence of responsiveness within and without a multisensory environment (MSE). Three children in the age range of 3-10 years with severe multiple disabilities participated in the study. A single-system ABAB design was used. Participants' behavioral states,…

  14. Multisensory aversive stimuli differentially modulate negative feelings in near and far space.

    PubMed

    Taffou, Marine; Ondřej, Jan; O'Sullivan, Carol; Warusfel, Olivier; Dubal, Stéphanie; Viaud-Delmon, Isabelle

    2016-05-05

    Affect, space, and multisensory integration are processes that are closely linked. However, it is unclear whether the spatial location of emotional stimuli interacts with multisensory presentation to influence the emotional experience they induce in the perceiver. In this study, we used the unique advantages of virtual reality techniques to present potentially aversive crowd stimuli embedded in a natural context and to control their display in terms of sensory and spatial presentation. Individuals high in crowdphobic fear navigated in an auditory-visual virtual environment, in which they encountered virtual crowds presented through the visual channel, the auditory channel, or both. They reported the intensity of their negative emotional experience at a far distance and at a close distance from the crowd stimuli. Whereas auditory-visual presentation of close feared stimuli amplified negative feelings, auditory-visual presentation of distant feared stimuli did not amplify negative feelings. This suggests that spatial closeness allows multisensory processes to modulate the intensity of the emotional experience induced by aversive stimuli. Nevertheless, the specific role of auditory stimulation must be investigated to better understand this interaction between multisensory, affective, and spatial representation processes. This phenomenon may serve the implementation of defensive behaviors in response to aversive stimuli that are in position to threaten an individual's feeling of security.

  15. Multi-sensory Contexts and Support in Science for Special Needs Pupils.

    ERIC Educational Resources Information Center

    Bancroft, Jill

    1999-01-01

    Describes an activity that was adapted from the Chemical Industry Education Center (CIEC) materials for use with special-needs students. The activity, "Transporting Chocolate," addresses a real-world problem and makes use of multisensory contexts to maximize the student's capabilities while minimizing limitations. (WRM)

  16. The Participation of Children with Multi-Sensory Impairment in Person-Centred Planning

    ERIC Educational Resources Information Center

    Taylor, Kim

    2007-01-01

    Consultation with pupils with learning disabilities through the use of person-centred planning methods is becoming increasingly common. However, little research has focused on pupils with multi-sensory impairment (MSI). Kim Taylor has taught children with special educational needs for over 25 years and holds a post-graduate diploma in…

  17. The Effects of Multisensory Imagery in Conjunction with Physical Movement Rehearsal on Golf Putting Performance

    ERIC Educational Resources Information Center

    Ploszay, A. J.; Gentner, Noah B.; Skinner, Christopher H.; Wrisberg, Craig A.

    2006-01-01

    A multiple-baseline design was used to evaluate the effects of a pre-shot putting routine on the putting performance of four NCAA Division I golfers. The routine involved a combination of multisensory imagery and simulated putting movements. Results suggested that the intervention was effective for some participants. Discussion focuses on…

  18. A dynamical framework to relate perceptual variability with multisensory information processing

    PubMed Central

    Thakur, Bhumika; Mukherjee, Abhishek; Sen, Abhijit; Banerjee, Arpan

    2016-01-01

    Multisensory processing involves participation of individual sensory streams, e.g., vision, audition to facilitate perception of environmental stimuli. An experimental realization of the underlying complexity is captured by the “McGurk-effect”- incongruent auditory and visual vocalization stimuli eliciting perception of illusory speech sounds. Further studies have established that time-delay between onset of auditory and visual signals (AV lag) and perturbations in the unisensory streams are key variables that modulate perception. However, as of now only few quantitative theoretical frameworks have been proposed to understand the interplay among these psychophysical variables or the neural systems level interactions that govern perceptual variability. Here, we propose a dynamic systems model consisting of the basic ingredients of any multisensory processing, two unisensory and one multisensory sub-system (nodes) as reported by several researchers. The nodes are connected such that biophysically inspired coupling parameters and time delays become key parameters of this network. We observed that zero AV lag results in maximum synchronization of constituent nodes and the degree of synchronization decreases when we have non-zero lags. The attractor states of this network can thus be interpreted as the facilitator for stabilizing specific perceptual experience. Thereby, the dynamic model presents a quantitative framework for understanding multisensory information processing. PMID:27502974

  19. THE USE OF INDIVIDUALIZED MULTISENSORY MATERIALS TO DEVELOP A BASIC SIGHT VOCABULARY.

    ERIC Educational Resources Information Center

    CRAWFORD, FRANCES N.

    TWO SETS OF MULTISENSORY DEVICES WERE USED TO DETERMINE WHETHER THEIR INDIVIDUALIZED USE WOULD HELP RETARDED READERS DEVELOP A BASIC SIGHT VOCABULARY. STUDENTS WHO HAD SPENT 9 OR 10 YEARS IN SCHOOL AND WHO WERE READING AT THE SECOND-READER INSTRUCTIONAL LEVEL WERE GIVEN THE DANIELS WORD RECOGNITION LIST, FORMS A AND B, AS PRETESTS AND POST-TESTS.…

  20. Methods for Sight Word Recognition in Kindergarten: Traditional Flashcard Method vs. Multisensory Approach

    ERIC Educational Resources Information Center

    Phillips, William E.; Feng, Jay

    2012-01-01

    A quasi-experimental action research with a pretest-posttest same subject design was implemented to determine if there is a different effect of the flash card method and the multisensory approach on kindergarteners' achievement in sight word recognition, and which method is more effective if there is any difference. Instrumentation for pretest and…

  1. Temporal limits on rubber hand illusion reflect individuals' temporal resolution in multisensory perception.

    PubMed

    Costantini, Marcello; Robinson, Jeffrey; Migliorati, Daniele; Donno, Brunella; Ferri, Francesca; Northoff, Georg

    2016-12-01

    Synchronous, but not asynchronous, multisensory stimulation has been successfully employed to manipulate the experience of body ownership, as in the case of the rubber hand illusion. Hence, it has been assumed that the rubber hand illusion is bound by the same temporal rules as in multisensory integration. However, empirical evidence of a direct link between the temporal limits on the rubber hand illusion and those on multisensory integration is still lacking. Here we provide the first comprehensive evidence that individual susceptibility to the rubber hand illusion depends upon the individual temporal resolution in multisensory perception, as indexed by the temporal binding window. In particular, in two studies we showed that the degree of temporal asynchrony necessary to prevent the induction of the rubber hand illusion depends upon the individuals' sensitivity to perceiving asynchrony during visuo-tactile stimulation. That is, the larger the temporal binding window, as inferred from a simultaneity judgment task, the higher the level of asynchrony tolerated in the rubber hand illusion. Our results suggest that current neurocognitive models of body ownership can be enriched with a temporal dimension. Moreover, our results suggest that the different aspects of body ownership operate over different time scales.

  2. Temporo-nasal asymmetry in multisensory integration mediated by the Superior Colliculus.

    PubMed

    Bertini, Caterina; Leo, Fabrizio; Làdavas, Elisabetta

    2008-11-25

    Temporo-nasal asymmetry in visual responses has been observed in many behavioural studies. These observations have typically been attributed to the anatomical asymmetry of fibres projecting to the Superior Colliculus (SC), even though this attribution is debated. The present study investigates temporo-nasal asymmetry in multisensory integration, and, by exploiting the absence of S-cone input to the SC, measures a behavioural response dependent strictly on the activity of the SC itself. We used a redundant signal paradigm for simple reaction times, with visual stimuli (red or purple) presented in either the temporal or the nasal hemifield. Participants responded more quickly to concurrent audio-visual (AV) stimuli than to either an auditory or a visual stimulus alone, an established phenomenon known as the Redundant Target Effect (RTE). The nature of this effect was dependent on the colour of the visual stimuli, suggesting its modulation by collicular circuits. When spatially-coincident audio-visual stimuli were visible to the SC (i.e. red stimuli), the RTE depended on a neural coactivation mechanism, suggesting an integration of multisensory information. When using stimuli invisible to the SC (i.e. purple stimuli), the RTE depended only on a simple statistical facilitation effect, in which the two sensory stimuli were processed by independent channels. Finally, we demonstrate that the multisensory integration effect was stronger for stimuli presented to the temporal hemifield than to the nasal hemifield. Taken together, these findings suggested that multisensory stimulation can be differentially effective depending on specific stimulus parameters.

  3. Multi-Sensory Exercises: An Approach to Communicative Practice. 1975-1979.

    ERIC Educational Resources Information Center

    Kalivoda, Theodore B.

    A reprint of a 1975 article on multi-sensory exercises for communicative second language learning is presented. The article begins by noting that the use of drills as a language learning and practice technique had been lost in the trend toward communicative language teaching, but that drills can provide a means of gaining functional control of…

  4. Technologically and Artistically Enhanced Multi-Sensory Computer-Programming Education

    ERIC Educational Resources Information Center

    Katai, Zoltan; Toth, Laszlo

    2010-01-01

    Over the last decades more and more research has analysed relatively new or rediscovered teaching-learning concepts like blended, hybrid, multi-sensory or technologically enhanced learning. This increased interest in these educational forms can be explained by new exciting discoveries in brain research and cognitive psychology, as well as by the…

  5. The impact of multisensory instruction on learning letter names and sounds, word reading, and spelling.

    PubMed

    Schlesinger, Nora W; Gray, Shelley

    2017-03-02

    The purpose of this study was to investigate whether the use of simultaneous multisensory structured language instruction promoted better letter name and sound production, word reading, and word spelling for second grade children with typical development (N = 6) or with dyslexia (N = 5) than structured language instruction alone. The use of non-English graphemes (letters) to represent two pretend languages was used to control for children's lexical knowledge. A multiple baseline, multiple probe across subjects single-case design, with an embedded alternating treatments design, was used to compare the efficacy of multisensory and structured language interventions. Both interventions provided explicit systematic phonics instruction; however, the multisensory intervention also utilized simultaneous engagement of at least two sensory modalities (visual, auditory, and kinesthetic/tactile). Participant's graphed data was visually analyzed, and individual Tau-U and weighted Tau-U effect sizes were calculated for the outcome variables of letter name production, letter sound production, word reading, and word spelling. The multisensory intervention did not provide an advantage over the structured intervention for participants with typical development or dyslexia. However, both interventions had an overall treatment effect for participants with typical development and dyslexia, although intervention effects varied by outcome variable.

  6. Teaching a foreign language using multisensory structured language techniques to at-risk learners: a review.

    PubMed

    Sparks, R L; Miller, K S

    2000-01-01

    An overview of multisensory structured language (MSL) techniques used to teach a foreign language to at-risk students is outlined. Research supporting the use of MSL techniques is reviewed. Specific activities using the MSL approach to teach the phonology/orthography, grammar and vocabulary of the foreign language as well as reading and communicative activities in the foreign language are presented.

  7. Beta/Gamma Oscillations and Event-Related Potentials Indicate Aberrant Multisensory Processing in Schizophrenia

    PubMed Central

    Balz, Johanna; Roa Romero, Yadira; Keil, Julian; Krebber, Martin; Niedeggen, Michael; Gallinat, Jürgen; Senkowski, Daniel

    2016-01-01

    Recent behavioral and neuroimaging studies have suggested multisensory processing deficits in patients with schizophrenia (SCZ). Thus far, the neural mechanisms underlying these deficits are not well understood. Previous studies with unisensory stimulation have shown altered neural oscillations in SCZ. As such, altered oscillations could contribute to aberrant multisensory processing in this patient group. To test this assumption, we conducted an electroencephalography (EEG) study in 15 SCZ and 15 control participants in whom we examined neural oscillations and event-related potentials (ERPs) in the sound-induced flash illusion (SIFI). In the SIFI multiple auditory stimuli that are presented alongside a single visual stimulus can induce the illusory percept of multiple visual stimuli. In SCZ and control participants we compared ERPs and neural oscillations between trials that induced an illusion and trials that did not induce an illusion. On the behavioral level, SCZ (55.7%) and control participants (55.4%) did not significantly differ in illusion rates. The analysis of ERPs revealed diminished amplitudes and altered multisensory processing in SCZ compared to controls around 135 ms after stimulus onset. Moreover, the analysis of neural oscillations revealed altered 25–35 Hz power after 100 to 150 ms over occipital scalp for SCZ compared to controls. Our findings extend previous observations of aberrant neural oscillations in unisensory perception paradigms. They suggest that altered ERPs and altered occipital beta/gamma band power reflect aberrant multisensory processing in SCZ. PMID:27999553

  8. Multi-Sensory Storytelling: A Tool for Teaching or an Intervention Technique?

    ERIC Educational Resources Information Center

    Preece, David; Zhao, Yu

    2015-01-01

    This article reports on research undertaken to investigate how multi-sensory storytelling (MSST) was being used within schools for students with profound and multiple learning difficulties and other special educational needs. Semi-structured interviews (n?=?27) and observations (n?=?18) were undertaken in five schools in the East Midlands and…

  9. A Multisensory Aquatic Environment for Individuals with Intellectual/Developmental Disabilities

    ERIC Educational Resources Information Center

    Potter, Cindy; Erzen, Carol

    2008-01-01

    This article presents the eighth of a 12-part series exploring the benefits of aquatic therapy and recreation for people with special needs. Here, the authors describe the process of development and installation of an aquatic multisensory environment (MSE) and the many factors that one should consider for a successful result. There are many…

  10. An Empirical Evaluation of an Interactive Multisensory Environment for Children with Disability.

    ERIC Educational Resources Information Center

    Houghton, Stephen; Douglas, Graham; Brigg, John; Langsford, Shane; Powell, Lesley; West, John; Chapman, Annaliese; Kellner, Rick

    1998-01-01

    Seventeen students with severe disability (ages 5 to 18) were assessed on Foundation Outcome Statement (FOS) Skills and subsequently exposed to an interactive multi-sensory environment which included equipment for light and visual stimulation and touch/tactile activities. Students increased in their number of FOS Skills immediately following…

  11. The Clinical Effectiveness of a Multisensory Therapy on Clients with Developmental Disability

    ERIC Educational Resources Information Center

    Chan, Sally; Fung, Maggie Yuen; Tong, Chien Wai; Thompson, David

    2005-01-01

    Many clients in Hong Kong with developmental disabilities stay in mental hospitals because of mental disorders and behavioural problems. There is a need to identify strategies that promote psychological well-being and reduce problem behaviours in this group of clients. This study evaluates the impact of multisensory therapy on participants'…

  12. A Teacher's Guide to Multisensory Learning: Improving Literacy by Engaging the Senses

    ERIC Educational Resources Information Center

    Baines, Lawrence

    2008-01-01

    Discover how teachers can motivate students and help them retain more knowledge longer by using sight, sound, smell, taste, touch, and movement in the classroom. In this first-ever guide to multisensory learning, author Lawrence Baines explains how teachers in every grade and subject can change curriculum from a series of assignments to a series…

  13. Effects of Aging in Multisensory Integration: A Systematic Review

    PubMed Central

    de Dieuleveult, Alix L.; Siemonsma, Petra C.; van Erp, Jan B. F.; Brouwer, Anne-Marie

    2017-01-01

    Multisensory integration (MSI) is the integration by the brain of environmental information acquired through more than one sense. Accurate MSI has been shown to be a key component of successful aging and to be crucial for processes underlying activities of daily living (ADLs). Problems in MSI could prevent older adults (OA) to age in place and live independently. However, there is a need to know how to assess changes in MSI in individuals. This systematic review provides an overview of tests assessing the effect of age on MSI in the healthy elderly population (aged 60 years and older). A literature search was done in Scopus. Articles from the earliest records available to January 20, 2016, were eligible for inclusion if assessing effects of aging on MSI in the healthy elderly population compared to younger adults (YA). These articles were rated for risk of bias with the Newcastle-Ottawa quality assessment. Out of 307 identified research articles, 49 articles were included for final review, describing 69 tests. The review indicated that OA maximize the use of multiple sources of information in comparison to YA (20 studies). In tasks that require more cognitive function, or when participants need to adapt rapidly to a situation, or when a dual task is added to the experiment, OA have problems selecting and integrating information properly as compared to YA (19 studies). Additionally, irrelevant or wrong information (i.e., distractors) has a greater impact on OA than on YA (21 studies). OA failing to weigh sensory information properly, has not been described in previous reviews. Anatomical changes (i.e., reduction of brain volume and differences of brain areas’ recruitment) and information processing changes (i.e., general cognitive slowing, inverse effectiveness, larger time window of integration, deficits in attentional control and increased noise at baseline) can only partly explain the differences between OA and YA regarding MSI. Since we have an interest in

  14. Effects of Aging in Multisensory Integration: A Systematic Review.

    PubMed

    de Dieuleveult, Alix L; Siemonsma, Petra C; van Erp, Jan B F; Brouwer, Anne-Marie

    2017-01-01

    Multisensory integration (MSI) is the integration by the brain of environmental information acquired through more than one sense. Accurate MSI has been shown to be a key component of successful aging and to be crucial for processes underlying activities of daily living (ADLs). Problems in MSI could prevent older adults (OA) to age in place and live independently. However, there is a need to know how to assess changes in MSI in individuals. This systematic review provides an overview of tests assessing the effect of age on MSI in the healthy elderly population (aged 60 years and older). A literature search was done in Scopus. Articles from the earliest records available to January 20, 2016, were eligible for inclusion if assessing effects of aging on MSI in the healthy elderly population compared to younger adults (YA). These articles were rated for risk of bias with the Newcastle-Ottawa quality assessment. Out of 307 identified research articles, 49 articles were included for final review, describing 69 tests. The review indicated that OA maximize the use of multiple sources of information in comparison to YA (20 studies). In tasks that require more cognitive function, or when participants need to adapt rapidly to a situation, or when a dual task is added to the experiment, OA have problems selecting and integrating information properly as compared to YA (19 studies). Additionally, irrelevant or wrong information (i.e., distractors) has a greater impact on OA than on YA (21 studies). OA failing to weigh sensory information properly, has not been described in previous reviews. Anatomical changes (i.e., reduction of brain volume and differences of brain areas' recruitment) and information processing changes (i.e., general cognitive slowing, inverse effectiveness, larger time window of integration, deficits in attentional control and increased noise at baseline) can only partly explain the differences between OA and YA regarding MSI. Since we have an interest in

  15. Nuclear-Targeted Multifunctional Magnetic Nanoparticles for Photothermal Therapy.

    PubMed

    Peng, Haibao; Tang, Jing; Zheng, Rui; Guo, Guannan; Dong, Angang; Wang, Yajun; Yang, Wuli

    2017-01-27

    The pursuit of multifunctional, innovative, more efficient, and safer cancer treatment has gained increasing interest in the research of preclinical nanoparticle-mediated photothermal therapy (PTT). Cell nucleus is recognized as the ideal target for cancer treatment because it plays a central role in genetic information and the transcription machinery reside. In this work, an efficient nuclear-targeted PTT strategy is proposed using transferrin and TAT peptide (TAT: YGRKKRRQRRR) conjugated monodisperse magnetic nanoparticles, which can be readily functionalized and stabilized for potential diagnostic and therapeutic applications. The monodisperse magnetic nanoparticles exhibit high photothermal conversion efficiency (≈37%) and considerable photothermal stability. They also show a high magnetization value and transverse relaxivity (207.1 mm(-1) s(-1) ), which could be applied for magnetic resonance imaging. The monodisperse magnetic nanoparticles conjugated with TAT peptides can efficiently target the nucleus and achieve the imaging-guided function, efficient cancer cells killing ability. Therefore, this work may present a practicable strategy to develop subcellular organelle targeted PTT agents for simultaneous cancer targeting, imaging, and therapy.

  16. Multisensory Stimulation to Improve Low- and Higher-Level Sensory Deficits after Stroke: A Systematic Review.

    PubMed

    Tinga, Angelica Maria; Visser-Meily, Johanna Maria Augusta; van der Smagt, Maarten Jeroen; Van der Stigchel, Stefan; van Ee, Raymond; Nijboer, Tanja Cornelia Wilhelmina

    2016-03-01

    The aim of this systematic review was to integrate and assess evidence for the effectiveness of multisensory stimulation (i.e., stimulating at least two of the following sensory systems: visual, auditory, and somatosensory) as a possible rehabilitation method after stroke. Evidence was considered with a focus on low-level, perceptual (visual, auditory and somatosensory deficits), as well as higher-level, cognitive, sensory deficits. We referred to the electronic databases Scopus and PubMed to search for articles that were published before May 2015. Studies were included which evaluated the effects of multisensory stimulation on patients with low- or higher-level sensory deficits caused by stroke. Twenty-one studies were included in this review and the quality of these studies was assessed (based on eight elements: randomization, inclusion of control patient group, blinding of participants, blinding of researchers, follow-up, group size, reporting effect sizes, and reporting time post-stroke). Twenty of the twenty-one included studies demonstrate beneficial effects on low- and/or higher-level sensory deficits after stroke. Notwithstanding these beneficial effects, the quality of the studies is insufficient for valid conclusion that multisensory stimulation can be successfully applied as an effective intervention. A valuable and necessary next step would be to set up well-designed randomized controlled trials to examine the effectiveness of multisensory stimulation as an intervention for low- and/or higher-level sensory deficits after stroke. Finally, we consider the potential mechanisms of multisensory stimulation for rehabilitation to guide this future research.

  17. Early, Low-Level Auditory-Somatosensory Multisensory Interactions Impact Reaction Time Speed

    PubMed Central

    Sperdin, Holger F.; Cappe, Céline; Foxe, John J.; Murray, Micah M.

    2009-01-01

    Several lines of research have documented early-latency non-linear response interactions between audition and touch in humans and non-human primates. That these effects have been obtained under anesthesia, passive stimulation, as well as speeded reaction time tasks would suggest that some multisensory effects are not directly influencing behavioral outcome. We investigated whether the initial non-linear neural response interactions have a direct bearing on the speed of reaction times. Electrical neuroimaging analyses were applied to event-related potentials in response to auditory, somatosensory, or simultaneous auditory–somatosensory multisensory stimulation that were in turn averaged according to trials leading to fast and slow reaction times (using a median split of individual subject data for each experimental condition). Responses to multisensory stimulus pairs were contrasted with each unisensory response as well as summed responses from the constituent unisensory conditions. Behavioral analyses indicated that neural response interactions were only implicated in the case of trials producing fast reaction times, as evidenced by facilitation in excess of probability summation. In agreement, supra-additive non-linear neural response interactions between multisensory and the sum of the constituent unisensory stimuli were evident over the 40–84 ms post-stimulus period only when reaction times were fast, whereas subsequent effects (86–128 ms) were observed independently of reaction time speed. Distributed source estimations further revealed that these earlier effects followed from supra-additive modulation of activity within posterior superior temporal cortices. These results indicate the behavioral relevance of early multisensory phenomena. PMID:19404410

  18. Antinucleon-nucleus interactions

    SciTech Connect

    Dover, C.B.

    1987-01-01

    Recent experimental and theoretical results on anti p-nucleus interactions are reviewed. We focus on determinations of the anti p optical potential from elastic scattering, the use of (anti p, anti p') inelastic scattering to reveal aspects of the spin-isospin dependence of N anti N amplitudes, and some puzzling features of (anti p, anti n) charge exchange reactions on nuclei. 47 refs., 7 figs.

  19. Synthetic approaches to multifunctional indenes

    PubMed Central

    López-Pérez, Sara; Dinarès, Immaculada

    2011-01-01

    Summary The synthesis of multifunctional indenes with at least two different functional groups has not yet been extensively explored. Among the plausible synthetic routes to 3,5-disubstituted indenes bearing two different functional groups, such as the [3-(aminoethyl)inden-5-yl)]amines, a reasonable pathway involves the (5-nitro-3-indenyl)acetamides as key intermediates. Although several multistep synthetic approaches can be applied to obtain these advanced intermediates, we describe herein their preparation by an aldol-type reaction between 5-nitroindan-1-ones and the lithium salt of N,N-disubstituted acetamides, followed immediately by dehydration with acid. This classical condensation process, which is neither simple nor trivial despite its apparent directness, permits an efficient entry to a variety of indene-based molecular modules, which could be adapted to a range of functionalized indanones. PMID:22238553

  20. Toward multifunctional "clickable" diamond nanoparticles.

    PubMed

    Khanal, Manakamana; Turcheniuk, Volodymyr; Barras, Alexandre; Rosay, Elodie; Bande, Omprakash; Siriwardena, Aloysius; Zaitsev, Vladimir; Pan, Guo-Hui; Boukherroub, Rabah; Szunerits, Sabine

    2015-04-07

    Nanodiamonds (NDs) are among the most promising new carbon based materials for biomedical applications, and the simultaneous integration of various functions onto NDs is an urgent necessity. A multifunctional nanodiamond based formulation is proposed here. Our strategy relies on orthogonal surface modification using different dopamine anchors. NDs simultaneously functionalized with triethylene glycol (EG) and azide (-N3) functions were fabricated through a stoichiometrically controlled integration of the dopamine ligands onto the surface of hydroxylated NDs. The presence of EG functionalities rendered NDs soluble in water and biological media, while the -N3 group allowed postsynthetic modification of the NDs using "click" chemistry. As a proof of principle, alkynyl terminated di(amido amine) ligands were linked to these ND particles.

  1. Metalenses: Versatile multifunctional photonic components.

    PubMed

    Khorasaninejad, Mohammadreza; Capasso, Federico

    2017-10-05

    Recent progress in metasurface designs fueled by advanced-fabrication techniques has led to the realization of ultrathin, lightweight, and flat lenses (metalenses) with unprecedented functionalities. Due to straightforward fabrication, generally requiring a single-step lithography, and possibility of vertical integration, these planar lenses can potentially replace or complement their conventional refractive and diffractive counterparts leading to further miniaturization of high performance optical devices and systems. Here, we give a brief overview of the evolution of metalenses with an emphasis on the visible and near-infrared spectrum and summarize their important features: diffraction-limited focusing, high quality imaging and multifunctionalities. Future challenges including aberrations' corrections, as well as current issues and solutions are discussed. We conclude by providing an outlook of this technology platform and by identifying future promising directions. Copyright © 2017, American Association for the Advancement of Science.

  2. Multifunctionalities driven by ferroic domains

    SciTech Connect

    Yang, J. C.; Huang, Y. L.; Chu, Y. H.; He, Q.

    2014-08-14

    Considerable attention has been paid to ferroic systems in pursuit of advanced applications in past decades. Most recently, the emergence and development of multiferroics, which exhibit the coexistence of different ferroic natures, has offered a new route to create functionalities in the system. In this manuscript, we step from domain engineering to explore a roadmap for discovering intriguing phenomena and multifunctionalities driven by periodic domain patters. As-grown periodic domains, offering exotic order parameters, periodic local perturbations and the capability of tailoring local spin, charge, orbital and lattice degrees of freedom, are introduced as modeling templates for fundamental studies and novel applications. We discuss related significant findings on ferroic domain, nanoscopic domain walls, and conjunct heterostructures based on the well-organized domain patterns, and end with future prospects and challenges in the field.

  3. Multifunction display system, volume 1

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The design and construction of a multifunction display man/machine interface for use with a 4 pi IBM-360 System are described. The system is capable of displaying superimposed volatile alphanumeric and graphical data on a 512 x 512 element plasma panel, and holographically stored multicolor archival information. The volatile data may be entered from a keyboard or by means of an I/O interface to the 360 system. A 2-page memory local to the display is provided for storing the entered data. The archival data is stored as a phase hologram on a vinyl tape strip. This data is accessible by means of a rapid transport system which responds to inputs provided by the I/O channel on the keyboard. As many as 500 frames may be stored on a tape strip for access in under 6 seconds.

  4. Multifunctional composites for energy storage

    NASA Astrophysics Data System (ADS)

    Shuvo, Mohammad Arif I.; Karim, Hasanul; Rajib, Md; Delfin, Diego; Lin, Yirong

    2014-03-01

    Electrochemical super-capacitors have become one of the most important topics in both academia and industry as novel energy storage devices because of their high power density, long life cycles, and high charge/discharge efficiency. Recently, there has been an increasing interest in the development of multifunctional structural energy storage devices such as structural super-capacitors for applications in aerospace, automobiles and portable electronics. These multifunctional structural super-capacitors provide lighter structures combining energy storage and load bearing functionalities. Due to their superior materials properties, carbon fiber composites have been widely used in structural applications for aerospace and automotive industries. Besides, carbon fiber has good electrical conductivity which will provide lower equivalent series resistance; therefore, it can be an excellent candidate for structural energy storage applications. Hence, this paper is focused on performing a pilot study for using nanowire/carbon fiber hybrids as building materials for structural energy storage materials; aiming at enhancing the charge/discharge rate and energy density. This hybrid material combines the high specific surface area of carbon fiber and pseudo-capacitive effect of metal oxide nanowires which were grown hydrothermally in an aligned fashion on carbon fibers. The aligned nanowire array could provide a higher specific surface area that leads to high electrode-electrolyte contact area and fast ion diffusion rates. Scanning Electron Microscopy (SEM) and XRay Diffraction (XRD) measurements were used for the initial characterization of this nanowire/carbon fiber hybrid material system. Electrochemical testing has been performed using a potentio-galvanostat. The results show that gold sputtered nanowire hybrid carbon fiber provides 65.9% better performance than bare carbon fiber cloth as super-capacitor.

  5. The variability of multisensory processes of natural stimuli in human and non-human primates in a detection task

    PubMed Central

    Juan, Cécile; Cappe, Céline; Alric, Baptiste; Roby, Benoit; Gilardeau, Sophie; Barone, Pascal; Girard, Pascal

    2017-01-01

    Background Behavioral studies in both human and animals generally converge to the dogma that multisensory integration improves reaction times (RTs) in comparison to unimodal stimulation. These multisensory effects depend on diverse conditions among which the most studied were the spatial and temporal congruences. Further, most of the studies are using relatively simple stimuli while in everyday life, we are confronted to a large variety of complex stimulations constantly changing our attentional focus over time, a modality switch that can impact on stimuli detection. In the present study, we examined the potential sources of the variability in reaction times and multisensory gains with respect to the intrinsic features of a large set of natural stimuli. Methodology/Principle findings Rhesus macaque monkeys and human subjects performed a simple audio-visual stimulus detection task in which a large collection of unimodal and bimodal natural stimuli with semantic specificities was presented at different saliencies. Although we were able to reproduce the well-established redundant signal effect, we failed to reveal a systematic violation of the race model which is considered to demonstrate multisensory integration. In both monkeys and human species, our study revealed a large range of multisensory gains, with negative and positive values. While modality switch has clear effects on reaction times, one of the main causes of the variability of multisensory gains appeared to be linked to the intrinsic physical parameters of the stimuli. Conclusion/Significance Based on the variability of multisensory benefits, our results suggest that the neuronal mechanisms responsible of the redundant effect (interactions vs. integration) are highly dependent on the stimulus complexity suggesting different implications of uni- and multisensory brain regions. Further, in a simple detection task, the semantic values of individual stimuli tend to have no significant impact on task

  6. Keeping in touch with the visual system: spatial alignment and multisensory integration of visual-somatosensory inputs.

    PubMed

    Mahoney, Jeannette R; Molholm, Sophie; Butler, John S; Sehatpour, Pejman; Gomez-Ramirez, Manuel; Ritter, Walter; Foxe, John J

    2015-01-01

    Correlated sensory inputs coursing along the individual sensory processing hierarchies arrive at multisensory convergence zones in cortex where inputs are processed in an integrative manner. The exact hierarchical level of multisensory convergence zones and the timing of their inputs are still under debate, although increasingly, evidence points to multisensory integration (MSI) at very early sensory processing levels. While MSI is said to be governed by stimulus properties including space, time, and magnitude, violations of these rules have been documented. The objective of the current study was to determine, both psychophysically and electrophysiologically, whether differential visual-somatosensory (VS) integration patterns exist for stimuli presented to the same versus opposite hemifields. Using high-density electrical mapping and complementary psychophysical data, we examined multisensory integrative processing for combinations of visual and somatosensory inputs presented to both left and right spatial locations. We assessed how early during sensory processing VS interactions were seen in the event-related potential and whether spatial alignment of the visual and somatosensory elements resulted in differential integration effects. Reaction times to all VS pairings were significantly faster than those to the unisensory conditions, regardless of spatial alignment, pointing to engagement of integrative multisensory processing in all conditions. In support, electrophysiological results revealed significant differences between multisensory simultaneous VS and summed V + S responses, regardless of the spatial alignment of the constituent inputs. Nonetheless, multisensory effects were earlier in the aligned conditions, and were found to be particularly robust in the case of right-sided inputs (beginning at just 55 ms). In contrast to previous work on audio-visual and audio-somatosensory inputs, the current work suggests a degree of spatial specificity to the earliest

  7. Keeping in touch with the visual system: spatial alignment and multisensory integration of visual-somatosensory inputs

    PubMed Central

    Mahoney, Jeannette R.; Molholm, Sophie; Butler, John S.; Sehatpour, Pejman; Gomez-Ramirez, Manuel; Ritter, Walter; Foxe, John J.

    2015-01-01

    Correlated sensory inputs coursing along the individual sensory processing hierarchies arrive at multisensory convergence zones in cortex where inputs are processed in an integrative manner. The exact hierarchical level of multisensory convergence zones and the timing of their inputs are still under debate, although increasingly, evidence points to multisensory integration (MSI) at very early sensory processing levels. While MSI is said to be governed by stimulus properties including space, time, and magnitude, violations of these rules have been documented. The objective of the current study was to determine, both psychophysically and electrophysiologically, whether differential visual-somatosensory (VS) integration patterns exist for stimuli presented to the same versus opposite hemifields. Using high-density electrical mapping and complementary psychophysical data, we examined multisensory integrative processing for combinations of visual and somatosensory inputs presented to both left and right spatial locations. We assessed how early during sensory processing VS interactions were seen in the event-related potential and whether spatial alignment of the visual and somatosensory elements resulted in differential integration effects. Reaction times to all VS pairings were significantly faster than those to the unisensory conditions, regardless of spatial alignment, pointing to engagement of integrative multisensory processing in all conditions. In support, electrophysiological results revealed significant differences between multisensory simultaneous VS and summed V + S responses, regardless of the spatial alignment of the constituent inputs. Nonetheless, multisensory effects were earlier in the aligned conditions, and were found to be particularly robust in the case of right-sided inputs (beginning at just 55 ms). In contrast to previous work on audio-visual and audio-somatosensory inputs, the current work suggests a degree of spatial specificity to the earliest

  8. The variability of multisensory processes of natural stimuli in human and non-human primates in a detection task.

    PubMed

    Juan, Cécile; Cappe, Céline; Alric, Baptiste; Roby, Benoit; Gilardeau, Sophie; Barone, Pascal; Girard, Pascal

    2017-01-01

    Behavioral studies in both human and animals generally converge to the dogma that multisensory integration improves reaction times (RTs) in comparison to unimodal stimulation. These multisensory effects depend on diverse conditions among which the most studied were the spatial and temporal congruences. Further, most of the studies are using relatively simple stimuli while in everyday life, we are confronted to a large variety of complex stimulations constantly changing our attentional focus over time, a modality switch that can impact on stimuli detection. In the present study, we examined the potential sources of the variability in reaction times and multisensory gains with respect to the intrinsic features of a large set of natural stimuli. Rhesus macaque monkeys and human subjects performed a simple audio-visual stimulus detection task in which a large collection of unimodal and bimodal natural stimuli with semantic specificities was presented at different saliencies. Although we were able to reproduce the well-established redundant signal effect, we failed to reveal a systematic violation of the race model which is considered to demonstrate multisensory integration. In both monkeys and human species, our study revealed a large range of multisensory gains, with negative and positive values. While modality switch has clear effects on reaction times, one of the main causes of the variability of multisensory gains appeared to be linked to the intrinsic physical parameters of the stimuli. Based on the variability of multisensory benefits, our results suggest that the neuronal mechanisms responsible of the redundant effect (interactions vs. integration) are highly dependent on the stimulus complexity suggesting different implications of uni- and multisensory brain regions. Further, in a simple detection task, the semantic values of individual stimuli tend to have no significant impact on task performances, an effect which is probably present in more cognitive tasks.

  9. The effects of multisensory therapy on behaviour of adult clients with developmental disabilities--a systematic review.

    PubMed

    Chan, Sally Wai-chi; Thompson, David R; Chau, Janita P C; Tam, Wilson W S; Chiu, Ivy W S; Lo, Susanne H S

    2010-01-01

    There is a growing use of multisensory therapy in enhancing sense of well-being and reducing challenging or stereotypic self-stimulating behaviour in people with a developmental disability. This review aimed to present the best available evidence on the effect of multisensory therapy in adult clients with developmental disabilities on the frequency of challenging behavior, stereotypic self-stimulating behavior, and positive behaviour; and changes of physiological measures. Systematic review. A search of electronic databases of published research studies (January 1985-December 2008) was conducted, using appropriate search terms. The reference lists and bibliographies of retrieved articles were reviewed to identify research not located through other search strategies. Studies that investigated the effects of multisensory environment in relation to outcomes were examined. Data were extracted independently by two reviewers. Methodological quality was also assessed by two reviewers against key quality criteria. One hundred and thirty-two studies were identified from database search of which 17 met the inclusion criteria for review. The evidence supports that participants' had displayed more positive behaviour after multisensory therapy sessions. There is no strong evidence supporting that multisensory therapy could help in reducing challenging behaviour or stereotypic self-stimulating behaviour. This systematic review demonstrates a beneficial effect of multisensory therapy in promoting participants' positive emotions. While the reviewers acknowledge the difficulty in carrying out randomized controlled trial in people with developmental disabilities and challenging behavior, the lack of trial-derived evidence makes it difficult to arrive at a conclusion of the effectiveness of the multisensory therapy. Future study should use well-designed randomised controlled trials to evaluate the short and long term effectiveness of multisensory therapy. There is also a need for

  10. Advanced Multifunctional MMOD Shield: Radiation Shielding Assessment

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Christiansen, Eric

    2013-01-01

    Deep space missions must contend with a harsh radiation environment Impacts to crew and electronics. Need to invest in multifunctionality for spacecraft optimization. MMOD shield. Goals: Increase radiation mitigation potential. Retain overall MMOD shielding performance.

  11. Microbial diversity drives multifunctionality in terrestrial ecosystems

    PubMed Central

    Delgado-Baquerizo, Manuel; Maestre, Fernando T.; Reich, Peter B.; Jeffries, Thomas C.; Gaitan, Juan J.; Encinar, Daniel; Berdugo, Miguel; Campbell, Colin D.; Singh, Brajesh K.

    2016-01-01

    Despite the importance of microbial communities for ecosystem services and human welfare, the relationship between microbial diversity and multiple ecosystem functions and services (that is, multifunctionality) at the global scale has yet to be evaluated. Here we use two independent, large-scale databases with contrasting geographic coverage (from 78 global drylands and from 179 locations across Scotland, respectively), and report that soil microbial diversity positively relates to multifunctionality in terrestrial ecosystems. The direct positive effects of microbial diversity were maintained even when accounting simultaneously for multiple multifunctionality drivers (climate, soil abiotic factors and spatial predictors). Our findings provide empirical evidence that any loss in microbial diversity will likely reduce multifunctionality, negatively impacting the provision of services such as climate regulation, soil fertility and food and fibre production by terrestrial ecosystems. PMID:26817514

  12. Microbial diversity drives multifunctionality in terrestrial ecosystems.

    PubMed

    Delgado-Baquerizo, Manuel; Maestre, Fernando T; Reich, Peter B; Jeffries, Thomas C; Gaitan, Juan J; Encinar, Daniel; Berdugo, Miguel; Campbell, Colin D; Singh, Brajesh K

    2016-01-28

    Despite the importance of microbial communities for ecosystem services and human welfare, the relationship between microbial diversity and multiple ecosystem functions and services (that is, multifunctionality) at the global scale has yet to be evaluated. Here we use two independent, large-scale databases with contrasting geographic coverage (from 78 global drylands and from 179 locations across Scotland, respectively), and report that soil microbial diversity positively relates to multifunctionality in terrestrial ecosystems. The direct positive effects of microbial diversity were maintained even when accounting simultaneously for multiple multifunctionality drivers (climate, soil abiotic factors and spatial predictors). Our findings provide empirical evidence that any loss in microbial diversity will likely reduce multifunctionality, negatively impacting the provision of services such as climate regulation, soil fertility and food and fibre production by terrestrial ecosystems.

  13. Leukocyte nucleus segmentation and nucleus lobe counting.

    PubMed

    Chan, Yung-Kuan; Tsai, Meng-Hsiun; Huang, Der-Chen; Zheng, Zong-Han; Hung, Kun-Ding

    2010-11-12

    Leukocytes play an important role in the human immune system. The family of leukocytes is comprised of lymphocytes, monocytes, eosinophils, basophils, and neutrophils. Any infection or acute stress may increase or decrease the number of leukocytes. An increased percentage of neutrophils may be caused by an acute infection, while an increased percentage of lymphocytes can be caused by a chronic bacterial infection. It is important to realize an abnormal variation in the leukocytes. The five types of leukocytes can be distinguished by their cytoplasmic granules, staining properties of the granules, size of cell, the proportion of the nuclear to the cytoplasmic material, and the type of nucleolar lobes. The number of lobes increased when leukemia, chronic nephritis, liver disease, cancer, sepsis, and vitamin B12 or folate deficiency occurred. Clinical neutrophil hypersegmentation has been widely used as an indicator of B12 or folate deficiency.Biomedical technologists can currently recognize abnormal leukocytes using human eyes. However, the quality and efficiency of diagnosis may be compromised due to the limitations of the biomedical technologists' eyesight, strength, and medical knowledge. Therefore, the development of an automatic leukocyte recognition system is feasible and necessary. It is essential to extract the leukocyte region from a blood smear image in order to develop an automatic leukocyte recognition system. The number of lobes increased when leukemia, chronic nephritis, liver disease, cancer, sepsis, and vitamin B12 or folate deficiency occurred. Clinical neutrophil hypersegmentation has been widely used as an indicator of B12 or folate deficiency. The purpose of this paper is to contribute an automatic leukocyte nuclei image segmentation method for such recognition technology. The other goal of this paper is to develop the method of counting the number of lobes in a cell nucleus. The experimental results demonstrated impressive segmentation accuracy

  14. Targeting Prostate Cancer with Multifunctional Nanoparticles

    DTIC Science & Technology

    2015-10-01

    AWARD NUMBER: W81XWH-14-1-0487 TITLE: Targeting Prostate Cancer with Multifunctional Nanoparticles PRINCIPAL INVESTIGATOR: Darryl Martin...Targeting Prostate Cancer with Multifunctional Nanoparticles 5b. GRANT NUMBER W81XWH-14-1-0487 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Darryl...STATEMENT Approved for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Prostate cancer cells were transfected with claudin siRNA

  15. Multifunctional magnetic quantum dots for cancer theranostics.

    PubMed

    Singh, Surinder P

    2011-02-01

    The development of an innovative platform for cancer theranostics that will be capable of noninvasive imaging and treatment of cancerous tumors using biocompatible and multifunctional Fe3O4-ZnO core-shell magnetic quantum dots (M-QDs) is being explored. This multi-functional approach will facilitate deep tumor targeting using a combination of a specific cancer marker and an external magnetic field will simultaneously provide therapy that may evolve as a new paradigm in cancer theranostics.

  16. Functional trait diversity maximizes ecosystem multifunctionality.

    PubMed

    Gross, Nicolas; Le Bagousse-Pinguet, Yoann; Liancourt, Pierre; Berdugo, Miguel; Gotelli, Nicholas J; Maestre, Fernando T

    2017-05-01

    Understanding the relationship between biodiversity and ecosystem functioning has been a core ecological research topic over the last decades. Although a key hypothesis is that the diversity of functional traits determines ecosystem functioning, we do not know how much trait diversity is needed to maintain multiple ecosystem functions simultaneously (multifunctionality). Here, we uncovered a scaling relationship between the abundance distribution of two key plant functional traits (specific leaf area, maximum plant height) and multifunctionality in 124 dryland plant communities spread over all continents except Antarctica. For each trait, we found a strong empirical relationship between the skewness and the kurtosis of the trait distributions that cannot be explained by chance. This relationship predicted a strikingly high trait diversity within dryland plant communities, which was associated with a local maximization of multifunctionality. Skewness and kurtosis had a much stronger impact on multifunctionality than other important multifunctionality drivers such as species richness and aridity. The scaling relationship identified here quantifies how much trait diversity is required to maximize multifunctionality locally. Trait distributions can be used to predict the functional consequences of biodiversity loss in terrestrial ecosystems.

  17. Does multifunctionality matter to US farmers? Farmer motivations and conceptions of multifunctionality in dairy systems.

    PubMed

    Brummel, Rachel F; Nelson, Kristen C

    2014-12-15

    The concept of multifunctionality describes and promotes the multiple non-production benefits that emerge from agricultural systems. The notion of multifunctional agriculture was conceived in a European context and largely has been used in European policy arenas to promote and protect the non-production goods emerging from European agriculture. Thus scholars and policy-makers disagree about the relevance of multifunctionality for United States agricultural policy and US farmers. In this study, we explore lived expressions of multifunctional agriculture at the farm-level to examine the salience of the multifunctionality concept in the US. In particular, we investigate rotational grazing and confinement dairy farms in the eastern United States as case studies of multifunctional and productivist agriculture. We also analyze farmer motivations for transitioning from confinement dairy to rotational grazing systems. Through interviews with a range of dairy producers in Wisconsin, Pennsylvania, and New York, we found that farmers were motivated by multiple factors--including improved cow health and profitability--to transition to rotational grazing systems to achieve greater farm-level multifunctionality. Additionally, rotational grazing farmers attributed a broader range of production and non-production benefits to their farm practice than confinement dairy farmers. Further, rotational grazing dairy farmers described a system-level notion of multifunctionality based on the interdependence of multiple benefits across scales--from the farm to the national level--emerging from grazing operations. We find that the concept of multifunctionality could be expanded in the US to address the interdependence of benefits emerging from farming practices, as well as private benefits to farmers. We contend that understanding agricultural benefits as experienced by the farmer is an important contribution to enriching the multifunctionality concept in the US context, informing agri

  18. Nucleus from string theory

    NASA Astrophysics Data System (ADS)

    Hashimoto, Koji; Morita, Takeshi

    2011-08-01

    In generic holographic QCD, we find that baryons are bound to form a nucleus, and that its radius obeys the empirically-known mass-number (A) dependence r∝A1/3 for large A. Our result is robust, since we use only a generic property of D-brane actions in string theory. We also show that nucleons are bound completely in a finite volume. Furthermore, employing a concrete holographic model (derived by Hashimoto, Iizuka, and Yi, describing a multibaryon system in the Sakai-Sugimoto model), the nuclear radius is evaluated as O(1)×A1/3[fm], which is consistent with experiments.

  19. Reality of comet nucleus.

    NASA Technical Reports Server (NTRS)

    Lyttleton, R. A.

    1972-01-01

    The prime problem of a comet mission must be to settle whether the cometary nucleus has an actual tangible material existence, or whether it arises from some optical effect present only at times within comets. The absence of any large particles in a comet seems to be demonstrated by certain meteor showers. A feature that would seem to indicate that a comet consists primarily of a swarm of particles is that the coma in general contracts as the comet approaches the sun, roughly in proportion within the distance, and then expands again as it recedes.

  20. Neutrino-nucleus interactions

    SciTech Connect

    Gallagher, H.; Garvey, G.; Zeller, G.P.; /Fermilab

    2011-01-01

    The study of neutrino oscillations has necessitated a new generation of neutrino experiments that are exploring neutrino-nuclear scattering processes. We focus in particular on charged-current quasi-elastic scattering, a particularly important channel that has been extensively investigated both in the bubble-chamber era and by current experiments. Recent results have led to theoretical reexamination of this process. We review the standard picture of quasi-elastic scattering as developed in electron scattering, review and discuss experimental results, and discuss additional nuclear effects such as exchange currents and short-range correlations that may play a significant role in neutrino-nucleus scattering.

  1. Higgs-boson production in nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Norbury, J. W.; Townsend, L. W. (Principal Investigator)

    1990-01-01

    Cross-section calculations are presented for the production of intermediate-mass Higgs bosons produced in ultrarelativistic nucleus-nucleus collisions via two-photon fusion. The calculations are performed in position space using Baur's method for folding together the Weizsacker-Williams virtual-photon spectra of the two colliding nuclei. It is found that two-photon fusion in nucleus-nucleus collisions is a plausible way of finding intermediate-mass Higgs bosons at the Superconducting Super Collider or the CERN Large Hadron Collider.

  2. Higgs-Boson Production in Nucleus-Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1992-01-01

    Cross section calculations are presented for the production of intermediate-mass Higgs bosons produced in ultrarelativistic nucleus-nucleus collisions via two photon fusion. The calculations are performed in position space using Baur's method for folding together the Weizsacker-Williams virtual-photon spectra of the two colliding nuclei. It is found that two photon fusion in nucleus-nucleus collisions is a plausible way of finding intermediate-mass Higgs bosons at the Superconducting Super Collider or the CERN Large Hadron Collider.

  3. Higgs-Boson Production in Nucleus-Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Norbury, John W.

    1992-01-01

    Cross section calculations are presented for the production of intermediate-mass Higgs bosons produced in ultrarelativistic nucleus-nucleus collisions via two photon fusion. The calculations are performed in position space using Baur's method for folding together the Weizsacker-Williams virtual-photon spectra of the two colliding nuclei. It is found that two photon fusion in nucleus-nucleus collisions is a plausible way of finding intermediate-mass Higgs bosons at the Superconducting Super Collider or the CERN Large Hadron Collider.

  4. Higgs-boson production in nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Norbury, J. W.; Townsend, L. W. (Principal Investigator)

    1990-01-01

    Cross-section calculations are presented for the production of intermediate-mass Higgs bosons produced in ultrarelativistic nucleus-nucleus collisions via two-photon fusion. The calculations are performed in position space using Baur's method for folding together the Weizsacker-Williams virtual-photon spectra of the two colliding nuclei. It is found that two-photon fusion in nucleus-nucleus collisions is a plausible way of finding intermediate-mass Higgs bosons at the Superconducting Super Collider or the CERN Large Hadron Collider.

  5. Networking the nucleus

    PubMed Central

    Rajapakse, Indika; Scalzo, David; Tapscott, Stephen J; Kosak, Steven T; Groudine, Mark

    2010-01-01

    The nuclei of differentiating cells exhibit several fundamental principles of self-organization. They are composed of many dynamical units connected physically and functionally to each other—a complex network—and the different parts of the system are mutually adapted and produce a characteristic end state. A unique cell-specific signature emerges over time from complex interactions among constituent elements that delineate coordinate gene expression and chromosome topology. Each element itself consists of many interacting components, all dynamical in nature. Self-organizing systems can be simplified while retaining complex information using approaches that examine the relationship between elements, such as spatial relationships and transcriptional information. These relationships can be represented using well-defined networks. We hypothesize that during the process of differentiation, networks within the cell nucleus rewire according to simple rules, from which a higher level of order emerges. Studying the interaction within and among networks provides a useful framework for investigating the complex organization and dynamic function of the nucleus. PMID:20664641

  6. Oscillatory brain activity during multisensory attention reflects activation, disinhibition, and cognitive control

    PubMed Central

    Friese, Uwe; Daume, Jonathan; Göschl, Florian; König, Peter; Wang, Peng; Engel, Andreas K.

    2016-01-01

    In this study, we used a novel multisensory attention paradigm to investigate attention-modulated cortical oscillations over a wide range of frequencies using magnetencephalography in healthy human participants. By employing a task that required the evaluation of the congruence of audio-visual stimuli, we promoted the formation of widespread cortical networks including early sensory cortices as well as regions associated with cognitive control. We found that attention led to increased high-frequency gamma-band activity and decreased lower frequency theta-, alpha-, and beta-band activity in early sensory cortex areas. Moreover, alpha-band coherence decreased in visual cortex. Frontal cortex was found to exert attentional control through increased low-frequency phase synchronisation. Crossmodal congruence modulated beta-band coherence in mid-cingulate and superior temporal cortex. Together, these results offer an integrative view on the concurrence of oscillations at different frequencies during multisensory attention. PMID:27604647

  7. Audio-visual multisensory integration in superior parietal lobule revealed by human intracranial recordings.

    PubMed

    Molholm, Sophie; Sehatpour, Pejman; Mehta, Ashesh D; Shpaner, Marina; Gomez-Ramirez, Manuel; Ortigue, Stephanie; Dyke, Jonathan P; Schwartz, Theodore H; Foxe, John J

    2006-08-01

    Intracranial recordings from three human subjects provide the first direct electrophysiological evidence for audio-visual multisensory processing in the human superior parietal lobule (SPL). Auditory and visual sensory inputs project to the same highly localized region of the parietal cortex with auditory inputs arriving considerably earlier (30 ms) than visual inputs (75 ms). Multisensory integration processes in this region were assessed by comparing the response to simultaneous audio-visual stimulation with the algebraic sum of responses to the constituent auditory and visual unisensory stimulus conditions. Significant integration effects were seen with almost identical morphology across the three subjects, beginning between 120 and 160 ms. These results are discussed in the context of the role of SPL in supramodal spatial attention and sensory-motor transformations.

  8. Cortical hubs form a module for multisensory integration on top of the hierarchy of cortical networks.

    PubMed

    Zamora-López, Gorka; Zhou, Changsong; Kurths, Jürgen

    2010-01-01

    Sensory stimuli entering the nervous system follow particular paths of processing, typically separated (segregated) from the paths of other modal information. However, sensory perception, awareness and cognition emerge from the combination of information (integration). The corticocortical networks of cats and macaque monkeys display three prominent characteristics: (i) modular organisation (facilitating the segregation), (ii) abundant alternative processing paths and (iii) the presence of highly connected hubs. Here, we study in detail the organisation and potential function of the cortical hubs by graph analysis and information theoretical methods. We find that the cortical hubs form a spatially delocalised, but topologically central module with the capacity to integrate multisensory information in a collaborative manner. With this, we resolve the underlying anatomical substrate that supports the simultaneous capacity of the cortex to segregate and to integrate multisensory information.

  9. Synaptic diversity enables temporal coding of coincident multi-sensory inputs in single neurons

    PubMed Central

    Chabrol, François P.; Arenz, Alexander; Wiechert, Martin T.; Margrie, Troy W.; DiGregorio, David A.

    2015-01-01

    The ability of the brain to rapidly process information from multiple pathways is critical for reliable execution of complex sensory-motor behaviors, yet the cellular mechanisms underlying a neuronal representation of multimodal stimuli are poorly understood. Here we explored the possibility that the physiological diversity of mossy fiber (MF) to granule cell (GC) synapses within the mouse vestibulocerebellum may contribute to the processing of coincident multisensory information at the level of individual GCs. We found that the strength and short-term dynamics of individual MF-GC synapses can act as biophysical signatures for primary vestibular, secondary vestibular and visual input pathways. The majority of GCs receive inputs from different modalities, which when co-activated, produced enhanced GC firing rates and distinct first spike latencies. Thus, pathway-specific synaptic response properties permit temporal coding of correlated multisensory input by single GCs, thereby enriching sensory representation and facilitating pattern separation. PMID:25821914

  10. Rett syndrome management with Snoezelen or controlled multi-sensory stimulation. A review.

    PubMed

    Lotan, Meir; Merrick, Joav

    2004-01-01

    Rett syndrome is a neurological disorder resulting from an X-linked dominant mutation. It is characterized by a variety of physical and perceptual disabilities, resulting in a need for constant therapy programs to be administered on a regular basis throughout life. Resistance to physical activity has driven the authors in a search for new intervention techniques which might improve the ability to cope while reducing difficulty in handling an external physical facilitator. Snoezelen, or multi-sensory environment, can provide a soothing environment appealing to the child or adolescent with Rett syndrome while at the same time improving physical abilities. The article reviews Rett syndrome typical phenotype and suggests suitable activities that might take place in the multi-sensory environment.

  11. Cortical Hubs Form a Module for Multisensory Integration on Top of the Hierarchy of Cortical Networks

    PubMed Central

    Zamora-López, Gorka; Zhou, Changsong; Kurths, Jürgen

    2009-01-01

    Sensory stimuli entering the nervous system follow particular paths of processing, typically separated (segregated) from the paths of other modal information. However, sensory perception, awareness and cognition emerge from the combination of information (integration). The corticocortical networks of cats and macaque monkeys display three prominent characteristics: (i) modular organisation (facilitating the segregation), (ii) abundant alternative processing paths and (iii) the presence of highly connected hubs. Here, we study in detail the organisation and potential function of the cortical hubs by graph analysis and information theoretical methods. We find that the cortical hubs form a spatially delocalised, but topologically central module with the capacity to integrate multisensory information in a collaborative manner. With this, we resolve the underlying anatomical substrate that supports the simultaneous capacity of the cortex to segregate and to integrate multisensory information. PMID:20428515

  12. Brief Cortical Deactivation Early in Life Has Long-Lasting Effects on Multisensory Behavior

    PubMed Central

    Jiang, Wan; Stein, Barry E.

    2014-01-01

    Detecting and locating environmental events are markedly enhanced by the midbrain's ability to integrate visual and auditory cues. Its capacity for multisensory integration develops in cats 1–4 months after birth but only after acquiring extensive visual–auditory experience. However, briefly deactivating specific regions of association cortex during this period induced long-term disruption of this maturational process, such that even 1 year later animals were unable to integrate visual and auditory cues to enhance their behavioral performance. The data from this animal model reveal a window of sensitivity within which association cortex mediates the encoding of cross-modal experience in the midbrain. Surprisingly, however, 3 years later, and without any additional intervention, the capacity appeared fully developed. This suggests that, although sensitivity degrades with age, the potential for acquiring or modifying multisensory integration capabilities extends well into adulthood. PMID:24849354

  13. Temporary deafness can impair multisensory integration: a study of cochlear-implant users.

    PubMed

    Landry, Simon P; Guillemot, Jean-Paul; Champoux, François

    2013-07-01

    Previous investigations suggest that temporary deafness can have a dramatic impact on audiovisual speech processing. The aim of this study was to test whether temporary deafness disturbs other multisensory processes in adults. A nonspeech task involving an audiotactile illusion was administered to a group of normally hearing individuals and a group of individuals who had been temporarily auditorily deprived. Members of this latter group had their auditory detection thresholds restored to normal levels through the use of a cochlear implant. Control conditions revealed that auditory and tactile discrimination capabilities were identical in the two groups. However, whereas normally hearing individuals integrated auditory and tactile information, so that they experienced the audiotactile illusion, individuals who had been temporarily deprived did not. Given the basic nature of the task, failure to integrate multisensory information could not be explained by the use of the cochlear implant. Thus, the results suggest that normally anticipated audiotactile interactions are disturbed following temporary deafness.

  14. The Influence of Embodiment on Multisensory Integration using the Mirror Box Illusion

    PubMed Central

    Medina, Jared; Khurana, Priya; Coslett, H. Branch

    2015-01-01

    We examined the relationship between subcomponents of embodiment and multisensory integration using a mirror box illusion. The participants’ left hand was positioned against the mirror, while their right hidden hand was positioned 12″, 6″, or 0″ from the mirror – creating a conflict between visual and proprioceptive estimates of limb position in some conditions. After synchronous tapping, asynchronous tapping, or no movement of both hands, participants gave position estimates for the hidden limb and filled out a brief embodiment questionnaire. We found a relationship between different subcomponents of embodiment and illusory displacement towards the visual estimate. Illusory visual displacement was positively correlated with feelings of deafference in the asynchronous and no movement conditions, whereas it was positive correlated with ratings of visual capture and limb ownership in the synchronous and no movement conditions. These results provide evidence for dissociable contributions of different aspects of embodiment to multisensory integration. PMID:26320868

  15. Multifunctional self-assembled monolayers

    SciTech Connect

    Zawodzinski, T.; Bar, G.; Rubin, S.; Uribe, F.; Ferrais, J.

    1996-06-01

    This is the final report of at three year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). The specific goals of this research project were threefold: to develop multifunctional self-assembled monolayers, to understand the role of monolayer structure on the functioning of such systems, and to apply this knowledge to the development of electrochemical enzyme sensors. An array of molecules that can be used to attach electrochemically active biomolecules to gold surfaces has been synthesized. Several members of a class of electroactive compounds have been characterized and the factors controlling surface modification are beginning to be characterized. Enzymes have been attached to self-assembled molecules arranged on the gold surface, a critical step toward the ultimate goal of this project. Several alternative enzyme attachment strategies to achieve robust enzyme- modified surfaces have been explored. Several means of juxtaposing enzymes and mediators, electroactive compounds through which the enzyme can exchange electrons with the electrode surface, have also been investigated. Finally, the development of sensitive biosensors based on films loaded with nanoscale-supported gold particles that have surface modified with the self-assembled enzyme and mediator have been explored.

  16. Multifunction automated crawling system (MACS)

    NASA Astrophysics Data System (ADS)

    Bar-Cohen, Yoseph; Backes, Paul G.; Joffe, Benjamin

    1996-11-01

    Nondestructive evaluation instruments and sensors are becoming smaller with enhanced computer controlled capability and increasingly use commercially available hardware and software. Further, robotic instruments are being developed to serve as mobility platforms allowing automation of the inspection process. This combination of miniaturized sensing and robotics technology enables hybrid miniature technology solutions for identified aircraft inspection needs. Integration of inspection and robotics technologies is benefited by the use of a standard computing platform. JPL investigated the application of telerobotic technology to inspection of aircraft structures using capabilities that were developed for use in space exploration. A miniature crawler that can travel on the surface of aircraft using suction cups for adherence was developed and is called multifunction automated crawling systems (MACS). MACS is an operational tool that can perform rapid large area inspection of aircraft, which has a relatively large platform to carry miniature inspection instruments payload. The capability of MACS and the trend towards autonomous inspection crawlers will be reviewed and discussed in this paper.

  17. Multifunctional nanorods for gene delivery

    NASA Astrophysics Data System (ADS)

    Salem, Aliasger K.; Searson, Peter C.; Leong, Kam W.

    2003-10-01

    The goal of gene therapy is to introduce foreign genes into somatic cells to supplement defective genes or provide additional biological functions, and can be achieved using either viral or synthetic non-viral delivery systems. Compared with viral vectors, synthetic gene-delivery systems, such as liposomes and polymers, offer several advantages including ease of production and reduced risk of cytotoxicity and immunogenicity, but their use has been limited by the relatively low transfection efficiency. This problem mainly stems from the difficulty in controlling their properties at the nanoscale. Synthetic inorganic gene carriers have received limited attention in the gene-therapy community, the only notable example being gold nanoparticles with surface-immobilized DNA applied to intradermal genetic immunization by particle bombardment. Here we present a non-viral gene-delivery system based on multisegment bimetallic nanorods that can simultaneously bind compacted DNA plasmids and targeting ligands in a spatially defined manner. This approach allows precise control of composition, size and multifunctionality of the gene-delivery system. Transfection experiments performed in vitro and in vivo provide promising results that suggest potential in genetic vaccination applications.

  18. Plasticity of somatosensory inputs to the cochlear nucleus--implications for tinnitus.

    PubMed

    Shore, S E

    2011-11-01

    This chapter reviews evidence for functional connections of the somatosensory and auditory systems at the very lowest levels of the nervous system. Neural inputs from the dosal root and trigeminal ganglia, as well as their brain stem nuclei, cuneate, gracillis and trigeminal, terminate in the cochlear nuclei. Terminations are primarily in the shell regions surrounding the cochlear nuclei but some terminals are found in the magnocellular regions of cochlear nucleus. The effects of stimulating these inputs on multisensory integration are shown as short and long-term, both suppressive and enhancing. Evidence that these projections are glutamatergic and are altered after cochlear damage is provided in the light of probable influences on the modulation and generation of tinnitus.

  19. Plasticity of somatosensory inputs to the cochlear nucleus – implications for tinnitus

    PubMed Central

    Shore, S.E.

    2011-01-01

    This chapter reviews evidence for functional connections of the somatosensory and auditory systems at the very lowest levels of the nervous system. Neural inputs from the dosal root and trigeminal ganglia, as well as their brain stem nuclei, cuneate, gracillis and trigeminal, terminate in the cochlear nuclei. Terminations are primarily in the shell regions surrounding the cochlear nuclei but some terminals are found in the magnocellular regions of cochlear nucleus. The effects of stimulating these inputs on multisensory integration are shown as short and long-term, both suppressive and enhancing. Evidence that these projections are glutamatergic and are altered after cochlear damage is provided in the light of probable influences on the modulation and generation of tinnitus. PMID:21620940

  20. Neuro-oscillatory phase alignment drives speeded multisensory response times: an electro-corticographic investigation.

    PubMed

    Mercier, Manuel R; Molholm, Sophie; Fiebelkorn, Ian C; Butler, John S; Schwartz, Theodore H; Foxe, John J

    2015-06-03

    Even simple tasks rely on information exchange between functionally distinct and often relatively distant neuronal ensembles. Considerable work indicates oscillatory synchronization through phase alignment is a major agent of inter-regional communication. In the brain, different oscillatory phases correspond to low- and high-excitability states. Optimally aligned phases (or high-excitability states) promote inter-regional communication. Studies have also shown that sensory stimulation can modulate or reset the phase of ongoing cortical oscillations. For example, auditory stimuli can reset the phase of oscillations in visual cortex, influencing processing of a simultaneous visual stimulus. Such cross-regional phase reset represents a candidate mechanism for aligning oscillatory phase for inter-regional communication. Here, we explored the role of local and inter-regional phase alignment in driving a well established behavioral correlate of multisensory integration: the redundant target effect (RTE), which refers to the fact that responses to multisensory inputs are substantially faster than to unisensory stimuli. In a speeded detection task, human epileptic patients (N = 3) responded to unisensory (auditory or visual) and multisensory (audiovisual) stimuli with a button press, while electrocorticography was recorded over auditory and motor regions. Visual stimulation significantly modulated auditory activity via phase reset in the delta and theta bands. During the period between stimulation and subsequent motor response, transient synchronization between auditory and motor regions was observed. Phase synchrony to multisensory inputs was faster than to unisensory stimulation. This sensorimotor phase alignment correlated with behavior such that stronger synchrony was associated with faster responses, linking the commonly observed RTE with phase alignment across a sensorimotor network.

  1. The effects of attention on the temporal integration of multisensory stimuli

    PubMed Central

    Donohue, Sarah E.; Green, Jessica J.; Woldorff, Marty G.

    2015-01-01

    In unisensory contexts, spatially-focused attention tends to enhance perceptual processing. How attention influences the processing of multisensory stimuli, however, has been of much debate. In some cases, attention has been shown to be important for processes related to the integration of audio-visual stimuli, but in other cases such processes have been reported to occur independently of attention. To address these conflicting results, we performed three experiments to examine how attention interacts with a key facet of multisensory processing: the temporal window of integration (TWI). The first two experiments used a novel cued-spatial-attention version of the bounce/stream illusion, wherein two moving visual stimuli with intersecting paths tend to be perceived as bouncing off rather than streaming through each other when a brief sound occurs near in time. When the task was to report whether the visual stimuli appeared to bounce or stream, attention served to narrow this measure of the TWI and bias perception toward “streaming”. When the participants’ task was to explicitly judge the simultaneity of the sound with the intersection of the moving visual stimuli, however, the results were quite different. Specifically, attention served to mainly widen the TWI, increasing the likelihood of simultaneity perception, while also substantially increasing the simultaneity judgment accuracy when the stimuli were actually physically simultaneous. Finally, in Experiment 3, where the task was to judge the simultaneity of a simple, temporally discrete, flashed visual stimulus and the same brief tone pip, attention had no effect on the measured TWI. These results highlight the flexibility of attention in enhancing multisensory perception and show that the effects of attention on multisensory processing are highly dependent on the task demands and observer goals. PMID:25954167

  2. An exploratory event-related potential study of multisensory integration in sensory over-responsive children.

    PubMed

    Brett-Green, Barbara A; Miller, Lucy J; Schoen, Sarah A; Nielsen, Darci M

    2010-03-19

    Children who are over-responsive to sensation have defensive and "fight or flight" reactions to ordinary levels of sensory stimulation in the environment. Based on clinical observations, sensory over-responsivity is hypothesized to reflect atypical neural integration of sensory input. To examine a possible underlying neural mechanism of the disorder, integration of simultaneous multisensory auditory and somatosensory stimulation was studied in twenty children with sensory over-responsivity (SOR) using event-related potentials (ERPs). Three types of sensory stimuli were presented and ERPs were recorded from thirty-two scalp electrodes while participants watched a silent cartoon: bilateral auditory clicks, right somatosensory median nerve electrical pulses, or both simultaneously. The paradigm was passive; no behavioral responses were required. To examine integration, responses to simultaneous multisensory auditory-somatosensory stimulation were compared to the sum of unisensory auditory plus unisensory somatosensory responses in four time-windows: (60-80 ms, 80-110 ms, 110-150 ms, and 180-220 ms). Specific midline and lateral electrode sites were examined over scalp regions where auditory-somatosensory integration was expected based on previous studies. Midline electrode sites (Fz, Cz, and Pz) showed significant integration during two time-windows: 60-80 ms and 180-220 ms. Significant integration was also found at contralateral electrode site (C3) for the time-window between 180 and 220 ms. At ipsilateral electrode sites (C4 and CP6), no significant integration was found during any of the time-windows (i.e. the multisensory ERP was not significantly different from the summed unisensory ERP). These results demonstrate that MSI can be reliably measured in children with SOR and provide evidence that multisensory auditory-somatosensory input is integrated during both early and later stages of sensory information processing, mainly over fronto-central scalp regions.

  3. Multisensory speech perception in autism spectrum disorder: From phoneme to whole-word perception.

    PubMed

    Stevenson, Ryan A; Baum, Sarah H; Segers, Magali; Ferber, Susanne; Barense, Morgan D; Wallace, Mark T

    2017-07-01

    Speech perception in noisy environments is boosted when a listener can see the speaker's mouth and integrate the auditory and visual speech information. Autistic children have a diminished capacity to integrate sensory information across modalities, which contributes to core symptoms of autism, such as impairments in social communication. We investigated the abilities of autistic and typically-developing (TD) children to integrate auditory and visual speech stimuli in various signal-to-noise ratios (SNR). Measurements of both whole-word and phoneme recognition were recorded. At the level of whole-word recognition, autistic children exhibited reduced performance in both the auditory and audiovisual modalities. Importantly, autistic children showed reduced behavioral benefit from multisensory integration with whole-word recognition, specifically at low SNRs. At the level of phoneme recognition, autistic children exhibited reduced performance relative to their TD peers in auditory, visual, and audiovisual modalities. However, and in contrast to their performance at the level of whole-word recognition, both autistic and TD children showed benefits from multisensory integration for phoneme recognition. In accordance with the principle of inverse effectiveness, both groups exhibited greater benefit at low SNRs relative to high SNRs. Thus, while autistic children showed typical multisensory benefits during phoneme recognition, these benefits did not translate to typical multisensory benefit of whole-word recognition in noisy environments. We hypothesize that sensory impairments in autistic children raise the SNR threshold needed to extract meaningful information from a given sensory input, resulting in subsequent failure to exhibit behavioral benefits from additional sensory information at the level of whole-word recognition. Autism Res 2017. © 2017 International Society for Autism Research, Wiley Periodicals, Inc. Autism Res 2017, 10: 1280-1290. © 2017 International

  4. Hemodynamic responses in human multisensory and auditory association cortex to purely visual stimulation

    PubMed Central

    Meyer, Martin; Baumann, Simon; Marchina, Sarah; Jancke, Lutz

    2007-01-01

    Background Recent findings of a tight coupling between visual and auditory association cortices during multisensory perception in monkeys and humans raise the question whether consistent paired presentation of simple visual and auditory stimuli prompts conditioned responses in unimodal auditory regions or multimodal association cortex once visual stimuli are presented in isolation in a post-conditioning run. To address this issue fifteen healthy participants partook in a "silent" sparse temporal event-related fMRI study. In the first (visual control) habituation phase they were presented with briefly red flashing visual stimuli. In the second (auditory control) habituation phase they heard brief telephone ringing. In the third (conditioning) phase we coincidently presented the visual stimulus (CS) paired with the auditory stimulus (UCS). In the fourth phase participants either viewed flashes paired with the auditory stimulus (maintenance, CS-) or viewed the visual stimulus in isolation (extinction, CS+) according to a 5:10 partial reinforcement schedule. The participants had no other task than attending to the stimuli and indicating the end of each trial by pressing a button. Results During unpaired visual presentations (preceding and following the paired presentation) we observed significant brain responses beyond primary visual cortex in the bilateral posterior auditory association cortex (planum temporale, planum parietale) and in the right superior temporal sulcus whereas the primary auditory regions were not involved. By contrast, the activity in auditory core regions was markedly larger when participants were presented with auditory stimuli. Conclusion These results demonstrate involvement of multisensory and auditory association areas in perception of unimodal visual stimulation which may reflect the instantaneous forming of multisensory associations and cannot be attributed to sensation of an auditory event. More importantly, we are able to show that brain

  5. Alpha-Band Oscillations Reflect Altered Multisensory Processing of the McGurk Illusion in Schizophrenia

    PubMed Central

    Roa Romero, Yadira; Keil, Julian; Balz, Johanna; Niedeggen, Michael; Gallinat, Jürgen; Senkowski, Daniel

    2016-01-01

    The formation of coherent multisensory percepts requires integration of stimuli across the multiple senses. Patients with schizophrenia (ScZ) often experience a loss of coherent perception and hence, they might also show dysfunctional multisensory processing. In this high-density electroencephalography study, we investigated the neural signatures of the McGurk illusion, as a phenomenon of speech-specific multisensory processing. In the McGurk illusion lip movements are paired with incongruent auditory syllables, which can induce a fused percept. In ScZ patients and healthy controls we compared neural oscillations and event-related potentials (ERPs) to congruent audiovisual speech stimuli and McGurk illusion trials, where a visual /ga/ and an auditory /pa/ was often perceived as /ka/. There were no significant group differences in illusion rates. The EEG data analysis revealed larger short latency ERPs to McGurk illusion compared with congruent trials in controls. The reversed effect pattern was found in ScZ patients, indicating an early audiovisual processing deficit. Moreover, we observed stronger suppression of medio-central alpha-band power (8–10 Hz, 550–700 ms) in response to McGurk illusion compared with control trials in the control group. Again, the reversed pattern was found in SCZ patients. Moreover, within groups, alpha-band suppression was negatively correlated with the McGurk illusion rate in ScZ patients, while the correlation tended to be positive in controls. The topography of alpha-band effects indicated an involvement of auditory and/or frontal structures. Our study suggests that short latency ERPs and long latency alpha-band oscillations reflect abnormal multisensory processing of the McGurk illusion in ScZ. PMID:26903845

  6. The effect of ageing on multisensory integration for the control of movement timing.

    PubMed

    Elliott, Mark T; Wing, Alan M; Welchman, Andrew E

    2011-09-01

    Previously, it has been shown that synchronising actions with periodic pacing stimuli are unaffected by ageing. However, synchronisation often requires combining evidence across multiple sources of timing information. We have previously shown the brain integrates multisensory cues to achieve a best estimate of the events in time and subsequently reduces variability in synchronised movements (Elliott et al. in Eur J Neurosci 31(10):1828-1835, 2010). Yet, it is unclear if sensory integration of temporal cues in older adults is degraded and whether this leads to reduced synchronisation performance. Here, we test for age-related changes when synchronising actions to multisensory temporal cues. We compared synchronisation performance between young (N = 15, aged 18-37 years) and older adults (N = 15, aged 63-80 years) using a finger-tapping task to auditory and tactile metronomes presented unimodally and bimodally. We added temporal jitter to the auditory metronome to determine whether participants would integrate auditory and tactile signals, with reduced weighting of the auditory metronome as its reliability decreased under bimodal conditions. We found that older adults matched the performance of young adults when synchronising to an isochronous auditory or tactile metronome. When the temporal regularity of the auditory metronome was reduced, older adults' performance was degraded to a greater extent than the young adults in both unimodal and bimodal conditions. However, proportionally both groups showed similar improvements in synchronisation performance in bimodal conditions compared with the equivalent, auditory-only conditions. We conclude that while older adults become more variable in synchronising to less regular beats, they do not show any deficit in the integration of multisensory temporal cues, suggesting that using multisensory information may help mitigate any deficits in coordinating actions to complex timing cues.

  7. A three-finger multisensory hand for dexterous space robotic tasks

    NASA Technical Reports Server (NTRS)

    Murase, Yuichi; Komada, Satoru; Uchiyama, Takashi; Machida, Kazuo; Akita, Kenzo

    1994-01-01

    The National Space Development Agency of Japan will launch ETS-7 in 1997, as a test bed for next generation space technology of RV&D and space robot. MITI has been developing a three-finger multisensory hand for complex space robotic tasks. The hand can be operated under remote control or autonomously. This paper describes the design and development of the hand and the performance of a breadboard model.

  8. Organization, Maturation, and Plasticity of Multisensory Integration: Insights from Computational Modeling Studies

    PubMed Central

    Cuppini, Cristiano; Magosso, Elisa; Ursino, Mauro

    2011-01-01

    In this paper, we present two neural network models – devoted to two specific and widely investigated aspects of multisensory integration – in order to evidence the potentialities of computational models to gain insight into the neural mechanisms underlying organization, development, and plasticity of multisensory integration in the brain. The first model considers visual–auditory interaction in a midbrain structure named superior colliculus (SC). The model is able to reproduce and explain the main physiological features of multisensory integration in SC neurons and to describe how SC integrative capability – not present at birth – develops gradually during postnatal life depending on sensory experience with cross-modal stimuli. The second model tackles the problem of how tactile stimuli on a body part and visual (or auditory) stimuli close to the same body part are integrated in multimodal parietal neurons to form the perception of peripersonal (i.e., near) space. The model investigates how the extension of peripersonal space – where multimodal integration occurs – may be modified by experience such as use of a tool to interact with the far space. The utility of the modeling approach relies on several aspects: (i) The two models, although devoted to different problems and simulating different brain regions, share some common mechanisms (lateral inhibition and excitation, non-linear neuron characteristics, recurrent connections, competition, Hebbian rules of potentiation and depression) that may govern more generally the fusion of senses in the brain, and the learning and plasticity of multisensory integration. (ii) The models may help interpretation of behavioral and psychophysical responses in terms of neural activity and synaptic connections. (iii) The models can make testable predictions that can help guiding future experiments in order to validate, reject, or modify the main assumptions. PMID:21687448

  9. Biochemistry and biology of the inducible multifunctional transcription factor TFII-I.

    PubMed

    Roy, A L

    2001-08-22

    An animal cell has the capability to respond to a variety of external signals through cell surface receptors. The response is usually manifested in terms of altered gene expression in the nucleus. Thus, in modern molecular and cell biology, it has become important to understand how the communication between extracellular signals and nuclear gene transcription is achieved. Originally discovered as a basal factor required for initiator-dependent transcription in vitro, recent evidence suggests that TFII-I is also an inducible multifunctional transcription factor that is activated in response to a variety of extracellular signals and translocates to the nucleus to turn on signal-induced genes. Here I review the biochemical and biological properties of TFII-I and related proteins in nuclear gene transcription, signal transduction and genetic disorders.

  10. Multi-Sensory and Sensorimotor Foundation of Bodily Self-Consciousness – An Interdisciplinary Approach

    PubMed Central

    Ionta, Silvio; Gassert, Roger; Blanke, Olaf

    2011-01-01

    Scientific investigations on the nature of the self have so far focused on high-level mechanisms. Recent evidence, however, suggests that low-level bottom-up mechanisms of multi-sensory integration play a fundamental role in encoding specific components of bodily self-consciousness, such as self-location and first-person perspective (Blanke and Metzinger, 2009). Self-location and first-person perspective are abnormal in neurological patients suffering from out-of-body experiences (Blanke et al., 2004), and can be manipulated experimentally in healthy subjects by imposing multi-sensory conflicts (Lenggenhager et al., 2009). Activity of the temporo-parietal junction (TPJ) reflects experimentally induced changes in self-location and first-person perspective (Ionta et al., 2011), and dysfunctions in TPJ are causally associated with out-of-body experiences (Blanke et al., 2002). We argue that TPJ is one of the key areas for multi-sensory integration of bodily self-consciousness, that its levels of activity reflect the experience of the conscious “I” as embodied and localized within bodily space, and that these mechanisms can be systematically investigated using state of the art technologies such as robotics, virtual reality, and non-invasive neuroimaging. PMID:22207860

  11. Multisensory integration of colors and scents: insights from bees and flowers.

    PubMed

    Leonard, Anne S; Masek, Pavel

    2014-06-01

    Karl von Frisch's studies of bees' color vision and chemical senses opened a window into the perceptual world of a species other than our own. A century of subsequent research on bees' visual and olfactory systems has developed along two productive but independent trajectories, leaving the questions of how and why bees use these two senses in concert largely unexplored. Given current interest in multimodal communication and recently discovered interplay between olfaction and vision in humans and Drosophila, understanding multisensory integration in bees is an opportunity to advance knowledge across fields. Using a classic ethological framework, we formulate proximate and ultimate perspectives on bees' use of multisensory stimuli. We discuss interactions between scent and color in the context of bee cognition and perception, focusing on mechanistic and functional approaches, and we highlight opportunities to further explore the development and evolution of multisensory integration. We argue that although the visual and olfactory worlds of bees are perhaps the best-studied of any non-human species, research focusing on the interactions between these two sensory modalities is vitally needed.

  12. The effect of early visual deprivation on the neural bases of multisensory processing.

    PubMed

    Guerreiro, Maria J S; Putzar, Lisa; Röder, Brigitte

    2015-06-01

    Developmental vision is deemed to be necessary for the maturation of multisensory cortical circuits. Thus far, this has only been investigated in animal studies, which have shown that congenital visual deprivation markedly reduces the capability of neurons to integrate cross-modal inputs. The present study investigated the effect of transient congenital visual deprivation on the neural mechanisms of multisensory processing in humans. We used functional magnetic resonance imaging to compare responses of visual and auditory cortical areas to visual, auditory and audio-visual stimulation in cataract-reversal patients and normally sighted controls. The results showed that cataract-reversal patients, unlike normally sighted controls, did not exhibit multisensory integration in auditory areas. Furthermore, cataract-reversal patients, but not normally sighted controls, exhibited lower visual cortical processing within visual cortex during audio-visual stimulation than during visual stimulation. These results indicate that congenital visual deprivation affects the capability of cortical areas to integrate cross-modal inputs in humans, possibly because visual processing is suppressed during cross-modal stimulation. Arguably, the lack of vision in the first months after birth may result in a reorganization of visual cortex, including the suppression of noisy visual input from the deprived retina in order to reduce interference during auditory processing.

  13. Neural Substrates of Reliability-Weighted Visual-Tactile Multisensory Integration

    PubMed Central

    Beauchamp, Michael S.; Pasalar, Siavash; Ro, Tony

    2010-01-01

    As sensory systems deteriorate in aging or disease, the brain must relearn the appropriate weights to assign each modality during multisensory integration. Using blood-oxygen level dependent functional magnetic resonance imaging of human subjects, we tested a model for the neural mechanisms of sensory weighting, termed “weighted connections.” This model holds that the connection weights between early and late areas vary depending on the reliability of the modality, independent of the level of early sensory cortex activity. When subjects detected viewed and felt touches to the hand, a network of brain areas was active, including visual areas in lateral occipital cortex, somatosensory areas in inferior parietal lobe, and multisensory areas in the intraparietal sulcus (IPS). In agreement with the weighted connection model, the connection weight measured with structural equation modeling between somatosensory cortex and IPS increased for somatosensory-reliable stimuli, and the connection weight between visual cortex and IPS increased for visual-reliable stimuli. This double dissociation of connection strengths was similar to the pattern of behavioral responses during incongruent multisensory stimulation, suggesting that weighted connections may be a neural mechanism for behavioral reliability weighting. PMID:20631844

  14. Natural asynchronies in audiovisual communication signals regulate neuronal multisensory interactions in voice-sensitive cortex.

    PubMed

    Perrodin, Catherine; Kayser, Christoph; Logothetis, Nikos K; Petkov, Christopher I

    2015-01-06

    When social animals communicate, the onset of informative content in one modality varies considerably relative to the other, such as when visual orofacial movements precede a vocalization. These naturally occurring asynchronies do not disrupt intelligibility or perceptual coherence. However, they occur on time scales where they likely affect integrative neuronal activity in ways that have remained unclear, especially for hierarchically downstream regions in which neurons exhibit temporally imprecise but highly selective responses to communication signals. To address this, we exploited naturally occurring face- and voice-onset asynchronies in primate vocalizations. Using these as stimuli we recorded cortical oscillations and neuronal spiking responses from functional MRI (fMRI)-localized voice-sensitive cortex in the anterior temporal lobe of macaques. We show that the onset of the visual face stimulus resets the phase of low-frequency oscillations, and that the face-voice asynchrony affects the prominence of two key types of neuronal multisensory responses: enhancement or suppression. Our findings show a three-way association between temporal delays in audiovisual communication signals, phase-resetting of ongoing oscillations, and the sign of multisensory responses. The results reveal how natural onset asynchronies in cross-sensory inputs regulate network oscillations and neuronal excitability in the voice-sensitive cortex of macaques, a suggested animal model for human voice areas. These findings also advance predictions on the impact of multisensory input on neuronal processes in face areas and other brain regions.

  15. Uni- and multisensory brain areas are synchronised across spectators when watching unedited dance recordings.

    PubMed

    Jola, Corinne; McAleer, Phil; Grosbras, Marie-Hélène; Love, Scott A; Morison, Gordon; Pollick, Frank E

    2013-01-01

    The superior temporal sulcus (STS) and gyrus (STG) are commonly identified to be functionally relevant for multisensory integration of audiovisual (AV) stimuli. However, most neuroimaging studies on AV integration used stimuli of short duration in explicit evaluative tasks. Importantly though, many of our AV experiences are of a long duration and ambiguous. It is unclear if the enhanced activity in audio, visual, and AV brain areas would also be synchronised over time across subjects when they are exposed to such multisensory stimuli. We used intersubject correlation to investigate which brain areas are synchronised across novices for uni- and multisensory versions of a 6-min 26-s recording of an unfamiliar, unedited Indian dance recording (Bharatanatyam). In Bharatanatyam, music and dance are choreographed together in a highly intermodal-dependent manner. Activity in the middle and posterior STG was significantly correlated between subjects and showed also significant enhancement for AV integration when the functional magnetic resonance signals were contrasted against each other using a general linear model conjunction analysis. These results extend previous studies by showing an intermediate step of synchronisation for novices: while there was a consensus across subjects' brain activity in areas relevant for unisensory processing and AV integration of related audio and visual stimuli, we found no evidence for synchronisation of higher level cognitive processes, suggesting these were idiosyncratic.

  16. Multisensory Interactions Influence Neuronal Spike Train Dynamics in the Posterior Parietal Cortex

    PubMed Central

    VanGilder, Paul; Shi, Ying; Apker, Gregory; Dyson, Keith; Buneo, Christopher A.

    2016-01-01

    Although significant progress has been made in understanding multisensory interactions at the behavioral level, their underlying neural mechanisms remain relatively poorly understood in cortical areas, particularly during the control of action. In recent experiments where animals reached to and actively maintained their arm position at multiple spatial locations while receiving either proprioceptive or visual-proprioceptive position feedback, multisensory interactions were shown to be associated with reduced spiking (i.e. subadditivity) as well as reduced intra-trial and across-trial spiking variability in the superior parietal lobule (SPL). To further explore the nature of such interaction-induced changes in spiking variability we quantified the spike train dynamics of 231 of these neurons. Neurons were classified as Poisson, bursty, refractory, or oscillatory (in the 13–30 Hz “beta-band”) based on their spike train power spectra and autocorrelograms. No neurons were classified as Poisson-like in either the proprioceptive or visual-proprioceptive conditions. Instead, oscillatory spiking was most commonly observed with many neurons exhibiting these oscillations under only one set of feedback conditions. The results suggest that the SPL may belong to a putative beta-synchronized network for arm position maintenance and that position estimation may be subserved by different subsets of neurons within this network depending on available sensory information. In addition, the nature of the observed spiking variability suggests that models of multisensory interactions in the SPL should account for both Poisson-like and non-Poisson variability. PMID:28033334

  17. Preliminary evidence for deficits in multisensory integration in autism spectrum disorders: the mirror neuron hypothesis.

    PubMed

    Oberman, Lindsay M; Ramachandran, Vilayanur S

    2008-01-01

    Autism is a complex disorder, characterized by social, cognitive, communicative, and motor symptoms. One suggestion, proposed in the current study, to explain the spectrum of symptoms is an underlying impairment in multisensory integration (MSI) systems such as a mirror neuron-like system. The mirror neuron system, thought to play a critical role in skills such as imitation, empathy, and language can be thought of as a multisensory system, converting sensory stimuli into motor representations. Consistent with this, we report preliminary evidence for deficits in a task thought to tap into MSI--"the bouba-kiki task" in children with ASD. The bouba-kiki effect is produced when subjects are asked to pair nonsense shapes with nonsense "words". We found that neurotypical children chose the nonsense "word" whose phonemic structure corresponded with the visual shape of the stimuli 88% of the time. This is presumably because of mirror neuron-like multisensory systems that integrate the visual shape with the corresponding motor gestures used to pronounce the nonsense word. Surprisingly, individuals with ASD only chose the corresponding name 56% of the time. The poor performance by the ASD group on this task suggests a deficit in MSI, perhaps related to impaired MSI brain systems. Though this is a behavioral study, it provides a testable hypothesis for the communication impairments in children with ASD that implicates a specific neural system and fits well with the current findings suggesting an impairment in the mirror systems in individuals with ASD.

  18. Impairments of multisensory integration and cross-sensory learning as pathways to dyslexia.

    PubMed

    Hahn, Noemi; Foxe, John J; Molholm, Sophie

    2014-11-01

    Two sensory systems are intrinsic to learning to read. Written words enter the brain through the visual system and associated sounds through the auditory system. The task before the beginning reader is quite basic. She must learn correspondences between orthographic tokens and phonemic utterances, and she must do this to the point that there is seamless automatic 'connection' between these sensorially distinct units of language. It is self-evident then that learning to read requires formation of cross-sensory associations to the point that deeply encoded multisensory representations are attained. While the majority of individuals manage this task to a high degree of expertise, some struggle to attain even rudimentary capabilities. Why do dyslexic individuals, who learn well in myriad other domains, fail at this particular task? Here, we examine the literature as it pertains to multisensory processing in dyslexia. We find substantial support for multisensory deficits in dyslexia, and make the case that to fully understand its neurological basis, it will be necessary to thoroughly probe the integrity of auditory-visual integration mechanisms.

  19. Lost in space: multisensory conflict yields adaptation in spatial representations across frames of reference.

    PubMed

    Lohmann, Johannes; Butz, Martin V

    2017-03-27

    According to embodied cognition, bodily interactions with our environment shape the perception and representation of our body and the surrounding space, that is, peripersonal space. To investigate the adaptive nature of these spatial representations, we introduced a multisensory conflict between vision and proprioception in an immersive virtual reality. During individual bimanual interaction trials, we gradually shifted the visual hand representation. As a result, participants unknowingly shifted their actual hands to compensate for the visual shift. We then measured the adaptation to the invoked multisensory conflict by means of a self-localization and an external localization task. While effects of the conflict were observed in both tasks, the effects systematically interacted with the type of localization task and the available visual information while performing the localization task (i.e., the visibility of the virtual hands). The results imply that the localization of one's own hands is based on a multisensory integration process, which is modulated by the saliency of the currently most relevant sensory modality and the involved frame of reference. Moreover, the results suggest that our brain strives for consistency between its body and spatial estimates, thereby adapting multiple, related frames of reference, and the spatial estimates within, due to a sensory conflict in one of them.

  20. Testing sensory and multisensory function in children with autism spectrum disorder.

    PubMed

    Baum, Sarah H; Stevenson, Ryan A; Wallace, Mark T

    2015-04-22

    In addition to impairments in social communication and the presence of restricted interests and repetitive behaviors, deficits in sensory processing are now recognized as a core symptom in autism spectrum disorder (ASD). Our ability to perceive and interact with the external world is rooted in sensory processing. For example, listening to a conversation entails processing the auditory cues coming from the speaker (speech content, prosody, syntax) as well as the associated visual information (facial expressions, gestures). Collectively, the "integration" of these multisensory (i.e., combined audiovisual) pieces of information results in better comprehension. Such multisensory integration has been shown to be strongly dependent upon the temporal relationship of the paired stimuli. Thus, stimuli that occur in close temporal proximity are highly likely to result in behavioral and perceptual benefits--gains believed to be reflective of the perceptual system's judgment of the likelihood that these two stimuli came from the same source. Changes in this temporal integration are expected to strongly alter perceptual processes, and are likely to diminish the ability to accurately perceive and interact with our world. Here, a battery of tasks designed to characterize various aspects of sensory and multisensory temporal processing in children with ASD is described. In addition to its utility in autism, this battery has great potential for characterizing changes in sensory function in other clinical populations, as well as being used to examine changes in these processes across the lifespan.

  1. Functional specializations of the ventral intraparietal area for multisensory heading discrimination.

    PubMed

    Chen, Aihua; Deangelis, Gregory C; Angelaki, Dora E

    2013-02-20

    The ventral intraparietal area (VIP) of the macaque brain is a multimodal cortical region with directionally selective responses to visual and vestibular stimuli. To explore how these signals contribute to self-motion perception, neural activity in VIP was monitored while macaques performed a fine heading discrimination task based on vestibular, visual, or multisensory cues. For neurons with congruent visual and vestibular heading tuning, discrimination thresholds improved during multisensory stimulation, suggesting that VIP (like the medial superior temporal area; MSTd) may contribute to the improved perceptual discrimination seen during cue combination. Unlike MSTd, however, few VIP neurons showed opposite visual/vestibular tuning over the range of headings relevant to behavior, and those few cells showed reduced sensitivity under cue combination. Our data suggest that the heading tuning of some VIP neurons may be locally remodeled to increase the proportion of cells with congruent tuning over the behaviorally relevant stimulus range. VIP neurons also showed much stronger trial-by-trial correlations with perceptual decisions (choice probabilities; CPs) than MSTd neurons. While this may suggest that VIP neurons are more strongly linked to heading perception, we also find that correlated noise is much stronger among pairs of VIP neurons, with noise correlations averaging 0.14 in VIP as compared with 0.04 in MSTd. Thus, the large CPs in VIP could be a consequence of strong interneuronal correlations. Together, our findings suggest that VIP neurons show specializations that may make them well equipped to play a role in multisensory integration for heading perception.

  2. Multisensory Integration and Calibration in Children and Adults with and without Sensory and Motor Disabilities.

    PubMed

    Gori, Monica

    2015-01-01

    During the first years of life, sensory modalities communicate with each other. This process is fundamental for the development of unisensory and multisensory skills. The absence of one sensory input impacts on the development of other modalities. Since 2008 we have studied these aspects and developed our cross-sensory calibration theory. This theory emerged from the observation that children start to integrate multisensory information (such as vision and touch) only after 8-10 years of age. Before this age the more accurate sense teaches (calibrates) the others; when one calibrating modality is missing, the other modalities result impaired. Children with visual disability have problems in understanding the haptic or auditory perception of space and children with motor disabilities have problems in understanding the visual dimension of objects. This review presents our recent studies on multisensory integration and cross-sensory calibration in children and adults with and without sensory and motor disabilities. The goal of this review is to show the importance of interaction between sensory systems during the early period of life in order to correct perceptual development to occur.

  3. Recurrent network for multisensory integration-identification of common sources of audiovisual stimuli.

    PubMed

    Yamashita, Itsuki; Katahira, Kentaro; Igarashi, Yasuhiko; Okanoya, Kazuo; Okada, Masato

    2013-01-01

    We perceive our surrounding environment by using different sense organs. However, it is not clear how the brain estimates information from our surroundings from the multisensory stimuli it receives. While Bayesian inference provides a normative account of the computational principle at work in the brain, it does not provide information on how the nervous system actually implements the computation. To provide an insight into how the neural dynamics are related to multisensory integration, we constructed a recurrent network model that can implement computations related to multisensory integration. Our model not only extracts information from noisy neural activity patterns, it also estimates a causal structure; i.e., it can infer whether the different stimuli came from the same source or different sources. We show that our model can reproduce the results of psychophysical experiments on spatial unity and localization bias which indicate that a shift occurs in the perceived position of a stimulus through the effect of another simultaneous stimulus. The experimental data have been reproduced in previous studies using Bayesian models. By comparing the Bayesian model and our neural network model, we investigated how the Bayesian prior is represented in neural circuits.

  4. Auditory projections to extrastriate visual cortex: connectional basis for multisensory processing in 'unimodal' visual neurons.

    PubMed

    Clemo, H Ruth; Sharma, Giriraj K; Allman, Brian L; Meredith, M Alex

    2008-10-01

    Neurophysiological studies have recently documented multisensory properties in 'unimodal' visual neurons of the cat posterolateral lateral suprasylvian (PLLS) cortex, a retinotopically organized area involved in visual motion processing. In this extrastriate visual area, a region has been identified where both visual and auditory stimuli were independently effective in activating neurons (bimodal zone), as well as a second region where visually-evoked activity was significantly facilitated by concurrent auditory stimulation but was unaffected by auditory stimulation alone (subthreshold multisensory region). Given their different distributions, the possible corticocortical connectivity underlying these distinct forms of crossmodal convergence was examined using biotinylated dextran amine (BDA) tracer methods in 21 adult cats. The auditory cortical areas examined included the anterior auditory field (AAF), primary auditory cortex (AI), dorsal zone (DZ), secondary auditory cortex (AII), field of the rostral suprasylvian sulcus (FRS), field anterior ectosylvian sulcus (FAES) and the posterior auditory field (PAF). Of these regions, the DZ, AI, AII, and FAES were found to project to the both the bimodal zone and the subthreshold region of the PLLS. This convergence of crossmodal inputs to the PLLS suggests not only that complex auditory information has access to this region but also that these connections provide the substrate for the different forms (bimodal versus subthreshold) of multisensory processing which may facilitate its functional role in visual motion processing.

  5. Multisensory representation of the space near the hand: from perception to action and interindividual interactions.

    PubMed

    Brozzoli, Claudio; Ehrsson, H Henrik; Farnè, Alessandro

    2014-04-01

    When interacting with objects and other people, the brain needs to locate our limbs and the relevant visual information surrounding them. Studies on monkeys showed that information from different sensory modalities converge at the single cell level within a set of interconnected multisensory frontoparietal areas. It is largely accepted that this network allows for multisensory processing of the space surrounding the body (peripersonal space), whose function has been linked to the sensory guidance of appetitive and defensive movements, and localization of the limbs in space. In the current review, we consider multidisciplinary findings about the processing of the space near the hands in humans and offer a convergent view of its functions and underlying neural mechanisms. We will suggest that evolution has provided the brain with a clever tool for representing visual information around the hand, which takes the hand itself as a reference for the coding of surrounding visual space. We will contend that the hand-centered representation of space, known as perihand space, is a multisensory-motor interface that allows interaction with the objects and other persons around us.

  6. Multi-sensory and sensorimotor foundation of bodily self-consciousness - an interdisciplinary approach.

    PubMed

    Ionta, Silvio; Gassert, Roger; Blanke, Olaf

    2011-01-01

    Scientific investigations on the nature of the self have so far focused on high-level mechanisms. Recent evidence, however, suggests that low-level bottom-up mechanisms of multi-sensory integration play a fundamental role in encoding specific components of bodily self-consciousness, such as self-location and first-person perspective (Blanke and Metzinger, 2009). Self-location and first-person perspective are abnormal in neurological patients suffering from out-of-body experiences (Blanke et al., 2004), and can be manipulated experimentally in healthy subjects by imposing multi-sensory conflicts (Lenggenhager et al., 2009). Activity of the temporo-parietal junction (TPJ) reflects experimentally induced changes in self-location and first-person perspective (Ionta et al., 2011), and dysfunctions in TPJ are causally associated with out-of-body experiences (Blanke et al., 2002). We argue that TPJ is one of the key areas for multi-sensory integration of bodily self-consciousness, that its levels of activity reflect the experience of the conscious "I" as embodied and localized within bodily space, and that these mechanisms can be systematically investigated using state of the art technologies such as robotics, virtual reality, and non-invasive neuroimaging.

  7. Multisensory processing in children with autism: high-density electrical mapping of auditory-somatosensory integration.

    PubMed

    Russo, Natalie; Foxe, John J; Brandwein, Alice B; Altschuler, Ted; Gomes, Hilary; Molholm, Sophie

    2010-10-01

    Successful integration of signals from the various sensory systems is crucial for normal sensory-perceptual functioning, allowing for the perception of coherent objects rather than a disconnected cluster of fragmented features. Several prominent theories of autism suggest that automatic integration is impaired in this population, but there have been few empirical tests of this thesis. A standard electrophysiological metric of multisensory integration (MSI) was used to test the integrity of auditory-somatosensory integration in children with autism (N=17, aged 6-16 years), compared to age- and IQ-matched typically developing (TD) children. High-density electrophysiology was recorded while participants were presented with either auditory or somatosensory stimuli alone (unisensory conditions), or as a combined auditory-somatosensory stimulus (multisensory condition), in randomized order. Participants watched a silent movie during testing, ignoring concurrent stimulation. Significant differences between neural responses to the multisensory auditory-somatosensory stimulus and the unisensory stimuli (the sum of the responses to the auditory and somatosensory stimuli when presented alone) served as the dependent measure. The data revealed group differences in the integration of auditory and somatosensory information that appeared at around 175 ms, and were characterized by the presence of MSI for the TD but not the autism spectrum disorder (ASD) children. Overall, MSI was less extensive in the ASD group. These findings are discussed within the framework of current knowledge of MSI in typical development as well as in relation to theories of ASD.

  8. Natural asynchronies in audiovisual communication signals regulate neuronal multisensory interactions in voice-sensitive cortex

    PubMed Central

    Perrodin, Catherine; Kayser, Christoph; Logothetis, Nikos K.; Petkov, Christopher I.

    2015-01-01

    When social animals communicate, the onset of informative content in one modality varies considerably relative to the other, such as when visual orofacial movements precede a vocalization. These naturally occurring asynchronies do not disrupt intelligibility or perceptual coherence. However, they occur on time scales where they likely affect integrative neuronal activity in ways that have remained unclear, especially for hierarchically downstream regions in which neurons exhibit temporally imprecise but highly selective responses to communication signals. To address this, we exploited naturally occurring face- and voice-onset asynchronies in primate vocalizations. Using these as stimuli we recorded cortical oscillations and neuronal spiking responses from functional MRI (fMRI)-localized voice-sensitive cortex in the anterior temporal lobe of macaques. We show that the onset of the visual face stimulus resets the phase of low-frequency oscillations, and that the face–voice asynchrony affects the prominence of two key types of neuronal multisensory responses: enhancement or suppression. Our findings show a three-way association between temporal delays in audiovisual communication signals, phase-resetting of ongoing oscillations, and the sign of multisensory responses. The results reveal how natural onset asynchronies in cross-sensory inputs regulate network oscillations and neuronal excitability in the voice-sensitive cortex of macaques, a suggested animal model for human voice areas. These findings also advance predictions on the impact of multisensory input on neuronal processes in face areas and other brain regions. PMID:25535356

  9. Perceptuo-motor compatibility governs multisensory integration in bimanual coordination dynamics.

    PubMed

    Zelic, Gregory; Mottet, Denis; Lagarde, Julien

    2016-02-01

    The brain has the remarkable ability to bind together inputs from different sensory origin into a coherent percept. Behavioral benefits can result from such ability, e.g., a person typically responds faster and more accurately to cross-modal stimuli than to unimodal stimuli. To date, it is, however, largely unknown whether such multisensory benefits, shown for discrete reactive behaviors, generalize to the continuous coordination of movements. The present study addressed multisensory integration from the perspective of bimanual coordination dynamics, where the perceptual activity no longer triggers a single response but continuously guides the motor action. The task consisted in coordinating anti-symmetrically the continuous flexion-extension of the index fingers, while synchronizing with an external pacer. Three different configurations of metronome were tested, for which we examined whether a cross-modal pacing (audio-tactile beats) improved the stability of the coordination in comparison with unimodal pacing condition (auditory or tactile beats). We found a more stable bimanual coordination for cross-modal pacing, but only when the metronome configuration directly matched the anti-symmetric coordination pattern. We conclude that multisensory integration can benefit the continuous coordination of movements; however, this is constrained by whether the perceptual and motor activities match in space and time.

  10. The electrophysiological time course of the interaction of stimulus conflict and the multisensory spread of attention.

    PubMed

    Zimmer, U; Itthipanyanan, S; Grent-'t-Jong, T; Woldorff, M G

    2010-05-01

    Previously, we have shown that spatial attention to a visual stimulus can spread across both space and modality to a synchronously presented but task-irrelevant sound arising from a different location, reflected by a late-onsetting, sustained, negative-polarity event-related potential (ERP) wave over frontal-central scalp sites, probably originating in part from the auditory cortices. Here we explore the influence of cross-modal conflict on the amplitude and temporal dynamics of this multisensory spreading-of-attention activity. Subjects attended selectively to one of two concurrently presented lateral visually-presented letter streams to perform a sequential comparison task, while ignoring task-irrelevant, centrally presented spoken letters that could occur synchronously with either the attended or unattended lateral visual letters and could be either congruent or incongruent with them. Extracted auditory ERPs revealed that, collapsed across congruency conditions, attentional spreading across modalities started at approximately 220 ms, replicating our earlier findings. The interaction between attentional spreading and conflict occurred beginning at approximately 300 ms, with attentional-spreading activity being larger for incongruent trials. Thus, the increased processing of an incongruent, task-irrelevant sound in a multisensory stimulation appeared to occur some time after attention has spread from the attended visual part to the ignored auditory part, presumably reflecting the conflict detection and associated attentional capture requiring accrual of some multisensory interaction processes at a higher-level semantic processing stage.

  11. Uni- and multisensory brain areas are synchronised across spectators when watching unedited dance recordings

    PubMed Central

    Jola, Corinne; McAleer, Phil; Grosbras, Marie-Hélène; Love, Scott A.; Morison, Gordon; Pollick, Frank E.

    2013-01-01

    The superior temporal sulcus (STS) and gyrus (STG) are commonly identified to be functionally relevant for multisensory integration of audiovisual (AV) stimuli. However, most neuroimaging studies on AV integration used stimuli of short duration in explicit evaluative tasks. Importantly though, many of our AV experiences are of a long duration and ambiguous. It is unclear if the enhanced activity in audio, visual, and AV brain areas would also be synchronised over time across subjects when they are exposed to such multisensory stimuli. We used intersubject correlation to investigate which brain areas are synchronised across novices for uni- and multisensory versions of a 6-min 26-s recording of an unfamiliar, unedited Indian dance recording (Bharatanatyam). In Bharatanatyam, music and dance are choreographed together in a highly intermodal-dependent manner. Activity in the middle and posterior STG was significantly correlated between subjects and showed also significant enhancement for AV integration when the functional magnetic resonance signals were contrasted against each other using a general linear model conjunction analysis. These results extend previous studies by showing an intermediate step of synchronisation for novices: while there was a consensus across subjects' brain activity in areas relevant for unisensory processing and AV integration of related audio and visual stimuli, we found no evidence for synchronisation of higher level cognitive processes, suggesting these were idiosyncratic. PMID:24349687

  12. Impairments of Multisensory Integration and Cross-Sensory Learning as Pathways to Dyslexia

    PubMed Central

    Hahn, Noemi; Foxe, John J.; Molholm, Sophie

    2014-01-01

    Two sensory systems are intrinsic to learning to read. Written words enter the brain through the visual system and associated sounds through the auditory system. The task before the beginning reader is quite basic. She must learn correspondences between orthographic tokens and phonemic utterances, and she must do this to the point that there is seamless automatic ‘connection’ between these sensorially distinct units of language. It is self-evident then that learning to read requires formation of cross-sensory associations to the point that deeply encoded multisensory representations are attained. While the majority of individuals manage this task to a high degree of expertise, some struggle to attain even rudimentary capabilities. Why do dyslexic individuals, who learn well in myriad other domains, fail at this particular task? Here, we examine the literature as it pertains to multisensory processing in dyslexia. We find substantial support for multisensory deficits in dyslexia, and make the case that to fully understand its neurological basis, it will be necessary to thoroughly probe the integrity of auditory-visual integration mechanisms. PMID:25265514

  13. The sense of body ownership relaxes temporal constraints for multisensory integration

    PubMed Central

    Maselli, Antonella; Kilteni, Konstantina; López-Moliner, Joan; Slater, Mel

    2016-01-01

    Experimental work on body ownership illusions showed how simple multisensory manipulation can generate the illusory experience of an artificial limb as being part of the own-body. This work highlighted how own-body perception relies on a plastic brain representation emerging from multisensory integration. The flexibility of this representation is reflected in the short-term modulations of physiological states and perceptual processing observed during these illusions. Here, we explore the impact of ownership illusions on the temporal dimension of multisensory integration. We show that, during the illusion, the temporal window for integrating touch on the physical body with touch seen on a virtual body representation, increases with respect to integration with visual events seen close but separated from the virtual body. We show that this effect is mediated by the ownership illusion. Crucially, the temporal window for visuotactile integration was positively correlated with participants’ scores rating the illusory experience of owning the virtual body and touching the object seen in contact with it. Our results corroborate the recently proposed causal inference mechanism for illusory body ownership. As a novelty, they show that the ensuing illusory causal binding between stimuli from the real and fake body relaxes constraints for the integration of bodily signals. PMID:27485049

  14. Meson multiplicity versus energy in relativistic nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Atwater, T. W.; Freier, P. S.

    1986-01-01

    A systematic study of meson multiplicity as a function of energy at energies up to 100 GeV/u in nucleus-nucleus collisions has been made, using cosmic-ray data in nuclear emulsion. The data are consistent with simple nucleon-nucleon superposition models. Multiplicity per interacting nucleon in AA collisions does not appear to differ significantly from pp collisions.

  15. Momentum loss in proton-nucleus and nucleus-nucleus collisions

    NASA Technical Reports Server (NTRS)

    Khan, Ferdous; Townsend, Lawrence W.

    1993-01-01

    An optical model description, based on multiple scattering theory, of longitudinal momentum loss in proton-nucleus and nucleus-nucleus collisions is presented. The crucial role of the imaginary component of the nucleon-nucleon transition matrix in accounting for longitudinal momentum transfer is demonstrated. Results obtained with this model are compared with Intranuclear Cascade (INC) calculations, as well as with predictions from Vlasov-Uehling-Uhlenbeck (VUU) and quantum molecular dynamics (QMD) simulations. Comparisons are also made with experimental data where available. These indicate that the present model is adequate to account for longitudinal momentum transfer in both proton-nucleus and nucleus-nucleus collisions over a wide range of energies.

  16. The Galactic Nucleus

    NASA Astrophysics Data System (ADS)

    Melia, Fulvio

    Exciting new broadband observations of the galactic nucleus have placed the heart of the Milky Way under intense scrutiny in recent years. This has been due in part to the growing interest from theorists motivated to study the physics of black hole accretion, magnetized gas dynamics, and unusual star formation. The center of our Galaxy is now known to harbor the most compelling supermassive black hole candidate, weighing in at 3-4 million solar masses. Its nearby environment is comprised of a molecular dusty ring, clusters of evolved and young stars, diffuse hot gas, ionized gas streamers, and several supernova remnants. This chapter will focus on the physical makeup of this dynamic region and the feasibility of actually imaging the black hole's shadow in the coming decade with mm interferometry.

  17. The use of multi-sensory interventions to manage dementia-related behaviours in the residential aged care setting: a survey of one Australian state.

    PubMed

    Bauer, Michael; Rayner, Jo-Anne; Koch, Susan; Chenco, Carol

    2012-11-01

    To describe the use of multi-sensory interventions in residential aged care services (RACS) for the management of dementia-related behaviours in residential aged care in Victoria, Australia. The popularity of multi-sensory interventions has spread worldwide, including for use in residential aged care, despite limited evidence to support their efficacy. This study reports the findings of the first stage of a two-stage project that was undertaken to describe and evaluate the use of multi-sensory interventions for the management of dementia-related behaviours in all residential aged care facilities in Victoria, Australia. A computer-assisted telephone interview survey was developed and administered to residential aged care facilities in Victoria, Australia, to collect descriptive data on the use of multi-sensory interventions for the management of dementia-related behaviours. A diverse and eclectic range of multi-sensory interventions are currently being used by residential aged care facilities. The findings suggest the use of multi-sensory interventions are used in an ad hoc manner, and there is no universal definition of multi-sensory interventions, little formal training for staff administering the interventions and no guideline for their use, nor evaluation of their impact on residents' behaviour. Multi-sensory interventions have been widely adopted for use in RACS in Victoria, Australia, and are currently being used without formal guidelines and little evidence to support their use in clinical practice. In the absence of a formal definition of what constitutes a multi-sensory intervention, training for staff and careful assessment and monitoring of residents who receive multi-sensory interventions, we recommend further research and development of policy and procedures to safe guard the use of multi-sensory interventions for people with dementia. © 2012 Blackwell Publishing Ltd.

  18. EHF multifunction phased array antenna

    NASA Astrophysics Data System (ADS)

    Solbach, Klaus

    1986-07-01

    The design of a low cost demonstration EHF multifunction-phased array antenna is described. Both, the radiating elements and the phase-shifter circuits are realized on microstrip substrate material in order to allow photolithographic batch fabrication. Self-encapsulated beam-lead PIN-diodes are employed as the electronic switch elements to avoid expensive hermetic encapsulation of the semiconductors or complete circuits. A space-feed using a horn-radiator to illuminate the array from the front-side is found to be the simplest and most inexpensive feed. The phased array antenna thus operates as a reflect-array, the antenna elements employed in a dual role for the collection of energy from the feed-horn and for the re-radiation of the phase-shifted waves (in transmit-mode). The antenna is divided into modules containing the radiator/phase-shifter plate plus drive- and BITE-circuitry at the back. Both drive- and BITE-components use gate-array integrated circuits especially designed for the purpose. Several bus-systems are used to supply bias and logical data flows to the modules. The beam-steering unit utilizes several signal processors and high-speed discrete adder circuits to combine the pointing, frequency and beam-shape information from the radar system computer with the stored phase-shift codes for the array elements. Since space, weight and power consumption are prime considerations only the most advanced technology is used in the design of both the microwave and the digital/drive circuitry.

  19. Compensatory Recovery after Multisensory Stimulation in Hemianopic Patients: Behavioral and Neurophysiological Components

    PubMed Central

    Grasso, Paolo A.; Làdavas, Elisabetta; Bertini, Caterina

    2016-01-01

    Lateralized post-chiasmatic lesions of the primary visual pathway result in loss of visual perception in the field retinotopically corresponding to the damaged cortical area. However, patients with visual field defects have shown enhanced detection and localization of multisensory audio-visual pairs presented in the blind field. This preserved multisensory integrative ability (i.e., crossmodal blindsight) seems to be subserved by the spared retino-colliculo-dorsal pathway. According to this view, audio-visual integrative mechanisms could be used to increase the functionality of the spared circuit and, as a consequence, might represent an important tool for the rehabilitation of visual field defects. The present study tested this hypothesis, investigating whether exposure to systematic multisensory audio-visual stimulation could induce long-lasting improvements in the visual performance of patients with visual field defects. A group of 10 patients with chronic visual field defects were exposed to audio-visual training for 4 h daily, over a period of 2 weeks. Behavioral, oculomotor and electroencephalography (EEG) measures were recorded during several visual tasks before and after audio-visual training. After audio-visual training, improvements in visual search abilities, visual detection, self-perceived disability in daily life activities and oculomotor parameters were found, suggesting the implementation of more effective visual exploration strategies. At the electrophysiological level, after training, patients showed a significant reduction of the P3 amplitude in response to stimuli presented in the intact field, reflecting a reduction in attentional resources allocated to the intact field, which might co-occur with a shift of spatial attention towards the blind field. More interestingly, both the behavioral improvements and the electrophysiological changes observed after training were found to be stable at a follow-up session (on average, 8 months after training

  20. Compensatory Recovery after Multisensory Stimulation in Hemianopic Patients: Behavioral and Neurophysiological Components.

    PubMed

    Grasso, Paolo A; Làdavas, Elisabetta; Bertini, Caterina

    2016-01-01

    Lateralized post-chiasmatic lesions of the primary visual pathway result in loss of visual perception in the field retinotopically corresponding to the damaged cortical area. However, patients with visual field defects have shown enhanced detection and localization of multisensory audio-visual pairs presented in the blind field. This preserved multisensory integrative ability (i.e., crossmodal blindsight) seems to be subserved by the spared retino-colliculo-dorsal pathway. According to this view, audio-visual integrative mechanisms could be used to increase the functionality of the spared circuit and, as a consequence, might represent an important tool for the rehabilitation of visual field defects. The present study tested this hypothesis, investigating whether exposure to systematic multisensory audio-visual stimulation could induce long-lasting improvements in the visual performance of patients with visual field defects. A group of 10 patients with chronic visual field defects were exposed to audio-visual training for 4 h daily, over a period of 2 weeks. Behavioral, oculomotor and electroencephalography (EEG) measures were recorded during several visual tasks before and after audio-visual training. After audio-visual training, improvements in visual search abilities, visual detection, self-perceived disability in daily life activities and oculomotor parameters were found, suggesting the implementation of more effective visual exploration strategies. At the electrophysiological level, after training, patients showed a significant reduction of the P3 amplitude in response to stimuli presented in the intact field, reflecting a reduction in attentional resources allocated to the intact field, which might co-occur with a shift of spatial attention towards the blind field. More interestingly, both the behavioral improvements and the electrophysiological changes observed after training were found to be stable at a follow-up session (on average, 8 months after training

  1. The intercalatus nucleus of Staderini.

    PubMed

    Cascella, Marco

    2016-01-01

    Rutilio Staderini was one of the leading Italian anatomists of the twentieth century, together with some scientists, such as Giulio Chiarugi, Giovanni Vitali, and others. He was also a member of a new generation of anatomists. They had continued the tradition of the most famous Italian scientists, which started from the Renaissance up until the nineteenth century. Although he carried out important studies of neuroanatomy and comparative anatomy, as well as embryology, his name is rarely remembered by most medical historians. His name is linked to the nucleus he discovered: the Staderini nucleus or intercalated nucleus, a collection of nerve cells in the medulla oblongata located lateral to the hypoglossal nucleus. This article focuses on the biography of the neuroanatomist as well as the nucleus that carries his name and his other research, especially on comparative anatomy and embryology.

  2. Multifunctional nanocomposite foams for space applications

    NASA Astrophysics Data System (ADS)

    Rollins, Diandra J.

    Materials combined with a small amount of nanoparticles offer new possibilities in the synthesizing of multifunctional materials. Graphene nanoplatelets (GnP) are multifunctional nanoreinforcing agents consisting of stacks of graphene sheets with comparable properties to a single graphene layer at an overall lower cost in a more robust form. Such particles have been shown to have good thermal, mechanical and electrical properties. In addition, a low density multifunctional nanocomposite foam has the potential for multiple applications and potential use for the aerospace industry. This dissertation investigates two different microporous (foam) polymers that are modified by the addition of GnP to combat this density effect to improve the foam's macroscopic properties Three sizes of GnP with varying aspect ratio were used to improve the polymeric foams' dielectric, electrical and mechanical properties. (Abstract shortened by ProQuest.).

  3. Multifunctional materials for bone cancer treatment

    PubMed Central

    Marques, Catarina; Ferreira, José MF; Andronescu, Ecaterina; Ficai, Denisa; Sonmez, Maria; Ficai, Anton

    2014-01-01

    The purpose of this review is to present the most recent findings in bone tissue engineering. Special attention is given to multifunctional materials based on collagen and collagen–hydroxyapatite composites used for skin and bone cancer treatments. The multi-functionality of these materials was obtained by adding to the base regenerative grafts proper components, such as ferrites (magnetite being the most important representative), cytostatics (cisplatin, carboplatin, vincristine, methotrexate, paclitaxel, doxorubicin), silver nanoparticles, antibiotics (anthracyclines, geldanamycin), and/or analgesics (ibuprofen, fentanyl). The suitability of complex systems for the intended applications was systematically analyzed. The developmental possibilities of multifunctional materials with regenerative and curative roles (antitumoral as well as pain management) in the field of skin and bone cancer treatment are discussed. It is worth mentioning that better materials are likely to be developed by combining conventional and unconventional experimental strategies. PMID:24920907

  4. Multisensory Convergence of Visual and Vestibular Heading Cues in the Pursuit Area of the Frontal Eye Field.

    PubMed

    Gu, Yong; Cheng, Zhixian; Yang, Lihua; DeAngelis, Gregory C; Angelaki, Dora E

    2016-09-01

    Both visual and vestibular sensory cues are important for perceiving one's direction of heading during self-motion. Previous studies have identified multisensory, heading-selective neurons in the dorsal medial superior temporal area (MSTd) and the ventral intraparietal area (VIP). Both MSTd and VIP have strong recurrent connections with the pursuit area of the frontal eye field (FEFsem), but whether FEFsem neurons may contribute to multisensory heading perception remain unknown. We characterized the tuning of macaque FEFsem neurons to visual, vestibular, and multisensory heading stimuli. About two-thirds of FEFsem neurons exhibited significant heading selectivity based on either vestibular or visual stimulation. These multisensory neurons shared many properties, including distributions of tuning strength and heading preferences, with MSTd and VIP neurons. Fisher information analysis also revealed that the average FEFsem neuron was almost as sensitive as MSTd or VIP cells. Visual and vestibular heading preferences in FEFsem tended to be either matched (congruent cells) or discrepant (opposite cells), such that combined stimulation strengthened heading selectivity for congruent cells but weakened heading selectivity for opposite cells. These findings demonstrate that, in addition to oculomotor functions, FEFsem neurons also exhibit properties that may allow them to contribute to a cortical network that processes multisensory heading cues.

  5. To bridge or not to bridge the multisensory time gap: bimanual coordination to sound and touch with temporal lags.

    PubMed

    Roy, C; Dalla Bella, S; Lagarde, J

    2017-01-01

    Living in a complex and multisensory environment involves constant interaction between perception and action. There is evidence that multisensory integration is governed by temporal factors, such as physiological synchrony between cross-modal stimuli favouring multisensory benefit, and the existence of a range of asynchrony between the stimuli which affords their binding (the temporal window of integration). These factors were examined in this study in a bimanual sensorimotor synchronization task with cross-modal stimuli. Participants synchronized each hand to a pair of audio-tactile stimuli, in which the asynchrony between onsets of auditory and tactile stimuli was systematically manipulated. In cross-modal conditions, they were instructed to tap either to the auditory stimuli or to tactile stimuli. The results reported a temporal window of integration of 160 ms centred around 40 and 80 ms (tactile first). Moreover, the temporal interval between the auditory and tactile stimuli affected the stability of bimanual coordination and of synchronization exclusively when participants were instructed to synchronize with tactile stimuli. Overall, the results indicate that both physiological asynchrony and temporal window of integration apply to cross-modal integration in a bimanual synchronization task. In addition, it shows the effect of auditory dominance onto multisensory temporal processes. This study sheds light on the role of temporal factors in multisensory processes when perception and actions are rhythmic and coupled.

  6. Learning multisensory representations for auditory-visual transfer of sequence category knowledge: a probabilistic language of thought approach.

    PubMed

    Yildirim, Ilker; Jacobs, Robert A

    2015-06-01

    If a person is trained to recognize or categorize objects or events using one sensory modality, the person can often recognize or categorize those same (or similar) objects and events via a novel modality. This phenomenon is an instance of cross-modal transfer of knowledge. Here, we study the Multisensory Hypothesis which states that people extract the intrinsic, modality-independent properties of objects and events, and represent these properties in multisensory representations. These representations underlie cross-modal transfer of knowledge. We conducted an experiment evaluating whether people transfer sequence category knowledge across auditory and visual domains. Our experimental data clearly indicate that we do. We also developed a computational model accounting for our experimental results. Consistent with the probabilistic language of thought approach to cognitive modeling, our model formalizes multisensory representations as symbolic "computer programs" and uses Bayesian inference to learn these representations. Because the model demonstrates how the acquisition and use of amodal, multisensory representations can underlie cross-modal transfer of knowledge, and because the model accounts for subjects' experimental performances, our work lends credence to the Multisensory Hypothesis. Overall, our work suggests that people automatically extract and represent objects' and events' intrinsic properties, and use these properties to process and understand the same (and similar) objects and events when they are perceived through novel sensory modalities.

  7. Disintegration of Multisensory Signals from the Real Hand Reduces Default Limb Self-Attribution: An fMRI Study

    PubMed Central

    Guterstam, Arvid; Brozzoli, Claudio; Ehrsson, H. Henrik

    2013-01-01

    The perception of our limbs in space is built upon the integration of visual, tactile, and proprioceptive signals. Accumulating evidence suggests that these signals are combined in areas of premotor, parietal, and cerebellar cortices. However, it remains to be determined whether neuronal populations in these areas integrate hand signals according to basic temporal and spatial congruence principles of multisensory integration. Here, we developed a setup based on advanced 3D video technology that allowed us to manipulate the spatiotemporal relationships of visuotactile (VT) stimuli delivered on a healthy human participant's real hand during fMRI and investigate the ensuing neural and perceptual correlates. Our experiments revealed two novel findings. First, we found responses in premotor, parietal, and cerebellar regions that were dependent upon the spatial and temporal congruence of VT stimuli. This multisensory integration effect required a simultaneous match between the seen and felt postures of the hand, which suggests that congruent visuoproprioceptive signals from the upper limb are essential for successful VT integration. Second, we observed that multisensory conflicts significantly disrupted the default feeling of ownership of the seen real limb, as indexed by complementary subjective, psychophysiological, and BOLD measures. The degree to which self-attribution was impaired could be predicted from the attenuation of neural responses in key multisensory areas. These results elucidate the neural bases of the integration of multisensory hand signals according to basic spatiotemporal principles and demonstrate that the disintegration of these signals leads to “disownership” of the seen real hand. PMID:23946393

  8. The fMRI BOLD response to unisensory and multisensory smoking cues in nicotine-dependent adults

    PubMed Central

    Cortese, Bernadette M.; Uhde, Thomas W.; Brady, Kathleen T.; McClernon, F. Joseph; Yang, Qing X.; Collins, Heather R.; LeMatty, Todd; Hartwell, Karen J.

    2015-01-01

    Given that the vast majority of functional magnetic resonance imaging (fMRI) studies of drug cue reactivity use unisensory visual cues, but that multisensory cues may elicit greater craving-related brain responses, the current study sought to compare the fMRI BOLD response to unisensory visual and multisensory, visual plus odor, smoking cues in 17 nicotine-dependent adult cigarette smokers. Brain activation to smoking-related, compared to neutral, pictures was assessed under cigarette smoke and odorless odor conditions. While smoking pictures elicited a pattern of activation consistent with the addiction literature, the multisensory (odor + picture) smoking cues elicited significantly greater and more widespread activation in mainly frontal and temporal regions. BOLD signal elicited by the multi-sensory, but not unisensory cues, was significantly related to participants’ level of control over craving as well. Results demonstrated that the co-presentation of cigarette smoke odor with smoking-related visual cues, compared to the visual cues alone, elicited greater levels of craving-related brain activation in key regions implicated in reward. These preliminary findings support future research aimed at a better understanding of multisensory integration of drug cues and craving. PMID:26475784

  9. Novel hybrid multifunctional magnetoelectric porous composite films

    NASA Astrophysics Data System (ADS)

    Martins, P.; Gonçalves, R.; Lopes, A. C.; Venkata Ramana, E.; Mendiratta, S. K.; Lanceros-Mendez, S.

    2015-12-01

    Novel multifunctional porous films have been developed by the integration of magnetic CoFe2O4 (CFO) nanoparticles into poly(vinylidene fluoride)-Trifuoroethylene (P(VDF-TrFE)), taking advantage of the synergies of the magnetostrictive filler and the piezoelectric polymer. The porous films show a piezoelectric response with an effective d33 coefficient of -22 pC/N-1, a maximum magnetization of 12 emu g-1 and a maximum magnetoelectric coefficient of 9 mV cm-1 Oe-1. In this way, a multifunctional membrane has been developed suitable for advanced applications ranging from biomedical to water treatment.

  10. Complex Multifunctional Polymer/Carbon-Nanotube Composites

    NASA Technical Reports Server (NTRS)

    Patel, Pritesh; Balasubramaniyam, Gobinath; Chen, Jian

    2009-01-01

    A methodology for developing complex multifunctional materials that consist of or contain polymer/carbon-nanotube composites has been conceived. As used here, "multifunctional" signifies having additional and/or enhanced physical properties that polymers or polymer-matrix composites would not ordinarily be expected to have. Such properties include useful amounts of electrical conductivity, increased thermal conductivity, and/or increased strength. In the present methodology, these properties are imparted to a given composite through the choice and processing of its polymeric and CNT constituents.

  11. The development of fluorescence turn-on probe for Al(III) sensing and live cell nucleus-nucleoli staining

    NASA Astrophysics Data System (ADS)

    Saini, Anoop Kumar; Sharma, Vinay; Mathur, Pradeep; Shaikh, Mobin M.

    2016-10-01

    The morphology of nucleus and nucleolus is powerful indicator of physiological and pathological conditions. The specific staining of nucleolus recently gained much attention due to the limited and expensive availability of the only existing stain “SYTO RNA-Select”. Here, a new multifunctional salen type ligand (L1) and its Al3+ complex (1) are designed and synthesized. L1 acts as a chemosensor for Al3+ whereas 1 demonstrates specific staining of nucleus as well as nucleoli. The binding of 1 with nucleic acid is probed by DNase and RNase digestion in stained cells. 1 shows an excellent photostability, which is a limitation for existing nucleus stains during long term observations. 1 is assumed to be a potential candidate as an alternative to expensive commercial dyes for nucleus and nucleoli staining.

  12. The development of fluorescence turn-on probe for Al(III) sensing and live cell nucleus-nucleoli staining

    PubMed Central

    Saini, Anoop Kumar; Sharma, Vinay; Mathur, Pradeep; Shaikh, Mobin M.

    2016-01-01

    The morphology of nucleus and nucleolus is powerful indicator of physiological and pathological conditions. The specific staining of nucleolus recently gained much attention due to the limited and expensive availability of the only existing stain “SYTO RNA-Select”. Here, a new multifunctional salen type ligand (L1) and its Al3+ complex (1) are designed and synthesized. L1 acts as a chemosensor for Al3+ whereas 1 demonstrates specific staining of nucleus as well as nucleoli. The binding of 1 with nucleic acid is probed by DNase and RNase digestion in stained cells. 1 shows an excellent photostability, which is a limitation for existing nucleus stains during long term observations. 1 is assumed to be a potential candidate as an alternative to expensive commercial dyes for nucleus and nucleoli staining. PMID:27721431

  13. Mechanics of the Nucleus

    PubMed Central

    Lammerding, Jan

    2015-01-01

    The nucleus is the distinguishing feature of eukaryotic cells. Until recently, it was often considered simply as a unique compartment containing the genetic information of the cell and associated machinery, without much attention to its structure and mechanical properties. This article provides compelling examples that illustrate how specific nuclear structures are associated with important cellular functions, and how defects in nuclear mechanics can cause a multitude of human diseases. During differentiation, embryonic stem cells modify their nuclear envelope composition and chromatin structure, resulting in stiffer nuclei that reflect decreased transcriptional plasticity. In contrast, neutrophils have evolved characteristic lobulated nuclei that increase their physical plasticity, enabling passage through narrow tissue spaces in their response to inflammation. Research on diverse cell types further demonstrates how induced nuclear deformations during cellular compression or stretch can modulate cellular function. Pathological examples of disturbed nuclear mechanics include the many diseases caused by mutations in the nuclear envelope proteins lamin A/C and associated proteins, as well as cancer cells that are often characterized by abnormal nuclear morphology. In this article, we will focus on determining the functional relationship between nuclear mechanics and cellular (dys-)function, describing the molecular changes associated with physiological and pathological examples, the resulting defects in nuclear mechanics, and the effects on cellular function. New insights into the close relationship between nuclear mechanics and cellular organization and function will yield a better understanding of normal biology and will offer new clues into therapeutic approaches to the various diseases associated with defective nuclear mechanics. PMID:23737203

  14. Hyperon-nucleus potentials

    NASA Astrophysics Data System (ADS)

    Dover, C. B.; Gal, A.

    We review models for the interaction of baryons ( N, Λ, Σ and Ξ) with nuclei, emphasizing the underlying meson exchange picture. Starting from a phenomenological one boson exchange model (the Nijmegen potential, as an example) which accounts for the available NN, ΛN and ΣN two-body scattering data, we show how to construct the effective baryon-nucleon interaction ( G-matrix). Employing the folding model, we then obtain the many-body potentials for bound states in terms of the nuclear density and the appropriate spin-isospin weighted G-matrices. The models we emphasize most impose SU(3) constraints on baryon-baryon coupling constants SU(3) is broken through the use of physical masses), although we also compare with rough estimates based on quark model relations between coupling constants. We stress the essential unity and economy of such models, in which nucleon and hyperon-nucleus potentials are intimately related via SU(3), and the connection between the two-body and many-body potentials is preserved. We decompose the nuclear potentials into central and spin-orbit parts, each of which is isospin dependent. For nucleons, the microscopic origin of the isospin dependent Lane potential V1 N is clarified. For Λ and Σ hyperons, the one boson exchange model with SU(3) constraints leads to one-body spin-orbit strengths VLSB which are relatively weak ( VLSΛ ≈ 1.5-2 MeV, VLSΣ ≈ 2.5-;3 MeV, compared to VLSN ≈ 7-9 MeV). We demonstrate the interplay between symmetric and antisymmetric two-body spin-orbit forces which give rise to these results, as well as the special role of K and K ∗ exchange for hyperons. We contrast these results with predictions based on the naive quark model. From S and P-wave two-body interactions, a Lane potential for the Σ of depth V1 Σ ≈ 50-60 MeV is predicted although this result is somewhat uncertain. For the Ξ, the nuclear potential is very different in various models for the two-body interaction based on SU(3) or the quark

  15. Brain and language: evidence for neural multifunctionality.

    PubMed

    Cahana-Amitay, Dalia; Albert, Martin L

    2014-01-01

    This review paper presents converging evidence from studies of brain damage and longitudinal studies of language in aging which supports the following thesis: the neural basis of language can best be understood by the concept of neural multifunctionality. In this paper the term "neural multifunctionality" refers to incorporation of nonlinguistic functions into language models of the intact brain, reflecting a multifunctional perspective whereby a constant and dynamic interaction exists among neural networks subserving cognitive, affective, and praxic functions with neural networks specialized for lexical retrieval, sentence comprehension, and discourse processing, giving rise to language as we know it. By way of example, we consider effects of executive system functions on aspects of semantic processing among persons with and without aphasia, as well as the interaction of executive and language functions among older adults. We conclude by indicating how this multifunctional view of brain-language relations extends to the realm of language recovery from aphasia, where evidence of the influence of nonlinguistic factors on the reshaping of neural circuitry for aphasia rehabilitation is clearly emerging.

  16. Multifunctional lubricant additives and compositions thereof

    SciTech Connect

    Farng, L.O.; Horodysky, A.G.

    1991-03-26

    This paper discusses an antioxidant/ antiwear/extreme pressure/load carrying lubricant composition. It comprises a major proportion of an oil of lubricating viscosity or grease or other solid lubricant prepared therefrom and a minor amount of an ashless multifunctional antioxidant/antiwear/extreme pressure/load carrying additive product comprising a thiophosphate derived from a dihydrocarbyl dithiocarbamate.

  17. Multisensory processing of naturalistic objects in motion: a high-density electrical mapping and source estimation study.

    PubMed

    Senkowski, Daniel; Saint-Amour, Dave; Kelly, Simon P; Foxe, John J

    2007-07-01

    In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA) source estimation to study the integration of multisensory objects in motion as reflected in event-related potentials (ERPs). A randomized stream of naturalistic multisensory-audiovisual (AV), unisensory-auditory (A), and unisensory-visual (V) "splash" clips (i.e., a drop falling and hitting a water surface) was presented among non-naturalistic abstract motion stimuli. The visual clip onset preceded the "splash" onset by 100 ms for multisensory stimuli. For naturalistic objects early multisensory integration effects beginning 120-140 ms after sound onset were observed over posterior scalp, with distributed sources localized to occipital cortex, temporal lobule, insular, and medial frontal gyrus (MFG). These effects, together with longer latency interactions (210-250 and 300-350 ms) found in a widespread network of occipital, temporal, and frontal areas, suggest that naturalistic objects in motion are processed at multiple stages of multisensory integration. The pattern of integration effects differed considerably for non-naturalistic stimuli. Unlike naturalistic objects, no early interactions were found for non-naturalistic objects. The earliest integration effects for non-naturalistic stimuli were observed 210-250 ms after sound onset including large portions of the inferior parietal cortex (IPC). As such, there were clear differences in the cortical networks activated by multisensory motion stimuli as a consequence of the semantic relatedness (or lack thereof) of the constituent sensory elements.

  18. Unexpected doubly-magic nucleus.

    SciTech Connect

    Janssens, R. V. F.; Physics

    2009-01-01

    Nuclei with a 'magic' number of both protons and neutrons, dubbed doubly magic, are particularly stable. The oxygen isotope {sup 24}O has been found to be one such nucleus - yet it lies just at the limit of stability.

  19. Multifunctional, High-Temperature Nanocomposites

    NASA Technical Reports Server (NTRS)

    Connell, John W.; Smith, Joseph G.; Siochi, Emilie J.; Working, Dennis C.; Criss, Jim M.; Watson, Kent A.; Delozier, Donavon M.; Ghose, Sayata

    2007-01-01

    In experiments conducted as part of a continuing effort to incorporate multifunctionality into advanced composite materials, blends of multi-walled carbon nanotubes and a resin denoted gPETI-330 h (wherein gPETI h is an abbreviation for gphenylethynyl-terminated imide h) were prepared, characterized, and fabricated into moldings. PETI-330 was selected as the matrix resin in these experiments because of its low melt viscosity (<10 poise at a temperature of 280 C), excellent melt stability (lifetime >2 hours at 280 C), and high temperature performance (>1,000 hours at 288 C). The multi-walled carbon nanotubes (MWCNTs), obtained from the University of Kentucky, were selected because of their electrical and thermal conductivity and their small diameters. The purpose of these experiments was to determine the combination of thermal, electrical, and mechanical properties achievable while still maintaining melt processability. The PETI-330/MWCNT mixtures were prepared at concentrations ranging from 3 to 25 weight-percent of MWCNTs by dry mixing of the constituents in a ball mill using zirconia beads. The resulting powders were characterized for degree of mixing and thermal and rheological properties. The neat resin was found to have melt viscosity between 5 and 10 poise. At 280 C and a fixed strain rate, the viscosity was found to increase with time. At this temperature, the phenylethynyl groups do not readily react and so no significant curing of the resin occurred. For MWCNT-filled samples, melt viscosity was reasonably steady at 280 C and was greater in samples containing greater proportions of MWCNTs. The melt viscosity for 20 weightpercent of MWCNTs was found to be .28,000 poise, which is lower than the initial estimated allowable maximum value of 60,000 poise for injection molding. Hence, MWCNT loadings of as much as 20 percent were deemed to be suitable compositions for scale-up. High-resolution scanning electron microscopy (HRSEM) showed the MWCNTs to be well

  20. Magnetically Attached Multifunction Maintenance Rover

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Joffe, Benjamin

    2005-01-01

    A versatile mobile telerobot, denoted the magnetically attached multifunction maintenance rover (MAGMER), has been proposed for use in the inspection and maintenance of the surfaces of ships, tanks containing petrochemicals, and other large ferromagnetic structures. As its name suggests, this robot would utilize magnetic attraction to adhere to a structure. As it moved along the surface of the structure, the MAGMER would perform tasks that could include close-up visual inspection by use of video cameras, various sensors, and/or removal of paint by water-jet blasting, laser heating, or induction heating. The water-jet nozzles would be mounted coaxially within compressed-air-powered venturi nozzles that would collect the paint debris dislodged by the jets. The MAGMER would be deployed, powered, and controlled from a truck, to which it would be connected by hoses for water, compressed air, and collection of debris and by cables for electric power and communication (see Figure 1). The operation of the MAGMER on a typical large structure would necessitate the use of long cables and hoses, which can be heavy. To reduce the load of the hoses and cables on the MAGMER and thereby ensure its ability to adhere to vertical and overhanging surfaces, the hoses and cables would be paid out through telescopic booms that would be parts of a MAGMER support system. The MAGMER would move by use of four motorized, steerable wheels, each of which would be mounted in an assembly that would include permanent magnets and four pole pieces (see Figure 2). The wheels would protrude from between the pole pieces by only about 3 mm, so that the gap between the pole pieces and the ferromagnetic surface would be just large enough to permit motion along the surface but not so large as to reduce the magnetic attraction excessively. In addition to the wheel assemblies, the MAGMER would include magnetic adherence enhancement fixtures, which would comprise arrays of permanent magnets and pole pieces

  1. Multisensory emotion perception in congenitally, early, and late deaf CI users.

    PubMed

    Fengler, Ineke; Nava, Elena; Villwock, Agnes K; Büchner, Andreas; Lenarz, Thomas; Röder, Brigitte

    2017-01-01

    Emotions are commonly recognized by combining auditory and visual signals (i.e., vocal and facial expressions). Yet it is unknown whether the ability to link emotional signals across modalities depends on early experience with audio-visual stimuli. In the present study, we investigated the role of auditory experience at different stages of development for auditory, visual, and multisensory emotion recognition abilities in three groups of adolescent and adult cochlear implant (CI) users. CI users had a different deafness onset and were compared to three groups of age- and gender-matched hearing control participants. We hypothesized that congenitally deaf (CD) but not early deaf (ED) and late deaf (LD) CI users would show reduced multisensory interactions and a higher visual dominance in emotion perception than their hearing controls. The CD (n = 7), ED (deafness onset: <3 years of age; n = 7), and LD (deafness onset: >3 years; n = 13) CI users and the control participants performed an emotion recognition task with auditory, visual, and audio-visual emotionally congruent and incongruent nonsense speech stimuli. In different blocks, participants judged either the vocal (Voice task) or the facial expressions (Face task). In the Voice task, all three CI groups performed overall less efficiently than their respective controls and experienced higher interference from incongruent facial information. Furthermore, the ED CI users benefitted more than their controls from congruent faces and the CD CI users showed an analogous trend. In the Face task, recognition efficiency of the CI users and controls did not differ. Our results suggest that CI users acquire multisensory interactions to some degree, even after congenital deafness. When judging affective prosody they appear impaired and more strongly biased by concurrent facial information than typically hearing individuals. We speculate that limitations inherent to the CI contribute to these group differences.

  2. Multisensory Integration in Non-Human Primates during a Sensory-Motor Task.

    PubMed

    Lanz, Florian; Moret, Véronique; Rouiller, Eric Michel; Loquet, Gérard

    2013-01-01

    Daily our central nervous system receives inputs via several sensory modalities, processes them and integrates information in order to produce a suitable behavior. The amazing part is that such a multisensory integration brings all information into a unified percept. An approach to start investigating this property is to show that perception is better and faster when multimodal stimuli are used as compared to unimodal stimuli. This forms the first part of the present study conducted in a non-human primate's model (n = 2) engaged in a detection sensory-motor task where visual and auditory stimuli were displayed individually or simultaneously. The measured parameters were the reaction time (RT) between stimulus and onset of arm movement, successes and errors percentages, as well as the evolution as a function of time of these parameters with training. As expected, RTs were shorter when the subjects were exposed to combined stimuli. The gains for both subjects were around 20 and 40 ms, as compared with the auditory and visual stimulus alone, respectively. Moreover the number of correct responses increased in response to bimodal stimuli. We interpreted such multisensory advantage through redundant signal effect which decreases perceptual ambiguity, increases speed of stimulus detection, and improves performance accuracy. The second part of the study presents single-unit recordings derived from the premotor cortex (PM) of the same subjects during the sensory-motor task. Response patterns to sensory/multisensory stimulation are documented and specific type proportions are reported. Characterization of bimodal neurons indicates a mechanism of audio-visual integration possibly through a decrease of inhibition. Nevertheless the neural processing leading to faster motor response from PM as a polysensory association cortical area remains still unclear.

  3. Multisensory and modality specific processing of visual speech in different regions of the premotor cortex

    PubMed Central

    Callan, Daniel E.; Jones, Jeffery A.; Callan, Akiko

    2014-01-01

    Behavioral and neuroimaging studies have demonstrated that brain regions involved with speech production also support speech perception, especially under degraded conditions. The premotor cortex (PMC) has been shown to be active during both observation and execution of action (“Mirror System” properties), and may facilitate speech perception by mapping unimodal and multimodal sensory features onto articulatory speech gestures. For this functional magnetic resonance imaging (fMRI) study, participants identified vowels produced by a speaker in audio-visual (saw the speaker's articulating face and heard her voice), visual only (only saw the speaker's articulating face), and audio only (only heard the speaker's voice) conditions with varying audio signal-to-noise ratios in order to determine the regions of the PMC involved with multisensory and modality specific processing of visual speech gestures. The task was designed so that identification could be made with a high level of accuracy from visual only stimuli to control for task difficulty and differences in intelligibility. The results of the functional magnetic resonance imaging (fMRI) analysis for visual only and audio-visual conditions showed overlapping activity in inferior frontal gyrus and PMC. The left ventral inferior premotor cortex (PMvi) showed properties of multimodal (audio-visual) enhancement with a degraded auditory signal. The left inferior parietal lobule and right cerebellum also showed these properties. The left ventral superior and dorsal premotor cortex (PMvs/PMd) did not show this multisensory enhancement effect, but there was greater activity for the visual only over audio-visual conditions in these areas. The results suggest that the inferior regions of the ventral premotor cortex are involved with integrating multisensory information, whereas, more superior and dorsal regions of the PMC are involved with mapping unimodal (in this case visual) sensory features of the speech signal with

  4. Parkinson's disease alters multisensory perception: Insights from the Rubber Hand Illusion.

    PubMed

    Ding, Catherine; Palmer, Colin J; Hohwy, Jakob; Youssef, George J; Paton, Bryan; Tsuchiya, Naotsugu; Stout, Julie C; Thyagarajan, Dominic

    2017-03-01

    Manipulation of multisensory integration induces illusory perceptions of body ownership. Patients with Parkinson's disease (PD), a neurodegenerative disorder characterised by striatal dopamine deficiency, are prone to illusions and hallucinations and have sensory deficits. Dopaminergic treatment also aggravates hallucinations in PD. Whether multisensory integration in body ownership is altered by PD is unexplored. To study the effect of dopamine neurotransmission on illusory perceptions of body ownership. We studied the Rubber Hand Illusion (RHI) in 21 PD patients (on- and off-medication) and 21 controls. In this experimental paradigm, synchronous stroking of a rubber hand and the subject's hidden real hand results in the illusory experience of 'feeling' the rubber hand, and proprioceptive mislocalisation of the real hand towards the rubber hand ('proprioceptive drift'). Asynchronous stroking typically attenuates the RHI. The effect of PD on illusory experience depended on the stroking condition (b = -2.15, 95% CI [-3.06, -1.25], p < .0001): patients scored questionnaire items eliciting the RHI experience higher than controls in the illusion-attenuating (asynchronous) condition, but not in the illusion-promoting (synchronous) condition. PD, independent of stroking condition, predicted greater proprioceptive drift (b = 15.05, 95% CI [6.05, 24.05], p = .0022); the longer the disease duration, the greater the proprioceptive drift. However, the RHI did not affect subsequent reaching actions. On-medication patients scored both illusion (critical) and mock (control) questionnaire items higher than when off-medication, an effect that increased with disease severity (log (OR) =.014, 95% CI [.01, .02], p < .0001). PD affects illusory perceptions of body ownership in situations that do not typically induce them, implicating dopamine deficit and consequent alterations in cortico-basal ganglia-thalamic circuitry in multisensory integration. Dopaminergic treatment appears to

  5. The sound-induced flash illusion reveals dissociable age-related effects in multisensory integration.

    PubMed

    McGovern, David P; Roudaia, Eugenie; Stapleton, John; McGinnity, T Martin; Newell, Fiona N

    2014-01-01

    While aging can lead to significant declines in perceptual and cognitive function, the effects of age on multisensory integration, the process in which the brain combines information across the senses, are less clear. Recent reports suggest that older adults are susceptible to the sound-induced flash illusion (Shams et al., 2000) across a much wider range of temporal asynchronies than younger adults (Setti et al., 2011). To assess whether this cost for multisensory integration is a general phenomenon of combining asynchronous audiovisual input, we compared the time courses of two variants of the sound-induced flash illusion in young and older adults: the fission illusion, where one flash accompanied by two beeps appears as two flashes, and the fusion illusion, where two flashes accompanied by one beep appear as one flash. Twenty-five younger (18-30 years) and older (65+ years) adults were required to report whether they perceived one or two flashes, whilst ignoring irrelevant auditory beeps, in bimodal trials where auditory and visual stimuli were separated by one of six stimulus onset asynchronies (SOAs). There was a marked difference in the pattern of results for the two variants of the illusion. In conditions known to produce the fission illusion, older adults were significantly more susceptible to the illusion at longer SOAs compared to younger participants. In contrast, the performance of the younger and older groups was almost identical in conditions known to produce the fusion illusion. This surprising difference between sound-induced fission and fusion in older adults suggests dissociable age-related effects in multisensory integration, consistent with the idea that these illusions are mediated by distinct neural mechanisms.

  6. Functional mobility and balance in community-dwelling elderly submitted to multisensory versus strength exercises

    PubMed Central

    Alfieri, Fábio Marcon; Riberto, Marcelo; Gatz, Lucila Silveira; Ribeiro, Carla Paschoal Corsi; Lopes, José Augusto Fernandes; Santarém, José Maria; Battistella, Linamara Rizzo

    2010-01-01

    It is well documented that aging impairs balance and functional mobility. The objective of this study was to compare the efficacy of multisensory versus strength exercises on these parameters. We performed a simple blinded randomized controlled trial with 46 community-dwelling elderly allocated to strength ([GST], N = 23, 70.2-years-old ± 4.8 years) or multisensory ([GMS], N = 23, 68.8-years-old ± 5.9 years) exercises twice a week for 12 weeks. Subjects were evaluated by blinded raters using the timed ‘up and go’ test (TUG), the Guralnik test battery, and a force platform. By the end of the treatment, the GMS group showed a significant improvement in TUG (9.1 ± 1.9 seconds (s) to 8.0 ± 1.0 s, P = 0.002); Guralnik test battery (10.6 ± 1.2 to 11.3 ± 0.8 P = 0.009); lateromedial (6.1 ± 11.7 cm to 3.1 ± 1.6 cm, P = 0.02) and anteroposterior displacement (4.7 ± 4.2 cm to 3.4 ± 1.0 cm, P = 0.03), which were not observed in the GST group. These results reproduce previous findings in the literature and mean that the stimulus to sensibility results in better achievements for the control of balance and dynamic activities. Multisensory exercises were shown to be more efficacious than strength exercises to improve functional mobility. PMID:20711437

  7. Musicians have enhanced audiovisual multisensory binding: experience-dependent effects in the double-flash illusion.

    PubMed

    Bidelman, Gavin M

    2016-10-01

    Musical training is associated with behavioral and neurophysiological enhancements in auditory processing for both musical and nonmusical sounds (e.g., speech). Yet, whether the benefits of musicianship extend beyond enhancements to auditory-specific skills and impact multisensory (e.g., audiovisual) processing has yet to be fully validated. Here, we investigated multisensory integration of auditory and visual information in musicians and nonmusicians using a double-flash illusion, whereby the presentation of multiple auditory stimuli (beeps) concurrent with a single visual object (flash) induces an illusory perception of multiple flashes. We parametrically varied the onset asynchrony between auditory and visual events (leads and lags of ±300 ms) to quantify participants' "temporal window" of integration, i.e., stimuli in which auditory and visual cues were fused into a single percept. Results show that musically trained individuals were both faster and more accurate at processing concurrent audiovisual cues than their nonmusician peers; nonmusicians had a higher susceptibility for responding to audiovisual illusions and perceived double flashes over an extended range of onset asynchronies compared to trained musicians. Moreover, temporal window estimates indicated that musicians' windows (<100 ms) were ~2-3× shorter than nonmusicians' (~200 ms), suggesting more refined multisensory integration and audiovisual binding. Collectively, findings indicate a more refined binding of auditory and visual cues in musically trained individuals. We conclude that experience-dependent plasticity of intensive musical experience extends beyond simple listening skills, improving multimodal processing and the integration of multiple sensory systems in a domain-general manner.

  8. Sleeping on the rubber-hand illusion: Memory reactivation during sleep facilitates multisensory recalibration.

    PubMed

    Honma, Motoyasu; Plass, John; Brang, David; Florczak, Susan M; Grabowecky, Marcia; Paller, Ken A

    2016-01-01

    Plasticity is essential in body perception so that physical changes in the body can be accommodated and assimilated. Multisensory integration of visual, auditory, tactile, and proprioceptive signals contributes both to conscious perception of the body's current state and to associated learning. However, much is unknown about how novel information is assimilated into body perception networks in the brain. Sleep-based consolidation can facilitate various types of learning via the reactivation of networks involved in prior encoding or through synaptic down-scaling. Sleep may likewise contribute to perceptual learning of bodily information by providing an optimal time for multisensory recalibration. Here we used methods for targeted memory reactivation (TMR) during slow-wave sleep to examine the influence of sleep-based reactivation of experimentally induced alterations in body perception. The rubber-hand illusion was induced with concomitant auditory stimulation in 24 healthy participants on 3 consecutive days. While each participant was sleeping in his or her own bed during intervening nights, electrophysiological detection of slow-wave sleep prompted covert stimulation with either the sound heard during illusion induction, a counterbalanced novel sound, or neither. TMR systematically enhanced feelings of bodily ownership after subsequent inductions of the rubber-hand illusion. TMR also enhanced spatial recalibration of perceived hand location in the direction of the rubber hand. This evidence for a sleep-based facilitation of a body-perception illusion demonstrates that the spatial recalibration of multisensory signals can be altered overnight to stabilize new learning of bodily representations. Sleep-based memory processing may thus constitute a fundamental component of body-image plasticity.

  9. The rapid distraction of attentional resources toward the source of incongruent stimulus input during multisensory conflict

    PubMed Central

    Donohue, Sarah E.; Todisco, Alexandra E.; Woldorff, Marty G.

    2013-01-01

    Neuroimaging work on multisensory conflict suggests that the relevant modality receives enhanced processing in the face of incongruency. However, the degree of stimulus processing in the irrelevant modality and the temporal cascade of the attentional modulations in either the relevant or irrelevant modalities are unknown. Here, we employed an audiovisual conflict paradigm with a sensory probe in the task-irrelevant modality (vision) to gauge the attentional allocation to that modality. Event-related potentials (ERPs) were recorded as subjects attended to and discriminated spoken auditory letters while ignoring simultaneous bilateral visual letter stimuli that were either fully congruent, fully incongruent, or partially incongruent (one side incongruent, one congruent) with the auditory stimulation. Half of the audiovisual letter stimuli were followed 500-700 ms later by a bilateral visual probe stimulus. As expected, ERPs to the audiovisual stimuli showed an incongruency ERP effect (fully incongruent versus fully congruent) of an enhanced, centrally distributed, negative-polarity wave starting ~250 ms. More critically here, the sensory ERP components to the visual probes were larger when they followed fully incongruent versus fully congruent multisensory stimuli, with these enhancements greatest on fully incongruent trials with the slowest response times. In addition, on the slowest-response partially incongruent trials, the P2 sensory component to the visual probes was larger contralateral to the preceding incongruent visual stimulus. These data suggest that, in response to conflicting multisensory stimulus input, the initial cognitive effect is a capture of attention by the incongruent irrelevant-modality input, pulling neural processing resources toward that modality, resulting in rapid enhancement, rather than rapid suppression, of that input. PMID:23249355

  10. Multisensory integration of dynamic emotional faces and voices: method for simultaneous EEG-fMRI measurements

    PubMed Central

    Schelenz, Patrick D.; Klasen, Martin; Reese, Barbara; Regenbogen, Christina; Wolf, Dhana; Kato, Yutaka; Mathiak, Klaus

    2013-01-01

    Combined EEG-fMRI analysis correlates time courses from single electrodes or independent EEG components with the hemodynamic response. Implementing information from only one electrode, however, may miss relevant information from complex electrophysiological networks. Component based analysis, in turn, depends on a priori knowledge of the signal topography. Complex designs such as studies on multisensory integration of emotions investigate subtle differences in distributed networks based on only a few trials per condition. Thus, they require a sensitive and comprehensive approach which does not rely on a-priori knowledge about the underlying neural processes. In this pilot study, feasibility and sensitivity of source localization-driven analysis for EEG-fMRI was tested using a multisensory integration paradigm. Dynamic audiovisual stimuli consisting of emotional talking faces and pseudowords with emotional prosody were rated in a delayed response task. The trials comprised affectively congruent and incongruent displays. In addition to event-locked EEG and fMRI analyses, induced oscillatory EEG responses at estimated cortical sources and in specific temporo-spectral windows were correlated with the corresponding BOLD responses. EEG analysis showed high data quality with less than 10% trial rejection. In an early time window, alpha oscillations were suppressed in bilateral occipital cortices and fMRI analysis confirmed high data quality with reliable activation in auditory, visual and frontal areas to the presentation of multisensory stimuli. In line with previous studies, we obtained reliable correlation patterns for event locked occipital alpha suppression and BOLD signal time course. Our results suggest a valid methodological approach to investigate complex stimuli using the present source localization driven method for EEG-fMRI. This novel procedure may help to investigate combined EEG-fMRI data from novel complex paradigms with high spatial and temporal

  11. Sleeping on the rubber-hand illusion: Memory reactivation during sleep facilitates multisensory recalibration

    PubMed Central

    Honma, Motoyasu; Plass, John; Brang, David; Florczak, Susan M.; Grabowecky, Marcia; Paller, Ken A.

    2016-01-01

    Plasticity is essential in body perception so that physical changes in the body can be accommodated and assimilated. Multisensory integration of visual, auditory, tactile, and proprioceptive signals contributes both to conscious perception of the body’s current state and to associated learning. However, much is unknown about how novel information is assimilated into body perception networks in the brain. Sleep-based consolidation can facilitate various types of learning via the reactivation of networks involved in prior encoding or through synaptic down-scaling. Sleep may likewise contribute to perceptual learning of bodily information by providing an optimal time for multisensory recalibration. Here we used methods for targeted memory reactivation (TMR) during slow-wave sleep to examine the influence of sleep-based reactivation of experimentally induced alterations in body perception. The rubber-hand illusion was induced with concomitant auditory stimulation in 24 healthy participants on 3 consecutive days. While each participant was sleeping in his or her own bed during intervening nights, electrophysiological detection of slow-wave sleep prompted covert stimulation with either the sound heard during illusion induction, a counterbalanced novel sound, or neither. TMR systematically enhanced feelings of bodily ownership after subsequent inductions of the rubber-hand illusion. TMR also enhanced spatial recalibration of perceived hand location in the direction of the rubber hand. This evidence for a sleep-based facilitation of a body-perception illusion demonstrates that the spatial recalibration of multisensory signals can be altered overnight to stabilize new learning of bodily representations. Sleep-based memory processing may thus constitute a fundamental component of body-image plasticity. PMID:28184322

  12. Multisensory Bayesian Inference Depends on Synapse Maturation during Training: Theoretical Analysis and Neural Modeling Implementation.

    PubMed

    Ursino, Mauro; Cuppini, Cristiano; Magosso, Elisa

    2017-03-01

    Recent theoretical and experimental studies suggest that in multisensory conditions, the brain performs a near-optimal Bayesian estimate of external events, giving more weight to the more reliable stimuli. However, the neural mechanisms responsible for this behavior, and its progressive maturation in a multisensory environment, are still insufficiently understood. The aim of this letter is to analyze this problem with a neural network model of audiovisual integration, based on probabilistic population coding-the idea that a population of neurons can encode probability functions to perform Bayesian inference. The model consists of two chains of unisensory neurons (auditory and visual) topologically organized. They receive the corresponding input through a plastic receptive field and reciprocally exchange plastic cross-modal synapses, which encode the spatial co-occurrence of visual-auditory inputs. A third chain of multisensory neurons performs a simple sum of auditory and visual excitations. The work includes a theoretical part and a computer simulation study. We show how a simple rule for synapse learning (consisting of Hebbian reinforcement and a decay term) can be used during training to shrink the receptive fields and encode the unisensory likelihood functions. Hence, after training, each unisensory area realizes a maximum likelihood estimate of stimulus position (auditory or visual). In cross-modal conditions, the same learning rule can encode information on prior probability into the cross-modal synapses. Computer simulations confirm the theoretical results and show that the proposed network can realize a maximum likelihood estimate of auditory (or visual) positions in unimodal conditions and a Bayesian estimate, with moderate deviations from optimality, in cross-modal conditions. Furthermore, the model explains the ventriloquism illusion and, looking at the activity in the multimodal neurons, explains the automatic reweighting of auditory and visual inputs

  13. The sound-induced flash illusion reveals dissociable age-related effects in multisensory integration

    PubMed Central

    McGovern, David P.; Roudaia, Eugenie; Stapleton, John; McGinnity, T. Martin; Newell, Fiona N.

    2014-01-01

    While aging can lead to significant declines in perceptual and cognitive function, the effects of age on multisensory integration, the process in which the brain combines information across the senses, are less clear. Recent reports suggest that older adults are susceptible to the sound-induced flash illusion (Shams et al., 2000) across a much wider range of temporal asynchronies than younger adults (Setti et al., 2011). To assess whether this cost for multisensory integration is a general phenomenon of combining asynchronous audiovisual input, we compared the time courses of two variants of the sound-induced flash illusion in young and older adults: the fission illusion, where one flash accompanied by two beeps appears as two flashes, and the fusion illusion, where two flashes accompanied by one beep appear as one flash. Twenty-five younger (18–30 years) and older (65+ years) adults were required to report whether they perceived one or two flashes, whilst ignoring irrelevant auditory beeps, in bimodal trials where auditory and visual stimuli were separated by one of six stimulus onset asynchronies (SOAs). There was a marked difference in the pattern of results for the two variants of the illusion. In conditions known to produce the fission illusion, older adults were significantly more susceptible to the illusion at longer SOAs compared to younger participants. In contrast, the performance of the younger and older groups was almost identical in conditions known to produce the fusion illusion. This surprising difference between sound-induced fission and fusion in older adults suggests dissociable age-related effects in multisensory integration, consistent with the idea that these illusions are mediated by distinct neural mechanisms. PMID:25309430

  14. Stepping to phase-perturbed metronome cues: multisensory advantage in movement synchrony but not correction

    PubMed Central

    Wright, Rachel L.; Spurgeon, Laura C.; Elliott, Mark T.

    2014-01-01

    Humans can synchronize movements with auditory beats or rhythms without apparent effort. This ability to entrain to the beat is considered automatic, such that any perturbations are corrected for, even if the perturbation was not consciously noted. Temporal correction of upper limb (e.g., finger tapping) and lower limb (e.g., stepping) movements to a phase perturbed auditory beat usually results in individuals being back in phase after just a few beats. When a metronome is presented in more than one sensory modality, a multisensory advantage is observed, with reduced temporal variability in finger tapping movements compared to unimodal conditions. Here, we investigate synchronization of lower limb movements (stepping in place) to auditory, visual and combined auditory-visual (AV) metronome cues. In addition, we compare movement corrections to phase advance and phase delay perturbations in the metronome for the three sensory modality conditions. We hypothesized that, as with upper limb movements, there would be a multisensory advantage, with stepping variability being lowest in the bimodal condition. As such, we further expected correction to the phase perturbation to be quickest in the bimodal condition. Our results revealed lower variability in the asynchronies between foot strikes and the metronome beats in the bimodal condition, compared to unimodal conditions. However, while participants corrected substantially quicker to perturbations in auditory compared to visual metronomes, there was no multisensory advantage in the phase correction task—correction under the bimodal condition was almost identical to the auditory-only (AO) condition. On the whole, we noted that corrections in the stepping task were smaller than those previously reported for finger tapping studies. We conclude that temporal corrections are not only affected by the reliability of the sensory information, but also the complexity of the movement itself. PMID:25309397

  15. Bioinspired Multifunctional Membrane for Aquatic Micropollutants Removal.

    PubMed

    Cao, Xiaotong; Luo, Jianquan; Woodley, John M; Wan, Yinhua

    2016-11-09

    Micropollutants present in water have many detrimental effects on the ecosystem. Membrane technology plays an important role in the removal of micropollutants, but there remain significant challenges such as concentration polarization, membrane fouling, and variable permeate quality. The work reported here uses a multifunctional membrane with rejection, adsorption, and catalysis functions to solve these problems. On the basis of mussel-inspired chemistry and biological membrane properties, a multifunctional membrane was prepared by applying "reverse filtration" of a laccase solution and subsequent "dopamine coating" on a nanofiltration (NF) membrane support, which was tested on bisphenol A (BPA) removal. Three NF membranes were chosen for the preparation of the multifunctional membranes on the basis of the membrane properties and enzyme immobilization efficiency. Compared with the pristine membrane, the multifunctional membrane exhibited significant improvement of BPA removal (78.21 ± 1.95%, 84.27 ± 7.30%, and 97.04 ± 0.33% for NT103, NF270, and NF90, respectively), all of which are clearly superior to the conventional Fenton treatment (55.0%) under similar conditions and comparable to soluble laccase coupled with NF270 membrane filtration (89.0%). The improvement would appear to be due to a combination of separation (reducing the enzymatic burden), adsorption (enriching the substrate concentration as well as prolonging the residence time), and lastly, catalysis (oxidizing the pollutants and breaking the "adsorption saturation limits"). Furthermore, the synergistic effect of the polydopamine (PDA) layer on the enzymatic oxidation of BPA was confirmed, which was due to its enhanced adsorption and electron transfer performance. The multifunctional membrane could be reused for at least seven cycles with an acceptable activity loss, demonstrating good potential for removal of micropollutants.

  16. Surface albedo of cometary nucleus

    NASA Astrophysics Data System (ADS)

    Mukai, T.; Mukai, S.

    A variation of the albedo on the illuminated disk of a comet nucleus is estimated, taking into account the multiple reflection of incident light due to small scale roughness. The dependences of the average albedo over the illuminated disk on the degree of roughness and on the complex refractive index of the surface materials are examined. The variation estimates are compared with measurements of the nucleus albedo of Comet Halley (Reitsema et al., 1987).

  17. Neural functional organization of hallucinations in schizophrenia: multisensory dissolution of pathological emergence in consciousness.

    PubMed

    Jardri, Renaud; Pins, Delphine; Bubrovszky, Maxime; Lucas, Bernard; Lethuc, Vianney; Delmaire, Christine; Vantyghem, Vincent; Despretz, Pascal; Thomas, Pierre

    2009-06-01

    Although complex hallucinations are extremely vivid, painful symptoms in schizophrenia, little is known about the underlying mechanisms of multisensory integration in such a phenomenon. We investigated the neural basis of these altered states of consciousness in a patient with schizophrenia, by combining state of the art neuroscientific exploratory methods like functional MRI, diffusion tensor imaging, cortical thickness analysis, electrical source reconstruction and trans-cranial magnetic stimulation. The results shed light on the functional architecture of the hallucinatory processes, in which unimodal information from different modalities is strongly functionally connected to higher-order integrative areas.

  18. Behavioral and Neural Foundations of Multisensory Face-Voice Perception in Infancy.

    PubMed

    Hyde, Daniel C; Flom, Ross; Porter, Chris L

    In this article, we describe behavioral and neurophysiological evidence for infants' multimodal face-voice perception. We argue that the behavioral development of face-voice perception, like multimodal perception more broadly, is consistent with the intersensory redundancy hypothesis (IRH). Furthermore, we highlight that several recently observed features of the neural responses in infants converge with the behavioral predictions of the intersensory redundancy hypothesis. Finally, we discuss the potential benefits of combining brain and behavioral measures to study multisensory processing, as well as some applications of this work for atypical development.

  19. Sensitivity of cross sections for elastic nucleus-nucleus scattering to halo nucleus density distributions

    SciTech Connect

    Alkhazov, G. D.; Sarantsev, V. V.

    2012-12-15

    In order to clear up the sensitivity of the nucleus-nucleus scattering to the nuclear matter distributions in exotic halo nuclei, we have calculated differential cross sections for elastic scattering of the {sup 6}He and {sup 11}Li nuclei on several nuclear targets at the energy of 0.8 GeV/nucleon with different assumed nuclear density distributions in {sup 6}He and {sup 11}Li.

  20. A neural model to study sensory abnormalities and multisensory effects in autism.

    PubMed

    Noriega, Gerardo

    2015-03-01

    Computational modeling plays an increasingly prominent role in complementing critical research in the genetics, neuroscience, and psychology of autism. This paper presents a model that supports the notion that weak central coherence, a processing bias for features and local information, may be responsible for perception abnormalities by failing to "control" sensory issues in autism. The model has a biologically plausible architecture based on a self-organizing map. It incorporates temporal information in input stimuli, with emphasis on real auditory signals, and provides a mechanism to model multisensory effects. Through comprehensive simulations the paper studies the effect of a control mechanism (akin to central coherence) in compensating the effects of temporal information in the presentation of stimuli, sensory abnormalities, and crosstalk between domains. The mechanism is successful in balancing out timing effects, basic hypersensitivities and, to a lesser degree, multisensory effects. An analysis of the effect of the control mechanism's onset time on performance suggests that most of the potential benefits are still attainable even when started rather late in the learning process. This high level of adaptability shown by the neural network highlights the importance of appropriate teaching and intervention throughout the lifetime of persons with autism and other neurological disorders.

  1. Does media multitasking always hurt? A positive correlation between multitasking and multisensory integration.

    PubMed

    Lui, Kelvin F H; Wong, Alan C-N

    2012-08-01

    Heavy media multitaskers have been found to perform poorly in certain cognitive tasks involving task switching, selective attention, and working memory. An account for this is that with a breadth-biased style of cognitive control, multitaskers tend to pay attention to various information available in the environment, without sufficient focus on the information most relevant to the task at hand. This cognitive style, however, may not cause a general deficit in all kinds of tasks. We tested the hypothesis that heavy media multitaskers would perform better in a multisensory integration task than would others, due to their extensive experience in integrating information from different modalities. Sixty-three participants filled out a questionnaire about their media usage and completed a visual search task with and without synchronous tones (pip-and-pop paradigm). It was found that a higher degree of media multitasking was correlated with better multisensory integration. The fact that heavy media multitaskers are not deficient in all kinds of cognitive tasks suggests that media multitasking does not always hurt.

  2. Multisensory brand search: How the meaning of sounds guides consumers' visual attention.

    PubMed

    Knoeferle, Klemens M; Knoeferle, Pia; Velasco, Carlos; Spence, Charles

    2016-06-01

    Building on models of crossmodal attention, the present research proposes that brand search is inherently multisensory, in that the consumers' visual search for a specific brand can be facilitated by semantically related stimuli that are presented in another sensory modality. A series of 5 experiments demonstrates that the presentation of spatially nonpredictive auditory stimuli associated with products (e.g., usage sounds or product-related jingles) can crossmodally facilitate consumers' visual search for, and selection of, products. Eye-tracking data (Experiment 2) revealed that the crossmodal effect of auditory cues on visual search manifested itself not only in RTs, but also in the earliest stages of visual attentional processing, thus suggesting that the semantic information embedded within sounds can modulate the perceptual saliency of the target products' visual representations. Crossmodal facilitation was even observed for newly learnt associations between unfamiliar brands and sonic logos, implicating multisensory short-term learning in establishing audiovisual semantic associations. The facilitation effect was stronger when searching complex rather than simple visual displays, thus suggesting a modulatory role of perceptual load. (PsycINFO Database Record

  3. Wireless Wearable Multisensory Suite and Real-Time Prediction of Obstructive Sleep Apnea Episodes.

    PubMed

    Le, Trung Q; Cheng, Changqing; Sangasoongsong, Akkarapol; Wongdhamma, Woranat; Bukkapatnam, Satish T S

    2013-01-01

    Obstructive sleep apnea (OSA) is a common sleep disorder found in 24% of adult men and 9% of adult women. Although continuous positive airway pressure (CPAP) has emerged as a standard therapy for OSA, a majority of patients are not tolerant to this treatment, largely because of the uncomfortable nasal air delivery during their sleep. Recent advances in wireless communication and advanced ("bigdata") preditive analytics technologies offer radically new point-of-care treatment approaches for OSA episodes with unprecedented comfort and afforadability. We introduce a Dirichlet process-based mixture Gaussian process (DPMG) model to predict the onset of sleep apnea episodes based on analyzing complex cardiorespiratory signals gathered from a custom-designed wireless wearable multisensory suite. Extensive testing with signals from the multisensory suite as well as PhysioNet's OSA database suggests that the accuracy of offline OSA classification is 88%, and accuracy for predicting an OSA episode 1-min ahead is 83% and 3-min ahead is 77%. Such accurate prediction of an impending OSA episode can be used to adaptively adjust CPAP airflow (toward improving the patient's adherence) or the torso posture (e.g., minor chin adjustments to maintain steady levels of the airflow).

  4. Plasticity in unimodal and multimodal brain areas reflects multisensory changes in self-face identification.

    PubMed

    Apps, Matthew A J; Tajadura-Jiménez, Ana; Sereno, Marty; Blanke, Olaf; Tsakiris, Manos

    2015-01-01

    Nothing provides as strong a sense of self as seeing one's face. Nevertheless, it remains unknown how the brain processes the sense of self during the multisensory experience of looking at one's face in a mirror. Synchronized visuo-tactile stimulation on one's own and another's face, an experience that is akin to looking in the mirror but seeing another's face, causes the illusory experience of ownership over the other person's face and changes in self-recognition. Here, we investigate the neural correlates of this enfacement illusion using fMRI. We examine activity in the human brain as participants experience tactile stimulation delivered to their face, while observing either temporally synchronous or asynchronous tactile stimulation delivered to another's face on either a specularly congruent or incongruent location. Activity in the multisensory right temporo-parietal junction, intraparietal sulcus, and the unimodal inferior occipital gyrus showed an interaction between the synchronicity and the congruency of the stimulation and varied with the self-reported strength of the illusory experience, which was recorded after each stimulation block. Our results highlight the important interplay between unimodal and multimodal information processing for self-face recognition, and elucidate the neurobiological basis for the plasticity required for identifying with our continuously changing visual appearance.

  5. Multisensory System for the Detection and Localization of Peripheral Subcutaneous Veins

    PubMed Central

    Fernández, Roemi; Armada, Manuel

    2017-01-01

    This paper proposes a multisensory system for the detection and localization of peripheral subcutaneous veins, as a first step for achieving automatic robotic insertion of catheters in the near future. The multisensory system is based on the combination of a SWIR (Short-Wave Infrared) camera, a TOF (Time-Of-Flight) camera and a NIR (Near Infrared) lighting source. The associated algorithm consists of two main parts: one devoted to the features extraction from the SWIR image, and another envisaged for the registration of the range data provided by the TOF camera, with the SWIR image and the results of the peripheral veins detection. In this way, the detected subcutaneous veins are mapped onto the 3D reconstructed surface, providing a full representation of the region of interest for the automatic catheter insertion. Several experimental tests were carried out in order to evaluate the capabilities of the presented approach. Preliminary results demonstrate the feasibility of the proposed design and highlight the potential benefits of the solution. PMID:28422075

  6. Spatiotemporal interplay between multisensory excitation and recruited inhibition in the lamprey optic tectum

    PubMed Central

    Grillner, Sten

    2016-01-01

    Animals integrate the different senses to facilitate event-detection for navigation in their environment. In vertebrates, the optic tectum (superior colliculus) commands gaze shifts by synaptic integration of different sensory modalities. Recent works suggest that tectum can elaborate gaze reorientation commands on its own, rather than merely acting as a relay from upstream/forebrain circuits to downstream premotor centers. We show that tectal circuits can perform multisensory computations independently and, hence, configure final motor commands. Single tectal neurons receive converging visual and electrosensory inputs, as investigated in the lamprey - a phylogenetically conserved vertebrate. When these two sensory inputs overlap in space and time, response enhancement of output neurons occurs locally in the tectum, whereas surrounding areas and temporally misaligned inputs are inhibited. Retinal and electrosensory afferents elicit local monosynaptic excitation, quickly followed by inhibition via recruitment of GABAergic interneurons. Multisensory inputs can thus regulate event-detection within tectum through local inhibition without forebrain control. DOI: http://dx.doi.org/10.7554/eLife.16472.001 PMID:27635636

  7. Management of young children with Rett disorder in the controlled multi-sensory (Snoezelen) environment.

    PubMed

    Lotan, Meir; Shapiro, Michele

    2005-11-01

    Rett syndrome is a neurological disorder resulting from an X-linked dominant mutation. It is characterized by a variety of physical and perceptual disabilities, resulting in a need for constant therapy programs to be administered on a regular basis throughout the client's life. As the child with Rett disorder (RD) is entering the more obvious, hectic phase of this syndrome (stage II), signs of extreme agitation and discomfort are usually exhibited. This behavior is suspected to reflect damaging chaotic processes accruing in the brain at that time. Experts advise that calming techniques might be helpful for children with Rett during this period. This may be our earliest opportunity to change the course of the disorder. Now that our knowledge of RD has increased and children are being diagnosed at a substantially earlier age, new intervention methods should be introduced for parents and therapists. This may ensure more suitable treatment. The multi-sensory environment may provide a soothing haven, which appeals to the child with RD. This article provides a short review of RD typical phenotype and suggests suitable activities that could take place in the multi-sensory environment with this population at the early stages of appearance of the Rett disorder.

  8. Behavioral, perceptual, and neural alterations in sensory and multisensory function in autism spectrum disorder.

    PubMed

    Baum, Sarah H; Stevenson, Ryan A; Wallace, Mark T

    2015-11-01

    Although sensory processing challenges have been noted since the first clinical descriptions of autism, it has taken until the release of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) in 2013 for sensory problems to be included as part of the core symptoms of autism spectrum disorder (ASD) in the diagnostic profile. Because sensory information forms the building blocks for higher-order social and cognitive functions, we argue that sensory processing is not only an additional piece of the puzzle, but rather a critical cornerstone for characterizing and understanding ASD. In this review we discuss what is currently known about sensory processing in ASD, how sensory function fits within contemporary models of ASD, and what is understood about the differences in the underlying neural processing of sensory and social communication observed between individuals with and without ASD. In addition to highlighting the sensory features associated with ASD, we also emphasize the importance of multisensory processing in building perceptual and cognitive representations, and how deficits in multisensory integration may also be a core characteristic of ASD.

  9. Multisensory integration and behavioral plasticity in sharks from different ecological niches.

    PubMed

    Gardiner, Jayne M; Atema, Jelle; Hueter, Robert E; Motta, Philip J

    2014-01-01

    The underwater sensory world and the sensory systems of aquatic animals have become better understood in recent decades, but typically have been studied one sense at a time. A comprehensive analysis of multisensory interactions during complex behavioral tasks has remained a subject of discussion without experimental evidence. We set out to generate a general model of multisensory information extraction by aquatic animals. For our model we chose to analyze the hierarchical, integrative, and sometimes alternate use of various sensory systems during the feeding sequence in three species of sharks that differ in sensory anatomy and behavioral ecology. By blocking senses in different combinations, we show that when some of their normal sensory cues were unavailable, sharks were often still capable of successfully detecting, tracking and capturing prey by switching to alternate sensory modalities. While there were significant species differences, odor was generally the first signal detected, leading to upstream swimming and wake tracking. Closer to the prey, as more sensory cues became available, the preferred sensory modalities varied among species, with vision, hydrodynamic imaging, electroreception, and touch being important for orienting to, striking at, and capturing the prey. Experimental deprivation of senses showed how sharks exploit the many signals that comprise their sensory world, each sense coming into play as they provide more accurate information during the behavioral sequence of hunting. The results may be applicable to aquatic hunting in general and, with appropriate modification, to other types of animal behavior.

  10. The multisensory basis of the self: From body to identity to others

    PubMed Central

    Tsakiris, Manos

    2017-01-01

    ABSTRACT By grounding the self in the body, experimental psychology has taken the body as the starting point for a science of the self. One fundamental dimension of the bodily self is the sense of body ownership that refers to the special perceptual status of one’s own body, the feeling that “my body” belongs to me. The primary aim of this review article is to highlight recent advances in the study of body ownership and our understanding of the underlying neurocognitive processes in three ways. I first consider how the sense of body ownership has been investigated and elucidated in the context of multisensory integration. Beyond exteroception, recent studies have considered how this exteroceptively driven sense of body ownership can be linked to the other side of embodiment, that of the unobservable, yet felt, interoceptive body, suggesting that these two sides of embodiment interact to provide a unifying bodily self. Lastly, the multisensorial understanding of the self has been shown to have implications for our understanding of social relationships, especially in the context of self–other boundaries. Taken together, these three research strands motivate a unified model of the self inspired by current predictive coding models. PMID:27100132

  11. Multisensory Integration and Behavioral Plasticity in Sharks from Different Ecological Niches

    PubMed Central

    Gardiner, Jayne M.; Atema, Jelle; Hueter, Robert E.; Motta, Philip J.

    2014-01-01

    The underwater sensory world and the sensory systems of aquatic animals have become better understood in recent decades, but typically have been studied one sense at a time. A comprehensive analysis of multisensory interactions during complex behavioral tasks has remained a subject of discussion without experimental evidence. We set out to generate a general model of multisensory information extraction by aquatic animals. For our model we chose to analyze the hierarchical, integrative, and sometimes alternate use of various sensory systems during the feeding sequence in three species of sharks that differ in sensory anatomy and behavioral ecology. By blocking senses in different combinations, we show that when some of their normal sensory cues were unavailable, sharks were often still capable of successfully detecting, tracking and capturing prey by switching to alternate sensory modalities. While there were significant species differences, odor was generally the first signal detected, leading to upstream swimming and wake tracking. Closer to the prey, as more sensory cues became available, the preferred sensory modalities varied among species, with vision, hydrodynamic imaging, electroreception, and touch being important for orienting to, striking at, and capturing the prey. Experimental deprivation of senses showed how sharks exploit the many signals that comprise their sensory world, each sense coming into play as they provide more accurate information during the behavioral sequence of hunting. The results may be applicable to aquatic hunting in general and, with appropriate modification, to other types of animal behavior. PMID:24695492

  12. Multisensory distortions of the hand have differential effects on tactile perception.

    PubMed

    Treshi-marie Perera, A; Newport, Roger; McKenzie, Kirsten J

    2015-11-01

    Research has suggested that altering the perceived shape and size of the body image significantly affects perception of somatic events. The current study investigated how multisensory illusions applied to the body altered tactile perception using the somatic signal detection task. Thirty-one healthy volunteers were asked to report the presence or absence of near-threshold tactile stimuli delivered to the index finger under three multisensory illusion conditions: stretched finger, shrunken finger and detached finger, as well as a veridical baseline condition. Both stretching and shrinking the stimulated finger enhanced correct touch detections; however, the mechanisms underlying this increase were found to be different. In contrast, the detached appearance reduced false touch reports-possibly due to reduced tactile noise, as a result of attention being directed to the tip of the finger only. These findings suggest that distorted representations of the body could have different modulatory effects on attention to touch and provide a link between perceived body representation and somatosensory decision-making.

  13. Multisensory Response Modulation in the Superficial Layers of the Superior Colliculus

    PubMed Central

    Maier, Alexander; Nidiffer, Aaron; Wallace, Mark T.

    2014-01-01

    The mammalian superior colliculus (SC) is made up of seven distinct layers. Based on overall differences in neuronal morphology, afferent and efferent projection patterns, physiological properties, and presumptive behavioral role, the upper three layers have been classically grouped together as the superficial layers and the remaining four layers collectively make up the deep layers. Although the superficial layers receive their primary inputs from the retina and primary visual cortex, the deep layers receive inputs from extrastriate visual cortical areas and from auditory, somatosensory, and motor-related structures. In contrast, there is no evidence of monosynaptic nonvisual inputs to the superficial layers. However, more recent studies have revealed anatomical connections between the superficial and deep layers, thus providing the substrate for possible communication between these two functional divisions of the SC. In this study, we provide physiological evidence for auditory influences on visual responses in the superficial layers of the SC. Using extracellular recordings of local field potentials (LFPs) and multiunit activity, we demonstrate multisensory effects in the superficial layers of the cat SC such that subthreshold auditory activity (as seen in the LFP) modulates visual responses (reflected in spiking activity) when the two stimuli are presented together. These results have important implications for our understanding of the functional organization of the SC and for the neural basis of multisensory integration in general. PMID:24647954

  14. Wireless Wearable Multisensory Suite and Real-Time Prediction of Obstructive Sleep Apnea Episodes

    PubMed Central

    Cheng, Changqing; Sangasoongsong, Akkarapol; Wongdhamma, Woranat; Bukkapatnam, Satish T. S.

    2013-01-01

    Obstructive sleep apnea (OSA) is a common sleep disorder found in 24% of adult men and 9% of adult women. Although continuous positive airway pressure (CPAP) has emerged as a standard therapy for OSA, a majority of patients are not tolerant to this treatment, largely because of the uncomfortable nasal air delivery during their sleep. Recent advances in wireless communication and advanced (“bigdata”) preditive analytics technologies offer radically new point-of-care treatment approaches for OSA episodes with unprecedented comfort and afforadability. We introduce a Dirichlet process-based mixture Gaussian process (DPMG) model to predict the onset of sleep apnea episodes based on analyzing complex cardiorespiratory signals gathered from a custom-designed wireless wearable multisensory suite. Extensive testing with signals from the multisensory suite as well as PhysioNet's OSA database suggests that the accuracy of offline OSA classification is 88%, and accuracy for predicting an OSA episode 1-min ahead is 83% and 3-min ahead is 77%. Such accurate prediction of an impending OSA episode can be used to adaptively adjust CPAP airflow (toward improving the patient's adherence) or the torso posture (e.g., minor chin adjustments to maintain steady levels of the airflow). PMID:27170854

  15. Semantic confusion regarding the development of multisensory integration: a practical solution

    PubMed Central

    Stein, Barry E.; Burr, David; Constantinidis, Christos; Laurienti, Paul J.; Meredith, M. Alex; Perrault, Thomas J.; Ramachandran, Ramnarayan; Röder, Brigitte; Rowland, Benjamin A.; Sathian, K.; Schroeder, Charles E.; Shams, Ladan; Stanford, Terrence R.; Wallace, Mark T.; Yu, Liping; Lewkowicz, David J.

    2011-01-01

    There is now a good deal of data from neurophysiological studies in animals and behavioral studies in human infants regarding the development of multisensory processing capabilities. Although the conclusions drawn from these different datasets sometimes appear to conflict, many of the differences are due to the use of different terms to mean the same thing and, more problematic, the use of similar terms to mean different things. Semantic issues are pervasive in the field and complicate communication among groups using different methods to study similar issues. Achieving clarity of communication among different investigative groups is essential for each to make full use of the findings of others, and an important step in this direction is to identify areas of semantic confusion. In this way investigators can be encouraged to use terms whose meaning and underlying assumptions are unambiguous because they are commonly accepted. Although this issue is of obvious importance to the large and very rapidly growing number of researchers working on multisensory processes, it is perhaps even more important to the non-cognoscenti. Those who wish to benefit from the scholarship in this field but are unfamiliar with the issues identified here are most likely to be confused by semantic inconsistencies. The current discussion attempts to document some of the more problematic of these, begin a discussion about the nature of the confusion and suggest some possible solutions. PMID:20584174

  16. The multisensory basis of the self: From body to identity to others [Formula: see text].

    PubMed

    Tsakiris, Manos

    2017-04-01

    By grounding the self in the body, experimental psychology has taken the body as the starting point for a science of the self. One fundamental dimension of the bodily self is the sense of body ownership that refers to the special perceptual status of one's own body, the feeling that "my body" belongs to me. The primary aim of this review article is to highlight recent advances in the study of body ownership and our understanding of the underlying neurocognitive processes in three ways. I first consider how the sense of body ownership has been investigated and elucidated in the context of multisensory integration. Beyond exteroception, recent studies have considered how this exteroceptively driven sense of body ownership can be linked to the other side of embodiment, that of the unobservable, yet felt, interoceptive body, suggesting that these two sides of embodiment interact to provide a unifying bodily self. Lastly, the multisensorial understanding of the self has been shown to have implications for our understanding of social relationships, especially in the context of self-other boundaries. Taken together, these three research strands motivate a unified model of the self inspired by current predictive coding models.

  17. A multi-sensorial hybrid control for robotic manipulation in human-robot workspaces.

    PubMed

    Pomares, Jorge; Perea, Ivan; García, Gabriel J; Jara, Carlos A; Corrales, Juan A; Torres, Fernando

    2011-01-01

    Autonomous manipulation in semi-structured environments where human operators can interact is an increasingly common task in robotic applications. This paper describes an intelligent multi-sensorial approach that solves this issue by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks. The proposed sensorial system is composed of a hybrid visual servo control to efficiently guide the robot towards the object to be manipulated, an inertial motion capture system and an indoor localization system to avoid possible collisions between human operators and robots working in the same workspace, and a tactile sensor algorithm to correctly manipulate the object. The proposed controller employs the whole multi-sensorial system and combines the measurements of each one of the used sensors during two different phases considered in the robot task: a first phase where the robot approaches the object to be grasped, and a second phase of manipulation of the object. In both phases, the unexpected presence of humans is taken into account. This paper also presents the successful results obtained in several experimental setups which verify the validity of the proposed approach.

  18. Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli.

    PubMed

    Ngo, Mary Kim; Spence, Charles

    2010-08-01

    Presenting an auditory or tactile cue in temporal synchrony with a change in the color of a visual target can facilitate participants' visual search performance. In the present study, we compared the magnitude of unimodal auditory, vibrotactile, and bimodal (i.e., multisensory) cuing benefits when the nonvisual cues were presented in temporal synchrony with the changing of the target's color (Experiments 1 and 2). The target (a horizontal or vertical line segment) was presented among a number of distractors (tilted line segments) that also changed color at various times. In Experiments 3 and 4, the cues were also made spatially informative with regard to the location of the visual target. The unimodal and bimodal cues gave rise to an equivalent (significant) facilitation of participants' visual search performance relative to a no-cue baseline condition. Making the unimodal auditory and vibrotactile cues spatially informative produced further performance improvements (on validly cued trials), as compared with cues that were spatially uninformative or otherwise spatially invalid. A final experiment was conducted in order to determine whether cue location (close to versus far from the visual display) would influence participants' visual search performance. Auditory cues presented close to the visual search display were found to produce significantly better performance than cues presented over headphones. Taken together, these results have implications for the design of nonvisual and multisensory warning signals used in complex visual displays.

  19. A Multi-Sensorial Hybrid Control for Robotic Manipulation in Human-Robot Workspaces

    PubMed Central

    Pomares, Jorge; Perea, Ivan; García, Gabriel J.; Jara, Carlos A.; Corrales, Juan A.; Torres, Fernando

    2011-01-01

    Autonomous manipulation in semi-structured environments where human operators can interact is an increasingly common task in robotic applications. This paper describes an intelligent multi-sensorial approach that solves this issue by providing a multi-robotic platform with a high degree of autonomy and the capability to perform complex tasks. The proposed sensorial system is composed of a hybrid visual servo control to efficiently guide the robot towards the object to be manipulated, an inertial motion capture system and an indoor localization system to avoid possible collisions between human operators and robots working in the same workspace, and a tactile sensor algorithm to correctly manipulate the object. The proposed controller employs the whole multi-sensorial system and combines the measurements of each one of the used sensors during two different phases considered in the robot task: a first phase where the robot approaches the object to be grasped, and a second phase of manipulation of the object. In both phases, the unexpected presence of humans is taken into account. This paper also presents the successful results obtained in several experimental setups which verify the validity of the proposed approach. PMID:22163729

  20. Causal links between dorsal medial superior temporal area neurons and multisensory heading perception.

    PubMed

    Gu, Yong; Deangelis, Gregory C; Angelaki, Dora E

    2012-02-15

    The dorsal medial superior temporal area (MSTd) in the extrastriate visual cortex is thought to play an important role in heading perception because neurons in this area are tuned to both optic flow and vestibular signals. MSTd neurons also show significant correlations with perceptual judgments during a fine heading direction discrimination task. To test for a causal link with heading perception, we used microstimulation and reversible inactivation techniques to artificially perturb MSTd activity while monitoring behavioral performance. Electrical microstimulation significantly biased monkeys' heading percepts based on optic flow, but did not significantly impact vestibular heading judgments. The latter result may be due to the fact that vestibular heading preferences in MSTd are more weakly clustered than visual preferences and multiunit tuning for vestibular stimuli is weak. Reversible chemical inactivation, however, increased behavioral thresholds when heading judgments were based on either optic flow or vestibular cues, although the magnitude of the effects was substantially stronger for optic flow. Behavioral deficits in a combined visual/vestibular stimulus condition were intermediate between the single-cue effects. Despite deficits in discrimination thresholds, animals were able to combine visual and vestibular cues near optimally, even after large bilateral muscimol injections into MSTd. Simulations show that the overall pattern of results following inactivation is consistent with a mixture of contributions from MSTd and other areas with vestibular-dominant tuning for heading. Our results support a causal link between MSTd neurons and multisensory heading perception but suggest that other multisensory brain areas also contribute.