Science.gov

Sample records for affects visual perception

  1. Dscam2 affects visual perception in Drosophila melanogaster

    PubMed Central

    Bosch, Danny S.; van Swinderen, Bruno; Millard, S. Sean

    2015-01-01

    Dscam2, a cell surface protein that mediates cellular repulsion, plays a crucial role in the development of the Drosophila melanogaster visual system. Dscam2 generates boundaries between neighboring modules in the fly optic lobe; in Dscam2 mutants this visual system modularity is compromised. Although developmental wiring defects have been well described in the Dscam2 mutant, behavioral consequences have not been investigated. To address this, we examined the visual behavior of Dscam2 mutant flies. Using a phototaxis assay, we ascertained that these flies are not blind, but have a reduced phototaxic response. Through population-based and single fly optomotor assays, we found that Dscam2 mutant flies can track motion but that their response is opposite to control flies under defined experimental conditions. In a fixation paradigm, which allows tethered flies to control the angular position of a visual stimulus, mutant flies' responses were diametrically opposed to those seen in control flies. These data suggest that modest changes in the modularity of the fly visual system in the Dscam2 mutant can dramatically change the perception of specific visual cues and modify behavior. PMID:26106310

  2. Conditions affecting beliefs about visual perception among children and adults.

    PubMed

    Winer, G A; Cottrell, J E; Karefilaki, K D; Chronister, M

    1996-03-01

    Children and adults were tested on their beliefs about whether visual processes involved intromissions (visual input) or extramissions (visual output) across a variety of situations. The idea that extramissions are part of the process of vision was first expressed by ancient philosophers, including Plato, Euclid, and Ptolemy and has been shown to be evident in children and in some adults. The present research showed that when questions about vision referred to luminous as opposed to non-luminous objects, under certain conditions there was some increase in intromission beliefs, but almost no corresponding decline in extramission beliefs, and no evidence of transfer of intromission responses to questions referring to nonluminous objects. A separate study showed that college students, but not children, increased their extramission responses to questions providing a positive emotional context. The results are inconsistent with the idea that simple experiences increase or reinforce a coherent theory of vision. The results also have implications for understanding the nature of beliefs about scientific processes and for education. PMID:8812034

  3. Body ownership affects visual perception of object size by rescaling the visual representation of external space.

    PubMed

    van der Hoort, Björn; Ehrsson, H Henrik

    2014-07-01

    Size perception is most often explained by a combination of cues derived from the visual system. However, this traditional cue approach neglects the role of the observer's body beyond mere visual comparison. In a previous study, we used a full-body illusion to show that objects appear larger and farther away when participants experience a small artificial body as their own and that objects appear smaller and closer when they assume ownership of a large artificial body ("Barbie-doll illusion"; van der Hoort, Guterstam, & Ehrsson, PLoS ONE, 6(5), e20195, 2011). The first aim of the present study was to test the hypothesis that this own-body-size effect is distinct from the role of the seen body as a direct familiar-size cue. To this end, we developed a novel setup that allowed for occlusion of the artificial body during the presentation of test objects. Our results demonstrate that the feeling of ownership of an artificial body can alter the perceived sizes of objects without the need for a visible body. Second, we demonstrate that fixation shifts do not contribute to the own-body-size effect. Third, we show that the effect exists in both peri-personal space and distant extra-personal space. Finally, through a meta-analysis, we demonstrate that the own-body-size effect is independent of and adds to the classical visual familiar-size cue effect. Our results suggest that, by changing body size, the entire spatial layout rescales and new objects are now perceived according to this rescaling, without the need to see the body. PMID:24806404

  4. Cardio-visual integration modulates the subjective perception of affectively neutral stimuli.

    PubMed

    Azevedo, Ruben T; Ainley, Vivien; Tsakiris, Manos

    2016-01-01

    Interoception, which refers to the perception of internal body signals, has been consistently associated with emotional processing and with the sense of self. However, its influence on the subjective appraisal of affectively neutral and body-unrelated stimuli is still largely unknown. Across two experiments we sought to investigate this issue by asking participants to detect changes in the flashing rhythm of a simple stimulus (a circle) that could either be pulsing synchronously with their own heartbeats or following the pattern of another person's heart. While overall task performance did not vary as a function of cardio-visual synchrony, participants were better at identifying trials in which no change occurred when the flashes were synchronous with their own heartbeats. This study adds to the growing body of research indicating that we use our body as a reference point when perceiving the world; and extends this view by focusing on the role that signals coming from inside the body, such as heartbeats, may play in this referencing process. Specifically we show that private interoceptive sensations can be combined with affectively neutral information unrelated to the self to influence the processing of a multisensory percept. Results are discussed in terms of both standard multisensory integration processes and predictive coding theories. PMID:26620928

  5. Perception and Attention for Visualization

    ERIC Educational Resources Information Center

    Haroz, Steve

    2013-01-01

    This work examines how a better understanding of visual perception and attention can impact visualization design. In a collection of studies, I explore how different levels of the visual system can measurably affect a variety of visualization metrics. The results show that expert preference, user performance, and even computational performance are…

  6. Visual Imagery without Visual Perception?

    ERIC Educational Resources Information Center

    Bertolo, Helder

    2005-01-01

    The question regarding visual imagery and visual perception remain an open issue. Many studies have tried to understand if the two processes share the same mechanisms or if they are independent, using different neural substrates. Most research has been directed towards the need of activation of primary visual areas during imagery. Here we review…

  7. Non-conscious visual cues related to affect and action alter perception of effort and endurance performance

    PubMed Central

    Blanchfield, Anthony; Hardy, James; Marcora, Samuele

    2014-01-01

    The psychobiological model of endurance performance proposes that endurance performance is determined by a decision-making process based on perception of effort and potential motivation. Recent research has reported that effort-based decision-making during cognitive tasks can be altered by non-conscious visual cues relating to affect and action. The effects of these non-conscious visual cues on effort and performance during physical tasks are however unknown. We report two experiments investigating the effects of subliminal priming with visual cues related to affect and action on perception of effort and endurance performance. In Experiment 1 thirteen individuals were subliminally primed with happy or sad faces as they cycled to exhaustion in a counterbalanced and randomized crossover design. A paired t-test (happy vs. sad faces) revealed that individuals cycled significantly longer (178 s, p = 0.04) when subliminally primed with happy faces. A 2 × 5 (condition × iso-time) ANOVA also revealed a significant main effect of condition on rating of perceived exertion (RPE) during the time to exhaustion (TTE) test with lower RPE when subjects were subliminally primed with happy faces (p = 0.04). In Experiment 2, a single-subject randomization tests design found that subliminal priming with action words facilitated a significantly longer TTE (399 s, p = 0.04) in comparison to inaction words. Like Experiment 1, this greater TTE was accompanied by a significantly lower RPE (p = 0.03). These experiments are the first to show that subliminal visual cues relating to affect and action can alter perception of effort and endurance performance. Non-conscious visual cues may therefore influence the effort-based decision-making process that is proposed to determine endurance performance. Accordingly, the findings raise notable implications for individuals who may encounter such visual cues during endurance competitions, training, or health related exercise. PMID:25566014

  8. Tuning to the significant: neural and genetic processes underlying affective enhancement of visual perception and memory.

    PubMed

    Markovic, Jelena; Anderson, Adam K; Todd, Rebecca M

    2014-02-01

    Emotionally arousing events reach awareness more easily and evoke greater visual cortex activation than more mundane events. Recent studies have shown that they are also perceived more vividly and that emotionally enhanced perceptual vividness predicts memory vividness. We propose that affect-biased attention (ABA) - selective attention to emotionally salient events - is an endogenous attentional system tuned by an individual's history of reward and punishment. We present the Biased Attention via Norepinephrine (BANE) model, which unifies genetic, neuromodulatory, neural and behavioural evidence to account for ABA. We review evidence supporting BANE's proposal that a key mechanism of ABA is locus coeruleus-norepinephrine (LC-NE) activity, which interacts with activity in hubs of affective salience networks to modulate visual cortex activation and heighten the subjective vividness of emotionally salient stimuli. We further review literature on biased competition and look at initial evidence for its potential as a neural mechanism behind ABA. We also review evidence supporting the role of the LC-NE system as a driving force of ABA. Finally, we review individual differences in ABA and memory including differences in sensitivity to stimulus category and valence. We focus on differences arising from a variant of the ADRA2b gene, which codes for the alpha2b adrenoreceptor as a way of investigating influences of NE availability on ABA in humans. PMID:24269973

  9. Bodily action penetrates affective perception

    PubMed Central

    Rigutti, Sara; Gerbino, Walter

    2016-01-01

    Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on

  10. A model of visual perception.

    PubMed

    Borello, L; Ferraro, M; Penengo, P; Rossotti, M L

    1981-01-01

    In this paper we propose a model of visual perception in which a positive feedback mechanism can reproduce the pattern stimulus on a neurons screen. The pattern stimulus reproduction is based on informations coming from the spatial derivatives of visual pattern. This information together with the response of the feature extractors provides to the reproduction of the visual pattern as neuron screen electric activity. We simulate several input patterns and prove that the model reproduces the percept. PMID:7236747

  11. Sound can suppress visual perception.

    PubMed

    Hidaka, Souta; Ide, Masakazu

    2015-01-01

    In a single modality, the percept of an input (e.g., voices of neighbors) is often suppressed by another (e.g., the sound of a car horn nearby) due to close interactions of neural responses to these inputs. Recent studies have also suggested that close interactions of neural responses could occur even across sensory modalities, especially for audio-visual interactions. However, direct behavioral evidence regarding the audio-visual perceptual suppression effect has not been reported in a study with humans. Here, we investigated whether sound could have a suppressive effect on visual perception. We found that white noise bursts presented through headphones degraded visual orientation discrimination performance. This auditory suppression effect on visual perception frequently occurred when these inputs were presented in a spatially and temporally consistent manner. These results indicate that the perceptual suppression effect could occur across auditory and visual modalities based on close and direct neural interactions among those sensory inputs. PMID:26023877

  12. Structure of visual perception.

    PubMed Central

    Zhang, J; Wu, S Y

    1990-01-01

    The response properties of a class of motion detectors (Reichardt detectors) are investigated extensively here. Since the outputs of the detectors, responding to an image undergoing two-dimensional rigid translation, are dependent on both the image velocity and the image intensity distribution, they are nonuniform across the entire image, even though the object is moving rigidly as a whole. To achieve perceptual "oneness" in the rigid motion, we are led to contend that visual perception must take place in a space that is non-Euclidean in nature. We then derive the affine connection and the metric of this perceptual space. The Riemann curvature tensor is identically zero, which means that the perceptual space is intrinsically flat. A geodesic in this space is composed of points of constant image intensity gradient along a certain direction. The deviation of geodesics (which are perceptually "straight") from physically straight lines may offer an explanation to the perceptual distortion of angular relationships such as the Hering illusion. PMID:2235999

  13. Serial dependence in visual perception

    PubMed Central

    Fischer, Jason; Whitney, David

    2014-01-01

    Visual input often arrives in a noisy and discontinuous stream, owing to head and eye movements, occlusion, lighting changes, and many other factors. Yet the physical world is generally stable—objects and physical characteristics rarely change spontaneously. How then does the human visual system capitalize on continuity in the physical environment over time? Here we show that visual perception is serially dependent, using both prior and present input to inform perception at the present moment. Using an orientation judgment task, we found that even when visual input changes randomly over time, perceived orientation is strongly and systematically biased toward recently seen stimuli. Further, the strength of this bias is modulated by attention and tuned to the spatial and temporal proximity of successive stimuli. These results reveal a serial dependence in perception characterized by a spatiotemporally tuned, orientation-selective operator—which we call a continuity field—that may promote visual stability over time. PMID:24686785

  14. Visual adaptation and face perception

    PubMed Central

    Webster, Michael A.; MacLeod, Donald I. A.

    2011-01-01

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555

  15. Visual field asymmetry in facial affect perception: moderating effects of hypnosis, hypnotic susceptibility level, absorption, and sustained attentional abilities.

    PubMed

    Crawford, H J; Harrison, D W; Kapelis, L

    1995-05-01

    Effects of hypnotic level, affect valence and cerebral asymmetry on reaction time (RT) in the discrimination of Ekman and Friesen's (1978) stimuli of angry and happy faces were studied in counterbalanced conditions of waking and hypnosis. Assessed previously on two hypnotic susceptibility scales [Harvard Group Scale of Hypnotic Susceptibility; Stanford Hypnotic Susceptibility Scale, Form C (SHSSC)], non-depressed subjects were 16 low (0-4 SHSSC) and 17 highly (10-12 SHSSC) hypnotizable, right-handed college students. Subjects were required to identify affects of faces, presented tachistoscopically to left (LVF) or right (RVF) visual fields, by using a forced-choice RT paradigm. Highs were significantly faster than lows in angry and happy affect recognition. Hypnosis had no significant effects. For highs only, angry emotional valence was identified faster when presented to the right hemisphere (LVF), but there were no significant hemispheric effects for happy emotional valence. For lows there were no hemispheric differences. Gender was a nonsignificant factor. Significant correlations showed that faster reaction times to angry and happy stimuli, in both LVF and RVF in waking and hypnosis, were obtained by subjects who reported more deeply absorbed and extremely focused and sustained attention on the Tellegen (1982) Absorption Scale and a subscale of the Differential Attentional Processes Inventory (Grumbles & Crawford, 1981). Vividness of Visual Imagery Questionnaire (Marks, 1973) and Affect Intensity Measure (Larsen, 1985), in general, did not correlate with RTs. The potential role of the fronto-limbic attentional system in the recognition of external visual sensory affect is discussed. PMID:7591508

  16. Camouflage and visual perception

    PubMed Central

    Troscianko, Tom; Benton, Christopher P.; Lovell, P. George; Tolhurst, David J.; Pizlo, Zygmunt

    2008-01-01

    How does an animal conceal itself from visual detection by other animals? This review paper seeks to identify general principles that may apply in this broad area. It considers mechanisms of visual encoding, of grouping and object encoding, and of search. In most cases, the evidence base comes from studies of humans or species whose vision approximates to that of humans. The effort is hampered by a relatively sparse literature on visual function in natural environments and with complex foraging tasks. However, some general constraints emerge as being potentially powerful principles in understanding concealment—a ‘constraint’ here means a set of simplifying assumptions. Strategies that disrupt the unambiguous encoding of discontinuities of intensity (edges), and of other key visual attributes, such as motion, are key here. Similar strategies may also defeat grouping and object-encoding mechanisms. Finally, the paper considers how we may understand the processes of search for complex targets in complex scenes. The aim is to provide a number of pointers towards issues, which may be of assistance in understanding camouflage and concealment, particularly with reference to how visual systems can detect the shape of complex, concealed objects. PMID:18990671

  17. Attentional Episodes in Visual Perception

    ERIC Educational Resources Information Center

    Wyble, Brad; Potter, Mary C.; Bowman, Howard; Nieuwenstein, Mark

    2011-01-01

    Is one's temporal perception of the world truly as seamless as it appears? This article presents a computationally motivated theory suggesting that visual attention samples information from temporal episodes (episodic simultaneous type/serial token model; Wyble, Bowman, & Nieuwenstein, 2009). Breaks between these episodes are punctuated by periods…

  18. Perception, Cognition, and Visualization.

    ERIC Educational Resources Information Center

    Arnheim, Rudolf

    1991-01-01

    Described are how pictures can combine aspects of naturalistic representation with more formal shapes to enhance cognitive understanding. These "diagrammatic" shapes derive from geometrical elementary and thereby bestow visual concreteness to concepts conveyed by the pictures. Leonardo da Vinci's anatomical drawings are used as examples…

  19. Neuron analysis of visual perception

    NASA Technical Reports Server (NTRS)

    Chow, K. L.

    1980-01-01

    The receptive fields of single cells in the visual system of cat and squirrel monkey were studied investigating the vestibular input affecting the cells, and the cell's responses during visual discrimination learning process. The receptive field characteristics of the rabbit visual system, its normal development, its abnormal development following visual deprivation, and on the structural and functional re-organization of the visual system following neo-natal and prenatal surgery were also studied. The results of each individual part of each investigation are detailed.

  20. Eye movements reset visual perception

    PubMed Central

    Paradiso, Michael A.; Meshi, Dar; Pisarcik, Jordan; Levine, Samuel

    2012-01-01

    Human vision uses saccadic eye movements to rapidly shift the sensitive foveal portion of our retina to objects of interest. For vision to function properly amidst these ballistic eye movements, a mechanism is needed to extract discrete percepts on each fixation from the continuous stream of neural activity that spans fixations. The speed of visual parsing is crucial because human behaviors ranging from reading to driving to sports rely on rapid visual analysis. We find that a brain signal associated with moving the eyes appears to play a role in resetting visual analysis on each fixation, a process that may aid in parsing the neural signal. We quantified the degree to which the perception of tilt is influenced by the tilt of a stimulus on a preceding fixation. Two key conditions were compared, one in which a saccade moved the eyes from one stimulus to the next and a second simulated saccade condition in which the stimuli moved in the same manner but the subjects did not move their eyes. We find that there is a brief period of time at the start of each fixation during which the tilt of the previous stimulus influences perception (in a direction opposite to the tilt aftereffect)—perception is not instantaneously reset when a fixation starts. Importantly, the results show that this perceptual bias is much greater, with nearly identical visual input, when saccades are simulated. This finding suggests that, in real-saccade conditions, some signal related to the eye movement may be involved in the reset phenomenon. While proprioceptive information from the extraocular muscles is conceivably a factor, the fast speed of the effect we observe suggests that a more likely mechanism is a corollary discharge signal associated with eye movement. PMID:23241264

  1. Eye movements reset visual perception.

    PubMed

    Paradiso, Michael A; Meshi, Dar; Pisarcik, Jordan; Levine, Samuel

    2012-01-01

    Human vision uses saccadic eye movements to rapidly shift the sensitive foveal portion of our retina to objects of interest. For vision to function properly amidst these ballistic eye movements, a mechanism is needed to extract discrete percepts on each fixation from the continuous stream of neural activity that spans fixations. The speed of visual parsing is crucial because human behaviors ranging from reading to driving to sports rely on rapid visual analysis. We find that a brain signal associated with moving the eyes appears to play a role in resetting visual analysis on each fixation, a process that may aid in parsing the neural signal. We quantified the degree to which the perception of tilt is influenced by the tilt of a stimulus on a preceding fixation. Two key conditions were compared, one in which a saccade moved the eyes from one stimulus to the next and a second simulated saccade condition in which the stimuli moved in the same manner but the subjects did not move their eyes. We find that there is a brief period of time at the start of each fixation during which the tilt of the previous stimulus influences perception (in a direction opposite to the tilt aftereffect)--perception is not instantaneously reset when a fixation starts. Importantly, the results show that this perceptual bias is much greater, with nearly identical visual input, when saccades are simulated. This finding suggests that, in real-saccade conditions, some signal related to the eye movement may be involved in the reset phenomenon. While proprioceptive information from the extraocular muscles is conceivably a factor, the fast speed of the effect we observe suggests that a more likely mechanism is a corollary discharge signal associated with eye movement. PMID:23241264

  2. Effects of strenuous exercise on visual perception are independent of visual resolution.

    PubMed

    Ando, Soichi; Kokubu, Masahiro; Nakae, Satoshi; Kimura, Misaka; Hojo, Tatsuya; Ebine, Naoyuki

    2012-05-15

    Strenuous exercise may have the detrimental effects on visual perception. However, it is unclear whether visual resolution is related to the detrimental effects on visual perception. The purpose of this study was to examine whether the effects of strenuous exercise on visual perception are dependent on visual resolution. Given that visual resolution decreases in the periphery of the visual field, we hypothesized that if visual resolution plays a role in the detrimental effects on visual perception, the detrimental effects may be exaggerated toward the periphery of the visual field. Simple visual reaction time was measured at rest and during cycling at 40% and 75% peak oxygen uptakes (VO(2)). Visual stimuli were randomly presented at 2°, 10°, 30°, and 50° to either the right or left of the midpoint between the eyes with equal probability. RT was fractionated into premotor and motor components (i.e. premotor time and motor time) based on electromyographic recording. The premotor time during exercise at 40% peak VO(2) was not different from that at rest. In contrast, the premotor time during exercise at 75% peak VO(2) was significantly longer than that at rest (p=0.018). The increase in the premotor time was observed irrespective of eccentricity, and the detrimental effects were not exaggerated toward the periphery of the visual field. The motor time was not affected by exercise. The current findings suggest that the detrimental effects of strenuous exercise on visual perception are independent of visual resolution. PMID:22285211

  3. Adaptive design of visual perception experiments

    NASA Astrophysics Data System (ADS)

    O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja

    2010-04-01

    Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.

  4. Affecting speed and accuracy in perception.

    PubMed

    Bocanegra, Bruno R

    2014-12-01

    An account of affective modulations in perceptual speed and accuracy (ASAP: Affecting Speed and Accuracy in Perception) is proposed and tested. This account assumes an emotion-induced inhibitory interaction between parallel channels in the visual system that modulates the onset latencies and response durations of visual signals. By trading off speed and accuracy between channels, this mechanism achieves (a) fast visuo-motor responding to course-grained information, and (b) accurate visuo-attentional selection of fine-grained information. ASAP gives a functional account of previously counterintuitive findings, and may be useful for explaining affective influences in both featural-level single-stimulus tasks and object-level multistimulus tasks. PMID:24853268

  5. Separate visual representations for perception and for visually guided behavior

    NASA Technical Reports Server (NTRS)

    Bridgeman, Bruce

    1989-01-01

    Converging evidence from several sources indicates that two distinct representations of visual space mediate perception and visually guided behavior, respectively. The two maps of visual space follow different rules; spatial values in either one can be biased without affecting the other. Ordinarily the two maps give equivalent responses because both are veridically in register with the world; special techniques are required to pull them apart. One such technique is saccadic suppression: small target displacements during saccadic eye movements are not preceived, though the displacements can change eye movements or pointing to the target. A second way to separate cognitive and motor-oriented maps is with induced motion: a slowly moving frame will make a fixed target appear to drift in the opposite direction, while motor behavior toward the target is unchanged. The same result occurs with stroboscopic induced motion, where the frame jump abruptly and the target seems to jump in the opposite direction. A third method of separating cognitive and motor maps, requiring no motion of target, background or eye, is the Roelofs effect: a target surrounded by an off-center rectangular frame will appear to be off-center in the direction opposite the frame. Again the effect influences perception, but in half of the subjects it does not influence pointing to the target. This experience also reveals more characteristics of the maps and their interactions with one another, the motor map apparently has little or no memory, and must be fed from the biased cognitive map if an enforced delay occurs between stimulus presentation and motor response. In designing spatial displays, the results mean that what you see isn't necessarily what you get. Displays must be designed with either perception or visually guided behavior in mind.

  6. Stochastic Resonance In Visual Perception

    NASA Astrophysics Data System (ADS)

    Simonotto, Enrico

    1996-03-01

    Stochastic resonance (SR) is a well established physical phenomenon wherein some measure of the coherence of a weak signal can be optimized by random fluctuations, or "noise" (K. Wiesenfeld and F. Moss, Nature), 373, 33 (1995). In all experiments to date the coherence has been measured using numerical analysis of the data, for example, signal-to-noise ratios obtained from power spectra. But, can this analysis be replaced by a perceptive task? Previously we had demonstrated this possibility with a numerical model of perceptual bistability applied to the interpretation of ambiguous figures(M. Riani and E. Simonotto, Phys. Rev. Lett.), 72, 3120 (1994). Here I describe an experiment wherein SR is detected in visual perception. A recognizible grayscale photograph was digitized and presented. The picture was then placed beneath a threshold. Every pixel for which the grayscale exceeded the threshold was painted white, and all others black. For large enough threshold, the picture is unrecognizable, but the addition of a random number to every pixel renders it interpretable(C. Seife and M. Roberts, The Economist), 336, 59, July 29 (1995). However the addition of dynamical noise to the pixels much enhances an observer's ability to interpret the picture. Here I report the results of psychophysics experiments wherein the effects of both the intensity of the noise and its correlation time were studied.

  7. Visual motion integration for perception and pursuit

    NASA Technical Reports Server (NTRS)

    Stone, L. S.; Beutter, B. R.; Lorenceau, J.

    2000-01-01

    To examine the relationship between visual motion processing for perception and pursuit, we measured the pursuit eye-movement and perceptual responses to the same complex-motion stimuli. We show that humans can both perceive and pursue the motion of line-figure objects, even when partial occlusion makes the resulting image motion vastly different from the underlying object motion. Our results show that both perception and pursuit can perform largely accurate motion integration, i.e. the selective combination of local motion signals across the visual field to derive global object motion. Furthermore, because we manipulated perceived motion while keeping image motion identical, the observed parallel changes in perception and pursuit show that the motion signals driving steady-state pursuit and perception are linked. These findings disprove current pursuit models whose control strategy is to minimize retinal image motion, and suggest a new framework for the interplay between visual cortex and cerebellum in visuomotor control.

  8. Saccadic Corollary Discharge Underlies Stable Visual Perception

    PubMed Central

    Berman, Rebecca A.; Joiner, Wilsaan M.; Wurtz, Robert H.

    2016-01-01

    Saccadic eye movements direct the high-resolution foveae of our retinas toward objects of interest. With each saccade, the image jumps on the retina, causing a discontinuity in visual input. Our visual perception, however, remains stable. Philosophers and scientists over centuries have proposed that visual stability depends upon an internal neuronal signal that is a copy of the neuronal signal driving the eye movement, now referred to as a corollary discharge (CD) or efference copy. In the old world monkey, such a CD circuit for saccades has been identified extending from superior colliculus through MD thalamus to frontal cortex, but there is little evidence that this circuit actually contributes to visual perception. We tested the influence of this CD circuit on visual perception by first training macaque monkeys to report their perceived eye direction, and then reversibly inactivating the CD as it passes through the thalamus. We found that the monkey's perception changed; during CD inactivation, there was a difference between where the monkey perceived its eyes to be directed and where they were actually directed. Perception and saccade were decoupled. We established that the perceived eye direction at the end of the saccade was not derived from proprioceptive input from eye muscles, and was not altered by contextual visual information. We conclude that the CD provides internal information contributing to the brain's creation of perceived visual stability. More specifically, the CD might provide the internal saccade vector used to unite separate retinal images into a stable visual scene. SIGNIFICANCE STATEMENT Visual stability is one of the most remarkable aspects of human vision. The eyes move rapidly several times per second, displacing the retinal image each time. The brain compensates for this disruption, keeping our visual perception stable. A major hypothesis explaining this stability invokes a signal within the brain, a corollary discharge, that informs

  9. How perception affects racial categorization: On the influence of initial visual exposure on labelling people as diverse individuals or racial subjects.

    PubMed

    Harsányi, Géza; Carbon, Claus-Christian

    2015-01-01

    In research on racial categorization we tend to focus on socialization, on environmental influences, and on social factors. One important factor, though, is perception itself In our experiment we let people label persons on dimensions which they could freely use. The participants were initially exposed to a full series either of black faces or of white faces. We observed a clear effect of initial exposure on explicit verbal categorizations. When initially exposed to white faces, participants used racial labels for the subsequent black faces only. In contrast, racial labels were used for black as well as white faces after initial exposure to black faces, which indicates a shift to in-group categorization after having initially inspected black faces. In conclusion, this effect documents highly adaptive categorizations caused by visual context alone, suggesting that racial thoughts are based on relatively volatile category representations. PMID:26489221

  10. Visual perception of order-disorder transition

    PubMed Central

    Katkov, Mikhail; Harris, Hila; Sagi, Dov

    2015-01-01

    Our experience with the natural world, as composed of ordered entities, implies that perception captures relationships between image parts. For instance, regularities in the visual scene are rapidly identified by our visual system. Defining the regularities that govern perception is a basic, unresolved issue in neuroscience. Mathematically, perfect regularities are represented by symmetry (perfect order). The transition from ordered configurations to completely random ones has been extensively studied in statistical physics, where the amount of order is characterized by a symmetry-specific order parameter. Here we applied tools from statistical physics to study order detection in humans. Different sets of visual textures, parameterized by the thermodynamic temperature in the Boltzmann distribution, were designed. We investigated how much order is required in a visual texture for it to be discriminated from random noise. The performance of human observers was compared to Ideal and Order observers (based on the order parameter). The results indicated a high consistency in performance across human observers, much below that of the Ideal observer, but well-approximated by the Order observer. Overall, we provide a novel quantitative paradigm to address order perception. Our findings, based on this paradigm, suggest that the statistical physics formalism of order captures regularities to which the human visual system is sensitive. An additional analysis revealed that some order perception properties are captured by traditional texture discrimination models according to which discrimination is based on integrated energy within maps of oriented linear filters. PMID:26113826

  11. Neural pathways for visual speech perception

    PubMed Central

    Bernstein, Lynne E.; Liebenthal, Einat

    2014-01-01

    This paper examines the questions, what levels of speech can be perceived visually, and how is visual speech represented by the brain? Review of the literature leads to the conclusions that every level of psycholinguistic speech structure (i.e., phonetic features, phonemes, syllables, words, and prosody) can be perceived visually, although individuals differ in their abilities to do so; and that there are visual modality-specific representations of speech qua speech in higher-level vision brain areas. That is, the visual system represents the modal patterns of visual speech. The suggestion that the auditory speech pathway receives and represents visual speech is examined in light of neuroimaging evidence on the auditory speech pathways. We outline the generally agreed-upon organization of the visual ventral and dorsal pathways and examine several types of visual processing that might be related to speech through those pathways, specifically, face and body, orthography, and sign language processing. In this context, we examine the visual speech processing literature, which reveals widespread diverse patterns of activity in posterior temporal cortices in response to visual speech stimuli. We outline a model of the visual and auditory speech pathways and make several suggestions: (1) The visual perception of speech relies on visual pathway representations of speech qua speech. (2) A proposed site of these representations, the temporal visual speech area (TVSA) has been demonstrated in posterior temporal cortex, ventral and posterior to multisensory posterior superior temporal sulcus (pSTS). (3) Given that visual speech has dynamic and configural features, its representations in feedforward visual pathways are expected to integrate these features, possibly in TVSA. PMID:25520611

  12. How Do Observer's Responses Affect Visual Long-Term Memory?

    ERIC Educational Resources Information Center

    Makovski, Tal; Jiang, Yuhong V.; Swallow, Khena M.

    2013-01-01

    How does responding to an object affect explicit memory for visual information? The close theoretical relationship between action and perception suggests that items that require a response should be better remembered than items that require no response. However, conclusive evidence for this claim is lacking, as semantic coherence, category size,…

  13. Neocortical Rebound Depolarization Enhances Visual Perception.

    PubMed

    Funayama, Kenta; Minamisawa, Genki; Matsumoto, Nobuyoshi; Ban, Hiroshi; Chan, Allen W; Matsuki, Norio; Murphy, Timothy H; Ikegaya, Yuji

    2015-08-01

    Animals are constantly exposed to the time-varying visual world. Because visual perception is modulated by immediately prior visual experience, visual cortical neurons may register recent visual history into a specific form of offline activity and link it to later visual input. To examine how preceding visual inputs interact with upcoming information at the single neuron level, we designed a simple stimulation protocol in which a brief, orientated flashing stimulus was subsequently coupled to visual stimuli with identical or different features. Using in vivo whole-cell patch-clamp recording and functional two-photon calcium imaging from the primary visual cortex (V1) of awake mice, we discovered that a flash of sinusoidal grating per se induces an early, transient activation as well as a long-delayed reactivation in V1 neurons. This late response, which started hundreds of milliseconds after the flash and persisted for approximately 2 s, was also observed in human V1 electroencephalogram. When another drifting grating stimulus arrived during the late response, the V1 neurons exhibited a sublinear, but apparently increased response, especially to the same grating orientation. In behavioral tests of mice and humans, the flashing stimulation enhanced the detection power of the identically orientated visual stimulation only when the second stimulation was presented during the time window of the late response. Therefore, V1 late responses likely provide a neural basis for admixing temporally separated stimuli and extracting identical features in time-varying visual environments. PMID:26274866

  14. Neocortical Rebound Depolarization Enhances Visual Perception

    PubMed Central

    Funayama, Kenta; Ban, Hiroshi; Chan, Allen W.; Matsuki, Norio; Murphy, Timothy H.; Ikegaya, Yuji

    2015-01-01

    Animals are constantly exposed to the time-varying visual world. Because visual perception is modulated by immediately prior visual experience, visual cortical neurons may register recent visual history into a specific form of offline activity and link it to later visual input. To examine how preceding visual inputs interact with upcoming information at the single neuron level, we designed a simple stimulation protocol in which a brief, orientated flashing stimulus was subsequently coupled to visual stimuli with identical or different features. Using in vivo whole-cell patch-clamp recording and functional two-photon calcium imaging from the primary visual cortex (V1) of awake mice, we discovered that a flash of sinusoidal grating per se induces an early, transient activation as well as a long-delayed reactivation in V1 neurons. This late response, which started hundreds of milliseconds after the flash and persisted for approximately 2 s, was also observed in human V1 electroencephalogram. When another drifting grating stimulus arrived during the late response, the V1 neurons exhibited a sublinear, but apparently increased response, especially to the same grating orientation. In behavioral tests of mice and humans, the flashing stimulation enhanced the detection power of the identically orientated visual stimulation only when the second stimulation was presented during the time window of the late response. Therefore, V1 late responses likely provide a neural basis for admixing temporally separated stimuli and extracting identical features in time-varying visual environments. PMID:26274866

  15. Acoustic noise improves visual perception and modulates occipital oscillatory states.

    PubMed

    Gleiss, Stephanie; Kayser, Christoph

    2014-04-01

    Perception is a multisensory process, and previous work has shown that multisensory interactions occur not only for object-related stimuli but also for simplistic and apparently unrelated inputs to the different senses. We here compare the facilitation of visual perception induced by transient (target-synchronized) sounds to the facilitation provided by continuous background noise like sounds. Specifically, we show that continuous acoustic noise improves visual contrast detection by systematically shifting psychometric curves in an amplitude-dependent manner. This multisensory benefit was found to be both qualitatively and quantitatively similar to that induced by a transient and target synchronized sound in the same paradigm. Studying the underlying neural mechanisms using electric neuroimaging (EEG), we found that acoustic noise alters occipital alpha (8-12 Hz) power and decreases beta-band (14-20 Hz) coupling of occipital and temporal sites. Task-irrelevant and continuous sounds thereby have an amplitude-dependent effect on cortical mechanisms implicated in shaping visual cortical excitability. The same oscillatory mechanisms also mediate visual facilitation by transient sounds, and our results suggest that task-related sounds and task-irrelevant background noises could induce perceptually and mechanistically similar enhancement of visual perception. Given the omnipresence of sounds and noises in our environment, such multisensory interactions may affect perception in many everyday scenarios. PMID:24236698

  16. Human Visual Perception--Learning at Workstations

    ERIC Educational Resources Information Center

    Schaal, Stefen; Bogner, Franz X.

    2005-01-01

    This study compares two methods of instruction in practical school biology. The content remains the same but two teaching methods are used, one based on workstations (Group 1) and the other a conventional approach (Group 2). The content was a regular 9th grade syllabus issue: visual perception. Method 1 included a phenomenological introduction,…

  17. Visual perception of spatial relations in depth

    NASA Astrophysics Data System (ADS)

    Doumen, M. J. A.

    2006-09-01

    The visual perception of spatial relations of two objects was investigated in a series of experiments. We examined spatial and contextual parameters. The effect of spatial parameters was investigated with various two-dimensional tasks: an exocentric pointing task, a parallelity task and a collinearity task. Whereas spatial parameters like relative distance and visual angle influenced the settings of all observers in a similar way, there were differences between tasks in their dependence on different parameters. For example, whereas the settings of the other tasks were dependent on the relative distance the settings of the parallelity task were not. This can be explained by different task-demands that are specific for each of these tasks. In the exocentric pointing task an observer has to direct a pointer, with a remote control, towards a target. We expanded the exocentric pointing task to a three dimensional version in which the height was also varied. Therefore, we had two dependent variables: the deviations in the horizontal plane (slant) and in the vertical plane (tilt). With this extension of the task, we could conclude that visual space is anisotropic since in contrast to the slant the tilt was not dependent on the relative distance. Another three-dimensional task that has been used is the ball-in-plane task, a task in which the observer has to hang a ball in a plane defined by three other balls by adjusting it in height. We found settings that were best described as concave settings, which is in agreement with most conclusions of the work described above. Furthermore, we investigated the effects of context on the 3D exocentric pointing task. We tested whether the settings of the observer were dependent on an egocentric reference like frontoparallelity or an allocentric reference like parallelity to a wall. It turns out that people differ in the references they use to do the task. However, in another experiment, we concluded that a reference like one's own

  18. Visual-somatotopic interactions in spatial perception.

    PubMed

    Samad, Majed; Shams, Ladan

    2016-02-10

    Ventriloquism is a well-studied multisensory illusion of audiovisual spatial perception in which the perceived location of an auditory stimulus is shifted in the direction of a synchronous, but spatially discrepant visual stimulus. This effect is because of vision's superior acuity in the spatial dimension, but has also been shown to be influenced by the perception of unity of the two signals. We sought to investigate whether a similar phenomenon may occur between vision and somatosensation along the surface of the body as vision is known to possess superior spatial acuity to somatosensation. We report the first demonstration of the visuotactile ventriloquist illusion: individuals were instructed to localize visual stimuli (small white disks) or tactile stimuli (brief localized vibrations) that were presented concurrently or individually along the surface of the forearm, where bimodal presentations included spatially congruent and incongruent stimuli. Participants showed strong visual-tactile interactions. The tactile localization was strongly biased in the direction of the visual stimulus and the magnitude of this bias decreased as the spatial disparity between the two stimuli increased. The Bayesian causal inference model that has previously been shown to account for auditory-visual spatial localization and the ventriloquism effect also accounted well for the present data. Therefore, crossmodal interactions involving spatial representation along the surface of the body follow the same rules as crossmodal interactions involving representations of external space (auditory-visual). PMID:26709693

  19. Visual space perception on a computer graphics night visual attachment

    NASA Technical Reports Server (NTRS)

    Palmer, E.; Petitt, J.

    1976-01-01

    A series of experiments was conducted to compare five psychophysical methods of measuring how people perceive visual space in simulators. Psychologists have used such methods traditionally to measure visual space perception in the real world. Of the five tasks - objective-size judgments, angular-size judgments, shape judgments, slant judgments, and distance judgments - only the angular-size judgment task proved to be of potential use as a measure of simulator realism. In this experiment pilots estimated the relative angular size of triangles placed at various distances along a simulated runway. Estimates made when the display was collimated were closer to real-world performance than estimates made with an uncollimated display.

  20. Do gender differences in audio-visual benefit and visual influence in audio-visual speech perception emerge with age?

    PubMed Central

    Alm, Magnus; Behne, Dawn

    2015-01-01

    Gender and age have been found to affect adults’ audio-visual (AV) speech perception. However, research on adult aging focuses on adults over 60 years, who have an increasing likelihood for cognitive and sensory decline, which may confound positive effects of age-related AV-experience and its interaction with gender. Observed age and gender differences in AV speech perception may also depend on measurement sensitivity and AV task difficulty. Consequently both AV benefit and visual influence were used to measure visual contribution for gender-balanced groups of young (20–30 years) and middle-aged adults (50–60 years) with task difficulty varied using AV syllables from different talkers in alternative auditory backgrounds. Females had better speech-reading performance than males. Whereas no gender differences in AV benefit or visual influence were observed for young adults, visually influenced responses were significantly greater for middle-aged females than middle-aged males. That speech-reading performance did not influence AV benefit may be explained by visual speech extraction and AV integration constituting independent abilities. Contrastingly, the gender difference in visually influenced responses in middle adulthood may reflect an experience-related shift in females’ general AV perceptual strategy. Although young females’ speech-reading proficiency may not readily contribute to greater visual influence, between young and middle-adulthood recurrent confirmation of the contribution of visual cues induced by speech-reading proficiency may gradually shift females AV perceptual strategy toward more visually dominated responses. PMID:26236274

  1. A Comparative Study on the Visual Perceptions of Children with Attention Deficit Hyperactivity Disorder

    NASA Astrophysics Data System (ADS)

    Ahmetoglu, Emine; Aral, Neriman; Butun Ayhan, Aynur

    This study was conducted in order to (a) compare the visual perceptions of seven-year-old children diagnosed with attention deficit hyperactivity disorder with those of normally developing children of the same age and development level and (b) determine whether the visual perceptions of children with attention deficit hyperactivity disorder vary with respect to gender, having received preschool education and parents` educational level. A total of 60 children, 30 with attention deficit hyperactivity disorder and 30 with normal development, were assigned to the study. Data about children with attention deficit hyperactivity disorder and their families was collected by using a General Information Form and the visual perception of children was examined through the Frostig Developmental Test of Visual Perception. The Mann-Whitney U-test and Kruskal-Wallis variance analysis was used to determine whether there was a difference of between the visual perceptions of children with normal development and those diagnosed with attention deficit hyperactivity disorder and to discover whether the variables of gender, preschool education and parents` educational status affected the visual perceptions of children with attention deficit hyperactivity disorder. The results showed that there was a statistically meaningful difference between the visual perceptions of the two groups and that the visual perceptions of children with attention deficit hyperactivity disorder were affected meaningfully by gender, preschool education and parents` educational status.

  2. Perceptions of the Visually Impaired toward Pursuing Geography Courses and Majors in Higher Education

    ERIC Educational Resources Information Center

    Murr, Christopher D.; Blanchard, R. Denise

    2011-01-01

    Advances in classroom technology have lowered barriers for the visually impaired to study geography, yet few participate. Employing stereotype threat theory, we examined whether beliefs held by the visually impaired affect perceptions toward completing courses and majors in visually oriented disciplines. A test group received a low-level threat…

  3. Adaptive optics without altering visual perception

    PubMed Central

    DE, Koenig; NW, Hart; HJ, Hofer

    2014-01-01

    Adaptive optics combined with visual psychophysics creates the potential to study the relationship between visual function and the retina at the cellular scale. This potential is hampered, however, by visual interference from the wavefront-sensing beacon used during correction. For example, we have previously shown that even a dim, visible beacon can alter stimulus perception (Hofer, H. J., Blaschke, J., Patolia, J., & Koenig, D. E. (2012). Fixation light hue bias revisited: Implications for using adaptive optics to study color vision. Vision Research, 56, 49-56). Here we describe a simple strategy employing a longer wavelength (980nm) beacon that, in conjunction with appropriate restriction on timing and placement, allowed us to perform psychophysics when dark adapted without altering visual perception. The method was verified by comparing detection and color appearance of foveally presented small spot stimuli with and without the wavefront beacon present in 5 subjects. As an important caution, we found that significant perceptual interference can occur even with a subliminal beacon when additional measures are not taken to limit exposure. Consequently, the lack of perceptual interference should be verified for a given system, and not assumed based on invisibility of the beacon. PMID:24607992

  4. Audio-visual affective expression recognition

    NASA Astrophysics Data System (ADS)

    Huang, Thomas S.; Zeng, Zhihong

    2007-11-01

    Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.

  5. Audibility and visual biasing in speech perception

    NASA Astrophysics Data System (ADS)

    Clement, Bart Richard

    Although speech perception has been considered a predominantly auditory phenomenon, large benefits from vision in degraded acoustic conditions suggest integration of audition and vision. More direct evidence of this comes from studies of audiovisual disparity that demonstrate vision can bias and even dominate perception (McGurk & MacDonald, 1976). It has been observed that hearing-impaired listeners demonstrate more visual biasing than normally hearing listeners (Walden et al., 1990). It is argued here that stimulus audibility must be equated across groups before true differences can be established. In the present investigation, effects of visual biasing on perception were examined as audibility was degraded for 12 young normally hearing listeners. Biasing was determined by quantifying the degree to which listener identification functions for a single synthetic auditory /ba-da-ga/ continuum changed across two conditions: (1)an auditory-only listening condition; and (2)an auditory-visual condition in which every item of the continuum was synchronized with visual articulations of the consonant-vowel (CV) tokens /ba/ and /ga/, as spoken by each of two talkers. Audibility was altered by presenting the conditions in quiet and in noise at each of three signal-to- noise (S/N) ratios. For the visual-/ba/ context, large effects of audibility were found. As audibility decreased, visual biasing increased. A large talker effect also was found, with one talker eliciting more biasing than the other. An independent lipreading measure demonstrated that this talker was more visually intelligible than the other. For the visual-/ga/ context, audibility and talker effects were less robust, possibly obscured by strong listener effects, which were characterized by marked differences in perceptual processing patterns among participants. Some demonstrated substantial biasing whereas others demonstrated little, indicating a strong reliance on audition even in severely degraded acoustic

  6. Emotion and Perception: The Role of Affective Information

    PubMed Central

    Zadra, Jonathan R.; Clore, Gerald L.

    2011-01-01

    Visual perception and emotion are traditionally considered separate domains of study. In this article, however, we review research showing them to be less separable that usually assumed. In fact, emotions routinely affect how and what we see. Fear, for example, can affect low-level visual processes, sad moods can alter susceptibility to visual illusions, and goal-directed desires can change the apparent size of goal-relevant objects. In addition, the layout of the physical environment, including the apparent steepness of a hill and the distance to the ground from a balcony can both be affected by emotional states. We propose that emotions provide embodied information about the costs and benefits of anticipated action, information that can be used automatically and immediately, circumventing the need for cogitating on the possible consequences of potential actions. Emotions thus provide a strong motivating influence on how the environment is perceived. PMID:22039565

  7. Visual Arts Teaching in Kindergarten through 3rd-Grade Classrooms in the UAE: Teacher Profiles, Perceptions, and Practices

    ERIC Educational Resources Information Center

    Buldu, Mehmet; Shaban, Mohamed S.

    2010-01-01

    This study portrayed a picture of kindergarten through 3rd-grade teachers who teach visual arts, their perceptions of the value of visual arts, their visual arts teaching practices, visual arts experiences provided to young learners in school, and major factors and/or influences that affect their teaching of visual arts. The sample for this study…

  8. Auditory motion affects visual biological motion processing.

    PubMed

    Brooks, A; van der Zwan, R; Billard, A; Petreska, B; Clarke, S; Blanke, O

    2007-02-01

    The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of

  9. Contribution of a visual pigment absorption spectrum to a visual function: depth perception in a jumping spider

    PubMed Central

    Nagata, Takashi; Arikawa, Kentaro; Terakita, Akihisa

    2013-01-01

    Absorption spectra of visual pigments are adaptively tuned to optimize informational capacity in most visual systems. Our recent investigation of the eyes of the jumping spider reveals an apparent exception: the absorption characteristics of a visual pigment cause defocusing of the image, reducing visual acuity generally in a part of the retina. However, the amount of defocus can theoretically provide a quantitative indication of the distance of an object. Therefore, we proposed a novel mechanism for depth perception in jumping spiders based on image defocus. Behavioral experiments revealed that the depth perception of the spider depended on the wavelength of the ambient light, which affects the amount of defocus because of chromatic aberration of the lens. This wavelength effect on depth perception was in close agreement with theoretical predictions based on our hypothesis. These data strongly support the hypothesis that the depth perception mechanism of jumping spiders is based on image defocus.

  10. A Bayesian model for visual space perception

    NASA Technical Reports Server (NTRS)

    Curry, R. E.

    1972-01-01

    A model for visual space perception is proposed that contains desirable features in the theories of Gibson and Brunswik. This model is a Bayesian processor of proximal stimuli which contains three important elements: an internal model of the Markov process describing the knowledge of the distal world, the a priori distribution of the state of the Markov process, and an internal model relating state to proximal stimuli. The universality of the model is discussed and it is compared with signal detection theory models. Experimental results of Kinchla are used as a special case.

  11. [Synchronized, oscillatory brain activity in visual perception].

    PubMed

    Braunitzer, Gábor

    2008-09-30

    The present study investigates one of the most promising developments of the brain-mind question, namely the possible links between synchronized oscillatory brain activity and certain (visual) perceptual processes. Through a review of the relevant literature, the author introduces the reader to the most important theories of coherent perception ('binding'), and makes an attempt to show how synchronization of EEG-registrable oscillatory activities from various frequency bands might explain binding. Finally, a number of clinical problems are also mentioned, regarding which the presented theoretical framework might deserve further consideration. PMID:18841649

  12. The role of human ventral visual cortex in motion perception

    PubMed Central

    Saygin, Ayse P.; Lorenzi, Lauren J.; Egan, Ryan; Rees, Geraint; Behrmann, Marlene

    2013-01-01

    Visual motion perception is fundamental to many aspects of visual perception. Visual motion perception has long been associated with the dorsal (parietal) pathway and the involvement of the ventral ‘form’ (temporal) visual pathway has not been considered critical for normal motion perception. Here, we evaluated this view by examining whether circumscribed damage to ventral visual cortex impaired motion perception. The perception of motion in basic, non-form tasks (motion coherence and motion detection) and complex structure-from-motion, for a wide range of motion speeds, all centrally displayed, was assessed in five patients with a circumscribed lesion to either the right or left ventral visual pathway. Patients with a right, but not with a left, ventral visual lesion displayed widespread impairments in central motion perception even for non-form motion, for both slow and for fast speeds, and this held true independent of the integrity of areas MT/V5, V3A or parietal regions. In contrast with the traditional view in which only the dorsal visual stream is critical for motion perception, these novel findings implicate a more distributed circuit in which the integrity of the right ventral visual pathway is also necessary even for the perception of non-form motion. PMID:23983030

  13. Contrast affects flicker and speed perception differently

    NASA Technical Reports Server (NTRS)

    Thompson, P.; Stone, L. S.

    1997-01-01

    We have previously shown that contrast affects speed perception, with lower-contrast, drifting gratings perceived as moving slower. In a recent study, we examined the implications of this result on models of speed perception that use the amplitude of the response of linear spatio-temporal filters to determine speed. In this study, we investigate whether the contrast dependence of speed can be understood within the context of models in which speed estimation is made using the temporal frequency of the response of linear spatio-temporal filters. We measured the effect of contrast on flicker perception and found that contrast manipulations produce opposite effects on perceived drift rate and perceived flicker rate, i.e., reducing contrast increases the apparent temporal frequency of counterphase modulated gratings. This finding argues that, if a temporal frequency-based algorithm underlies speed perception, either flicker and speed perception must not be based on the output of the same mechanism or contrast effects on perceived spatial frequency reconcile the disparate effects observed for perceived temporal frequency and speed.

  14. Effective Connectivity within Human Primary Visual Cortex Predicts Interindividual Diversity in Illusory Perception

    PubMed Central

    Schwarzkopf, D. Samuel; Lutti, Antoine; Li, Baojuan; Kanai, Ryota; Rees, Geraint

    2013-01-01

    Visual perception depends strongly on spatial context. A classic example is the tilt illusion where the perceived orientation of a central stimulus differs from its physical orientation when surrounded by tilted spatial contexts. Here we show that such contextual modulation of orientation perception exhibits trait-like interindividual diversity that correlates with interindividual differences in effective connectivity within human primary visual cortex. We found that the degree to which spatial contexts induced illusory orientation perception, namely, the magnitude of the tilt illusion, varied across healthy human adults in a trait-like fashion independent of stimulus size or contrast. Parallel to contextual modulation of orientation perception, the presence of spatial contexts affected effective connectivity within human primary visual cortex between peripheral and foveal representations that responded to spatial context and central stimulus, respectively. Importantly, this effective connectivity from peripheral to foveal primary visual cortex correlated with interindividual differences in the magnitude of the tilt illusion. Moreover, this correlation with illusion perception was observed for effective connectivity under tilted contextual stimulation but not for that under iso-oriented contextual stimulation, suggesting that it reflected the impact of orientation-dependent intra-areal connections. Our findings revealed an interindividual correlation between intra-areal connectivity within primary visual cortex and contextual influence on orientation perception. This neurophysiological-perceptual link provides empirical evidence for theoretical proposals that intra-areal connections in early visual cortices are involved in contextual modulation of visual perception. PMID:24285885

  15. Visual Perception of Force: Comment on White (2012)

    ERIC Educational Resources Information Center

    Hubbard, Timothy L.

    2012-01-01

    White (2012) proposed that kinematic features in a visual percept are matched to stored representations containing information regarding forces (based on prior haptic experience) and that information in the matched, stored representations regarding forces is then incorporated into visual perception. Although some elements of White's (2012) account…

  16. Refractive Errors Affect the Vividness of Visual Mental Images

    PubMed Central

    Palermo, Liana; Nori, Raffaella; Piccardi, Laura; Zeri, Fabrizio; Babino, Antonio; Giusberti, Fiorella; Guariglia, Cecilia

    2013-01-01

    The hypothesis that visual perception and mental imagery are equivalent has never been explored in individuals with vision defects not preventing the visual perception of the world, such as refractive errors. Refractive error (i.e., myopia, hyperopia or astigmatism) is a condition where the refracting system of the eye fails to focus objects sharply on the retina. As a consequence refractive errors cause blurred vision. We subdivided 84 individuals according to their spherical equivalent refraction into Emmetropes (control individuals without refractive errors) and Ametropes (individuals with refractive errors). Participants performed a vividness task and completed a questionnaire that explored their cognitive style of thinking before their vision was checked by an ophthalmologist. Although results showed that Ametropes had less vivid mental images than Emmetropes this did not affect the development of their cognitive style of thinking; in fact, Ametropes were able to use both verbal and visual strategies to acquire and retrieve information. Present data are consistent with the hypothesis of equivalence between imagery and perception. PMID:23755186

  17. Hand Positions Alter Bistable Visual Motion Perception

    PubMed Central

    Gyoba, Jiro

    2016-01-01

    We found that a hand posture with the palms together located just below the stream/bounce display could increase the proportion of bouncing perception. This effect, called the hands-induced bounce (HIB) effect, did not occur in the hands-cross condition or in the one-hand condition. By using rubber hands or covering the participants’ hands with a cloth, we demonstrated that the visual information of the hand shapes was not a critical factor in producing the HIB effect, whereas proprioceptive information seemed to be important. We also found that the HIB effect did not occur when the participants’ hands were far from the coincidence point, suggesting that the HIB effect might be produced within a limited spatial area around the hands. PMID:27433332

  18. Hand Positions Alter Bistable Visual Motion Perception.

    PubMed

    Saito, Godai; Gyoba, Jiro

    2016-05-01

    We found that a hand posture with the palms together located just below the stream/bounce display could increase the proportion of bouncing perception. This effect, called the hands-induced bounce (HIB) effect, did not occur in the hands-cross condition or in the one-hand condition. By using rubber hands or covering the participants' hands with a cloth, we demonstrated that the visual information of the hand shapes was not a critical factor in producing the HIB effect, whereas proprioceptive information seemed to be important. We also found that the HIB effect did not occur when the participants' hands were far from the coincidence point, suggesting that the HIB effect might be produced within a limited spatial area around the hands. PMID:27433332

  19. See what I hear? Beat perception in auditory and visual rhythms.

    PubMed

    Grahn, Jessica A

    2012-07-01

    Our perception of time is affected by the modality in which it is conveyed. Moreover, certain temporal phenomena appear to exist in only one modality. The perception of temporal regularity or structure (e.g., the 'beat') in rhythmic patterns is one such phenomenon: visual beat perception is rare. The modality-specificity for beat perception is puzzling, as the durations that comprise rhythmic patterns are much longer than the limits of visual temporal resolution. Moreover, the optimization that beat perception provides for memory of auditory sequences should be equally relevant to visual sequences. Why does beat perception appear to be modality specific? One possibility is that the nature of the visual stimulus plays a role. Previous studies have usually used brief stimuli (e.g., light flashes) to present visual rhythms. In the current study, a rotating line that appeared sequentially in different spatial orientations was used to present a visual rhythm. Discrimination accuracy for visual rhythms and auditory rhythms was compared for different types of rhythms. The rhythms either had a regular temporal structure that previously has been shown to induce beat perception in the auditory modality, or they had an irregular temporal structure without beat-inducing qualities. Overall, the visual rhythms were discriminated more poorly than the auditory rhythms. The beat-based structure, however, increased accuracy for visual as well as auditory rhythms. These results indicate that beat perception can occur in the visual modality and improve performance on a temporal discrimination task, when certain types of stimuli are used. PMID:22623092

  20. Visual Perception Associated With Diabetes Mellitus

    NASA Astrophysics Data System (ADS)

    Suaste, Ernesto

    2004-09-01

    We designs and implement an instrumental methodology of analysis of the pupillary response to chromatic stimuli in order to observe the changes of pupillary area in the process of contraction and dilation in diabetic patients. Visual stimuli were used in the visible spectrum (400nm-650nm). Three different programs were used to determinate the best stimulation in order to obtain the better and contrasted pupillary response for diagnosis of the visual perception of colors. The stimulators PG0, PG12 and PG20 were designed in our laboratory. The test was carried out with 44 people, 33 men, 10 women and a boy (22-52 and 6 years), 12 with the stimulator PG0, 21 with PG12 and 17 with PG20, 7 subjects participated in more than a test. According to the plates of Ishihara, 40 of those subjects have normal vision to the colors, one subject suffers dicromasy (inability to differ or to perceive red and green) and while three of them present deficiencies to observe the blue and red spectrum (they suffer type II diabetes mellitus). With this instrumental methodology, we pretend to obtain an indication in the pupillary variability for the early diagnose of the diabetes mellitus, as well as a monitoring instrument for it.

  1. Volumic visual perception: principally novel concept

    NASA Astrophysics Data System (ADS)

    Petrov, Valery

    1996-01-01

    The general concept of volumic view (VV) as a universal property of space is introduced. VV exists in every point of the universe where electromagnetic (EM) waves can reach and a point or a quasi-point receiver (detector) of EM waves can be placed. Classification of receivers is given for the first time. They are classified into three main categories: biological, man-made non-biological, and mathematically specified hypothetical receivers. The principally novel concept of volumic perception is introduced. It differs chiefly from the traditional concept which traces back to Euclid and pre-Euclidean times and much later to Leonardo da Vinci and Giovanni Battista della Porta's discoveries and practical stereoscopy as introduced by C. Wheatstone. The basic idea of novel concept is that humans and animals acquire volumic visual data flows in series rather than in parallel. In this case the brain is free from extremely sophisticated real time parallel processing of two volumic visual data flows in order to combine them. Such procedure seems hardly probable even for humans who are unable to combine two primitive static stereoscopic images in one quicker than in a few seconds. Some people are unable to perform this procedure at all.

  2. Word selection affects perceptions of synthetic biology

    PubMed Central

    2011-01-01

    Members of the synthetic biology community have discussed the significance of word selection when describing synthetic biology to the general public. In particular, many leaders proposed the word "create" was laden with negative connotations. We found that word choice and framing does affect public perception of synthetic biology. In a controlled experiment, participants perceived synthetic biology more negatively when "create" was used to describe the field compared to "construct" (p = 0.008). Contrary to popular opinion among synthetic biologists, however, low religiosity individuals were more influenced negatively by the framing manipulation than high religiosity people. Our results suggest that synthetic biologists directly influence public perception of their field through avoidance of the word "create". PMID:21777466

  3. Short-term visual deprivation improves the perception of harmonicity.

    PubMed

    Landry, Simon P; Shiller, Douglas M; Champoux, François

    2013-12-01

    Neuroimaging studies have shown that the perception of auditory stimuli involves occipital cortical regions traditionally associated with visual processing, even in the absence of any overt visual component to the task. Analogous behavioral evidence of an interaction between visual and auditory processing during purely auditory tasks comes from studies of short-term visual deprivation on the perception of auditory cues, however, the results of such studies remain equivocal. Although some data have suggested that visual deprivation significantly increases loudness and pitch discrimination and reduces spatial localization inaccuracies, it is still unclear whether such improvement extends to the perception of spectrally complex cues, such as those involved in speech and music perception. We present data demonstrating that a 90-min period of visual deprivation causes a transient improvement in the perception of harmonicity: a spectrally complex cue that plays a key role in music and speech perception. The results provide clear behavioral evidence supporting a role for the visual system in the processing of complex auditory stimuli, even in the absence of any visual component to the task. PMID:23957309

  4. Do Hearing Aids Improve Affect Perception?

    PubMed

    Schmidt, Juliane; Herzog, Diana; Scharenborg, Odette; Janse, Esther

    2016-01-01

    Normal-hearing listeners use acoustic cues in speech to interpret a speaker's emotional state. This study investigates the effect of hearing aids on the perception of the emotion dimensions arousal (aroused/calm) and valence (positive/negative attitude) in older adults with hearing loss. More specifically, we investigate whether wearing a hearing aid improves the correlation between affect ratings and affect-related acoustic parameters. To that end, affect ratings by 23 hearing-aid users were compared for aided and unaided listening. Moreover, these ratings were compared to the ratings by an age-matched group of 22 participants with age-normal hearing.For arousal, hearing-aid users rated utterances as generally more aroused in the aided than in the unaided condition. Intensity differences were the strongest indictor of degree of arousal. Among the hearing-aid users, those with poorer hearing used additional prosodic cues (i.e., tempo and pitch) for their arousal ratings, compared to those with relatively good hearing. For valence, pitch was the only acoustic cue that was associated with valence. Neither listening condition nor hearing loss severity (differences among the hearing-aid users) influenced affect ratings or the use of affect-related acoustic parameters. Compared to the normal-hearing reference group, ratings of hearing-aid users in the aided condition did not generally differ in both emotion dimensions. However, hearing-aid users were more sensitive to intensity differences in their arousal ratings than the normal-hearing participants.We conclude that the use of hearing aids is important for the rehabilitation of affect perception and particularly influences the interpretation of arousal. PMID:27080645

  5. Affective and motivational influences in person perception.

    PubMed

    Kuzmanovic, Bojana; Jefferson, Anneli; Bente, Gary; Vogeley, Kai

    2013-01-01

    Interpersonal impression formation is highly consequential for social interactions in private and public domains. These perceptions of others rely on different sources of information and processing mechanisms, all of which have been investigated in independent research fields. In social psychology, inferences about states and traits of others as well as activations of semantic categories and corresponding stereotypes have attracted great interest. On the other hand, research on emotion and reward demonstrated affective and motivational influences of social cues on the observer, which in turn modulate attention, categorization, evaluation, and decision processes. While inferential and categorical social processes have been shown to recruit a network of cortical brain regions associated with mentalizing and evaluation, the affective influence of social cues has been linked to subcortical areas that play a central role in detection of salient sensory input and reward processing. In order to extend existing integrative approaches to person perception, both the inferential-categorical processing of information about others, and affective and motivational influences of this information on the beholder should be taken into account. PMID:23781188

  6. Analysis of texture characteristics associated with visual complexity perception

    NASA Astrophysics Data System (ADS)

    Guo, Xiaoying; Asano, Chie Muraki; Asano, Akira; Kurita, Takio; Li, Liang

    2012-09-01

    In our previous work we determined that five important characteristics affect the perception of visual complexity of a texture: regularity, roughness, directionality, density, and understandability. In this paper, a set of objective methods for measuring these characteristics is proposed: regularity is estimated by an autocorrelation function; roughness is computed based on local changes; directionality is measured by the maximum line-likeness of edges in different directions; and density is calculated from the edge density. Our analysis shows a significant correlation between the objective measures and subjective evaluations. In addition, for the estimation of understandability, a new approach is proposed. We asked the respondents to name each texture, and then we sorted all these names into different types, including names that were similar. We discovered that understandability is affected by two factors of a texture: the maximum number of similar names assigned to a specific type and the total number of types.

  7. Behavioral model of visual perception and recognition

    NASA Astrophysics Data System (ADS)

    Rybak, Ilya A.; Golovan, Alexander V.; Gusakova, Valentina I.

    1993-09-01

    In the processes of visual perception and recognition human eyes actively select essential information by way of successive fixations at the most informative points of the image. A behavioral program defining a scanpath of the image is formed at the stage of learning (object memorizing) and consists of sequential motor actions, which are shifts of attention from one to another point of fixation, and sensory signals expected to arrive in response to each shift of attention. In the modern view of the problem, invariant object recognition is provided by the following: (1) separated processing of `what' (object features) and `where' (spatial features) information at high levels of the visual system; (2) mechanisms of visual attention using `where' information; (3) representation of `what' information in an object-based frame of reference (OFR). However, most recent models of vision based on OFR have demonstrated the ability of invariant recognition of only simple objects like letters or binary objects without background, i.e. objects to which a frame of reference is easily attached. In contrast, we use not OFR, but a feature-based frame of reference (FFR), connected with the basic feature (edge) at the fixation point. This has provided for our model, the ability for invariant representation of complex objects in gray-level images, but demands realization of behavioral aspects of vision described above. The developed model contains a neural network subsystem of low-level vision which extracts a set of primary features (edges) in each fixation, and high- level subsystem consisting of `what' (Sensory Memory) and `where' (Motor Memory) modules. The resolution of primary features extraction decreases with distances from the point of fixation. FFR provides both the invariant representation of object features in Sensor Memory and shifts of attention in Motor Memory. Object recognition consists in successive recall (from Motor Memory) and execution of shifts of attention and

  8. Opposite Distortions in Interval Timing Perception for Visual and Auditory Stimuli with Temporal Modulations

    PubMed Central

    Yuasa, Kenichi; Yotsumoto, Yuko

    2015-01-01

    When an object is presented visually and moves or flickers, the perception of its duration tends to be overestimated. Such an overestimation is called time dilation. Perceived time can also be distorted when a stimulus is presented aurally as an auditory flutter, but the mechanisms and their relationship to visual processing remains unclear. In the present study, we measured interval timing perception while modulating the temporal characteristics of visual and auditory stimuli, and investigated whether the interval times of visually and aurally presented objects shared a common mechanism. In these experiments, participants compared the durations of flickering or fluttering stimuli to standard stimuli, which were presented continuously. Perceived durations for auditory flutters were underestimated, while perceived durations of visual flickers were overestimated. When auditory flutters and visual flickers were presented simultaneously, these distortion effects were cancelled out. When auditory flutters were presented with a constantly presented visual stimulus, the interval timing perception of the visual stimulus was affected by the auditory flutters. These results indicate that interval timing perception is governed by independent mechanisms for visual and auditory processing, and that there are some interactions between the two processing systems. PMID:26292285

  9. On Environmental Model-Based Visual Perception for Humanoids

    NASA Astrophysics Data System (ADS)

    Gonzalez-Aguirre, D.; Wieland, S.; Asfour, T.; Dillmann, R.

    In this article an autonomous visual perception framework for humanoids is presented. This model-based framework exploits the available knowledge and the context acquired during global localization in order to overcome the limitations of pure data-driven approaches. The reasoning for perception and the properceptive components are the key elements to solve complex visual assertion queries with a proficient performance. Experimental evaluation with the humanoid robot ARMAR-IIIa is presented.

  10. Visual-vestibular integration motion perception reporting

    NASA Technical Reports Server (NTRS)

    Harm, Deborah L.; Reschke, Millard R.; Parker, Donald E.

    1999-01-01

    Self-orientation and self/surround-motion perception derive from a multimodal sensory process that integrates information from the eyes, vestibular apparatus, proprioceptive and somatosensory receptors. Results from short and long duration spaceflight investigations indicate that: (1) perceptual and sensorimotor function was disrupted during the initial exposure to microgravity and gradually improved over hours to days (individuals adapt), (2) the presence and/or absence of information from different sensory modalities differentially affected the perception of orientation, self-motion and surround-motion, (3) perceptual and sensorimotor function was initially disrupted upon return to Earth-normal gravity and gradually recovered to preflight levels (individuals readapt), and (4) the longer the exposure to microgravity, the more complete the adaptation, the more profound the postflight disturbances, and the longer the recovery period to preflight levels. While much has been learned about perceptual and sensorimotor reactions and adaptation to microgravity, there is much remaining to be learned about the mechanisms underlying the adaptive changes, and about how intersensory interactions affect perceptual and sensorimotor function during voluntary movements. During space flight, SMS and perceptual disturbances have led to reductions in performance efficiency and sense of well-being. During entry and immediately after landing, such disturbances could have a serious impact on the ability of the commander to land the Orbiter and on the ability of all crew members to egress from the Orbiter, particularly in a non-nominal condition or following extended stays in microgravity. An understanding of spatial orientation and motion perception is essential for developing countermeasures for Space Motion Sickness (SMS) and perceptual disturbances during spaceflight and upon return to Earth. Countermeasures for optimal performance in flight and a successful return to Earth require

  11. A Dynamic Systems Theory Model of Visual Perception Development

    ERIC Educational Resources Information Center

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  12. Fast Modulation of Visual Perception by Basal Forebrain Cholinergic Neurons

    PubMed Central

    Estandian, Daniel; Xu, Min; Kwan, Alex C.; Lee, Seung-Hee; Harrison, Thomas C.; Feng, Guoping; Dan, Yang

    2014-01-01

    The basal forebrain provides the primary source of cholinergic input to the cortex, and it plays a crucial role in promoting wakefulness and arousal. However, whether rapid changes in basal forebrain neuron spiking in awake animals can dynamically influence sensory perception is unclear. Here we show that basal forebrain cholinergic neurons rapidly regulate cortical activity and visual perception in awake, behaving mice. Optogenetic activation of the cholinergic neurons or their V1 axon terminals improved performance of a visual discrimination task on a trial-by-trial basis. In V1, basal forebrain activation enhanced visual responses and desynchronized neuronal spiking, which could partly account for the behavioral improvement. Conversely, optogenetic basal forebrain inactivation decreased behavioral performance, synchronized cortical activity and impaired visual responses, indicating the importance of cholinergic activity in normal visual processing. These results underscore the causal role of basal forebrain cholinergic neurons in fast, bidirectional modulation of cortical processing and sensory perception. PMID:24162654

  13. Visual perception and imagery: a new molecular hypothesis.

    PubMed

    Bókkon, I

    2009-05-01

    Here, we put forward a redox molecular hypothesis about the natural biophysical substrate of visual perception and visual imagery. This hypothesis is based on the redox and bioluminescent processes of neuronal cells in retinotopically organized cytochrome oxidase-rich visual areas. Our hypothesis is in line with the functional roles of reactive oxygen and nitrogen species in living cells that are not part of haphazard process, but rather a very strict mechanism used in signaling pathways. We point out that there is a direct relationship between neuronal activity and the biophoton emission process in the brain. Electrical and biochemical processes in the brain represent sensory information from the external world. During encoding or retrieval of information, electrical signals of neurons can be converted into synchronized biophoton signals by bioluminescent radical and non-radical processes. Therefore, information in the brain appears not only as an electrical (chemical) signal but also as a regulated biophoton (weak optical) signal inside neurons. During visual perception, the topological distribution of photon stimuli on the retina is represented by electrical neuronal activity in retinotopically organized visual areas. These retinotopic electrical signals in visual neurons can be converted into synchronized biophoton signals by radical and non-radical processes in retinotopically organized mitochondria-rich areas. As a result, regulated bioluminescent biophotons can create intrinsic pictures (depictive representation) in retinotopically organized cytochrome oxidase-rich visual areas during visual imagery and visual perception. The long-term visual memory is interpreted as epigenetic information regulated by free radicals and redox processes. This hypothesis does not claim to solve the secret of consciousness, but proposes that the evolution of higher levels of complexity made the intrinsic picture representation of the external visual world possible by regulated

  14. Unintentionality of affective attention across visual processing stages

    PubMed Central

    Uusberg, Andero; Uibo, Helen; Kreegipuu, Kairi; Tamm, Maria; Raidvee, Aire; Allik, Jüri

    2013-01-01

    Affective attention involves bottom-up perceptual selection that prioritizes motivationally significant stimuli. To clarify the extent to which this process is automatic, we investigated the dependence of affective attention on the intention to process emotional meaning. Affective attention was manipulated by presenting affective images with variable arousal and intentionality by requiring participants to make affective and non-affective evaluations. Polytomous rather than binary decisions were required from the participants in order to elicit relatively deep emotional processing. The temporal dynamics of prioritized processing were assessed using early posterior negativity (EPN, 175–300 ms) as well as P3-like (P3, 300–500 ms) and slow wave (SW, 500–1500 ms) portions of the late positive potential. All analyzed components were differentially sensitive to stimulus categories suggesting that they indeed reflect distinct stages of motivational significance encoding. The intention to perceive emotional meaning had no effect on EPN, an additive effect on P3, and an interactive effect on SW. We concluded that affective attention went from completely unintentional during the EPN to partially unintentional during P3 and SW where top-down signals, respectively, complemented and modulated bottom-up differences in stimulus prioritization. The findings were interpreted in light of two-stage models of visual perception by associating the EPN with large-capacity initial relevance detection and the P3 as well as SW with capacity-limited consolidation and elaboration of affective stimuli. PMID:24421772

  15. Flicker-light induced visual phenomena: frequency dependence and specificity of whole percepts and percept features.

    PubMed

    Allefeld, Carsten; Pütz, Peter; Kastner, Kristina; Wackermann, Jiří

    2011-12-01

    Flickering light induces visual hallucinations in human observers. Despite a long history of the phenomenon, little is known about the dependence of flicker-induced subjective impressions on the flicker frequency. We investigate this question using Ganzfeld stimulation and an experimental paradigm combining a continuous frequency scan (1-50 Hz) with a focus on re-occurring, whole percepts. On the single-subject level, we find a high degree of frequency stability of percepts. To generalize across subjects, we apply two rating systems, (1) a set of complex percept classes derived from subjects' reports and (2) an enumeration of elementary percept features, and determine distributions of occurrences over flicker frequency. We observe a stronger frequency specificity for complex percept classes than elementary percept features. Comparing the similarity relations among percept categories to those among frequency profiles, we observe that though percepts are preferentially induced by particular frequencies, the frequency does not unambiguously determine the experienced percept. PMID:21123084

  16. Visual perception of texture in aggressive behavior of Betta splendens.

    PubMed

    Bando, T

    1991-07-01

    In order to elucidate the role of texture in fish vision, the agonistic behavior of male Siamese fighting fish (Betta splendens) was tested in a response to models composed by means of image processing techniques. Using the models with the contour shape of a side view of Betta splendens in an aggressive state, the responses were vigorous when there was a fine distribution of brightness and naturalistic color, producing textures like a scale pattern. Reactions became weaker as the brightness and color distribution reverted to more homogeneous levels and the scale pattern disappeared. When the artificial models with the circular contour shape were used, models with the scale pattern evoked more aggressive behaviors than those without it, while the existence of spherical gradation affected the behavior slightly. These results suggest that texture plays an important role in fish visual perception. PMID:1941718

  17. Seeing and Seeing: Visual Perception in Art and Science

    ERIC Educational Resources Information Center

    Campbell, Peter

    2004-01-01

    This article takes a brief walk through two complex cultures, looking at similarities and differences between them. Visual perception is vital to both art and science, for to see is to understand. The article compares how education in each subject fosters visualization and creative thinking.

  18. Experience, Context, and the Visual Perception of Human Movement

    ERIC Educational Resources Information Center

    Jacobs, Alissa; Pinto, Jeannine; Shiffrar, Maggie

    2004-01-01

    Why are human observers particularly sensitive to human movement? Seven experiments examined the roles of visual experience and motor processes in human movement perception by comparing visual sensitivities to point-light displays of familiar, unusual, and impossible gaits across gait-speed and identity discrimination tasks. In both tasks, visual…

  19. Visual Angle as Determinant Factor for Relative Distance Perception

    ERIC Educational Resources Information Center

    Matsushima, Elton H.; de Oliveira, Artur P.; Ribeiro-Filho, Nilton P.; Da Silva, Jose A.

    2005-01-01

    Visual angles are defined as the angle between line of sight up to the mean point of a relative distance and the relative distance itself. In one experiment, we examined the functional aspect of visual angle in relative distance perception using two different layouts composed by 14 stakes, one of them with its center 23 m away from the observation…

  20. General Markers of Conscious Visual Perception and Their Timing

    PubMed Central

    Rutiku, Renate; Aru, Jaan; Bachmann, Talis

    2016-01-01

    Previous studies have observed different onset times for the neural markers of conscious perception. This variability could be attributed to procedural differences between studies. Here we show that the onset times for the markers of conscious visual perception can strongly vary even within a single study. A heterogeneous stimulus set was presented at threshold contrast. Trials with and without conscious perception were contrasted on 100 balanced subsets of the data. Importantly, the 100 subsets with heterogeneous stimuli did not differ in stimulus content, but only with regard to specific trials used. This approach enabled us to study general markers of conscious visual perception independent of stimulus content, characterize their onset and its variability within one study. N200 and P300 were the two reliable markers of conscious visual perception common to all perceived stimuli and absent for all non-perceived stimuli. The estimated mean onset latency for both markers was shortly after 200 ms. However, the onset latency of these markers was associated with considerable variability depending on which subsets of the data were considered. We show that it is first and foremost the amplitude fluctuation in the condition without conscious perception that explains the observed variability in onset latencies of the markers of conscious visual perception. PMID:26869905

  1. [Perception of physiological visual illusions by individuals with schizophrenia].

    PubMed

    Ciszewski, Słowomir; Wichowicz, Hubert Michał; Żuk, Krzysztof

    2015-01-01

    Visual perception by individuals with schizophrenia has not been extensively researched. The focus of this review is the perception of physiological visual illusions by patients with schizophrenia, a differences of perception reported in a small number of studies. Increased or decreased susceptibility of these patients to various illusions seems to be unconnected to the location of origin in the visual apparatus, which also takes place in illusions connected to other modalities. The susceptibility of patients with schizophrenia to haptic illusions has not yet been investigated, although the need for such investigation has been is clear. The emerging picture is that some individuals with schizophrenia are "resistant" to some of the illusions and are able to assess visual phenomena more "rationally", yet certain illusions (ex. Müller-Lyer's) are perceived more intensely. Disturbances in the perception of visual illusions have neither been classified as possible diagnostic indicators of a dangerous mental condition, nor included in the endophenotype of schizophrenia. Although the relevant data are sparse, the ability to replicate the results is limited, and the research model lacks a "gold standard", some preliminary conclusions may be drawn. There are indications that disturbances in visual perception are connected to the extent of disorganization, poor initial social functioning, poor prognosis, and the types of schizophrenia described as neurodevelopmental. Patients with schizophrenia usually fail to perceive those illusions that require volitional controlled attention, and show lack of sensitivity to the contrast between shape and background. PMID:26093596

  2. How do musical tonality and experience affect visual working memory?

    PubMed

    Yang, Hua; Lu, Jing; Gong, Diankun; Yao, Dezhong

    2016-01-20

    The influence of music on the human brain has continued to attract increasing attention from neuroscientists and musicologists. Currently, tonal music is widely present in people's daily lives; however, atonal music has gradually become an important part of modern music. In this study, we conducted two experiments: the first one tested for differences in perception of distractibility between tonal music and atonal music. The second experiment tested how tonal music and atonal music affect visual working memory by comparing musicians and nonmusicians who were placed in contexts with background tonal music, atonal music, and silence. They were instructed to complete a delay matching memory task. The results show that musicians and nonmusicians have different evaluations of the distractibility of tonal music and atonal music, possibly indicating that long-term training may lead to a higher auditory perception threshold among musicians. For the working memory task, musicians reacted faster than nonmusicians in all background music cases, and musicians took more time to respond in the tonal background music condition than in the other conditions. Therefore, our results suggest that for a visual memory task, background tonal music may occupy more cognitive resources than atonal music or silence for musicians, leaving few resources left for the memory task. Moreover, the musicians outperformed the nonmusicians because of the higher sensitivity to background music, which also needs a further longitudinal study to be confirmed. PMID:26619232

  3. Human perception of visual air quality (uniform haze)

    NASA Astrophysics Data System (ADS)

    Malm, William; Kelley, Karen; Molenar, John; Daniel, Terry

    The National Park Service and the U.S. Environmental Protection Agency are cooperatively conducting ongoing studies of human perception of visual air quality. Major objectives of this program include: (1) determination of the relationship between judgments of visual air quality of actual three dimensional scenes and a surrogate slide representation of that scene, (2) examination of the effect of sun angle and meteorological conditions on perceived visual air quality, (3) examination of the effect of demographic background on observer's judgments of visual air quality, (4) establishment of a functional relationship between human perception of visual air quality and various electro-optical parameters for several different scenic vistas and (5) development of a model capable of predicting the sensitivity of a park to visual air pollution impact. Preliminary results of a previous study involving one vista revealed a linear relationship between human perception and apparent vista contrast for constant vista illumination and ground cover. A more general formalism for averaging vista color contrast appeared to account for effects that snow cover and varying illumination have on the sensitivity of perceived visual air quality to air pollution. These functional relationships are re-examined using a number of southwestern vistas. A first order model capable of predicting perceived visual air quality as a function of change in air pollution is developed. In addition, the relationship between perceived visual air quality of actual three dimensional scenes and pictoral surrogates is examined.

  4. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception

    PubMed Central

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress). PMID:27445901

  5. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception.

    PubMed

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress). PMID:27445901

  6. High visual resolution matters in audiovisual speech perception, but only for some.

    PubMed

    Alsius, Agnès; Wayne, Rachel V; Paré, Martin; Munhall, Kevin G

    2016-07-01

    The basis for individual differences in the degree to which visual speech input enhances comprehension of acoustically degraded speech is largely unknown. Previous research indicates that fine facial detail is not critical for visual enhancement when auditory information is available; however, these studies did not examine individual differences in ability to make use of fine facial detail in relation to audiovisual speech perception ability. Here, we compare participants based on their ability to benefit from visual speech information in the presence of an auditory signal degraded with noise, modulating the resolution of the visual signal through low-pass spatial frequency filtering and monitoring gaze behavior. Participants who benefited most from the addition of visual information (high visual gain) were more adversely affected by the removal of high spatial frequency information, compared to participants with low visual gain, for materials with both poor and rich contextual cues (i.e., words and sentences, respectively). Differences as a function of gaze behavior between participants with the highest and lowest visual gains were observed only for words, with participants with the highest visual gain fixating longer on the mouth region. Our results indicate that the individual variance in audiovisual speech in noise performance can be accounted for, in part, by better use of fine facial detail information extracted from the visual signal and increased fixation on mouth regions for short stimuli. Thus, for some, audiovisual speech perception may suffer when the visual input (in addition to the auditory signal) is less than perfect. PMID:27150616

  7. Visual Cues for Enhancing Depth Perception.

    ERIC Educational Resources Information Center

    O'Donnell, L. M.; Smith, A. J.

    1994-01-01

    This article describes the physiological mechanisms involved in three-dimensional depth perception and presents a variety of distance and depth cues and strategies for detecting and estimating curbs and steps for individuals with impaired vision. (Author/DB)

  8. Analysis of visual acuity and motion resolvability as measures for optimal visual perception of the workspace.

    PubMed

    Janabi-Sharifi, Farrokh; Vakanski, Aleksandar

    2011-03-01

    For working tasks with high visual demand, ergonomic design of the working stations requires defining criteria for comparative evaluation and analysis of the visual perceptibility in different regions of the workspace. This paper provides kinematic models of visual acuity and motion resolvability as adopted measures of visual perceptibility of the workspace. The proposed models have been examined through two sets of experiments. The first experiment is designed to compare the models outputs with those from experiments. Time measurements of the participants' response to visual events are employed for calculation of the perceptibility measures. The overall comparison results show similar patterns and moderate statistical errors of the measured and kinematically modeled values of the parameters. In the second experiment, the proposed set of visual perceptibility measures are examined for a simulated industrial task of inserting electronic chips into slots of a working table, resembling a fine assembly line of transponders manufacturing. The results from ANOVA tests for the visual acuity and the motion resolvability justify the postures adopted by the participants using visual perceptibility measures for completing the insertion tasks. PMID:20947063

  9. Visual size perception and haptic calibration during development.

    PubMed

    Gori, Monica; Giuliana, Luana; Sandini, Giulio; Burr, David

    2012-11-01

    It is still unclear how the visual system perceives accurately the size of objects at different distances. One suggestion, dating back to Berkeley's famous essay, is that vision is calibrated by touch. If so, we may expect different mechanisms involved for near, reachable distances and far, unreachable distances. To study how the haptic system calibrates vision we measured size constancy in children (from 6 to 16 years of age) and adults, at various distances. At all ages, accuracy of the visual size perception changes with distance, and is almost veridical inside the haptic workspace, in agreement with the idea that the haptic system acts to calibrate visual size perception. Outside this space, systematic errors occurred, which varied with age. Adults tended to overestimate visual size of distant objects (over-compensation for distance), while children younger than 14 underestimated their size (under-compensation). At 16 years of age there seemed to be a transition point, with veridical perception of distant objects. When young subjects were allowed to touch the object inside the haptic workspace, the visual biases disappeared, while older subjects showed multisensory integration. All results are consistent with the idea that the haptic system can be used to calibrate visual size perception during development, more effectively within than outside the haptic workspace, and that the calibration mechanisms are different in children than in adults. PMID:23106739

  10. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  11. Aging affects postural tracking of complex visual motion cues.

    PubMed

    Sotirakis, H; Kyvelidou, A; Mademli, L; Stergiou, N; Hatzitaki, V

    2016-09-01

    Postural tracking of visual motion cues improves perception-action coupling in aging, yet the nature of the visual cues to be tracked is critical for the efficacy of such a paradigm. We investigated how well healthy older (72.45 ± 4.72 years) and young (22.98 ± 2.9 years) adults can follow with their gaze and posture horizontally moving visual target cues of different degree of complexity. Participants tracked continuously for 120 s the motion of a visual target (dot) that oscillated in three different patterns: a simple periodic (simulated by a sine), a more complex (simulated by the Lorenz attractor that is deterministic displaying mathematical chaos) and an ultra-complex random (simulated by surrogating the Lorenz attractor) pattern. The degree of coupling between performance (posture and gaze) and the target motion was quantified in the spectral coherence, gain, phase and cross-approximate entropy (cross-ApEn) between signals. Sway-target coherence decreased as a function of target complexity and was lower for the older compared to the young participants when tracking the chaotic target. On the other hand, gaze-target coherence was not affected by either target complexity or age. Yet, a lower cross-ApEn value when tracking the chaotic stimulus motion revealed a more synchronous gaze-target relationship for both age groups. Results suggest limitations in online visuo-motor processing of complex motion cues and a less efficient exploitation of the body sway dynamics with age. Complex visual motion cues may provide a suitable training stimulus to improve visuo-motor integration and restore sway variability in older adults. PMID:27126061

  12. Auditory Emotional Cues Enhance Visual Perception

    ERIC Educational Resources Information Center

    Zeelenberg, Rene; Bocanegra, Bruno R.

    2010-01-01

    Recent studies show that emotional stimuli impair performance to subsequently presented neutral stimuli. Here we show a cross-modal perceptual enhancement caused by emotional cues. Auditory cue words were followed by a visually presented neutral target word. Two-alternative forced-choice identification of the visual target was improved by…

  13. Crossmodal integration between visual linguistic information and flavour perception.

    PubMed

    Razumiejczyk, Eugenia; Macbeth, Guillermo; Marmolejo-Ramos, Fernando; Noguchi, Kimihiro

    2015-08-01

    Many studies have found processing interference in working memory when complex information that enters the cognitive system from different modalities has to be integrated to understand the environment and promote adjustment. Here, we report on a Stroop study that provides evidence concerned with the crossmodal processing of flavour perception and visual language. We found a facilitation effect in the congruency condition. Acceleration was observed for incomplete words and anagrams compared to complete words. A crossmodal completion account is presented for such findings. It is concluded that the crossmodal integration between flavour and visual language perception requires the active participation of top-down and bottom-up processing. PMID:25843936

  14. Undetectable Changes in Image Resolution of Luminance-Contrast Gradients Affect Depth Perception

    PubMed Central

    Tsushima, Yoshiaki; Komine, Kazuteru; Sawahata, Yasuhito; Morita, Toshiya

    2016-01-01

    A great number of studies have suggested a variety of ways to get depth information from two dimensional images such as binocular disparity, shape-from-shading, size gradient/foreshortening, aerial perspective, and so on. Are there any other new factors affecting depth perception? A recent psychophysical study has investigated the correlation between image resolution and depth sensation of Cylinder images (A rectangle contains gradual luminance-contrast changes.). It was reported that higher resolution images facilitate depth perception. However, it is still not clear whether or not the finding generalizes to other kinds of visual stimuli, because there are more appropriate visual stimuli for exploration of depth perception of luminance-contrast changes, such as Gabor patch. Here, we further examined the relationship between image resolution and depth perception by conducting a series of psychophysical experiments with not only Cylinders but also Gabor patches having smoother luminance-contrast gradients. As a result, higher resolution images produced stronger depth sensation with both images. This finding suggests that image resolution affects depth perception of simple luminance-contrast differences (Gabor patch) as well as shape-from-shading (Cylinder). In addition, this phenomenon was found even when the resolution difference was undetectable. This indicates the existence of consciously available and unavailable information in our visual system. These findings further support the view that image resolution is a cue for depth perception that was previously ignored. It partially explains the unparalleled viewing experience of novel high resolution displays. PMID:26941693

  15. Visual attention modulates the asymmetric influence of each cerebral hemisphere on spatial perception.

    PubMed

    Wang, Meijian; Wang, Xiuhai; Xue, Lingyan; Huang, Dan; Chen, Yao

    2016-01-01

    Although the allocation of brain functions across the two cerebral hemispheres has aroused public interest over the past century, asymmetric interhemispheric cooperation under attentional modulation has been scarcely investigated. An example of interhemispheric cooperation is visual spatial perception. During this process, visual information from each hemisphere is integrated because each half of the visual field predominantly projects to the contralateral visual cortex. Both egocentric and allocentric coordinates can be employed for visual spatial representation, but they activate different areas in primate cerebral hemispheres. Recent studies have determined that egocentric representation affects the reaction time of allocentric perception; furthermore, this influence is asymmetric between the two visual hemifields. The egocentric-allocentric incompatibility effect and its asymmetry between the two hemispheres can produce this phenomenon. Using an allocentric position judgment task, we found that this incompatibility effect was reduced, and its asymmetry was eliminated on an attentional task rather than a neutral task. Visual attention might activate cortical areas that process conflicting information, such as the anterior cingulate cortex, and balance the asymmetry between the two hemispheres. Attention may enhance and balance this interhemispheric cooperation because this imbalance may also be caused by the asymmetric cooperation of each hemisphere in spatial perception. PMID:26758349

  16. Visual attention modulates the asymmetric influence of each cerebral hemisphere on spatial perception

    PubMed Central

    Wang, Meijian; Wang, Xiuhai; Xue, Lingyan; Huang, Dan; Chen, Yao

    2016-01-01

    Although the allocation of brain functions across the two cerebral hemispheres has aroused public interest over the past century, asymmetric interhemispheric cooperation under attentional modulation has been scarcely investigated. An example of interhemispheric cooperation is visual spatial perception. During this process, visual information from each hemisphere is integrated because each half of the visual field predominantly projects to the contralateral visual cortex. Both egocentric and allocentric coordinates can be employed for visual spatial representation, but they activate different areas in primate cerebral hemispheres. Recent studies have determined that egocentric representation affects the reaction time of allocentric perception; furthermore, this influence is asymmetric between the two visual hemifields. The egocentric-allocentric incompatibility effect and its asymmetry between the two hemispheres can produce this phenomenon. Using an allocentric position judgment task, we found that this incompatibility effect was reduced, and its asymmetry was eliminated on an attentional task rather than a neutral task. Visual attention might activate cortical areas that process conflicting information, such as the anterior cingulate cortex, and balance the asymmetry between the two hemispheres. Attention may enhance and balance this interhemispheric cooperation because this imbalance may also be caused by the asymmetric cooperation of each hemisphere in spatial perception. PMID:26758349

  17. Instructional Designer's Intentions and Learners' Perceptions of the Instructional Functions of Visuals in an e-Learning Context

    ERIC Educational Resources Information Center

    Jin, Sung-Hee; Boling, Elizabeth

    2010-01-01

    The purpose of this study is to compare an instructional designer's intentions with the learners' perceptions of the instructional functions of visuals in one specific e-learning lesson. An instructional designer created each visual with more than two purposes related to the psychological, cognitive, and affective aspects of learning. Contrary to…

  18. Odors Bias Time Perception in Visual and Auditory Modalities.

    PubMed

    Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang

    2016-01-01

    Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of

  19. Odors Bias Time Perception in Visual and Auditory Modalities

    PubMed Central

    Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang

    2016-01-01

    Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of

  20. Relationships between Categorical Perception of Phonemes, Phoneme Awareness, and Visual Attention Span in Developmental Dyslexia.

    PubMed

    Zoubrinetzky, Rachel; Collet, Gregory; Serniclaes, Willy; Nguyen-Morel, Marie-Ange; Valdois, Sylviane

    2016-01-01

    We tested the hypothesis that the categorical perception deficit of speech sounds in developmental dyslexia is related to phoneme awareness skills, whereas a visual attention (VA) span deficit constitutes an independent deficit. Phoneme awareness tasks, VA span tasks and categorical perception tasks of phoneme identification and discrimination using a d/t voicing continuum were administered to 63 dyslexic children and 63 control children matched on chronological age. Results showed significant differences in categorical perception between the dyslexic and control children. Significant correlations were found between categorical perception skills, phoneme awareness and reading. Although VA span correlated with reading, no significant correlations were found between either categorical perception or phoneme awareness and VA span. Mediation analyses performed on the whole dyslexic sample suggested that the effect of categorical perception on reading might be mediated by phoneme awareness. This relationship was independent of the participants' VA span abilities. Two groups of dyslexic children with a single phoneme awareness or a single VA span deficit were then identified. The phonologically impaired group showed lower categorical perception skills than the control group but categorical perception was similar in the VA span impaired dyslexic and control children. The overall findings suggest that the link between categorical perception, phoneme awareness and reading is independent from VA span skills. These findings provide new insights on the heterogeneity of developmental dyslexia. They suggest that phonological processes and VA span independently affect reading acquisition. PMID:26950210

  1. Relationships between Categorical Perception of Phonemes, Phoneme Awareness, and Visual Attention Span in Developmental Dyslexia

    PubMed Central

    Zoubrinetzky, Rachel; Collet, Gregory; Serniclaes, Willy; Nguyen-Morel, Marie-Ange; Valdois, Sylviane

    2016-01-01

    We tested the hypothesis that the categorical perception deficit of speech sounds in developmental dyslexia is related to phoneme awareness skills, whereas a visual attention (VA) span deficit constitutes an independent deficit. Phoneme awareness tasks, VA span tasks and categorical perception tasks of phoneme identification and discrimination using a d/t voicing continuum were administered to 63 dyslexic children and 63 control children matched on chronological age. Results showed significant differences in categorical perception between the dyslexic and control children. Significant correlations were found between categorical perception skills, phoneme awareness and reading. Although VA span correlated with reading, no significant correlations were found between either categorical perception or phoneme awareness and VA span. Mediation analyses performed on the whole dyslexic sample suggested that the effect of categorical perception on reading might be mediated by phoneme awareness. This relationship was independent of the participants’ VA span abilities. Two groups of dyslexic children with a single phoneme awareness or a single VA span deficit were then identified. The phonologically impaired group showed lower categorical perception skills than the control group but categorical perception was similar in the VA span impaired dyslexic and control children. The overall findings suggest that the link between categorical perception, phoneme awareness and reading is independent from VA span skills. These findings provide new insights on the heterogeneity of developmental dyslexia. They suggest that phonological processes and VA span independently affect reading acquisition. PMID:26950210

  2. Black-white asymmetry in visual perception.

    PubMed

    Lu, Zhong-Lin; Sperling, George

    2012-01-01

    With eleven different types of stimuli that exercise a wide gamut of spatial and temporal visual processes, negative perturbations from mean luminance are found to be typically 25% more effective visually than positive perturbations of the same magnitude (range 8-67%). In Experiment 12, the magnitude of the black-white asymmetry is shown to be a saturating function of stimulus contrast. Experiment 13 shows black-white asymmetry primarily involves a nonlinearity in the visual representation of decrements. Black-white asymmetry in early visual processing produces even-harmonic distortion frequencies in all ordinary stimuli and in illusions such as the perceived asymmetry of optically perfect sine wave gratings. In stimuli intended to stimulate exclusively second-order processing in which motion or shape are defined not by luminance differences but by differences in texture contrast, the black-white asymmetry typically generates artifactual luminance (first-order) motion and shape components. Because black-white asymmetry pervades psychophysical and neurophysiological procedures that utilize spatial or temporal variations of luminance, it frequently needs to be considered in the design and evaluation of experiments that involve visual stimuli. Simple procedures to compensate for black-white asymmetry are proposed. PMID:22984221

  3. Black–white asymmetry in visual perception

    PubMed Central

    Lu, Zhong-Lin; Sperling, George

    2012-01-01

    With eleven different types of stimuli that exercise a wide gamut of spatial and temporal visual processes, negative perturbations from mean luminance are found to be typically 25% more effective visually than positive perturbations of the same magnitude (range 8–67%). In Experiment 12, the magnitude of the black–white asymmetry is shown to be a saturating function of stimulus contrast. Experiment 13 shows black–white asymmetry primarily involves a nonlinearity in the visual representation of decrements. Black–white asymmetry in early visual processing produces even-harmonic distortion frequencies in all ordinary stimuli and in illusions such as the perceived asymmetry of optically perfect sine wave gratings. In stimuli intended to stimulate exclusively second-order processing in which motion or shape are defined not by luminance differences but by differences in texture contrast, the black–white asymmetry typically generates artifactual luminance (first-order) motion and shape components. Because black–white asymmetry pervades psychophysical and neurophysiological procedures that utilize spatial or temporal variations of luminance, it frequently needs to be considered in the design and evaluation of experiments that involve visual stimuli. Simple procedures to compensate for black–white asymmetry are proposed. PMID:22984221

  4. Strengthening the Visual Perception of Deaf Children. Final Report.

    ERIC Educational Resources Information Center

    Sachs, David A.; And Others

    Learning sets programs were administered to preschool deaf children from a variety of representative educational programs throughout the southwest to improve their visual perception skills. The concept of learning sets was described as progression from trial-and-error learning to immediate problem solving by insight. The project consisted of six…

  5. Audio-Visual Speech Perception: A Developmental ERP Investigation

    ERIC Educational Resources Information Center

    Knowland, Victoria C. P.; Mercure, Evelyne; Karmiloff-Smith, Annette; Dick, Fred; Thomas, Michael S. C.

    2014-01-01

    Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language…

  6. Perceptual Training Strongly Improves Visual Motion Perception in Schizophrenia

    ERIC Educational Resources Information Center

    Norton, Daniel J.; McBain, Ryan K.; Ongur, Dost; Chen, Yue

    2011-01-01

    Schizophrenia patients exhibit perceptual and cognitive deficits, including in visual motion processing. Given that cognitive systems depend upon perceptual inputs, improving patients' perceptual abilities may be an effective means of cognitive intervention. In healthy people, motion perception can be enhanced through perceptual learning, but it…

  7. Visual Perception of Touchdown Point During Simulated Landing

    ERIC Educational Resources Information Center

    Palmisano, Stephen; Gillam, Barbara

    2005-01-01

    Experiments examined the accuracy of visual touchdown point perception during oblique descents (1.5?-15?) toward a ground plane consisting of (a) randomly positioned dots, (b) a runway outline, or (c) a grid. Participants judged whether the perceived touchdown point was above or below a probe that appeared at a random position following each…

  8. Dynamic Visual Perception and Reading Development in Chinese School Children

    ERIC Educational Resources Information Center

    Meng, Xiangzhi; Cheng-Lai, Alice; Zeng, Biao; Stein, John F.; Zhou, Xiaolin

    2011-01-01

    The development of reading skills may depend to a certain extent on the development of basic visual perception. The magnocellular theory of developmental dyslexia assumes that deficits in the magnocellular pathway, indicated by less sensitivity in perceiving dynamic sensory stimuli, are responsible for a proportion of reading difficulties…

  9. Asymmetries for the Visual Expression and Perception of Speech

    ERIC Educational Resources Information Center

    Nicholls, Michael E. R.; Searle, Dara A.

    2006-01-01

    This study explored asymmetries for movement, expression and perception of visual speech. Sixteen dextral models were videoed as they articulated: "bat," "cat," "fat," and "sat." Measurements revealed that the right side of the mouth was opened wider and for a longer period than the left. The asymmetry was accentuated at the beginning and ends of…

  10. Curriculum Model for Oculomotor. Binocular, and Visual Perception Dysfunctions.

    ERIC Educational Resources Information Center

    Journal of Optometric Education, 1988

    1988-01-01

    A curriculum for disorders of oculomotor control, binocular vision, and visual perception, adopted by the Association of Schools and Colleges of Optometry, is outlined. The curriculum's 14 objectives in physiology, perceptual and cognitive development, epidemiology, public health, diagnosis and management, environmental influences, care delivery,…

  11. Enhanced visual perception with occipital transcranial magnetic stimulation.

    PubMed

    Mulckhuyse, Manon; Kelley, Todd A; Theeuwes, Jan; Walsh, Vincent; Lavie, Nilli

    2011-10-01

    Transcranial magnetic stimulation (TMS) over the occipital pole can produce an illusory percept of a light flash (or 'phosphene'), suggesting an excitatory effect. Whereas previous reported effects produced by single-pulse occipital pole TMS are typically disruptive, here we report the first demonstration of a location-specific facilitatory effect on visual perception in humans. Observers performed a spatial cueing orientation discrimination task. An orientation target was presented in one of two peripheral placeholders. A single pulse below the phosphene threshold applied to the occipital pole 150 or 200 ms before stimulus onset was found to facilitate target discrimination in the contralateral compared with the ipsilateral visual field. At the 150-ms time window contralateral TMS also amplified cueing effects, increasing both facilitation effects for valid cues and interference effects for invalid cues. These results are the first to show location-specific enhanced visual perception with single-pulse occipital pole stimulation prior to stimulus presentation, suggesting that occipital stimulation can enhance the excitability of visual cortex to subsequent perception. PMID:21848918

  12. Infant Perception of Audio-Visual Speech Synchrony

    ERIC Educational Resources Information Center

    Lewkowicz, David J.

    2010-01-01

    Three experiments investigated perception of audio-visual (A-V) speech synchrony in 4- to 10-month-old infants. Experiments 1 and 2 used a convergent-operations approach by habituating infants to an audiovisually synchronous syllable (Experiment 1) and then testing for detection of increasing degrees of A-V asynchrony (366, 500, and 666 ms) or by…

  13. Prenatal exposure to recreational drugs affects global motion perception in preschool children

    PubMed Central

    Chakraborty, Arijit; Anstice, Nicola S.; Jacobs, Robert J.; LaGasse, Linda L.; Lester, Barry M.; Wouldes, Trecia A.; Thompson, Benjamin

    2015-01-01

    Prenatal exposure to recreational drugs impairs motor and cognitive development; however it is currently unknown whether visual brain areas are affected. To address this question, we investigated the effect of prenatal drug exposure on global motion perception, a behavioural measure of processing within the dorsal extrastriate visual cortex that is thought to be particularly vulnerable to abnormal neurodevelopment. Global motion perception was measured in one hundred and forty-five 4.5-year-old children who had been exposed to different combinations of methamphetamine, alcohol, nicotine and marijuana prior to birth and 25 unexposed children. Self-reported drug use by the mothers was verified by meconium analysis. We found that global motion perception was impaired by prenatal exposure to alcohol and improved significantly by exposure to marijuana. Exposure to both drugs prenatally had no effect. Other visual functions such as habitual visual acuity and stereoacuity were not affected by drug exposure. Prenatal exposure to methamphetamine did not influence visual function. Our results demonstrate that prenatal drug exposure can influence a behavioural measure of visual development, but that the effects are dependent on the specific drugs used during pregnancy. PMID:26581958

  14. Prenatal exposure to recreational drugs affects global motion perception in preschool children.

    PubMed

    Chakraborty, Arijit; Anstice, Nicola S; Jacobs, Robert J; LaGasse, Linda L; Lester, Barry M; Wouldes, Trecia A; Thompson, Benjamin

    2015-01-01

    Prenatal exposure to recreational drugs impairs motor and cognitive development; however it is currently unknown whether visual brain areas are affected. To address this question, we investigated the effect of prenatal drug exposure on global motion perception, a behavioural measure of processing within the dorsal extrastriate visual cortex that is thought to be particularly vulnerable to abnormal neurodevelopment. Global motion perception was measured in one hundred and forty-five 4.5-year-old children who had been exposed to different combinations of methamphetamine, alcohol, nicotine and marijuana prior to birth and 25 unexposed children. Self-reported drug use by the mothers was verified by meconium analysis. We found that global motion perception was impaired by prenatal exposure to alcohol and improved significantly by exposure to marijuana. Exposure to both drugs prenatally had no effect. Other visual functions such as habitual visual acuity and stereoacuity were not affected by drug exposure. Prenatal exposure to methamphetamine did not influence visual function. Our results demonstrate that prenatal drug exposure can influence a behavioural measure of visual development, but that the effects are dependent on the specific drugs used during pregnancy. PMID:26581958

  15. Integration of visual and motion cues for simulator requirements and ride quality investigation. [computerized simulation of aircraft landing, visual perception of aircraft pilots

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1975-01-01

    Preliminary tests and evaluation are presented of pilot performance during landing (flight paths) using computer generated images (video tapes). Psychophysiological factors affecting pilot visual perception were measured. A turning flight maneuver (pitch and roll) was specifically studied using a training device, and the scaling laws involved were determined. Also presented are medical studies (abstracts) on human response to gravity variations without visual cues, acceleration stimuli effects on the semicircular canals, and neurons affecting eye movements, and vestibular tests.

  16. Quality of Visual Cue Affects Visual Reweighting in Quiet Standing.

    PubMed

    Moraes, Renato; de Freitas, Paulo Barbosa; Razuk, Milena; Barela, José Angelo

    2016-01-01

    Sensory reweighting is a characteristic of postural control functioning adopted to accommodate environmental changes. The use of mono or binocular cues induces visual reduction/increment of moving room influences on postural sway, suggesting a visual reweighting due to the quality of available sensory cues. Because in our previous study visual conditions were set before each trial, participants could adjust the weight of the different sensory systems in an anticipatory manner based upon the reduction in quality of the visual information. Nevertheless, in daily situations this adjustment is a dynamical process and occurs during ongoing movement. The purpose of this study was to examine the effect of visual transitions in the coupling between visual information and body sway in two different distances from the front wall of a moving room. Eleven young adults stood upright inside of a moving room in two distances (75 and 150 cm) wearing a liquid crystal lenses goggles, which allow individual lenses transition from opaque to transparent and vice-versa. Participants stood still during five minutes for each trial and the lenses status changed every one minute (no vision to binocular vision, no vision to monocular vision, binocular vision to monocular vision, and vice-versa). Results showed that farther distance and monocular vision reduced the effect of visual manipulation on postural sway. The effect of visual transition was condition dependent, with a stronger effect when transitions involved binocular vision than monocular vision. Based upon these results, we conclude that the increased distance from the front wall of the room reduced the effect of visual manipulation on postural sway and that sensory reweighting is stimulus quality dependent, with binocular vision producing a much stronger down/up-weighting than monocular vision. PMID:26939058

  17. Quality of Visual Cue Affects Visual Reweighting in Quiet Standing

    PubMed Central

    Moraes, Renato; de Freitas, Paulo Barbosa; Razuk, Milena; Barela, José Angelo

    2016-01-01

    Sensory reweighting is a characteristic of postural control functioning adopted to accommodate environmental changes. The use of mono or binocular cues induces visual reduction/increment of moving room influences on postural sway, suggesting a visual reweighting due to the quality of available sensory cues. Because in our previous study visual conditions were set before each trial, participants could adjust the weight of the different sensory systems in an anticipatory manner based upon the reduction in quality of the visual information. Nevertheless, in daily situations this adjustment is a dynamical process and occurs during ongoing movement. The purpose of this study was to examine the effect of visual transitions in the coupling between visual information and body sway in two different distances from the front wall of a moving room. Eleven young adults stood upright inside of a moving room in two distances (75 and 150 cm) wearing a liquid crystal lenses goggles, which allow individual lenses transition from opaque to transparent and vice-versa. Participants stood still during five minutes for each trial and the lenses status changed every one minute (no vision to binocular vision, no vision to monocular vision, binocular vision to monocular vision, and vice-versa). Results showed that farther distance and monocular vision reduced the effect of visual manipulation on postural sway. The effect of visual transition was condition dependent, with a stronger effect when transitions involved binocular vision than monocular vision. Based upon these results, we conclude that the increased distance from the front wall of the room reduced the effect of visual manipulation on postural sway and that sensory reweighting is stimulus quality dependent, with binocular vision producing a much stronger down/up-weighting than monocular vision. PMID:26939058

  18. Audiovisual associations alter the perception of low-level visual motion.

    PubMed

    Kafaligonul, Hulusi; Oluk, Can

    2015-01-01

    Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role. PMID:25873869

  19. Adaptation to visual or auditory time intervals modulates the perception of visual apparent motion

    PubMed Central

    Zhang, Huihui; Chen, Lihan; Zhou, Xiaolin

    2012-01-01

    It is debated whether sub-second timing is subserved by a centralized mechanism or by the intrinsic properties of task-related neural activity in specific modalities (Ivry and Schlerf, 2008). By using a temporal adaptation task, we investigated whether adapting to different time intervals conveyed through stimuli in different modalities (i.e., frames of a visual Ternus display, visual blinking discs, or auditory beeps) would affect the subsequent implicit perception of visual timing, i.e., inter-stimulus interval (ISI) between two frames in a Ternus display. The Ternus display can induce two percepts of apparent motion (AM), depending on the ISI between the two frames: “element motion” for short ISIs, in which the endmost disc is seen as moving back and forth while the middle disc at the overlapping or central position remains stationary; “group motion” for longer ISIs, in which both discs appear to move in a manner of lateral displacement as a whole. In Experiment 1, participants adapted to either the typical “element motion” (ISI = 50 ms) or the typical “group motion” (ISI = 200 ms). In Experiments 2 and 3, participants adapted to a time interval of 50 or 200 ms through observing a series of two paired blinking discs at the center of the screen (Experiment 2) or hearing a sequence of two paired beeps (with pitch 1000 Hz). In Experiment 4, participants adapted to sequences of paired beeps with either low pitches (500 Hz) or high pitches (5000 Hz). After adaptation in each trial, participants were presented with a Ternus probe in which the ISI between the two frames was equal to the transitional threshold of the two types of motions, as determined by a pretest. Results showed that adapting to the short time interval in all the situations led to more reports of “group motion” in the subsequent Ternus probes; adapting to the long time interval, however, caused no aftereffect for visual adaptation but significantly more reports of group motion for

  20. The Perception of Cooperativeness Without Any Visual or Auditory Communication.

    PubMed

    Chang, Dong-Seon; Burger, Franziska; Bülthoff, Heinrich H; de la Rosa, Stephan

    2015-12-01

    Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal. PMID:27551362

  1. The Perception of Cooperativeness Without Any Visual or Auditory Communication

    PubMed Central

    Chang, Dong-Seon; Burger, Franziska; de la Rosa, Stephan

    2015-01-01

    Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal. PMID:27551362

  2. Neural Population Tuning Links Visual Cortical Anatomy to Human Visual Perception

    PubMed Central

    Song, Chen; Schwarzkopf, Dietrich Samuel; Kanai, Ryota; Rees, Geraint

    2015-01-01

    Summary The anatomy of cerebral cortex is characterized by two genetically independent variables, cortical thickness and cortical surface area, that jointly determine cortical volume. It remains unclear how cortical anatomy might influence neural response properties and whether such influences would have behavioral consequences. Here, we report that thickness and surface area of human early visual cortices exert opposite influences on neural population tuning with behavioral consequences for perceptual acuity. We found that visual cortical thickness correlated negatively with the sharpness of neural population tuning and the accuracy of perceptual discrimination at different visual field positions. In contrast, visual cortical surface area correlated positively with neural population tuning sharpness and perceptual discrimination accuracy. Our findings reveal a central role for neural population tuning in linking visual cortical anatomy to visual perception and suggest that a perceptually advantageous visual cortex is a thinned one with an enlarged surface area. PMID:25619658

  3. Perception, Cognition, and Effectiveness of Visualizations with Applications in Science and Engineering

    NASA Astrophysics Data System (ADS)

    Borkin, Michelle A.

    Visualization is a powerful tool for data exploration and analysis. With data ever-increasing in quantity and becoming integrated into our daily lives, having effective visualizations is necessary. But how does one design an effective visualization? To answer this question we need to understand how humans perceive, process, and understand visualizations. Through visualization evaluation studies we can gain deeper insight into the basic perception and cognition theory of visualizations, both through domain-specific case studies as well as generalized laboratory experiments. This dissertation presents the results of four evaluation studies, each of which contributes new knowledge to the theory of perception and cognition of visualizations. The results of these studies include a deeper clearer understanding of how color, data representation dimensionality, spatial layout, and visual complexity affect a visualization's effectiveness, as well as how visualization types and visual attributes affect the memorability of a visualization. We first present the results of two domain-specific case study evaluations. The first study is in the field of biomedicine in which we developed a new heart disease diagnostic tool, and conducted a study to evaluate the effectiveness of 2D versus 3D data representations as well as color maps. In the second study, we developed a new visualization tool for filesystem provenance data with applications in computer science and the sciences more broadly. We additionally developed a new time-based hierarchical node grouping method. We then conducted a study to evaluate the effectiveness of the new tool with its radial layout versus the conventional node-link diagram, and the new node grouping method. Finally, we discuss the results of two generalized studies designed to understand what makes a visualization memorable. In the first evaluation we focused on visualization memorability and conducted an online study using Amazon's Mechanical Turk with

  4. Spatial and temporal characteristics of visual motion perception involving V5 visual cortex.

    PubMed

    d'Alfonso, A A; van Honk, J; Schutter, D J; Caffé, A R; Postma, A; de Haan, E H

    2002-04-01

    The anatomical substrates of the perception of motion have not yet been established in a detailed way on an individual level. The aim of this study was to develop a systematic procedure for mapping the visual cortex using Transcranial Magnetic Stimulation (TMS). The results showed that such an individual and detailed map of the spatial and temporal characteristics of motion perception can be constructed using TMS. PMID:11958420

  5. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  6. Modeling visual clutter perception using proto-object segmentation

    PubMed Central

    Yu, Chen-Ping; Samaras, Dimitris; Zelinsky, Gregory J.

    2014-01-01

    We introduce the proto-object model of visual clutter perception. This unsupervised model segments an image into superpixels, then merges neighboring superpixels that share a common color cluster to obtain proto-objects—defined here as spatially extended regions of coherent features. Clutter is estimated by simply counting the number of proto-objects. We tested this model using 90 images of realistic scenes that were ranked by observers from least to most cluttered. Comparing this behaviorally obtained ranking to a ranking based on the model clutter estimates, we found a significant correlation between the two (Spearman's ρ = 0.814, p < 0.001). We also found that the proto-object model was highly robust to changes in its parameters and was generalizable to unseen images. We compared the proto-object model to six other models of clutter perception and demonstrated that it outperformed each, in some cases dramatically. Importantly, we also showed that the proto-object model was a better predictor of clutter perception than an actual count of the number of objects in the scenes, suggesting that the set size of a scene may be better described by proto-objects than objects. We conclude that the success of the proto-object model is due in part to its use of an intermediate level of visual representation—one between features and objects—and that this is evidence for the potential importance of a proto-object representation in many common visual percepts and tasks. PMID:24904121

  7. Visual movement perception in deaf and hearing individuals

    PubMed Central

    Hauthal, Nadine; Sandmann, Pascale; Debener, Stefan; Thorne, Jeremy D.

    2013-01-01

    A number of studies have investigated changes in the perception of visual motion as a result of altered sensory experiences. An animal study has shown that auditory-deprived cats exhibit enhanced performance in a visual movement detection task compared to hearing cats (Lomber, Meredith, & Kral, 2010). In humans, the behavioural evidence regarding the perception of motion is less clear. The present study investigated deaf and hearing adult participants using a movement localization task and a direction of motion task employing coherently-moving and static visual dot patterns. Overall, deaf and hearing participants did not differ in their movement localization performance, although within the deaf group, a left visual field advantage was found. When discriminating the direction of motion, however, deaf participants responded faster and tended to be more accurate when detecting small differences in direction compared with the hearing controls. These results conform to the view that visual abilities are enhanced after auditory deprivation and extend previous findings regarding visual motion processing in deaf individuals. PMID:23826037

  8. Boosting visual cortex function and plasticity with acetylcholine to enhance visual perception

    PubMed Central

    Kang, Jun Il; Huppé-Gourgues, Frédéric; Vaucher, Elvire

    2014-01-01

    The cholinergic system is a potent neuromodulatory system that plays critical roles in cortical plasticity, attention and learning. In this review, we propose that the cellular effects of acetylcholine (ACh) in the primary visual cortex during the processing of visual inputs might induce perceptual learning; i.e., long-term changes in visual perception. Specifically, the pairing of cholinergic activation with visual stimulation increases the signal-to-noise ratio, cue detection ability and long-term facilitation in the primary visual cortex. This cholinergic enhancement would increase the strength of thalamocortical afferents to facilitate the treatment of a novel stimulus while decreasing the cortico-cortical signaling to reduce recurrent or top-down modulation. This balance would be mediated by different cholinergic receptor subtypes that are located on both glutamatergic and GABAergic neurons of the different cortical layers. The mechanisms of cholinergic enhancement are closely linked to attentional processes, long-term potentiation (LTP) and modulation of the excitatory/inhibitory balance. Recently, it was found that boosting the cholinergic system during visual training robustly enhances sensory perception in a long-term manner. Our hypothesis is that repetitive pairing of cholinergic and sensory stimulation over a long period of time induces long-term changes in the processing of trained stimuli that might improve perceptual ability. Various non-invasive approaches to the activation of the cholinergic neurons have strong potential to improve visual perception. PMID:25278848

  9. The Geometry of Visual Perception: Retinotopic and Non-retinotopic Representations in the Human Visual System

    PubMed Central

    Öğmen, Haluk; Herzog, Michael H.

    2011-01-01

    Geometry is closely linked to visual perception; yet, very little is known about the geometry of visual processing beyond early retinotopic organization. We present a variety of perceptual phenomena showing that a retinotopic representation is neither sufficient nor necessary to support form perception. We discuss the popular “object files” concept as a candidate for non-retinotopic representations and, based on its shortcomings, suggest future directions for research using local manifold representations. We suggest that these manifolds are created by the emergence of dynamic reference-frames that result from motion segmentation. We also suggest that the metric of these manifolds is based on relative motion vectors. PMID:22334763

  10. PERCEPT: indoor navigation for the blind and visually impaired.

    PubMed

    Ganz, Aura; Gandhi, Siddhesh Rajan; Schafer, James; Singh, Tushar; Puleo, Elaine; Mullett, Gary; Wilson, Carole

    2011-01-01

    In order to enhance the perception of indoor and unfamiliar environments for the blind and visually-impaired, we introduce the PERCEPT system that supports a number of unique features such as: a) Low deployment and maintenance cost; b) Scalability, i.e. we can deploy the system in very large buildings; c) An on-demand system that does not overwhelm the user, as it offers small amounts of information on demand; and d) Portability and ease-of-use, i.e., the custom handheld device carried by the user is compact and instructions are received audibly. PMID:22254445

  11. Visual Timing of Structured Dance Movements Resembles Auditory Rhythm Perception.

    PubMed

    Su, Yi-Huang; Salazar-López, Elvira

    2016-01-01

    Temporal mechanisms for processing auditory musical rhythms are well established, in which a perceived beat is beneficial for timing purposes. It is yet unknown whether such beat-based timing would also underlie visual perception of temporally structured, ecological stimuli connected to music: dance. In this study, we investigated whether observers extracted a visual beat when watching dance movements to assist visual timing of these movements. Participants watched silent videos of dance sequences and reproduced the movement duration by mental recall. We found better visual timing for limb movements with regular patterns in the trajectories than without, similar to the beat advantage for auditory rhythms. When movements involved both the arms and the legs, the benefit of a visual beat relied only on the latter. The beat-based advantage persisted despite auditory interferences that were temporally incongruent with the visual beat, arguing for the visual nature of these mechanisms. Our results suggest that visual timing principles for dance parallel their auditory counterparts for music, which may be based on common sensorimotor coupling. These processes likely yield multimodal rhythm representations in the scenario of music and dance. PMID:27313900

  12. Visual Timing of Structured Dance Movements Resembles Auditory Rhythm Perception

    PubMed Central

    Su, Yi-Huang; Salazar-López, Elvira

    2016-01-01

    Temporal mechanisms for processing auditory musical rhythms are well established, in which a perceived beat is beneficial for timing purposes. It is yet unknown whether such beat-based timing would also underlie visual perception of temporally structured, ecological stimuli connected to music: dance. In this study, we investigated whether observers extracted a visual beat when watching dance movements to assist visual timing of these movements. Participants watched silent videos of dance sequences and reproduced the movement duration by mental recall. We found better visual timing for limb movements with regular patterns in the trajectories than without, similar to the beat advantage for auditory rhythms. When movements involved both the arms and the legs, the benefit of a visual beat relied only on the latter. The beat-based advantage persisted despite auditory interferences that were temporally incongruent with the visual beat, arguing for the visual nature of these mechanisms. Our results suggest that visual timing principles for dance parallel their auditory counterparts for music, which may be based on common sensorimotor coupling. These processes likely yield multimodal rhythm representations in the scenario of music and dance. PMID:27313900

  13. Producing Curious Affects: Visual Methodology as an Affecting and Conflictual Wunderkammer

    ERIC Educational Resources Information Center

    Staunaes, Dorthe; Kofoed, Jette

    2015-01-01

    Digital video cameras, smartphones, internet and iPads are increasingly used as visual research methods with the purpose of creating an affective corpus of data. Such visual methods are often combined with interviews or observations. Not only are visual methods part of the used research methods, the visual products are used as requisites in…

  14. Accuracy aspects of stereo side-looking radar. [analysis of its visual perception and binocular vision

    NASA Technical Reports Server (NTRS)

    Leberl, F. W.

    1979-01-01

    The geometry of the radar stereo model and factors affecting visual radar stereo perception are reviewed. Limits to the vertical exaggeration factor of stereo radar are defined. Radar stereo model accuracies are analyzed with respect to coordinate errors caused by errors of radar sensor position and of range, and with respect to errors of coordinate differences, i.e., cross-track distances and height differences.

  15. Audio-visual speech perception: a developmental ERP investigation

    PubMed Central

    Knowland, Victoria CP; Mercure, Evelyne; Karmiloff-Smith, Annette; Dick, Fred; Thomas, Michael SC

    2014-01-01

    Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language learning. We therefore explored this at the neural level. The event-related potential (ERP) technique has been used to assess the mechanisms of audio-visual speech perception in adults, with visual cues reliably modulating auditory ERP responses to speech. Previous work has shown congruence-dependent shortening of auditory N1/P2 latency and congruence-independent attenuation of amplitude in the presence of auditory and visual speech signals, compared to auditory alone. The aim of this study was to chart the development of these well-established modulatory effects over mid-to-late childhood. Experiment 1 employed an adult sample to validate a child-friendly stimulus set and paradigm by replicating previously observed effects of N1/P2 amplitude and latency modulation by visual speech cues; it also revealed greater attenuation of component amplitude given incongruent audio-visual stimuli, pointing to a new interpretation of the amplitude modulation effect. Experiment 2 used the same paradigm to map cross-sectional developmental change in these ERP responses between 6 and 11 years of age. The effect of amplitude modulation by visual cues emerged over development, while the effect of latency modulation was stable over the child sample. These data suggest that auditory ERP modulation by visual speech represents separable underlying cognitive processes, some of which show earlier maturation than others over the course of development. PMID:24176002

  16. Audio-visual speech perception: a developmental ERP investigation.

    PubMed

    Knowland, Victoria C P; Mercure, Evelyne; Karmiloff-Smith, Annette; Dick, Fred; Thomas, Michael S C

    2014-01-01

    Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language learning. We therefore explored this at the neural level. The event-related potential (ERP) technique has been used to assess the mechanisms of audio-visual speech perception in adults, with visual cues reliably modulating auditory ERP responses to speech. Previous work has shown congruence-dependent shortening of auditory N1/P2 latency and congruence-independent attenuation of amplitude in the presence of auditory and visual speech signals, compared to auditory alone. The aim of this study was to chart the development of these well-established modulatory effects over mid-to-late childhood. Experiment 1 employed an adult sample to validate a child-friendly stimulus set and paradigm by replicating previously observed effects of N1/P2 amplitude and latency modulation by visual speech cues; it also revealed greater attenuation of component amplitude given incongruent audio-visual stimuli, pointing to a new interpretation of the amplitude modulation effect. Experiment 2 used the same paradigm to map cross-sectional developmental change in these ERP responses between 6 and 11 years of age. The effect of amplitude modulation by visual cues emerged over development, while the effect of latency modulation was stable over the child sample. These data suggest that auditory ERP modulation by visual speech represents separable underlying cognitive processes, some of which show earlier maturation than others over the course of development. PMID:24176002

  17. Image Watermarking Based on Adaptive Models of Human Visual Perception

    NASA Astrophysics Data System (ADS)

    Khawne, Amnach; Hamamoto, Kazuhiko; Chitsobhuk, Orachat

    This paper proposes a digital image watermarking based on adaptive models of human visual perception. The algorithm exploits the local activities estimated from wavelet coefficients of each subband to adaptively control the luminance masking. The adaptive luminance is thus delicately combined with the contrast masking and edge detection and adopted as a visibility threshold. With the proposed combination of adaptive visual sensitivity parameters, the proposed perceptual model can be more appropriate to the different characteristics of various images. The weighting function is chosen such that the fidelity, imperceptibility and robustness could be preserved without making any perceptual difference to the image quality.

  18. A STUDY OF CONCEPTUAL DEVELOPMENT AND VISUAL PERCEPTION IN SIX-YEAR-OLD CHILDREN.

    PubMed

    Bütün Ayhan, Aynur; Aki, Esra; Mutlu, Burcu; Aral, Neriman

    2015-12-01

    Visual perception comprises established responses to visual stimuli. Conceptual development accompanies the development of visual perception skills. Both visual perception and sufficient conceptual development is vital to a child's academic skills and social participation. The aim of this study was to examine the relationship between conceptual development and visual perceptual skills of six-year-old children. 140 children were administered Bracken's (1998) Basic Concept Scale (BBCS-R) and the Frostig Developmental Visual Perception Test. BBCS-R scores were weakly correlated with FDVPT Discrimination of figure-ground, and had moderate and significant correlations with Constancy of the figures, Perception of position in space, Perception of spatial relation, and the Total score on visual perception. Also, a moderate correlation was found between the total scores of the FDVPT and the total score of the BBCS-R. PMID:26595200

  19. Influence of aging on visual perception and visual motor integration in Korean adults.

    PubMed

    Kim, Eunhwi; Park, Young-Kyung; Byun, Yong-Hyun; Park, Mi-Sook; Kim, Hong

    2014-08-01

    This study investigated age-related changes of cognitive function in Korean adults using the Korean-Developmental Test of Visual Perception-2 (K-DTVP-2) and the Visual Motor Integration-3rd Revision (VMI-3R) test, and determined the main factors influencing VP and VMI in older adults. For this research, 139 adults for the K-DTVP-2 and 192 adults for the VMI-3R, from a total of 283 participants, were randomly and separately recruited in province, Korea. The present study showed that the mean score of the K-DTVP-2 and VMI-3R in 10-yr age increments significantly decreased as age increased (K-DTVP-2, F= 41.120, P< 0.001; VMI-3R, F= 16.583, P< 0.001). The mean score of the VMI-3R and K-DTVP-2 were significantly decreased in participants in their 50s compared to those in their 20s (P< 0.05). Age (t= -9.130, P< 0.001), gender (t= 3.029, P= 0.003), and the presence of diseases (t= -2.504, P= 0.013) were the significant factors affecting K-DTVP-2 score. On the other hand, age (t= -6.300, P< 0.001) was the only significant factor affecting VMI-3R score. K-DTVP-2 score (Standardized β= -0.611) decreased more sensitively with aging than VMI-3R (Standardized β= -0.467). The two measurements had a significant positive correlation (r = 0.855, P< 0.001). In conclusion, it can be suggested that VP and VMI should be regularly checked from an individual's 50s, which is a critical period for detecting cognitive decline by aging. Both the K-DTVP-2 and VMI-3R could be used for determining the level of cognitive deficit by aging. PMID:25210701

  20. The Effects of Visual Beats on Prosodic Prominence: Acoustic Analyses, Auditory Perception and Visual Perception

    ERIC Educational Resources Information Center

    Krahmer, Emiel; Swerts, Marc

    2007-01-01

    Speakers employ acoustic cues (pitch accents) to indicate that a word is important, but may also use visual cues (beat gestures, head nods, eyebrow movements) for this purpose. Even though these acoustic and visual cues are related, the exact nature of this relationship is far from well understood. We investigate whether producing a visual beat…

  1. Aesthetic perception of visual textures: a holistic exploration using texture analysis, psychological experiment, and perception modeling.

    PubMed

    Liu, Jianli; Lughofer, Edwin; Zeng, Xianyi

    2015-01-01

    Modeling human aesthetic perception of visual textures is important and valuable in numerous industrial domains, such as product design, architectural design, and decoration. Based on results from a semantic differential rating experiment, we modeled the relationship between low-level basic texture features and aesthetic properties involved in human aesthetic texture perception. First, we compute basic texture features from textural images using four classical methods. These features are neutral, objective, and independent of the socio-cultural context of the visual textures. Then, we conduct a semantic differential rating experiment to collect from evaluators their aesthetic perceptions of selected textural stimuli. In semantic differential rating experiment, eights pairs of aesthetic properties are chosen, which are strongly related to the socio-cultural context of the selected textures and to human emotions. They are easily understood and connected to everyday life. We propose a hierarchical feed-forward layer model of aesthetic texture perception and assign 8 pairs of aesthetic properties to different layers. Finally, we describe the generation of multiple linear and non-linear regression models for aesthetic prediction by taking dimensionality-reduced texture features and aesthetic properties of visual textures as dependent and independent variables, respectively. Our experimental results indicate that the relationships between each layer and its neighbors in the hierarchical feed-forward layer model of aesthetic texture perception can be fitted well by linear functions, and the models thus generated can successfully bridge the gap between computational texture features and aesthetic texture properties. PMID:26582987

  2. Aesthetic perception of visual textures: a holistic exploration using texture analysis, psychological experiment, and perception modeling

    PubMed Central

    Liu, Jianli; Lughofer, Edwin; Zeng, Xianyi

    2015-01-01

    Modeling human aesthetic perception of visual textures is important and valuable in numerous industrial domains, such as product design, architectural design, and decoration. Based on results from a semantic differential rating experiment, we modeled the relationship between low-level basic texture features and aesthetic properties involved in human aesthetic texture perception. First, we compute basic texture features from textural images using four classical methods. These features are neutral, objective, and independent of the socio-cultural context of the visual textures. Then, we conduct a semantic differential rating experiment to collect from evaluators their aesthetic perceptions of selected textural stimuli. In semantic differential rating experiment, eights pairs of aesthetic properties are chosen, which are strongly related to the socio-cultural context of the selected textures and to human emotions. They are easily understood and connected to everyday life. We propose a hierarchical feed-forward layer model of aesthetic texture perception and assign 8 pairs of aesthetic properties to different layers. Finally, we describe the generation of multiple linear and non-linear regression models for aesthetic prediction by taking dimensionality-reduced texture features and aesthetic properties of visual textures as dependent and independent variables, respectively. Our experimental results indicate that the relationships between each layer and its neighbors in the hierarchical feed-forward layer model of aesthetic texture perception can be fitted well by linear functions, and the models thus generated can successfully bridge the gap between computational texture features and aesthetic texture properties. PMID:26582987

  3. Sound Affects the Speed of Visual Processing

    ERIC Educational Resources Information Center

    Keetels, Mirjam; Vroomen, Jean

    2011-01-01

    The authors examined the effects of a task-irrelevant sound on visual processing. Participants were presented with revolving clocks at or around central fixation and reported the hand position of a target clock at the time an exogenous cue (1 clock turning red) or an endogenous cue (a line pointing toward 1 of the clocks) was presented. A…

  4. The Effect of Combined Sensory and Semantic Components on Audio–Visual Speech Perception in Older Adults

    PubMed Central

    Maguinness, Corrina; Setti, Annalisa; Burke, Kate E.; Kenny, Rose Anne; Newell, Fiona N.

    2011-01-01

    Previous studies have found that perception in older people benefits from multisensory over unisensory information. As normal speech recognition is affected by both the auditory input and the visual lip movements of the speaker, we investigated the efficiency of audio and visual integration in an older population by manipulating the relative reliability of the auditory and visual information in speech. We also investigated the role of the semantic context of the sentence to assess whether audio–visual integration is affected by top-down semantic processing. We presented participants with audio–visual sentences in which the visual component was either blurred or not blurred. We found that there was a greater cost in recall performance for semantically meaningless speech in the audio–visual ‘blur’ compared to audio–visual ‘no blur’ condition and this effect was specific to the older group. Our findings have implications for understanding how aging affects efficient multisensory integration for the perception of speech and suggests that multisensory inputs may benefit speech perception in older adults when the semantic content of the speech is unpredictable. PMID:22207848

  5. Ongoing Slow Fluctuations in V1 Impact on Visual Perception.

    PubMed

    Wohlschläger, Afra M; Glim, Sarah; Shao, Junming; Draheim, Johanna; Köhler, Lina; Lourenço, Susana; Riedl, Valentin; Sorg, Christian

    2016-01-01

    The human brain's ongoing activity is characterized by intrinsic networks of coherent fluctuations, measured for example with correlated functional magnetic resonance imaging signals. So far, however, the brain processes underlying this ongoing blood oxygenation level dependent (BOLD) signal orchestration and their direct relevance for human behavior are not sufficiently understood. In this study, we address the question of whether and how ongoing BOLD activity within intrinsic occipital networks impacts on conscious visual perception. To this end, backwardly masked targets were presented in participants' left visual field only, leaving the ipsi-lateral occipital areas entirely free from direct effects of task throughout the experiment. Signal time courses of ipsi-lateral BOLD fluctuations in visual areas V1 and V2 were then used as proxies for the ongoing contra-lateral BOLD activity within the bilateral networks. Magnitude and phase of these fluctuations were compared in trials with and without conscious visual perception, operationalized by means of subjective confidence ratings. Our results show that ipsi-lateral BOLD magnitudes in V1 were significantly higher at times of peak response when the target was perceived consciously. A significant difference between conscious and non-conscious perception with regard to the pre-target phase of an intrinsic-frequency regime suggests that ongoing V1 fluctuations exert a decisive impact on the access to consciousness already before stimulation. Both effects were absent in V2. These results thus support the notion that ongoing slow BOLD activity within intrinsic networks covering V1 represents localized processes that modulate the degree of readiness for the emergence of visual consciousness. PMID:27601986

  6. Ongoing Slow Fluctuations in V1 Impact on Visual Perception

    PubMed Central

    Wohlschläger, Afra M.; Glim, Sarah; Shao, Junming; Draheim, Johanna; Köhler, Lina; Lourenço, Susana; Riedl, Valentin; Sorg, Christian

    2016-01-01

    The human brain’s ongoing activity is characterized by intrinsic networks of coherent fluctuations, measured for example with correlated functional magnetic resonance imaging signals. So far, however, the brain processes underlying this ongoing blood oxygenation level dependent (BOLD) signal orchestration and their direct relevance for human behavior are not sufficiently understood. In this study, we address the question of whether and how ongoing BOLD activity within intrinsic occipital networks impacts on conscious visual perception. To this end, backwardly masked targets were presented in participants’ left visual field only, leaving the ipsi-lateral occipital areas entirely free from direct effects of task throughout the experiment. Signal time courses of ipsi-lateral BOLD fluctuations in visual areas V1 and V2 were then used as proxies for the ongoing contra-lateral BOLD activity within the bilateral networks. Magnitude and phase of these fluctuations were compared in trials with and without conscious visual perception, operationalized by means of subjective confidence ratings. Our results show that ipsi-lateral BOLD magnitudes in V1 were significantly higher at times of peak response when the target was perceived consciously. A significant difference between conscious and non-conscious perception with regard to the pre-target phase of an intrinsic-frequency regime suggests that ongoing V1 fluctuations exert a decisive impact on the access to consciousness already before stimulation. Both effects were absent in V2. These results thus support the notion that ongoing slow BOLD activity within intrinsic networks covering V1 represents localized processes that modulate the degree of readiness for the emergence of visual consciousness. PMID:27601986

  7. Ventral aspect of the visual form pathway is not critical for the perception of biological motion.

    PubMed

    Gilaie-Dotan, Sharon; Saygin, Ayse Pinar; Lorenzi, Lauren J; Rees, Geraint; Behrmann, Marlene

    2015-01-27

    Identifying the movements of those around us is fundamental for many daily activities, such as recognizing actions, detecting predators, and interacting with others socially. A key question concerns the neurobiological substrates underlying biological motion perception. Although the ventral "form" visual cortex is standardly activated by biologically moving stimuli, whether these activations are functionally critical for biological motion perception or are epiphenomenal remains unknown. To address this question, we examined whether focal damage to regions of the ventral visual cortex, resulting in significant deficits in form perception, adversely affects biological motion perception. Six patients with damage to the ventral cortex were tested with sensitive point-light display paradigms. All patients were able to recognize unmasked point-light displays and their perceptual thresholds were not significantly different from those of three different control groups, one of which comprised brain-damaged patients with spared ventral cortex (n > 50). Importantly, these six patients performed significantly better than patients with damage to regions critical for biological motion perception. To assess the necessary contribution of different regions in the ventral pathway to biological motion perception, we complement the behavioral findings with a fine-grained comparison between the lesion location and extent, and the cortical regions standardly implicated in biological motion processing. This analysis revealed that the ventral aspects of the form pathway (e.g., fusiform regions, ventral extrastriate body area) are not critical for biological motion perception. We hypothesize that the role of these ventral regions is to provide enhanced multiview/posture representations of the moving person rather than to represent biological motion perception per se. PMID:25583504

  8. Visual perception in prediagnostic and early stage Huntington's disease.

    PubMed

    O'Donnell, Brian F; Blekher, Tanya M; Weaver, Marjorie; White, Kerry M; Marshall, Jeanine; Beristain, Xabier; Stout, Julie C; Gray, Jacqueline; Wojcieszek, Joanne M; Foroud, Tatiana M

    2008-05-01

    Disturbances of visual perception frequently accompany neurodegenerative disorders but have been little studied in Huntington's disease (HD) gene carriers. We used psychophysical tests to assess visual perception among individuals in the prediagnostic and early stages of HD. The sample comprised four groups, which included 201 nongene carriers (NG), 32 prediagnostic gene carriers with minimal neurological abnormalities (PD1); 20 prediagnostic gene carriers with moderate neurological abnormalities (PD2), and 36 gene carriers with diagnosed HD. Contrast sensitivity for stationary and moving sinusoidal gratings, and tests of form and motion discrimination, were used to probe different visual pathways. Patients with HD showed impaired contrast sensitivity for moving gratings. For one of the three contrast sensitivity tests, the prediagnostic gene carriers with greater neurological abnormality (PD2) also had impaired performance as compared with NG. These findings suggest that early stage HD disrupts visual functions associated with the magnocellular pathway. However, these changes are only observed in individuals diagnosed with HD or who are in the more symptomatic stages of prediagnostic HD. PMID:18419843

  9. Knowledge corruption for visual perception in individuals high on paranoia.

    PubMed

    Moritz, Steffen; Göritz, Anja S; Van Quaquebeke, Niels; Andreou, Christina; Jungclaussen, David; Peters, Maarten J V

    2014-03-30

    Studies revealed that patients with paranoid schizophrenia display overconfidence in errors for memory and social cognition tasks. The present investigation examined whether this pattern holds true for visual perception tasks. Nonclinical participants were recruited via an online panel. Individuals were asked to complete a questionnaire that included the Paranoia Checklist and were then presented with 24 blurry pictures; half contained a hidden object while the other half showed snowy (visual) noise. Participants were asked to state whether the visual items contained an object and how confident they were in their judgment. Data from 1966 individuals were included following a conservative selection process. Participants high on core paranoid symptoms showed a poor calibration of confidence for correct versus incorrect responses. In particular, participants high on paranoia displayed overconfidence in incorrect responses and demonstrated a 20% error rate for responses made with high confidence compared to a 12% error rate in participants with low paranoia scores. Interestingly, paranoia scores declined after performance of the task. For the first time, overconfidence in errors was demonstrated among individuals with high levels of paranoia using a visual perception task, tentatively suggesting it is a ubiquitous phenomenon. In view of the significant decline in paranoia across time, bias modification programs may incorporate items such as the one employed here to teach patients with clinical paranoia the fallibility of human cognition, which may foster subsequent symptom improvement. PMID:24461685

  10. Food pleasantness affects visual selective attention.

    PubMed

    di Pellegrino, Giuseppe; Magarelli, Silvia; Mengarelli, Flavia

    2011-03-01

    Fundamental to adaptive behaviour is the ability to select environmental objects that best satisfy current needs and preferences. Here we investigated whether temporary changes in food preference influence visual selective attention. To this end, we exploited the fact that when a food is eaten to satiety its motivational value and perceived pleasantness decrease relative to other foods not eaten in the meal, an effect termed sensory-specific satiety. A total of 26 hungry participants were fed until sated with one of two palatable foods. Before and after selective satiation, participants rated the pleasantness of the two foods and then viewed the same as stimuli on a computer screen while attention was assessed by a visual probe task. Results showed that the attentional bias for the food eaten decreased markedly from pre- to postsatiety, along with the subjective pleasantness for that food. By contrast, subjective pleasantness and attentional bias for the food not eaten did not show any such decrease. These findings suggest that the allocation of visual selective attention is flexibly and rapidly adjusted to reflect temporary shift in relative preference for different foods. PMID:20835973

  11. Perception of environmental tobacco smoke odors: An olfactory and visual response

    NASA Astrophysics Data System (ADS)

    Moschandreas, D. J.; Relwani, S. M.

    Odor perception of approximately 200 subjects was measured to determine whether visual contact with an odor source affects sensory responses and to estimate the magnitude of such an effect. Environmental tobacco smoke (ETS) odors were generated in a chamber either by a smoke machine or by an investigator who smoked. Several levels of odor intensity were generated. Odor intensity, odor hedonics and odor characters were the parameters measured before and after visual contact with the odor source. Visual contact increased the perceived odor intensity, the hedonic nature of the odor changed directionally toward unpleasant and the number of subjects perceiving tobacco odor increased. The change caused by visual contact led to differences that were statistically significant.

  12. Does Attention Affect Visual Feature Integration?

    ERIC Educational Resources Information Center

    Prinzmetal, William; And Others

    This work investigates, first, whether the integration of color and shape information is affected by attending to the stimulus location, and second, whether attending to a stimulus location enhances the perceptual representation of the stimulus or merely affects decision processes. In three experiments, subjects were briefly presented with colored…

  13. Site-specific visual feedback reduces pain perception.

    PubMed

    Diers, Martin; Zieglgänsberger, Walter; Trojan, Jörg; Drevensek, Annika Mira; Erhardt-Raum, Gertrud; Flor, Herta

    2013-06-01

    One of the most common forms of chronic pain is back pain. Until now, nothing has been known about the influence of visualizing one's own back on pain perception at this site. We tested 18 patients with chronic back pain and 18 healthy controls, by implementing online video feedback of the back during painful pressure and subcutaneous electrical stimuli over the trapezius muscle. Pain threshold and pain tolerance were assessed. Pressure pain stimulation intensity was set to 50% above the pain threshold. Subcutaneous stimulation intensity was set to 70% above the pain threshold. Subjects had to rate pain intensity and unpleasantness after each stimulation block on an 11-point numerical rating scale. Visual feedback of the back reduced perceived pain intensity compared to feedback of the hand in both patients and controls. These findings suggest novel intervention modes for chronic back pain based on visualization of body parts by augmented reality applications. PMID:23582151

  14. Visual perception system and method for a humanoid robot

    NASA Technical Reports Server (NTRS)

    Wells, James W. (Inventor); Mc Kay, Neil David (Inventor); Chelian, Suhas E. (Inventor); Linn, Douglas Martin (Inventor); Wampler, II, Charles W. (Inventor); Bridgwater, Lyndon (Inventor)

    2012-01-01

    A robotic system includes a humanoid robot with robotic joints each moveable using an actuator(s), and a distributed controller for controlling the movement of each of the robotic joints. The controller includes a visual perception module (VPM) for visually identifying and tracking an object in the field of view of the robot under threshold lighting conditions. The VPM includes optical devices for collecting an image of the object, a positional extraction device, and a host machine having an algorithm for processing the image and positional information. The algorithm visually identifies and tracks the object, and automatically adapts an exposure time of the optical devices to prevent feature data loss of the image under the threshold lighting conditions. A method of identifying and tracking the object includes collecting the image, extracting positional information of the object, and automatically adapting the exposure time to thereby prevent feature data loss of the image.

  15. Visual attention to and perception of undamaged and damaged versions of natural and colored female hair.

    PubMed

    Fink, Bernhard; Neuser, Frauke; Deloux, Gwenelle; Röder, Susanne; Matts, Paul J

    2013-03-01

    Female hair color is thought to influence physical attractiveness, and although there is some evidence for this assertion, research has yet not addressed the question if and how physical damaging affects the perception of female hair color. Here we investigate whether people are sensitive (in terms of visual attention and age, health and attractiveness perception) to subtle differences in hair images of natural and colored hair before and after physical damaging. We tracked the eye-gaze of 50 men and 50 women aged 31-50 years whilst they viewed randomized pairs of images of 20 natural and 20 colored hair tresses, each pair displaying the same tress before and after controlled cuticle damage. The hair images were then rated for perceived health, attractiveness, and age. Undamaged versions of natural and colored hair were perceived as significantly younger, healthier, and more attractive than corresponding damaged versions. Visual attention to images of undamaged colored hair was significantly higher compared with their damaged counterparts, while in natural hair, the opposite pattern was found. We argue that the divergence in visual attention to undamaged colored female hair and damaged natural female hair and associated ratings is due to differences in social perception and discuss the source of apparent visual difference between undamaged and damaged hair. PMID:23438146

  16. Exploration of complex visual feature spaces for object perception

    PubMed Central

    Leeds, Daniel D.; Pyles, John A.; Tarr, Michael J.

    2014-01-01

    The mid- and high-level visual properties supporting object perception in the ventral visual pathway are poorly understood. In the absence of well-specified theory, many groups have adopted a data-driven approach in which they progressively interrogate neural units to establish each unit's selectivity. Such methods are challenging in that they require search through a wide space of feature models and stimuli using a limited number of samples. To more rapidly identify higher-level features underlying human cortical object perception, we implemented a novel functional magnetic resonance imaging method in which visual stimuli are selected in real-time based on BOLD responses to recently shown stimuli. This work was inspired by earlier primate physiology work, in which neural selectivity for mid-level features in IT was characterized using a simple parametric approach (Hung et al., 2012). To extend such work to human neuroimaging, we used natural and synthetic object stimuli embedded in feature spaces constructed on the basis of the complex visual properties of the objects themselves. During fMRI scanning, we employed a real-time search method to control continuous stimulus selection within each image space. This search was designed to maximize neural responses across a pre-determined 1 cm3 brain region within ventral cortex. To assess the value of this method for understanding object encoding, we examined both the behavior of the method itself and the complex visual properties the method identified as reliably activating selected brain regions. We observed: (1) Regions selective for both holistic and component object features and for a variety of surface properties; (2) Object stimulus pairs near one another in feature space that produce responses at the opposite extremes of the measured activity range. Together, these results suggest that real-time fMRI methods may yield more widely informative measures of selectivity within the broad classes of visual features

  17. Motion coherence affects human perception and pursuit similarly

    NASA Technical Reports Server (NTRS)

    Beutter, B. R.; Stone, L. S.

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion

  18. Auditory Adaptation in Vocal Affect Perception

    ERIC Educational Resources Information Center

    Bestelmeyer, Patricia E. G.; Rouger, Julien; DeBruine, Lisa M.; Belin, Pascal

    2010-01-01

    Previous research has demonstrated perceptual aftereffects for emotionally expressive faces, but the extent to which they can also be obtained in a different modality is unknown. In two experiments we show for the first time that adaptation to affective, non-linguistic vocalisations elicits significant auditory aftereffects. Adaptation to angry…

  19. Real-time visualization of neuronal activity during perception.

    PubMed

    Muto, Akira; Ohkura, Masamichi; Abe, Gembu; Nakai, Junichi; Kawakami, Koichi

    2013-02-18

    To understand how the brain perceives the external world, it is desirable to observe neuronal activity in the brain in real time during perception. The zebrafish is a suitable model animal for fluorescence imaging studies to visualize neuronal activity because its body is transparent through the embryonic and larval stages. Imaging studies have been carried out to monitor neuronal activity in the larval spinal cord and brain using Ca(2+) indicator dyes and DNA-encoded Ca(2+) indicators, such as Cameleon, GFP-aequorin, and GCaMPs. However, temporal and spatial resolution and sensitivity of these tools are still limited, and imaging of brain activity during perception of a natural object has not yet been demonstrated. Here we demonstrate visualization of neuronal activity in the optic tectum of larval zebrafish by genetically expressing the new version of GCaMP. First, we demonstrate Ca(2+) transients in the tectum evoked by a moving spot on a display and identify direction-selective neurons. Second, we show tectal activity during perception of a natural object, a swimming paramecium, revealing a functional visuotopic map. Finally, we image the tectal responses of a free-swimming larval fish to a paramecium and thereby correlate neuronal activity in the brain with prey capture behavior. PMID:23375894

  20. Can 2- and 3-Year-Old Children Be Trained to Perform Visual Perception Tasks?

    ERIC Educational Resources Information Center

    McGuigan, Nicola

    2007-01-01

    Children aged 2 and 3 years were exposed to a novel paradigm designed to train visual perception skills. The results indicate that children of this age could be trained to perform both percept deprivation and percept diagnosis tasks. Results are discussed with reference to engagement, a precursor to an adult-like understanding of perception.

  1. An Exploratory Investigation into Factors Affecting Visual Balance.

    ERIC Educational Resources Information Center

    Niekamp, Walter

    1981-01-01

    Describes a study using ocular photography to examine factors which affect the visual weights of significant elements in a picture. Results indicating that the upper half of the visual fields has greatest weight are discussed, as are results showing insufficient support for side preferences. Included are 27 references. (Author/BK)

  2. Affective State Influences Perception by Affecting Decision Parameters Underlying Bias and Sensitivity

    PubMed Central

    Lynn, Spencer K.; Zhang, Xuan; Barrett, Lisa Feldman

    2012-01-01

    Studies of the effect of affect on perception often show consistent directional effects of a person’s affective state on perception. Unpleasant emotions have been associated with a “locally focused” style of stimulus evaluation, and positive emotions with a “globally focused” style. Typically, however, studies of affect and perception have not been conducted under the conditions of perceptual uncertainty and behavioral risk inherent to perceptual judgments outside the laboratory. We investigated the influence of perceivers’ experience affect (valence and arousal) on the utility of social threat perception by combining signal detection theory and behavioral economics. We created three perceptual decision environments that systematically differed with respect to factors that underlie uncertainty and risk: the base rate of threat, the costs of incorrect identification threat, and the perceptual similarity of threats and non-threats. We found that no single affective state yielded the best performance on the threat perception task across the three environments. Unpleasant valence promoted calibration of response bias to base rate and costs, high arousal promoted calibration of perceptual sensitivity to perceptual similarity, and low arousal was associated with an optimal adjustment of bias to sensitivity. However, the strength of these associations was conditional upon the difficulty of attaining optimal bias and high sensitivity, such that the effect of the perceiver’s affective state on perception differed with the cause and/or level of uncertainty and risk. PMID:22251054

  3. Coordinates of Human Visual and Inertial Heading Perception

    PubMed Central

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results. PMID:26267865

  4. Coordinates of Human Visual and Inertial Heading Perception.

    PubMed

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results. PMID:26267865

  5. Stars advantages vs parallel coordinates: shape perception as visualization reserve

    NASA Astrophysics Data System (ADS)

    Grishin, Vladimir; Kovalerchuk, Boris

    2013-12-01

    Although shape perception is the main information channel for brain, it has been poor used by recent visualization techniques. The difficulties of its modeling are key obstacles for visualization theory and application. Known experimental estimates of shape perception capabilities have been made for low data dimension, and they were usually not connected with data structures. More applied approach for certain data structures detection by means of shape displays are considered by the example of analytical and experimental comparison of popular now Parallel Coordinates (PCs), i.e. 2D Cartesian displays of data vectors, with polar displays known as stars. Advantages of stars vs. PCs by Gestalt Laws are shown. About twice faster feature selection and classification with stars than PCs are showed by psychological experiments for hyper-tubes structures detection in data space with dimension up to 100-200 and its subspaces. This demonstrates great reserves of visualization enhancement in comparison with many recent techniques usually focused on few data attributes analysis.

  6. Neural bandwidth of veridical perception across the visual field.

    PubMed

    Wilkinson, Michael O; Anderson, Roger S; Bradley, Arthur; Thibos, Larry N

    2016-01-01

    Neural undersampling of the retinal image limits the range of spatial frequencies that can be represented veridically by the array of retinal ganglion cells conveying visual information from eye to brain. Our goal was to demarcate the neural bandwidth and local anisotropy of veridical perception, unencumbered by optical imperfections of the eye, and to test competing hypotheses that might account for the results. Using monochromatic interference fringes to stimulate the retina with high-contrast sinusoidal gratings, we measured sampling-limited visual resolution along eight meridians from 0° to 50° of eccentricity. The resulting isoacuity contour maps revealed all of the expected features of the human array of retinal ganglion cells. Contours in the radial fringe maps are elongated horizontally, revealing the functional equivalent of the anatomical visual streak, and are extended into nasal retina and superior retina, indicating higher resolution along those meridians. Contours are larger in diameter for radial gratings compared to tangential or oblique gratings, indicating local anisotropy with highest bandwidth for radially oriented gratings. Comparison of these results to anatomical predictions indicates acuity is proportional to the sampling density of retinal ganglion cells everywhere in the retina. These results support the long-standing hypothesis that "pixel density" of the discrete neural image carried by the human optic nerve limits the spatial bandwidth of veridical perception at all retinal locations. PMID:26824638

  7. Human alteration of the rural landscape: Variations in visual perception

    SciTech Connect

    Cloquell-Ballester, Vicente-Agustin Carmen Torres-Sibille, Ana del; Cloquell-Ballester, Victor-Andres; Santamarina-Siurana, Maria Cristina

    2012-01-15

    The objective of this investigation is to evaluate how visual perception varies as the rural landscape is altered by human interventions of varying character. An experiment is carried out using Semantic Differential Analysis to analyse the effect of the character and the type of the intervention on perception. Interventions are divided into elements of 'permanent industrial character', 'elements of permanent rural character' and 'elements of temporary character', and these categories are sub-divided into smaller groups according to the type of development. To increase the reliability of the results, the Intraclass Correlation Coefficient tool, is applied to validate the semantic space of the perceptual responses and to determine the number of subjects required for a reliable evaluation of the scenes.

  8. Does bilingual experience affect early visual perceptual development?

    PubMed Central

    Schonberg, Christina; Sandhofer, Catherine M.; Tsang, Tawny; Johnson, Scott P.

    2014-01-01

    Visual attention and perception develop rapidly during the first few months after birth, and these behaviors are critical components in the development of language and cognitive abilities. Here we ask how early bilingual experiences might lead to differences in visual attention and perception. Experiments 1–3 investigated the looking behavior of monolingual and bilingual infants when presented with social (Experiment 1), mixed (Experiment 2), or non-social (Experiment 3) stimuli. In each of these experiments, infants' dwell times (DT) and number of fixations to areas of interest (AOIs) were analyzed, giving a sense of where the infants looked. To examine how the infants looked at the stimuli in a more global sense, Experiment 4 combined and analyzed the saccade data collected in Experiments 1–3. There were no significant differences between monolingual and bilingual infants' DTs, AOI fixations, or saccade characteristics (specifically, frequency, and amplitude) in any of the experiments. These results suggest that monolingual and bilingual infants process their visual environments similarly, supporting the idea that the substantial cognitive differences between monolinguals and bilinguals in early childhood are more related to active vocabulary production than perception of the environment. PMID:25566116

  9. Visual perception and grasping for the extravehicular activity robot

    NASA Technical Reports Server (NTRS)

    Starks, Scott A.

    1989-01-01

    The development of an approach to the visual perception of object surface information using laser range data in support of robotic grasping is discussed. This is a very important problem area in that a robot such as the EVAR must be able to formulate a grasping strategy on the basis of its knowledge of the surface structure of the object. A description of the problem domain is given as well as a formulation of an algorithm which derives an object surface description adequate to support robotic grasping. The algorithm is based upon concepts of differential geometry namely, Gaussian and mean curvature.

  10. Principals' Perception regarding Factors Affecting the Performance of Teachers

    ERIC Educational Resources Information Center

    Akram, Muhammad Javaid; Raza, Syed Ahmad; Khaleeq, Abdur Rehman; Atika, Samrana

    2011-01-01

    This study investigated the perception of principals on how the factors of subject mastery, teaching methodology, personal characteristics, and attitude toward students affect the performance of teachers at higher secondary level in the Punjab. All principals of higher secondary level in the Punjab were part of the population of the study. From…

  11. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  12. Inhibitory processes in visual perception: a bilingual advantage.

    PubMed

    Wimmer, Marina C; Marx, Christina

    2014-10-01

    Bilingual inhibitory control advantages are well established. An open question is whether inhibitory superiority also extends to visual perceptual phenomena that involve inhibitory processes. This research used ambiguous figures to assess inhibitory bilingual superiority in 3-, 4-, and 5-year-old mono- and bilingual children (N=141). Findings show that bilinguals across all ages are superior in inhibiting a prevalent interpretation of an ambiguous figure to perceive the alternative interpretation. In contrast, mono- and bilinguals revealed no differences in understanding that an ambiguous figure can have two distinct referents. Together, these results suggest that early bilingual inhibitory control superiority is also evident in visual perception. Bilinguals' conceptual understanding of figure ambiguity is comparable to that of their monolingual peers. PMID:24878102

  13. Relationships between Fine-Motor, Visual-Motor, and Visual Perception Scores and Handwriting Legibility and Speed

    ERIC Educational Resources Information Center

    Klein, Sheryl; Guiltner, Val; Sollereder, Patti; Cui, Ying

    2011-01-01

    Occupational therapists assess fine motor, visual motor, visual perception, and visual skill development, but knowledge of the relationships between scores on sensorimotor performance measures and handwriting legibility and speed is limited. Ninety-nine students in grades three to six with learning and/or behavior problems completed the Upper-Limb…

  14. Spatial Resolution of Conscious Visual Perception in Infants

    PubMed Central

    Farzin, Faraz; Rivera, Susan M.; Whitney, David

    2011-01-01

    Conscious awareness of objects in the visual periphery is limited. This limit is not entirely the result of reduced visual acuity, but is primarily caused by crowding—the inability to identify an object when surrounded by clutter. Crowding represents a fundamental limitation of the visual system, and has to date been unexplored in infants. Do infants have a fine-grained “spotlight”, similar to adults, or a diffuse “lantern” that sets limits on what they can register in the periphery? An eye-tracking paradigm was designed to psychophysically measure crowding in 6- to 15-month-olds by showing pairs of faces at three eccentricities, in the presence or absence of flankers, and recording infants’ first saccade from central fixation to either face. Results reveal that infants can discriminate faces in the periphery, and flankers impair this ability as close as 3 degrees; the effective spatial resolution of visual perception increased with age but was only half that of adults. PMID:20817914

  15. Vibrotactile support: initial effects on visual speech perception.

    PubMed

    Lyxell, B; Rönnberg, J; Andersson, J; Linderoth, E

    1993-01-01

    The study investigated the initial effects of the implementation of vibrotactile support on the individual's speech perception ability. Thirty-two subjects participated in the study; 16 with an acquired deafness and 16 with normal hearing. At a general level, the results indicated no immediate and direct improvement as a function of the implementation across all speech perception tests. However, when the subjects were divided into Skilled and Less Skilled groups, based on their performance in the visual condition of each test, it was found that the performance of the Skilled subjects deteriorated while that of the Less Skilled subjects improved when tactile information was provided in two conditions (word-discrimination and word-decoding conditions). It was concluded that tactile information interferes with Skilled subjects' automaticity of these functions. Furthermore, intercorrelations between discrimination and decoding tasks suggest that there are similarities between visually and tactilely supported speechreading in how they relate to sentence-based speechreading. Clinical implications of the results were discussed. PMID:8210957

  16. Sampling of post-Riley visual artists surreptitiously probing perception

    NASA Astrophysics Data System (ADS)

    Daly, Scott J.

    2003-06-01

    Attending any conference on visual perception undoubtedly leaves one exposed to the work of Salvador Dali, whose extended phase of work exploring what he dubbed, "the paranoiac-critical method" is very popular as examples of multiple perceptions from conflicting input. While all visual art is intertwined with perceptual science, from convincing three-dimensional illusion during the Renaissance to the isolated visual illusions of Bridget Riley"s Op-Art, direct statements about perception are rarely uttered by the artists in recent times. However, there are still a number of artists working today whose work contains perceptual questions and exemplars that can be of interest to vision scientists and imaging engineers. This talk will start sampling from Op-Art, which is most directly related to psychophysical test stimuli and then will discuss "perceptual installations" from artists such as James Turrell"s, whose focus is often directly on natural light, with no distortions imposed by any capture or display apparatus. His work generally involves installations that use daylight and focus the viewer on its nuanced qualities, such as umbra, air particle interactions, and effects of light adaptation. He is one of the last artists to actively discuss perception. Next we discuss minimal art and electronic art, with video artist Nam June Paik discussing the "intentionally boring" art of minimalism. Another artist using installations is Sandy Skoglund, who creates environments of constant spectral albedo, with the exception of her human occupants. Tom Shannon also uses installations as his media to delve into 3D aspects of depth and perspective, but in an atomized fashion. Beginning with installation concepts, Calvin Collum then adds the restrictive viewpoint of photography to create initially confusing images where the pictorial content and depth features are independent (analogous to the work of Patrick Hughes). Andy Goldsworthy also combines photography with concepts of

  17. Vegetarianism and food perception. Selective visual attention to meat pictures.

    PubMed

    Stockburger, Jessica; Renner, Britta; Weike, Almut I; Hamm, Alfons O; Schupp, Harald T

    2009-04-01

    Vegetarianism provides a model system to examine the impact of negative affect towards meat, based on ideational reasoning. It was hypothesized that meat stimuli are efficient attention catchers in vegetarians. Event-related brain potential recordings served to index selective attention processes at the level of initial stimulus perception. Consistent with the hypothesis, late positive potentials to meat pictures were enlarged in vegetarians compared to omnivores. This effect was specific for meat pictures and obtained during passive viewing and an explicit attention task condition. These findings demonstrate the attention capture of food stimuli, deriving affective salience from ideational reasoning and symbolic meaning. PMID:18996158

  18. How (and why) the visual control of action differs from visual perception

    PubMed Central

    Goodale, Melvyn A.

    2014-01-01

    Vision not only provides us with detailed knowledge of the world beyond our bodies, but it also guides our actions with respect to objects and events in that world. The computations required for vision-for-perception are quite different from those required for vision-for-action. The former uses relational metrics and scene-based frames of reference while the latter uses absolute metrics and effector-based frames of reference. These competing demands on vision have shaped the organization of the visual pathways in the primate brain, particularly within the visual areas of the cerebral cortex. The ventral ‘perceptual’ stream, projecting from early visual areas to inferior temporal cortex, helps to construct the rich and detailed visual representations of the world that allow us to identify objects and events, attach meaning and significance to them and establish their causal relations. By contrast, the dorsal ‘action’ stream, projecting from early visual areas to the posterior parietal cortex, plays a critical role in the real-time control of action, transforming information about the location and disposition of goal objects into the coordinate frames of the effectors being used to perform the action. The idea of two visual systems in a single brain might seem initially counterintuitive. Our visual experience of the world is so compelling that it is hard to believe that some other quite independent visual signal—one that we are unaware of—is guiding our movements. But evidence from a broad range of studies from neuropsychology to neuroimaging has shown that the visual signals that give us our experience of objects and events in the world are not the same ones that control our actions. PMID:24789899

  19. See it with feeling: affective predictions during object perception

    PubMed Central

    Barrett, L.F.; Bar, Moshe

    2009-01-01

    People see with feeling. We ‘gaze’, ‘behold’, ‘stare’, ‘gape’ and ‘glare’. In this paper, we develop the hypothesis that the brain's ability to see in the present incorporates a representation of the affective impact of those visual sensations in the past. This representation makes up part of the brain's prediction of what the visual sensations stand for in the present, including how to act on them in the near future. The affective prediction hypothesis implies that responses signalling an object's salience, relevance or value do not occur as a separate step after the object is identified. Instead, affective responses support vision from the very moment that visual stimulation begins. PMID:19528014

  20. Emotion perception, but not affect perception, is impaired with semantic memory loss

    PubMed Central

    Lindquist, Kristen A.; Gendron, Maria; Feldman Barrett, Lisa; Dickerson, Bradford C.

    2014-01-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others’ faces is inborn, pre-linguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this paper, we report findings from three patients with semantic dementia that cannot be explained by this “basic emotion” view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear and sadness. These findings have important consequences for understanding the processes supporting emotion perception. PMID:24512242

  1. Psychopathic traits affect the visual exploration of facial expressions.

    PubMed

    Boll, Sabrina; Gamer, Matthias

    2016-05-01

    Deficits in emotional reactivity and recognition have been reported in psychopathy. Impaired attention to the eyes along with amygdala malfunctions may underlie these problems. Here, we investigated how different facets of psychopathy modulate the visual exploration of facial expressions by assessing personality traits in a sample of healthy young adults using an eye-tracking based face perception task. Fearless Dominance (the interpersonal-emotional facet of psychopathy) and Coldheartedness scores predicted reduced face exploration consistent with findings on lowered emotional reactivity in psychopathy. Moreover, participants high on the social deviance facet of psychopathy ('Self-Centered Impulsivity') showed a reduced bias to shift attention towards the eyes. Our data suggest that facets of psychopathy modulate face processing in healthy individuals and reveal possible attentional mechanisms which might be responsible for the severe impairments of social perception and behavior observed in psychopathy. PMID:27016126

  2. Behavioral Differences in the Upper and Lower Visual Hemifields in Shape and Motion Perception.

    PubMed

    Zito, Giuseppe A; Cazzoli, Dario; Müri, René M; Mosimann, Urs P; Nef, Tobias

    2016-01-01

    Perceptual accuracy is known to be influenced by stimuli location within the visual field. In particular, it seems to be enhanced in the lower visual hemifield (VH) for motion and space processing, and in the upper VH for object and face processing. The origins of such asymmetries are attributed to attentional biases across the visual field, and in the functional organization of the visual system. In this article, we tested content-dependent perceptual asymmetries in different regions of the visual field. Twenty-five healthy volunteers participated in this study. They performed three visual tests involving perception of shapes, orientation and motion, in the four quadrants of the visual field. The results of the visual tests showed that perceptual accuracy was better in the lower than in the upper visual field for motion perception, and better in the upper than in the lower visual field for shape perception. Orientation perception did not show any vertical bias. No difference was found when comparing right and left VHs. The functional organization of the visual system seems to indicate that the dorsal and the ventral visual streams, responsible for motion and shape perception, respectively, show a bias for the lower and upper VHs, respectively. Such a bias depends on the content of the visual information. PMID:27378876

  3. Behavioral Differences in the Upper and Lower Visual Hemifields in Shape and Motion Perception

    PubMed Central

    Zito, Giuseppe A.; Cazzoli, Dario; Müri, René M.; Mosimann, Urs P.; Nef, Tobias

    2016-01-01

    Perceptual accuracy is known to be influenced by stimuli location within the visual field. In particular, it seems to be enhanced in the lower visual hemifield (VH) for motion and space processing, and in the upper VH for object and face processing. The origins of such asymmetries are attributed to attentional biases across the visual field, and in the functional organization of the visual system. In this article, we tested content-dependent perceptual asymmetries in different regions of the visual field. Twenty-five healthy volunteers participated in this study. They performed three visual tests involving perception of shapes, orientation and motion, in the four quadrants of the visual field. The results of the visual tests showed that perceptual accuracy was better in the lower than in the upper visual field for motion perception, and better in the upper than in the lower visual field for shape perception. Orientation perception did not show any vertical bias. No difference was found when comparing right and left VHs. The functional organization of the visual system seems to indicate that the dorsal and the ventral visual streams, responsible for motion and shape perception, respectively, show a bias for the lower and upper VHs, respectively. Such a bias depends on the content of the visual information. PMID:27378876

  4. Seen, Unseen or Overlooked? How Can Visual Perception Develop through a Multimodal Enquiry?

    ERIC Educational Resources Information Center

    Payne, Rachel

    2012-01-01

    This article outlines an exploration into the development of visual perception through analysing the process of taking photographs of the mundane as small-scale research. A preoccupation with social construction of the visual lies at the heart of the investigation by correlating the perceptive process to Mitchell's (2002) counter thesis for visual…

  5. Quantitative and qualitative evaluation of PERCEPT indoor navigation system for visually impaired users.

    PubMed

    Ganz, Aura; Schafer, James; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    In this paper we introduce qualitative and quantitative evaluation of PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT system trials with 24 blind and visually impaired users in a multi-story building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows Orientation and Mobility principles. These results encourage us to generalize the solution to large indoor spaces and test it with significantly larger visually impaired population in diverse settings. We hope that PERCEPT will become a standard deployed in all indoor public spaces. PMID:23367251

  6. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech.

    PubMed

    García-Pérez, Miguel A; Alcalá-Quintana, Rocío

    2015-12-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. PMID:27551361

  7. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech

    PubMed Central

    Alcalá-Quintana, Rocío

    2015-01-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. PMID:27551361

  8. Influential sources affecting Bangkok adolescent body image perceptions.

    PubMed

    Thianthai, Chulanee

    2006-01-01

    The study of body image-related problems in non-Western countries is still very limited. Thus, this study aims to identify the main influential sources and show how they affect the body image perceptions of Bangkok adolescents. The researcher recruited 400 Thai male and female adolescents in Bangkok, attending high school to freshmen level, ranging from 16-19 years, to participate in this study. Survey questionnaires were distributed to every student and follow-up interviews conducted with 40 students. The findings showed that there are eight main influential sources respectively ranked from the most influential to the least influential: magazines, television, peer group, familial, fashion trend, the opposite gender, self-realization and health knowledge. Similar to those studies conducted in Western countries, more than half of the total percentage was the influence of mass media and peer groups. Bangkok adolescents also internalized Western ideal beauty through these mass media channels. Alike studies conducted in the West, there was similarities in the process of how these influential sources affect Bangkok adolescent body image perception, with the exception of familial source. In conclusion, taking the approach of identifying the main influential sources and understanding how they affect adolescent body image perceptions can help prevent adolescents from having unhealthy views and taking risky measures toward their bodies. More studies conducted in non-Western countries are needed in order to build a cultural sensitive program, catered to the body image problems occurring in adolescents within that particular society. PMID:17340854

  9. The dynamics of visual perception pictures of stroboscope

    NASA Astrophysics Data System (ADS)

    Zhytaryuk, V. G.

    2015-11-01

    This paper studies and investigated the issue of physical principles of visual perception blinking images spokes of a wheel that rotates in alternating and direct the reflected light fields. The research results make it possible to clearly interpret observations stroboscopic effect of the rotating spoke wheels of the car, propeller aircraft, domestic fans. Established that the observation of these defects is possible only when illuminated by artificial fluorescent, discharge and pulsed light source. Discovered fact "capture", ie observation as a separate fixed needles at frequencies far exceeding the published data, which this time is 0.1 sec (10 Hz). Established that there is a capture at frequencies up to and including 50 Hz. This result is not described in the scientific literature and no explanation of the theory.

  10. Suppressive mechanisms in visual motion processing: From perception to intelligence.

    PubMed

    Tadin, Duje

    2015-10-01

    Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and individuals with schizophrenia-a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. PMID:26299386

  11. Constructing Visual Perception of Body Movement with the Motor Cortex

    PubMed Central

    Orgs, Guido; Dovern, Anna; Hagura, Nobuhiro; Haggard, Patrick; Fink, Gereon R.; Weiss, Peter H.

    2016-01-01

    The human brain readily perceives fluent movement from static input. Using functional magnetic resonance imaging, we investigated brain mechanisms that mediate fluent apparent biological motion (ABM) perception from sequences of body postures. We presented body and nonbody stimuli varying in objective sequence duration and fluency of apparent movement. Three body postures were ordered to produce a fluent (ABC) or a nonfluent (ACB) apparent movement. This enabled us to identify brain areas involved in the perceptual reconstruction of body movement from identical lower-level static input. Participants judged the duration of a rectangle containing body/nonbody sequences, as an implicit measure of movement fluency. For body stimuli, fluent apparent motion sequences produced subjectively longer durations than nonfluent sequences of the same objective duration. This difference was reduced for nonbody stimuli. This body-specific bias in duration perception was associated with increased blood oxygen level-dependent responses in the primary (M1) and supplementary motor areas. Moreover, fluent ABM was associated with increased functional connectivity between M1/SMA and right fusiform body area. We show that perceptual reconstruction of fluent movement from static body postures does not merely enlist areas traditionally associated with visual body processing, but involves cooperative recruitment of motor areas, consistent with a “motor way of seeing”. PMID:26534907

  12. Professors' Facebook content affects students' perceptions and expectations.

    PubMed

    Sleigh, Merry J; Smith, Aimee W; Laboe, Jason

    2013-07-01

    Abstract Facebook users must make choices about level of self-disclosure, and this self-disclosure can influence perceptions of the profile's author. We examined whether the specific type of self-disclosure on a professor's profile would affect students' perceptions of the professor and expectations of his classroom. We created six Facebook profiles for a fictitious male professor, each with a specific emphasis: politically conservative, politically liberal, religious, family oriented, socially oriented, or professional. Undergraduate students randomly viewed one profile and responded to questions that assessed their perceptions and expectations. The social professor was perceived as less skilled but more popular, while his profile was perceived as inappropriate and entertaining. Students reacted more strongly and negatively to the politically focused profiles in comparison to the religious, family, and professional profiles. Students reported being most interested in professional information on a professor's Facebook profile, yet they reported being least influenced by the professional profile. In general, students expressed neutrality about their interest in finding and friending professors on Facebook. These findings suggest that students have the potential to form perceptions about the classroom environment and about their professors based on the specific details disclosed in professors' Facebook profiles. PMID:23614794

  13. Visual perception of upright: Head tilt, visual errors and viewing eye

    PubMed Central

    Kheradmand, Amir; Gonzalez, Grisel; Otero-Millan, Jorge; Lasker, Adrian

    2016-01-01

    BACKGROUND Perception of upright is often assessed by aligning a luminous line to the subjective visual vertical (SVV). OBJECTIVE Here we investigated the effects of visual line rotation and viewing eye on SVV responses and whether there was any change with head tilt. METHODS SVV was measured using a forced-choice paradigm and by combining the following conditions in 22 healthy subjects: head position (20° left tilt, upright and 20° right tilt), viewing eye (left eye, both eyes and right eye) and direction of visual line rotation (clockwise [CW] and counter clockwise [CCW]). RESULTS The accuracy and precision of SVV responses were not different between the viewing eye conditions in all head positions (P > 0.05, Kruskal-Wallis test). The accuracy of SVV responses was different between the CW and CCW line rotations (p ≈ 0.0001; Kruskal-Wallis test) and SVV was tilted in the same direction as the line rotation. This effect of line rotation was however not consistent across head tilts and was only present in the upright and right tilt head positions. The accuracy of SVV responses showed a higher variability among subjects in the left head tilt position with no significant difference between the CW and CCW line rotations (P > 0.05; post-hoc Dunn’s test). CONCLUSIONS In spite of the challenges to the estimate of upright with head tilt, normal subjects did remarkably well irrespective of the viewing eye. The physiological significance of the asymmetry in the effect of line rotation between the head tilt positions is unclear but it suggests a lateralizing effect of head tilt on the visual perception of upright. PMID:26890421

  14. Modeling of image perception and discrimination by the visually impaired

    NASA Astrophysics Data System (ADS)

    Benguigui, Avi; Efron, Uzi

    2006-08-01

    An Image Transceiver based- Goggle has been under development at the Ben Gurion University and the Holon Institute. The device , aimed at Low-Vision Aid applications [1], is based on a unique LCOS-CMOS Image Transceiver Device (ITD), which is capable of combining both functions of imaging and Display in a single chip. The head mounted Goggle will allow the capture of ambient scenery, performing the necessary image enhancement and processing, as well as its redirection to the healthy part of the patient's retina. In this presentation we will report on the modeling of the imaging, Image Perception and discrimination capabilities of the visually impaired. The first part of the study is based on modeling the spatial frequency response and contrast sensitivity analyzing the two main cases of central and peripheral vision losses. Studies of the effects of both the Retinal Eccentricity and illumination-levels on the low vision's spatial frequency response will be described. The second part of the modeling incorporates the use of an image discrimination model to assess the ability of the visually impaired using the low vision model outlined above, to discriminate between two nearly-identical images.

  15. Talker variability in audio-visual speech perception

    PubMed Central

    Heald, Shannon L. M.; Nusbaum, Howard C.

    2014-01-01

    A change in talker is a change in the context for the phonetic interpretation of acoustic patterns of speech. Different talkers have different mappings between acoustic patterns and phonetic categories and listeners need to adapt to these differences. Despite this complexity, listeners are adept at comprehending speech in multiple-talker contexts, albeit at a slight but measurable performance cost (e.g., slower recognition). So far, this talker variability cost has been demonstrated only in audio-only speech. Other research in single-talker contexts have shown, however, that when listeners are able to see a talker’s face, speech recognition is improved under adverse listening (e.g., noise or distortion) conditions that can increase uncertainty in the mapping between acoustic patterns and phonetic categories. Does seeing a talker’s face reduce the cost of word recognition in multiple-talker contexts? We used a speeded word-monitoring task in which listeners make quick judgments about target word recognition in single- and multiple-talker contexts. Results show faster recognition performance in single-talker conditions compared to multiple-talker conditions for both audio-only and audio-visual speech. However, recognition time in a multiple-talker context was slower in the audio-visual condition compared to audio-only condition. These results suggest that seeing a talker’s face during speech perception may slow recognition by increasing the importance of talker identification, signaling to the listener a change in talker has occurred. PMID:25076919

  16. In the eye of the beholder: Visual biases in package and portion size perceptions.

    PubMed

    Ordabayeva, Nailya; Chandon, Pierre

    2016-08-01

    As the sizes of food packages and portions have changed rapidly over the past decades, it has become crucial to understand how consumers perceive and respond to changes in size. Existing evidence suggests that consumers make errors when visually estimating package and portion sizes, and these errors significantly influence subsequent food choices and intake. We outline four visual biases (arising from the underestimation of increasing portion sizes, the dimensionality of the portion size change, labeling effects, and consumer affect) that shape consumers' perceptions of package and portion sizes. We discuss the causes of these biases, review their impact on food consumption decisions, and suggest concrete strategies to reduce them and to promote healthier eating. We conclude with a discussion of important theoretical and practical issues that should be addressed in the future. PMID:26482283

  17. The perception of affective touch in anorexia nervosa.

    PubMed

    Crucianelli, Laura; Cardi, Valentina; Treasure, Janet; Jenkinson, Paul M; Fotopoulou, Aikaterini

    2016-05-30

    Anorexia nervosa (AN) is a disorder characterized by restricted eating, fears of gaining weight, and body image distortions. The etiology remains unknown; however impairments in social cognition and reward circuits contribute to the onset and maintenance of the disorder. One possibility is that AN is associated with reduced perceived pleasantness during social interactions. We therefore examined the perception of interpersonal, 'affective touch' and its social modulation in AN. We measured the perceived pleasantness of light, dynamic stroking touches applied to the forearm of 25 AN patients and 30 healthy controls using C Tactile (CT) afferents-optimal (3cm/s) and non-optimal (18cm/s) velocities, while simultaneously displaying images of faces showing rejecting, neutral and accepting expressions. CT-optimal touch, but not CT non-optimal touch, elicited significantly lower pleasantness ratings in AN patients compared with healthy controls. Pleasantness ratings were modulated by facial expressions in both groups in a similar fashion; namely, presenting socially accepting faces increased the perception of touch pleasantness more than neutral and rejecting faces. Our findings suggest that individuals with AN have a disordered, CT-based affective touch system. This impairment may be linked to their weakened interoceptive perception and distorted body representation. PMID:27137964

  18. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies.

    PubMed

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M; Rogers, Peter J; Hardman, Charlotte A

    2016-03-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a 'normal' sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. PMID:26702602

  19. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies

    PubMed Central

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M.; Rogers, Peter J.; Hardman, Charlotte A.

    2016-01-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a ‘normal’ sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. PMID:26702602

  20. PERCEPT Indoor Navigation System for the Blind and Visually Impaired: Architecture and Experimentation

    PubMed Central

    Ganz, Aura; Schafer, James; Gandhi, Siddhesh; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    We introduce PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT will improve the quality of life and health of the visually impaired community by enabling independent living. Using PERCEPT, blind users will have independent access to public health facilities such as clinics, hospitals, and wellness centers. Access to healthcare facilities is crucial for this population due to the multiple health conditions that they face such as diabetes and its complications. PERCEPT system trials with 24 blind and visually impaired users in a multistory building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows orientation and mobility principles. We hope that PERCEPT will become a standard deployed in all indoor public spaces, especially in healthcare and wellness facilities. PMID:23316225

  1. PERCEPT Indoor Navigation System for the Blind and Visually Impaired: Architecture and Experimentation.

    PubMed

    Ganz, Aura; Schafer, James; Gandhi, Siddhesh; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    We introduce PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT will improve the quality of life and health of the visually impaired community by enabling independent living. Using PERCEPT, blind users will have independent access to public health facilities such as clinics, hospitals, and wellness centers. Access to healthcare facilities is crucial for this population due to the multiple health conditions that they face such as diabetes and its complications. PERCEPT system trials with 24 blind and visually impaired users in a multistory building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows orientation and mobility principles. We hope that PERCEPT will become a standard deployed in all indoor public spaces, especially in healthcare and wellness facilities. PMID:23316225

  2. Affect of the unconscious: visually suppressed angry faces modulate our decisions.

    PubMed

    Almeida, Jorge; Pajtas, Petra E; Mahon, Bradford Z; Nakayama, Ken; Caramazza, Alfonso

    2013-03-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item--a Chinese character--that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala. PMID:23224765

  3. Assistive Technology Competencies of Teachers of Students with Visual Impairments: A Comparison of Perceptions

    ERIC Educational Resources Information Center

    Zhou, Li; Smith, Derrick W.; Parker, Amy T.; Griffin-Shirley, Nora

    2011-01-01

    This study surveyed teachers of students with visual impairments in Texas on their perceptions of a set of assistive technology competencies developed for teachers of students with visual impairments by Smith and colleagues (2009). Differences in opinion between practicing teachers of students with visual impairments and Smith's group of…

  4. Categorical perception of affective and linguistic facial expressions

    PubMed Central

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers' response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience. PMID:19111287

  5. No-reference quality assessment based on visual perception

    NASA Astrophysics Data System (ADS)

    Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao

    2014-11-01

    The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233

  6. A preliminary study on how hypohydration affects pain perception.

    PubMed

    Bear, Tracey; Philipp, Michael; Hill, Stephen; Mündel, Toby

    2016-05-01

    Chronic pain is a prevalent health issue with one in five people suffering from some form of chronic pain, with loss of productivity and medical costs of chronic pain considerable. However, the treatment of pain can be difficult, as pain perception is complex and can be affected by factors other than tissue damage. This study investigated the effect of hypohydration (mild, voluntary dehydration from ∼24 h of limiting fluid intake, mimicking someone drinking less than usual) on a person's pain perception. Seventeen healthy males (age 27 ± 5 years) visited the laboratory on three occasions, once as a familiarization and then twice again while either euhydrated (urine specific gravity: 1.008 ± 0.005) or hypohydrated (urine specific gravity: 1.024 ± 0.003, and -1.4 ± 0.9% body mass). Each visit, they performed a cold pressor test, where their feet were placed in cold water (0-3°C) for a maximum of 4 min. Measures of hydration status, pain sensitivity, pain threshold, and catastrophization were taken. We found that hypohydration predicted increased pain sensitivity (β = 0.43), trait pain catastrophizing, and baseline pain sensitivity (β = 0.37 and 0.47, respectively). These results are consistent with previous research, and suggest that a person's hydration status may be an important factor in their perception of acute pain. PMID:26785699

  7. Interaction Between Optical and Neural Factors Affecting Visual Performance

    NASA Astrophysics Data System (ADS)

    Sabesan, Ramkumar

    The human eye suffers from higher order aberrations, in addition to conventional spherical and cylindrical refractive errors. Advanced optical techniques have been devised to correct them in order to achieve superior retinal image quality. However, vision is not completely defined by the optical quality of the eye, but also depends on how the image quality is processed by the neural system. In particular, how neural processing is affected by the past visual experience with optical blur has remained largely unexplored. The objective of this thesis was to investigate the interaction of optical and neural factors affecting vision. To achieve this goal, pathological keratoconic eyes were chosen as the ideal population to study since they are severely afflicted by degraded retinal image quality due to higher order aberrations and their neural system has been exposed to that habitually for a long period of time. Firstly, we have developed advanced customized ophthalmic lenses for correcting the higher order aberration of keratoconic eyes and demonstrated their feasibility in providing substantial visual benefit over conventional corrective methodologies. However, the achieved visual benefit was significantly smaller than that predicted optically. To better understand this, the second goal of the thesis was set to investigate if the neural system optimizes its underlying mechanisms in response to the long-term visual experience with large magnitudes of higher order aberrations. This study was facilitated by a large-stroke adaptive optics vision simulator, enabling us to access the neural factors in the visual system by manipulating the limit imposed by the optics of the eye. Using this instrument, we have performed a series of experiments to establish that habitual exposure to optical blur leads to an alteration in neural processing thereby alleviating the visual impact of degraded retinal image quality, referred to as neural compensation. However, it was also found that

  8. Person perception informs understanding of cognition during visual search.

    PubMed

    Brennan, Allison A; Watson, Marcus R; Kingstone, Alan; Enns, James T

    2011-08-01

    Does person perception--the impressions we form from watching others--hold clues to the mental states of people engaged in cognitive tasks? We investigated this with a two-phase method: In Phase 1, participants searched on a computer screen (Experiment 1) or in an office (Experiment 2); in Phase 2, other participants rated the searchers' video-recorded behavior. The results showed that blind raters are sensitive to individual differences in search proficiency and search strategy, as well as to environmental factors affecting search difficulty. Also, different behaviors were linked to search success in each setting: Eye movement frequency predicted successful search on a computer screen; head movement frequency predicted search success in an office. In both settings, an active search strategy and positive emotional expressions were linked to search success. These data indicate that person perception informs cognition beyond the scope of performance measures, offering the potential for new measurements of cognition that are both rich and unobtrusive. PMID:21626239

  9. Sound frequency affects speech emotion perception: results from congenital amusia.

    PubMed

    Lolli, Sydney L; Lewenstein, Ari D; Basurto, Julian; Winnik, Sean; Loui, Psyche

    2015-01-01

    Congenital amusics, or "tone-deaf" individuals, show difficulty in perceiving and producing small pitch differences. While amusia has marked effects on music perception, its impact on speech perception is less clear. Here we test the hypothesis that individual differences in pitch perception affect judgment of emotion in speech, by applying low-pass filters to spoken statements of emotional speech. A norming study was first conducted on Mechanical Turk to ensure that the intended emotions from the Macquarie Battery for Evaluation of Prosody were reliably identifiable by US English speakers. The most reliably identified emotional speech samples were used in Experiment 1, in which subjects performed a psychophysical pitch discrimination task, and an emotion identification task under low-pass and unfiltered speech conditions. Results showed a significant correlation between pitch-discrimination threshold and emotion identification accuracy for low-pass filtered speech, with amusics (defined here as those with a pitch discrimination threshold >16 Hz) performing worse than controls. This relationship with pitch discrimination was not seen in unfiltered speech conditions. Given the dissociation between low-pass filtered and unfiltered speech conditions, we inferred that amusics may be compensating for poorer pitch perception by using speech cues that are filtered out in this manipulation. To assess this potential compensation, Experiment 2 was conducted using high-pass filtered speech samples intended to isolate non-pitch cues. No significant correlation was found between pitch discrimination and emotion identification accuracy for high-pass filtered speech. Results from these experiments suggest an influence of low frequency information in identifying emotional content of speech. PMID:26441718

  10. Sound frequency affects speech emotion perception: results from congenital amusia

    PubMed Central

    Lolli, Sydney L.; Lewenstein, Ari D.; Basurto, Julian; Winnik, Sean; Loui, Psyche

    2015-01-01

    Congenital amusics, or “tone-deaf” individuals, show difficulty in perceiving and producing small pitch differences. While amusia has marked effects on music perception, its impact on speech perception is less clear. Here we test the hypothesis that individual differences in pitch perception affect judgment of emotion in speech, by applying low-pass filters to spoken statements of emotional speech. A norming study was first conducted on Mechanical Turk to ensure that the intended emotions from the Macquarie Battery for Evaluation of Prosody were reliably identifiable by US English speakers. The most reliably identified emotional speech samples were used in Experiment 1, in which subjects performed a psychophysical pitch discrimination task, and an emotion identification task under low-pass and unfiltered speech conditions. Results showed a significant correlation between pitch-discrimination threshold and emotion identification accuracy for low-pass filtered speech, with amusics (defined here as those with a pitch discrimination threshold >16 Hz) performing worse than controls. This relationship with pitch discrimination was not seen in unfiltered speech conditions. Given the dissociation between low-pass filtered and unfiltered speech conditions, we inferred that amusics may be compensating for poorer pitch perception by using speech cues that are filtered out in this manipulation. To assess this potential compensation, Experiment 2 was conducted using high-pass filtered speech samples intended to isolate non-pitch cues. No significant correlation was found between pitch discrimination and emotion identification accuracy for high-pass filtered speech. Results from these experiments suggest an influence of low frequency information in identifying emotional content of speech. PMID:26441718

  11. The Perceptual Root of Object-Based Storage: An Interactive Model of Perception and Visual Working Memory

    ERIC Educational Resources Information Center

    Gao, Tao; Gao, Zaifeng; Li, Jie; Sun, Zhongqiang; Shen, Mowei

    2011-01-01

    Mainstream theories of visual perception assume that visual working memory (VWM) is critical for integrating online perceptual information and constructing coherent visual experiences in changing environments. Given the dynamic interaction between online perception and VWM, we propose that how visual information is processed during visual…

  12. The Developmental Test of Visual Perception-Third Edition (DTVP-3): A Review, Critique, and Practice Implications

    ERIC Educational Resources Information Center

    Brown, Ted; Murdolo, Yuki

    2015-01-01

    The "Developmental Test of Visual Perception-Third Edition" (DTVP-3) is a recent revision of the "Developmental Test of Visual Perception-Second Edition" (DTVP-2). The DTVP-3 is designed to assess the visual perceptual and/or visual-motor integration skills of children from 4 to 12 years of age. The test is standardized using…

  13. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness

    PubMed Central

    Forder, Lewis; Taylor, Olivia; Mankin, Helen; Scott, Ryan B.; Franklin, Anna

    2016-01-01

    The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d’) and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object’s stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain. PMID:27023274

  14. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness.

    PubMed

    Forder, Lewis; Taylor, Olivia; Mankin, Helen; Scott, Ryan B; Franklin, Anna

    2016-01-01

    The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d') and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object's stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain. PMID:27023274

  15. Interactive effects of musical and visual cues on time perception: an application to waiting lines in banks.

    PubMed

    Chebat, J C; Gelinas-Chebat, C; Filiatrault, P

    1993-12-01

    This study explores the interactive effects of musical and visual cues on time perception in a specific situation, that of waiting in a bank. Videotapes are employed to stimulate the situation; a 2 x 3 factorial design (N = 427) is used: 2 (high vs low) amounts of visual information and 2 (fast vs slow) levels of musical tempo in addition to a no-music condition. Two mediating variables are tested in the relation between the independent variables (musical and visual ones) and the dependent variable (perceived waiting time), mood and attention. Results of multivariate analysis of variance and a system of simultaneous equations show that musical cues and visual cues have no symmetrical effects: the musical tempo has a global (moderating) effect on the whole structure of the relations between dependent, independent, and mediating variables but has no direct influence on time perception. The visual cues affect time perception, the significance of which depends on musical tempo. Also, the "Resource Allocation Model of Time Estimation" predicts the attention-time relation better than Ornstein's "storage-size theory." Mood state serves as a substitute for time information with slow music, but its effects are cancelled with fast music. PMID:8284188

  16. Differentiation of Competence and Affect Self-Perceptions in Elementary School Students: Extending Empirical Evidence

    ERIC Educational Resources Information Center

    Arens, A. Katrin; Hasselhorn, Marcus

    2015-01-01

    This study aimed to address two underexplored research questions regarding support for the separation between competence and affect self-perceptions due to differential relations to outcome criteria. First, it is tested whether higher relations between affect self-perceptions and effort than between competence self-perceptions and effort can also…

  17. Combining spatial and temporal expectations to improve visual perception

    PubMed Central

    Rohenkohl, Gustavo; Gould, Ian C.; Pessoa, Jéssica; Nobre, Anna C.

    2014-01-01

    The importance of temporal expectations in modulating perceptual functions is increasingly recognized. However, the means through which temporal expectations can bias perceptual information processing remains ill understood. Recent theories propose that modulatory effects of temporal expectations rely on the co-existence of other biases based on receptive-field properties, such as spatial location. We tested whether perceptual benefits of temporal expectations in a perceptually demanding psychophysical task depended on the presence of spatial expectations. Foveally presented symbolic arrow cues indicated simultaneously where (location) and when (time) target events were more likely to occur. The direction of the arrow indicated target location (80% validity), while its color (pink or blue) indicated the interval (80% validity) for target appearance. Our results confirmed a strong synergistic interaction between temporal and spatial expectations in enhancing visual discrimination. Temporal expectation significantly boosted the effectiveness of spatial expectation in sharpening perception. However, benefits for temporal expectation disappeared when targets occurred at unattended locations. Our findings suggest that anticipated receptive-field properties of targets provide a natural template upon which temporal expectations can operate in order to help prioritize goal-relevant events from early perceptual stages. PMID:24722562

  18. Evaluating visual perception for assessing reconstructed flap health

    PubMed Central

    Ponticorvo, Adrien; Taydas, Eren; Mazhar, Amaan; Ellstrom, Christopher L.; Rimler, Jonathan; Scholz, Thomas; Tong, June; Evans, Gregory R.D.; Cuccia, David J.

    2015-01-01

    Background Detecting failing tissue flaps before they are clinically apparent has the potential to improve post-operative flap management and salvage rates. This study demonstrates a model to quantitatively compare clinical appearance, as recorded via digital camera, with spatial frequency domain imaging (SFDI), a non-invasive imaging technique utilizing patterned illumination to generate images of total hemoglobin and tissue oxygen saturation. Methods Using a swine pedicle model where blood flow was carefully controlled with occlusion cuffs and monitored with ultrasound probes, throughput was reduced by 25%, 50%, 75%, and 100% of baseline values in either the artery or the vein of each of the flaps. The color changes recorded by a digital camera were quantified in order to predict which occlusion levels were visible to the human eye. SFDI was also used to quantify the changes in physiological parameters including total hemoglobin and oxygen saturation associated with each occlusion. Results There were no statistically significant changes in color above the noticeable perception levels associated with human vision during any of the occlusion levels. However there were statistically significant changes in total hemoglobin and tissue oxygen saturation levels detected at the 50%, 75%, and 100% occlusion levels for arterial and venous occlusions. Conclusion As demonstrated by the color imaging data, visual flap changes are difficult to detect until significant occlusion has occurred. SFDI is capable of detecting changes in total hemoglobin and tissue oxygen saturation as a result of partial occlusions before they are perceivable, thereby potentially improving response times and salvage rates. PMID:25935469

  19. Focal Length Affects Depicted Shape and Perception of Facial Images

    PubMed Central

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject’s facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits. PMID:26894832

  20. Focal Length Affects Depicted Shape and Perception of Facial Images.

    PubMed

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits. PMID:26894832

  1. A color fusion method of infrared and low-light-level images based on visual perception

    NASA Astrophysics Data System (ADS)

    Han, Jing; Yan, Minmin; Zhang, Yi; Bai, Lianfa

    2014-11-01

    The color fusion images can be obtained through the fusion of infrared and low-light-level images, which will contain both the information of the two. The fusion images can help observers to understand the multichannel images comprehensively. However, simple fusion may lose the target information due to inconspicuous targets in long-distance infrared and low-light-level images; and if targets extraction is adopted blindly, the perception of the scene information will be affected seriously. To solve this problem, a new fusion method based on visual perception is proposed in this paper. The extraction of the visual targets ("what" information) and parallel processing mechanism are applied in traditional color fusion methods. The infrared and low-light-level color fusion images are achieved based on efficient typical targets learning. Experimental results show the effectiveness of the proposed method. The fusion images achieved by our algorithm can not only improve the detection rate of targets, but also get rich natural information of the scenes.

  2. A Rhodopsin-Guanylyl Cyclase Gene Fusion Functions in Visual Perception in a Fungus

    PubMed Central

    Avelar, Gabriela M.; Schumacher, Robert I.; Zaini, Paulo A.; Leonard, Guy; Richards, Thomas A.; Gomes, Suely L.

    2014-01-01

    Summary Sensing light is the fundamental property of visual systems, with vision in animals being based almost exclusively on opsin photopigments [1]. Rhodopsin also acts as a photoreceptor linked to phototaxis in green algae [2, 3] and has been implicated by chemical means as a light sensor in the flagellated swimming zoospores of the fungus Allomyces reticulatus [4]; however, the signaling mechanism in these fungi remains unknown. Here we use a combination of genome sequencing and molecular inhibition experiments with light-sensing phenotype studies to examine the signaling pathway involved in visual perception in the closely related fungus Blastocladiella emersonii. Our data show that in these fungi, light perception is accomplished by the function of a novel gene fusion (BeGC1) of a type I (microbial) rhodopsin domain and guanylyl cyclase catalytic domain. Photobleaching of rhodopsin function prevents accumulation of cGMP levels and phototaxis of fungal zoospores exposed to green light, whereas inhibition of guanylyl cyclase activity negatively affects fungal phototaxis. Immunofluorescence microscopy localizes the BeGC1 protein to the external surface of the zoospore eyespot positioned close to the base of the swimming flagellum [4, 5], demonstrating this is a photoreceptive organelle composed of lipid droplets. Taken together, these data indicate that Blastocladiomycota fungi have a cGMP signaling pathway involved in phototaxis similar to the vertebrate vision-signaling cascade but composed of protein domain components arranged as a novel gene fusion architecture and of distant evolutionary ancestry to type II rhodopsins of animals. PMID:24835457

  3. A rhodopsin-guanylyl cyclase gene fusion functions in visual perception in a fungus.

    PubMed

    Avelar, Gabriela M; Schumacher, Robert I; Zaini, Paulo A; Leonard, Guy; Richards, Thomas A; Gomes, Suely L

    2014-06-01

    Sensing light is the fundamental property of visual systems, with vision in animals being based almost exclusively on opsin photopigments [1]. Rhodopsin also acts as a photoreceptor linked to phototaxis in green algae [2, 3] and has been implicated by chemical means as a light sensor in the flagellated swimming zoospores of the fungus Allomyces reticulatus [4]; however, the signaling mechanism in these fungi remains unknown. Here we use a combination of genome sequencing and molecular inhibition experiments with light-sensing phenotype studies to examine the signaling pathway involved in visual perception in the closely related fungus Blastocladiella emersonii. Our data show that in these fungi, light perception is accomplished by the function of a novel gene fusion (BeGC1) of a type I (microbial) rhodopsin domain and guanylyl cyclase catalytic domain. Photobleaching of rhodopsin function prevents accumulation of cGMP levels and phototaxis of fungal zoospores exposed to green light, whereas inhibition of guanylyl cyclase activity negatively affects fungal phototaxis. Immunofluorescence microscopy localizes the BeGC1 protein to the external surface of the zoospore eyespot positioned close to the base of the swimming flagellum [4, 5], demonstrating this is a photoreceptive organelle composed of lipid droplets. Taken together, these data indicate that Blastocladiomycota fungi have a cGMP signaling pathway involved in phototaxis similar to the vertebrate vision-signaling cascade but composed of protein domain components arranged as a novel gene fusion architecture and of distant evolutionary ancestry to type II rhodopsins of animals. PMID:24835457

  4. Lesions to right posterior parietal cortex impair visual depth perception from disparity but not motion cues.

    PubMed

    Murphy, Aidan P; Leopold, David A; Humphreys, Glyn W; Welchman, Andrew E

    2016-06-19

    The posterior parietal cortex (PPC) is understood to be active when observers perceive three-dimensional (3D) structure. However, it is not clear how central this activity is in the construction of 3D spatial representations. Here, we examine whether PPC is essential for two aspects of visual depth perception by testing patients with lesions affecting this region. First, we measured subjects' ability to discriminate depth structure in various 3D surfaces and objects using binocular disparity. Patients with lesions to right PPC (N = 3) exhibited marked perceptual deficits on these tasks, whereas those with left hemisphere lesions (N = 2) were able to reliably discriminate depth as accurately as control subjects. Second, we presented an ambiguous 3D stimulus defined by structure from motion to determine whether PPC lesions influence the rate of bistable perceptual alternations. Patients' percept durations for the 3D stimulus were generally within a normal range, although the two patients with bilateral PPC lesions showed the fastest perceptual alternation rates in our sample. Intermittent stimulus presentation reduced the reversal rate similarly across subjects. Together, the results suggest that PPC plays a causal role in both inferring and maintaining the perception of 3D structure with stereopsis supported primarily by the right hemisphere, but do not lend support to the view that PPC is a critical contributor to bistable perceptual alternations.This article is part of the themed issue 'Vision in our three-dimensional world'. PMID:27269606

  5. Lesions to right posterior parietal cortex impair visual depth perception from disparity but not motion cues

    PubMed Central

    Leopold, David A.; Humphreys, Glyn W.; Welchman, Andrew E.

    2016-01-01

    The posterior parietal cortex (PPC) is understood to be active when observers perceive three-dimensional (3D) structure. However, it is not clear how central this activity is in the construction of 3D spatial representations. Here, we examine whether PPC is essential for two aspects of visual depth perception by testing patients with lesions affecting this region. First, we measured subjects' ability to discriminate depth structure in various 3D surfaces and objects using binocular disparity. Patients with lesions to right PPC (N = 3) exhibited marked perceptual deficits on these tasks, whereas those with left hemisphere lesions (N = 2) were able to reliably discriminate depth as accurately as control subjects. Second, we presented an ambiguous 3D stimulus defined by structure from motion to determine whether PPC lesions influence the rate of bistable perceptual alternations. Patients' percept durations for the 3D stimulus were generally within a normal range, although the two patients with bilateral PPC lesions showed the fastest perceptual alternation rates in our sample. Intermittent stimulus presentation reduced the reversal rate similarly across subjects. Together, the results suggest that PPC plays a causal role in both inferring and maintaining the perception of 3D structure with stereopsis supported primarily by the right hemisphere, but do not lend support to the view that PPC is a critical contributor to bistable perceptual alternations. This article is part of the themed issue ‘Vision in our three-dimensional world’. PMID:27269606

  6. Seeing Is the Hardest Thing to See: Using Illusions to Teach Visual Perception

    ERIC Educational Resources Information Center

    Riener, Cedar

    2015-01-01

    This chapter describes three examples of using illusions to teach visual perception. The illusions present ways for students to change their perspective regarding how their eyes work and also offer opportunities to question assumptions regarding their approach to knowledge.

  7. Visualizing the Perception Filter and Breaching It with Active-Learning Strategies

    ERIC Educational Resources Information Center

    White, Harold B.

    2012-01-01

    Teachers' perception filter operates in all realms of their consciousness. It plays an important part in what and how students learn and should play a central role in what and how they teach. This may be obvious, but having a visual model of a perception filter can guide the way they think about education. In this article, the author talks about…

  8. To See or Not to See: Analyzing Difficulties in Geometry from the Perspective of Visual Perception

    ERIC Educational Resources Information Center

    Gal, Hagar; Linchevski, Liora

    2010-01-01

    In this paper, we consider theories about processes of visual perception and perception-based knowledge representation (VPR) in order to explain difficulties encountered in figural processing in junior high school geometry tasks. In order to analyze such difficulties, we take advantage of the following perspectives of VPR: (1) Perceptual…

  9. Optical images of visible and invisible percepts in the primary visual cortex of primates

    PubMed Central

    Macknik, Stephen L.; Haglund, Michael M.

    1999-01-01

    We optically imaged a visual masking illusion in primary visual cortex (area V-1) of rhesus monkeys to ask whether activity in the early visual system more closely reflects the physical stimulus or the generated percept. Visual illusions can be a powerful way to address this question because they have the benefit of dissociating the stimulus from perception. We used an illusion in which a flickering target (a bar oriented in visual space) is rendered invisible by two counter-phase flickering bars, called masks, which flank and abut the target. The target and masks, when shown separately, each generated correlated activity on the surface of the cortex. During the illusory condition, however, optical signals generated in the cortex by the target disappeared although the image of the masks persisted. The optical image thus was correlated with perception but not with the physical stimulus. PMID:10611363

  10. Dance and Music in "Gangnam Style": How Dance Observation Affects Meter Perception.

    PubMed

    Lee, Kyung Myun; Barrett, Karen Chan; Kim, Yeonhwa; Lim, Yeoeun; Lee, Kyogu

    2015-01-01

    Dance and music often co-occur as evidenced when viewing choreographed dances or singers moving while performing. This study investigated how the viewing of dance motions shapes sound perception. Previous research has shown that dance reflects the temporal structure of its accompanying music, communicating musical meter (i.e. a hierarchical organization of beats) via coordinated movement patterns that indicate where strong and weak beats occur. Experiments here investigated the effects of dance cues on meter perception, hypothesizing that dance could embody the musical meter, thereby shaping participant reaction times (RTs) to sound targets occurring at different metrical positions.In experiment 1, participants viewed a video with dance choreography indicating 4/4 meter (dance condition) or a series of color changes repeated in sequences of four to indicate 4/4 meter (picture condition). A sound track accompanied these videos and participants reacted to timbre targets at different metrical positions. Participants had the slowest RT's at the strongest beats in the dance condition only. In experiment 2, participants viewed the choreography of the horse-riding dance from Psy's "Gangnam Style" in order to examine how a familiar dance might affect meter perception. Moreover, participants in this experiment were divided into a group with experience dancing this choreography and a group without experience. Results again showed slower RTs to stronger metrical positions and the group with experience demonstrated a more refined perception of metrical hierarchy. Results likely stem from the temporally selective division of attention between auditory and visual domains. This study has implications for understanding: 1) the impact of splitting attention among different sensory modalities, and 2) the impact of embodiment, on perception of musical meter. Viewing dance may interfere with sound processing, particularly at critical metrical positions, but embodied familiarity with

  11. Dance and Music in “Gangnam Style”: How Dance Observation Affects Meter Perception

    PubMed Central

    Lee, Kyung Myun; Barrett, Karen Chan; Kim, Yeonhwa; Lim, Yeoeun; Lee, Kyogu

    2015-01-01

    Dance and music often co-occur as evidenced when viewing choreographed dances or singers moving while performing. This study investigated how the viewing of dance motions shapes sound perception. Previous research has shown that dance reflects the temporal structure of its accompanying music, communicating musical meter (i.e. a hierarchical organization of beats) via coordinated movement patterns that indicate where strong and weak beats occur. Experiments here investigated the effects of dance cues on meter perception, hypothesizing that dance could embody the musical meter, thereby shaping participant reaction times (RTs) to sound targets occurring at different metrical positions.In experiment 1, participants viewed a video with dance choreography indicating 4/4 meter (dance condition) or a series of color changes repeated in sequences of four to indicate 4/4 meter (picture condition). A sound track accompanied these videos and participants reacted to timbre targets at different metrical positions. Participants had the slowest RT’s at the strongest beats in the dance condition only. In experiment 2, participants viewed the choreography of the horse-riding dance from Psy’s “Gangnam Style” in order to examine how a familiar dance might affect meter perception. Moreover, participants in this experiment were divided into a group with experience dancing this choreography and a group without experience. Results again showed slower RTs to stronger metrical positions and the group with experience demonstrated a more refined perception of metrical hierarchy. Results likely stem from the temporally selective division of attention between auditory and visual domains. This study has implications for understanding: 1) the impact of splitting attention among different sensory modalities, and 2) the impact of embodiment, on perception of musical meter. Viewing dance may interfere with sound processing, particularly at critical metrical positions, but embodied

  12. 3D Shape Perception in Posterior Cortical Atrophy: A Visual Neuroscience Perspective

    PubMed Central

    Gillebert, Céline R.; Schaeverbeke, Jolien; Bastin, Christine; Neyens, Veerle; Bruffaerts, Rose; De Weer, An-Sofie; Seghers, Alexandra; Sunaert, Stefan; Van Laere, Koen; Versijpt, Jan; Vandenbulcke, Mathieu; Salmon, Eric; Todd, James T.; Orban, Guy A.

    2015-01-01

    Posterior cortical atrophy (PCA) is a rare focal neurodegenerative syndrome characterized by progressive visuoperceptual and visuospatial deficits, most often due to atypical Alzheimer's disease (AD). We applied insights from basic visual neuroscience to analyze 3D shape perception in humans affected by PCA. Thirteen PCA patients and 30 matched healthy controls participated, together with two patient control groups with diffuse Lewy body dementia (DLBD) and an amnestic-dominant phenotype of AD, respectively. The hierarchical study design consisted of 3D shape processing for 4 cues (shading, motion, texture, and binocular disparity) with corresponding 2D and elementary feature extraction control conditions. PCA and DLBD exhibited severe 3D shape-processing deficits and AD to a lesser degree. In PCA, deficient 3D shape-from-shading was associated with volume loss in the right posterior inferior temporal cortex. This region coincided with a region of functional activation during 3D shape-from-shading in healthy controls. In PCA patients who performed the same fMRI paradigm, response amplitude during 3D shape-from-shading was reduced in this region. Gray matter volume in this region also correlated with 3D shape-from-shading in AD. 3D shape-from-disparity in PCA was associated with volume loss slightly more anteriorly in posterior inferior temporal cortex as well as in ventral premotor cortex. The findings in right posterior inferior temporal cortex and right premotor cortex are consistent with neurophysiologically based models of the functional anatomy of 3D shape processing. However, in DLBD, 3D shape deficits rely on mechanisms distinct from inferior temporal structural integrity. SIGNIFICANCE STATEMENT Posterior cortical atrophy (PCA) is a neurodegenerative syndrome characterized by progressive visuoperceptual dysfunction and most often an atypical presentation of Alzheimer's disease (AD) affecting the ventral and dorsal visual streams rather than the medial

  13. Biometric Research in Perception and Neurology Related to the Study of Visual Communication.

    ERIC Educational Resources Information Center

    Metallinos, Nikos

    Contemporary research findings in the fields of perceptual psychology and neurology of the human brain that are directly related to the study of visual communication are reviewed and briefly discussed in this paper. Specifically, the paper identifies those major research findings in visual perception that are relevant to the study of visual…

  14. Validity and Reliability of the Developmental Test of Visual Perception - Third Edition (DTVP-3).

    PubMed

    Brown, Ted

    2016-07-01

    The Developmental Test of Visual Perception - Third Edition (DTVP-3) is a recently published revision of a visual perceptual test from the United States, frequently used by occupational therapists. It is important that tests have adequate documented reliability and validity and are evaluated in cross-cultural contexts. The purpose of the study was to assess the reliability and validity of the DTVP-3 when completed by a group of Australian participants. Thirty-nine typically developing children 6-8 years of age completed the DTVP-3 and the Developmental Test of Visual-Motor Integration - 6th edition (VMI-6). The internal consistency of the DVTP-3 was assessed using Cronbach alpha coefficients and the DTVP-3's convergent validity was examined by correlating it with the VMI-6 and its two supplementary tests. The five DTVP-3 subscales' Cronbach alpha coefficients ranged from.60 to.80 while its three composite indexes had coefficients all at the.80 level. The VMI-6 was significantly correlated with the DTVP-3 Figure Ground and Visual Closure subscales and the Motor-Reduced Visual Perception Index (MRVPI). The VMI-6 Visual Perception Supplementary Test was significantly correlated with the DTVP-3 Figure Ground, Visual Closure, Form Constancy, MRVPI, and General Visual Perception Index. The DTVP-3 exhibited acceptable levels of internal consistency and moderate levels of convergent validity with the VMI-6 when completed by a group of Australian children. PMID:26913939

  15. Behind Mathematical Learning Disabilities: What about Visual Perception and Motor Skills?

    ERIC Educational Resources Information Center

    Pieters, Stefanie; Desoete, Annemie; Roeyers, Herbert; Vanderswalmen, Ruth; Van Waelvelde, Hilde

    2012-01-01

    In a sample of 39 children with mathematical learning disabilities (MLD) and 106 typically developing controls belonging to three control groups of three different ages, we found that visual perception, motor skills and visual-motor integration explained a substantial proportion of the variance in either number fact retrieval or procedural…

  16. Parents' Perceptions of Physical Activity for Their Children with Visual Impairments

    ERIC Educational Resources Information Center

    Perkins, Kara; Columna, Luis; Lieberman, Lauren; Bailey, JoEllen

    2013-01-01

    Introduction: Ongoing communication with parents and the acknowledgment of their preferences and expectations are crucial to promote the participation of physical activity by children with visual impairments. Purpose: The study presented here explored parents' perceptions of physical activity for their children with visual impairments and explored…

  17. Effects of Positive Affect on Risk Perceptions in Adolescence and Young Adulthood

    ERIC Educational Resources Information Center

    Haase, Claudia M.; Silbereisen, Rainer K.

    2011-01-01

    Affective influences may play a key role in adolescent risk taking, but have rarely been studied. Using an audiovisual method of affect induction, two experimental studies examined the effect of positive affect on risk perceptions in adolescence and young adulthood. Outcomes were risk perceptions regarding drinking alcohol, smoking a cigarette,…

  18. Perceptions of chemistry: Why is the common perception of chemistry, the most visual of sciences, so distorted?

    NASA Astrophysics Data System (ADS)

    Habraken, Clarisse L.

    1996-09-01

    Chemistry has evolved from a science dominated by mathematics into a science highly dependent on spatial-visual intelligence. Yet the chemical content of introductory courses remains taught essentially the same as 40-50 years ago. Chemistry, today, is recognized by chemists as the molecular science. Yet, school chemistry is alienated from that perception. Thanks to the computer, young people are more comfortable with visual imaging than their instructors were at the same age. Thus the time is rife to reinvigorate chemistry education by means of the visual-spatial approach, an approach wholly in conformance with the way modern chemistry is thought about and practiced.

  19. Integrative Processing of Touch and Affect in Social Perception: An fMRI Study.

    PubMed

    Ebisch, Sjoerd J H; Salone, Anatolia; Martinotti, Giovanni; Carlucci, Leonardo; Mantini, Dante; Perrucci, Mauro G; Saggino, Aristide; Romani, Gian Luca; Di Giannantonio, Massimo; Northoff, Georg; Gallese, Vittorio

    2016-01-01

    Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top-down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC), and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others' feelings from manifold bodily sources (sensory-affective information) in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content. PMID:27242474

  20. Integrative Processing of Touch and Affect in Social Perception: An fMRI Study

    PubMed Central

    Ebisch, Sjoerd J. H.; Salone, Anatolia; Martinotti, Giovanni; Carlucci, Leonardo; Mantini, Dante; Perrucci, Mauro G.; Saggino, Aristide; Romani, Gian Luca; Di Giannantonio, Massimo; Northoff, Georg; Gallese, Vittorio

    2016-01-01

    Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top–down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC), and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others’ feelings from manifold bodily sources (sensory-affective information) in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content. PMID:27242474

  1. Self-motion perception in autism is compromised by visual noise but integrated optimally across multiple senses.

    PubMed

    Zaidel, Adam; Goin-Kochel, Robin P; Angelaki, Dora E

    2015-05-19

    Perceptual processing in autism spectrum disorder (ASD) is marked by superior low-level task performance and inferior complex-task performance. This observation has led to theories of defective integration in ASD of local parts into a global percept. Despite mixed experimental results, this notion maintains widespread influence and has also motivated recent theories of defective multisensory integration in ASD. Impaired ASD performance in tasks involving classic random dot visual motion stimuli, corrupted by noise as a means to manipulate task difficulty, is frequently interpreted to support this notion of global integration deficits. By manipulating task difficulty independently of visual stimulus noise, here we test the hypothesis that heightened sensitivity to noise, rather than integration deficits, may characterize ASD. We found that although perception of visual motion through a cloud of dots was unimpaired without noise, the addition of stimulus noise significantly affected adolescents with ASD, more than controls. Strikingly, individuals with ASD demonstrated intact multisensory (visual-vestibular) integration, even in the presence of noise. Additionally, when vestibular motion was paired with pure visual noise, individuals with ASD demonstrated a different strategy than controls, marked by reduced flexibility. This result could be simulated by using attenuated (less reliable) and inflexible (not experience-dependent) Bayesian priors in ASD. These findings question widespread theories of impaired global and multisensory integration in ASD. Rather, they implicate increased sensitivity to sensory noise and less use of prior knowledge in ASD, suggesting increased reliance on incoming sensory information. PMID:25941373

  2. Applying concepts of visual perception to formats of hospital menus.

    PubMed

    Fankhauser, W L; Vaden, A G; Konz, S A

    1975-11-01

    Standardized printed menu formats for all diets utilizing concepts of visual perception were evaluated in a machine-paced hospital tray-assembly process. Formats of existing menus differed among the various diets. On the redesigned menus, all menu items were arranged in basic groups which were assigned specific positions; groups were accentuated by white strips across the various color-coded selective menus; and accessory items were placed in specific, standard positions on all menus. Criteria for evaluating the effect of using the redesigned menu in tray assembly operations were: overall productivity, individual productivity, and error rate per tray. Data were charted (a) during a control period when the existing menu formats were used to provide baseline data and (b) during an experimental period when the redesigned menu formats were used. Overall productivity was measured by man-minutes per tray. Video tapes of five station operators servicing selected trays were made to study individual productivity. Station operator and checker accuracy were measured in terms of ratio of error-free trays, errors per tray, and errors to possibility of errors per tray. Man-minutes per tray decreased significantly in the experimental period from 2.44 to 2.17--a productivity increase of 11.1 per cent. The individual productivity analysis revealed no significant changes from control to experimental periods. Accuracy of the tray assembly station operators improved significantly. Decreases in ratio of mean number of errors to possibility of errors per tray were recorded in the experimental period. The error rate per tray decreased 44.9 per cent from 0.48 to 0.26, and the ratio of errors to possibility of errors per tray decreased from 6.3 to 3.5 per cent. The percentage of error-free trays rose from 69.9 to 80.9 per cent. Checkers' errors per tray did not change significantly from control to experimental period when data for the two periods were compared. This study provides a

  3. Visual adaptation of the perception of "life": animacy is a basic perceptual dimension of faces.

    PubMed

    Koldewyn, Kami; Hanus, Patricia; Balas, Benjamin

    2014-08-01

    One critical component of understanding another's mind is the perception of "life" in a face. However, little is known about the cognitive and neural mechanisms underlying this perception of animacy. Here, using a visual adaptation paradigm, we ask whether face animacy is (1) a basic dimension of face perception and (2) supported by a common neural mechanism across distinct face categories defined by age and species. Observers rated the perceived animacy of adult human faces before and after adaptation to (1) adult faces, (2) child faces, and (3) dog faces. When testing the perception of animacy in human faces, we found significant adaptation to both adult and child faces, but not dog faces. We did, however, find significant adaptation when morphed dog images and dog adaptors were used. Thus, animacy perception in faces appears to be a basic dimension of face perception that is species specific but not constrained by age categories. PMID:24323739

  4. Infants' Understanding of the Link between Visual Perception and Emotion: "If She Can't See Me Doing It, She Won't Get Angry."

    ERIC Educational Resources Information Center

    Repacholi, Betty M.; Meltzoff, Andrew N.; Olsen, Berit

    2008-01-01

    Two experiments investigated 18-month-olds' understanding of the link between visual perception and emotion. Infants watched an adult perform actions on objects. An emoter then expressed neutral affect or anger toward the adult in response to the adult's actions. Subsequently, infants were given 20 s to interact with each object. In Experiment 1,…

  5. NMDA receptor antagonist ketamine impairs feature integration in visual perception.

    PubMed

    Meuwese, Julia D I; van Loon, Anouk M; Scholte, H Steven; Lirk, Philipp B; Vulink, Nienke C C; Hollmann, Markus W; Lamme, Victor A F

    2013-01-01

    Recurrent interactions between neurons in the visual cortex are crucial for the integration of image elements into coherent objects, such as in figure-ground segregation of textured images. Blocking N-methyl-D-aspartate (NMDA) receptors in monkeys can abolish neural signals related to figure-ground segregation and feature integration. However, it is unknown whether this also affects perceptual integration itself. Therefore, we tested whether ketamine, a non-competitive NMDA receptor antagonist, reduces feature integration in humans. We administered a subanesthetic dose of ketamine to healthy subjects who performed a texture discrimination task in a placebo-controlled double blind within-subject design. We found that ketamine significantly impaired performance on the texture discrimination task compared to the placebo condition, while performance on a control fixation task was much less impaired. This effect is not merely due to task difficulty or a difference in sedation levels. We are the first to show a behavioral effect on feature integration by manipulating the NMDA receptor in humans. PMID:24223927

  6. NMDA Receptor Antagonist Ketamine Impairs Feature Integration in Visual Perception

    PubMed Central

    Meuwese, Julia D. I.; van Loon, Anouk M.; Scholte, H. Steven; Lirk, Philipp B.; Vulink, Nienke C. C.; Hollmann, Markus W.; Lamme, Victor A. F.

    2013-01-01

    Recurrent interactions between neurons in the visual cortex are crucial for the integration of image elements into coherent objects, such as in figure-ground segregation of textured images. Blocking N-methyl-D-aspartate (NMDA) receptors in monkeys can abolish neural signals related to figure-ground segregation and feature integration. However, it is unknown whether this also affects perceptual integration itself. Therefore, we tested whether ketamine, a non-competitive NMDA receptor antagonist, reduces feature integration in humans. We administered a subanesthetic dose of ketamine to healthy subjects who performed a texture discrimination task in a placebo-controlled double blind within-subject design. We found that ketamine significantly impaired performance on the texture discrimination task compared to the placebo condition, while performance on a control fixation task was much less impaired. This effect is not merely due to task difficulty or a difference in sedation levels. We are the first to show a behavioral effect on feature integration by manipulating the NMDA receptor in humans. PMID:24223927

  7. Data visualization optimization via computational modeling of perception.

    PubMed

    Pineo, Daniel; Ware, Colin

    2012-02-01

    We present a method for automatically evaluating and optimizing visualizations using a computational model of human vision. The method relies on a neural network simulation of early perceptual processing in the retina and primary visual cortex. The neural activity resulting from viewing flow visualizations is simulated and evaluated to produce a metric of visualization effectiveness. Visualization optimization is achieved by applying this effectiveness metric as the utility function in a hill-climbing algorithm. We apply this method to the evaluation and optimization of 2D flow visualizations, using two visualization parameterizations: streaklet-based and pixel-based. An emergent property of the streaklet-based optimization is head-to-tail streaklet alignment. It had been previously hypothesized the effectiveness of head-to-tail alignment results from the perceptual processing of the visual system, but this theory had not been computationally modeled. A second optimization using a pixel-based parameterization resulted in a LIC-like result. The implications in terms of the selection of primitives is discussed. We argue that computational models can be used for optimizing complex visualizations. In addition, we argue that they can provide a means of computationally evaluating perceptual theories of visualization, and as a method for quality control of display methods. PMID:21383402

  8. How interviewers' nonverbal behaviors can affect children's perceptions and suggestibility.

    PubMed

    Almerigogna, Jehanne; Ost, James; Akehurst, Lucy; Fluck, Mike

    2008-05-01

    We conducted two studies to examine how interviewers' nonverbal behaviors affect children's perceptions and suggestibility. In the first study, 42 8- to 10-year-olds watched video clips showing an interviewer displaying combinations of supportive and nonsupportive nonverbal behaviors and were asked to rate the interviewer on six attributes (e.g., friendliness, strictness). Smiling received high ratings on the positive attributes (i.e., friendly, helpful, and sincere), and fidgeting received high ratings on the negative attributes (i.e., strict, bored, and stressed). For the second study, 86 8- to 10-year-olds participated in a learning activity about the vocal chords. One week later, they were interviewed individually about the activity by an interviewer adopting either the supportive (i.e., smiling) or nonsupportive (i.e., fidgeting) behavior. Children questioned by the nonsupportive interviewer were less accurate and more likely to falsely report having been touched than were those questioned by the supportive interviewer. Children questioned by the supportive interviewer were also more likely to say that they did not know an answer than were children questioned by the nonsupportive interviewer. Participants in both conditions gave more correct answers to questions about central, as opposed to peripheral, details of the activity. Implications of these findings for the appropriate interviewing of child witnesses are discussed. PMID:18316091

  9. Early, but not late visual distractors affect movement synchronization to a temporal-spatial visual cue

    PubMed Central

    Booth, Ashley J.; Elliott, Mark T.

    2015-01-01

    The ease of synchronizing movements to a rhythmic cue is dependent on the modality of the cue presentation: timing accuracy is much higher when synchronizing with discrete auditory rhythms than an equivalent visual stimulus presented through flashes. However, timing accuracy is improved if the visual cue presents spatial as well as temporal information (e.g., a dot following an oscillatory trajectory). Similarly, when synchronizing with an auditory target metronome in the presence of a second visual distracting metronome, the distraction is stronger when the visual cue contains spatial-temporal information rather than temporal only. The present study investigates individuals’ ability to synchronize movements to a temporal-spatial visual cue in the presence of same-modality temporal-spatial distractors. Moreover, we investigated how increasing the number of distractor stimuli impacted on maintaining synchrony with the target cue. Participants made oscillatory vertical arm movements in time with a vertically oscillating white target dot centered on a large projection screen. The target dot was surrounded by 2, 8, or 14 distractor dots, which had an identical trajectory to the target but at a phase lead or lag of 0, 100, or 200 ms. We found participants’ timing performance was only affected in the phase-lead conditions and when there were large numbers of distractors present (8 and 14). This asymmetry suggests participants still rely on salient events in the stimulus trajectory to synchronize movements. Subsequently, distractions occurring in the window of attention surrounding those events have the maximum impact on timing performance. PMID:26157412

  10. Undergraduate nursing students' perceptions regarding factors that affect math abilities

    NASA Astrophysics Data System (ADS)

    Pyo, Katrina A.

    2011-07-01

    A review of the nursing literature reveals many undergraduate nursing students lack proficiency with basic mathematical skills, those necessary for safe medication preparation and administration. Few studies exploring the phenomenon from the undergraduate nursing student perspective are reported in the nursing literature. The purpose of this study was to explore undergraduate nursing students’ perceptions of math abilities, factors that affect math abilities, the use of math in nursing, and the extent to which specific math skills were addressed throughout a nursing curriculum. Polya’s Model for Problem Solving and the Bloom’s Taxonomy of Educational Objectives, Affective Domain served as the theoretical background for the study. Qualitative and quantitative methods were utilized to obtain data from a purposive sample of undergraduate nursing students from a private university in western Pennsylvania. Participants were selected based on the proficiency level with math skills, as determined by a score on the Elsevier’s HESI™ Admission Assessment (A2) Exam, Math Portion. Ten students from the “Excellent” benchmark group and eleven students from the “Needing Additional Assistance or Improvement” benchmark group participated in one-on-one, semi-structured interviews, and completed a 25-item, 4-point Likert scale survey that rated confidence levels with specific math skills and the extent to which these skills were perceived to be addressed in the nursing curriculum. Responses from the two benchmark groups were compared and contrasted. Eight themes emerged from the qualitative data. Findings related to mathematical approach and confidence levels with specific math skills were determined to be statistically significant.

  11. How Perceptions of an Intervention Program Affect Outcomes

    ERIC Educational Resources Information Center

    Forneris, Tanya; Danish, Steven J.; Fries, Elizabeth

    2009-01-01

    Goals for Health was a National Cancer Institute funded program designed to impact health behaviors of adolescents living in rural Virginia and New York. This study examined three specific objectives: (a) to examine participants' perceptions of the program components and the relationship between program components and overall program perception,…

  12. Working Memory Enhances Visual Perception: Evidence from Signal Detection Analysis

    ERIC Educational Resources Information Center

    Soto, David; Wriglesworth, Alice; Bahrami-Balani, Alex; Humphreys, Glyn W.

    2010-01-01

    We show that perceptual sensitivity to visual stimuli can be modulated by matches between the contents of working memory (WM) and stimuli in the visual field. Observers were presented with an object cue (to hold in WM or to merely attend) and subsequently had to identify a brief target presented within a colored shape. The cue could be…

  13. Parallel and Serial Grouping of Image Elements in Visual Perception

    ERIC Educational Resources Information Center

    Houtkamp, Roos; Roelfsema, Pieter R.

    2010-01-01

    The visual system groups image elements that belong to an object and segregates them from other objects and the background. Important cues for this grouping process are the Gestalt criteria, and most theories propose that these are applied in parallel across the visual scene. Here, we find that Gestalt grouping can indeed occur in parallel in some…

  14. Perceptions Concerning Visual Culture Dialogues of Visual Art Pre-Service Teachers

    ERIC Educational Resources Information Center

    Mamur, Nuray

    2012-01-01

    The visual art which is commented by the visual art teachers to help processing of the visual culture is important. In this study it is tried to describe the effect of visual culture based on the usual aesthetic experiences to be included in the learning process art education. The action research design, which is a qualitative study, is conducted…

  15. Self-motion perception in autism is compromised by visual noise but integrated optimally across multiple senses

    PubMed Central

    Zaidel, Adam; Goin-Kochel, Robin P.; Angelaki, Dora E.

    2015-01-01

    Perceptual processing in autism spectrum disorder (ASD) is marked by superior low-level task performance and inferior complex-task performance. This observation has led to theories of defective integration in ASD of local parts into a global percept. Despite mixed experimental results, this notion maintains widespread influence and has also motivated recent theories of defective multisensory integration in ASD. Impaired ASD performance in tasks involving classic random dot visual motion stimuli, corrupted by noise as a means to manipulate task difficulty, is frequently interpreted to support this notion of global integration deficits. By manipulating task difficulty independently of visual stimulus noise, here we test the hypothesis that heightened sensitivity to noise, rather than integration deficits, may characterize ASD. We found that although perception of visual motion through a cloud of dots was unimpaired without noise, the addition of stimulus noise significantly affected adolescents with ASD, more than controls. Strikingly, individuals with ASD demonstrated intact multisensory (visual–vestibular) integration, even in the presence of noise. Additionally, when vestibular motion was paired with pure visual noise, individuals with ASD demonstrated a different strategy than controls, marked by reduced flexibility. This result could be simulated by using attenuated (less reliable) and inflexible (not experience-dependent) Bayesian priors in ASD. These findings question widespread theories of impaired global and multisensory integration in ASD. Rather, they implicate increased sensitivity to sensory noise and less use of prior knowledge in ASD, suggesting increased reliance on incoming sensory information. PMID:25941373

  16. Close binding of identity and location in visual feature perception

    NASA Technical Reports Server (NTRS)

    Johnston, J. C.; Pashler, H.

    1990-01-01

    The binding of identity and location information in disjunctive feature search was studied. Ss searched a heterogeneous display for a color or a form target, and reported both target identity and location. To avoid better than chance guessing of target identity (by choosing the target less likely to have been seen), the difficulty of the two targets was equalized adaptively; a mathematical model was used to quantify residual effects. A spatial layout was used that minimized postperceptual errors in reporting location. Results showed strong binding of identity and location perception. After correction for guessing, no perception of identity without location was found. A weak trend was found for accurate perception of target location without identity. We propose that activated features generate attention-calling "interrupt" signals, specifying only location; attention then retrieves the properties at that location.

  17. The spatiotemporal profile of cortical processing leading up to visual perception.

    PubMed

    Fahrenfort, J J; Scholte, H S; Lamme, V A F

    2008-01-01

    Much controversy exists around the locus of conscious visual perception in human cortex. Some authors have proposed that its neural correlates correspond with recurrent processing within visual cortex, whereas others have argued they are located in a frontoparietal network. The present experiment aims to bring together these competing viewpoints. We recorded EEG from human subjects that were engaged in detecting masked visual targets. From this, we obtained a spatiotemporal profile of neural activity selectively related to the processing of the targets, which we correlated with the subjects' ability to detect those targets. This made it possible to distinguish between those stages of visual processing that correlate with human perception and those that do not. The results show that target induced extra-striate feedforward activity peaking at 121 ms does not correlate with perception, whereas more posterior recurrent activity peaking at 160 ms does. Several subsequent stages show an alternating pattern of frontoparietal and occipital activity, all of which correlate highly with perception. This shows that perception emerges early on, but only after an initial feedforward volley, and suggests that multiple reentrant loops are involved in propagating this signal to frontoparietal areas. PMID:18318615

  18. The Effectiveness of Using the Successive Perception Test I to Measure Visual-Haptic Tendencies in Engineering Students.

    ERIC Educational Resources Information Center

    Study, Nancy E.

    2002-01-01

    Compares results of Successive Perception Test I (SPT) for the study population of freshman engineering students to their results on the group-administered Purdue Spatial Visualization Test: Visualization of Rotations (PSVT) and the individually administered Haptic Visual Discrimination Test (HVDT). Concludes that either visual and haptic…

  19. Crossmodal Statistical Binding of Temporal Information and Stimuli Properties Recalibrates Perception of Visual Apparent Motion.

    PubMed

    Zhang, Yi; Chen, Lihan

    2016-01-01

    Recent studies of brain plasticity that pertain to time perception have shown that fast training of temporal discrimination in one modality, for example, the auditory modality, can improve performance of temporal discrimination in another modality, such as the visual modality. We here examined whether the perception of visual Ternus motion could be recalibrated through fast crossmodal statistical binding of temporal information and stimuli properties binding. We conducted two experiments, composed of three sessions each: pre-test, learning, and post-test. In both the pre-test and the post-test, participants classified the Ternus display as either "element motion" or "group motion." For the training session in Experiment 1, we constructed two types of temporal structures, in which two consecutively presented sound beeps were dominantly (80%) flanked by one leading visual Ternus frame and by one lagging visual Ternus frame (VAAV) or dominantly inserted by two Ternus visual frames (AVVA). Participants were required to respond which interval (auditory vs. visual) was longer. In Experiment 2, we presented only a single auditory-visual pair but with similar temporal configurations as in Experiment 1, and asked participants to perform an audio-visual temporal order judgment. The results of these two experiments support that statistical binding of temporal information and stimuli properties can quickly and selectively recalibrate the sensitivity of perceiving visual motion, according to the protocols of the specific bindings. PMID:27065910

  20. Crossmodal Statistical Binding of Temporal Information and Stimuli Properties Recalibrates Perception of Visual Apparent Motion

    PubMed Central

    Zhang, Yi; Chen, Lihan

    2016-01-01

    Recent studies of brain plasticity that pertain to time perception have shown that fast training of temporal discrimination in one modality, for example, the auditory modality, can improve performance of temporal discrimination in another modality, such as the visual modality. We here examined whether the perception of visual Ternus motion could be recalibrated through fast crossmodal statistical binding of temporal information and stimuli properties binding. We conducted two experiments, composed of three sessions each: pre-test, learning, and post-test. In both the pre-test and the post-test, participants classified the Ternus display as either “element motion” or “group motion.” For the training session in Experiment 1, we constructed two types of temporal structures, in which two consecutively presented sound beeps were dominantly (80%) flanked by one leading visual Ternus frame and by one lagging visual Ternus frame (VAAV) or dominantly inserted by two Ternus visual frames (AVVA). Participants were required to respond which interval (auditory vs. visual) was longer. In Experiment 2, we presented only a single auditory–visual pair but with similar temporal configurations as in Experiment 1, and asked participants to perform an audio–visual temporal order judgment. The results of these two experiments support that statistical binding of temporal information and stimuli properties can quickly and selectively recalibrate the sensitivity of perceiving visual motion, according to the protocols of the specific bindings. PMID:27065910

  1. Auditory-visual speech perception and synchrony detection for speech and nonspeech signals

    PubMed Central

    Conrey, Brianna; Pisoni, David B.

    2012-01-01

    Previous research has identified a “synchrony window” of several hundred milliseconds over which auditory-visual (AV) asynchronies are not reliably perceived. Individual variability in the size of this AV synchrony window has been linked with variability in AV speech perception measures, but it was not clear whether AV speech perception measures are related to synchrony detection for speech only or for both speech and nonspeech signals. An experiment was conducted to investigate the relationship between measures of AV speech perception and AV synchrony detection for speech and nonspeech signals. Variability in AV synchrony detection for both speech and nonspeech signals was found to be related to variability in measures of auditory-only (A-only) and AV speech perception, suggesting that temporal processing for both speech and nonspeech signals must be taken into account in explaining variability in A-only and multisensory speech perception. PMID:16838548

  2. Language and Visual Perception Associations: Meta-Analytic Connectivity Modeling of Brodmann Area 37

    PubMed Central

    Rosselli, Monica

    2015-01-01

    Background. Understanding the functions of different brain areas has represented a major endeavor of neurosciences. Historically, brain functions have been associated with specific cortical brain areas; however, modern neuroimaging developments suggest cognitive functions are associated to networks rather than to areas. Objectives. The purpose of this paper was to analyze the connectivity of Brodmann area (BA) 37 (posterior, inferior, and temporal/fusiform gyrus) in relation to (1) language and (2) visual processing. Methods. Two meta-analyses were initially conducted (first level analysis). The first one was intended to assess the language network in which BA37 is involved. The second one was intended to assess the visual perception network. A third meta-analysis (second level analysis) was then performed to assess contrasts and convergence between the two cognitive domains (language and visual perception). The DataBase of Brainmap was used. Results. Our results support the role of BA37 in language but by means of a distinct network from the network that supports its second most important function: visual perception. Conclusion. It was concluded that left BA37 is a common node of two distinct networks—visual recognition (perception) and semantic language functions. PMID:25648869

  3. Quarrelsome behavior in borderline personality disorder: influence of behavioral and affective reactivity to perceptions of others.

    PubMed

    Sadikaj, Gentiana; Moskowitz, D S; Russell, Jennifer J; Zuroff, David C; Paris, Joel

    2013-02-01

    We examined how the amplification of 3 within-person processes (behavioral reactivity to interpersonal perceptions, affect reactivity to interpersonal perceptions, and behavioral reactivity to a person's own affect) accounts for greater quarrelsome behavior among individuals with borderline personality disorder (BPD). Using an event-contingent recording (ECR) methodology, individuals with BPD (N = 38) and community controls (N = 31) reported on their negative affect, quarrelsome behavior, and perceptions of the interaction partner's agreeable-quarrelsome behavior in interpersonal events during a 20-day period. Behavioral reactivity to negative affect was similar in both groups. However, behavioral reactivity and affect reactivity to interpersonal perceptions were elevated in individuals with BPD relative to community controls; specifically, individuals with BPD reported more quarrelsome behavior and more negative affect during interactions in which they perceived others as more cold-quarrelsome. Greater negative affect reactivity to perceptions of other's cold-quarrelsome behavior partly accounted for the increased quarrelsome behavior reported by individuals with BPD during these interactions. This pattern of results suggests a cycle in which the perception of cold-quarrelsome behavior in others triggers elevated negative affect and quarrelsome behavior in individuals with BPD, which subsequently led to more quarrelsome behavior from their interaction partners, which leads to perceptions of others as cold-quarrelsomeness, which begins the cycle anew. PMID:23231460

  4. Visual Perception of Sentences with Temporarily Ambiguous Clause Boundaries.

    ERIC Educational Resources Information Center

    Frazier, Lyn

    The model of sentence perception proposed by Fodor, Bever and Garrett (1974) emphasizes the importance of grammatical cues signalling clause boundaries, and suggests that segmentation of a sentence into clauses precedes computation of the internal structure of those clauses. However, this model has nothing to say about the many sentences in which…

  5. Visual Influences on Speech Perception in Children with Autism

    ERIC Educational Resources Information Center

    Iarocci, Grace; Rombough, Adrienne; Yager, Jodi; Weeks, Daniel J.; Chua, Romeo

    2010-01-01

    The bimodal perception of speech sounds was examined in children with autism as compared to mental age--matched typically developing (TD) children. A computer task was employed wherein only the mouth region of the face was displayed and children reported what they heard or saw when presented with consonant-vowel sounds in unimodal auditory…

  6. That Deceptive Line: Plato, Linear Perspective, Visual Perception, and Tragedy

    ERIC Educational Resources Information Center

    Killian, Jeremy

    2012-01-01

    In "The Renaissance Rediscovery of Linear Perspective," one of Samuel Edgerton's claims is that Filippo Brunelleschi and his contemporaries did not develop a three-dimensional style of representing the world in painting as much as they reappropriated a way to depict the natural world in painting that most mirrored the human perception of it.…

  7. Young Children's Knowledge about Visual Perception: Projective Size and Shape.

    ERIC Educational Resources Information Center

    Pillow, Bradford H.; Flavell, John H.

    1986-01-01

    Four experiments investigated three- and four-year-old children's knowledge of projective size-distance and projective shape-orientation relationships. Results indicated that preschool children's understanding of these relationships seems at least partly cognitive rather than wholly perceptive, providing further evidence for the acquisition of…

  8. Science Visual Literacy: Learners' Perceptions and Knowledge of Diagrams

    ERIC Educational Resources Information Center

    McTigue, Erin M.; Flowers, Amanda C.

    2011-01-01

    Constructing meaning from science texts relies not only on comprehending the words but also the diagrams and other graphics. The goal of this study was to explore elementary students' perceptions of science diagrams and their skills related to diagram interpretation. 30 students, ranging from second grade through middle school, completed a diagram…

  9. Feature-Based Memory-Driven Attentional Capture: Visual Working Memory Content Affects Visual Attention

    ERIC Educational Resources Information Center

    Olivers, Christian N. L.; Meijer, Frank; Theeuwes, Jan

    2006-01-01

    In 7 experiments, the authors explored whether visual attention (the ability to select relevant visual information) and visual working memory (the ability to retain relevant visual information) share the same content representations. The presence of singleton distractors interfered more strongly with a visual search task when it was accompanied by…

  10. Perception of Audio-Visual Speech Synchrony in Spanish-Speaking Children with and without Specific Language Impairment

    ERIC Educational Resources Information Center

    Pons, Ferran; Andreu, Llorenc; Sanz-Torrent, Monica; Buil-Legaz, Lucia; Lewkowicz, David J.

    2013-01-01

    Speech perception involves the integration of auditory and visual articulatory information, and thus requires the perception of temporal synchrony between this information. There is evidence that children with specific language impairment (SLI) have difficulty with auditory speech perception but it is not known if this is also true for the…

  11. Experience affects the use of ego-motion signals during 3D shape perception

    PubMed Central

    Jain, Anshul; Backus, Benjamin T.

    2011-01-01

    Experience has long-term effects on perceptual appearance (Q. Haijiang, J. A. Saunders, R. W. Stone, & B. T. Backus, 2006). We asked whether experience affects the appearance of structure-from-motion stimuli when the optic flow is caused by observer ego-motion. Optic flow is an ambiguous depth cue: a rotating object and its oppositely rotating, depth-inverted dual generate similar flow. However, the visual system exploits ego-motion signals to prefer the percept of an object that is stationary over one that rotates (M. Wexler, F. Panerai, I. Lamouret, & J. Droulez, 2001). We replicated this finding and asked whether this preference for stationarity, the “stationarity prior,” is modulated by experience. During training, two groups of observers were exposed to objects with identical flow, but that were either stationary or moving as determined by other cues. The training caused identical test stimuli to be seen preferentially as stationary or moving by the two groups, respectively. We then asked whether different priors can exist independently at different locations in the visual field. Observers were trained to see objects either as stationary or as moving at two different locations. Observers’ stationarity bias at the two respective locations was modulated in the directions consistent with training. Thus, the utilization of extraretinal ego-motion signals for disambiguating optic flow signals can be updated as the result of experience, consistent with the updating of a Bayesian prior for stationarity. PMID:21191132

  12. A Novel Image Quality Assessment With Globally and Locally Consilient Visual Quality Perception.

    PubMed

    Bae, Sung-Ho; Kim, Munchurl

    2016-05-01

    Computational models for image quality assessment (IQA) have been developed by exploring effective features that are consistent with the characteristics of a human visual system (HVS) for visual quality perception. In this paper, we first reveal that many existing features used in computational IQA methods can hardly characterize visual quality perception for local image characteristics and various distortion types. To solve this problem, we propose a new IQA method, called the structural contrast-quality index (SC-QI), by adopting a structural contrast index (SCI), which can well characterize local and global visual quality perceptions for various image characteristics with structural-distortion types. In addition to SCI, we devise some other perceptually important features for our SC-QI that can effectively reflect the characteristics of HVS for contrast sensitivity and chrominance component variation. Furthermore, we develop a modified SC-QI, called structural contrast distortion metric (SC-DM), which inherits desirable mathematical properties of valid distance metricability and quasi-convexity. So, it can effectively be used as a distance metric for image quality optimization problems. Extensive experimental results show that both SC-QI and SC-DM can very well characterize the HVS's properties of visual quality perception for local image characteristics and various distortion types, which is a distinctive merit of our methods compared with other IQA methods. As a result, both SC-QI and SC-DM have better performances with a strong consilience of global and local visual quality perception as well as with much lower computation complexity, compared with the state-of-the-art IQA methods. The MATLAB source codes of the proposed SC-QI and SC-DM are publicly available online at https://sites.google.com/site/sunghobaecv/iqa. PMID:27046873

  13. Analysis of EEG Signals Related to Artists and Nonartists during Visual Perception, Mental Imagery, and Rest Using Approximate Entropy

    PubMed Central

    Shourie, Nasrin; Firoozabadi, Mohammad; Badie, Kambiz

    2014-01-01

    In this paper, differences between multichannel EEG signals of artists and nonartists were analyzed during visual perception and mental imagery of some paintings and at resting condition using approximate entropy (ApEn). It was found that ApEn is significantly higher for artists during the visual perception and the mental imagery in the frontal lobe, suggesting that artists process more information during these conditions. It was also observed that ApEn decreases for the two groups during the visual perception due to increasing mental load; however, their variation patterns are different. This difference may be used for measuring progress in novice artists. In addition, it was found that ApEn is significantly lower during the visual perception than the mental imagery in some of the channels, suggesting that visual perception task requires more cerebral efforts. PMID:25133180

  14. Auditory, Visual, and Auditory-Visual Perception of Vowels by Hearing-Impaired Children.

    ERIC Educational Resources Information Center

    Hack, Zarita Caplan; Erber, Norman P.

    1982-01-01

    Vowels were presented through auditory, visual, and auditory-visual modalities to 18 hearing impaired children (12 to 15 years old) having good, intermediate, and poor auditory word recognition skills. All the groups had difficulty with acoustic information and visual information alone. The first two groups had only moderate difficulty identifying…

  15. Applications of neural networks in human shape visual perception.

    PubMed

    Wu, Bo-Wen; Fang, Yi-Chin; Lin, David Pei-Cheng

    2015-12-01

    Advances in optical and electronic technology can immensely reduce noise in images and greatly enhance human visual recognition. However, it is still difficult for human eyes to identify low-resolution thermal images, due to the limits imposed by psychological and physiological factors. In addition, changes in monitor brightness and lens resolution may also interfere with visual recognition abilities. To overcome these limitations, we devised a suitable and effective recognition method which may help the military in revising the shape parameters of long-range targets. The modulation transfer function was used as a basis to extend the visual characteristics of the human visual model and a new model was produced through the incorporation of new shape parameters. The new human visual model was next used in combination with a backpropagation neural network for better recognition of low-resolution thermal images. The new model was then tested in experiments and the results showed that the accuracy rate of recognition steadily rose by over 95%. PMID:26831387

  16. Temporal perception in visual processing as a research tool

    PubMed Central

    Zhou, Bin; Zhang, Ting; Mao, Lihua

    2015-01-01

    Accumulated evidence has shown that the subjective time in the sub-second range can be altered by different factors; some are related to stimulus features such as luminance contrast and spatial frequency, others are processes like perceptual grouping and contextual modulation. These findings indicate that temporal perception uses neural signals involved in non-temporal feature processes and that perceptual organization plays an important role in shaping the experience of elapsed time. We suggest that the temporal representation of objects can be treated as a feature of objects. This new concept implies that psychological time can serve as a tool to study the principles of neural codes in the perception of objects like “reaction time (RT).” Whereas “RT” usually reflects the state of transient signals crossing decision thresholds, “apparent time” in addition reveals the dynamics of sustained signals, thus providing complementary information of what has been obtained from “RT” studies. PMID:25964774

  17. Shining new light on dark percepts: visual sensations induced by TMS.

    PubMed

    Knight, Ramisha; Mazzi, Chiara; Savazzi, Silvia

    2015-11-01

    Phosphenes induced by transcranial magnetic stimulation (TMS) are sensations of light, whereas a missing region in the visual field induced by TMS is generally referred to as a scotoma. It is believed that phosphenes are caused by neural excitation, while scotomas are due to neural inhibition. In light of the recent literature it might, however, be surmised that both phenomena are the result of neural noise injected into the cortex by TMS and that the likelihood of perceiving the two kinds of percepts depends on the state of the cortex at the time of stimulation. In the present study, TMS was applied over the left occipital cortex under different background conditions (Experiments 1-2) and using different TMS intensities (Experiment 3). Behavioral responses indicate the visual system processes luminance in a standardized manner, as lighter percepts were reacted to faster than darker percepts; this effect, however, did not extend to percept size. Our results suggest that phenomenological characteristics of artificial visual percepts are in line with the proposed effects of TMS as the induction of random neural noise interfering with the neural dynamics (the state of the cortex) at the time of stimulation. PMID:26195168

  18. Visual-motor recalibration in geographical slant perception

    NASA Technical Reports Server (NTRS)

    Bhalla, M.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    1999-01-01

    In 4 experiments, it was shown that hills appear steeper to people who are encumbered by wearing a heavy backpack (Experiment 1), are fatigued (Experiment 2), are of low physical fitness (Experiment 3), or are elderly and/or in declining health (Experiment 4). Visually guided actions are unaffected by these manipulations of physiological potential. Although dissociable, the awareness and action systems were also shown to be interconnected. Recalibration of the transformation relating awareness and actions was found to occur over long-term changes in physiological potential (fitness level, age, and health) but not with transitory changes (fatigue and load). Findings are discussed in terms of a time-dependent coordination between the separate systems that control explicit visual awareness and visually guided action.

  19. What can fish brains tell us about visual perception?

    PubMed

    Rosa Salva, Orsola; Sovrano, Valeria Anna; Vallortigara, Giorgio

    2014-01-01

    Fish are a complex taxonomic group, whose diversity and distance from other vertebrates well suits the comparative investigation of brain and behavior: in fish species we observe substantial differences with respect to the telencephalic organization of other vertebrates and an astonishing variety in the development and complexity of pallial structures. We will concentrate on the contribution of research on fish behavioral biology for the understanding of the evolution of the visual system. We shall review evidence concerning perceptual effects that reflect fundamental principles of the visual system functioning, highlighting the similarities and differences between distant fish groups and with other vertebrates. We will focus on perceptual effects reflecting some of the main tasks that the visual system must attain. In particular, we will deal with subjective contours and optical illusions, invariance effects, second order motion and biological motion and, finally, perceptual binding of object properties in a unified higher level representation. PMID:25324728

  20. Biases in Visual, Auditory, and Audiovisual Perception of Space

    PubMed Central

    Odegaard, Brian; Wozny, David R.; Shams, Ladan

    2015-01-01

    Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the

  1. Does Increasing Communication through Visual Learning Environments Enhance Student Perceptions of Lecturers?

    ERIC Educational Resources Information Center

    Frumkin, Lara

    2006-01-01

    The current study was conducted in an effort to examine whether increased levels of communication using visual learning environments (VLEs) alters student perceptions of lecturers. Eighty-six MSc students in Computing Science participated by using She and Fisher's (2002) Teacher Communication Behavior Questionnaire (TCBQ). In addition to using the…

  2. The Validity of Two Clinical Tests of Visual-Motor Perception

    ERIC Educational Resources Information Center

    Wallbrown, Jane D.; And Others

    1977-01-01

    The intent of this study was to determine whether the Minnesota Percepto-Diagnostic Test (Fuller, 1969; Fuller & Laird, 1963) is more effective than the Bender-Gestalt (Bender, 1937) with respect to identifying achievement-related errors in visual-motor perception. (Author/RK)

  3. Visual Perception and Frontal Lobe in Intellectual Disabilities: A Study with Evoked Potentials and Neuropsychology

    ERIC Educational Resources Information Center

    Munoz-Ruata, J.; Caro-Martinez, E.; Perez, L. Martinez; Borja, M.

    2010-01-01

    Background: Perception disorders are frequently observed in persons with intellectual disability (ID) and their influence on cognition has been discussed. The objective of this study is to clarify the mechanisms behind these alterations by analysing the visual event related potentials early component, the N1 wave, which is related to perception…

  4. University Teachers' Perception of Inclusion of Visually Impaired in Ghanaian Universities

    ERIC Educational Resources Information Center

    Mamah, Vincent; Deku, Prosper; Darling, Sharon M.; Avoke, Selete K.

    2011-01-01

    This study was undertaken to examine the university teachers' perception of including students with Visual Impairment (VI) in the public universities of Ghana. The sample consisted of 110 teachers from the University of Cape Coast (UCC), the University of Education, Winneba, (UEW), and the University of Ghana (UG). Data were collected through…

  5. Self-Perceptions of Visually Impaired Children Aged 3-10 in India.

    ERIC Educational Resources Information Center

    Christy, Beula; Shanimole; Nuthetie, Rishita

    2002-01-01

    A study analyzed the self-perceptions of 50 Indian children (ages 3-10) with visual impairments in their expressions of mood and feelings, needs and wants, and preferences and decision making. Children had difficulty expressing sympathy, particularly younger children. Children also difficulty choosing what they wanted to wear. (Contains 7…

  6. The Effect of Learning Background and Imagery Cognitive Development on Visual Perception

    ERIC Educational Resources Information Center

    Chiang, Shyh-Bao; Sun, Chun-Wang

    2013-01-01

    This research looked into the effect of how cognitive development toward imagery is formed through visual perception by means of a quantitative questionnaire. The main variable was the difference between the learning backgrounds of the interviewees. A two-way ANOVA mixed design was the statistical method used for the analysis of the 2 × 4 (2 by 4)…

  7. Perceptions of Older Veterans with Visual Impairments Regarding Computer Access Training and Quality of Life

    ERIC Educational Resources Information Center

    DuBosque, Richard Stanborough

    2013-01-01

    The widespread integration of the computer into the mainstream of daily life presents a challenge to various sectors of society, and the incorporation of this technology into the realm of the older individual with visual impairments is a relatively uncharted field of study. This study was undertaken to acquire the perceptions of the impact of the…

  8. Perception of Words and Non-Words in the Upper and Lower Visual Fields

    ERIC Educational Resources Information Center

    Darker, Iain T.; Jordan, Timothy R.

    2004-01-01

    The findings of previous investigations into word perception in the upper and the lower visual field (VF) are variable and may have incurred non-perceptual biases caused by the asymmetric distribution of information within a word, an advantage for saccadic eye-movements to targets in the upper VF and the possibility that stimuli were not projected…

  9. Human perception of visual stimuli modulated by direction of linear polarization.

    PubMed

    Misson, Gary P; Timmerman, Brenda H; Bryanston-Cross, Peter J

    2015-10-01

    This study explores both theoretically and experimentally the human perception of polarized light beyond that currently established. The radial analyser theory of Haidinger's phenomenon (HP) is used to predict the effect of observing visual stimuli comprising patterned zones characterized by orthogonal planes of linear polarization (linear polarization direction fields, LPD-fields). Any pattern can be represented as an LPD-field including optotypes and geometric forms. Simulated percepts differ from the original patterns although edges are mostly preserved. In edge-rich images a cross of attenuating contrast spanning the field of view is predicted. The mathematical model is verified experimentally using a liquid crystal display (LCD)-based polarization modulator imaged through a tangential (azimuthal) analyser with properties complementary to a radial analyser. The LCD device is then used in vivo to elicit perceptual responses in human subjects. Normal humans are found to readily detect spatially and temporally modulated isoluminant spatially-isochromatic, highly polarized LPD stimuli. Most subjects match the stimuli to corresponding images of theoretically predicted percepts. In particular edge perception and the presence of the contrast cross was confirmed. Unlike HP, static patterned LPD stimuli are perceived without difficulty. The simplest manifestation of human polarization perception is HP which is the fundamental element of an open set of stimulus-dependent percepts. This study demonstrates that humans have the ability to perceive and identify visual pattern stimuli defined solely by polarization state modulation. PMID:26291073

  10. A framework for the first-person internal sensation of visual perception in mammals and a comparable circuitry for olfactory perception in Drosophila.

    PubMed

    Vadakkan, Kunjumon I

    2015-01-01

    Perception is a first-person internal sensation induced within the nervous system at the time of arrival of sensory stimuli from objects in the environment. Lack of access to the first-person properties has limited viewing perception as an emergent property and it is currently being studied using third-person observed findings from various levels. One feasible approach to understand its mechanism is to build a hypothesis for the specific conditions and required circuit features of the nodal points where the mechanistic operation of perception take place for one type of sensation in one species and to verify it for the presence of comparable circuit properties for perceiving a different sensation in a different species. The present work explains visual perception in mammalian nervous system from a first-person frame of reference and provides explanations for the homogeneity of perception of visual stimuli above flicker fusion frequency, the perception of objects at locations different from their actual position, the smooth pursuit and saccadic eye movements, the perception of object borders, and perception of pressure phosphenes. Using results from temporal resolution studies and the known details of visual cortical circuitry, explanations are provided for (a) the perception of rapidly changing visual stimuli, (b) how the perception of objects occurs in the correct orientation even though, according to the third-person view, activity from the visual stimulus reaches the cortices in an inverted manner and (c) the functional significance of well-conserved columnar organization of the visual cortex. A comparable circuitry detected in a different nervous system in a remote species-the olfactory circuitry of the fruit fly Drosophila melanogaster-provides an opportunity to explore circuit functions using genetic manipulations, which, along with high-resolution microscopic techniques and lipid membrane interaction studies, will be able to verify the structure

  11. An Examination of the Factors Affecting Prospective Teachers' Perceptions of Faculty Members Using Chaid Analysis

    ERIC Educational Resources Information Center

    Tanhan, Fuat; Kayri, Murat

    2012-01-01

    This study aims to examine prospective teachers' perceptions of faculty members and the demographic variables affecting these perceptions. The population of the study consists of undergraduate students attending the Faculty of Education of Van Yuzuncu Yil University in the 2009-2010 academic year. A total of 500 students in their 1st, 2nd, 3rd and…

  12. Thresholds for self-motion perception in roll without and with visual fixation target - the visual-vestibular interaction effect

    PubMed Central

    Kolev, Ognyan I.

    2015-01-01

    Summary The purpose of this study was to establish the self-motion perception threshold, in roll, in the visual-vestibular interaction (VVI) state, creating an oculogyral illusion, and to compare this threshold to the self-motion perception threshold in darkness. A further aim was to investigate the dynamics of the threshold at a low frequency range (0.1–1 Hz) of sinusoidal rotation. Seven healthy subjects were tested. A motion platform was used to generate motion. Single cycles of sinusoidal acceleration at four frequencies (0.1, 0.2, 0.5 and 1 Hz) were used as motion stimuli. To avoid otolith stimulation, subjects were rotated about a vertical axis in supine position. To evoke an oculogyral illusion subjects were instructed to fixate their gaze on a cross-shaped object aligned with their head, which rotated with them. The results show a lowering of the self-motion perception threshold in the VVI state, significant for the frequencies 0.1 and 0.2 Hz (p<0.05). In all the subjects, visual fixation on the cross evoked an oculogyral illusion. The threshold in both tested conditions was frequency dependent: it decreased with increasing frequency values. However, this effect was consistently stronger in darkness across all frequencies (p<0.05). In conclusion, the application of sinusoidal rotation during roll at low frequencies in the VVI condition evokes oculogyral illusion. This interaction lowers the self-motion perception threshold compared to that measured during rotation in darkness. This testing method could be of practical benefit in clinical application for revealing brain dysfunction involving integrative mechanisms of perception. PMID:26415781

  13. Altering Visual Perception Abnormalities: A Marker for Body Image Concern

    PubMed Central

    Duncum, Anna J. F.; Mundy, Matthew E.

    2016-01-01

    The body image concern (BIC) continuum ranges from a healthy and positive body image, to clinical diagnoses of abnormal body image, like body dysmorphic disorder (BDD). BDD and non-clinical, yet high-BIC participants have demonstrated a local visual processing bias, characterised by reduced inversion effects. To examine whether this bias is a potential marker of BDD, the visual processing of individuals across the entire BIC continuum was examined. Dysmorphic Concern Questionnaire (DCQ; quantified BIC) scores were expected to correlate with higher discrimination accuracy and faster reaction times of inverted stimuli, indicating reduced inversion effects (occurring due to increased local visual processing). Additionally, an induced global or local processing bias via Navon stimulus presentation was expected to alter these associations. Seventy-four participants completed the DCQ and upright-inverted face and body stimulus discrimination task. Moderate positive associations were revealed between DCQ scores and accuracy rates for inverted face and body stimuli, indicating a graded local bias accompanying increases in BIC. This relationship supports a local processing bias as a marker for BDD, which has significant assessment implications. Furthermore, a moderate negative relationship was found between DCQ score and inverted face accuracy after inducing global processing, indicating the processing bias can temporarily be reversed in high BIC individuals. Navon stimuli were successfully able to alter the visual processing of individuals across the BIC continuum, which has important implications for treating BDD. PMID:27003715

  14. Exploring Children's Perceptions of Play Using Visual Methodologies

    ERIC Educational Resources Information Center

    Anthamatten, Peter; Wee, Bryan Shao-Chang; Korris, Erin

    2013-01-01

    Objective: A great deal of scholarly work has examined the way that physical, social and cultural environments relate to children's health behaviour, particularly with respect to diet and exercise. While this work is critical, little research attempts to incorporate the views and perspectives of children themselves using visual methodologies.…

  15. Visual Speech Perception in Children with Language Learning Impairments

    ERIC Educational Resources Information Center

    Knowland, Victoria C. P.; Evans, Sam; Snell, Caroline; Rosen, Stuart

    2016-01-01

    Purpose: The purpose of the study was to assess the ability of children with developmental language learning impairments (LLIs) to use visual speech cues from the talking face. Method: In this cross-sectional study, 41 typically developing children (mean age: 8 years 0 months, range: 4 years 5 months to 11 years 10 months) and 27 children with…

  16. Auditory-Visual Perception of Changing Distance by Human Infants.

    ERIC Educational Resources Information Center

    Walker-Andrews, Arlene S.; Lennon, Elizabeth M.

    1985-01-01

    Examines, in two experiments, 5-month-old infants' sensitivity to auditory-visual specification of distance and direction of movement. One experiment presented two films with soundtracks in either a match or mismatch condition; the second showed the two films side-by-side with a single soundtrack appropriate to one. Infants demonstrated visual…

  17. Audio-visual perception system for a humanoid robotic head.

    PubMed

    Viciana-Abad, Raquel; Marfil, Rebeca; Perez-Lorenzo, Jose M; Bandera, Juan P; Romero-Garces, Adrian; Reche-Lopez, Pedro

    2014-01-01

    One of the main issues within the field of social robotics is to endow robots with the ability to direct attention to people with whom they are interacting. Different approaches follow bio-inspired mechanisms, merging audio and visual cues to localize a person using multiple sensors. However, most of these fusion mechanisms have been used in fixed systems, such as those used in video-conference rooms, and thus, they may incur difficulties when constrained to the sensors with which a robot can be equipped. Besides, within the scope of interactive autonomous robots, there is a lack in terms of evaluating the benefits of audio-visual attention mechanisms, compared to only audio or visual approaches, in real scenarios. Most of the tests conducted have been within controlled environments, at short distances and/or with off-line performance measurements. With the goal of demonstrating the benefit of fusing sensory information with a Bayes inference for interactive robotics, this paper presents a system for localizing a person by processing visual and audio data. Moreover, the performance of this system is evaluated and compared via considering the technical limitations of unimodal systems. The experiments show the promise of the proposed approach for the proactive detection and tracking of speakers in a human-robot interactive framework. PMID:24878593

  18. Visual Size Perception and Haptic Calibration during Development

    ERIC Educational Resources Information Center

    Gori, Monica; Giuliana, Luana; Sandini, Giulio; Burr, David

    2012-01-01

    It is still unclear how the visual system perceives accurately the size of objects at different distances. One suggestion, dating back to Berkeley's famous essay, is that vision is calibrated by touch. If so, we may expect different mechanisms involved for near, reachable distances and far, unreachable distances. To study how the haptic system…

  19. Visual perception: bizarre contours go against the odds.

    PubMed

    Fleming, Roland W

    2011-04-12

    A new study shows that the brain sometimes invents visual contours even when they would be highly unlikely to occur in the real world. This presents a challenge to theories assuming that the brain prefers the most probable interpretation of the retinal image. PMID:21481763

  20. Gender Equity & Visual Literacy: Schools Can Help Change Perceptions.

    ERIC Educational Resources Information Center

    Couch, Richard A.

    Background information about gender inequity is provided, and the assertion is made that educators must recognize that many of the problems females encounter are begun and perpetuated in the schools. Visual literacy is part of the change that schools must make in order to make greater strides toward gender equity. Two connections between visual…

  1. Audio-Visual Perception System for a Humanoid Robotic Head

    PubMed Central

    Viciana-Abad, Raquel; Marfil, Rebeca; Perez-Lorenzo, Jose M.; Bandera, Juan P.; Romero-Garces, Adrian; Reche-Lopez, Pedro

    2014-01-01

    One of the main issues within the field of social robotics is to endow robots with the ability to direct attention to people with whom they are interacting. Different approaches follow bio-inspired mechanisms, merging audio and visual cues to localize a person using multiple sensors. However, most of these fusion mechanisms have been used in fixed systems, such as those used in video-conference rooms, and thus, they may incur difficulties when constrained to the sensors with which a robot can be equipped. Besides, within the scope of interactive autonomous robots, there is a lack in terms of evaluating the benefits of audio-visual attention mechanisms, compared to only audio or visual approaches, in real scenarios. Most of the tests conducted have been within controlled environments, at short distances and/or with off-line performance measurements. With the goal of demonstrating the benefit of fusing sensory information with a Bayes inference for interactive robotics, this paper presents a system for localizing a person by processing visual and audio data. Moreover, the performance of this system is evaluated and compared via considering the technical limitations of unimodal systems. The experiments show the promise of the proposed approach for the proactive detection and tracking of speakers in a human-robot interactive framework. PMID:24878593

  2. Vibrotactile Perception: Perspective Taking by Children Who Are Visually Impaired.

    ERIC Educational Resources Information Center

    Miletic, G.

    1994-01-01

    This study compared the performance on perspective-taking tasks of 8 congenitally blind children (mean age 13.5 years), using either haptic exploration or a vibrotactile prosthetic device, with the performance of 4 children having low vision using their limited visual abilities. The vibrotactile device improved perspective-taking performance…

  3. Altering Visual Perception Abnormalities: A Marker for Body Image Concern.

    PubMed

    Beilharz, Francesca L; Atkins, Kelly J; Duncum, Anna J F; Mundy, Matthew E

    2016-01-01

    The body image concern (BIC) continuum ranges from a healthy and positive body image, to clinical diagnoses of abnormal body image, like body dysmorphic disorder (BDD). BDD and non-clinical, yet high-BIC participants have demonstrated a local visual processing bias, characterised by reduced inversion effects. To examine whether this bias is a potential marker of BDD, the visual processing of individuals across the entire BIC continuum was examined. Dysmorphic Concern Questionnaire (DCQ; quantified BIC) scores were expected to correlate with higher discrimination accuracy and faster reaction times of inverted stimuli, indicating reduced inversion effects (occurring due to increased local visual processing). Additionally, an induced global or local processing bias via Navon stimulus presentation was expected to alter these associations. Seventy-four participants completed the DCQ and upright-inverted face and body stimulus discrimination task. Moderate positive associations were revealed between DCQ scores and accuracy rates for inverted face and body stimuli, indicating a graded local bias accompanying increases in BIC. This relationship supports a local processing bias as a marker for BDD, which has significant assessment implications. Furthermore, a moderate negative relationship was found between DCQ score and inverted face accuracy after inducing global processing, indicating the processing bias can temporarily be reversed in high BIC individuals. Navon stimuli were successfully able to alter the visual processing of individuals across the BIC continuum, which has important implications for treating BDD. PMID:27003715

  4. Modelling Subjectivity in Visual Perception of Orientation for Image Retrieval.

    ERIC Educational Resources Information Center

    Sanchez, D.; Chamorro-Martinez, J.; Vila, M. A.

    2003-01-01

    Discussion of multimedia libraries and the need for storage, indexing, and retrieval techniques focuses on the combination of computer vision and data mining techniques to model high-level concepts for image retrieval based on perceptual features of the human visual system. Uses fuzzy set theory to measure users' assessments and to capture users'…

  5. Temporal stability of the action-perception cycle for postural control in a moving visual environment.

    PubMed

    Dijkstra, T M; Schöner, G; Gielen, C C

    1994-01-01

    When standing human subjects are exposed to a moving visual environment, the induced postural sway forms a stable temporal relationship with the visual information. We have investigated this relationship experimentally with a new set-up in which a computer generates video images which correspond to the motion of a 3D environment. The suggested mean distance to a sinusoidally moving wall is varied and the temporal relationship to induced sway is analysed (1) in terms of the fluctuations of relative phase between visual and sway motion and (2) in terms of the relaxation time of relative phase as determined from the rate of recovery of the stable relative phase pattern following abrupt changes in the visual motion pattern. The two measures are found to converge to a well-defined temporal stability of the action-perception cycle. Furthermore, we show that this temporal stability is a sensitive measure of the strength of the action-perception coupling. It decreases as the distance of the visual scene from the observer increases. This fact and the increase of mean relative phase are consistent with predictions of a linear second-order system driven by the visual expansion rate. However, the amplitude of visual sway decreases little as visual distance increases, in contradiction to the predictions, and is suggestive of a process that actively generates sway. The visual expansion rate on the optic array is found to decrease strongly with visual distance. This leads to the conclusion that postural control in a moving visual environment cannot be understood simply in terms of minimization of retinal slip, and that dynamic coupling of vision into the postural control system must be taken into account. PMID:8187859

  6. Biased perception about gene technology: How perceived naturalness and affect distort benefit perception.

    PubMed

    Siegrist, Michael; Hartmann, Christina; Sütterlin, Bernadette

    2016-01-01

    In two experiments, the participants showed biased responses when asked to evaluate the benefits of gene technology. They evaluated the importance of additional yields in corn fields due to a newly introduced variety, which would increase a farmer's revenues. In one condition, the newly introduced variety was described as a product of traditional breeding; in the other, it was identified as genetically modified (GM). The two experiments' findings showed that the same benefits were perceived as less important for a farmer when these were the result of GM crops compared with traditionally bred crops. Mediation analyses suggest that perceived naturalness and the affect associated with the technology per se influence the interpretation of the new information. The lack of perceived naturalness of gene technology seems to be the reason for the participants' perceived lower benefits of a new corn variety in the gene technology condition compared with the perceptions of the participants assigned to the traditional breeding condition. The strategy to increase the acceptance of gene technology by introducing plant varieties that better address consumer and producer needs may not work because people discount its associated benefits. PMID:26505287

  7. Dopamine Activation Preserves Visual Motion Perception Despite Noise Interference of Human V5/MT

    PubMed Central

    Yousif, Nada; Fu, Richard Z.; Abou-El-Ela Bourquin, Bilal; Bhrugubanda, Vamsee; Schultz, Simon R.

    2016-01-01

    When processing sensory signals, the brain must account for noise, both noise in the stimulus and that arising from within its own neuronal circuitry. Dopamine receptor activation is known to enhance both visual cortical signal-to-noise-ratio (SNR) and visual perceptual performance; however, it is unknown whether these two dopamine-mediated phenomena are linked. To assess this, we used single-pulse transcranial magnetic stimulation (TMS) applied to visual cortical area V5/MT to reduce the SNR focally and thus disrupt visual motion discrimination performance to visual targets located in the same retinotopic space. The hypothesis that dopamine receptor activation enhances perceptual performance by improving cortical SNR predicts that dopamine activation should antagonize TMS disruption of visual perception. We assessed this hypothesis via a double-blinded, placebo-controlled study with the dopamine receptor agonists cabergoline (a D2 agonist) and pergolide (a D1/D2 agonist) administered in separate sessions (separated by 2 weeks) in 12 healthy volunteers in a William's balance-order design. TMS degraded visual motion perception when the evoked phosphene and the visual stimulus overlapped in time and space in the placebo and cabergoline conditions, but not in the pergolide condition. This suggests that dopamine D1 or combined D1 and D2 receptor activation enhances cortical SNR to boost perceptual performance. That local visual cortical excitability was unchanged across drug conditions suggests the involvement of long-range intracortical interactions in this D1 effect. Because increased internal noise (and thus lower SNR) can impair visual perceptual learning, improving visual cortical SNR via D1/D2 agonist therapy may be useful in boosting rehabilitation programs involving visual perceptual training. SIGNIFICANCE STATEMENT In this study, we address the issue of whether dopamine activation improves visual perception despite increasing sensory noise in the visual cortex

  8. People watching: visual, motor, and social processes in the perception of human movement.

    PubMed

    Shiffrar, Maggie

    2011-01-01

    Successful social behavior requires the accurate perception and interpretation of other peoples' actions. In the last decade, significant progress has been made in understanding how the human visual system analyzes bodily motion. Neurophysiological studies have identified two neural areas, the superior temporal sulcus (STS) and the premotor cortex, which play key roles in the visual perception of human movement. Patterns of neural activity in these areas are reflective of psychophysical measures of visual sensitivity to human movement. Both vary as a function of stimulus orientation and global stimulus structure. Human observers and STS responsiveness share some developmental similarities as both exhibit sensitivities that become increasingly tuned for upright, human movement. Furthermore, the observer's own visual and motor experience with an action as well as the social and emotional content of that action influence behavioral measures of visual sensitivity and patterns of neural activity in the STS and premotor cortex. Finally, dysfunction of motor processes, such as hemiplegia, and dysfunction of social processes, such as Autism, systematically impact visual sensitivity to human movement. In sum, a convergence of visual, motor, and social processes underlies our ability to perceive and interpret the actions of other people. WIREs Cogn Sci 2011 2 68-78 DOI: 10.1002/wcs.88 For further resources related to this article, please visit the WIREs website. PMID:26301914

  9. Seeing the tipping point: Balance perception and visual shape.

    PubMed

    Firestone, Chaz; Keil, Frank C

    2016-07-01

    In a brief glance at an object or shape, we can appreciate a rich suite of its functional properties, including the organization of the object's parts, its optimal contact points for grasping, and its center of mass, or balancing point. However, in the real world and the laboratory, balance perception shows systematic biases whereby observers may misjudge a shape's center of mass by a severe margin. Are such biases simply quirks of physical reasoning? Or might they instead reflect more fundamental principles of object representation? Here we demonstrate systematically biased center-of-mass estimation for two-dimensional (2D) shapes (Study 1) and advance a surprising explanation of such biases. We suggest that the mind implicitly represents ordinary 2D shapes as rich, volumetric, three-dimensional (3D) objects, and that these "inflated" shape representations intrude on and bias perception of the 2D shape's geometric properties. Such "inflation" is a computer-graphics technique for segmenting shapes into parts, and we show that a model derived from this technique best accounts for the biases in center-of-mass estimation in Study 1. Further supporting this account, we show that reducing the need for inflated shape representations diminishes such biases: Center-of-mass estimation improved when cues to shapehood were attenuated (Study 2) and when shapes' depths were explicitly depicted using real-life objects laser-cut from wood (Study 3). We suggest that the technique of shape inflation is actually implemented in the mind; thus, biases in our impressions of balance reflect a more general functional characteristic of object perception. (PsycINFO Database Record PMID:27348290

  10. Phosphene Perception Relates to Visual Cortex Glutamate Levels and Covaries with Atypical Visuospatial Awareness.

    PubMed

    Terhune, Devin B; Murray, Elizabeth; Near, Jamie; Stagg, Charlotte J; Cowey, Alan; Cohen Kadosh, Roi

    2015-11-01

    Phosphenes are illusory visual percepts produced by the application of transcranial magnetic stimulation to occipital cortex. Phosphene thresholds, the minimum stimulation intensity required to reliably produce phosphenes, are widely used as an index of cortical excitability. However, the neural basis of phosphene thresholds and their relationship to individual differences in visual cognition are poorly understood. Here, we investigated the neurochemical basis of phosphene perception by measuring basal GABA and glutamate levels in primary visual cortex using magnetic resonance spectroscopy. We further examined whether phosphene thresholds would relate to the visuospatial phenomenology of grapheme-color synesthesia, a condition characterized by atypical binding and involuntary color photisms. Phosphene thresholds negatively correlated with glutamate concentrations in visual cortex, with lower thresholds associated with elevated glutamate. This relationship was robust, present in both controls and synesthetes, and exhibited neurochemical, topographic, and threshold specificity. Projector synesthetes, who experience color photisms as spatially colocalized with inducing graphemes, displayed lower phosphene thresholds than associator synesthetes, who experience photisms as internal images, with both exhibiting lower thresholds than controls. These results suggest that phosphene perception is driven by interindividual variation in glutamatergic activity in primary visual cortex and relates to cortical processes underlying individual differences in visuospatial awareness. PMID:25725043

  11. Phosphene Perception Relates to Visual Cortex Glutamate Levels and Covaries with Atypical Visuospatial Awareness

    PubMed Central

    Terhune, Devin B.; Murray, Elizabeth; Near, Jamie; Stagg, Charlotte J.; Cowey, Alan; Cohen Kadosh, Roi

    2015-01-01

    Phosphenes are illusory visual percepts produced by the application of transcranial magnetic stimulation to occipital cortex. Phosphene thresholds, the minimum stimulation intensity required to reliably produce phosphenes, are widely used as an index of cortical excitability. However, the neural basis of phosphene thresholds and their relationship to individual differences in visual cognition are poorly understood. Here, we investigated the neurochemical basis of phosphene perception by measuring basal GABA and glutamate levels in primary visual cortex using magnetic resonance spectroscopy. We further examined whether phosphene thresholds would relate to the visuospatial phenomenology of grapheme-color synesthesia, a condition characterized by atypical binding and involuntary color photisms. Phosphene thresholds negatively correlated with glutamate concentrations in visual cortex, with lower thresholds associated with elevated glutamate. This relationship was robust, present in both controls and synesthetes, and exhibited neurochemical, topographic, and threshold specificity. Projector synesthetes, who experience color photisms as spatially colocalized with inducing graphemes, displayed lower phosphene thresholds than associator synesthetes, who experience photisms as internal images, with both exhibiting lower thresholds than controls. These results suggest that phosphene perception is driven by interindividual variation in glutamatergic activity in primary visual cortex and relates to cortical processes underlying individual differences in visuospatial awareness. PMID:25725043

  12. Crossmodal Object-Based Attention: Auditory Objects Affect Visual Processing

    ERIC Educational Resources Information Center

    Turatto, M.; Mazza, V.; Umilta, C.

    2005-01-01

    According to the object-based view, visual attention can be deployed to ''objects'' or perceptual units, regardless of spatial locations. Recently, however, the notion of object has also been extended to the auditory domain, with some authors suggesting possible interactions between visual and auditory objects. Here we show that task-irrelevant…

  13. Factors Affecting the Reading Media Used by Visually Impaired Adults

    ERIC Educational Resources Information Center

    Goudiras, Dimitrios B.; Papadopoulos, Konstantinos S.; Koutsoklenis, Athanasios Ch.; Papageorgiou, Virginia E.; Stergiou, Maria S.

    2009-01-01

    The aim of this study was to examine reading media (braille, cassettes, screen-reader, screen-magnifier, large print, low vision aids, CCTV) used by visually impaired adults. This article reports the results of a research project involving 100 people with visual impairment. The participants were interviewed and asked to fill in a questionnaire to…

  14. Development of Visual Motion Perception for Prospective Control: Brain and Behavioral Studies in Infants

    PubMed Central

    Agyei, Seth B.; van der Weel, F. R. (Ruud); van der Meer, Audrey L. H.

    2016-01-01

    During infancy, smart perceptual mechanisms develop allowing infants to judge time-space motion dynamics more efficiently with age and locomotor experience. This emerging capacity may be vital to enable preparedness for upcoming events and to be able to navigate in a changing environment. Little is known about brain changes that support the development of prospective control and about processes, such as preterm birth, that may compromise it. As a function of perception of visual motion, this paper will describe behavioral and brain studies with young infants investigating the development of visual perception for prospective control. By means of the three visual motion paradigms of occlusion, looming, and optic flow, our research shows the importance of including behavioral data when studying the neural correlates of prospective control. PMID:26903908

  15. Visual behavior and perception of trajectories of moving objects with visual occlusion.

    PubMed

    Moreno, Francisco J; Luis, Vicente; Salgado, Francisco; García, Juan A; Reina, Raúl

    2005-08-01

    Experienced athletes in sports with moving objects have shown greater skill when using visual information to anticipate the direction of a moving object than nonexperienced athletes of those sports. Studies have shown that expert athletes are more effective than novices in occlusion situations in the first stages of the sports sequence. In this study, 12 athletes with different competitive experience in sports with moving objects viewed a sequence of tennis ball launches with and without visual occlusion, launched by a ball-shooting machine toward different areas with respect to the participant's position. The relation among visual behavior, occlusion time, and the precision of the task is reviewed. The spot where the balls bounced was analysed by a digital camera and visual behavior by an Eye Tracking System. Analysis showed that the nonexperienced athletes made significantly more errors and were more variable in visual occlusion conditions. Participants had a stable visual search strategy. PMID:16350604

  16. Does Viewing Documentary Films Affect Environmental Perceptions and Behaviors?

    ERIC Educational Resources Information Center

    Janpol, Henry L.; Dilts, Rachel

    2016-01-01

    This research explored whether viewing documentary films about the natural or built environment can exert a measurable influence on behaviors and perceptions. Different documentary films were viewed by subjects. One film emphasized the natural environment, while the other focused on the built environment. After viewing a film, a computer game…

  17. Affect and Acceptability: Exploring Teachers' Technology-Related Risk Perceptions

    ERIC Educational Resources Information Center

    Howard, Sarah K.

    2011-01-01

    Educational change, such as technology integration, involves risk. Teachers are encouraged to "take risks", but what risks they are asked to take and how do they perceive these risks? Developing an understanding of teachers' technology-related risk perceptions can help explain their choices and behaviours. This paper presents a way to understand…

  18. Teacher Perceptions Affect Boys' and Girls' Reading Motivation Differently

    ERIC Educational Resources Information Center

    Boerma, Inouk E.; Mol, Suzanne E.; Jolles, Jelle

    2016-01-01

    The aim of this study was to examine the relationship between teacher perceptions and children's reading motivation, with specific attention to gender differences. The reading self-concept, task value, and attitude of 160 fifth and sixth graders were measured. Teachers rated each student's reading comprehension. Results showed that for boys,…

  19. Vocal fundamental and formant frequencies affect perceptions of speaker cooperativeness.

    PubMed

    Knowles, Kristen K; Little, Anthony C

    2016-09-01

    In recent years, the perception of social traits in faces and voices has received much attention. Facial and vocal masculinity are linked to perceptions of trustworthiness; however, while feminine faces are generally considered to be trustworthy, vocal trustworthiness is associated with masculinized vocal features. Vocal traits such as pitch and formants have previously been associated with perceived social traits such as trustworthiness and dominance, but the link between these measurements and perceptions of cooperativeness have yet to be examined. In Experiment 1, cooperativeness ratings of male and female voices were examined against four vocal measurements: fundamental frequency (F0), pitch variation (F0-SD), formant dispersion (Df), and formant position (Pf). Feminine pitch traits (F0 and F0-SD) and masculine formant traits (Df and Pf) were associated with higher cooperativeness ratings. In Experiment 2, manipulated voices with feminized F0 were found to be more cooperative than voices with masculinized F0(,) among both male and female speakers, confirming our results from Experiment 1. Feminine pitch qualities may indicate an individual who is friendly and non-threatening, while masculine formant qualities may reflect an individual that is socially dominant or prestigious, and the perception of these associated traits may influence the perceived cooperativeness of the speakers. PMID:26360784

  20. Specific Previous Experience Affects Perception of Harmony and Meter

    ERIC Educational Resources Information Center

    Creel, Sarah C.

    2011-01-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was…

  1. Negative Affect, Risk Perception, and Adolescent Risk Behavior

    ERIC Educational Resources Information Center

    Curry, Laura A.; Youngblade, Lise M.

    2006-01-01

    The prevalence, etiology, and consequences of adolescent risk behavior have stimulated much research. The current study examined relationships among anger and depressive symptomatology (DS), risk perception, self-restraint, and adolescent risk behavior. Telephone surveys were conducted with 290 14- to 20-year-olds (173 females; M = 15.98 years).…

  2. Proprioceptive Body Illusions Modulate the Visual Perception of Reaching Distance

    PubMed Central

    Petroni, Agustin; Carbajal, M. Julia; Sigman, Mariano

    2015-01-01

    The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide—without engaging in explicit action—whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas. PMID:26110274

  3. Temporal sensitivity. [time dependent human perception of visual stimuli

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1986-01-01

    Human visual temporal sensitivity is examined. The stimuli used to measure temporal sensitivity are described and the linear systems theory is reviewed in terms of temporal sensitivity. A working model which represents temporal sensitivity is proposed. The visibility of a number of temporal wave forms, sinusoids, rectangular pulses, and pulse pairs, is analyzed. The relation between spatial and temporal effects is studied. Temporal variations induced by image motion and the effects of light adaptation on temporal sensitivity are considered.

  4. How Does the Use of Visual Media Affect a Nonverbal Student's Communication?

    ERIC Educational Resources Information Center

    Remmel-Gehm, Mary T.

    This report discusses the outcomes of a study that investigated how visual media would affect the communication skills of a 13-year-old nonverbal girl with cerebral palsy and whether the use of visual media would provide documentation of higher cognitive functioning. For the study, the subject used three different tools to add visual information…

  5. Visual perception and mixed-initiative interaction for assisted visualization design.

    PubMed

    Healey, Christopher; Kocherlakota, Sarat; Rao, Vivek; Mehta, Reshma; St Amant, Robert

    2008-01-01

    This paper describes the integration of perceptual guidelines from human vision with an AI-based mixed-initiative search strategy. The result is a visualization assistant called ViA, a system that collaborates with its users to identify perceptually salient visualizations for large, multidimensional datasets. ViA applies knowledge of low-level human vision to: (1) evaluate the effectiveness of a particular visualization for a given dataset and analysis tasks; and (2) rapidly direct its search towards new visualizations that are most likely to offer improvements over those seen to date. Context, domain expertise, and a high-level understanding of a dataset are critical to identifying effective visualizations. We apply a mixed-initiative strategy that allows ViA and its users to share their different strengths and continually improve ViA's understanding of a user's preferences. We visualize historical weather conditions to compare ViA's search strategy to exhaustive analysis, simulated annealing, and reactive tabu search, and to measure the improvement provided by mixed-initiative interaction. We also visualize intelligent agents competing in a simulated online auction to evaluate ViA's perceptual guidelines. Results from each study are positive, suggesting that ViA can construct high-quality visualizations for a range of real-world datasets. PMID:18192718

  6. On the relationship between personal experience, affect and risk perception: The case of climate change

    PubMed Central

    van der Linden, Sander

    2014-01-01

    Examining the conceptual relationship between personal experience, affect, and risk perception is crucial in improving our understanding of how emotional and cognitive process mechanisms shape public perceptions of climate change. This study is the first to investigate the interrelated nature of these variables by contrasting three prominent social-psychological theories. In the first model, affect is viewed as a fast and associative information processing heuristic that guides perceptions of risk. In the second model, affect is seen as flowing from cognitive appraisals (i.e., affect is thought of as a post-cognitive process). Lastly, a third, dual-process model is advanced that integrates aspects from both theoretical perspectives. Four structural equation models were tested on a national sample (N = 808) of British respondents. Results initially provide support for the “cognitive” model, where personal experience with extreme weather is best conceptualized as a predictor of climate change risk perception and, in turn, risk perception a predictor of affect. Yet, closer examination strongly indicates that at the same time, risk perception and affect reciprocally influence each other in a stable feedback system. It is therefore concluded that both theoretical claims are valid and that a dual-process perspective provides a superior fit to the data. Implications for theory and risk communication are discussed. © 2014 The Authors. European Journal of Social Psychology published by John Wiley & Sons, Ltd. PMID:25678723

  7. Acute Zonal Occult Outer Retinopathy in Japanese Patients: Clinical Features, Visual Function, and Factors Affecting Visual Function

    PubMed Central

    Saito, Saho; Saito, Wataru; Saito, Michiyuki; Hashimoto, Yuki; Mori, Shohei; Noda, Kousuke; Namba, Kenichi; Ishida, Susumu

    2015-01-01

    Purpose To evaluate the clinical features and investigate their relationship with visual function in Japanese patients with acute zonal occult outer retinopathy (AZOOR). Methods Fifty-two eyes of 38 Japanese AZOOR patients (31 female and 7 male patients; mean age at first visit, 35.0 years; median follow-up duration, 31 months) were retrospectively collected: 31 untreated eyes with good visual acuity and 21 systemic corticosteroid-treated eyes with progressive visual acuity loss. Variables affecting the logMAR values of best-corrected visual acuity (BCVA) and the mean deviation (MD) on Humphrey perimetry at initial and final visits were examined using multiple stepwise linear regression analysis. Results In untreated eyes, the mean MD at the final visit was significantly higher than that at the initial visit (P = 0.00002). In corticosteroid-treated eyes, the logMAR BCVA and MD at the final visit were significantly better than the initial values (P = 0.007 and P = 0.02, respectively). The final logMAR BCVA was 0.0 or less in 85% of patients. Variables affecting initial visual function were moderate anterior vitreous cells, myopia severity, and a-wave amplitudes on electroretinography; factors affecting final visual function were the initial MD values, female sex, moderate anterior vitreous cells, and retinal atrophy. Conclusions Our data indicated that visual functions in enrolled patients significantly improved spontaneously or after systemic corticosteroids therapy, suggesting that Japanese patients with AZOOR have good visual outcomes during the follow-up period of this study. Furthermore, initial visual field defects, gender, anterior vitreous cells, and retinal atrophy affected final visual functions in these patients. PMID:25919689

  8. Confinement has no effect on visual space perception: The results of the Mars-500 experiment.

    PubMed

    Sikl, Radovan; Simeček, Michal

    2014-02-01

    People confined to a closed space live in a visual environment that differs from a natural open-space environment in several respects. The view is restricted to no more than a few meters, and nearby objects cannot be perceived relative to the position of a horizon. Thus, one might expect to find changes in visual space perception as a consequence of the prolonged experience of confinement. The subjects in our experimental study were participants of the Mars-500 project and spent nearly a year and a half isolated from the outside world during a simulated mission to Mars. The participants were presented with a battery of computer-based psychophysical tests examining their performance on various 3-D perception tasks, and we monitored changes in their perceptual performance throughout their confinement. Contrary to our expectations, no serious effect of the confinement on the crewmembers' 3-D perception was observed in any experiment. Several interpretations of these findings are discussed, including the possibilities that (1) the crewmembers' 3-D perception really did not change significantly, (2) changes in 3-D perception were manifested in the precision rather than the accuracy of perceptual judgments, and/or (3) the experimental conditions and the group sample were problematic. PMID:24288139

  9. Visual perception and stereoscopic imaging: an artist's perspective

    NASA Astrophysics Data System (ADS)

    Mason, Steve

    2015-03-01

    This paper continues my 2014 February IS and T/SPIE Convention exploration into the relationship of stereoscopic vision and consciousness (90141F-1). It was proposed then that by using stereoscopic imaging people may consciously experience, or see, what they are viewing and thereby help make them more aware of the way their brains manage and interpret visual information. Environmental imaging was suggested as a way to accomplish this. This paper is the result of further investigation, research, and follow-up imaging. A show of images, that is a result of this research, allows viewers to experience for themselves the effects of stereoscopy on consciousness. Creating dye-infused aluminum prints while employing ChromaDepth® 3D glasses, I hope to not only raise awareness of visual processing but also explore the differences and similarities between the artist and scientist―art increases right brain spatial consciousness, not only empirical thinking, while furthering the viewer's cognizance of the process of seeing. The artist must abandon preconceptions and expectations, despite what the evidence and experience may indicate in order to see what is happening in his work and to allow it to develop in ways he/she could never anticipate. This process is then revealed to the viewer in a show of work. It is in the experiencing, not just from the thinking, where insight is achieved. Directing the viewer's awareness during the experience using stereoscopic imaging allows for further understanding of the brain's function in the visual process. A cognitive transformation occurs, the preverbal "left/right brain shift," in order for viewers to "see" the space. Using what we know from recent brain research, these images will draw from certain parts of the brain when viewed in two dimensions and different ones when viewed stereoscopically, a shift, if one is looking for it, which is quite noticeable. People who have experienced these images in the context of examining their own

  10. A novel evaluation metric based on visual perception for moving target detection algorithm

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Liu, Lei; Cui, Minjie; Li, He

    2016-05-01

    Traditional performance evaluation index for moving target detection algorithm, whose each index's emphasis is different when it is used to evaluate the performance of the moving target detection algorithm, is inconvenient for people to make an evaluation of the performance of algorithm comprehensively and objectively. Particularly, when the detection results of different algorithms have the same number of the foreground point and the background point, the algorithm's each traditional index is the same, and we can't use the traditional index to compare the performance of the moving target detection algorithms, which is the disadvantage of traditional evaluation index that takes pixel as a unit when calculating the index. To solve this problem, combining with the feature of human's visual perception system, this paper presents a new evaluation index-Visual Fluctuation (VF) based on the principle of image block to evaluate the performance of moving target detection algorithm. Experiments showed that the new evaluation index based on the visual perception makes up for the deficiency of traditional one, and the calculation results are not only in accordance with visual perception of human, but also evaluate the performance of the moving target detection algorithm more objectively.

  11. Perception and performance in flight simulators: The contribution of vestibular, visual, and auditory information

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The pilot's perception and performance in flight simulators is examined. The areas investigated include: vestibular stimulation, flight management and man cockpit information interfacing, and visual perception in flight simulation. The effects of higher levels of rotary acceleration on response time to constant acceleration, tracking performance, and thresholds for angular acceleration are examined. Areas of flight management examined are cockpit display of traffic information, work load, synthetic speech call outs during the landing phase of flight, perceptual factors in the use of a microwave landing system, automatic speech recognition, automation of aircraft operation, and total simulation of flight training.

  12. Motion perception and visual signal design in Anolis lizards

    PubMed Central

    Fleishman, Leo J.; Pallus, Adam C.

    2010-01-01

    Anolis lizards communicate with displays consisting of motion of the head and body. Early portions of long-distance displays require movements that are effective at eliciting the attention of potential receivers. We studied signal-motion efficacy using a two-dimensional visual-motion detection (2DMD) model consisting of a grid of correlation-type elementary motion detectors. This 2DMD model has been shown to accurately predict Anolis lizard behavioural response. We tested different patterns of artificially generated motion and found that an abrupt 0.3° shift of position in less than 100 ms is optimal. We quantified motion in displays of 25 individuals from five species. Four species employ near-optimal movement patterns. We tested displays of these species using the 2DMD model on scenes with and without moderate wind. Display movements can easily be detected, even in the presence of windblown vegetation. The fifth species does not typically use the most effective display movements and display movements cannot be discerned by the 2DMD model in the presence of windblown vegetation. A number of Anolis species use abrupt up-and-down head movements approximately 10 mm in amplitude in displays, and these movements appear to be extremely effective for stimulating the receiver visual system. PMID:20591869

  13. Restoring visual perception using microsystem technologies: engineering and manufacturing perspectives.

    PubMed

    Krisch, I; Hosticka, B J

    2007-01-01

    Microsystem technologies offer significant advantages in the development of neural prostheses. In the last two decades, it has become feasible to develop intelligent prostheses that are fully implantable into the human body with respect to functionality, complexity, size, weight, and compactness. Design and development enforce collaboration of various disciplines including physicians, engineers, and scientists. The retina implant system can be taken as one sophisticated example of a prosthesis which bypasses neural defects and enables direct electrical stimulation of nerve cells. This micro implantable visual prosthesis assists blind patients to return to the normal course of life. The retina implant is intended for patients suffering from retinitis pigmentosa or macular degeneration. In this contribution, we focus on the epiretinal prosthesis and discuss topics like system design, data and power transfer, fabrication, packaging and testing. In detail, the system is based upon an implantable micro electro stimulator which is powered and controlled via a wireless inductive link. Microelectronic circuits for data encoding and stimulation are assembled on flexible substrates with an integrated electrode array. The implant system is encapsulated using parylene C and silicone rubber. Results extracted from experiments in vivo demonstrate the retinotopic activation of the visual cortex. PMID:17691337

  14. Global motion perception deficits in autism are reflected as early as primary visual cortex

    PubMed Central

    Thomas, Cibu; Kravitz, Dwight J.; Wallace, Gregory L.; Baron-Cohen, Simon; Martin, Alex; Baker, Chris I.

    2014-01-01

    Individuals with autism are often characterized as ‘seeing the trees, but not the forest’—attuned to individual details in the visual world at the expense of the global percept they compose. Here, we tested the extent to which global processing deficits in autism reflect impairments in (i) primary visual processing; or (ii) decision-formation, using an archetypal example of global perception, coherent motion perception. In an event-related functional MRI experiment, 43 intelligence quotient and age-matched male participants (21 with autism, age range 15–27 years) performed a series of coherent motion perception judgements in which the amount of local motion signals available to be integrated into a global percept was varied by controlling stimulus viewing duration (0.2 or 0.6 s) and the proportion of dots moving in the correct direction (coherence: 4%, 15%, 30%, 50%, or 75%). Both typical participants and those with autism evidenced the same basic pattern of accuracy in judging the direction of motion, with performance decreasing with reduced coherence and shorter viewing durations. Critically, these effects were exaggerated in autism: despite equal performance at the long duration, performance was more strongly reduced by shortening viewing duration in autism (P < 0.015) and decreasing stimulus coherence (P < 0.008). To assess the neural correlates of these effects we focused on the responses of primary visual cortex and the middle temporal area, critical in the early visual processing of motion signals, as well as a region in the intraparietal sulcus thought to be involved in perceptual decision-making. The behavioural results were mirrored in both primary visual cortex and the middle temporal area, with a greater reduction in response at short, compared with long, viewing durations in autism compared with controls (both P < 0.018). In contrast, there was no difference between the groups in the intraparietal sulcus (P > 0.574). These findings suggest that

  15. Visual Perception and Regulatory Conflict: Motivation and Physiology Influence Distance Perception

    ERIC Educational Resources Information Center

    Cole, Shana; Balcetis, Emily; Zhang, Sam

    2013-01-01

    Regulatory conflict can emerge when people experience a strong motivation to act on goals but a conflicting inclination to withhold action because physical resources available, or "physiological potentials", are low. This study demonstrated that distance perception is biased in ways that theory suggests assists in managing this conflict.…

  16. Factors Affecting Parent's Perception on Air Quality-From the Individual to the Community Level.

    PubMed

    Guo, Yulin; Liu, Fengfeng; Lu, Yuanan; Mao, Zongfu; Lu, Hanson; Wu, Yanyan; Chu, Yuanyuan; Yu, Lichen; Liu, Yisi; Ren, Meng; Li, Na; Chen, Xi; Xiang, Hao

    2016-01-01

    The perception of air quality significantly affects the acceptance of the public of the government's environmental policies. The aim of this research is to explore the relationship between the perception of the air quality of parents and scientific monitoring data and to analyze the factors that affect parents' perceptions. Scientific data of air quality were obtained from Wuhan's environmental condition reports. One thousand parents were investigated for their knowledge and perception of air quality. Scientific data show that the air quality of Wuhan follows an improving trend in general, while most participants believed that the air quality of Wuhan has deteriorated, which indicates a significant difference between public perception and reality. On the individual level, respondents with an age of 40 or above (40 or above: OR = 3.252; 95% CI: 1.170-9.040), a higher educational level (college and above: OR = 7.598; 95% CI: 2.244-25.732) or children with poor healthy conditions (poor: OR = 6.864; 95% CI: 2.212-21.302) have much more negative perception of air quality. On the community level, industrial facilities, vehicles and city construction have major effects on parents' perception of air quality. Our investigation provides baseline information for environmental policy researchers and makers regarding the public's perception and expectation of air quality and the benefits to the environmental policy completing and enforcing. PMID:27187432

  17. Event Boundaries in Perception Affect Memory Encoding and Updating

    ERIC Educational Resources Information Center

    Swallow, Khena M.; Zacks, Jeffrey M.; Abrams, Richard A.

    2009-01-01

    Memory for naturalistic events over short delays is important for visual scene processing, reading comprehension, and social interaction. The research presented here examined relations between how an ongoing activity is perceptually segmented into events and how those events are remembered a few seconds later. In several studies, participants…

  18. Implied Dynamics Biases the Visual Perception of Velocity

    PubMed Central

    La Scaleia, Barbara; Zago, Myrka; Moscatelli, Alessandro; Lacquaniti, Francesco; Viviani, Paolo

    2014-01-01

    We expand the anecdotic report by Johansson that back-and-forth linear harmonic motions appear uniform. Six experiments explore the role of shape and spatial orientation of the trajectory of a point-light target in the perceptual judgment of uniform motion. In Experiment 1, the target oscillated back-and-forth along a circular arc around an invisible pivot. The imaginary segment from the pivot to the midpoint of the trajectory could be oriented vertically downward (consistent with an upright pendulum), horizontally leftward, or vertically upward (upside-down). In Experiments 2 to 5, the target moved uni-directionally. The effect of suppressing the alternation of movement directions was tested with curvilinear (Experiment 2 and 3) or rectilinear (Experiment 4 and 5) paths. Experiment 6 replicated the upright condition of Experiment 1, but participants were asked to hold the gaze on a fixation point. When some features of the trajectory evoked the motion of either a simple pendulum or a mass-spring system, observers identified as uniform the kinematic profiles close to harmonic motion. The bias towards harmonic motion was most consistent in the upright orientation of Experiment 1 and 6. The bias disappeared when the stimuli were incompatible with both pendulum and mass-spring models (Experiments 3 to 5). The results are compatible with the hypothesis that the perception of dynamic stimuli is biased by the laws of motion obeyed by natural events, so that only natural motions appear uniform. PMID:24667578

  19. The visual perception of distance ratios in physical space.

    PubMed

    Norman, J Farley; Adkins, Olivia C; Pedersen, Lauren E

    2016-06-01

    Past studies have consistently demonstrated that human observers cannot accurately perceive environmental distances. Even so, we obviously detect sufficient spatial information to meet the demands of everyday life. In the current experiment, ten younger adults (mean age was 21.8years) and ten older adults (mean age was 72.3years) estimated distance ratios in physical space. On any given trial, observers judged how long one distance interval was relative to another. The 18 stimulus ratios ranged from 1.0 to 9.5; the observers judged each stimulus ratio three times. The average correlation coefficient relating actual distance ratios to perceived ratios was identical (r=0.87) for both younger and older age groups. Despite this strong relationship between perception and reality, the judgments of many individual observers were inaccurate. For example, ten percent of the observers overestimated the stimulus ratios, while fifty percent underestimated the stimulus ratios. Although both under- and overestimation occurred in the current experiment, the results nevertheless demonstrate that human adults can reliably compare environmental distances in different directions. PMID:27155022

  20. Visual perception of effervescence in champagne and other sparkling beverages.

    PubMed

    Liger-Belair, Gérard

    2010-01-01

    The so-called effervescence process, which enlivens champagne, sparkling wines, beers, and carbonated beverages in general, is the result of the fine interplay between CO₂-dissolved gas molecules, tiny air pockets trapped within microscopic particles during the pouring process, and some liquid properties. This chapter summarizes recent advances obtained during the last decade concerning the physicochemical processes behind the nucleation, rise, and burst of bubbles found in glasses poured with sparkling beverages. Those phenomena observed in close-up through high-speed photography are often visually appealing. Moreover, the kinetics of gas discharging from freshly poured glasses was monitored with time, whether champagne is served into a flute or into a coupe. The role of temperature was also examined. We hope that your enjoyment of champagne will be enhanced after reading this fully illustrated review dedicated to the deep beauties of nature often hidden behind many everyday phenomena. PMID:21092901

  1. The Conductor As Visual Guide: Gesture and Perception of Musical Content.

    PubMed

    Kumar, Anita B; Morrison, Steven J

    2016-01-01

    Ensemble conductors are often described as embodying the music. Researchers have determined that expressive gestures affect viewers' perceptions of conducted ensemble performances. This effect may be due, in part, to conductor gesture delineating and amplifying specific expressive aspects of music performances. The purpose of the present study was to determine if conductor gesture affected observers' focus of attention to contrasting aspects of ensemble performances. Audio recordings of two different music excerpts featuring two-part counterpoint (an ostinato paired with a lyric melody, and long chord tones paired with rhythmic interjections) were paired with video of two conductors. Each conductor used gesture appropriate to one or the other musical element (e.g., connected and flowing or detached and crisp) for a total of sixteen videos. Musician participants evaluated 8 of the excerpts for Articulation, Rhythm, Style, and Phrasing using four 10-point differential scales anchored by descriptive terms (e.g., disconnected to connected, and angular to flowing.) Results indicated a relationship between gesture and listeners' evaluations of musical content. Listeners appear to be sensitive to the manner in which a conductor's gesture delineates musical lines, particularly as an indication of overall articulation and style. This effect was observed for the lyric melody and ostinato excerpt, but not for the chords and interjections excerpt. Therefore, this effect appears to be mitigated by the congruence of gesture to preconceptions of the importance of melodic over rhythmic material, of certain instrument timbres over others, and of length between onsets of active material. These results add to a body of literature that supports the importance of the visual component in the multimodal experience of music performance. PMID:27458425

  2. The Conductor As Visual Guide: Gesture and Perception of Musical Content

    PubMed Central

    Kumar, Anita B.; Morrison, Steven J.

    2016-01-01

    Ensemble conductors are often described as embodying the music. Researchers have determined that expressive gestures affect viewers’ perceptions of conducted ensemble performances. This effect may be due, in part, to conductor gesture delineating and amplifying specific expressive aspects of music performances. The purpose of the present study was to determine if conductor gesture affected observers’ focus of attention to contrasting aspects of ensemble performances. Audio recordings of two different music excerpts featuring two-part counterpoint (an ostinato paired with a lyric melody, and long chord tones paired with rhythmic interjections) were paired with video of two conductors. Each conductor used gesture appropriate to one or the other musical element (e.g., connected and flowing or detached and crisp) for a total of sixteen videos. Musician participants evaluated 8 of the excerpts for Articulation, Rhythm, Style, and Phrasing using four 10-point differential scales anchored by descriptive terms (e.g., disconnected to connected, and angular to flowing.) Results indicated a relationship between gesture and listeners’ evaluations of musical content. Listeners appear to be sensitive to the manner in which a conductor’s gesture delineates musical lines, particularly as an indication of overall articulation and style. This effect was observed for the lyric melody and ostinato excerpt, but not for the chords and interjections excerpt. Therefore, this effect appears to be mitigated by the congruence of gesture to preconceptions of the importance of melodic over rhythmic material, of certain instrument timbres over others, and of length between onsets of active material. These results add to a body of literature that supports the importance of the visual component in the multimodal experience of music performance. PMID:27458425

  3. Visual Crowding: a fundamental limit on conscious perception and object recognition

    PubMed Central

    Whitney, David; Levi, Dennis M.

    2011-01-01

    Crowding, the inability to recognize objects in clutter, sets a fundamental limit on conscious visual perception and object recognition throughout most of the visual field. Despite how widespread and essential it is to object recognition, reading, and visually guided action, a solid operational definition of what crowding is has only recently become clear. The goal of this review is to provide a broad-based synthesis of the most recent findings in this area, to define what crowding is and is not, and to set the stage for future work that will extend crowding well beyond low-level vision. Here we define five diagnostic criteria for what counts as crowding, and further describe factors that both escape and break crowding. All of these lead to the conclusion that crowding occurs at multiple stages in the visual hierarchy. PMID:21420894

  4. Fornix and medial temporal lobe lesions lead to comparable deficits in complex visual perception.

    PubMed

    Lech, Robert K; Koch, Benno; Schwarz, Michael; Suchan, Boris

    2016-05-01

    Recent research dealing with the structures of the medial temporal lobe (MTL) has shifted away from exclusively investigating memory-related processes and has repeatedly incorporated the investigation of complex visual perception. Several studies have demonstrated that higher level visual tasks can recruit structures like the hippocampus and perirhinal cortex in order to successfully perform complex visual discriminations, leading to a perceptual-mnemonic or representational view of the medial temporal lobe. The current study employed a complex visual discrimination paradigm in two patients suffering from brain lesions with differing locations and origin. Both patients, one with extensive medial temporal lobe lesions (VG) and one with a small lesion of the anterior fornix (HJK), were impaired in complex discriminations while showing otherwise mostly intact cognitive functions. The current data confirmed previous results while also extending the perceptual-mnemonic theory of the MTL to the main output structure of the hippocampus, the fornix. PMID:26994782

  5. Characteristics of Visual-Perceptual Function Measured by the Motor-Free Visual Perception Test-3 in Korean Adults

    PubMed Central

    Han, A-Reum; Kim, Doo-Yung; Choi, Tae-Woong; Moon, Hyun-Im; Ryu, Byung-Joo; Yang, Seung-Nam

    2014-01-01

    Objective To adapt and standardize the Motor-Free Visual Perception Test-3 (MVPT-3) to Koreans and investigate the change in visual-perceptual function using the MVPT-3 in healthy Korean adults. Methods The Korean version of the MVPT-3 was developed through a cross-cultural adaptation process according to 6 steps, including translation, reconciliation, back translation, cognitive debriefing, feedback, and final reconciliation. A total of 321 healthy Korean volunteers (mean age, 51.05 years) were recruited. We collected participant demographic data, such as sex, age, and years of education, and performed the Korean version of the Mini-Mental State Examination (K-MMSE) and MVPT-3. Internal consistency of the MVPT-3 and the relationships between demographic data, K-MMSE and MVPT-3 scores were analyzed. The results of this study were compared with published data from western countries including the United States and Canada. Results Total score on the MVPT-3 was positively correlated with years of education (r=0.715, p<0.001) and K-MMSE score (r=0.718, p<0.001). However, it had a negative correlation with age (r=-0.669, p<0.001). A post-hoc analysis of MVPT-3 scores classified age into 5 groups of ≤49, 50-59, 60-69, 70-79, ≥80 years and years of education into 4 groups of 0, 1-9, 10-12, ≥13 years. No significant differences in MVPT-3 scores were observed according to sex or country. Conclusion Visual perception was significantly influenced by age, years of education, and cognitive function. Reference values for the MVPT-3 provided in this study will be useful for evaluating and planning a rehabilitation program of visual perceptual function in patients with brain disorders. PMID:25229034

  6. The Effect of a Computerized Visual Perception and Visual-Motor Integration Training Program on Improving Chinese Handwriting of Children with Handwriting Difficulties

    ERIC Educational Resources Information Center

    Poon, K. W.; Li-Tsang, C. W .P.; Weiss, T. P. L.; Rosenblum, S.

    2010-01-01

    This study aimed to investigate the effect of a computerized visual perception and visual-motor integration training program to enhance Chinese handwriting performance among children with learning difficulties, particularly those with handwriting problems. Participants were 26 primary-one children who were assessed by educational psychologists and…

  7. Visual Perception and Visual-Motor Integration in Very Preterm and/or Very Low Birth Weight Children: A Meta-Analysis

    ERIC Educational Resources Information Center

    Geldof, C. J. A.; van Wassenaer, A. G.; de Kieviet, J. F.; Kok, J. H.; Oosterlaan, J.

    2012-01-01

    A range of neurobehavioral impairments, including impaired visual perception and visual-motor integration, are found in very preterm born children, but reported findings show great variability. We aimed to aggregate the existing literature using meta-analysis, in order to provide robust estimates of the effect of very preterm birth on visual…

  8. Syntactic texture and perception for a new generic visual anomalies classification

    NASA Astrophysics Data System (ADS)

    Désage, Simon-Frédéric; Pitard, Gilles; Pillet, Maurice; Favrelière, Hugues; Maire, Jean-Luc; Frelin, Fabrice; Samper, Serge; Le Goïc, Gaëtan

    2015-04-01

    The research purpose is to improve aesthetic anomalies detection and evaluation based on what is perceived by human eye and on the 2006 CIE report.1 It is therefore important to define parameters able to discriminate surfaces, in accordance with the perception of human eye. Our starting point in assessing aesthetic anomalies is geometric description such as defined by ISO standard,2 i.e. traduce anomalies description with perception words about texture divergence impact. However, human controllers observe (detect) the aesthetic anomaly by its visual effect and interpreter for its geometric description. The research question is how define generic parameters for discriminating aesthetic anomalies, from enhanced information of visual texture such as recent surface visual rendering approach. We propose to use an approach from visual texture processing that quantify spatial variations of pixel for translating changes in color, material and relief. From a set of images from different angles of light which gives us access to the surface appearance, we propose an approach from visual effect to geometrical specifications as the current standards have identified the aesthetic anomalies.

  9. Task-dependent calibration of auditory spatial perception through environmental visual observation.

    PubMed

    Tonelli, Alessia; Brayda, Luca; Gori, Monica

    2015-01-01

    Visual information is paramount to space perception. Vision influences auditory space estimation. Many studies show that simultaneous visual and auditory cues improve precision of the final multisensory estimate. However, the amount or the temporal extent of visual information, that is sufficient to influence auditory perception, is still unknown. It is therefore interesting to know if vision can improve auditory precision through a short-term environmental observation preceding the audio task and whether this influence is task-specific or environment-specific or both. To test these issues we investigate possible improvements of acoustic precision with sighted blindfolded participants in two audio tasks [minimum audible angle (MAA) and space bisection] and two acoustically different environments (normal room and anechoic room). With respect to a baseline of auditory precision, we found an improvement of precision in the space bisection task but not in the MAA after the observation of a normal room. No improvement was found when performing the same task in an anechoic chamber. In addition, no difference was found between a condition of short environment observation and a condition of full vision during the whole experimental session. Our results suggest that even short-term environmental observation can calibrate auditory spatial performance. They also suggest that echoes can be the cue that underpins visual calibration. Echoes may mediate the transfer of information from the visual to the auditory system. PMID:26082692

  10. Document region classification using low-resolution images: a human visual perception approach

    NASA Astrophysics Data System (ADS)

    Chacon Murguia, Mario I.; Jordan, Jay B.

    1999-10-01

    This paper describes the design of a document region classifier. The regions of a document are classified as large text regions, LTR, and non-LTR. The foundations of the classifier are derived from human visual perception theories. The theories analyzed are texture discrimination based on textons, and perceptual grouping. Based on these theories, the classification task is stated as a texture discrimination problem and is implemented as a preattentive process. Once the foundations of the classifier are defined, engineering techniques are developed to extract features for deciding the class of information contained in the regions. The feature derived from the human visual perception theories is a measurement of periodicity of the blobs of the text regions. This feature is used to design a statistical classifier based on the minimum probability of error criterion to perform the classification of LTR and non-LTR. The method is test on free format low resolution document images achieving 93% of correct recognition.

  11. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  12. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  13. Factors affecting the perception of Korean-accented American English

    NASA Astrophysics Data System (ADS)

    Cho, Kwansun; Harris, John G.; Shrivastav, Rahul

    2005-09-01

    This experiment examines the relative contribution of two factors, intonation and articulation errors, on the perception of foreign accent in Korean-accented American English. Ten native speakers of Korean and ten native speakers of American English were asked to read ten English sentences. These sentences were then modified using high-quality speech resynthesis techniques [STRAIGHT Kawahara et al., Speech Commun. 27, 187-207 (1999)] to generate four sets of stimuli. In the first two sets of stimuli, the intonation patterns of the Korean speakers and American speakers were switched with one another. The articulatory errors for each speaker were not modified. In the final two sets, the sentences from the Korean and American speakers were resynthesized without any modifications. Fifteen listeners were asked to rate all the stimuli for the degree of foreign accent. Preliminary results show that, for native speakers of American English, articulation errors may play a greater role in the perception of foreign accent than errors in intonation patterns. [Work supported by KAIM.

  14. Specific previous experience affects perception of harmony and meter.

    PubMed

    Creel, Sarah C

    2011-10-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was presented simultaneously with a melody. After a listener was familiarized with melodies embedded in contexts, the listener heard melodies in isolation and judged the fit of a final harmonic or metrical probe event. The probe event matched either the familiar (but absent) context or an unfamiliar context. For both harmonic (Experiments 1 and 3) and metrical (Experiment 2) information, exposure to context shifted listeners' preferences toward a probe matching the context that they had been familiarized with. This suggests that listeners rapidly form specific musical memories without explicit instruction, which are then activated during music listening. These data pose an interesting challenge for models of music perception which implicitly assume that the listener's knowledge base is predominantly schematic or abstract. PMID:21553992

  15. Toward an evolutionary perspective on conceptual representation: species-specific calls activate visual and affective processing systems in the macaque.

    PubMed

    Gil-da-Costa, Ricardo; Braun, Allen; Lopes, Marco; Hauser, Marc D; Carson, Richard E; Herscovitch, Peter; Martin, Alex

    2004-12-14

    Non-human primates produce a diverse repertoire of species-specific calls and have rich conceptual systems. Some of their calls are designed to convey information about concepts such as predators, food, and social relationships, as well as the affective state of the caller. Little is known about the neural architecture of these calls, and much of what we do know is based on single-cell physiology from anesthetized subjects. By using positron emission tomography in awake rhesus macaques, we found that conspecific vocalizations elicited activity in higher-order visual areas, including regions in the temporal lobe associated with the visual perception of object form (TE/TEO) and motion (superior temporal sulcus) and storing visual object information into long-term memory (TE), as well as in limbic (the amygdala and hippocampus) and paralimbic regions (ventromedial prefrontal cortex) associated with the interpretation and memory-encoding of highly salient and affective material. This neural circuitry strongly corresponds to the network shown to support representation of conspecifics and affective information in humans. These findings shed light on the evolutionary precursors of conceptual representation in humans, suggesting that monkeys and humans have a common neural substrate for representing object concepts. PMID:15583132

  16. Contextual Effects of Scene on the Visual Perception of Object Orientation in Depth

    PubMed Central

    Niimi, Ryosuke; Watanabe, Katsumi

    2013-01-01

    We investigated the effect of background scene on the human visual perception of depth orientation (i.e., azimuth angle) of three-dimensional common objects. Participants evaluated the depth orientation of objects. The objects were surrounded by scenes with an apparent axis of the global reference frame, such as a sidewalk scene. When a scene axis was slightly misaligned with the gaze line, object orientation perception was biased, as if the gaze line had been assimilated into the scene axis (Experiment 1). When the scene axis was slightly misaligned with the object, evaluated object orientation was biased, as if it had been assimilated into the scene axis (Experiment 2). This assimilation may be due to confusion between the orientation of the scene and object axes (Experiment 3). Thus, the global reference frame may influence object orientation perception when its orientation is similar to that of the gaze-line or object. PMID:24391947

  17. [An instrument for projecting visual perception by monofocal and bifocal intraocular lenses].

    PubMed

    Reiner, J

    1992-01-01

    With the aid of a simple optical system similar to an astronomical telescope, images of spectacle lenses, contact lenses and intraocular lenses can be projected in front of, onto or into the eye. These optical images function like real lenses and enable their performance to be evaluated. In the case of intraocular lenses, visual perception through monofocal or bifocal lenses and teledioptric systems can be reproduced. PMID:1583844

  18. Affect differentially modulates brain activation in uni- and multisensory body-voice perception.

    PubMed

    Jessen, Sarah; Kotz, Sonja A

    2015-01-01

    Emotion perception naturally entails multisensory integration. It is also assumed that multisensory emotion perception is characterized by enhanced activation of brain areas implied in multisensory integration, such as the superior temporal gyrus and sulcus (STG/STS). However, most previous studies have employed designs and stimuli that preclude other forms of multisensory interaction, such as crossmodal prediction, leaving open the question whether classical integration is the only relevant process in multisensory emotion perception. Here, we used video clips containing emotional and neutral body and vocal expressions to investigate the role of crossmodal prediction in multisensory emotion perception. While emotional multisensory expressions increased activation in the bilateral fusiform gyrus (FFG), neutral expressions compared to emotional ones enhanced activation in the bilateral middle temporal gyrus (MTG) and posterior STS. Hence, while neutral stimuli activate classical multisensory areas, emotional stimuli invoke areas linked to unisensory visual processing. Emotional stimuli may therefore trigger a prediction of upcoming auditory information based on prior visual information. Such prediction may be stronger for highly salient emotional compared to less salient neutral information. Therefore, we suggest that multisensory emotion perception involves at least two distinct mechanisms; classical multisensory integration, as shown for neutral expressions, and crossmodal prediction, as evident for emotional expressions. PMID:25445782

  19. Within- and Cross-Modal Distance Information Disambiguate Visual Size-Change Perception

    PubMed Central

    Battaglia, Peter W.; Di Luca, Massimiliano; Ernst, Marc O.; Schrater, Paul R.; Machulla, Tonja; Kersten, Daniel

    2010-01-01

    Perception is fundamentally underconstrained because different combinations of object properties can generate the same sensory information. To disambiguate sensory information into estimates of scene properties, our brains incorporate prior knowledge and additional “auxiliary” (i.e., not directly relevant to desired scene property) sensory information to constrain perceptual interpretations. For example, knowing the distance to an object helps in perceiving its size. The literature contains few demonstrations of the use of prior knowledge and auxiliary information in combined visual and haptic disambiguation and almost no examination of haptic disambiguation of vision beyond “bistable” stimuli. Previous studies have reported humans integrate multiple unambiguous sensations to perceive single, continuous object properties, like size or position. Here we test whether humans use visual and haptic information, individually and jointly, to disambiguate size from distance. We presented participants with a ball moving in depth with a changing diameter. Because no unambiguous distance information is available under monocular viewing, participants rely on prior assumptions about the ball's distance to disambiguate their -size percept. Presenting auxiliary binocular and/or haptic distance information augments participants' prior distance assumptions and improves their size judgment accuracy—though binocular cues were trusted more than haptic. Our results suggest both visual and haptic distance information disambiguate size perception, and we interpret these results in the context of probabilistic perceptual reasoning. PMID:20221263

  20. Vestibular signals in macaque extrastriate visual cortex are functionally appropriate for heading perception

    PubMed Central

    Liu, Sheng; Angelaki, Dora E.

    2009-01-01

    Visual and vestibular signals converge onto the dorsal medial superior temporal area (MSTd) of the macaque extrastriate visual cortex, which is thought to be involved in multisensory heading perception for spatial navigation. Peripheral otolith information, however, is ambiguous and cannot distinguish linear accelerations experienced during self-motion from those due to changes in spatial orientation relative to gravity. Here we show that, unlike peripheral vestibular sensors but similar to lobules 9 and 10 of the cerebellar vermis (nodulus and uvula), MSTd neurons respond selectively to heading and not to changes in orientation relative to gravity. In support of a role in heading perception, MSTd vestibular responses are also dominated by velocity-like temporal dynamics, which might optimize sensory integration with visual motion information. Unlike the cerebellar vermis, however, MSTd neurons also carry a spatial orientation-independent rotation signal from the semicircular canals, which could be useful in compensating for the effects of head rotation on the processing of optic flow. These findings show that vestibular signals in MSTd are appropriately processed to support a functional role in multisensory heading perception. PMID:19605631

  1. Perception and psychological evaluation for visual and auditory environment based on the correlation mechanisms

    NASA Astrophysics Data System (ADS)

    Fujii, Kenji

    2002-06-01

    In this dissertation, the correlation mechanism in modeling the process in the visual perception is introduced. It has been well described that the correlation mechanism is effective for describing subjective attributes in auditory perception. The main result is that it is possible to apply the correlation mechanism to the process in temporal vision and spatial vision, as well as in audition. (1) The psychophysical experiment was performed on subjective flicker rates for complex waveforms. A remarkable result is that the phenomenon of missing fundamental is found in temporal vision as analogous to the auditory pitch perception. This implies the existence of correlation mechanism in visual system. (2) For spatial vision, the autocorrelation analysis provides useful measures for describing three primary perceptual properties of visual texture: contrast, coarseness, and regularity. Another experiment showed that the degree of regularity is a salient cue for texture preference judgment. (3) In addition, the autocorrelation function (ACF) and inter-aural cross-correlation function (IACF) were applied for analysis of the temporal and spatial properties of environmental noise. It was confirmed that the acoustical properties of aircraft noise and traffic noise are well described. These analyses provided useful parameters extracted from the ACF and IACF in assessing the subjective annoyance for noise. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Junko Atagi, 6813 Mosonou, Saijo-cho, Higashi-Hiroshima 739-0024, Japan. E-mail address: atagi\\@urban.ne.jp.

  2. Color names, color categories, and color-cued visual search: Sometimes, color perception is not categorical

    PubMed Central

    Brown, Angela M; Lindsey, Delwin T; Guckes, Kevin M

    2011-01-01

    The relation between colors and their names is a classic case-study for investigating the Sapir-Whorf hypothesis that categorical perception is imposed on perception by language. Here, we investigate the Sapir-Whorf prediction that visual search for a green target presented among blue distractors (or vice versa) should be faster than search for a green target presented among distractors of a different color of green (or for a blue target among different blue distractors). Gilbert, Regier, Kay & Ivry (2006) reported that this Sapir-Whorf effect is restricted to the right visual field (RVF), because the major brain language centers are in the left cerebral hemisphere. We found no categorical effect at the Green|Blue color boundary, and no categorical effect restricted to the RVF. Scaling of perceived color differences by Maximum Likelihood Difference Scaling (MLDS) also showed no categorical effect, including no effect specific to the RVF. Two models fit the data: a color difference model based on MLDS and a standard opponent-colors model of color discrimination based on the spectral sensitivities of the cones. Neither of these models, nor any of our data, suggested categorical perception of colors at the Green|Blue boundary, in either visual field. PMID:21980188

  3. Visual perception studies to improve the perceived sharpness of television images

    NASA Astrophysics Data System (ADS)

    Glenn, William E.

    2002-06-01

    In this paper several properties of visual perception are used to describe the perceived sharpness of present HDTV transmission and display formats. A method is described that uses these properties to improve perceived sharpness without increasing the transmission bit rate. Because of the oblique effect in vision and the statistical orientation of lines in scenes, diagonal sampling reduces the required number of pixels in an image. Quantitatively, our measurements show that the number of pixels is reduced by a factor of 1.4 for the same perceived sharpness. Interlaced scanning reduces vertical resolution for several reasons involving spatial and temporal masking effects in visual perception. Progressive scan avoids these limitations. In addition, by taking advantage of the octave-wide tuning bands in visual perception, our measurements show that the perceived resolution in the vertical direction for a progressive scan can be double that of an interlaced scan. By using diagonal sampling, a 1920X1080 image with progressive scan at 60 frames per second requires the same transmission bit rate as a 1920X1080 cardinally sampled image scanned interlaced at 30 frames per second. This results in an image that appears to be much sharper than the 1080 line interlaced format without the interlace artifacts.

  4. The Perception of Naturalness Correlates with Low-Level Visual Features of Environmental Scenes

    PubMed Central

    Berman, Marc G.; Hout, Michael C.; Kardan, Omid; Hunter, MaryCarol R.; Yourganov, Grigori; Henderson, John M.; Hanayik, Taylor; Karimi, Hossein; Jonides, John

    2014-01-01

    Previous research has shown that interacting with natural environments vs. more urban or built environments can have salubrious psychological effects, such as improvements in attention and memory. Even viewing pictures of nature vs. pictures of built environments can produce similar effects. A major question is: What is it about natural environments that produces these benefits? Problematically, there are many differing qualities between natural and urban environments, making it difficult to narrow down the dimensions of nature that may lead to these benefits. In this study, we set out to uncover visual features that related to individuals' perceptions of naturalness in images. We quantified naturalness in two ways: first, implicitly using a multidimensional scaling analysis and second, explicitly with direct naturalness ratings. Features that seemed most related to perceptions of naturalness were related to the density of contrast changes in the scene, the density of straight lines in the scene, the average color saturation in the scene and the average hue diversity in the scene. We then trained a machine-learning algorithm to predict whether a scene was perceived as being natural or not based on these low-level visual features and we could do so with 81% accuracy. As such we were able to reliably predict subjective perceptions of naturalness with objective low-level visual features. Our results can be used in future studies to determine if these features, which are related to naturalness, may also lead to the benefits attained from interacting with nature. PMID:25531411

  5. Whose reality counts? Factors affecting the perception of volcanic risk

    NASA Astrophysics Data System (ADS)

    Haynes, Katharine; Barclay, Jenni; Pidgeon, Nick

    2008-05-01

    Understanding how people perceive risk has become increasingly important for improving risk communication and reducing risk associated conflicts. This paper builds upon findings, methodologies and lessons learned from other fields to help understand differences between scientists, authorities and the public. Qualitative and quantitative methods were used to analyse underlying attitudes and judgements during an ongoing volcanic crisis on the Caribbean Island of Montserrat. Specific differences between the public, authorities and scientists were found to have been responsible for misunderstandings and misinterpretations of information and roles, resulting in differing perceptions of acceptable risk. Difficulties in the articulation and understanding of uncertainties pertaining to the volcanic risk led to a situation in which the roles of hazard monitoring, risk communication and public protection became confused. In addition, social, economic and political forces were found to have distorted risk messages, leading to a public reliance upon informal information networks. The implications of these findings for volcanic risk management and communication are discussed.

  6. Atypical perception of affective prosody in Autism Spectrum Disorder

    PubMed Central

    Gebauer, Line; Skewes, Joshua; Hørlyck, Lone; Vuust, Peter

    2014-01-01

    Autism Spectrum Disorder (ASD) is characterized by impairments in language and social–emotional cognition. Yet, findings of emotion recognition from affective prosody in individuals with ASD are inconsistent. This study investigated emotion recognition and neural processing of affective prosody in high-functioning adults with ASD relative to neurotypical (NT) adults. Individuals with ASD showed mostly typical brain activation of the fronto-temporal and subcortical brain regions in response to affective prosody. Yet, the ASD group showed a trend towards increased activation of the right caudate during processing of affective prosody and rated the emotional intensity lower than NT individuals. This is likely associated with increased attentional task demands in this group, which might contribute to social–emotional impairments. PMID:25379450

  7. Atypical perception of affective prosody in Autism Spectrum Disorder.

    PubMed

    Gebauer, Line; Skewes, Joshua; Hørlyck, Lone; Vuust, Peter

    2014-01-01

    Autism Spectrum Disorder (ASD) is characterized by impairments in language and social-emotional cognition. Yet, findings of emotion recognition from affective prosody in individuals with ASD are inconsistent. This study investigated emotion recognition and neural processing of affective prosody in high-functioning adults with ASD relative to neurotypical (NT) adults. Individuals with ASD showed mostly typical brain activation of the fronto-temporal and subcortical brain regions in response to affective prosody. Yet, the ASD group showed a trend towards increased activation of the right caudate during processing of affective prosody and rated the emotional intensity lower than NT individuals. This is likely associated with increased attentional task demands in this group, which might contribute to social-emotional impairments. PMID:25379450

  8. The motion/pursuit law for visual depth perception from motion parallax.

    PubMed

    Nawrot, Mark; Stroyan, Keith

    2009-07-01

    One of vision's most important functions is specification of the layout of objects in the 3D world. While the static optical geometry of retinal disparity explains the perception of depth from binocular stereopsis, we propose a new formula to link the pertinent dynamic geometry to the computation of depth from motion parallax. Mathematically, the ratio of retinal image motion (motion) and smooth pursuit of the eye (pursuit) provides the necessary information for the computation of relative depth from motion parallax. We show that this could have been obtained with the approaches of Nakayama and Loomis [Nakayama, K., & Loomis, J. M. (1974). Optical velocity patterns, velocity-sensitive neurons, and space perception: A hypothesis. Perception, 3, 63-80] or Longuet-Higgens and Prazdny [Longuet-Higgens, H. C., & Prazdny, K. (1980). The interpretation of a moving retinal image. Proceedings of the Royal Society of London Series B, 208, 385-397] by adding pursuit to their treatments. Results of a psychophysical experiment show that changes in the motion/pursuit ratio have a much better relationship to changes in the perception of depth from motion parallax than do changes in motion or pursuit alone. The theoretical framework provided by the motion/pursuit law provides the quantitative foundation necessary to study this fundamental visual depth perception ability. PMID:19463848

  9. Visual Contextual Effects of Orientation, Contrast, Flicker, and Luminance: All Are Affected by Normal Aging

    PubMed Central

    Nguyen, Bao N.; McKendrick, Allison M.

    2016-01-01

    The perception of a visual stimulus can be markedly altered by spatial interactions between the stimulus and its surround. For example, a grating stimulus appears lower in contrast when surrounded by a similar pattern of higher contrast: a phenomenon known as surround suppression of perceived contrast. Such center–surround interactions in visual perception are numerous and arise from both cortical and pre-cortical neural circuitry. For example, perceptual surround suppression of luminance and flicker are predominantly mediated pre-cortically, whereas contrast and orientation suppression have strong cortical contributions. Here, we compare the perception of older and younger observers on a battery of tasks designed to assess such visual contextual effects. For all visual dimensions tested (luminance, flicker, contrast, and orientation), on average the older adults showed greater suppression of central targets than the younger adult group. The increase in suppression was consistent in magnitude across all tasks, suggesting that normal aging produces a generalized, non-specific alteration to contextual processing in vision. PMID:27148047

  10. How to make a good animation: A grounded cognition model of how visual representation design affects the construction of abstract physics knowledge

    NASA Astrophysics Data System (ADS)

    Chen, Zhongzhou; Gladding, Gary

    2014-06-01

    Visual representations play a critical role in teaching physics. However, since we do not have a satisfactory understanding of how visual perception impacts the construction of abstract knowledge, most visual representations used in instructions are either created based on existing conventions or designed according to the instructor's intuition, which leads to a significant variance in their effectiveness. In this paper we propose a cognitive mechanism based on grounded cognition, suggesting that visual perception affects understanding by activating "perceptual symbols": the basic cognitive unit used by the brain to construct a concept. A good visual representation activates perceptual symbols that are essential for the construction of the represented concept, whereas a bad representation does the opposite. As a proof of concept, we conducted a clinical experiment in which participants received three different versions of a multimedia tutorial teaching the integral expression of electric potential. The three versions were only different by the details of the visual representation design, only one of which contained perceptual features that activate perceptual symbols essential for constructing the idea of "accumulation." On a following post-test, participants receiving this version of tutorial significantly outperformed those who received the other two versions of tutorials designed to mimic conventional visual representations used in classrooms.

  11. Visual recovery following open globe injury with initial no light perception

    PubMed Central

    Han, Yong S; Kavoussi, Shaheen C; Adelman, Ron A

    2015-01-01

    Background The purpose of this study was to analyze eyes presenting with no light perception (NLP) after open globe injury (OGI) to determine visual outcomes and prognostic indicators for visual recovery. Methods The records of consecutive patients with at least 6 months of follow-up presenting with OGI and NLP to a single institution between January 1, 2003 and December 31, 2013 were reviewed for demographics, ophthalmic history, context and characteristics of injury, ocular examination findings, surgical interventions, and follow-up visual acuity. Unpaired t-tests and Fisher’s Exact tests were used for statistical analysis. Results Twenty-five patients met our inclusion criteria. The mean age was 50.4±25.5 (range 8–91) years. Four patients (16%) regained vision (hand motion in three patients and light perception in one patient) while 21 patients (84%) remained with NLP or had a prosthesis at final follow-up. Fourteen eyes (56%) were enucleated; nine (36%) were secondary enucleations. Although the sample sizes were small, neither ocular trauma score nor wound size was found to predict visual recovery. Conclusion Four patients regained some vision after presenting with NLP due to OGI. These findings suggest that, in select cases, physicians should discuss the possibility of regaining some vision. PMID:26316683

  12. Visual perception in prediagnostic and early stage Huntington’s disease

    PubMed Central

    O’DONNELL, BRIAN F.; BLEKHER, TANYA M.; WEAVER, MARJORIE; WHITE, KERRY M.; MARSHALL, JEANINE; BERISTAIN, XABIER; STOUT, JULIE C.; GRAY, JACQUELINE; WOJCIESZEK, JOANNE M.; FOROUD, TATIANA M.

    2009-01-01

    Disturbances of visual perception frequently accompany neurodegenerative disorders but have been little studied in Huntington’s disease (HD) gene carriers. We used psychophysical tests to assess visual perception among individuals in the prediagnostic and early stages of HD. The sample comprised four groups, which included 201 nongene carriers (NG), 32 prediagnostic gene carriers with minimal neurological abnormalities (PD1); 20 prediagnostic gene carriers with moderate neurological abnormalities (PD2), and 36 gene carriers with diagnosed HD. Contrast sensitivity for stationary and moving sinusoidal gratings, and tests of form and motion discrimination, were used to probe different visual pathways. Patients with HD showed impaired contrast sensitivity for moving gratings. For one of the three contrast sensitivity tests, the prediagnostic gene carriers with greater neurological abnormality (PD2) also had impaired performance as compared with NG. These findings suggest that early stage HD disrupts visual functions associated with the magnocellular pathway. However, these changes are only observed in individuals diagnosed with HD or who are in the more symptomatic stages of prediagnostic HD. PMID:18419843

  13. Shadows of artistry: cortical synchrony during perception and imagery of visual art.

    PubMed

    Bhattacharya, Joydeep; Petsche, Hellmuth

    2002-04-01

    Functional and topographical differences between two groups, artists and non-artists, during the performances of visual perception and imagery of paintings were presented by means of EEG phase synchrony analysis. In artists as compared with non-artists, significantly higher phase synchrony was found in the high frequency beta and gamma bands during the perception of the paintings; in the low frequency bands (primarily delta), phase synchrony was mostly enhanced during imagery. Strong decreases in phase synchrony of alpha were found primarily in artists for both tasks. The right hemisphere was found to present higher synchrony than the left in artists, whereas hemispheric asymmetry was less significant in non-artists. In the artists, enhanced synchrony in the high frequency band is most likely due to their enhanced binding capabilities of numerous visual attributes, and enhanced synchrony in the low frequency band seems to be due to the higher involvement of long-term visual memory mostly in imagery. Thus, the analysis of phase synchrony from EEG signals yields new information about the dynamical co-operation between neuronal assemblies during the cognition of visual art. PMID:11958960

  14. Are theories of perception necessary? A review of Gibson's The Ecological Approach to Visual Perception.

    PubMed Central

    Costall, A P

    1984-01-01

    Representational theories of perception postulate an isolated and autonomous "subject" set apart from its real environment, and then go on to invoke processes of mental representation, construction, or hypothesizing to explain how perception can nevertheless take place. Although James Gibson's most conspicuous contribution has been to challenge representational theory, his ultimate concern was the cognitivism which now prevails in psychology. He was convinced that the so-called cognitive revolution merely perpetuates, and even promotes, many of psychology's oldest mistakes. This review article considers Gibson's final statement of his "ecological" alternative to cognitivism (Gibson, 1979). It is intended not as a complete account of Gibson's alternative, however, but primarily as an appreciation of his critical contribution. Gibson's sustained attempt to counter representational theory served not only to reveal the variety of arguments used in support of this theory, but also to expose the questionable metaphysical assumptions upon which they rest. In concentrating upon Gibson's criticisms of representational theory, therefore, this paper aims to emphasize the point of his alternative scheme and to explain some of the important concerns shared by Gibson's ecological approach and operant psychology. PMID:6699538

  15. Learning one-to-many mapping functions for audio-visual integrated perception

    NASA Astrophysics Data System (ADS)

    Lim, Jung-Hui; Oh, Do-Kwan; Lee, Soo-Young

    2010-04-01

    In noisy environment the human speech perception utilizes visual lip-reading as well as audio phonetic classification. This audio-visual integration may be done by combining the two sensory features at the early stage. Also, the top-down attention may integrate the two modalities. For the sensory feature fusion we introduce mapping functions between the audio and visual manifolds. Especially, we present an algorithm to provide one-to-many mapping function for the videoto- audio mapping. The top-down attention is also presented to integrate both the sensory features and classification results of both modalities, which is able to explain McGurk effect. Each classifier is separately implemented by the Hidden-Markov Model (HMM), but the two classifiers are combined at the top level and interact by the top-down attention.

  16. The Role of Affective and Cognitive Individual Differences in Social Perception.

    PubMed

    Aquino, Antonio; Haddock, Geoffrey; Maio, Gregory R; Wolf, Lukas J; Alparone, Francesca R

    2016-06-01

    Three studies explored the connection between social perception processes and individual differences in the use of affective and cognitive information in relation to attitudes. Study 1 revealed that individuals high in need for affect (NFA) accentuated differences in evaluations of warm and cold traits, whereas individuals high in need for cognition (NFC) accentuated differences in evaluations of competent and incompetent traits. Study 2 revealed that individual differences in NFA predicted liking of warm or cold targets, whereas individual differences in NFC predicted perceptions of competent or incompetent targets. Furthermore, the effects of NFA and NFC were independent of structural bases and meta-bases of attitudes. Study 3 revealed that differences in the evaluation of warm and cold traits mediated the effects of NFA and NFC on liking of targets. The implications for social perception processes and for individual differences in affect-cognition are discussed. PMID:27460272

  17. Simulated Environments with Animated Agents: Effects on Visual Attention, Emotion, Performance, and Perception

    ERIC Educational Resources Information Center

    Romero-Hall, E.; Watson, G. S.; Adcock, A.; Bliss, J.; Adams Tufts, K.

    2016-01-01

    This research assessed how emotive animated agents in a simulation-based training affect the performance outcomes and perceptions of the individuals interacting in real time with the training application. A total of 56 participants consented to complete the study. The material for this investigation included a nursing simulation in which…

  18. A method for real-time visual stimulus selection in the study of cortical object perception.

    PubMed

    Leeds, Daniel D; Tarr, Michael J

    2016-06-01

    The properties utilized by visual object perception in the mid- and high-level ventral visual pathway are poorly understood. To better establish and explore possible models of these properties, we adopt a data-driven approach in which we repeatedly interrogate neural units using functional Magnetic Resonance Imaging (fMRI) to establish each unit's image selectivity. This approach to imaging necessitates a search through a broad space of stimulus properties using a limited number of samples. To more quickly identify the complex visual features underlying human cortical object perception, we implemented a new functional magnetic resonance imaging protocol in which visual stimuli are selected in real-time based on BOLD responses to recently shown images. Two variations of this protocol were developed, one relying on natural object stimuli and a second based on synthetic object stimuli, both embedded in feature spaces based on the complex visual properties of the objects. During fMRI scanning, we continuously controlled stimulus selection in the context of a real-time search through these image spaces in order to maximize neural responses across pre-determined 1cm(3) rain regions. Elsewhere we have reported the patterns of cortical selectivity revealed by this approach (Leeds et al., 2014). In contrast, here our objective is to present more detailed methods and explore the technical and biological factors influencing the behavior of our real-time stimulus search. We observe that: 1) Searches converged more reliably when exploring a more precisely parameterized space of synthetic objects; 2) real-time estimation of cortical responses to stimuli is reasonably consistent; 3) search behavior was acceptably robust to delays in stimulus displays and subject motion effects. Overall, our results indicate that real-time fMRI methods may provide a valuable platform for continuing study of localized neural selectivity, both for visual object representation and beyond. PMID

  19. Hemispheric Asymmetries in Children's Perception of Nonlinguistic Human Affective Sounds

    ERIC Educational Resources Information Center

    Pollak, Seth D.; Holt, Lori L.; Fries, Alison B. Wismer

    2004-01-01

    In the present work, we developed a database of nonlinguistic sounds that mirror prosodic characteristics typical of language and thus carry affective information, but do not convey linguistic information. In a dichotic-listening task, we used these novel stimuli as a means of disambiguating the relative contributions of linguistic and affective…

  20. Factors Affecting the Effectiveness and Use of Moodle: Students' Perception

    ERIC Educational Resources Information Center

    Damnjanovic, Vesna; Jednak, Sandra; Mijatovic, Ivana

    2015-01-01

    The purpose of this research paper is to identify the factors affecting the effectiveness of Moodle from the students' perspective. The research hypotheses derived from the suggested extended Seddon model have been empirically validated using the responses to a survey on e-learning usage among 255 users. We tested the model across higher education…

  1. Neighborhood Perceptions Affect Dietary Behaviors and Diet Quality

    ERIC Educational Resources Information Center

    Keita, Akilah Dulin; Casazza, Krista; Thomas, Olivia; Fernandez, Jose R.

    2011-01-01

    Objective: The primary purpose of this study was to determine if perceived neighborhood disorder affected dietary quality within a multiethnic sample of children. Design: Children were recruited through the use of fliers, wide-distribution mailers, parent magazines, and school presentations from June 2005 to December 2008. Setting:…

  2. Environmental risk perception from visual cues: the psychophysics of tornado risk perception

    NASA Astrophysics Data System (ADS)

    Dewitt, Barry; Fischhoff, Baruch; Davis, Alexander; Broomell, Stephen B.

    2015-12-01

    Lay judgments of environmental risks are central to both immediate decisions (e.g., taking shelter from a storm) and long-term ones (e.g., building in locations subject to storm surges). Using methods from quantitative psychology, we provide a general approach to studying lay perceptions of environmental risks. As a first application of these methods, we investigate a setting where lay decisions have not taken full advantage of advances in natural science understanding: tornado forecasts in the US and Canada. Because official forecasts are imperfect, members of the public must often evaluate the risks on their own, by checking environmental cues (such as cloud formations) before deciding whether to take protective action. We study lay perceptions of cloud formations, demonstrating an approach that could be applied to other environmental judgments. We use signal detection theory to analyse how well people can distinguish tornadic from non-tornadic clouds, and multidimensional scaling to determine how people make these judgments. We find that participants (N = 400 recruited from Amazon Mechanical Turk) have heuristics that generally serve them well, helping participants to separate tornadic from non-tornadic clouds, but which also lead them to misjudge the tornado risk of certain cloud types. The signal detection task revealed confusion regarding shelf clouds, mammatus clouds, and clouds with upper- and mid-level tornadic features, which the multidimensional scaling task suggested was the result of participants focusing on the darkness of the weather scene and the ease of discerning its features. We recommend procedures for training (e.g., for storm spotters) and communications (e.g., tornado warnings) that will reduce systematic misclassifications of tornadicity arising from observers’ reliance on otherwise useful heuristics.

  3. Perception of auditory, visual, and egocentric spatial alignment adapts differently to changes in eye position.

    PubMed

    Cui, Qi N; Razavi, Babak; O'Neill, William E; Paige, Gary D

    2010-02-01

    Vision and audition represent the outside world in spatial synergy that is crucial for guiding natural activities. Input conveying eye-in-head position is needed to maintain spatial congruence because the eyes move in the head while the ears remain head-fixed. Recently, we reported that the human perception of auditory space shifts with changes in eye position. In this study, we examined whether this phenomenon is 1) dependent on a visual fixation reference, 2) selective for frequency bands (high-pass and low-pass noise) related to specific auditory spatial channels, 3) matched by a shift in the perceived straight-ahead (PSA), and 4) accompanied by a spatial shift for visual and/or bimodal (visual and auditory) targets. Subjects were tested in a dark echo-attenuated chamber with their heads fixed facing a cylindrical screen, behind which a mobile speaker/LED presented targets across the frontal field. Subjects fixated alternating reference spots (0, +/-20 degrees ) horizontally or vertically while either localizing targets or indicating PSA using a laser pointer. Results showed that the spatial shift induced by ocular eccentricity is 1) preserved for auditory targets without a visual fixation reference, 2) generalized for all frequency bands, and thus all auditory spatial channels, 3) paralleled by a shift in PSA, and 4) restricted to auditory space. Findings are consistent with a set-point control strategy by which eye position governs multimodal spatial alignment. The phenomenon is robust for auditory space and egocentric perception, and highlights the importance of controlling for eye position in the examination of spatial perception and behavior. PMID:19846626

  4. Tilt and Translation Motion Perception during Pitch Tilt with Visual Surround Translation

    NASA Technical Reports Server (NTRS)

    O'Sullivan, Brita M.; Harm, Deborah L.; Reschke, Millard F.; Wood, Scott J.

    2006-01-01

    The central nervous system must resolve the ambiguity of inertial motion sensory cues in order to derive an accurate representation of spatial orientation. Previous studies suggest that multisensory integration is critical for discriminating linear accelerations arising from tilt and translation head motion. Visual input is especially important at low frequencies where canal input is declining. The NASA Tilt Translation Device (TTD) was designed to recreate postflight orientation disturbances by exposing subjects to matching tilt self motion with conflicting visual surround translation. Previous studies have demonstrated that brief exposures to pitch tilt with foreaft visual surround translation produced changes in compensatory vertical eye movement responses, postural equilibrium, and motion sickness symptoms. Adaptation appeared greatest with visual scene motion leading (versus lagging) the tilt motion, and the adaptation time constant appeared to be approximately 30 min. The purpose of this study was to compare motion perception when the visual surround translation was inphase versus outofphase with pitch tilt. The inphase stimulus presented visual surround motion one would experience if the linear acceleration was due to foreaft self translation within a stationary surround, while the outofphase stimulus had the visual scene motion leading the tilt by 90 deg as previously used. The tilt stimuli in these conditions were asymmetrical, ranging from an upright orientation to 10 deg pitch back. Another objective of the study was to compare motion perception with the inphase stimulus when the tilts were asymmetrical relative to upright (0 to 10 deg back) versus symmetrical (10 deg forward to 10 deg back). Twelve subjects (6M, 6F, 22-55 yrs) were tested during 3 sessions separated by at least one week. During each of the three sessions (out-of-phase asymmetrical, in-phase asymmetrical, inphase symmetrical), subjects were exposed to visual surround translation

  5. Categorical Perception of Colour in the Left and Right Visual Field Is Verbally Mediated: Evidence from Korean

    ERIC Educational Resources Information Center

    Roberson, Debi; Pak, Hyensou; Hanley, J. Richard

    2008-01-01

    In this study we demonstrate that Korean (but not English) speakers show Categorical perception (CP) on a visual search task for a boundary between two Korean colour categories that is not marked in English. These effects were observed regardless of whether target items were presented to the left or right visual field. Because this boundary is…

  6. Combining Strengths and Weaknesses in Visual Perception of Children with an Autism Spectrum Disorder: Perceptual Matching of Facial Expressions

    ERIC Educational Resources Information Center

    Evers, Kris; Noens, Ilse; Steyaert, Jean; Wagemans, Johan

    2011-01-01

    Background: Children with an autism spectrum disorder (ASD) are known to have an atypical visual perception, with deficits in automatic Gestalt formation and an enhanced processing of visual details. In addition, they are sometimes found to have difficulties in emotion processing. Methods: In three experiments, we investigated whether 7-to-11-year…

  7. How Facial Expressions of Emotion Affect Distance Perception.

    PubMed

    Kim, Nam-Gyoon; Son, Heejung

    2015-01-01

    Facial expressions of emotion are thought to convey expressers' behavioral intentions, thus priming observers' approach and avoidance tendencies appropriately. The present study examined whether detecting expressions of behavioral intent influences perceivers' estimation of the expresser's distance from them. Eighteen undergraduates (nine male and nine female) participated in the study. Six facial expressions were chosen on the basis of degree of threat-anger, hate (threatening expressions), shame, surprise (neutral expressions), pleasure, and joy (safe expressions). Each facial expression was presented on a tablet PC held by an assistant covered by a black drape who stood 1, 2, or 3 m away from participants. Participants performed a visual matching task to report the perceived distance. Results showed that facial expression influenced distance estimation, with faces exhibiting threatening or safe expressions judged closer than those showing neutral expressions. Females' judgments were more likely to be influenced; but these influences largely disappeared beyond the 2 m distance. These results suggest that facial expressions of emotion (particularly threatening or safe emotions) influence others' (especially females') distance estimations but only within close proximity. PMID:26635708

  8. How Facial Expressions of Emotion Affect Distance Perception

    PubMed Central

    Kim, Nam-Gyoon; Son, Heejung

    2015-01-01

    Facial expressions of emotion are thought to convey expressers’ behavioral intentions, thus priming observers’ approach and avoidance tendencies appropriately. The present study examined whether detecting expressions of behavioral intent influences perceivers’ estimation of the expresser’s distance from them. Eighteen undergraduates (nine male and nine female) participated in the study. Six facial expressions were chosen on the basis of degree of threat—anger, hate (threatening expressions), shame, surprise (neutral expressions), pleasure, and joy (safe expressions). Each facial expression was presented on a tablet PC held by an assistant covered by a black drape who stood 1, 2, or 3 m away from participants. Participants performed a visual matching task to report the perceived distance. Results showed that facial expression influenced distance estimation, with faces exhibiting threatening or safe expressions judged closer than those showing neutral expressions. Females’ judgments were more likely to be influenced; but these influences largely disappeared beyond the 2 m distance. These results suggest that facial expressions of emotion (particularly threatening or safe emotions) influence others’ (especially females’) distance estimations but only within close proximity. PMID:26635708

  9. Event Boundaries in Perception Affect Memory Encoding and Updating

    PubMed Central

    Swallow, Khena M.; Zacks, Jeffrey M.; Abrams, Richard A.

    2010-01-01

    Memory for naturalistic events over short delays is important for visual scene processing, reading comprehension, and social interaction. The research presented here examined relations between how an ongoing activity is perceptually segmented into events and how those events are remembered a few seconds later. In several studies participants watched movie clips that presented objects in the context of goal-directed activities. Five seconds after an object was presented, the clip paused for a recognition test. Performance on the recognition test depended on the occurrence of perceptual event boundaries. Objects that were present when an event boundary occurred were better recognized than other objects, suggesting that event boundaries structure the contents of memory. This effect was strongest when an object’s type was tested, but was also observed for objects’ perceptual features. Memory also depended on whether an event boundary occurred between presentation and test; this variable produced complex interactive effects that suggested that the contents of memory are updated at event boundaries. These data indicate that perceptual event boundaries have immediate consequences for what, when, and how easily information can be remembered. PMID:19397382

  10. Visual electrophysiology in children with tumours affecting the visual pathway. Case reports.

    PubMed

    Brecelj, J; Stirn-Kranjc, B; Skrbec, M

    2000-09-01

    In 9 children (8-14 years of age) with orbital, suprasellar or postchiasmal tumours, visual loss was studied by visual electrophysiology in relation to ophthalmologic and neuroimaging findings. Pattern electroretinography (PERG) and pattern visual evoked potentials (PVEP) to full and half-field pattern-reversal stimulation were recorded and PERG and PVEP changes were related to the tumour location. PERG wave P50 attenuation was found associated with the central retinal dysfunction in the child with orbital rhabdomyosarcoma; PVEP wave P100 delay was associated with the optic nerve dysfunction in a child with retrobulbar chondrosarcoma and in a child with optic nerve glioma; PVEP wave P100 asymmetry was associated with the crossed fibers dysfunction in a child with hypothalamic germinoma, and PVEP wave P100 uncrossed asymmetry was associated with postchiasmal dysfunction in children with postchiasmal tumours (one with pilocytic astrocytoma and two with angioma). On the other hand, normal PERG suggested that there was no central retinal dysfunction in a child with pleomorphic adenoma of the lacrimal gland, and normal PVEP to full and half-field stimulation excluded visual pathway dysfunction at the chiasm in a child with suprasellar arachnoidal cyst. Follow-up was useful in indicating whether visual dysfunction was progressive or not. We conclude that PERG and PVEP findings contributed to understanding whether the dysfunction originated was at the retina, in the optic nerve, chiasm or postchiasmal pathway. PMID:11200546

  11. MEG brain activities reflecting affection for visual food stimuli.

    PubMed

    Kuriki, Shinya; Miyamura, Takahiro; Uchikawa, Yoshinori

    2010-01-01

    This study aimed to explore the modulation of alpha rhythm in response to food pictures with distinct affection values. We examined the method to discriminate subject's state, i.e., whether he/she liked the article of food or not, from MEG signals detected over the head. Pictures of familiar foods were used as affective stimuli, while those pictures with complementary color phase were used as non-affective stimuli. Alpha band signals in a narrow frequency window around the spectral peak of individual subjects were wavelet analyzed and phase-locked component to the stimulus onset was obtained as a complex number. The amplitude of the phase-locked component was averaged during 0-1 s after stimulus onset for 30 epochs in a measurement session and across 76 channels of MEG sensor. In statistical test of individual subjects, significant difference was found in the real part of the averaged phase-locked amplitude between the normal-color and reverse-color pictures. These results suggest that affective information processing of food pictures is reflected in the synchronized component of narrow band alpha rhythm. PMID:21096510

  12. Visual tuning and metrical perception of realistic point-light dance movements.

    PubMed

    Su, Yi-Huang

    2016-01-01

    Humans move to music spontaneously, and this sensorimotor coupling underlies musical rhythm perception. The present research proposed that, based on common action representation, different metrical levels as in auditory rhythms could emerge visually when observing structured dance movements. Participants watched a point-light figure performing basic steps of Swing dance cyclically in different tempi, whereby the trunk bounced vertically at every beat and the limbs moved laterally at every second beat, yielding two possible metrical periodicities. In Experiment 1, participants freely identified a tempo of the movement and tapped along. While some observers only tuned to the bounce and some only to the limbs, the majority tuned to one level or the other depending on the movement tempo, which was also associated with individuals' preferred tempo. In Experiment 2, participants reproduced the tempo of leg movements by four regular taps, and showed a slower perceived leg tempo with than without the trunk bouncing simultaneously in the stimuli. This mirrors previous findings of an auditory 'subdivision effect', suggesting the leg movements were perceived as beat while the bounce as subdivisions. Together these results support visual metrical perception of dance movements, which may employ similar action-based mechanisms to those underpinning auditory rhythm perception. PMID:26947252

  13. Visual tuning and metrical perception of realistic point-light dance movements

    PubMed Central

    Su, Yi-Huang

    2016-01-01

    Humans move to music spontaneously, and this sensorimotor coupling underlies musical rhythm perception. The present research proposed that, based on common action representation, different metrical levels as in auditory rhythms could emerge visually when observing structured dance movements. Participants watched a point-light figure performing basic steps of Swing dance cyclically in different tempi, whereby the trunk bounced vertically at every beat and the limbs moved laterally at every second beat, yielding two possible metrical periodicities. In Experiment 1, participants freely identified a tempo of the movement and tapped along. While some observers only tuned to the bounce and some only to the limbs, the majority tuned to one level or the other depending on the movement tempo, which was also associated with individuals’ preferred tempo. In Experiment 2, participants reproduced the tempo of leg movements by four regular taps, and showed a slower perceived leg tempo with than without the trunk bouncing simultaneously in the stimuli. This mirrors previous findings of an auditory ‘subdivision effect’, suggesting the leg movements were perceived as beat while the bounce as subdivisions. Together these results support visual metrical perception of dance movements, which may employ similar action-based mechanisms to those underpinning auditory rhythm perception. PMID:26947252

  14. Assessment of visual perception in adolescents with a history of central coordination disorder in early life – 15-year follow-up study

    PubMed Central

    Kowalski, Ireneusz M.; Domagalska, Małgorzata; Szopa, Andrzej; Dwornik, Michał; Kujawa, Jolanta; Stępień, Agnieszka; Śliwiński, Zbigniew

    2012-01-01

    Introduction Central nervous system damage in early life results in both quantitative and qualitative abnormalities of psychomotor development. Late sequelae of these disturbances may include visual perception disorders which not only affect the ability to read and write but also generally influence the child's intellectual development. This study sought to determine whether a central coordination disorder (CCD) in early life treated according to Vojta's method with elements of the sensory integration (S-I) and neuro-developmental treatment (NDT)/Bobath approaches affects development of visual perception later in life. Material and methods The study involved 44 participants aged 15-16 years, including 19 diagnosed with moderate or severe CCD in the neonatal period, i.e. during the first 2-3 months of life, with diagnosed mild degree neonatal encephalopathy due to perinatal anoxia, and 25 healthy people without a history of developmental psychomotor disturbances in the neonatal period. The study tool was a visual perception IQ test comprising 96 graphic tasks. Results The study revealed equal proportions of participants (p < 0.05) defined as very skilled (94-96), skilled (91-94), aerage (71-91), poor (67-71), and very poor (0-67) in both groups. These results mean that adolescents with a history of CCD in the neonatal period did not differ with regard to the level of visual perception from their peers who had not demonstrated psychomotor development disorders in the neonatal period. Conclusions Early treatment of children with CCD affords a possibility of normalising their psychomotor development early enough to prevent consequences in the form of cognitive impairments in later life. PMID:23185199

  15. Evaluation of factors affecting stakeholder risk perception of contaminated sediment disposal in Oslo harbor.

    PubMed

    Sparrevik, Magnus; Ellen, Gerald Jan; Duijn, Mike

    2011-01-01

    The management of environmental pollution has changed considerably since the growth of environmental awareness in the late 1960s. The general increased environmental concern and involvement of stakeholders in today's environmental issues may enhance the need to consider risk in a much broader social context rather than just as an estimate of ecological hazard. Risk perception and the constructs and images of risks held by stakeholders and society are important items to address in the management of environmental projects, including the management of contaminated sediments. Here we present a retrospective case study that evaluates factors affecting stakeholder risk perception of contaminated sediment disposal that occurred during a remediation project in Oslo harbor, Norway. The choice to dispose dredged contaminated sediments in a confined aquatic disposal (CAD) site rather than at a land disposal site has received a lot of societal attention, attracted large media coverage, and caused many public discussions. A mixed method approach is used to investigate how risk perceptive affective factors (PAF), socio-demographic aspects, and participatory aspects have influenced the various stakeholders' preferences for the two different disposal options. Risk perceptive factors such as transparency in the decision making process and controllability of the disposal options have been identified as important for risk perception. The results of the study also support the view that there is no sharp distinction in risk perception between experts and other parties and emphasizes the importance of addressing risk perceptive affective factors in similar environmental decision-making processes. Indeed, PAFs such as transparency, openness, and information are fundamental to address in sensitive environmental decisions, such as sediment disposal alternatives, in order to progress to more technical questions such as the controllability and safety. PMID:20809566

  16. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy

    PubMed Central

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W. Tecumseh

    2012-01-01

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups. PMID:22688636

  17. SoftAR: visually manipulating haptic softness perception in spatial augmented reality.

    PubMed

    Punpongsanon, Parinya; Iwai, Daisuke; Sato, Kosuke

    2015-11-01

    We present SoftAR, a novel spatial augmented reality (AR) technique based on a pseudo-haptics mechanism that visually manipulates the sense of softness perceived by a user pushing a soft physical object. Considering the limitations of projection-based approaches that change only the surface appearance of a physical object, we propose two projection visual effects, i.e., surface deformation effect (SDE) and body appearance effect (BAE), on the basis of the observations of humans pushing physical objects. The SDE visualizes a two-dimensional deformation of the object surface with a controlled softness parameter, and BAE changes the color of the pushing hand. Through psychophysical experiments, we confirm that the SDE can manipulate softness perception such that the participant perceives significantly greater softness than the actual softness. Furthermore, fBAE, in which BAE is applied only for the finger area, significantly enhances manipulation of the perception of softness. We create a computational model that estimates perceived softness when SDE+fBAE is applied. We construct a prototype SoftAR system in which two application frameworks are implemented. The softness adjustment allows a user to adjust the softness parameter of a physical object, and the softness transfer allows the user to replace the softness with that of another object. PMID:26340774

  18. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy.

    PubMed

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W Tecumseh

    2012-07-19

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups. PMID:22688636

  19. Positive affect modulates activity in the visual cortex to images of high calorie foods.

    PubMed

    Killgore, William D S; Yurgelun-Todd, Deborah A

    2007-05-01

    Activity within the visual cortex can be influenced by the emotional salience of a stimulus, but it is not clear whether such cortical activity is modulated by the affective status of the individual. This study used functional magnetic resonance imaging (fMRI) to examine the relationship between affect ratings on the Positive and Negative Affect Schedule and activity within the occipital cortex of 13 normal-weight women while viewing images of high calorie and low calorie foods. Regression analyses revealed that when participants viewed high calorie foods, Positive Affect correlated significantly with activity within the lingual gyrus and calcarine cortex, whereas Negative Affect was unrelated to visual cortex activity. In contrast, during presentations of low calorie foods, affect ratings, regardless of valence, were unrelated to occipital cortex activity. These findings suggest a mechanism whereby positive affective state may affect the early stages of sensory processing, possibly influencing subsequent perceptual experience of a stimulus. PMID:17464782

  20. Embodiments, visualizations, and immersion with enactive affective systems

    NASA Astrophysics Data System (ADS)

    Domingues, Diana; Miosso, Cristiano J.; Rodrigues, Suélia F.; Silva Rocha Aguiar, Carla; Lucena, Tiago F.; Miranda, Mateus; Rocha, Adson F.; Raskar, Ramesh

    2014-02-01

    Our proposal in Bioart and Biomedical Engineering for a ective esthetics focuses on the expanded sensorium and investigates problems regarding enactive systems. These systems enhance the sensorial experiences and amplify kinesthesia by adding the sensations that are formed in response to the physical world, which aesthetically constitutes the principle of synaesthesia. In this paper, we also present enactive systems inside the CAVE, con guring compelling experiences in data landscapes and human a ective narratives. The interaction occurs through the acquisition, data visualization and analysis of several synchronized physiological signals, to which the landscapes respond and provide immediate feedback, according to the detected participants' actions and the intertwined responses of the environment. The signals we use to analyze the human states include the electrocardiography (ECG) signal, the respiratory ow, the galvanic skin response (GSR) signal, plantar pressures, the pulse signal and others. Each signal is collected by using a speci cally designed dedicated electronic board, with reduced dimensions, so it does not interfere with normal movements, according to the principles of transparent technologies. Also, the electronic boards are implemented in a modular approach, so they are independent, and can be used in many di erent desired combinations, and at the same time provide synchronization between the collected data.

  1. Functional dissociation between action and perception of object shape in developmental visual object agnosia.

    PubMed

    Freud, Erez; Ganel, Tzvi; Avidan, Galia; Gilaie-Dotan, Sharon

    2016-03-01

    According to the two visual systems model, the cortical visual system is segregated into a ventral pathway mediating object recognition, and a dorsal pathway mediating visuomotor control. In the present study we examined whether the visual control of action could develop normally even when visual perceptual abilities are compromised from early childhood onward. Using his fingers, LG, an individual with a rare developmental visual object agnosia, manually estimated (perceptual condition) the width of blocks that varied in width and length (but not in overall size), or simply picked them up across their width (grasping condition). LG's perceptual sensitivity to target width was profoundly impaired in the manual estimation task compared to matched controls. In contrast, the sensitivity to object shape during grasping, as measured by maximum grip aperture (MGA), the time to reach the MGA, the reaction time and the total movement time were all normal in LG. Further analysis, however, revealed that LG's sensitivity to object shape during grasping emerged at a later time stage during the movement compared to controls. Taken together, these results demonstrate a dissociation between action and perception of object shape, and also point to a distinction between different stages of the grasping movement, namely planning versus online control. Moreover, the present study implies that visuomotor abilities can develop normally even when perceptual abilities developed in a profoundly impaired fashion. PMID:26827163

  2. From the heart to the mind's eye: cardiac vagal tone is related to visual perception of fearful faces at high spatial frequency.

    PubMed

    Park, Gewnhi; Van Bavel, Jay J; Vasey, Michael W; Egan, Eric J L; Thayer, Julian F

    2012-05-01

    The neurovisceral integration model (Thayer and Lane, 2000) proposes that vagally mediated heart rate variability (HRV)--an index of cardiac vagal tone--is associated with autonomic flexibility and emotional self-regulation. Two experiments examined the relationship between vagally mediated HRV and visual perception of affectively significant stimuli at different spatial frequencies. In Experiment 1, HRV was positively correlated with superior performance discriminating the emotion of affectively significant (i.e., fearful) faces at high spatial frequency (HSF). In Experiment 2, processing goals moderated the relationship between HRV and successful discrimination of HSF fearful faces. In contrast to Experiment 1, discriminating the expressiveness of HSF fearful faces was not correlated with HRV. The current research suggests that HRV is positively associated with superior visual discrimination of affectively significant stimuli at high spatial frequency, and this relationship may be sensitive to the top-down influence of different processing goals. PMID:22391523

  3. Motion perception: a review of developmental changes and the role of early visual experience

    PubMed Central

    Hadad, Batsheva; Schwartz, Sivan; Maurer, Daphne; Lewis, Terri L.

    2015-01-01

    Significant controversies have arisen over the developmental trajectory for the perception of global motion. Studies diverge on the age at which it becomes adult-like, with estimates ranging from as young as 3 years to as old as 16. In this article, we review these apparently conflicting results and suggest a potentially unifying hypothesis that may also account for the contradictory literature in neurodevelopmental disorders, such as Autism Spectrum Disorder (ASD). We also discuss the extent to which patterned visual input during this period is necessary for the later development of motion perception. We conclude by addressing recent studies directly comparing different types of motion integration, both in typical and atypical development, and suggest areas ripe for future research. PMID:26441564

  4. Influence Of Ambient Light On The "Visual" Sensitometric Properties Of, And Detail Perception On, A Radiograph

    NASA Astrophysics Data System (ADS)

    Bollen, Romain; Vranckx, Jean

    1981-07-01

    Lack of perception at high densities on radiographs and the influence of viewing conditions on it are well known. This lack may be caused by blinding effects, by high visual noise at low light intensities or by a third phenomenon i.e. the dependence of the sensitometric properties of film on viewing conditions, which is analyzed in this paper. Reflection of ambient light by the film mainly lowers dramatically high densities and film contrast at these densities. Sensitometric curves of several films were measured under different viewing conditions by means of a telescopic photometer. The curves also can be deduced from curves measured by a regular densitometer when the optical properties of the film, the ambient light level and the light intensity of the negatoscope are known. The influence of the phenomenon under typical viewing conditions for the Curix MR4-film is demonstrated by means of sensitometric- and perceptibility-curves.

  5. Motion perception: a review of developmental changes and the role of early visual experience.

    PubMed

    Hadad, Batsheva; Schwartz, Sivan; Maurer, Daphne; Lewis, Terri L

    2015-01-01

    Significant controversies have arisen over the developmental trajectory for the perception of global motion. Studies diverge on the age at which it becomes adult-like, with estimates ranging from as young as 3 years to as old as 16. In this article, we review these apparently conflicting results and suggest a potentially unifying hypothesis that may also account for the contradictory literature in neurodevelopmental disorders, such as Autism Spectrum Disorder (ASD). We also discuss the extent to which patterned visual input during this period is necessary for the later development of motion perception. We conclude by addressing recent studies directly comparing different types of motion integration, both in typical and atypical development, and suggest areas ripe for future research. PMID:26441564

  6. Teachers’ perceptions of aspects affecting seminar learning: a qualitative study

    PubMed Central

    2013-01-01

    Background Many medical schools have embraced small group learning methods in their undergraduate curricula. Given increasing financial constraints on universities, active learning groups like seminars (with 25 students a group) are gaining popularity. To enhance the understanding of seminar learning and to determine how seminar learning can be optimised it is important to investigate stakeholders’ views. In this study, we qualitatively explored the views of teachers on aspects affecting seminar learning. Methods Twenty-four teachers with experience in facilitating seminars in a three-year bachelor curriculum participated in semi-structured focus group interviews. Three focus groups met twice with an interval of two weeks led by one moderator. Sessions were audio taped, transcribed verbatim and independently coded by two researchers using thematic analysis. An iterative process of data reduction resulted in emerging aspects that influence seminar learning. Results Teachers identified seven key aspects affecting seminar learning: the seminar teacher, students, preparation, group functioning, seminar goals and content, course coherence and schedule and facilities. Important components of these aspects were: the teachers’ role in developing seminars (‘ownership’), the amount and quality of preparation materials, a non-threatening learning climate, continuity of group composition, suitability of subjects for seminar teaching, the number and quality of seminar questions, and alignment of different course activities. Conclusions The results of this study contribute to the unravelling of the ‘the black box’ of seminar learning. Suggestions for ways to optimise active learning in seminars are made regarding curriculum development, seminar content, quality assurance and faculty development. PMID:23399475

  7. Visual crowding illustrates the inadequacy of local vs. global and feedforward vs. feedback distinctions in modeling visual perception

    PubMed Central

    Clarke, Aaron M.; Herzog, Michael H.; Francis, Gregory

    2014-01-01

    Experimentalists tend to classify models of visual perception as being either local or global, and involving either feedforward or feedback processing. We argue that these distinctions are not as helpful as they might appear, and we illustrate these issues by analyzing models of visual crowding as an example. Recent studies have argued that crowding cannot be explained by purely local processing, but that instead, global factors such as perceptual grouping are crucial. Theories of perceptual grouping, in turn, often invoke feedback connections as a way to account for their global properties. We examined three types of crowding models that are representative of global processing models, and two of which employ feedback processing: a model based on Fourier filtering, a feedback neural network, and a specific feedback neural architecture that explicitly models perceptual grouping. Simulations demonstrate that crucial empirical findings are not accounted for by any of the models. We conclude that empirical investigations that reject a local or feedforward architecture offer almost no constraints for model construction, as there are an uncountable number of global and feedback systems. We propose that the identification of a system as being local or global and feedforward or feedback is less important than the identification of a system's computational details. Only the latter information can provide constraints on model development and promote quantitative explanations of complex phenomena. PMID:25374554

  8. Do Students' Approaches to Learning Affect Their Perceptions of Using Computing and Information Technology?

    ERIC Educational Resources Information Center

    Jelfs, Anne; Colbourn, Chris

    2002-01-01

    Discusses the use of communication and information technology (C&IT) in higher education in the United Kingdom and describes research that examined student perceptions of using C&IT for a virtual seminar series in psychology. Identified student learning approaches within the group and how it affected their adoption or rejection of the electronic…

  9. Perceptions of Educational Barriers Affecting the Academic Achievement of Latino K-12 Students

    ERIC Educational Resources Information Center

    Becerra, David

    2012-01-01

    This study examined different factors affecting the perceptions of barriers in academic achievement of Latino K-12 students. The study used data from 1,508 participants who identified themselves as being of Hispanic or Latino heritage in the 2004 National Survey of Latinos: Education, compiled by the Pew Hispanic Center between August 7 and…

  10. Ethical Ideologies: Do They Affect Shopping Behaviors and Perceptions of Morality?

    ERIC Educational Resources Information Center

    Cho, Hyeon; Yoo, Jeong-Ju; Johnson, Kim K. P.

    2005-01-01

    Counterfeiting is a serious problem facing several industries, including the medical, agricultural, and apparel industries (Bloch, Bush, & Campbell, 1993). The authors investigated whether ethical viewpoints affect perceptions of the morality of particular shopping behaviors, attitudes toward counterfeit products, and intentions to purchase such…

  11. Student and Teacher Affective Perception of Simulation-Gaming as a Pedagogical Technique.

    ERIC Educational Resources Information Center

    Postma, Charles H.; And Others

    A research project investigated the effect which experience with simulation techniques had upon students' and teachers' affective perceptions of the teaching-learning process in which they were involved. Two hundred ninety-five eleventh grade students from Indiana public schools were divided into experimental and control groups for instruction in…

  12. Students Perceptions on Factors That Affect Their Academic Performance: The Case of Great Zimbabwe University (GZU)

    ERIC Educational Resources Information Center

    Mapuranga, Barbra; Musingafi, Maxwell C. C.; Zebron, Shupikai

    2015-01-01

    Some educators argue that entry standards are the most important determinants of successful completion of a university programme; others maintain that non-academic factors must also be considered. In this study we sought to investigate open and distance learning students' perceptions of the factors affecting academic performance and successful…

  13. Preschool Children's Perceptions of the Value of Affection as Seen in Their Drawings

    ERIC Educational Resources Information Center

    Günindi, Yunus

    2015-01-01

    The purpose of this study is to examine the perceptions of children in preschool education with regard to the value of affection in the pictures they draw. The study involved 199 children aged 60 months old or above. The descriptive research method was used and data were collected with the draw-and-explain technique. During the collection of the…

  14. Affective picture perception: Emotion, context, and the late positive potential

    PubMed Central

    Pastor, M. Carmen; Bradley, Margaret M.; Löw, Andreas; Versace, Francesco; Moltó, Javier; Lang, Peter J.

    2010-01-01

    Event-related potentials (ERP) were measured when pleasant, neutral or unpleasant pictures were presented in the context of similarly valenced stimuli, and compared to ERPs elicited when the same pictures were viewed in an intermixed context. An early ERP component (150–300 ms) measured over occipital and fronto-central sensors was specific to viewing pleasant pictures and was not affected by presentation context. Replicating previous studies, emotional pictures prompted a larger late positive potential (LPP, 400–700 ms) and a larger positive slow wave (1–6 s) over centro-parietal sensors that also did not differ by presentation context. On the other hand, ERPs elicited when viewing neutral pictures varied as a function of context, eliciting somewhat larger LPPs when presented in blocks, and prompting smaller slow waves over occipital sensors. Taken together, the data indicate that emotional pictures prompt increased attention and orienting that is unaffected by its context of presentation, whereas neutral pictures are more vulnerable to context manipulations. PMID:18068150

  15. Anticipatory visual perception as a bio-inspired mechanism underlying robot locomotion.

    PubMed

    Barrera, Alejandra; Laschi, Cecilia

    2010-01-01

    Anticipation of sensory consequences of actions is critical for the predictive control of movement that explains most of our sensory-motor behaviors. Plenty of neuroscientific studies in humans suggest evidence of anticipatory mechanisms based on internal models. Several robotic implementations of predictive behaviors have been inspired on those biological mechanisms in order to achieve adaptive agents. This paper provides an overview of such neuroscientific and robotic evidences; a high-level architecture of sensory-motor coordination based on anticipatory visual perception and internal models is then introduced; and finally, the paper concludes by discussing the relevance of the proposed architecture within the context of current research in humanoid robotics. PMID:21096813

  16. Visual adaptation of the perception of “life”: animacy is a basic perceptual dimension of faces

    PubMed Central

    Koldewyn, Kami; Hanus, Patricia; Balas, Benjamin

    2014-01-01

    One critical component of understanding another’s mind is the perception of “life” in a face. However, little is known about the cognitive and neural mechanisms underlying this perception of animacy. Here, using a visual adaptation paradigm, we ask whether face animacy is (1) a basic dimension of face perception and (2) supported by a common neural mechanism across distinct face categories defined by age and species. Observers rated the perceived animacy of adult human faces before and after adaptation to (1) adult faces, (2) child faces, and (3) dog faces. When testing the perception of animacy in human faces, we found significant adaptation to both adult and child faces, but not dog faces. We did, however, find significant adaptation when morphed dog images and dog adaptors were used. Thus, animacy perception in faces appears to be a basic dimension of face perception that is species-specific, but not constrained by age categories. PMID:24323739

  17. Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli

    NASA Astrophysics Data System (ADS)

    Tsetserukou, D.; Neviarouskaya, A.

    2012-03-01

    The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.

  18. [Does music influence visual perception in campimetric measurements of the visual field?].

    PubMed

    Gall, Carolin; Geier, Jens-Stefan; Sabel, Bernhard A; Kasten, Erich

    2009-01-01

    21 subjects (mean age 28,4 +/- 10,9, M +/- SD) without any damage of the visual system were examined with computer-based campimetric tests of near threshold stimulus detection whereby an artificial tunnel vision was induced. Campimetry was performed in four trials in randomized order using a within-subjects-design: 1. classical music, 2. Techno music, 3. music for relaxation and 4. no music. Results were slightly better in all music conditions. Performance was best when subjects were listening to Techno music. The average increase of correctly recognized stimuli and fixation controls amounted to 3 %. To check the stability of the effects 9 subjects were tested three times. A moderating influence of personality traits and habits of listening to music was tested but could not be found. We conclude that music has at least no negative influence on performance in the campimetric measurement. Reasons for the positive effects of music can be seen in a general increase of vigilance and a modulation of perceptual thresholds. PMID:18240114

  19. Time perception of visual motion is tuned by the motor representation of human actions

    PubMed Central

    Gavazzi, Gioele; Bisio, Ambra; Pozzo, Thierry

    2013-01-01

    Several studies have shown that the observation of a rapidly moving stimulus dilates our perception of time. However, this effect appears to be at odds with the fact that our interactions both with environment and with each other are temporally accurate. This work exploits this paradox to investigate whether the temporal accuracy of visual motion uses motor representations of actions. To this aim, the stimuli were a dot moving with kinematics belonging or not to the human motor repertoire and displayed at different velocities. Participants had to replicate its duration with two tasks differing in the underlying motor plan. Results show that independently of the task's motor plan, the temporal accuracy and precision depend on the correspondence between the stimulus' kinematics and the observer's motor competencies. Our data suggest that the temporal mechanism of visual motion exploits a temporal visuomotor representation tuned by the motor knowledge of human actions. PMID:23378903

  20. Parafoveal perception during sentence reading?: An ERP paradigm using rapid serial visual presentation (RSVP) with flankers

    PubMed Central

    Bentin, Shlomo; Kutas, Marta

    2014-01-01

    We describe a new procedure using event related brain potentials to investigate parafoveal word processing during sentence reading. Sentences were presented word-byword at fixation, flanked two degrees bilaterally by letter strings. Flanker strings were pseudowords, except for the third word in each sentence, which was flanked by either two pseudowords, or a pseudoword and a word, one on each side. Flanker words were either semantically congruent or incongruent with the sentence context. P2 (175-375 ms) amplitudes were less positive for contextually incongruent than congruent flanker words but only with flanker words in the right visual field for English, and in the left visual field in Hebrew. Flankered word presentation thus may be a suitable method for the electrophysiological study of parafoveal perception during sentence reading. PMID:21361965

  1. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception

    PubMed Central

    Wilson, Christopher J.; Soranzo, Alessandro

    2015-01-01

    Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. PMID:26339281

  2. Neuronal integration in visual cortex elevates face category tuning to conscious face perception.

    PubMed

    Fahrenfort, Johannes J; Snijders, Tineke M; Heinen, Klaartje; van Gaal, Simon; Scholte, H Steven; Lamme, Victor A F

    2012-12-26

    The human brain has the extraordinary capability to transform cluttered sensory input into distinct object representations. For example, it is able to rapidly and seemingly without effort detect object categories in complex natural scenes. Surprisingly, category tuning is not sufficient to achieve conscious recognition of objects. What neural process beyond category extraction might elevate neural representations to the level where objects are consciously perceived? Here we show that visible and invisible faces produce similar category-selective responses in the ventral visual cortex. The pattern of neural activity evoked by visible faces could be used to decode the presence of invisible faces and vice versa. However, only visible faces caused extensive response enhancements and changes in neural oscillatory synchronization, as well as increased functional connectivity between higher and lower visual areas. We conclude that conscious face perception is more tightly linked to neural processes of sustained information integration and binding than to processes accommodating face category tuning. PMID:23236162

  3. Perception of audio-visual speech synchrony in Spanish-speaking children with and without specific language impairment

    PubMed Central

    PONS, FERRAN; ANDREU, LLORENC.; SANZ-TORRENT, MONICA; BUIL-LEGAZ, LUCIA; LEWKOWICZ, DAVID J.

    2014-01-01

    Speech perception involves the integration of auditory and visual articulatory information and, thus, requires the perception of temporal synchrony between this information. There is evidence that children with Specific Language Impairment (SLI) have difficulty with auditory speech perception but it is not known if this is also true for the integration of auditory and visual speech. Twenty Spanish-speaking children with SLI, twenty typically developing age-matched Spanish-speaking children, and twenty Spanish-speaking children matched for MLU-w participated in an eye-tracking study to investigate the perception of audiovisual speech synchrony. Results revealed that children with typical language development perceived an audiovisual asynchrony of 666ms regardless of whether the auditory or visual speech attribute led the other one. Children with SLI only detected the 666 ms asynchrony when the auditory component followed the visual component. None of the groups perceived an audiovisual asynchrony of 366ms. These results suggest that the difficulty of speech processing by children with SLI would also involve difficulties in integrating auditory and visual aspects of speech perception. PMID:22874648

  4. Visual aspects of perception of multimedia messages on the web through the "eye tracker" method.

    PubMed

    Svilicić, Niksa

    2010-09-01

    Since the dawn of civilisation visual communication played a role in everyday life. In the early times there were simply shaped drawings of animals, pictograms explaining hunting tactics or strategies of attacking the enemies. Through evolution visual expression becomes an important component of communication process on several levels, from the existential and economic level to the artistic level. However, there was always a question of the level of user reception of such visual information in the medium transmitting the information. Does physical positioning of information in the medium contribute to the efficiency of the message? Do the same rules of content positioning apply for traditional (offline) and online media (Internet)? Rapid development of information technology and Internet in almost all segments of contemporary life calls for defining the rules of designing and positioning multimedia online contents on web sites. Recent research indicates beyond doubt that the physical positioning of an online content on a web site significantly determines the quality of user's perception of such content. By employing the "Eye tracking" method it is possible to objectively analyse the level of user perception of a multimedia content on a web site. What is the first thing observed by the user after opening the web site and how does he/she visually search the online content? By which methods can this be investigated subjectively and objectively? How can the survey results be used to improve the creation of web sites and to optimise the positioning of relevant contents on the site? The answers to these questions will significantly improve the presentation of multimedia interactive contents on the Web. PMID:20977068

  5. Perception of linear horizontal self-motion induced by peripheral vision /linearvection/ - Basic characteristics and visual-vestibular interactions

    NASA Technical Reports Server (NTRS)

    Berthoz, A.; Pavard, B.; Young, L. R.

    1975-01-01

    The basic characteristics of the sensation of linear horizontal motion have been studied. Objective linear motion was induced by means of a moving cart. Visually induced linear motion perception (linearvection) was obtained by projection of moving images at the periphery of the visual field. Image velocity and luminance thresholds for the appearance of linearvection have been measured and are in the range of those for image motion detection (without sensation of self motion) by the visual system. Latencies of onset are around 1 sec and short term adaptation has been shown. The dynamic range of the visual analyzer as judged by frequency analysis is lower than the vestibular analyzer. Conflicting situations in which visual cues contradict vestibular and other proprioceptive cues show, in the case of linearvection a dominance of vision which supports the idea of an essential although not independent role of vision in self motion perception.

  6. On Assisting a Visual-Facial Affect Recognition System with Keyboard-Stroke Pattern Information

    NASA Astrophysics Data System (ADS)

    Stathopoulou, I.-O.; Alepis, E.; Tsihrintzis, G. A.; Virvou, M.

    Towards realizing a multimodal affect recognition system, we are considering the advantages of assisting a visual-facial expression recognition system with keyboard-stroke pattern information. Our work is based on the assumption that the visual-facial and keyboard modalities are complementary to each other and that their combination can significantly improve the accuracy in affective user models. Specifically, we present and discuss the development and evaluation process of two corresponding affect recognition subsystems, with emphasis on the recognition of 6 basic emotional states, namely happiness, sadness, surprise, anger and disgust as well as the emotion-less state which we refer to as neutral. We find that emotion recognition by the visual-facial modality can be aided greatly by keyboard-stroke pattern information and the combination of the two modalities can lead to better results towards building a multimodal affect recognition system.

  7. The influence of a working memory task on affective perception of facial expressions.

    PubMed

    Lim, Seung-Lark; Bruce, Amanda S; Aupperle, Robin L

    2014-01-01

    In a dual-task paradigm, participants performed a spatial location working memory task and a forced two-choice perceptual decision task (neutral vs. fearful) with gradually morphed emotional faces (neutral ∼ fearful). Task-irrelevant word distractors (negative, neutral, and control) were experimentally manipulated during spatial working memory encoding. We hypothesized that, if affective perception is influenced by concurrent cognitive load using a working memory task, task-irrelevant emotional distractors would bias subsequent perceptual decision-making on ambiguous facial expression. We found that when either neutral or negative emotional words were presented as task-irrelevant working-memory distractors, participants more frequently reported fearful face perception - but only at the higher emotional intensity levels of morphed faces. Also, the affective perception bias due to negative emotional distractors correlated with a decrease in working memory performance. Taken together, our findings suggest that concurrent working memory load by task-irrelevant distractors has an impact on affective perception of facial expressions. PMID:25347772

  8. Real-time tracking using stereo and motion: Visual perception for space robotics

    NASA Technical Reports Server (NTRS)

    Nishihara, H. Keith; Thomas, Hans; Huber, Eric; Reid, C. Ann

    1994-01-01

    The state-of-the-art in computing technology is rapidly attaining the performance necessary to implement many early vision algorithms at real-time rates. This new capability is helping to accelerate progress in vision research by improving our ability to evaluate the performance of algorithms in dynamic environments. In particular, we are becoming much more aware of the relative stability of various visual measurements in the presence of camera motion and system noise. This new processing speed is also allowing us to raise our sights toward accomplishing much higher-level processing tasks, such as figure-ground separation and active object tracking, in real-time. This paper describes a methodology for using early visual measurements to accomplish higher-level tasks; it then presents an overview of the high-speed accelerators developed at Teleos to support early visual measurements. The final section describes the successful deployment of a real-time vision system to provide visual perception for the Extravehicular Activity Helper/Retriever robotic system in tests aboard NASA's KC135 reduced gravity aircraft.

  9. Field processes in stereovision. A description of stereopsis appropriate to ophthalmology and visual perception.

    PubMed

    Shipley, T

    1987-06-01

    There is, as yet, no satisfactory theory of stereopsis, despite the fact that our overt knowledge of "solid seeing" is now about 150 years old, and that contributions to our understanding come today from many fields: ophthalmology, psychology, psychophysics, neurophysiology, computer modelling, and optical-TV display technology. We review herein, and demonstrate for the reader whenever possible, certain key perceptual properties of the stereoscopic event of which any general theory must take account: vector stereoscopy and the neural grid, depth in empty visual fields, the relationship between stereoscopic and cognitive contours, stereoscopic contour formation in the presence of blur (thus, at low levels of central visual acuity), the phenomenon of cortical locking and of neural grid evocation in the presence of either peripheral or central rivalry, certain unusual ranges of figural mismatch and the concept of the horopter in relation to modern single cell electroneurophysiology in animals and to the constancy of visual directions. Some comments are also made on the concept of disparity processing by single cortical neurons, together with a short discussion of the implications of certain views of the genetics of stereovision for the perception of novel random texture sine-wave stereograms. We conclude that any theory pertinent to ophthalmology and visual science must combine the global concepts of cortical integration, the neural lock and the neural grid, herein introduced, with the more classical concepts of particulate or local binocular cortical correspondence. Certain preliminary steps in this direction are presented. PMID:3319467

  10. Beyond colour perception: auditory-visual synaesthesia induces experiences of geometric objects in specific locations.

    PubMed

    Chiou, Rocco; Stelter, Marleen; Rich, Anina N

    2013-06-01

    Our brain constantly integrates signals across different senses. Auditory-visual synaesthesia is an unusual form of cross-modal integration in which sounds evoke involuntary visual experiences. Previous research primarily focuses on synaesthetic colour, but little is known about non-colour synaesthetic visual features. Here we studied a group of synaesthetes for whom sounds elicit consistent visual experiences of coloured 'geometric objects' located at specific spatial location. Changes in auditory pitch alter the brightness, size, and spatial height of synaesthetic experiences in a systematic manner resembling the cross-modal correspondences of non-synaesthetes, implying synaesthesia may recruit cognitive/neural mechanisms for 'normal' cross-modal processes. To objectively assess the impact of synaesthetic objects on behaviour, we devised a multi-feature cross-modal synaesthetic congruency paradigm and asked participants to perform speeded colour or shape discrimination. We found irrelevant sounds influenced performance, as quantified by congruency effects, demonstrating that synaesthetes were not able to suppress their synaesthetic experiences even when these were irrelevant for the task. Furthermore, we found some evidence for task-specific effects consistent with feature-based attention acting on the constituent features of synaesthetic objects: synaesthetic colours appeared to have a stronger impact on performance than synaesthetic shapes when synaesthetes attended to colour, and vice versa when they attended to shape. We provide the first objective evidence that visual synaesthetic experience can involve multiple features forming object-like percepts and suggest that each feature can be selected by attention despite it being internally generated. These findings suggest theories of the brain mechanisms of synaesthesia need to incorporate a broader neural network underpinning multiple visual features, perceptual knowledge, and feature integration, rather than

  11. Individual differences in beat perception affect gait responses to low- and high-groove music.

    PubMed

    Leow, Li-Ann; Parrott, Taylor; Grahn, Jessica A

    2014-01-01

    Slowed gait in patients with Parkinson's disease (PD) can be improved when patients synchronize footsteps to isochronous metronome cues, but limited retention of such improvements suggest that permanent cueing regimes are needed for long-term improvements. If so, music might make permanent cueing regimes more pleasant, improving adherence; however, music cueing requires patients to synchronize movements to the "beat," which might be difficult for patients with PD who tend to show weak beat perception. One solution may be to use high-groove music, which has high beat salience that may facilitate synchronization, and affective properties, which may improve motivation to move. As a first step to understanding how beat perception affects gait in complex neurological disorders, we examined how beat perception ability affected gait in neurotypical adults. Synchronization performance and gait parameters were assessed as healthy young adults with strong or weak beat perception synchronized to low-groove music, high-groove music, and metronome cues. High-groove music was predicted to elicit better synchronization than low-groove music, due to its higher beat salience. Two musical tempi, or rates, were used: (1) preferred tempo: beat rate matched to preferred step rate and (2) faster tempo: beat rate adjusted to 22.5% faster than preferred step rate. For both strong and weak beat-perceivers, synchronization performance was best with metronome cues, followed by high-groove music, and worst with low-groove music. In addition, high-groove music elicited longer and faster steps than low-groove music, both at preferred tempo and at faster tempo. Low-groove music was particularly detrimental to gait in weak beat-perceivers, who showed slower and shorter steps compared to uncued walking. The findings show that individual differences in beat perception affect gait when synchronizing footsteps to music, and have implications for using music in gait rehabilitation. PMID:25374521

  12. Individual Differences in Beat Perception Affect Gait Responses to Low- and High-Groove Music

    PubMed Central

    Leow, Li-Ann; Parrott, Taylor; Grahn, Jessica A.

    2014-01-01

    Slowed gait in patients with Parkinson’s disease (PD) can be improved when patients synchronize footsteps to isochronous metronome cues, but limited retention of such improvements suggest that permanent cueing regimes are needed for long-term improvements. If so, music might make permanent cueing regimes more pleasant, improving adherence; however, music cueing requires patients to synchronize movements to the “beat,” which might be difficult for patients with PD who tend to show weak beat perception. One solution may be to use high-groove music, which has high beat salience that may facilitate synchronization, and affective properties, which may improve motivation to move. As a first step to understanding how beat perception affects gait in complex neurological disorders, we examined how beat perception ability affected gait in neurotypical adults. Synchronization performance and gait parameters were assessed as healthy young adults with strong or weak beat perception synchronized to low-groove music, high-groove music, and metronome cues. High-groove music was predicted to elicit better synchronization than low-groove music, due to its higher beat salience. Two musical tempi, or rates, were used: (1) preferred tempo: beat rate matched to preferred step rate and (2) faster tempo: beat rate adjusted to 22.5% faster than preferred step rate. For both strong and weak beat-perceivers, synchronization performance was best with metronome cues, followed by high-groove music, and worst with low-groove music. In addition, high-groove music elicited longer and faster steps than low-groove music, both at preferred tempo and at faster tempo. Low-groove music was particularly detrimental to gait in weak beat-perceivers, who showed slower and shorter steps compared to uncued walking. The findings show that individual differences in beat perception affect gait when synchronizing footsteps to music, and have implications for using music in gait rehabilitation. PMID:25374521

  13. Interactions between working memory and visual perception: An ERP/EEG study

    PubMed Central

    Agam, Yigal; Sekuler, Robert

    2007-01-01

    How do working memory and perception interact with each other? Recent theories of working memory suggest that they are closely linked, and in fact share certain brain mechanisms. We used a sequential motion imitation task in combination with EEG and ERP techniques for a direct, online examination of memory load’s influence on the processing of visual stimuli. Using a paradigm in which subjects tried to reproduce random motion sequences from memory, we found a systematic decrease in ERP amplitude with each additional motion segment that was viewed and memorized for later imitation. High-frequency (>20 Hz) oscillatory activity exhibited a similar position-dependent decrease. When trials were sorted according to the accuracy of subsequent imitation, the amplitude of the ERPs during stimulus presentation correlated with behavioral performance: The larger the amplitude, the more accurate the subsequent imitation. These findings imply that visual processing of sequential stimuli is not uniform. Rather, earlier information elicits stronger neural activity. We discuss possible explanations for this observation, among them competition for attention between memory and perception and encoding of serial order by means of differential activation strengths. PMID:17512216

  14. [The impact of psilocybin on visual perception and spatial orientation--neuropsychological approach].

    PubMed

    Jastrzebski, Mikolaj; Bala, Aleksandra

    2013-01-01

    Psilocybin is a substance of natural origin, occurring in hallucinogenic mushrooms (most common in the Psilocybe family). After its synthesis in 1958 research began on its psychoactive properties, particularly strong effects on visual perception and spatial orientation. Due to the very broad spectrum of psilocybin effects research began on the different ranges of its actions--including the effect on physiological processes (such as eye saccada movements). Neuro-imaging and neurophysiological research (positron emission tomography-PET and electroencephalography-EEG), indicate a change in the rate of metabolism of the brain and desync cerebral hemispheres. Experimental studies show the changes in visual perception and distortion from psilocybin in the handwriting style of patients examined. There are widely described subjective experiences reported by the subjects. There are also efforts to apply testing via questionnaire on people under the influence of psilocybin, in the context of the similarity of psilocybin-induced state to the initial stages of schizophrenia, as well as research aimed at creating an 'artificial' model of the disease. PMID:25007546

  15. Does restriction of pitch variation affect the perception of vocal emotions in Mandarin Chinese?

    PubMed

    Wang, Ting; Lee, Yong-Cheol

    2015-01-01

    This study reports a finding about vocal expressions of emotion in Mandarin Chinese. Production and perception experiments used the same tone and mixed tone sequences to test whether pitch variation is restricted due to the presence of lexical tones. Results showed that the restriction of pitch variation occurred in all high level tone sequences (tone 1 group) with the expression of happiness but did not happen for other dynamic tone groups. However, perception analysis revealed that all the emotions in every tone group received high identification rates; this indicates that listeners used other cues for encoding happiness in the tone 1 group. This study demonstrates that the restriction of pitch variation does not affect the perception of vocal emotions. PMID:25618091

  16. Construction and evaluation of an integrated dynamical model of visual motion perception.

    PubMed

    Tlapale, Émilien; Dosher, Barbara Anne; Lu, Zhong-Lin

    2015-07-01

    Although numerous models describe the individual neural mechanisms that may be involved in the perception of visual motion, few of them have been constructed to take arbitrary stimuli and map them to a motion percept. Here, we propose an integrated dynamical motion model (IDM), which is sufficiently general to handle diverse moving stimuli, yet sufficiently precise to account for a wide-ranging set of empirical observations made on a family of random dot kinematograms. In particular, we constructed models of the cortical areas involved in motion detection, motion integration and perceptual decision. We analyzed their parameters through dynamical simulations and numerical continuation to constrain their proper ranges. Then, empirical data from a family of random dot kinematograms experiments with systematically varying direction distribution, presentation duration and stimulus size, were used to evaluate our model and estimate corresponding model parameters. The resulting model provides an excellent account of a demanding set of parametrically varied behavioral effects on motion perception, providing both quantitative and qualitative elements of evaluation. PMID:25897511

  17. Construction and evaluation of an integrated dynamical model of visual motion perception

    PubMed Central

    Dosher, Barbara Anne; Lu, Zhong-Lin

    2015-01-01

    Although numerous models describe the individual neural mechanisms that may be involved in the perception of visual motion, few of them have been constructed to take arbitrary stimuli and map them to a motion percept. Here, we propose an integrated dynamical motion model (IDM), which is sufficiently general to handle diverse moving stimuli, yet sufficiently precise to account for a wide-ranging set of empirical observations made on a family of random dot kinematograms. In particular, we constructed models of the cortical areas involved in motion detection, motion integration and perceptual decision. We analyzed their parameters through dynamical simulations and numerical continuation to constrain their proper ranges. Then, empirical data from a family of random dot kinematograms experiments with systematically varying direction distribution, presentation duration and stimulus size, were used to evaluate our model and estimate corresponding model parameters. The resulting model provides an excellent account of a demanding set of parametrically varied behavioral effects on motion perception, providing both quantitative and qualitative elements of evaluation. PMID:25897511

  18. Developing effective serious games: the effect of background sound on visual fidelity perception with varying texture resolution.

    PubMed

    Rojas, David; Kapralos, Bill; Cristancho, Sayra; Collins, Karen; Hogue, Andrew; Conati, Cristina; Dubrowski, Adam

    2012-01-01

    Despite the benefits associated with virtual learning environments and serious games, there are open, fundamental issues regarding simulation fidelity and multi-modal cue interaction and their effect on immersion, transfer of knowledge, and retention. Here we describe the results of a study that examined the effect of ambient (background) sound on the perception of visual fidelity (defined with respect to texture resolution). Results suggest that the perception of visual fidelity is dependent on ambient sound and more specifically, white noise can have detrimental effects on our perception of high quality visuals. The results of this study will guide future studies that will ultimately aid in developing an understanding of the role that fidelity, and multi-modal interactions play with respect to knowledge transfer and retention for users of virtual simulations and serious games. PMID:22357023

  19. Making Time for Nature: Visual Exposure to Natural Environments Lengthens Subjective Time Perception and Reduces Impulsivity

    PubMed Central

    Berry, Meredith S.; Repke, Meredith A.; Nickerson, Norma P.; Conway, Lucian G.; Odum, Amy L.; Jordan, Kerry E.

    2015-01-01

    Impulsivity in delay discounting is associated with maladaptive behaviors such as overeating and drug and alcohol abuse. Researchers have recently noted that delay discounting, even when measured by a brief laboratory task, may be the best predictor of human health related behaviors (e.g., exercise) currently available. Identifying techniques to decrease impulsivity in delay discounting, therefore, could help improve decision-making on a global scale. Visual exposure to natural environments is one recent approach shown to decrease impulsive decision-making in a delay discounting task, although the mechanism driving this result is currently unknown. The present experiment was thus designed to evaluate not only whether visual exposure to natural (mountains, lakes) relative to built (buildings, cities) environments resulted in less impulsivity, but also whether this exposure influenced time perception. Participants were randomly assigned to either a natural environment condition or a built environment condition. Participants viewed photographs of either natural scenes or built scenes before and during a delay discounting task in which they made choices about receiving immediate or delayed hypothetical monetary outcomes. Participants also completed an interval bisection task in which natural or built stimuli were judged as relatively longer or shorter presentation durations. Following the delay discounting and interval bisection tasks, additional measures of time perception were administered, including how many minutes participants thought had passed during the session and a scale measurement of whether time "flew" or "dragged" during the session. Participants exposed to natural as opposed to built scenes were less impulsive and also reported longer subjective session times, although no differences across groups were revealed with the interval bisection task. These results are the first to suggest that decreased impulsivity from exposure to natural as opposed to built

  20. Visual perception can account for the close relation between numerosity processing and computational fluency

    PubMed Central

    Zhou, Xinlin; Wei, Wei; Zhang, Yiyun; Cui, Jiaxin; Chen, Chuansheng

    2015-01-01

    Studies have shown that numerosity processing (e.g., comparison of numbers of dots in two dot arrays) is significantly correlated with arithmetic performance. Researchers have attributed this association to the fact that both tasks share magnitude processing. The current investigation tested an alternative hypothesis, which states that visual perceptual ability (as measured by a figure-matching task) can account for the close relation between numerosity processing and arithmetic performance (computational fluency). Four hundred and twenty four third- to fifth-grade children (220 boys and 204 girls, 8.0–11.0 years old; 120 third graders, 146 fourth graders, and 158 fifth graders) were recruited from two schools (one urban and one suburban) in Beijing, China. Six classes were randomly selected from each school, and all students in each selected class participated in the study. All children were given a series of cognitive and mathematical tests, including numerosity comparison, figure matching, forward verbal working memory, visual tracing, non-verbal matrices reasoning, mental rotation, choice reaction time, arithmetic tests and curriculum-based mathematical achievement test. Results showed that figure-matching ability had higher correlations with numerosity processing and computational fluency than did other cognitive factors (e.g., forward verbal working memory, visual tracing, non-verbal matrix reasoning, mental rotation, and choice reaction time). More important, hierarchical multiple regression showed that figure matching ability accounted for the well-established association between numerosity processing and computational fluency. In support of the visual perception hypothesis, the results suggest that visual perceptual ability, rather than magnitude processing, may be the shared component of numerosity processing and arithmetic performance. PMID:26441740

  1. Neural Associations of the Early Retinotopic Cortex with the Lateral Occipital Complex during Visual Perception

    PubMed Central

    Liang, Bishan; Liu, Bo; Liu, Ming; Huang, Ruiwang

    2014-01-01

    Previous studies have demonstrated that the early retinotopic cortex (ERC, i.e., V1/V2/V3) is highly associated with the lateral occipital complex (LOC) during visual perception. However, it remains largely unclear how to evaluate their associations in quantitative way. The present study tried to apply a multivariate pattern analysis (MVPA) to quantify the neural activity in ERC and its association with that of the LOC when participants saw visual images. To this end, we assessed whether low-level visual features (Gabor features) could predict the neural activity in the ERC and LOC according to a voxel-based encoding model (VBEM), and then quantified the association of the neural activity between these regions by using an analogical VBEM. We found that the Gabor features remarkably predicted the activity of the ERC (e.g., the predicted accuracy was 52.5% for a participant) instead of that of the LOC (4.2%). Moreover, the MVPA approach can also be used to establish corresponding relationships between the activity patterns in the LOC and those in the ERC (64.2%). In particular, we found that the integration of the Gabor features and LOC visual information could dramatically improve the ‘prediction’ of ERC activity (88.3%). Overall, the present study provides new evidences for the possibility of quantifying the association of the neural activity between the regions of ERC and LOC. This approach will help to provide further insights into the neural substrates of the visual processing. PMID:25251083

  2. Errors in Moral Forecasting: Perceptions of Affect Shape the Gap Between Moral Behaviors and Moral Forecasts.

    PubMed

    Teper, Rimma; Tullett, Alexa M; Page-Gould, Elizabeth; Inzlicht, Michael

    2015-07-01

    Research in moral decision making has shown that there may not be a one-to-one relationship between peoples' moral forecasts and behaviors. Although past work suggests that physiological arousal may account for part of the behavior-forecasting discrepancy, whether or not perceptions of affect play an important determinant remains unclear. Here, we investigate whether this discrepancy may arise because people fail to anticipate how they will feel in morally significant situations. In Study 1, forecasters predicted cheating significantly more on a test than participants in a behavior condition actually cheated. Importantly, forecasters who received false somatic feedback, indicative of high arousal, produced forecasts that aligned more closely with behaviors. In Study 2, forecasters who misattributed their arousal to an extraneous source forecasted cheating significantly more. In Study 3, higher dispositional emotional awareness was related to less forecasted cheating. These findings suggest that perceptions of affect play a key role in the behavior-forecasting dissociation. PMID:25900823

  3. Age Differences in Visual-Auditory Self-Motion Perception during a Simulated Driving Task.

    PubMed

    Ramkhalawansingh, Robert; Keshavarz, Behrang; Haycock, Bruce; Shahab, Saba; Campos, Jennifer L

    2016-01-01

    Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e., optic flow) and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e., engine, tire, and wind sounds). Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion. PMID:27199829

  4. Visual Perception of Habitats Adopted for Post-Mining Landscape Rehabilitation

    NASA Astrophysics Data System (ADS)

    Sklenicka, Petr; Molnarova, Kristina

    2010-09-01

    The study presented here focuses on visual preferences expressed by respondents for five relatively natural habitat types used in land reclamation projects in the North-West Bohemian brown coal basins (Czech Republic). Respondents evaluated the perceived beauty of the habitat types using a photograph questionnaire, on the basis of the positively skewed 6-point Likert scale. The order of the habitat types, from most beautiful to least beautiful, was: managed coniferous forest, wild deciduous forest, managed deciduous forest, managed mixed forest, and managed grassland. Higher visual preferences were indicated for older forest habitats (30-40 years old) than for younger habitats (10-20 years old). In addition, respondents preferred wild deciduous forest to managed deciduous forest. Managed grasslands and non-native managed coniferous forests were preferred by older people with a lower level of education and low income living in the post-mining area. On the other hand, native, wild deciduous forest was awarded the highest perceived beauty score by younger, more educated respondents with higher income, living outside the post-mining landscapes. The study confirms differences in the perception of various forms of land reclamation by residents vs. non-residents, and its findings also confirm the need for sociological research in post-mining landscapes within the process of designing rehabilitated landscapes. From the visual standpoint, the results of our study also support the current trend toward using natural succession in the reclamation of post-mining landscapes.

  5. Visual perception of habitats adopted for post-mining landscape rehabilitation.

    PubMed

    Sklenicka, Petr; Molnarova, Kristina

    2010-09-01

    The study presented here focuses on visual preferences expressed by respondents for five relatively natural habitat types used in land reclamation projects in the North-West Bohemian brown coal basins (Czech Republic). Respondents evaluated the perceived beauty of the habitat types using a photograph questionnaire, on the basis of the positively skewed 6-point Likert scale. The order of the habitat types, from most beautiful to least beautiful, was: managed coniferous forest, wild deciduous forest, managed deciduous forest, managed mixed forest, and managed grassland. Higher visual preferences were indicated for older forest habitats (30-40 years old) than for younger habitats (10-20 years old). In addition, respondents preferred wild deciduous forest to managed deciduous forest. Managed grasslands and non-native managed coniferous forests were preferred by older people with a lower level of education and low income living in the post-mining area. On the other hand, native, wild deciduous forest was awarded the highest perceived beauty score by younger, more educated respondents with higher income, living outside the post-mining landscapes. The study confirms differences in the perception of various forms of land reclamation by residents vs. non-residents, and its findings also confirm the need for sociological research in post-mining landscapes within the process of designing rehabilitated landscapes. From the visual standpoint, the results of our study also support the current trend toward using natural succession in the reclamation of post-mining landscapes. PMID:20556383

  6. Age Differences in Visual-Auditory Self-Motion Perception during a Simulated Driving Task

    PubMed Central

    Ramkhalawansingh, Robert; Keshavarz, Behrang; Haycock, Bruce; Shahab, Saba; Campos, Jennifer L.

    2016-01-01

    Recent evidence suggests that visual-auditory cue integration may change as a function of age such that integration is heightened among older adults. Our goal was to determine whether these changes in multisensory integration are also observed in the context of self-motion perception under realistic task constraints. Thus, we developed a simulated driving paradigm in which we provided older and younger adults with visual motion cues (i.e., optic flow) and systematically manipulated the presence or absence of congruent auditory cues to self-motion (i.e., engine, tire, and wind sounds). Results demonstrated that the presence or absence of congruent auditory input had different effects on older and younger adults. Both age groups demonstrated a reduction in speed variability when auditory cues were present compared to when they were absent, but older adults demonstrated a proportionally greater reduction in speed variability under combined sensory conditions. These results are consistent with evidence indicating that multisensory integration is heightened in older adults. Importantly, this study is the first to provide evidence to suggest that age differences in multisensory integration may generalize from simple stimulus detection tasks to the integration of the more complex and dynamic visual and auditory cues that are experienced during self-motion. PMID:27199829

  7. Integration of facial and newly learned visual cues in speech perception.

    PubMed

    Massaro, Dom; Cohen, Michael M; Meyer, Heidi; Stribling, Tracy; Sterling, Cass; Vanderhyden, Sam

    2011-01-01

    We are developing technology to translate acoustic characteristics of speech into visual cues that can be used to supplement speechreading when hearing is limited. Research and theory have established that perceivers are influenced by multiple sources of sensory and contextual information in spoken language processing. Previous research has also shown that additional sources of information can be learned and used to supplement those that are normally available but have been degraded by sensory impairment or difficult environments. We tested whether people can combine or integrate information from the face and information from newly learned cues in an optimal manner. Subjects first learned the visual cues and then were tested under three conditions.Words were presented with just the face, just the visual cues, or both together. Performance was much better with both cues than with either one alone. Similar to the description of previous results with audible and visible speech, the present results were well described by the Fuzzy Logical Model of Perception (Massaro, 1998), which predicts optimal or maximally efficient integration. PMID:21977695

  8. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study.

    PubMed

    Andreou, Christina; Bozikas, Vasilis P; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  9. Associations between visual perception accuracy and confidence in a dopaminergic manipulation study

    PubMed Central

    Andreou, Christina; Bozikas, Vasilis P.; Luedtke, Thies; Moritz, Steffen

    2015-01-01

    Delusions are defined as fixed erroneous beliefs that are based on misinterpretation of events or perception, and cannot be corrected by argumentation to the opposite. Cognitive theories of delusions regard this symptom as resulting from specific distorted thinking styles that lead to biased integration and interpretation of perceived stimuli (i.e., reasoning biases). In previous studies, we were able to show that one of these reasoning biases, overconfidence in errors, can be modulated by drugs that act on the dopamine system, a major neurotransmitter system implicated in the pathogenesis of delusions and other psychotic symptoms. Another processing domain suggested to involve the dopamine system and to be abnormal in psychotic disorders is sensory perception. The present study aimed to investigate whether (lower-order) sensory perception and (higher-order) overconfidence in errors are similarly affected by dopaminergic modulation in healthy subjects. Thirty-four healthy individuals were assessed upon administration of l-dopa, placebo, or haloperidol within a randomized, double-blind, cross-over design. Variables of interest were hits and false alarms in an illusory perception paradigm requiring speeded detection of pictures over a noisy background, and subjective confidence ratings for correct and incorrect responses. There was a significant linear increase of false alarm rates from haloperidol to placebo to l-dopa, whereas hit rates were not affected by dopaminergic manipulation. As hypothesized, confidence in error responses was significantly higher with l-dopa compared to placebo. Moreover, confidence in erroneous responses significantly correlated with false alarm rates. These findings suggest that overconfidence in errors and aberrant sensory processing might be both interdependent and related to dopaminergic transmission abnormalities in patients with psychosis. PMID:25932015

  10. N1 enhancement in synesthesia during visual and audio–visual perception in semantic cross-modal conflict situations: an ERP study

    PubMed Central

    Sinke, Christopher; Neufeld, Janina; Wiswede, Daniel; Emrich, Hinderk M.; Bleich, Stefan; Münte, Thomas F.; Szycik, Gregor R.

    2014-01-01

    Synesthesia entails a special kind of sensory perception, where stimulation in one sensory modality leads to an internally generated perceptual experience of another, not stimulated sensory modality. This phenomenon can be viewed as an abnormal multisensory integration process as here the synesthetic percept is aberrantly fused with the stimulated modality. Indeed, recent synesthesia research has focused on multimodal processing even outside of the specific synesthesia-inducing context and has revealed changed multimodal integration, thus suggesting perceptual alterations at a global level. Here, we focused on audio–visual processing in synesthesia using a semantic classification task in combination with visually or auditory–visually presented animated and in animated objects in an audio–visual congruent and incongruent manner. Fourteen subjects with auditory-visual and/or grapheme-color synesthesia and 14 control subjects participated in the experiment. During presentation of the stimuli, event-related potentials were recorded from 32 electrodes. The analysis of reaction times and error rates revealed no group differences with best performance for audio-visually congruent stimulation indicating the well-known multimodal facilitation effect. We found enhanced amplitude of the N1 component over occipital electrode sites for synesthetes compared to controls. The differences occurred irrespective of the experimental condition and therefore suggest a global influence on early sensory processing in synesthetes. PMID:24523689

  11. Cultural Differences in Affect Intensity Perception in the Context of Advertising

    PubMed Central

    Pogosyan, Marianna; Engelmann, Jan B.

    2011-01-01

    Cultural differences in the perception of positive affect intensity within an advertising context were investigated among American, Japanese, and Russian participants. Participants were asked to rate the intensity of facial expressions of positive emotions, which displayed either subtle, low intensity, or salient, high intensity expressions of positive affect. In agreement with previous findings from cross-cultural psychological research, current results demonstrate both cross-cultural agreement and differences in the perception of positive affect intensity across the three cultures. Specifically, American participants perceived high arousal (HA) images as significantly less calm than participants from the other two cultures, while the Japanese participants perceived low arousal (LA) images as significantly more excited than participants from the other cultures. The underlying mechanisms of these cultural differences were further investigated through difference scores that probed for cultural differences in perception and categorization of positive emotions. Findings indicate that rating differences are due to (1) perceptual differences in the extent to which HA images were discriminated from LA images, and (2) categorization differences in the extent to which facial expressions were grouped into affect intensity categories. Specifically, American participants revealed significantly higher perceptual differentiation between arousal levels of facial expressions in high and intermediate intensity categories. Japanese participants, on the other hand, did not discriminate between high and low arousal affect categories to the same extent as did the American and Russian participants. These findings indicate the presence of cultural differences in underlying decoding mechanisms of facial expressions of positive affect intensity. Implications of these results for global advertising are discussed. PMID:22084635

  12. Implicit Processing of Visual Emotions Is Affected by Sound-Induced Affective States and Individual Affective Traits

    PubMed Central

    Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira

    2014-01-01

    The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162

  13. Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits.

    PubMed

    Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira

    2014-01-01

    The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals. PMID:25072162

  14. The effect of neurofeedback on a brain wave and visual perception in stroke: a randomized control trial.

    PubMed

    Cho, Hwi-Young; Kim, Kitae; Lee, Byounghee; Jung, Jinhwa

    2015-03-01

    [Purpose] This study investigated a brain wave and visual perception changes in stroke subjects using neurofeedback (NFB) training. [Subjects] Twenty-seven stroke subjects were randomly allocated to the NFB (n = 13) group and the control group (n=14). [Methods] Two expert therapists provided the NFB and CON groups with traditional rehabilitation therapy in 30 thirst-minute sessions over the course of 6 weeks. NFB training was provided only to the NFB group. The CON group received traditional rehabilitation therapy only. Before and after the 6-week intervention, a brain wave test and motor free visual perception test (MVPT) were performed. [Results] Both groups showed significant differences in their relative beta wave values and attention concentration quotients. Moreover, the NFB group showed a significant difference in MVPT visual discrimination, form constancy, visual memory, visual closure, spatial relation, raw score, and processing time. [Conclusion] This study demonstrated that NFB training is more effective for increasing concentration and visual perception changes than traditional rehabilitation. In further studies, detailed and diverse investigations should be performed considering the number and characteristics of subjects, and the NFB training period. PMID:25931705

  15. Prenatal and acute cocaine exposure affects neural responses and habituation to visual stimuli

    PubMed Central

    Riley, Elizabeth; Kopotiyenko, Konstantin; Zhdanova, Irina

    2015-01-01

    Psychostimulants have many effects on visual function, from adverse following acute and prenatal exposure to therapeutic on attention deficit. To determine the impact of prenatal and acute cocaine exposure on visual processing, we studied neuronal responses to visual stimuli in two brain regions of a transgenic larval zebrafish expressing the calcium indicator GCaMP-HS. We found that both red light (LF) and dark (DF) flashes elicited similar responses in the optic tectum neuropil (TOn), while the dorsal telencephalon (dTe) responded only to LF. Acute cocaine (0.5 μM) reduced neuronal responses to LF in both brain regions but did not affect responses to DF. Repeated stimulus presentation (RSP) led to habituation of dTe neurons to LF. Acute cocaine prevented habituation. TOn habituated to DF, but not LF, and DF habituation was not modified by cocaine. Remarkably, prenatal cocaine exposure (PCE) prevented the effects of acute cocaine on LF response amplitude and habituation later in development in both brain regions, but did not affect DF responses. We discovered that, in spite of similar neural responses to LF and DF in the TO (superior colliculus in mammals), responses to LF are more complex, involving dTe (homologous to the cerebral cortex), and are more vulnerable to cocaine. Our results demonstrate that acute cocaine exposure affects visual processing differentially by brain region, and that PCE modifies zebrafish visual processing in multiple structures in a stimulus-dependent manner. These findings are in accordance with the major role that the optic tectum and cerebral cortex play in sustaining visual attention, and support the hypothesis that modification of these areas by PCE may be responsible for visual deficits noted in humans. This model offers new methodological approaches for studying the adverse and therapeutic effects of psychostimulants on attention, and for the development of new pharmacological interventions. PMID:26379509

  16. Prenatal and acute cocaine exposure affects neural responses and habituation to visual stimuli.

    PubMed

    Riley, Elizabeth; Kopotiyenko, Konstantin; Zhdanova, Irina

    2015-01-01

    Psychostimulants have many effects on visual function, from adverse following acute and prenatal exposure to therapeutic on attention deficit. To determine the impact of prenatal and acute cocaine exposure on visual processing, we studied neuronal responses to visual stimuli in two brain regions of a transgenic larval zebrafish expressing the calcium indicator GCaMP-HS. We found that both red light (LF) and dark (DF) flashes elicited similar responses in the optic tectum neuropil (TOn), while the dorsal telencephalon (dTe) responded only to LF. Acute cocaine (0.5 μM) reduced neuronal responses to LF in both brain regions but did not affect responses to DF. Repeated stimulus presentation (RSP) led to habituation of dTe neurons to LF. Acute cocaine prevented habituation. TOn habituated to DF, but not LF, and DF habituation was not modified by cocaine. Remarkably, prenatal cocaine exposure (PCE) prevented the effects of acute cocaine on LF response amplitude and habituation later in development in both brain regions, but did not affect DF responses. We discovered that, in spite of similar neural responses to LF and DF in the TO (superior colliculus in mammals), responses to LF are more complex, involving dTe (homologous to the cerebral cortex), and are more vulnerable to cocaine. Our results demonstrate that acute cocaine exposure affects visual processing differentially by brain region, and that PCE modifies zebrafish visual processing in multiple structures in a stimulus-dependent manner. These findings are in accordance with the major role that the optic tectum and cerebral cortex play in sustaining visual attention, and support the hypothesis that modification of these areas by PCE may be responsible for visual deficits noted in humans. This model offers new methodological approaches for studying the adverse and therapeutic effects of psychostimulants on attention, and for the development of new pharmacological interventions. PMID:26379509

  17. READING YOUR OWN LIPS: COMMON CODING THEORY AND VISUAL SPEECH PERCEPTION

    PubMed Central

    Tye-Murray, Nancy; Spehar, Brent P.; Myerson, Joel; Hale, Sandra; Sommers, Mitchell S.

    2012-01-01

    Common coding theory posits: 1) perceiving an action activates the same representations of motor plans that are activated by actually performing that action; 2) because of individual differences in the way actions are performed, observing recordings of one’s own previous behavior activates motor plans to an even greater degree than observing someone else’s behavior. We hypothesized that if observing oneself activates motor plans to a greater degree than observing others, and these activated plans contribute to perception, then people should be able to lipread silent video clips of their own previous utterances more accurately than they can lipread video clips of other talkers. As predicted, two groups of participants were able to lipread video clips of themselves recorded more than two weeks earlier significantly more accurately than video clips of others. These results suggest that visual input activates speech motor activity that links to word representations in the mental lexicon. PMID:23132604

  18. A tone mapping operator based on neural and psychophysical models of visual perception

    NASA Astrophysics Data System (ADS)

    Cyriac, Praveen; Bertalmio, Marcelo; Kane, David; Vazquez-Corral, Javier

    2015-03-01

    High dynamic range imaging techniques involve capturing and storing real world radiance values that span many orders of magnitude. However, common display devices can usually reproduce intensity ranges only up to two to three orders of magnitude. Therefore, in order to display a high dynamic range image on a low dynamic range screen, the dynamic range of the image needs to be compressed without losing details or introducing artefacts, and this process is called tone mapping. A good tone mapping operator must be able to produce a low dynamic range image that matches as much as possible the perception of the real world scene. We propose a two stage tone mapping approach, in which the first stage is a global method for range compression based on a gamma curve that equalizes the lightness histogram the best, and the second stage performs local contrast enhancement and color induction using neural activity models for the visual cortex.

  19. The phase of pre-stimulus alpha oscillations influences the visual perception of stimulus timing.

    PubMed

    Milton, Alex; Pleydell-Pearce, Christopher W

    2016-06-01

    This study examined the influence of pre-stimulus alpha phase and attention on whether two visual stimuli occurring closely in time were perceived as simultaneous or asynchronous. The results demonstrated that certain phases of alpha in the period immediately preceding stimulus onset were associated with a higher proportion of stimuli judged to be asynchronous. Furthermore, this effect was shown to occur independently of both visuo-spatial attention and alpha amplitude. The findings are compatible with proposals that alpha phase reflects cyclic shifts in neuronal excitability. Importantly, however, the results further suggest that fluctuations in neuronal excitability can create a periodicity in neuronal transfer that can have functional consequences that are decoupled from changes in alpha amplitude. This study therefore provides evidence that perceptual processes fluctuate periodically although it remains uncertain whether this implies the discrete temporal framing of perception. PMID:26924284

  20. Reference Valence Effects of Affective S–R Compatibility: Are Visual and Auditory Results Consistent?

    PubMed Central

    Xiaojun, Zhao; Xuqun, You; Changxiu, Shi; Shuoqiu, Gan; Chaoyi, Hu

    2014-01-01

    Humans may be faster to avoid negative words than to approach negative words, and faster to approach positive words than to avoid positive words. That is an example of affective stimulus–response (S–R) compatibility. The present study identified the reference valence effects of affective stimulus–response (S–R) compatibility when auditory stimulus materials are used. The researchers explored the reference valence effects of affective S–R compatibility using a mixed-design experiment based on visual words, visual pictures and audition. The study computed the average compatibility effect size. A t-test based on visual pictures showed that the compatibility effect size was significantly different from zero, t (22) = 2.43, p<.05 (M = 485 ms). Smaller compatibility effects existed when switching the presentation mode from visual stimuli to auditory stimuli. This study serves as an important reference for the auditory reference valence effects of affective S–R compatibility. PMID:24743797

  1. Responsiveness to the Negative Affect System as a Function of Emotion Perception: Relations Between Affect and Sociability in Three Daily Diary Studies.

    PubMed

    Moeller, Sara K; Nicpon, Catherine G; Robinson, Michael D

    2014-04-30

    Perceiving emotions clearly and accurately is an important component of emotional intelligence (EI). This skill is thought to predict emotional and social outcomes, but evidence for this point appears somewhat underwhelming in cross-sectional designs. The present work adopted a more contextual approach to understanding the correlates of emotion perception. Because emotion perception involves awareness of affect as it occurs, people higher in this skill might reasonably be expected to be more attuned to variations in their affective states and be responsive to them for this reason. This novel hypothesis was pursued in three daily diary studies (total N = 247), which found systematic evidence for the idea that higher levels of daily negative affect predicted lesser sociability particularly, and somewhat exclusively, among people whose emotion perception skills were high rather than low. The results support a contextual understanding of individual differences in emotion perception and how they operate. PMID:24789808

  2. Responsiveness to the Negative Affect System as a Function of Emotion Perception: Relations between Affect and Sociability in Three Daily Diary Studies

    PubMed Central

    Moeller, Sara K.; Nicpon, Catherine G.; Robinson, Michael D.

    2014-01-01

    Perceiving emotions clearly and accurately is an important component of emotional intelligence. This skill is thought to predict emotional and social outcomes, but evidence for this point appears somewhat underwhelming in cross-sectional designs. The present work adopted a more contextual approach to understanding the correlates of emotion perception instead. Because emotion perception involves awareness of affect as it occurs, people higher in this skill might reasonably be expected to be more attuned to variations in their affective states and be responsive to them for this reason. This novel hypothesis was pursued in three daily diary studies (total N = 247), which found systematic evidence for the idea that higher levels of daily negative affect predicted lesser sociability particularly, and somewhat exclusively, among people whose emotion perception skills were high rather than low. The results support a contextual understanding of individual differences in emotion perception and how they operate. PMID:24789808

  3. Image and video compression/decompression based on human visual perception system and transform coding

    SciTech Connect

    Fu, Chi Yung., Petrich, L.I., Lee, M.

    1997-02-01

    The quantity of information has been growing exponentially, and the form and mix of information have been shifting into the image and video areas. However, neither the storage media nor the available bandwidth can accommodated the vastly expanding requirements for image information. A vital, enabling technology here is compression/decompression. Our compression work is based on a combination of feature-based algorithms inspired by the human visual- perception system (HVS), and some transform-based algorithms (such as our enhanced discrete cosine transform, wavelet transforms), vector quantization and neural networks. All our work was done on desktop workstations using the C++ programming language and commercially available software. During FY 1996, we explored and implemented an enhanced feature-based algorithms, vector quantization, and neural- network-based compression technologies. For example, we improved the feature compression for our feature-based algorithms by a factor of two to ten, a substantial improvement. We also found some promising results when using neural networks and applying them to some video sequences. In addition, we also investigated objective measures to characterize compression results, because traditional means such as the peak signal- to-noise ratio (PSNR) are not adequate to fully characterize the results, since such measures do not take into account the details of human visual perception. We have successfully used our one- year LDRD funding as seed money to explore new research ideas and concepts, the results of this work have led us to obtain external funding from the dud. At this point, we are seeking matching funds from DOE to match the dud funding so that we can bring such technologies into fruition. 9 figs., 2 tabs.

  4. The Relationship of Error and Correction of Error in Oral Reading to Visual-Form Perception and Word Attack Skills.

    ERIC Educational Resources Information Center

    Clayman, Deborah P. Goldweber

    The ability of 100 second-grade boys and girls to self-correct oral reading errors was studied in relationship to visual-form perception, phonic skills, response speed, and reading level. Each child was tested individually with the Bender-Error Test, the Gray Oral Paragraphs, and the Roswell-Chall Diagnostic Reading Test and placed into a group of…

  5. Basic to Applied Research: The Benefits of Audio-Visual Speech Perception Research in Teaching Foreign Languages

    ERIC Educational Resources Information Center

    Erdener, Dogu

    2016-01-01

    Traditionally, second language (L2) instruction has emphasised auditory-based instruction methods. However, this approach is restrictive in the sense that speech perception by humans is not just an auditory phenomenon but a multimodal one, and specifically, a visual one as well. In the past decade, experimental studies have shown that the…

  6. Teaching with Concrete and Abstract Visual Representations: Effects on Students' Problem Solving, Problem Representations, and Learning Perceptions

    ERIC Educational Resources Information Center

    Moreno, Roxana; Ozogul, Gamze; Reisslein, Martin

    2011-01-01

    In 3 experiments, we examined the effects of using concrete and/or abstract visual problem representations during instruction on students' problem-solving practice, near transfer, problem representations, and learning perceptions. In Experiments 1 and 2, novice students learned about electrical circuit analysis with an instructional program that…

  7. Contribution of the Visual Perception and Graphic Production Systems to the Copying of Complex Geometrical Drawings: A Developmental Study

    ERIC Educational Resources Information Center

    Bouaziz, Serge; Magnan, Annie

    2007-01-01

    The aim of this study was to determine the contribution of the visual perception and graphic production systems [Van Sommers, P. (1989). "A system for drawing and drawing-related neuropsychology." "Cognitive Neuropsychology," 6, 117-164] to the manifestation of the "Centripetal Execution Principle" (CEP), a graphic rule for the copying of drawings…

  8. Do Visual and Vestibular Inputs Compensate for Somatosensory Loss in the Perception of Spatial Orientation? Insights from a Deafferented Patient.

    PubMed

    Bringoux, Lionel; Scotto Di Cesare, Cécile; Borel, Liliane; Macaluso, Thomas; Sarlegna, Fabrice R

    2016-01-01

    The present study aimed at investigating the consequences of a massive loss of somatosensory inputs on the perception of spatial orientation. The occurrence of possible compensatory processes for external (i.e., object) orientation perception and self-orientation perception was examined by manipulating visual and/or vestibular cues. To that aim, we compared perceptual responses of a deafferented patient (GL) with respect to age-matched Controls in two tasks involving gravity-related judgments. In the first task, subjects had to align a visual rod with the gravitational vertical (i.e., Subjective Visual Vertical: SVV) when facing a tilted visual frame in a classic Rod-and-Frame Test. In the second task, subjects had to report whether they felt tilted when facing different visuo-postural conditions which consisted in very slow pitch tilts of the body and/or visual surroundings away from vertical. Results showed that, much more than Controls, the deafferented patient was fully dependent on spatial cues issued from the visual frame when judging the SVV. On the other hand, the deafferented patient did not rely at all on visual cues for self-tilt detection. Moreover, the patient never reported any sensation of tilt up to 18° contrary to Controls, hence showing that she did not rely on vestibular (i.e., otoliths) signals for the detection of very slow body tilts either. Overall, this study demonstrates that a massive somatosensory deficit substantially impairs the perception of spatial orientation, and that the use of the remaining sensory inputs available to a deafferented patient differs regarding whether the judgment concerns external vs. self-orientation. PMID:27199704

  9. Do Visual and Vestibular Inputs Compensate for Somatosensory Loss in the Perception of Spatial Orientation? Insights from a Deafferented Patient

    PubMed Central

    Bringoux, Lionel; Scotto Di Cesare, Cécile; Borel, Liliane; Macaluso, Thomas; Sarlegna, Fabrice R.

    2016-01-01

    The present study aimed at investigating the consequences of a massive loss of somatosensory inputs on the perception of spatial orientation. The occurrence of possible compensatory processes for external (i.e., object) orientation perception and self-orientation perception was examined by manipulating visual and/or vestibular cues. To that aim, we compared perceptual responses of a deafferented patient (GL) with respect to age-matched Controls in two tasks involving gravity-related judgments. In the first task, subjects had to align a visual rod with the gravitational vertical (i.e., Subjective Visual Vertical: SVV) when facing a tilted visual frame in a classic Rod-and-Frame Test. In the second task, subjects had to report whether they felt tilted when facing different visuo-postural conditions which consisted in very slow pitch tilts of the body and/or visual surroundings away from vertical. Results showed that, much more than Controls, the deafferented patient was fully dependent on spatial cues issued from the visual frame when judging the SVV. On the other hand, the deafferented patient did not rely at all on visual cues for self-tilt detection. Moreover, the patient never reported any sensation of tilt up to 18° contrary to Controls, hence showing that she did not rely on vestibular (i.e., otoliths) signals for the detection of very slow body tilts either. Overall, this study demonstrates that a massive somatosensory deficit substantially impairs the perception of spatial orientation, and that the use of the remaining sensory inputs available to a deafferented patient differs regarding whether the judgment concerns external vs. self-orientation. PMID:27199704

  10. Reducing magnocellular processing of various motion trajectories tests single process theories of visual position perception.

    PubMed

    Chappell, Mark; Potter, Zach; Hine, Trevor J; Mullen, Kathy T; Shand, James

    2013-01-01

    Spatial projection and temporal integration are two prominent theories of visual localization for moving stimuli which gain most of their explanatory power from a single process. Spatial projection theories posit that a moving stimulus' perceived position is projected forwards in order to compensate for processing delays (Eagleman & Sejnowski, 2007; Nijhawan, 2008). Temporal integration theories (Krekelberg & Lappe, 2000) suggest that an averaging over positions occupied by the moving stimulus for a period of time is the dominant process underlying perception of position. We found that when magnocellular (M) pathway processing was reduced, there were opposite effects on localization judgments when a smooth, continuous trajectory was used, compared to when the moving object suddenly appeared, or suddenly reversed direction. The flash-lag illusion was decreased for the continuous trajectory, but increased for the onset and reversal trajectories. This cross-over interaction necessitates processes additional to those proposed by either the spatial projection or temporal integration theories in order to explain the perception of the position of moving stimuli across all our conditions. Differentiating our onset trajectory conditions from a Fröhlich illusion, in a second experiment, we found a null Fröhlich illusion under normal luminance-defined conditions, significantly smaller than the corresponding flash-lag illusion, but significantly increased when M processing was reduced. Our data are most readily accounted for by Kirschfeld and Kammer's (1999) backward-inhibition and focal attention theory. PMID:23986536

  11. A neural basis for the spatial suppression of visual motion perception

    PubMed Central

    Liu, Liu D; Haefner, Ralf M; Pack, Christopher C

    2016-01-01

    In theory, sensory perception should be more accurate when more neurons contribute to the representation of a stimulus. However, psychophysical experiments that use larger stimuli to activate larger pools of neurons sometimes report impoverished perceptual performance. To determine the neural mechanisms underlying these paradoxical findings, we trained monkeys to discriminate the direction of motion of visual stimuli that varied in size across trials, while simultaneously recording from populations of motion-sensitive neurons in cortical area MT. We used the resulting data to constrain a computational model that explained the behavioral data as an interaction of three main mechanisms: noise correlations, which prevented stimulus information from growing with stimulus size; neural surround suppression, which decreased sensitivity for large stimuli; and a read-out strategy that emphasized neurons with receptive fields near the stimulus center. These results suggest that paradoxical percepts reflect tradeoffs between sensitivity and noise in neuronal populations. DOI: http://dx.doi.org/10.7554/eLife.16167.001 PMID:27228283

  12. First-person and third-person verbs in visual motion-perception regions.

    PubMed

    Papeo, Liuba; Lingnau, Angelika

    2015-02-01

    Verb-related activity is consistently found in the left posterior lateral cortex (PLTC), encompassing also regions that respond to visual-motion perception. Besides motion, those regions appear sensitive to distinctions among the entities beyond motion, including that between first- vs. third-person ("third-person bias"). In two experiments, using functional magnetic resonance imaging (fMRI), we studied whether the implied subject (first/third-person) and/or the semantic content (motor/non-motor) of verbs modulate the neural activity in the left PLTC-regions responsive during basic- and biological-motion perception. In those sites, we found higher activity for verbs than for nouns. This activity was modulated by the person (but not the semantic content) of the verbs, with stronger response to third- than first-person verbs. The third-person bias elicited by verbs supports a role of motion-processing regions in encoding information about the entity beyond (and independently from) motion, and sets in a new light the role of these regions in verb processing. PMID:25594153

  13. A Qualitative Case Study of EFL Students' Affective Reactions to and Perceptions of Their Teachers' Written Feedback

    ERIC Educational Resources Information Center

    Mahfoodh, Omer Hassan A.; Pandian, Ambigapathy

    2011-01-01

    The present paper reports a qualitative case study of investigating EFL students' affective reactions to and perceptions of their teachers' written feedback. In addition, the study reported here also focuses on contextual factors that may influence students' reactions to and perceptions of their teachers' written feedback. Data were collected…

  14. Keeping an eye on the violinist: motor experts show superior timing consistency in a visual perception task

    PubMed Central

    Cañal-Bruland, Rouwen

    2010-01-01

    Common coding theory states that perception and action may reciprocally induce each other. Consequently, motor expertise should map onto perceptual consistency in specific tasks such as predicting the exact timing of a musical entry. To test this hypothesis, ten string musicians (motor experts), ten non-string musicians (visual experts), and ten non-musicians were asked to watch progressively occluded video recordings of a first violinist indicating entries to fellow members of a string quartet. Participants synchronised with the perceived timing of the musical entries. Results revealed significant effects of motor expertise on perception. Compared to visual experts and non-musicians, string players not only responded more accurately, but also with less timing variability. These findings provide evidence that motor experts’ consistency in movement execution—a key characteristic of expert motor performance—is mirrored in lower variability in perceptual judgements, indicating close links between action competence and perception. PMID:20300943

  15. Cerebral Visual Impairment: Which Perceptive Visual Dysfunctions Can Be Expected in Children with Brain Damage? A Systematic Review

    ERIC Educational Resources Information Center

    Boot, F. H.; Pel, J. J. M.; van der Steen, J.; Evenhuis, H. M.

    2010-01-01

    The current definition of Cerebral Visual Impairment (CVI) includes all visual dysfunctions caused by damage to, or malfunctioning of, the retrochiasmatic visual pathways in the absence of damage to the anterior visual pathways or any major ocular disease. CVI is diagnosed by exclusion and the existence of many different causes and symptoms make…

  16. Virtual lesion of right posterior superior temporal sulcus modulates conscious visual perception of fearful expressions in faces and bodies.

    PubMed

    Candidi, Matteo; Stienen, Bernard M C; Aglioti, Salvatore M; de Gelder, Beatrice

    2015-04-01

    The posterior Superior Temporal Suclus (pSTS) represents a central hub in the complex cerebral network for person perception and emotion recognition as also suggested by its heavy connections with face- and body-specific cortical (e.g., the fusiform face area, FFA and the extrastriate body area, EBA) and subcortical structures (e.g., amygdala). Information on whether pSTS is causatively involved in sustaining conscious visual perception of emotions expressed by faces and bodies is lacking. We explored this issue by combining a binocular rivalry procedure (where emotional and neutral face and body postures rivaled with house images) with off-line, 1-Hz repetitive transcranial magnetic stimulation (rTMS). We found that temporary inhibition of the right pSTS reduced perceptual dominance of fearful faces and increased perceptual dominance of fearful bodies, while leaving unaffected the perception of neutral face and body images. Inhibition of the vertex had no effect on conscious visual perception of neutral or emotional face or body stimuli. Thus, the right pSTS plays a causal role in shortening conscious vision of fearful faces and in prolonging conscious vision of fearful bodies. These results suggest that pSTS selectively modulates the activity of segregated networks involved in the conscious visual perception of emotional faces or bodies. We speculate that the opposite role of the right pSTS for conscious perception of fearful face and body may be explained by the different connections that this region entertains with face- and body-selective visual areas as well as with amygdalae and premotor regions. PMID:25835522

  17. Can you see what you feel? Color and folding properties affect visual-tactile material discrimination of fabrics.

    PubMed

    Xiao, Bei; Bi, Wenyan; Jia, Xiaodan; Wei, Hanhan; Adelson, Edward H

    2016-01-01

    Humans can often estimate tactile properties of objects from vision alone. For example, during online shopping, we can often infer material properties of clothing from images and judge how the material would feel against our skin. What visual information is important for tactile perception? Previous studies in material perception have focused on measuring surface appearance, such as gloss and roughness, and using verbal reports of material attributes and categories. However, in real life, predicting tactile properties of an object might not require accurate verbal descriptions of its surface attributes or categories. In this paper, we use tactile perception as ground truth to measure visual material perception. Using fabrics as our stimuli, we measure how observers match what they see (photographs of fabric samples) with what they feel (physical fabric samples). The data shows that color has a significant main effect in that removing color significantly reduces accuracy, especially when the images contain 3-D folds. We also find that images of draped fabrics, which revealed 3-D shape information, achieved better matching accuracy than images with flattened fabrics. The data shows a strong interaction between color and folding conditions on matching accuracy, suggesting that, in 3-D folding conditions, the visual system takes advantage of chromatic gradients to infer tactile properties but not in flattened conditions. Together, using a visual-tactile matching task, we show that humans use folding and color information in matching the visual and tactile properties of fabrics. PMID:26913626

  18. Multisensory integration and the concert experience: An overview of how visual stimuli can affect what we hear

    NASA Astrophysics Data System (ADS)

    Hyde, Jerald R.

    2001-05-01

    It is clear to those who ``listen'' to concert halls and evaluate their degree of acoustical success that it is quite difficult to separate the acoustical response at a given seat from the multi-modal perception of the whole event. Objective concert hall data have been collected for the purpose of finding a link with their related subjective evaluation and ultimately with the architectural correlates which produce the sound field. This exercise, while important, tends to miss the point that a concert or opera event utilizes all the senses of which the sound field and visual stimuli are both major contributors to the experience. Objective acoustical factors point to visual input as being significant in the perception of ``acoustical intimacy'' and with the perception of loudness versus distance in large halls. This paper will review the evidence of visual input as a factor in what we ``hear'' and introduce concepts of perceptual constancy, distance perception, static and dynamic visual stimuli, and the general process of the psychology of the integrated experience. A survey of acousticians on their opinions about the auditory-visual aspects of the concert hall experience will be presented. [Work supported in part from the Veneklasen Research Foundation and Veneklasen Associates.

  19. Playing the electric light orchestra—how electrical stimulation of visual cortex elucidates the neural basis of perception

    PubMed Central

    Cicmil, Nela; Krug, Kristine

    2015-01-01

    Vision research has the potential to reveal fundamental mechanisms underlying sensory experience. Causal experimental approaches, such as electrical microstimulation, provide a unique opportunity to test the direct contributions of visual cortical neurons to perception and behaviour. But in spite of their importance, causal methods constitute a minority of the experiments used to investigate the visual cortex to date. We reconsider the function and organization of visual cortex according to results obtained from stimulation techniques, with a special emphasis on electrical stimulation of small groups of cells in awake subjects who can report their visual experience. We compare findings from humans and monkeys, striate and extrastriate cortex, and superficial versus deep cortical layers, and identify a number of revealing gaps in the ‘causal map′ of visual cortex. Integrating results from different methods and species, we provide a critical overview of the ways in which causal approaches have been used to further our understanding of circuitry, plasticity and information integration in visual cortex. Electrical stimulation not only elucidates the contributions of different visual areas to perception, but also contributes to our understanding of neuronal mechanisms underlying memory, attention and decision-making. PMID:26240421

  20. Musically induced arousal affects pain perception in females but not in males: a psychophysiological examination.

    PubMed

    Kenntner-Mabiala, Ramona; Gorges, Susanne; Alpers, Georg W; Lehmann, Andreas C; Pauli, Paul

    2007-04-01

    The present study investigated affective and physiological responses to changes of tempo and mode in classical music and their effects on heat pain perception. Thirty-eight healthy non-musicians (17 female) listened to sequences of 24 music stimuli which were variations of 4 pieces of classical music. Tempo (46, 60, and 95 beats/min) and mode (major and minor) were manipulated digitally, all other musical elements were held constant. Participants rated valence, arousal, happiness and sadness of the musical stimuli as well as the intensity and the unpleasantness of heat pain stimuli which were applied during music listening. Heart rate, respiratory rate and end-tidal PCO(2) were recorded. Pain ratings were highest for the fastest tempo. Also, participants' arousal ratings, their respiratory rate and heart rate were accelerated by the fastest tempo. The modulation of pain perception by the tempo of music seems to be mediated by the listener's arousal. PMID:17118518

  1. Affordance-based perception-action dynamics: A model of visually guided braking.

    PubMed

    Harrison, Henry S; Turvey, Michael T; Frank, Till D

    2016-04-01

    Behavioral dynamics is a framework for understanding adaptive behavior as arising from the self-organizing interaction between animal and environment. The methods of nonlinear dynamics provide a language for describing behavior that is both stable and flexible. Behavioral dynamics has been criticized for ignoring the animal's sensitivity to its own capabilities, leading to the development of an alternative framework: affordance-based control. Although it is theoretically sound and empirically motivated, affordance-based control has resisted characterization in terms of nonlinear dynamics. Here, we provide a dynamical description of affordance-based control, extending behavioral dynamics to meet its criticisms. We propose a general modeling strategy consistent with both theories. We use visually guided braking as a representative behavior and construct a novel dynamical model. This model demonstrates the possibility of understanding visually guided action as respecting the limits of the actor's capabilities, while still being guided by informational variables associated with desired states of affairs. In addition to such "hard" constraints on behavior, our framework allows for the influence of "soft" constraints such as preference and comfort, opening a new area of inquiry in perception-action dynamics. PMID:26881694

  2. Hybrid fNIRS-EEG based classification of auditory and visual perception processes

    PubMed Central

    Putze, Felix; Hesslinger, Sebastian; Tse, Chun-Yu; Huang, YunYing; Herff, Christian; Guan, Cuntai; Schultz, Tanja

    2014-01-01

    For multimodal Human-Computer Interaction (HCI), it is very useful to identify the modalities on which the user is currently processing information. This would enable a system to select complementary output modalities to reduce the user's workload. In this paper, we develop a hybrid Brain-Computer Interface (BCI) which uses Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS) to discriminate and detect visual and auditory stimulus processing. We describe the experimental setup we used for collection of our data corpus with 12 subjects. On this data, we performed cross-validation evaluation, of which we report accuracy for different classification conditions. The results show that the subject-dependent systems achieved a classification accuracy of 97.8% for discriminating visual and auditory perception processes from each other and a classification accuracy of up to 94.8% for detecting modality-specific processes independently of other cognitive activity. The same classification conditions could also be discriminated in a subject-independent fashion with accuracy of up to 94.6 and 86.7%, respectively. We also look at the contributions of the two signal types and show that the fusion of classifiers using different features significantly increases accuracy. PMID:25477777

  3. Visual strategies for enhancing user perception of task relationships in emergency operations centers

    NASA Astrophysics Data System (ADS)

    Dudzic, Stephanie; Godwin, Alex; Kilgore, Ryan

    2010-04-01

    In time-sensitive environments, such as DHS emergency operations centers (EOCs), it is imperative for decision makers to rapidly understand and address key logical relationships that exist between tasks, entities, and events, even as conditions fluctuate. These relationships often have important temporal characteristics, such as tasks that must be completed before others can be started (e.g., buses must be transported to an area before an evacuation process can begin). Unfortunately, traditional temporal display methods, such as mission timelines, typically reveal only rudimentary event details and fail to support user understanding of and reasoning about critical temporal constraints and interrelationships across multiple mission components. To address these shortcomings, we developed a visual language to enhance temporal data displays by explicitly and intuitively conveying these constraints and relationships to decision makers. In this paper, we detail these design strategies and describe ongoing evaluation efforts to assess their usability and effectiveness to support decision-making tasks in complex, time-sensitive environments. We present a case study in which we applied our visual enhancements to a timeline display, improving the perception of logical relationships among events in a Master Scenario Event List (MSEL). These methods reduce the cognitive workload of decision makers and improve the efficacy of identification.

  4. Tuning self-motion perception in virtual reality with visual illusions.

    PubMed

    Bruder, Gerd; Steinicke, Frank; Wieland, Phil; Lappe, Markus

    2012-07-01

    Motion perception in immersive virtual environments significantly differs from the real world. For example, previous work has shown that users tend to underestimate travel distances in virtual environments (VEs). As a solution to this problem, researchers proposed to scale the mapped virtual camera motion relative to the tracked real-world movement of a user until real and virtual motion are perceived as equal, i.e., real-world movements could be mapped with a larger gain to the VE in order to compensate for the underestimation. However, introducing discrepancies between real and virtual motion can become a problem, in particular, due to misalignments of both worlds and distorted space cognition. In this paper, we describe a different approach that introduces apparent self-motion illusions by manipulating optic flow fields during movements in VEs. These manipulations can affect self-motion perception in VEs, but omit a quantitative discrepancy between real and virtual motions. In particular, we consider to which regions of the virtual view these apparent self-motion illusions can be applied, i.e., the ground plane or peripheral vision. Therefore, we introduce four illusions and show in experiments that optic flow manipulation can significantly affect users' self-motion judgments. Furthermore, we show that with such manipulations of optic flow fields the underestimation of travel distances can be compensated. PMID:22084144

  5. The influence of image valence on visual attention and perception of risk in drivers.

    PubMed

    Jones, M P; Chapman, P; Bailey, K

    2014-12-01

    Currently there is little research into the relationship between emotion and driving in the context of advertising and distraction. Research that has looked into this also has methodological limitations that could be affecting the results rather than emotional processing (Trick et al., 2012). The current study investigated the relationship between image valence and risk perception, eye movements and physiological reactions. Participants watched hazard perception clips which had emotional images from the international affective picture system overlaid onto them. They rated how hazardous or safe they felt, whilst eye movements, galvanic skin response and heart rate were recorded. Results suggested that participants were more aware of potential hazards when a neutral image had been shown, in comparison to positive and negative valenced images; that is, participants showed higher subjective ratings of risk, larger physiological responses and marginally longer fixation durations when viewing a hazard after a neutral image, but this effect was attenuated after emotional images. It appears that emotional images reduce sensitivity to potential hazards, and we suggest that future studies could apply these findings to higher fidelity paradigms such as driving simulators. PMID:25265192

  6. Micro-valences: perceiving affective valence in everyday objects.

    PubMed

    Lebrecht, Sophie; Bar, Moshe; Barrett, Lisa Feldman; Tarr, Michael J

    2012-01-01

    Perceiving the affective valence of objects influences how we think about and react to the world around us. Conversely, the speed and quality with which we visually recognize objects in a visual scene can vary dramatically depending on that scene's affective content. Although typical visual scenes contain mostly "everyday" objects, the affect perception in visual objects has been studied using somewhat atypical stimuli with strong affective valences (e.g., guns or roses). Here we explore whether affective valence must be strong or overt to exert an effect on our visual perception. We conclude that everyday objects carry subtle affective valences - "micro-valences" - which are intrinsic to their perceptual representation. PMID:22529828

  7. Perception of affective and linguistic prosody: an ALE meta-analysis of neuroimaging studies

    PubMed Central

    Brown, Steven

    2014-01-01

    Prosody refers to the melodic and rhythmic aspects of speech. Two forms of prosody are typically distinguished: ‘affective prosody’ refers to the expression of emotion in speech, whereas ‘linguistic prosody’ relates to the intonation of sentences, including the specification of focus within sentences and stress within polysyllabic words. While these two processes are united by their use of vocal pitch modulation, they are functionally distinct. In order to examine the localization and lateralization of speech prosody in the brain, we performed two voxel-based meta-analyses of neuroimaging studies of the perception of affective and linguistic prosody. There was substantial sharing of brain activations between analyses, particularly in right-hemisphere auditory areas. However, a major point of divergence was observed in the inferior frontal gyrus: affective prosody was more likely to activate Brodmann area 47, while linguistic prosody was more likely to activate the ventral part of area 44. PMID:23934416

  8. Shedding light on emotional perception: Interaction of brightness and semantic content in extrastriate visual cortex.

    PubMed

    Schettino, Antonio; Keil, Andreas; Porcu, Emanuele; Müller, Matthias M

    2016-06-01

    The rapid extraction of affective cues from the visual environment is crucial for flexible behavior. Previous studies have reported emotion-dependent amplitude modulations of two event-related potential (ERP) components - the N1 and EPN - reflecting sensory gain control mechanisms in extrastriate visual areas. However, it is unclear whether both components are selective electrophysiological markers of attentional orienting toward emotional material or are also influenced by physical features of the visual stimuli. To address this question, electrical brain activity was recorded from seventeen male participants while viewing original and bright versions of neutral and erotic pictures. Bright neutral scenes were rated as more pleasant compared to their original counterpart, whereas erotic scenes were judged more positively when presented in their original version. Classical and mass univariate ERP analysis showed larger N1 amplitude for original relative to bright erotic pictures, with no differences for original and bright neutral scenes. Conversely, the EPN was only modulated by picture content and not by brightness, substantiating the idea that this component is a unique electrophysiological marker of attention allocation toward emotional material. Complementary topographic analysis revealed the early selective expression of a centro-parietal positivity following the presentation of original erotic scenes only, reflecting the recruitment of neural networks associated with sustained attention and facilitated memory encoding for motivationally relevant material. Overall, these results indicate that neural networks subtending the extraction of emotional information are differentially recruited depending on low-level perceptual features, which ultimately influence affective evaluations. PMID:26994832

  9. Strengthening affective organizational commitment: the influence of fairness perceptions of management practices and underlying employee cynicism.

    PubMed

    English, Brian; Chalon, Christopher

    2011-01-01

    This study investigates the relationship between cynicism, the perceived fairness of change management and personnel practices, and affective organizational commitment. High levels of affective organizational commitment have been shown to reduce voluntary turnover in the nursing workforce. Previous research suggests that "unfair" management practices and employee cynicism lead to lower commitment. It is not clear, however, whether the perceived fairness of particular practices influences affective commitment beyond that accounted for by underlying employee cynicism. Data were obtained from a study involving 1104 registered nurses that formed part of a larger investigation of the general well-being of nurses in Western Australia. Only nurses who were permanent or employed on fixed term or temporary contracts were included. Findings indicated that although higher levels of cynicism among nurses were associated with lower levels of affective commitment, their perception of the fairness of change management and personnel practices influenced their affective commitment over and above their cynicism. The perceived fairness of management practices is an important influence on nurses' affective commitment beyond that accounted for by cynicism. The implication for managers is that the affective organizational commitment of nurses is likely to be strengthened by addressing the perceived fairness of change management and personnel practices notwithstanding their beliefs about the integrity of the organization. PMID:21248545

  10. Auditory, Visual, and Auditory-Visual Perception of Emotions by Individuals with Cochlear Implants, Hearing Aids, and Normal Hearing

    ERIC Educational Resources Information Center

    Most, Tova; Aviner, Chen

    2009-01-01

    This study evaluated the benefits of cochlear implant (CI) with regard to emotion perception of participants differing in their age of implantation, in comparison to hearing aid users and adolescents with normal hearing (NH). Emotion perception was examined by having the participants identify happiness, anger, surprise, sadness, fear, and disgust.…

  11. Attention enhances stimulus representations in macaque visual cortex without affecting their signal-to-noise level

    PubMed Central

    Daliri, Mohammad Reza; Kozyrev, Vladislav; Treue, Stefan

    2016-01-01

    The magnitude of the attentional modulation of neuronal responses in visual cortex varies with stimulus contrast. Whether the strength of these attentional influences is similarly dependent on other stimulus properties is unknown. Here we report the effect of spatial attention on responses in the medial-temporal area (MT) of macaque visual cortex to moving random dots pattern of various motion coherences, i.e. signal-to-noise ratios. Our data show that allocating spatial attention causes a gain change in MT neurons. The magnitude of this attentional modulation is independent of the attended stimulus’ motion coherence, creating a multiplicative scaling of the neuron’s coherence-response function. This is consistent with the characteristics of gain models of attentional modulation and suggests that attention strengthens the neuronal representation of behaviorally relevant visual stimuli relative to unattended stimuli, but without affecting their signal-to-noise ratios. PMID:27283275

  12. Asymmetric bias in perception of facial affect among Roman and Arabic script readers.

    PubMed

    Heath, Robin L; Rouhana, Aida; Ghanem, Dana Abi

    2005-01-01

    The asymmetric chimeric faces test is used frequently as an indicator of right hemisphere involvement in the perception of facial affect, as the test is considered free of linguistic elements. Much of the original research with the asymmetric chimeric faces test was conducted with subjects reading left-to-right Roman script, i.e., English. As readers of right-to-left scripts, such as Arabic, demonstrated a mixed or weak rightward bias in judgements of facial affect, the influence of habitual scanning direction was thought to intersect with laterality. We administered the asymmetric chimeric faces test to 1239 adults who represented a range of script experience, i.e., Roman script readers (English and French), Arabic readers, bidirectional readers of Roman and Arabic scripts, and illiterates. Our findings supported the hypothesis that the bias in facial affect judgement is rooted in laterality, but can be influenced by script direction. Specifically, right-handed readers of Roman script demonstrated the greatest mean leftward score, and mixed-handed Arabic script readers demonstrated the greatest mean rightward score. Biliterates showed a gradual shift in asymmetric perception, as their scores fell between those of Roman and Arabic script readers, basically distributed in the order expected by their handedness and most often used script. Illiterates, whose only directional influence was laterality, showed a slight leftward bias. PMID:15841823

  13. Unseen positive and negative affective information influences social perception in bipolar I disorder and healthy adults

    PubMed Central

    Siegel, Erika H.; Purcell, Amanda L.; Earls, Holly A.; Cooper, Gaia; Barrett, Lisa Feldman

    2016-01-01

    Bipolar disorder is fundamentally a disorder of emotion regulation, and associated with explicit processing biases for socially relevant emotional information in human faces. Less is known, however, about whether implicit processing of this type of emotional information directly influences social perception. We thus investigated group-related differences in the influence of unconscious emotional processing on conscious person perception judgments using a continuous flash suppression task among 22 individuals with remitted bipolar I disorder (BD; AgeM=30.82, AgeSD=7.04; 68.2% female) compared with 22 healthy adults (CTL; AgeM=20.86, AgeSD=9.91; 72.2% female). Across both groups, participants rated neutral faces as more trustworthy, warm, and competent when paired with unseen happy faces as compared to unseen angry and neutral faces; participants rated neutral faces as less trustworthy, warm, and competent when paired with unseen angry as compared to neutral faces. These findings suggest that emotion-related disturbances are not explained by early automatic processing stages, and that activity in the dorsal visual stream underlying implicit emotion processing is intact in bipolar disorder. Implications for understanding the etiology of emotion disturbance in BD are discussed. PMID:26745436

  14. Unseen positive and negative affective information influences social perception in bipolar I disorder and healthy adults.

    PubMed

    Gruber, June; Siegel, Erika H; Purcell, Amanda L; Earls, Holly A; Cooper, Gaia; Barrett, Lisa Feldman

    2016-03-01

    Bipolar disorder is fundamentally a disorder of emotion regulation, and associated with explicit processing biases for socially relevant emotional information in human faces. Less is known, however, about whether implicit processing of this type of emotional information directly influences social perception. We thus investigated group-related differences in the influence of unconscious emotional processing on conscious person perception judgments using a continuous flash suppression task among 22 individuals with remitted bipolar I disorder (BD; AgeM=30.82, AgeSD=7.04; 68.2% female) compared with 22 healthy adults (CTL; AgeM=20.86, AgeSD=9.91; 72.2% female). Across both groups, participants rated neutral faces as more trustworthy, warm, and competent when paired with unseen happy faces as compared to unseen angry and neutral faces; participants rated neutral faces as less trustworthy, warm, and competent when paired with unseen angry as compared to neutral faces. These findings suggest that emotion-related disturbances are not explained by early automatic processing stages, and that activity in the dorsal visual stream underlying implicit emotion processing is intact in bipolar disorder. Implications for understanding the etiology of emotion disturbance in BD are discussed. PMID:26745436

  15. Speaking rate affects the perception of duration as a suprasegmental lexical-stress cue.

    PubMed

    Reinisch, Eva; Jesse, Alexandra; McQueen, James M

    2011-06-01

    Three categorization experiments investigated whether the speaking rate of a preceding sentence influences durational cues to the perception of suprasegmental lexical-stress patterns. Dutch two-syllable word fragments had to be judged as coming from one of two longer words that matched the fragment segmentally but differed in lexical stress placement. Word pairs contrasted primary stress on either the first versus the second syllable or the first versus the third syllable. Duration of the initial or the second syllable of the fragments and rate of the preceding context (fast vs. slow) were manipulated. Listeners used speaking rate to decide about the degree of stress on initial syllables whether the syllables' absolute durations were informative about stress (Experiment Ia) or not (Experiment Ib). Rate effects on the second syllable were visible only when the initial syllable was ambiguous in duration with respect to the preceding rate context (Experiment 2). Absolute second syllable durations contributed little to stress perception (Experiment 3). These results suggest that speaking rate is used to disambiguate words and that rate-modulated stress cues are more important on initial than noninitial syllables. Speaking rate affects perception of suprasegmental information. PMID:21848077

  16. Effects of oculo-motor exercise, functional electrical stimulation and proprioceptive neuromuscular stimulation on visual perception of spatial neglect patients

    PubMed Central

    Park, Si-Eun; Oh, Dae-Sik; Moon, Sang-Hyun

    2016-01-01

    [Purpose] The purpose of this study was to identify the effects of oculo-motor exercise, functional electrical stimulation (FES), and proprioceptive neuromuscular facilitation (PNF) on the visual perception of spatial neglect patients. [Subjects and Methods] The subjects were randomly allocated to 3 groups: an oculo-motor exercise (OME) group, a FES with oculo-motor exercise (FOME) group, and a PNF with oculo-motor exercise (POME) group. The line bisection test (LBT), motor free visual test (MVPT), and Catherine Bergego Scale (CBS) were used to measure visual perception. These were performed 5 times per week for 6 weeks. [Results] The OME group and POME group showed significant improvements according to the LBT and MVPT results, but the FOME group showed no significant improvement. According to the CBS, all 3 groups showed significant improvements. The OME and POME groups showed improvement over the FOME group in the LBT and MVPT. However, there was no significant difference among the three groups according to the CBS. [Conclusion] These results indicate that oculo-motor exercise and PNF with oculo-motor exercise had more positive effects than FES with oculo-motor exercise on the visual perception of spatial neglect patients. PMID:27190436

  17. Effects of oculo-motor exercise, functional electrical stimulation and proprioceptive neuromuscular stimulation on visual perception of spatial neglect patients.

    PubMed

    Park, Si-Eun; Oh, Dae-Sik; Moon, Sang-Hyun

    2016-04-01

    [Purpose] The purpose of this study was to identify the effects of oculo-motor exercise, functional electrical stimulation (FES), and proprioceptive neuromuscular facilitation (PNF) on the visual perception of spatial neglect patients. [Subjects and Methods] The subjects were randomly allocated to 3 groups: an oculo-motor exercise (OME) group, a FES with oculo-motor exercise (FOME) group, and a PNF with oculo-motor exercise (POME) group. The line bisection test (LBT), motor free visual test (MVPT), and Catherine Bergego Scale (CBS) were used to measure visual perception. These were performed 5 times per week for 6 weeks. [Results] The OME group and POME group showed significant improvements according to the LBT and MVPT results, but the FOME group showed no significant improvement. According to the CBS, all 3 groups showed significant improvements. The OME and POME groups showed improvement over the FOME group in the LBT and MVPT. However, there was no significant difference among the three groups according to the CBS. [Conclusion] These results indicate that oculo-motor exercise and PNF with oculo-motor exercise had more positive effects than FES with oculo-motor exercise on the visual perception of spatial neglect patients. PMID:27190436

  18. Hand proximity differentially affects visual working memory for color and orientation in a binding task.

    PubMed

    Kelly, Shane P; Brockmole, James R

    2014-01-01

    Observers determined whether two sequentially presented arrays of six lines were the same or different. Differences, when present, involved either a swap in the color of two lines or a swap in the orientation of two lines. Thus, accurate change detection required the binding of color and orientation information for each line within visual working memory. Holding viewing distance constant, the proximity of the arrays to the hands was manipulated. Placing the hands near the to-be-remembered array decreased participants' ability to remember color information, but increased their ability to remember orientation information. This pair of results indicates that hand proximity differentially affects the processing of various types of visual information, a conclusion broadly consistent with functional and anatomical differences in the magnocellular and parvocellular pathways. It further indicates that hand proximity affects the likelihood that various object features will be encoded into integrated object files. PMID:24795671

  19. A Century of Gestalt Psychology in Visual Perception II. Conceptual and Theoretical Foundations

    PubMed Central

    Wagemans, Johan; Feldman, Jacob; Gepshtein, Sergei; Kimchi, Ruth; Pomerantz, James R.; van der Helm, Peter A.; van Leeuwen, Cees

    2012-01-01

    Our first review paper on the occasion of the centennial anniversary of Gestalt psychology focused on perceptual grouping and figure-ground organization. It concluded that further progress requires a reconsideration of the conceptual and theoretical foundations of the Gestalt approach, which is provided here. In particular, we review contemporary formulations of holism within an information-processing framework, allowing for operational definitions (e.g., integral dimensions, emergent features, configural superiority, global precedence, primacy of holistic/configural properties) and a refined understanding of its psychological implications (e.g., at the level of attention, perception, and decision). We also review four lines of theoretical progress regarding the law of Prägnanz—the brain’s tendency of being attracted towards states corresponding to the simplest possible organization, given the available stimulation. The first considers the brain as a complex adaptive system and explains how self-organization solves the conundrum of trading between robustness and flexibility of perceptual states. The second specifies the economy principle in terms of optimization of neural resources, showing that elementary sensors working independently to minimize uncertainty can respond optimally at the system level. The third considers how Gestalt percepts (e.g., groups, objects) are optimal given the available stimulation, with optimality specified in Bayesian terms. Fourth, Structural Information Theory explains how a Gestaltist visual system that focuses on internal coding efficiency yields external veridicality as a side-effect. To answer the fundamental question of why things look as they do, a further synthesis of these complementary perspectives is required. PMID:22845750

  20. Emotion-perception interplay in the visual cortex: "the eyes follow the heart".

    PubMed

    Hendler, T; Rotshtein, P; Hadar, U

    2001-12-01

    Emotive aspects of stimuli have been shown to modulate perceptual thresholds. Lately, studies using functional Magnetic Resonance Imaging (fMRI) showed that emotive aspects of visual stimuli activated not only canonical limbic regions, but also sensory areas in the cerebral cortex. However, it is still arguable to what extent such emotive, related activation in sensory areas of the cortex are affected by physical characteristic or attribute difference of stimuli. To manipulate valence of stimuli while keeping visual features largely unchanged, we took advantage of the Expressional Transfiguration (ET) of faces. In addition, to explore the sensitivity of high level visual regions, we compared repeated with unrepeated (i.e. different) stimuli presentations (fMR adaptation). Thus, the dynamics of brain responses was determined according to the relative signal reduction during "repeated" relative to "different" presentations ("adaptation ratio"). Our results showed, for the first time, that emotional valence produced significant differences in fMR adaptation, but not in overall levels of activation of lateral occipital complex (LOC). We then asked whether this emotion modulation on sensory cortex could be related to previous personal experience that attached negative attributes of stimuli. To clarify this, we investigated Posttraumatic Stress Disorder (PTSD) and non-PTSD veterans. PTSD is characterized by recurrent revival of trauma-related sensations. Such phenomena have been attributed to a disturbed processing of trauma-related stimuli, either at the perceptual level or at the cognitive level. We assumed that PTSD veterans would differ from non-PTSD veterans (who have similar combat experience) in their high order visual cortex responses to combat-related visual stimuli that are associated with their traumatic experience. An fMRI study measured the cerebral activation of subjects while viewing pictures with and without combat content, in "repeated" or "different