Science.gov

Sample records for affects visual perception

  1. Losing One's Hand: Visual-Proprioceptive Conflict Affects Touch Perception

    PubMed Central

    Folegatti, Alessia; de Vignemont, Frédérique; Pavani, Francesco; Rossetti, Yves; Farnè, Alessandro

    2009-01-01

    Background While the sense of bodily ownership has now been widely investigated through the rubber hand illusion (RHI), very little is known about the sense of disownership. It has been hypothesized that the RHI also affects the ownership feelings towards the participant's own hand, as if the rubber hand replaced the participant's actual hand. Somatosensory changes observed in the participants' hand while experiencing the RHI have been taken as evidence for disownership of their real hand. Here we propose a theoretical framework to disambiguate whether such somatosensory changes are to be ascribed to the disownership of the real hand or rather to the anomalous visuo-proprioceptive conflict experienced by the participant during the RHI. Methodology/Principal Findings In experiment 1, reaction times (RTs) to tactile stimuli delivered to the participants' hand slowed down following the establishment of the RHI. In experiment 2, the misalignment of visual and proprioceptive inputs was obtained via prismatic displacement, a situation in which ownership of the seen hand was doubtless. This condition slowed down the participants' tactile RTs. Thus, similar effects on touch perception emerged following RHI and prismatic displacement. Both manipulations also induced a proprioceptive drift, toward the fake hand in the first experiment and toward the visual position of the participants' hand in the second experiment. Conclusions/Significance These findings reveal that somatosensory alterations in the experimental hand resulting from the RHI result from cross-modal mismatch between the seen and felt position of the hand. As such, they are not necessarily a signature of disownership. PMID:19738900

  2. Contrast and strength of visual memory and imagery differentially affect visual perception.

    PubMed

    Saad, Elyana; Silvanto, Juha

    2013-01-01

    Visual short-term memory (VSTM) and visual imagery have been shown to modulate visual perception. However, how the subjective experience of VSTM/imagery and its contrast modulate this process has not been investigated. We addressed this issue by asking participants to detect brief masked targets while they were engaged either in VSTM or visual imagery. Subjective experience of memory/imagery (strength scale), and the visual contrast of the memory/mental image (contrast scale) were assessed on a trial-by-trial basis. For both VSTM and imagery, contrast of the memory/mental image was positively associated with reporting target presence. Consequently, at the sensory level, both VSTM and imagery facilitated visual perception. However, subjective strength of VSTM was positively associated with visual detection whereas the opposite pattern was found for imagery. Thus the relationship between subjective strength of memory/imagery and visual detection are qualitatively different for VSTM and visual imagery, although their impact at the sensory level appears similar. Our results furthermore demonstrate that imagery and VSTM are partly dissociable processes.

  3. Conditions affecting beliefs about visual perception among children and adults.

    PubMed

    Winer, G A; Cottrell, J E; Karefilaki, K D; Chronister, M

    1996-03-01

    Children and adults were tested on their beliefs about whether visual processes involved intromissions (visual input) or extramissions (visual output) across a variety of situations. The idea that extramissions are part of the process of vision was first expressed by ancient philosophers, including Plato, Euclid, and Ptolemy and has been shown to be evident in children and in some adults. The present research showed that when questions about vision referred to luminous as opposed to non-luminous objects, under certain conditions there was some increase in intromission beliefs, but almost no corresponding decline in extramission beliefs, and no evidence of transfer of intromission responses to questions referring to nonluminous objects. A separate study showed that college students, but not children, increased their extramission responses to questions providing a positive emotional context. The results are inconsistent with the idea that simple experiences increase or reinforce a coherent theory of vision. The results also have implications for understanding the nature of beliefs about scientific processes and for education.

  4. Conditions affecting beliefs about visual perception among children and adults.

    PubMed

    Winer, G A; Cottrell, J E; Karefilaki, K D; Chronister, M

    1996-03-01

    Children and adults were tested on their beliefs about whether visual processes involved intromissions (visual input) or extramissions (visual output) across a variety of situations. The idea that extramissions are part of the process of vision was first expressed by ancient philosophers, including Plato, Euclid, and Ptolemy and has been shown to be evident in children and in some adults. The present research showed that when questions about vision referred to luminous as opposed to non-luminous objects, under certain conditions there was some increase in intromission beliefs, but almost no corresponding decline in extramission beliefs, and no evidence of transfer of intromission responses to questions referring to nonluminous objects. A separate study showed that college students, but not children, increased their extramission responses to questions providing a positive emotional context. The results are inconsistent with the idea that simple experiences increase or reinforce a coherent theory of vision. The results also have implications for understanding the nature of beliefs about scientific processes and for education. PMID:8812034

  5. Perception and Attention for Visualization

    ERIC Educational Resources Information Center

    Haroz, Steve

    2013-01-01

    This work examines how a better understanding of visual perception and attention can impact visualization design. In a collection of studies, I explore how different levels of the visual system can measurably affect a variety of visualization metrics. The results show that expert preference, user performance, and even computational performance are…

  6. Visual Imagery without Visual Perception?

    ERIC Educational Resources Information Center

    Bertolo, Helder

    2005-01-01

    The question regarding visual imagery and visual perception remain an open issue. Many studies have tried to understand if the two processes share the same mechanisms or if they are independent, using different neural substrates. Most research has been directed towards the need of activation of primary visual areas during imagery. Here we review…

  7. Non-conscious visual cues related to affect and action alter perception of effort and endurance performance

    PubMed Central

    Blanchfield, Anthony; Hardy, James; Marcora, Samuele

    2014-01-01

    The psychobiological model of endurance performance proposes that endurance performance is determined by a decision-making process based on perception of effort and potential motivation. Recent research has reported that effort-based decision-making during cognitive tasks can be altered by non-conscious visual cues relating to affect and action. The effects of these non-conscious visual cues on effort and performance during physical tasks are however unknown. We report two experiments investigating the effects of subliminal priming with visual cues related to affect and action on perception of effort and endurance performance. In Experiment 1 thirteen individuals were subliminally primed with happy or sad faces as they cycled to exhaustion in a counterbalanced and randomized crossover design. A paired t-test (happy vs. sad faces) revealed that individuals cycled significantly longer (178 s, p = 0.04) when subliminally primed with happy faces. A 2 × 5 (condition × iso-time) ANOVA also revealed a significant main effect of condition on rating of perceived exertion (RPE) during the time to exhaustion (TTE) test with lower RPE when subjects were subliminally primed with happy faces (p = 0.04). In Experiment 2, a single-subject randomization tests design found that subliminal priming with action words facilitated a significantly longer TTE (399 s, p = 0.04) in comparison to inaction words. Like Experiment 1, this greater TTE was accompanied by a significantly lower RPE (p = 0.03). These experiments are the first to show that subliminal visual cues relating to affect and action can alter perception of effort and endurance performance. Non-conscious visual cues may therefore influence the effort-based decision-making process that is proposed to determine endurance performance. Accordingly, the findings raise notable implications for individuals who may encounter such visual cues during endurance competitions, training, or health related exercise. PMID:25566014

  8. How does parents' visual perception of their child's weight status affect their feeding style?

    PubMed

    Yilmaz, Resul; Erkorkmaz, Ünal; Ozcetin, Mustafa; Karaaslan, Erhan

    2013-01-01

    Introducción: El estilo de alimentación es uno de los factores prominentes que determina la ingesta de energía. Uno de los factores que influyen en el estilo de alimentación paterna es la percepción de los padres del estado de peso del niño. Objetivo: El propósito de este estudio fue evaluar la relación entre la percepción visual de la madre del estado de peso de su hijo y su estilo de alimentación. Método: Se realizó un estudio transversal con madres de 380 niños preescolares de 5 a 7 (6,14 años). Las puntuaciones de la percepción visual se midieron mediante unos dibujos y el estilo de alimentación materna se medió con el cuestionario validado “Parental Feeding Style Questionnaire”. Resultados: Las puntuaciones de las subescalas de las dimensiones de alimentación parental “alimentación emocional” y “animar a comer” eran bajas en niños con sobrepeso de acuerdo con la clasificación de la percepción visual. Las puntuaciones de las subescalas “alimentación emocional” y “control permisivo” eran estadísticamente distintas en los niños clasificados como correctamente percibidos e incorrectamente percibidos bajos por una mala percepción materna. Conclusión: Diversos estilos de alimentación se relacionaban con la percepción visual materna. El mejor abordaje para evitar la obesidad y el peso bajo podría estar en centrarse en conseguir una correcta percepción parental del estado de peso de sus hijos, mejorando así las habilidades paternas y conllevando la implantación de unos estilos de alimentación adecuados.

  9. [Non-conscious perception of emotional faces affects the visual objects recognition].

    PubMed

    Gerasimenko, N Iu; Slavutskaia, A V; Kalinin, S A; Mikhaĭlova, E S

    2013-01-01

    In 34 healthy subjects we have analyzed accuracy and reaction time (RT) during the recognition of complex visual images: pictures of animals and non-living objects. The target stimuli were preceded by brief presentation of masking non-target ones, which represented drawings of emotional (angry, fearful, happy) or neutral faces. We have revealed that in contrast to accuracy the RT depended on the emotional expression of the preceding faces. RT was significantly shorter if the target objects were paired with the angry and fearful faces as compared with the happy and neutral ones. These effects depended on the category of the target stimulus and were more prominent for objects than for animals. Further, the emotional faces' effects were determined by emotional and communication personality traits (defined by Cattell's Questionnaire) and were clearer defined in more sensitive, anxious and pessimistic introverts. The data are important for understanding the mechanisms of human visual behavior determination by non-consciously processing of emotional information. PMID:23885550

  10. Topological Structure in Visual Perception.

    ERIC Educational Resources Information Center

    Chen, L.

    1982-01-01

    Three experiments on tachistoscopic perception of visual stimuli demonstrate that the visual system is sensitive to global topological properties. The results indicate that extraction of global topological properties is a basic factor in perceptual organization. (Author)

  11. Binocular visual surface perception.

    PubMed Central

    Nakayama, K

    1996-01-01

    Binocular disparity, the differential angular separation between pairs of image points in the two eyes, is the well-recognized basis for binocular distance perception. Without denying disparity's role in perceiving depth, we describe two perceptual phenomena, which indicate that a wider view of binocular vision is warranted. First, we show that disparity can play a critical role in two-dimensional perception by determining whether separate image fragments should be grouped as part of a single surface or segregated as parts of separate surfaces. Second, we show that stereoscopic vision is not limited to the registration and interpretation of binocular disparity but that it relies on half-occluded points, visible to one eye and not the other, to determine the layout and transparency of surfaces. Because these half-visible points are coded by neurons carrying eye-of-origin information, we suggest that the perception of these surface properties depends on neural activity available at visual cortical area V1. Images Fig. 2 Fig. 4 Fig. 6 Fig. 7 Fig. 8 PMID:8570607

  12. Bodily action penetrates affective perception

    PubMed Central

    Rigutti, Sara; Gerbino, Walter

    2016-01-01

    Fantoni & Gerbino (2014) showed that subtle postural shifts associated with reaching can have a strong hedonic impact and affect how actors experience facial expressions of emotion. Using a novel Motor Action Mood Induction Procedure (MAMIP), they found consistent congruency effects in participants who performed a facial emotion identification task after a sequence of visually-guided reaches: a face perceived as neutral in a baseline condition appeared slightly happy after comfortable actions and slightly angry after uncomfortable actions. However, skeptics about the penetrability of perception (Zeimbekis & Raftopoulos, 2015) would consider such evidence insufficient to demonstrate that observer’s internal states induced by action comfort/discomfort affect perception in a top-down fashion. The action-modulated mood might have produced a back-end memory effect capable of affecting post-perceptual and decision processing, but not front-end perception. Here, we present evidence that performing a facial emotion detection (not identification) task after MAMIP exhibits systematic mood-congruent sensitivity changes, rather than response bias changes attributable to cognitive set shifts; i.e., we show that observer’s internal states induced by bodily action can modulate affective perception. The detection threshold for happiness was lower after fifty comfortable than uncomfortable reaches; while the detection threshold for anger was lower after fifty uncomfortable than comfortable reaches. Action valence induced an overall sensitivity improvement in detecting subtle variations of congruent facial expressions (happiness after positive comfortable actions, anger after negative uncomfortable actions), in the absence of significant response bias shifts. Notably, both comfortable and uncomfortable reaches impact sensitivity in an approximately symmetric way relative to a baseline inaction condition. All of these constitute compelling evidence of a genuine top-down effect on

  13. Sound can suppress visual perception.

    PubMed

    Hidaka, Souta; Ide, Masakazu

    2015-05-29

    In a single modality, the percept of an input (e.g., voices of neighbors) is often suppressed by another (e.g., the sound of a car horn nearby) due to close interactions of neural responses to these inputs. Recent studies have also suggested that close interactions of neural responses could occur even across sensory modalities, especially for audio-visual interactions. However, direct behavioral evidence regarding the audio-visual perceptual suppression effect has not been reported in a study with humans. Here, we investigated whether sound could have a suppressive effect on visual perception. We found that white noise bursts presented through headphones degraded visual orientation discrimination performance. This auditory suppression effect on visual perception frequently occurred when these inputs were presented in a spatially and temporally consistent manner. These results indicate that the perceptual suppression effect could occur across auditory and visual modalities based on close and direct neural interactions among those sensory inputs.

  14. Structure of visual perception.

    PubMed Central

    Zhang, J; Wu, S Y

    1990-01-01

    The response properties of a class of motion detectors (Reichardt detectors) are investigated extensively here. Since the outputs of the detectors, responding to an image undergoing two-dimensional rigid translation, are dependent on both the image velocity and the image intensity distribution, they are nonuniform across the entire image, even though the object is moving rigidly as a whole. To achieve perceptual "oneness" in the rigid motion, we are led to contend that visual perception must take place in a space that is non-Euclidean in nature. We then derive the affine connection and the metric of this perceptual space. The Riemann curvature tensor is identically zero, which means that the perceptual space is intrinsically flat. A geodesic in this space is composed of points of constant image intensity gradient along a certain direction. The deviation of geodesics (which are perceptually "straight") from physically straight lines may offer an explanation to the perceptual distortion of angular relationships such as the Hering illusion. PMID:2235999

  15. Serial dependence in visual perception.

    PubMed

    Fischer, Jason; Whitney, David

    2014-05-01

    Visual input often arrives in a noisy and discontinuous stream, owing to head and eye movements, occlusion, lighting changes, and many other factors. Yet the physical world is generally stable; objects and physical characteristics rarely change spontaneously. How then does the human visual system capitalize on continuity in the physical environment over time? We found that visual perception in humans is serially dependent, using both prior and present input to inform perception at the present moment. Using an orientation judgment task, we found that, even when visual input changed randomly over time, perceived orientation was strongly and systematically biased toward recently seen stimuli. Furthermore, the strength of this bias was modulated by attention and tuned to the spatial and temporal proximity of successive stimuli. These results reveal a serial dependence in perception characterized by a spatiotemporally tuned, orientation-selective operator-which we call a continuity field-that may promote visual stability over time.

  16. Visual adaptation and face perception

    PubMed Central

    Webster, Michael A.; MacLeod, Donald I. A.

    2011-01-01

    The appearance of faces can be strongly affected by the characteristics of faces viewed previously. These perceptual after-effects reflect processes of sensory adaptation that are found throughout the visual system, but which have been considered only relatively recently in the context of higher level perceptual judgements. In this review, we explore the consequences of adaptation for human face perception, and the implications of adaptation for understanding the neural-coding schemes underlying the visual representation of faces. The properties of face after-effects suggest that they, in part, reflect response changes at high and possibly face-specific levels of visual processing. Yet, the form of the after-effects and the norm-based codes that they point to show many parallels with the adaptations and functional organization that are thought to underlie the encoding of perceptual attributes like colour. The nature and basis for human colour vision have been studied extensively, and we draw on ideas and principles that have been developed to account for norms and normalization in colour vision to consider potential similarities and differences in the representation and adaptation of faces. PMID:21536555

  17. Camouflage and visual perception

    PubMed Central

    Troscianko, Tom; Benton, Christopher P.; Lovell, P. George; Tolhurst, David J.; Pizlo, Zygmunt

    2008-01-01

    How does an animal conceal itself from visual detection by other animals? This review paper seeks to identify general principles that may apply in this broad area. It considers mechanisms of visual encoding, of grouping and object encoding, and of search. In most cases, the evidence base comes from studies of humans or species whose vision approximates to that of humans. The effort is hampered by a relatively sparse literature on visual function in natural environments and with complex foraging tasks. However, some general constraints emerge as being potentially powerful principles in understanding concealment—a ‘constraint’ here means a set of simplifying assumptions. Strategies that disrupt the unambiguous encoding of discontinuities of intensity (edges), and of other key visual attributes, such as motion, are key here. Similar strategies may also defeat grouping and object-encoding mechanisms. Finally, the paper considers how we may understand the processes of search for complex targets in complex scenes. The aim is to provide a number of pointers towards issues, which may be of assistance in understanding camouflage and concealment, particularly with reference to how visual systems can detect the shape of complex, concealed objects. PMID:18990671

  18. Attentional Episodes in Visual Perception

    ERIC Educational Resources Information Center

    Wyble, Brad; Potter, Mary C.; Bowman, Howard; Nieuwenstein, Mark

    2011-01-01

    Is one's temporal perception of the world truly as seamless as it appears? This article presents a computationally motivated theory suggesting that visual attention samples information from temporal episodes (episodic simultaneous type/serial token model; Wyble, Bowman, & Nieuwenstein, 2009). Breaks between these episodes are punctuated by periods…

  19. Perception, Cognition, and Visualization.

    ERIC Educational Resources Information Center

    Arnheim, Rudolf

    1991-01-01

    Described are how pictures can combine aspects of naturalistic representation with more formal shapes to enhance cognitive understanding. These "diagrammatic" shapes derive from geometrical elementary and thereby bestow visual concreteness to concepts conveyed by the pictures. Leonardo da Vinci's anatomical drawings are used as examples…

  20. Improving visual perception through neurofeedback.

    PubMed

    Scharnowski, Frank; Hutton, Chloe; Josephs, Oliver; Weiskopf, Nikolaus; Rees, Geraint

    2012-12-01

    Perception depends on the interplay of ongoing spontaneous activity and stimulus-evoked activity in sensory cortices. This raises the possibility that training ongoing spontaneous activity alone might be sufficient for enhancing perceptual sensitivity. To test this, we trained human participants to control ongoing spontaneous activity in circumscribed regions of retinotopic visual cortex using real-time functional MRI-based neurofeedback. After training, we tested participants using a new and previously untrained visual detection task that was presented at the visual field location corresponding to the trained region of visual cortex. Perceptual sensitivity was significantly enhanced only when participants who had previously learned control over ongoing activity were now exercising control and only for that region of visual cortex. Our new approach allows us to non-invasively and non-pharmacologically manipulate regionally specific brain activity and thus provide "brain training" to deliver particular perceptual enhancements. PMID:23223302

  1. Neuron analysis of visual perception

    NASA Technical Reports Server (NTRS)

    Chow, K. L.

    1980-01-01

    The receptive fields of single cells in the visual system of cat and squirrel monkey were studied investigating the vestibular input affecting the cells, and the cell's responses during visual discrimination learning process. The receptive field characteristics of the rabbit visual system, its normal development, its abnormal development following visual deprivation, and on the structural and functional re-organization of the visual system following neo-natal and prenatal surgery were also studied. The results of each individual part of each investigation are detailed.

  2. Picture perception and visual field

    NASA Astrophysics Data System (ADS)

    van Doorn, Andrea J.; de Ridder, Huib; Koenderink, Jan

    2013-03-01

    Looking at a picture fills part of the visual field. In the case of straight photographs there is a notion of the "Field of View" of the camera at the time of exposure. Is there a corresponding notion for the perception of the picture? In most cases the part of the visual field (as measured in degrees) filled by the picture will be quite different from the field of view of the camera. The case of works of arts is even more complicated, there need not even exist a well defined central view point. With several examples we show that there is essentially no notion of a corresponding "field of view" in pictorial perception. This is even the case for drawings in conventional linear perspective. Apparently the "mental eye" of the viewer is often unrelated to the geometry of the camera (or perspective center used in drawing). Observers often substitute templates instead of attempting an analysis of perspective.

  3. Eye movements reset visual perception.

    PubMed

    Paradiso, Michael A; Meshi, Dar; Pisarcik, Jordan; Levine, Samuel

    2012-12-12

    Human vision uses saccadic eye movements to rapidly shift the sensitive foveal portion of our retina to objects of interest. For vision to function properly amidst these ballistic eye movements, a mechanism is needed to extract discrete percepts on each fixation from the continuous stream of neural activity that spans fixations. The speed of visual parsing is crucial because human behaviors ranging from reading to driving to sports rely on rapid visual analysis. We find that a brain signal associated with moving the eyes appears to play a role in resetting visual analysis on each fixation, a process that may aid in parsing the neural signal. We quantified the degree to which the perception of tilt is influenced by the tilt of a stimulus on a preceding fixation. Two key conditions were compared, one in which a saccade moved the eyes from one stimulus to the next and a second simulated saccade condition in which the stimuli moved in the same manner but the subjects did not move their eyes. We find that there is a brief period of time at the start of each fixation during which the tilt of the previous stimulus influences perception (in a direction opposite to the tilt aftereffect)--perception is not instantaneously reset when a fixation starts. Importantly, the results show that this perceptual bias is much greater, with nearly identical visual input, when saccades are simulated. This finding suggests that, in real-saccade conditions, some signal related to the eye movement may be involved in the reset phenomenon. While proprioceptive information from the extraocular muscles is conceivably a factor, the fast speed of the effect we observe suggests that a more likely mechanism is a corollary discharge signal associated with eye movement.

  4. Eye movements reset visual perception.

    PubMed

    Paradiso, Michael A; Meshi, Dar; Pisarcik, Jordan; Levine, Samuel

    2012-01-01

    Human vision uses saccadic eye movements to rapidly shift the sensitive foveal portion of our retina to objects of interest. For vision to function properly amidst these ballistic eye movements, a mechanism is needed to extract discrete percepts on each fixation from the continuous stream of neural activity that spans fixations. The speed of visual parsing is crucial because human behaviors ranging from reading to driving to sports rely on rapid visual analysis. We find that a brain signal associated with moving the eyes appears to play a role in resetting visual analysis on each fixation, a process that may aid in parsing the neural signal. We quantified the degree to which the perception of tilt is influenced by the tilt of a stimulus on a preceding fixation. Two key conditions were compared, one in which a saccade moved the eyes from one stimulus to the next and a second simulated saccade condition in which the stimuli moved in the same manner but the subjects did not move their eyes. We find that there is a brief period of time at the start of each fixation during which the tilt of the previous stimulus influences perception (in a direction opposite to the tilt aftereffect)--perception is not instantaneously reset when a fixation starts. Importantly, the results show that this perceptual bias is much greater, with nearly identical visual input, when saccades are simulated. This finding suggests that, in real-saccade conditions, some signal related to the eye movement may be involved in the reset phenomenon. While proprioceptive information from the extraocular muscles is conceivably a factor, the fast speed of the effect we observe suggests that a more likely mechanism is a corollary discharge signal associated with eye movement. PMID:23241264

  5. The hippocampus and visual perception

    PubMed Central

    Lee, Andy C. H.; Yeung, Lok-Kin; Barense, Morgan D.

    2012-01-01

    In this review, we will discuss the idea that the hippocampus may be involved in both memory and perception, contrary to theories that posit functional and neuroanatomical segregation of these processes. This suggestion is based on a number of recent neuropsychological and functional neuroimaging studies that have demonstrated that the hippocampus is involved in the visual discrimination of complex spatial scene stimuli. We argue that these findings cannot be explained by long-term memory or working memory processing or, in the case of patient findings, dysfunction beyond the medial temporal lobe (MTL). Instead, these studies point toward a role for the hippocampus in higher-order spatial perception. We suggest that the hippocampus processes complex conjunctions of spatial features, and that it may be more appropriate to consider the representations for which this structure is critical, rather than the cognitive processes that it mediates. PMID:22529794

  6. Adaptive design of visual perception experiments

    NASA Astrophysics Data System (ADS)

    O'Connor, John D.; Hixson, Jonathan; Thomas, James M., Jr.; Peterson, Matthew S.; Parasuraman, Raja

    2010-04-01

    Meticulous experimental design may not always prevent confounds from affecting experimental data acquired during visual perception experiments. Although experimental controls reduce the potential effects of foreseen sources of interference, interaction, or noise, they are not always adequate for preventing the confounding effects of unforeseen forces. Visual perception experimentation is vulnerable to unforeseen confounds because of the nature of the associated cognitive processes involved in the decision task. Some confounds are beyond the control of experimentation, such as what a participant does immediately prior to experimental participation, or the participant's attitude or emotional state. Other confounds may occur through ignorance of practical control methods on the part of the experiment's designer. The authors conducted experiments related to experimental fatigue and initially achieved significant results that were, upon re-examination, attributable to a lack of adequate controls. Re-examination of the original results and the processes and events that led to them yielded a second experimental design with more experimental controls and significantly different results. The authors propose that designers of visual perception experiments can benefit from planning to use a test-fix-test or adaptive experimental design cycle, so that unforeseen confounds in the initial design can be remedied.

  7. Visual perception and corollary discharge.

    PubMed

    Sommer, Marc A; Wurtz, Robert H

    2008-01-01

    Perception depends not only on sensory input but also on the state of the brain receiving that input. A classic example is perception of a stable visual world in spite of the saccadic eye movements that shift the images on the retina. A long-standing hypothesis is that the brain compensates for the disruption of visual input by using advance knowledge of the impending saccade, an internally generated corollary discharge. One possible neuronal mechanism for this compensation has been previously identified in parietal and frontal cortex of monkeys, but the origin of the necessary corollary discharge remained unknown. Here, we consider recent experiments that identified a pathway for a corollary discharge for saccades that extends from the superior colliculus in the midbrain to the frontal eye fields in the cerebral cortex with a relay in the medial dorsal nucleus of the thalamus. We first review the nature of the evidence used to identify a corollary discharge signal in the complexity of the primate brain and show its use for guiding a rapid sequence of eye movements. We then consider two experiments that show this same corollary signal may provide the input to the frontal cortex neurons that alters their activity with saccades in ways that could compensate for the displacements in the visual input produced by saccadic eye movements. The first experiment shows that the corollary discharge signal is spatially and temporally appropriate to produce the alterations in the frontal-cortex neurons. The second shows that this signal is necessary for this alteration because inactivation of the corollary reduces the compensation by frontal-cortex neurons. The identification of this relatively simple circuit specifies the organization of a corollary discharge in the primate brain for the first time and provides a specific example upon which consideration of the roles of corollary activity in other systems and for other functions can be evaluated.

  8. Flow, affect and visual creativity.

    PubMed

    Cseh, Genevieve M; Phillips, Louise H; Pearson, David G

    2015-01-01

    Flow (being in the zone) is purported to have positive consequences in terms of affect and performance; however, there is no empirical evidence about these links in visual creativity. Positive affect often--but inconsistently--facilitates creativity, and both may be linked to experiencing flow. This study aimed to determine relationships between these variables within visual creativity. Participants performed the creative mental synthesis task to simulate the creative process. Affect change (pre- vs. post-task) and flow were measured via questionnaires. The creativity of synthesis drawings was rated objectively and subjectively by judges. Findings empirically demonstrate that flow is related to affect improvement during visual creativity. Affect change was linked to productivity and self-rated creativity, but no other objective or subjective performance measures. Flow was unrelated to all external performance measures but was highly correlated with self-rated creativity; flow may therefore motivate perseverance towards eventual excellence rather than provide direct cognitive enhancement.

  9. Rubber Hand Illusion Affects Joint Angle Perception

    PubMed Central

    Butz, Martin V.; Kutter, Esther F.; Lorenz, Corinna

    2014-01-01

    The Rubber Hand Illusion (RHI) is a well-established experimental paradigm. It has been shown that the RHI can affect hand location estimates, arm and hand motion towards goals, the subjective visual appearance of the own hand, and the feeling of body ownership. Several studies also indicate that the peri-hand space is partially remapped around the rubber hand. Nonetheless, the question remains if and to what extent the RHI can affect the perception of other body parts. In this study we ask if the RHI can alter the perception of the elbow joint. Participants had to adjust an angular representation on a screen according to their proprioceptive perception of their own elbow joint angle. The results show that the RHI does indeed alter the elbow joint estimation, increasing the agreement with the position and orientation of the artificial hand. Thus, the results show that the brain does not only adjust the perception of the hand in body-relative space, but it also modifies the perception of other body parts. In conclusion, we propose that the brain continuously strives to maintain a consistent internal body image and that this image can be influenced by the available sensory information sources, which are mediated and mapped onto each other by means of a postural, kinematic body model. PMID:24671172

  10. Separate visual representations for perception and for visually guided behavior

    NASA Technical Reports Server (NTRS)

    Bridgeman, Bruce

    1989-01-01

    Converging evidence from several sources indicates that two distinct representations of visual space mediate perception and visually guided behavior, respectively. The two maps of visual space follow different rules; spatial values in either one can be biased without affecting the other. Ordinarily the two maps give equivalent responses because both are veridically in register with the world; special techniques are required to pull them apart. One such technique is saccadic suppression: small target displacements during saccadic eye movements are not preceived, though the displacements can change eye movements or pointing to the target. A second way to separate cognitive and motor-oriented maps is with induced motion: a slowly moving frame will make a fixed target appear to drift in the opposite direction, while motor behavior toward the target is unchanged. The same result occurs with stroboscopic induced motion, where the frame jump abruptly and the target seems to jump in the opposite direction. A third method of separating cognitive and motor maps, requiring no motion of target, background or eye, is the Roelofs effect: a target surrounded by an off-center rectangular frame will appear to be off-center in the direction opposite the frame. Again the effect influences perception, but in half of the subjects it does not influence pointing to the target. This experience also reveals more characteristics of the maps and their interactions with one another, the motor map apparently has little or no memory, and must be fed from the biased cognitive map if an enforced delay occurs between stimulus presentation and motor response. In designing spatial displays, the results mean that what you see isn't necessarily what you get. Displays must be designed with either perception or visually guided behavior in mind.

  11. Perception of biological motion in visual agnosia.

    PubMed

    Huberle, Elisabeth; Rupek, Paul; Lappe, Markus; Karnath, Hans-Otto

    2012-01-01

    Over the past 25 years, visual processing has been discussed in the context of the dual stream hypothesis consisting of a ventral ("what") and a dorsal ("where") visual information processing pathway. Patients with brain damage of the ventral pathway typically present with signs of visual agnosia, the inability to identify and discriminate objects by visual exploration, but show normal perception of motion perception. A dissociation between the perception of biological motion and non-biological motion has been suggested: perception of biological motion might be impaired when "non-biological" motion perception is intact and vice versa. The impact of object recognition on the perception of biological motion remains unclear. We thus investigated this question in a patient with severe visual agnosia, who showed normal perception of non-biological motion. The data suggested that the patient's perception of biological motion remained largely intact. However, when tested with objects constructed of coherently moving dots ("Shape-from-Motion"), recognition was severely impaired. The results are discussed in the context of possible mechanisms of biological motion perception.

  12. Visual influence on haptic torque perception.

    PubMed

    Xu, Yangqing; O'Keefe, Shélan; Suzuki, Satoru; Franconeri, Steven L

    2012-01-01

    The brain receives input from multiple sensory modalities simultaneously, yet we experience the outside world as a single integrated percept. This integration process must overcome instances where perceptual information conflicts across sensory modalities. Under such conflicts, the relative weighting of information from each modality typically depends on the given task. For conflicts between visual and haptic modalities, visual information has been shown to influence haptic judgments of object identity, spatial features (e.g., location, size), texture, and heaviness. Here we test a novel instance of haptic-visual conflict in the perception of torque. We asked participants to hold a left-right unbalanced object while viewing a potentially left-right mirror-reversed image of the object. Despite the intuition that the more proximal haptic information should dominate the perception of torque, we find that visual information exerts substantial influences on torque perception even when participants know that visual information is unreliable.

  13. Statistical extraction affects visually guided action

    PubMed Central

    Corbett, Jennifer E.; Song, Joo-Hyun

    2014-01-01

    The visual system summarizes average properties of ensembles of similar objects. We demonstrated an adaptation aftereffect of one such property, mean size, suggesting it is encoded along a single visual dimension (Corbett, et al., 2012), in a similar manner as basic stimulus properties like orientation and direction of motion. To further explore the fundamental nature of ensemble encoding, here we mapped the evolution of mean size adaptation over the course of visually guided grasping. Participants adapted to two sets of dots with different mean sizes. After adaptation, two test dots replaced the adapting sets. Participants first reached to one of these dots, and then judged whether it was larger or smaller than the opposite dot. Grip apertures were inversely dependent on the average dot size of the preceding adapting patch during the early phase of movements, and this aftereffect dissipated as reaches neared the target. Interestingly, perceptual judgments still showed a marked aftereffect, even though they were made after grasping was completed more-or-less veridically. This effect of mean size adaptation on early visually guided kinematics provides novel evidence that mean size is encoded fundamentally in both perception and action domains, and suggests that ensemble statistics not only influence our perceptions of individual objects but can also affect our physical interactions with the external environment. PMID:25383014

  14. Visual perception through the diffusion of light

    NASA Astrophysics Data System (ADS)

    Ung, Timothy

    Human perception of the visual world is limited through the homogeneity of design and the standardization of materials. After constructing a lighting apparatus made of steel and thousands of transparent thread, a small amount of light will be directed onto the apparatus and reflected and refracted multiple times, spreading light over a large area. However, visual perception of the light reflecting and refracting through the apparatus will change according to an observer's location in relation to the apparatus. Ultimately, the goal of this thesis is to engage one's perception of the visual world using properties of transparent materials to maximize the diffusion of light.

  15. The Nose Influences Visual and Personality Perception.

    PubMed

    van Schijndel, Olaf; Tasman, Abel-Jan; Litschel, Ralph

    2015-10-01

    Nasal deformities are known to attract attention, are felt to be stigmatizing, and are known to affect negatively the perception of personalities. These effects have not been studied on profile views. The objective of this study was the quantification of visual attention directed toward nasal deformities and its impact on the perception of personality traits. Forty observers were divided into two groups and their visual scanpaths were recorded. Both groups observed a series of photographs displaying profile views of 20 adult patients' faces with one or more nasal deformities or computer-morphed corrected noses. Photographs were chosen from a consecutive sample of patients (range: 17-68 years, median: 45) who requested a rhinoplasty at the Department of Otolaryngology, Head and Neck Surgery, Facial Plastic Surgery of the Cantonal Hospital Sankt Gallen, Switzerland. Patients' photographs showed a nasal deformity in one series and a computer-morphed nose in the other series and vice versa. Visual fixation times on the noses were compared between the photographs with and without a nasal deformity. Observers subsequently rated personality traits using visual analog scales. The nasal profile with a deformity received more visual attention in 17 of 20 patients (85%). The mean relative fixation duration of all nasal deformities was significantly larger compared with all computer-simulated noses (17.3 ± 6.9 [SD] vs. 10.6 ± 2.5%; p < 0.001). Cumulative personality questionnaire scores and the score for satisfaction were significantly lower for faces with nasal deformities compared with computer-morphed noses (27.8 ± 6.0 vs. 29.1 ± 6.0, p = 0.040, and 5.3 ± 1.59 vs. 5.7 ± 1.53, p = 0.017, respectively). For deformed noses, the mean relative fixation duration did not correlate with the cumulative personality score (R =  - 0.399; p = 0.082). To the best knowledge of the authors, an attention-drawing potential of nasal

  16. Fear selectively modulates visual mental imagery and visual perception.

    PubMed

    Borst, Grégoire; Kosslyn, Stephen M

    2010-05-01

    Emotions have been shown to modulate low-level visual processing of simple stimuli. In this study, we investigate whether emotions only modulate processing of visual representations created from direct visual inputs or whether they also modulate representations that underlie visual mental images. Our results demonstrate that when participants visualize or look at the global shape of written words (low-spatial-frequency visual information), the prior brief presentation of fearful faces enhances processing, whereas when participants visualize or look at details of written words (high-spatial-frequency visual information), the prior brief presentation of fearful faces impairs processing. This study demonstrates that emotions have similar effects on low-level processing of visual percepts and of internal representations created on the basis of information stored in long-term memory. PMID:20182955

  17. Perception of Visual Speed While Moving

    ERIC Educational Resources Information Center

    Durgin, Frank H.; Gigone, Krista; Scott, Rebecca

    2005-01-01

    During self-motion, the world normally appears stationary. In part, this may be due to reductions in visual motion signals during self-motion. In 8 experiments, the authors used magnitude estimation to characterize changes in visual speed perception as a result of biomechanical self-motion alone (treadmill walking), physical translation alone…

  18. Visual motion integration for perception and pursuit

    NASA Technical Reports Server (NTRS)

    Stone, L. S.; Beutter, B. R.; Lorenceau, J.

    2000-01-01

    To examine the relationship between visual motion processing for perception and pursuit, we measured the pursuit eye-movement and perceptual responses to the same complex-motion stimuli. We show that humans can both perceive and pursue the motion of line-figure objects, even when partial occlusion makes the resulting image motion vastly different from the underlying object motion. Our results show that both perception and pursuit can perform largely accurate motion integration, i.e. the selective combination of local motion signals across the visual field to derive global object motion. Furthermore, because we manipulated perceived motion while keeping image motion identical, the observed parallel changes in perception and pursuit show that the motion signals driving steady-state pursuit and perception are linked. These findings disprove current pursuit models whose control strategy is to minimize retinal image motion, and suggest a new framework for the interplay between visual cortex and cerebellum in visuomotor control.

  19. Saccadic Corollary Discharge Underlies Stable Visual Perception

    PubMed Central

    Berman, Rebecca A.; Joiner, Wilsaan M.; Wurtz, Robert H.

    2016-01-01

    Saccadic eye movements direct the high-resolution foveae of our retinas toward objects of interest. With each saccade, the image jumps on the retina, causing a discontinuity in visual input. Our visual perception, however, remains stable. Philosophers and scientists over centuries have proposed that visual stability depends upon an internal neuronal signal that is a copy of the neuronal signal driving the eye movement, now referred to as a corollary discharge (CD) or efference copy. In the old world monkey, such a CD circuit for saccades has been identified extending from superior colliculus through MD thalamus to frontal cortex, but there is little evidence that this circuit actually contributes to visual perception. We tested the influence of this CD circuit on visual perception by first training macaque monkeys to report their perceived eye direction, and then reversibly inactivating the CD as it passes through the thalamus. We found that the monkey's perception changed; during CD inactivation, there was a difference between where the monkey perceived its eyes to be directed and where they were actually directed. Perception and saccade were decoupled. We established that the perceived eye direction at the end of the saccade was not derived from proprioceptive input from eye muscles, and was not altered by contextual visual information. We conclude that the CD provides internal information contributing to the brain's creation of perceived visual stability. More specifically, the CD might provide the internal saccade vector used to unite separate retinal images into a stable visual scene. SIGNIFICANCE STATEMENT Visual stability is one of the most remarkable aspects of human vision. The eyes move rapidly several times per second, displacing the retinal image each time. The brain compensates for this disruption, keeping our visual perception stable. A major hypothesis explaining this stability invokes a signal within the brain, a corollary discharge, that informs

  20. How perception affects racial categorization: On the influence of initial visual exposure on labelling people as diverse individuals or racial subjects.

    PubMed

    Harsányi, Géza; Carbon, Claus-Christian

    2015-01-01

    In research on racial categorization we tend to focus on socialization, on environmental influences, and on social factors. One important factor, though, is perception itself In our experiment we let people label persons on dimensions which they could freely use. The participants were initially exposed to a full series either of black faces or of white faces. We observed a clear effect of initial exposure on explicit verbal categorizations. When initially exposed to white faces, participants used racial labels for the subsequent black faces only. In contrast, racial labels were used for black as well as white faces after initial exposure to black faces, which indicates a shift to in-group categorization after having initially inspected black faces. In conclusion, this effect documents highly adaptive categorizations caused by visual context alone, suggesting that racial thoughts are based on relatively volatile category representations.

  1. How perception affects racial categorization: On the influence of initial visual exposure on labelling people as diverse individuals or racial subjects.

    PubMed

    Harsányi, Géza; Carbon, Claus-Christian

    2015-01-01

    In research on racial categorization we tend to focus on socialization, on environmental influences, and on social factors. One important factor, though, is perception itself In our experiment we let people label persons on dimensions which they could freely use. The participants were initially exposed to a full series either of black faces or of white faces. We observed a clear effect of initial exposure on explicit verbal categorizations. When initially exposed to white faces, participants used racial labels for the subsequent black faces only. In contrast, racial labels were used for black as well as white faces after initial exposure to black faces, which indicates a shift to in-group categorization after having initially inspected black faces. In conclusion, this effect documents highly adaptive categorizations caused by visual context alone, suggesting that racial thoughts are based on relatively volatile category representations. PMID:26489221

  2. Neural pathways for visual speech perception

    PubMed Central

    Bernstein, Lynne E.; Liebenthal, Einat

    2014-01-01

    This paper examines the questions, what levels of speech can be perceived visually, and how is visual speech represented by the brain? Review of the literature leads to the conclusions that every level of psycholinguistic speech structure (i.e., phonetic features, phonemes, syllables, words, and prosody) can be perceived visually, although individuals differ in their abilities to do so; and that there are visual modality-specific representations of speech qua speech in higher-level vision brain areas. That is, the visual system represents the modal patterns of visual speech. The suggestion that the auditory speech pathway receives and represents visual speech is examined in light of neuroimaging evidence on the auditory speech pathways. We outline the generally agreed-upon organization of the visual ventral and dorsal pathways and examine several types of visual processing that might be related to speech through those pathways, specifically, face and body, orthography, and sign language processing. In this context, we examine the visual speech processing literature, which reveals widespread diverse patterns of activity in posterior temporal cortices in response to visual speech stimuli. We outline a model of the visual and auditory speech pathways and make several suggestions: (1) The visual perception of speech relies on visual pathway representations of speech qua speech. (2) A proposed site of these representations, the temporal visual speech area (TVSA) has been demonstrated in posterior temporal cortex, ventral and posterior to multisensory posterior superior temporal sulcus (pSTS). (3) Given that visual speech has dynamic and configural features, its representations in feedforward visual pathways are expected to integrate these features, possibly in TVSA. PMID:25520611

  3. Neocortical Rebound Depolarization Enhances Visual Perception

    PubMed Central

    Funayama, Kenta; Ban, Hiroshi; Chan, Allen W.; Matsuki, Norio; Murphy, Timothy H.; Ikegaya, Yuji

    2015-01-01

    Animals are constantly exposed to the time-varying visual world. Because visual perception is modulated by immediately prior visual experience, visual cortical neurons may register recent visual history into a specific form of offline activity and link it to later visual input. To examine how preceding visual inputs interact with upcoming information at the single neuron level, we designed a simple stimulation protocol in which a brief, orientated flashing stimulus was subsequently coupled to visual stimuli with identical or different features. Using in vivo whole-cell patch-clamp recording and functional two-photon calcium imaging from the primary visual cortex (V1) of awake mice, we discovered that a flash of sinusoidal grating per se induces an early, transient activation as well as a long-delayed reactivation in V1 neurons. This late response, which started hundreds of milliseconds after the flash and persisted for approximately 2 s, was also observed in human V1 electroencephalogram. When another drifting grating stimulus arrived during the late response, the V1 neurons exhibited a sublinear, but apparently increased response, especially to the same grating orientation. In behavioral tests of mice and humans, the flashing stimulation enhanced the detection power of the identically orientated visual stimulation only when the second stimulation was presented during the time window of the late response. Therefore, V1 late responses likely provide a neural basis for admixing temporally separated stimuli and extracting identical features in time-varying visual environments. PMID:26274866

  4. Acoustic noise improves visual perception and modulates occipital oscillatory states.

    PubMed

    Gleiss, Stephanie; Kayser, Christoph

    2014-04-01

    Perception is a multisensory process, and previous work has shown that multisensory interactions occur not only for object-related stimuli but also for simplistic and apparently unrelated inputs to the different senses. We here compare the facilitation of visual perception induced by transient (target-synchronized) sounds to the facilitation provided by continuous background noise like sounds. Specifically, we show that continuous acoustic noise improves visual contrast detection by systematically shifting psychometric curves in an amplitude-dependent manner. This multisensory benefit was found to be both qualitatively and quantitatively similar to that induced by a transient and target synchronized sound in the same paradigm. Studying the underlying neural mechanisms using electric neuroimaging (EEG), we found that acoustic noise alters occipital alpha (8-12 Hz) power and decreases beta-band (14-20 Hz) coupling of occipital and temporal sites. Task-irrelevant and continuous sounds thereby have an amplitude-dependent effect on cortical mechanisms implicated in shaping visual cortical excitability. The same oscillatory mechanisms also mediate visual facilitation by transient sounds, and our results suggest that task-related sounds and task-irrelevant background noises could induce perceptually and mechanistically similar enhancement of visual perception. Given the omnipresence of sounds and noises in our environment, such multisensory interactions may affect perception in many everyday scenarios. PMID:24236698

  5. Sequencing Visual Perception Skills. Training Manual.

    ERIC Educational Resources Information Center

    Langstaff, Anne L., Ed.

    The training manual of sequenced visual perception skills offers an assessment guide, explains approximately 20 major types of instructional activities, and describes appropriate instructional materials, illustrated in an associated filmstrip. All activities are organized into four learning steps (recognition, discrimination, recall, and…

  6. How Do Observer's Responses Affect Visual Long-Term Memory?

    ERIC Educational Resources Information Center

    Makovski, Tal; Jiang, Yuhong V.; Swallow, Khena M.

    2013-01-01

    How does responding to an object affect explicit memory for visual information? The close theoretical relationship between action and perception suggests that items that require a response should be better remembered than items that require no response. However, conclusive evidence for this claim is lacking, as semantic coherence, category size,…

  7. Visual perception of thick transparent materials.

    PubMed

    Fleming, Roland W; Jäkel, Frank; Maloney, Laurence T

    2011-06-01

    Under typical viewing conditions, human observers readily distinguish between materials such as silk, marmalade, or granite, an achievement of the visual system that is poorly understood. Recognizing transparent materials is especially challenging. Previous work on the perception of transparency has focused on objects composed of flat, infinitely thin filters. In the experiments reported here, we considered thick transparent objects, such as ice cubes, which are irregular in shape and can vary in refractive index. An important part of the visual evidence signaling the presence of such objects is distortions in the perceived shape of other objects in the scene. We propose a new class of visual cues derived from the distortion field induced by thick transparent objects, and we provide experimental evidence that cues arising from the distortion field predict both the successes and the failures of human perception in judging refractive indices.

  8. Visual perception of thick transparent materials.

    PubMed

    Fleming, Roland W; Jäkel, Frank; Maloney, Laurence T

    2011-06-01

    Under typical viewing conditions, human observers readily distinguish between materials such as silk, marmalade, or granite, an achievement of the visual system that is poorly understood. Recognizing transparent materials is especially challenging. Previous work on the perception of transparency has focused on objects composed of flat, infinitely thin filters. In the experiments reported here, we considered thick transparent objects, such as ice cubes, which are irregular in shape and can vary in refractive index. An important part of the visual evidence signaling the presence of such objects is distortions in the perceived shape of other objects in the scene. We propose a new class of visual cues derived from the distortion field induced by thick transparent objects, and we provide experimental evidence that cues arising from the distortion field predict both the successes and the failures of human perception in judging refractive indices. PMID:21597102

  9. Visual Perception and Its Relation to Reading: An Annotated Bibliography.

    ERIC Educational Resources Information Center

    Vernon, Magdalen D., Comp.

    This annotated bibliography on visual perception and its relation to reading is composed of 55 citations ranging in date from 1952 to 1965. Its divisions include Perception of Shape by Young Children, Perception of Words by Children, Perception in Backward Readers, and Perception of Shapes, Letters, and Words by Adults. Listings which include…

  10. Visual space perception on a computer graphics night visual attachment

    NASA Technical Reports Server (NTRS)

    Palmer, E.; Petitt, J.

    1976-01-01

    A series of experiments was conducted to compare five psychophysical methods of measuring how people perceive visual space in simulators. Psychologists have used such methods traditionally to measure visual space perception in the real world. Of the five tasks - objective-size judgments, angular-size judgments, shape judgments, slant judgments, and distance judgments - only the angular-size judgment task proved to be of potential use as a measure of simulator realism. In this experiment pilots estimated the relative angular size of triangles placed at various distances along a simulated runway. Estimates made when the display was collimated were closer to real-world performance than estimates made with an uncollimated display.

  11. Task usefulness affects perception of rivalrous images.

    PubMed

    Chopin, Adrien; Mamassian, Pascal

    2010-12-01

    In bistable perception, several interpretations of the same physical stimulus are perceived in alternation. If one interpretation appears to help the observer to be successful in an auxiliary task, will that interpretation be seen more often than the other? We addressed this question using rivalrous stimuli. One of the elicited percepts presented an advantage for a separate visual search task that was run in close temporal proximity to the rivalry task. We found that the percept that was useful for the search task became dominant over the alternate percept. Observers were not aware of the manipulation that made one percept more useful, which suggests that usefulness was learned implicitly. The learning influenced only the first percept of each rivalrous presentation, but the bias persisted even when the useful percept was no longer useful. The long-lasting aspect of the effect distinguishes it from other documented attentional effects on bistable perception. Therefore, using implicit learning, we demonstrated that task usefulness can durably change the appearance of a stimulus.

  12. Common mechanisms of visual imagery and perception.

    PubMed

    Ishai, A; Sagi, D

    1995-06-23

    Detection of a visual target can be facilitated by flanking visual masks. A similar enhancement in detection thresholds was obtained when observers imagined the previously perceived masks. Imagery-induced facilitation was detected for as long as 5 minutes after observation of the masks by the targeted eye. These results indicated the existence of a low-level (monocular) memory that stores the sensory trace for several minutes and enables reactivation of early representations by higher processes. This memory, with its iconic nature, may subserve the interface between mental images and percepts.

  13. Visual Perception in Correlated Noise

    NASA Astrophysics Data System (ADS)

    Myers, Kyle Jean

    known to exist in the human visual system. We shall show that the presence of such a mechanism can explain the degradation of human observer performance in correlated noise.

  14. A Comparative Study on the Visual Perceptions of Children with Attention Deficit Hyperactivity Disorder

    NASA Astrophysics Data System (ADS)

    Ahmetoglu, Emine; Aral, Neriman; Butun Ayhan, Aynur

    This study was conducted in order to (a) compare the visual perceptions of seven-year-old children diagnosed with attention deficit hyperactivity disorder with those of normally developing children of the same age and development level and (b) determine whether the visual perceptions of children with attention deficit hyperactivity disorder vary with respect to gender, having received preschool education and parents` educational level. A total of 60 children, 30 with attention deficit hyperactivity disorder and 30 with normal development, were assigned to the study. Data about children with attention deficit hyperactivity disorder and their families was collected by using a General Information Form and the visual perception of children was examined through the Frostig Developmental Test of Visual Perception. The Mann-Whitney U-test and Kruskal-Wallis variance analysis was used to determine whether there was a difference of between the visual perceptions of children with normal development and those diagnosed with attention deficit hyperactivity disorder and to discover whether the variables of gender, preschool education and parents` educational status affected the visual perceptions of children with attention deficit hyperactivity disorder. The results showed that there was a statistically meaningful difference between the visual perceptions of the two groups and that the visual perceptions of children with attention deficit hyperactivity disorder were affected meaningfully by gender, preschool education and parents` educational status.

  15. Smelling directions: olfaction modulates ambiguous visual motion perception.

    PubMed

    Kuang, Shenbing; Zhang, Tao

    2014-07-23

    Senses of smells are often accompanied by simultaneous visual sensations. Previous studies have documented enhanced olfactory performance with concurrent presence of congruent color- or shape- related visual cues, and facilitated visual object perception when congruent smells are simultaneously present. These visual object-olfaction interactions suggest the existences of couplings between the olfactory pathway and the visual ventral processing stream. However, it is not known if olfaction can modulate visual motion perception, a function that is related to the visual dorsal stream. We tested this possibility by examining the influence of olfactory cues on the perceptions of ambiguous visual motion signals. We showed that, after introducing an association between motion directions and olfactory cues, olfaction could indeed bias ambiguous visual motion perceptions. Our result that olfaction modulates visual motion processing adds to the current knowledge of cross-modal interactions and implies a possible functional linkage between the olfactory system and the visual dorsal pathway.

  16. Adaptive optics without altering visual perception.

    PubMed

    Koenig, D E; Hart, N W; Hofer, H J

    2014-04-01

    Adaptive optics combined with visual psychophysics creates the potential to study the relationship between visual function and the retina at the cellular scale. This potential is hampered, however, by visual interference from the wavefront-sensing beacon used during correction. For example, we have previously shown that even a dim, visible beacon can alter stimulus perception (Hofer et al., 2012). Here we describe a simple strategy employing a longer wavelength (980nm) beacon that, in conjunction with appropriate restriction on timing and placement, allowed us to perform psychophysics when dark adapted without altering visual perception. The method was verified by comparing detection and color appearance of foveally presented small spot stimuli with and without the wavefront beacon present in 5 subjects. As an important caution, we found that significant perceptual interference can occur even with a subliminal beacon when additional measures are not taken to limit exposure. Consequently, the lack of perceptual interference should be verified for a given system, and not assumed based on invisibility of the beacon.

  17. Asymmetry of the Visual Field: Perception, Retention and Preference of Still Images.

    ERIC Educational Resources Information Center

    Metallinos, Nikos

    Visual field theory was examined insofar as television viewers' perception, retention, and preference for still visual images were concerned. The purpose of the experimental investigation was to determine whether the specific shapes, colors, and placement of visuals within a picture frame affected viewers' abilities to perceive, describe, and…

  18. Perceptions of the Visually Impaired toward Pursuing Geography Courses and Majors in Higher Education

    ERIC Educational Resources Information Center

    Murr, Christopher D.; Blanchard, R. Denise

    2011-01-01

    Advances in classroom technology have lowered barriers for the visually impaired to study geography, yet few participate. Employing stereotype threat theory, we examined whether beliefs held by the visually impaired affect perceptions toward completing courses and majors in visually oriented disciplines. A test group received a low-level threat…

  19. Audibility and visual biasing in speech perception

    NASA Astrophysics Data System (ADS)

    Clement, Bart Richard

    Although speech perception has been considered a predominantly auditory phenomenon, large benefits from vision in degraded acoustic conditions suggest integration of audition and vision. More direct evidence of this comes from studies of audiovisual disparity that demonstrate vision can bias and even dominate perception (McGurk & MacDonald, 1976). It has been observed that hearing-impaired listeners demonstrate more visual biasing than normally hearing listeners (Walden et al., 1990). It is argued here that stimulus audibility must be equated across groups before true differences can be established. In the present investigation, effects of visual biasing on perception were examined as audibility was degraded for 12 young normally hearing listeners. Biasing was determined by quantifying the degree to which listener identification functions for a single synthetic auditory /ba-da-ga/ continuum changed across two conditions: (1)an auditory-only listening condition; and (2)an auditory-visual condition in which every item of the continuum was synchronized with visual articulations of the consonant-vowel (CV) tokens /ba/ and /ga/, as spoken by each of two talkers. Audibility was altered by presenting the conditions in quiet and in noise at each of three signal-to- noise (S/N) ratios. For the visual-/ba/ context, large effects of audibility were found. As audibility decreased, visual biasing increased. A large talker effect also was found, with one talker eliciting more biasing than the other. An independent lipreading measure demonstrated that this talker was more visually intelligible than the other. For the visual-/ga/ context, audibility and talker effects were less robust, possibly obscured by strong listener effects, which were characterized by marked differences in perceptual processing patterns among participants. Some demonstrated substantial biasing whereas others demonstrated little, indicating a strong reliance on audition even in severely degraded acoustic

  20. Audio-visual affective expression recognition

    NASA Astrophysics Data System (ADS)

    Huang, Thomas S.; Zeng, Zhihong

    2007-11-01

    Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.

  1. Emotion and Perception: The Role of Affective Information

    PubMed Central

    Zadra, Jonathan R.; Clore, Gerald L.

    2011-01-01

    Visual perception and emotion are traditionally considered separate domains of study. In this article, however, we review research showing them to be less separable that usually assumed. In fact, emotions routinely affect how and what we see. Fear, for example, can affect low-level visual processes, sad moods can alter susceptibility to visual illusions, and goal-directed desires can change the apparent size of goal-relevant objects. In addition, the layout of the physical environment, including the apparent steepness of a hill and the distance to the ground from a balcony can both be affected by emotional states. We propose that emotions provide embodied information about the costs and benefits of anticipated action, information that can be used automatically and immediately, circumventing the need for cogitating on the possible consequences of potential actions. Emotions thus provide a strong motivating influence on how the environment is perceived. PMID:22039565

  2. Uncertainty principle in human visual perception

    NASA Astrophysics Data System (ADS)

    Trifonov, Mikhael I.; Ugolev, Dmitry A.

    1994-05-01

    The orthodox data concerning the contrast sensitivity estimation for sine-wave gratings were formally analyzed. The result of our analysis made feasible a threshold energy value (Delta) E -- energetic equivalent to quantum of perception -- as (Delta) E equals (alpha) (Delta) L(Delta) X2, where (alpha) is a proportionality coefficient, (Delta) L is a threshold luminance, and (Delta) X is a half-period of grating. The value of (Delta) E is a constant for a given value of mean luminance L of the grating and for a middle spatial frequency region. So the `exchange' between luminance threshold (Delta) L and spatial resolution (Delta) X2 values takes place; the increasing of one is followed by the decreasing of the other. We treated this phenomenon as a principle of uncertainty in human visual perception and proved its correctness for other spatial frequencies. Taking into account threshold wavelength ((Delta) (lambda) ) and time ((Delta) t) the uncertainty principle may be extended to a wider class of visual perception problems, including color and flicker objects recognition. So, we suggest the uncertainty principle proposed above is to be one of the cornerstones of the evolution of cognitive systems.

  3. Auditory motion affects visual biological motion processing.

    PubMed

    Brooks, A; van der Zwan, R; Billard, A; Petreska, B; Clarke, S; Blanke, O

    2007-02-01

    The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of

  4. Contribution of a visual pigment absorption spectrum to a visual function: depth perception in a jumping spider.

    PubMed

    Nagata, Takashi; Arikawa, Kentaro; Terakita, Akihisa

    2013-01-01

    Absorption spectra of visual pigments are adaptively tuned to optimize informational capacity in most visual systems. Our recent investigation of the eyes of the jumping spider reveals an apparent exception: the absorption characteristics of a visual pigment cause defocusing of the image, reducing visual acuity generally in a part of the retina. However, the amount of defocus can theoretically provide a quantitative indication of the distance of an object. Therefore, we proposed a novel mechanism for depth perception in jumping spiders based on image defocus. Behavioral experiments revealed that the depth perception of the spider depended on the wavelength of the ambient light, which affects the amount of defocus because of chromatic aberration of the lens. This wavelength effect on depth perception was in close agreement with theoretical predictions based on our hypothesis. These data strongly support the hypothesis that the depth perception mechanism of jumping spiders is based on image defocus.

  5. Contribution of a visual pigment absorption spectrum to a visual function: depth perception in a jumping spider.

    PubMed

    Nagata, Takashi; Arikawa, Kentaro; Terakita, Akihisa

    2013-01-01

    Absorption spectra of visual pigments are adaptively tuned to optimize informational capacity in most visual systems. Our recent investigation of the eyes of the jumping spider reveals an apparent exception: the absorption characteristics of a visual pigment cause defocusing of the image, reducing visual acuity generally in a part of the retina. However, the amount of defocus can theoretically provide a quantitative indication of the distance of an object. Therefore, we proposed a novel mechanism for depth perception in jumping spiders based on image defocus. Behavioral experiments revealed that the depth perception of the spider depended on the wavelength of the ambient light, which affects the amount of defocus because of chromatic aberration of the lens. This wavelength effect on depth perception was in close agreement with theoretical predictions based on our hypothesis. These data strongly support the hypothesis that the depth perception mechanism of jumping spiders is based on image defocus. PMID:27493545

  6. Affective Education for Visually Impaired Children.

    ERIC Educational Resources Information Center

    Locke, Don C.; Gerler, Edwin R., Jr.

    1981-01-01

    Evaluated the effectiveness of the Human Development Program (HDP) and the Developing Understanding of Self and Others (DUSO) program used with visually impaired children. Although HDP and DUSO affected the behavior of visually impaired children, they did not have any effect on children's attitudes toward school. (RC)

  7. Attention, Visual Perception and their Relationship to Sport Performance in Fencing.

    PubMed

    Hijazi, Mona Mohamed Kamal

    2013-12-18

    Attention and visual perception are important in fencing, as they affect the levels of performance and achievement in fencers. This study identifies the levels of attention and visual perception among male and female fencers and the relationship between attention and visual perception dimensions and the sport performance in fencing. The researcher employed a descriptive method in a sample of 16 fencers during the 2010/2011 season. The sample was comprised of eight males and eight females who participated in the 11-year stage of the Cairo Championships. The Test of Attentional and Interpersonal Style, which was designed by Nideffer and translated by Allawi (1998) was applied. The test consisted of 59 statements that measured seven dimensions. The Test of Visual Perception Skills designed by Alsmadune (2005), which includes seven dimensions was also used. Among females, a positive and statistically significant correlation between the achievement level and Visual Discrimination, Visual-Spatial Relationships, Visual Sequential Memory, Narrow Attentional Focus and Information Processing was observed, while among males, there was a positive and statistically significant correlation between the achievement level and Visual Discrimination, Visual Sequential Memory, Broad External Attentional Focus and Information Processing. For both males and females, a positive and statistically significant correlation between achievement level and Visual Discrimination, Visual Sequential Memory, Broad External Attentional, Narrow Attentional Focus and Information Processing was found. There were statistically significant differences between males and females in Visual Discrimination and Visual-Form Constancy. PMID:24511355

  8. Attention, Visual Perception and their Relationship to Sport Performance in Fencing

    PubMed Central

    Hijazi, Mona Mohamed Kamal

    2013-01-01

    Attention and visual perception are important in fencing, as they affect the levels of performance and achievement in fencers. This study identifies the levels of attention and visual perception among male and female fencers and the relationship between attention and visual perception dimensions and the sport performance in fencing. The researcher employed a descriptive method in a sample of 16 fencers during the 2010/2011 season. The sample was comprised of eight males and eight females who participated in the 11-year stage of the Cairo Championships. The Test of Attentional and Interpersonal Style, which was designed by Nideffer and translated by Allawi (1998) was applied. The test consisted of 59 statements that measured seven dimensions. The Test of Visual Perception Skills designed by Alsmadune (2005), which includes seven dimensions was also used. Among females, a positive and statistically significant correlation between the achievement level and Visual Discrimination, Visual-Spatial Relationships, Visual Sequential Memory, Narrow Attentional Focus and Information Processing was observed, while among males, there was a positive and statistically significant correlation between the achievement level and Visual Discrimination, Visual Sequential Memory, Broad External Attentional Focus and Information Processing. For both males and females, a positive and statistically significant correlation between achievement level and Visual Discrimination, Visual Sequential Memory, Broad External Attentional, Narrow Attentional Focus and Information Processing was found. There were statistically significant differences between males and females in Visual Discrimination and Visual-Form Constancy. PMID:24511355

  9. Visual scene perception in navigating wood ants.

    PubMed

    Lent, David D; Graham, Paul; Collett, Thomas S

    2013-04-22

    Ants, like honeybees, can set their travel direction along foraging routes using just the surrounding visual panorama. This ability gives us a way to explore how visual scenes are perceived. By training wood ants to follow a path in an artificial scene and then examining their path within transformed scenes, we identify several perceptual operations that contribute to the ants' choice of direction. The first is a novel extension to the known ability of insects to compute the "center of mass" of large shapes: ants learn a desired heading toward a point on a distant shape as the proportion of the shape that lies to the left and right of the aiming point--the 'fractional position of mass' (FPM). The second operation, the extraction of local visual features like oriented edges, is familiar from studies of shape perception. Ants may use such features for guidance by keeping them in desired retinal locations. Third, ants exhibit segmentation. They compute the learned FPM over the whole of a simple scene, but over a segmented region of a complex scene. We suggest how the three operations may combine to provide efficient directional guidance.

  10. Visual perception of female physical attractiveness.

    PubMed Central

    Fan, J.; Liu, F.; Wu, J.; Dai, W.

    2004-01-01

    On the basis of visual assessment of figure drawings and front/profile images, past researchers believed that the waist-hip ratio (WHR) and the body mass index (BMI) were two putative cues to female physical attractiveness. However, this view was not tested on three-dimensional (3D) female images. In the present study, 3D images of 31 Caucasian females having varying body weights (BMI ranged from 16 to 35) were shown to 29 male and 25 female viewers, who were asked to rate the physical attractiveness. The results showed that the body volume divided by the square of the height, defined as volume height index (VHI), is the most important and direct visual determinant of female physical attractiveness. In determining the female attractiveness, human observers may first use VHI as a visual cue, which is also a key indicator of health and fertility owing to its strong linear relation to BMI. To fine-tune the judgement, observers may then use body proportions, the most important of which are the ratio of waist height over the chin height (WHC) (a measure of the length of legs over total tallness) and the deviation of WHR from the ideal ratio. It also appears that the effect of the body's physical parameters on the perception of female physical attractiveness conforms to Stevens' power law of psychophysics. PMID:15101692

  11. Visual Perception of Force: Comment on White (2012)

    ERIC Educational Resources Information Center

    Hubbard, Timothy L.

    2012-01-01

    White (2012) proposed that kinematic features in a visual percept are matched to stored representations containing information regarding forces (based on prior haptic experience) and that information in the matched, stored representations regarding forces is then incorporated into visual perception. Although some elements of White's (2012) account…

  12. Effective Connectivity within Human Primary Visual Cortex Predicts Interindividual Diversity in Illusory Perception

    PubMed Central

    Schwarzkopf, D. Samuel; Lutti, Antoine; Li, Baojuan; Kanai, Ryota; Rees, Geraint

    2013-01-01

    Visual perception depends strongly on spatial context. A classic example is the tilt illusion where the perceived orientation of a central stimulus differs from its physical orientation when surrounded by tilted spatial contexts. Here we show that such contextual modulation of orientation perception exhibits trait-like interindividual diversity that correlates with interindividual differences in effective connectivity within human primary visual cortex. We found that the degree to which spatial contexts induced illusory orientation perception, namely, the magnitude of the tilt illusion, varied across healthy human adults in a trait-like fashion independent of stimulus size or contrast. Parallel to contextual modulation of orientation perception, the presence of spatial contexts affected effective connectivity within human primary visual cortex between peripheral and foveal representations that responded to spatial context and central stimulus, respectively. Importantly, this effective connectivity from peripheral to foveal primary visual cortex correlated with interindividual differences in the magnitude of the tilt illusion. Moreover, this correlation with illusion perception was observed for effective connectivity under tilted contextual stimulation but not for that under iso-oriented contextual stimulation, suggesting that it reflected the impact of orientation-dependent intra-areal connections. Our findings revealed an interindividual correlation between intra-areal connectivity within primary visual cortex and contextual influence on orientation perception. This neurophysiological-perceptual link provides empirical evidence for theoretical proposals that intra-areal connections in early visual cortices are involved in contextual modulation of visual perception. PMID:24285885

  13. Contrast affects flicker and speed perception differently

    NASA Technical Reports Server (NTRS)

    Thompson, P.; Stone, L. S.

    1997-01-01

    We have previously shown that contrast affects speed perception, with lower-contrast, drifting gratings perceived as moving slower. In a recent study, we examined the implications of this result on models of speed perception that use the amplitude of the response of linear spatio-temporal filters to determine speed. In this study, we investigate whether the contrast dependence of speed can be understood within the context of models in which speed estimation is made using the temporal frequency of the response of linear spatio-temporal filters. We measured the effect of contrast on flicker perception and found that contrast manipulations produce opposite effects on perceived drift rate and perceived flicker rate, i.e., reducing contrast increases the apparent temporal frequency of counterphase modulated gratings. This finding argues that, if a temporal frequency-based algorithm underlies speed perception, either flicker and speed perception must not be based on the output of the same mechanism or contrast effects on perceived spatial frequency reconcile the disparate effects observed for perceived temporal frequency and speed.

  14. Hand Positions Alter Bistable Visual Motion Perception.

    PubMed

    Saito, Godai; Gyoba, Jiro

    2016-05-01

    We found that a hand posture with the palms together located just below the stream/bounce display could increase the proportion of bouncing perception. This effect, called the hands-induced bounce (HIB) effect, did not occur in the hands-cross condition or in the one-hand condition. By using rubber hands or covering the participants' hands with a cloth, we demonstrated that the visual information of the hand shapes was not a critical factor in producing the HIB effect, whereas proprioceptive information seemed to be important. We also found that the HIB effect did not occur when the participants' hands were far from the coincidence point, suggesting that the HIB effect might be produced within a limited spatial area around the hands. PMID:27433332

  15. Visual extinction and prior entry: impaired perception of temporal order with intact motion perception after unilateral parietal damage.

    PubMed

    Rorden, C; Mattingley, J B; Karnath, H O; Driver, J

    1997-04-01

    Two patients with left-sided visual extinction after right parietal damage were each given two 'prior entry' tasks that have recently been used to study attentional biases in normals. The first task presented two unconnected bars, one in each visual field, with the patients asked to judge which appeared sooner. Both patients reported that the right bar preceded the left unless the latter led by over 200 msec, suggesting a severe bias to the right affecting the time-course of visual awareness. The second task presented one continuous line in a scrolling format across the same spatial extent, with the patients asked to judge which direction the line moved in. The patients now performed normally. Thus, the perception of temporal order for separate events was impaired by the lesions, but without disrupting motion perception within single events. The implications are discussed for theories of normal and pathological attention, visual awareness, and motion perception.

  16. Refractive Errors Affect the Vividness of Visual Mental Images

    PubMed Central

    Palermo, Liana; Nori, Raffaella; Piccardi, Laura; Zeri, Fabrizio; Babino, Antonio; Giusberti, Fiorella; Guariglia, Cecilia

    2013-01-01

    The hypothesis that visual perception and mental imagery are equivalent has never been explored in individuals with vision defects not preventing the visual perception of the world, such as refractive errors. Refractive error (i.e., myopia, hyperopia or astigmatism) is a condition where the refracting system of the eye fails to focus objects sharply on the retina. As a consequence refractive errors cause blurred vision. We subdivided 84 individuals according to their spherical equivalent refraction into Emmetropes (control individuals without refractive errors) and Ametropes (individuals with refractive errors). Participants performed a vividness task and completed a questionnaire that explored their cognitive style of thinking before their vision was checked by an ophthalmologist. Although results showed that Ametropes had less vivid mental images than Emmetropes this did not affect the development of their cognitive style of thinking; in fact, Ametropes were able to use both verbal and visual strategies to acquire and retrieve information. Present data are consistent with the hypothesis of equivalence between imagery and perception. PMID:23755186

  17. Visual Perception Associated With Diabetes Mellitus

    NASA Astrophysics Data System (ADS)

    Suaste, Ernesto

    2004-09-01

    We designs and implement an instrumental methodology of analysis of the pupillary response to chromatic stimuli in order to observe the changes of pupillary area in the process of contraction and dilation in diabetic patients. Visual stimuli were used in the visible spectrum (400nm-650nm). Three different programs were used to determinate the best stimulation in order to obtain the better and contrasted pupillary response for diagnosis of the visual perception of colors. The stimulators PG0, PG12 and PG20 were designed in our laboratory. The test was carried out with 44 people, 33 men, 10 women and a boy (22-52 and 6 years), 12 with the stimulator PG0, 21 with PG12 and 17 with PG20, 7 subjects participated in more than a test. According to the plates of Ishihara, 40 of those subjects have normal vision to the colors, one subject suffers dicromasy (inability to differ or to perceive red and green) and while three of them present deficiencies to observe the blue and red spectrum (they suffer type II diabetes mellitus). With this instrumental methodology, we pretend to obtain an indication in the pupillary variability for the early diagnose of the diabetes mellitus, as well as a monitoring instrument for it.

  18. Volumic visual perception: principally novel concept

    NASA Astrophysics Data System (ADS)

    Petrov, Valery

    1996-01-01

    The general concept of volumic view (VV) as a universal property of space is introduced. VV exists in every point of the universe where electromagnetic (EM) waves can reach and a point or a quasi-point receiver (detector) of EM waves can be placed. Classification of receivers is given for the first time. They are classified into three main categories: biological, man-made non-biological, and mathematically specified hypothetical receivers. The principally novel concept of volumic perception is introduced. It differs chiefly from the traditional concept which traces back to Euclid and pre-Euclidean times and much later to Leonardo da Vinci and Giovanni Battista della Porta's discoveries and practical stereoscopy as introduced by C. Wheatstone. The basic idea of novel concept is that humans and animals acquire volumic visual data flows in series rather than in parallel. In this case the brain is free from extremely sophisticated real time parallel processing of two volumic visual data flows in order to combine them. Such procedure seems hardly probable even for humans who are unable to combine two primitive static stereoscopic images in one quicker than in a few seconds. Some people are unable to perform this procedure at all.

  19. Dynamic visual perception of dyslexic children.

    PubMed

    Fischer, B; Hartnegg, K; Mokler, A

    2000-01-01

    This study describes the capacity of children to detect fast changes of a small visual pattern. Three visual detection tasks for a group of normally reading (N = 140) and another group of dyslexic children (N = 366) in the age range of 7 to 16 years have been used. All three tasks require the detection of the fast changing orientation of a small pattern before it disappears. In one task, stationary fixation was required, because the orientation changes took place always at the same location. In the saccade condition, the pattern was displaced suddenly to one or the other side and a saccade was required to detect the orientation. In a third condition, a distractor was presented at one side shortly before the oriented pattern appeared at the opposite side. In this case, an antisaccade with respect to the distractor was required. In all three conditions, the dyslexic group as a whole performed significantly below the level of the control group. The performance improved with age in both groups. The differences between the test and control group were largest in the distractor condition. When compared with eye-movement performance in an antisaccade task, a parallel development of the performance of both tasks was observed in both groups. The study shows that a certain percentage of dyslexic children has difficulties in the perception of fast changing stimuli, a task presumably challenging the magnocellular system.

  20. Visual perception of axes of head rotation

    PubMed Central

    Arnoldussen, D. M.; Goossens, J.; van den Berg, A. V.

    2013-01-01

    Registration of ego-motion is important to accurately navigate through space. Movements of the head and eye relative to space are registered through the vestibular system and optical flow, respectively. Here, we address three questions concerning the visual registration of self-rotation. (1) Eye-in-head movements provide a link between the motion signals received by sensors in the moving eye and sensors in the moving head. How are these signals combined into an ego-rotation percept? We combined optic flow of simulated forward and rotational motion of the eye with different levels of eye-in-head rotation for a stationary head. We dissociated simulated gaze rotation and head rotation by different levels of eye-in-head pursuit. We found that perceived rotation matches simulated head- not gaze-rotation. This rejects a model for perceived self-rotation that relies on the rotation of the gaze line. Rather, eye-in-head signals serve to transform the optic flow's rotation information, that specifies rotation of the scene relative to the eye, into a rotation relative to the head. This suggests that transformed visual self-rotation signals may combine with vestibular signals. (2) Do transformed visual self-rotation signals reflect the arrangement of the semi-circular canals (SCC)? Previously, we found sub-regions within MST and V6+ that respond to the speed of the simulated head rotation. Here, we re-analyzed those Blood oxygenated level-dependent (BOLD) signals for the presence of a spatial dissociation related to the axes of visually simulated head rotation, such as have been found in sub-cortical regions of various animals. Contrary, we found a rather uniform BOLD response to simulated rotation along the three SCC axes. (3) We investigated if subject's sensitivity to the direction of the head rotation axis shows SCC axes specifcity. We found that sensitivity to head rotation is rather uniformly distributed, suggesting that in human cortex, visuo-vestibular integration is

  1. Word selection affects perceptions of synthetic biology.

    PubMed

    Pearson, Brianna; Snell, Sam; Bye-Nagel, Kyri; Tonidandel, Scott; Heyer, Laurie J; Campbell, A Malcolm

    2011-01-01

    Members of the synthetic biology community have discussed the significance of word selection when describing synthetic biology to the general public. In particular, many leaders proposed the word "create" was laden with negative connotations. We found that word choice and framing does affect public perception of synthetic biology. In a controlled experiment, participants perceived synthetic biology more negatively when "create" was used to describe the field compared to "construct" (p = 0.008). Contrary to popular opinion among synthetic biologists, however, low religiosity individuals were more influenced negatively by the framing manipulation than high religiosity people. Our results suggest that synthetic biologists directly influence public perception of their field through avoidance of the word "create". PMID:21777466

  2. Do Hearing Aids Improve Affect Perception?

    PubMed

    Schmidt, Juliane; Herzog, Diana; Scharenborg, Odette; Janse, Esther

    2016-01-01

    Normal-hearing listeners use acoustic cues in speech to interpret a speaker's emotional state. This study investigates the effect of hearing aids on the perception of the emotion dimensions arousal (aroused/calm) and valence (positive/negative attitude) in older adults with hearing loss. More specifically, we investigate whether wearing a hearing aid improves the correlation between affect ratings and affect-related acoustic parameters. To that end, affect ratings by 23 hearing-aid users were compared for aided and unaided listening. Moreover, these ratings were compared to the ratings by an age-matched group of 22 participants with age-normal hearing.For arousal, hearing-aid users rated utterances as generally more aroused in the aided than in the unaided condition. Intensity differences were the strongest indictor of degree of arousal. Among the hearing-aid users, those with poorer hearing used additional prosodic cues (i.e., tempo and pitch) for their arousal ratings, compared to those with relatively good hearing. For valence, pitch was the only acoustic cue that was associated with valence. Neither listening condition nor hearing loss severity (differences among the hearing-aid users) influenced affect ratings or the use of affect-related acoustic parameters. Compared to the normal-hearing reference group, ratings of hearing-aid users in the aided condition did not generally differ in both emotion dimensions. However, hearing-aid users were more sensitive to intensity differences in their arousal ratings than the normal-hearing participants.We conclude that the use of hearing aids is important for the rehabilitation of affect perception and particularly influences the interpretation of arousal. PMID:27080645

  3. Affective and motivational influences in person perception

    PubMed Central

    Kuzmanovic, Bojana; Jefferson, Anneli; Bente, Gary; Vogeley, Kai

    2013-01-01

    Interpersonal impression formation is highly consequential for social interactions in private and public domains. These perceptions of others rely on different sources of information and processing mechanisms, all of which have been investigated in independent research fields. In social psychology, inferences about states and traits of others as well as activations of semantic categories and corresponding stereotypes have attracted great interest. On the other hand, research on emotion and reward demonstrated affective and motivational influences of social cues on the observer, which in turn modulate attention, categorization, evaluation, and decision processes. While inferential and categorical social processes have been shown to recruit a network of cortical brain regions associated with mentalizing and evaluation, the affective influence of social cues has been linked to subcortical areas that play a central role in detection of salient sensory input and reward processing. In order to extend existing integrative approaches to person perception, both the inferential-categorical processing of information about others, and affective and motivational influences of this information on the beholder should be taken into account. PMID:23781188

  4. Affective and motivational influences in person perception.

    PubMed

    Kuzmanovic, Bojana; Jefferson, Anneli; Bente, Gary; Vogeley, Kai

    2013-01-01

    Interpersonal impression formation is highly consequential for social interactions in private and public domains. These perceptions of others rely on different sources of information and processing mechanisms, all of which have been investigated in independent research fields. In social psychology, inferences about states and traits of others as well as activations of semantic categories and corresponding stereotypes have attracted great interest. On the other hand, research on emotion and reward demonstrated affective and motivational influences of social cues on the observer, which in turn modulate attention, categorization, evaluation, and decision processes. While inferential and categorical social processes have been shown to recruit a network of cortical brain regions associated with mentalizing and evaluation, the affective influence of social cues has been linked to subcortical areas that play a central role in detection of salient sensory input and reward processing. In order to extend existing integrative approaches to person perception, both the inferential-categorical processing of information about others, and affective and motivational influences of this information on the beholder should be taken into account. PMID:23781188

  5. [Visual perception deficits of cortical origin].

    PubMed

    Stolarska, Urszula; Zajac, Anna; Skowronek-Bała, Barbara; Budziszewska, Bogusława

    2009-01-01

    This work comprises of a literature review on visual perception distortions that have their origin in structural or functional irregularities of the brain, resulting in the cortex malfunction. The main area that we pay attention to is the brain cortex, but we should not forget, that diseases destructive to the lower brain structures also inevitably lead to secondary dysfunction of the cortex, and thus they have also been included in this paper. Cerebral vision disorders are a small percentage of caseload in either neurology or ophthalmology practice, yet they certainly are interesting for the cognitive scientists, as they open a window into the complex mechanisms of the cerebral clockwork. We are presenting examples of disorders, many of which engage the creative cooperation between specialists from different fields of neuroscience. Three kinds of disorders are presented: vision loss, agnosias and hallucinations. Among others there is some information on cortical blindness, blindsight, Anton's syndrome, hysterical blindness, apperceptive and associative agnosia, prosopagnosia, pure alexia, achromatopsia, Bonnet syndrome, Alice in Wonderland syndrome, peduncular halucinosis etc. PMID:20297642

  6. Lip movements affect infants' audiovisual speech perception.

    PubMed

    Yeung, H Henny; Werker, Janet F

    2013-05-01

    Speech is robustly audiovisual from early in infancy. Here we show that audiovisual speech perception in 4.5-month-old infants is influenced by sensorimotor information related to the lip movements they make while chewing or sucking. Experiment 1 consisted of a classic audiovisual matching procedure, in which two simultaneously displayed talking faces (visual [i] and [u]) were presented with a synchronous vowel sound (audio /i/ or /u/). Infants' looking patterns were selectively biased away from the audiovisual matching face when the infants were producing lip movements similar to those needed to produce the heard vowel. Infants' looking patterns returned to those of a baseline condition (no lip movements, looking longer at the audiovisual matching face) when they were producing lip movements that did not match the heard vowel. Experiment 2 confirmed that these sensorimotor effects interacted with the heard vowel, as looking patterns differed when infants produced these same lip movements while seeing and hearing a talking face producing an unrelated vowel (audio /a/). These findings suggest that the development of speech perception and speech production may be mutually informative.

  7. Behavioral model of visual perception and recognition

    NASA Astrophysics Data System (ADS)

    Rybak, Ilya A.; Golovan, Alexander V.; Gusakova, Valentina I.

    1993-09-01

    In the processes of visual perception and recognition human eyes actively select essential information by way of successive fixations at the most informative points of the image. A behavioral program defining a scanpath of the image is formed at the stage of learning (object memorizing) and consists of sequential motor actions, which are shifts of attention from one to another point of fixation, and sensory signals expected to arrive in response to each shift of attention. In the modern view of the problem, invariant object recognition is provided by the following: (1) separated processing of `what' (object features) and `where' (spatial features) information at high levels of the visual system; (2) mechanisms of visual attention using `where' information; (3) representation of `what' information in an object-based frame of reference (OFR). However, most recent models of vision based on OFR have demonstrated the ability of invariant recognition of only simple objects like letters or binary objects without background, i.e. objects to which a frame of reference is easily attached. In contrast, we use not OFR, but a feature-based frame of reference (FFR), connected with the basic feature (edge) at the fixation point. This has provided for our model, the ability for invariant representation of complex objects in gray-level images, but demands realization of behavioral aspects of vision described above. The developed model contains a neural network subsystem of low-level vision which extracts a set of primary features (edges) in each fixation, and high- level subsystem consisting of `what' (Sensory Memory) and `where' (Motor Memory) modules. The resolution of primary features extraction decreases with distances from the point of fixation. FFR provides both the invariant representation of object features in Sensor Memory and shifts of attention in Motor Memory. Object recognition consists in successive recall (from Motor Memory) and execution of shifts of attention and

  8. Phase reset affects auditory-visual simultaneity judgment.

    PubMed

    Kambe, Jun; Kakimoto, Yuta; Araki, Osamu

    2015-10-01

    We continuously receive the external information from multiple sensors simultaneously. The brain must judge a source event of these sensory informations and integrate them. It is thought that judging the simultaneity of such multisensory stimuli is an important cue when we discriminate whether the stimuli are derived from one event or not. Although previous studies have investigated the correspondence between an auditory-visual (AV) simultaneity perceptions and the neural responses, there are still few studies of this. Electrophysiological studies have reported that ongoing oscillations in human cortex affect perception. Especially, the phase resetting of ongoing oscillations has been examined as it plays an important role in multisensory integration. The aim of this study was to investigate the relationship of phase resetting for the judgment of AV simultaneity judgement tasks. The subjects were successively presented with auditory and visual stimuli with intervals that were controlled as [Formula: see text] and they were asked to report whether they perceived them simultaneously or not. We investigated the effects of the phase of ongoing oscillations on simultaneity judgments with AV stimuli with SOAs in which the detection rate of asynchrony was 50 %. It was found that phase resetting at the beta frequency band in the brain area that related to the modality of the following stimulus occurred after preceding stimulus onset only when the subjects perceived AV stimuli as simultaneous. This result suggested that beta phase resetting occurred in areas that are related to the subsequent stimulus, supporting perception multisensory stimuli as simultaneous.

  9. Visual and auditory perception in preschool children at risk for dyslexia.

    PubMed

    Ortiz, Rosario; Estévez, Adelina; Muñetón, Mercedes; Domínguez, Carolina

    2014-11-01

    Recently, there has been renewed interest in perceptive problems of dyslexics. A polemic research issue in this area has been the nature of the perception deficit. Another issue is the causal role of this deficit in dyslexia. Most studies have been carried out in adult and child literates; consequently, the observed deficits may be the result rather than the cause of dyslexia. This study addresses these issues by examining visual and auditory perception in children at risk for dyslexia. We compared children from preschool with and without risk for dyslexia in auditory and visual temporal order judgment tasks and same-different discrimination tasks. Identical visual and auditory, linguistic and nonlinguistic stimuli were presented in both tasks. The results revealed that the visual as well as the auditory perception of children at risk for dyslexia is impaired. The comparison between groups in auditory and visual perception shows that the achievement of children at risk was lower than children without risk for dyslexia in the temporal tasks. There were no differences between groups in auditory discrimination tasks. The difficulties of children at risk in visual and auditory perceptive processing affected both linguistic and nonlinguistic stimuli. Our conclusions are that children at risk for dyslexia show auditory and visual perceptive deficits for linguistic and nonlinguistic stimuli. The auditory impairment may be explained by temporal processing problems and these problems are more serious for processing language than for processing other auditory stimuli. These visual and auditory perceptive deficits are not the consequence of failing to learn to read, thus, these findings support the theory of temporal processing deficit.

  10. Visual-vestibular integration motion perception reporting

    NASA Technical Reports Server (NTRS)

    Harm, Deborah L.; Reschke, Millard R.; Parker, Donald E.

    1999-01-01

    Self-orientation and self/surround-motion perception derive from a multimodal sensory process that integrates information from the eyes, vestibular apparatus, proprioceptive and somatosensory receptors. Results from short and long duration spaceflight investigations indicate that: (1) perceptual and sensorimotor function was disrupted during the initial exposure to microgravity and gradually improved over hours to days (individuals adapt), (2) the presence and/or absence of information from different sensory modalities differentially affected the perception of orientation, self-motion and surround-motion, (3) perceptual and sensorimotor function was initially disrupted upon return to Earth-normal gravity and gradually recovered to preflight levels (individuals readapt), and (4) the longer the exposure to microgravity, the more complete the adaptation, the more profound the postflight disturbances, and the longer the recovery period to preflight levels. While much has been learned about perceptual and sensorimotor reactions and adaptation to microgravity, there is much remaining to be learned about the mechanisms underlying the adaptive changes, and about how intersensory interactions affect perceptual and sensorimotor function during voluntary movements. During space flight, SMS and perceptual disturbances have led to reductions in performance efficiency and sense of well-being. During entry and immediately after landing, such disturbances could have a serious impact on the ability of the commander to land the Orbiter and on the ability of all crew members to egress from the Orbiter, particularly in a non-nominal condition or following extended stays in microgravity. An understanding of spatial orientation and motion perception is essential for developing countermeasures for Space Motion Sickness (SMS) and perceptual disturbances during spaceflight and upon return to Earth. Countermeasures for optimal performance in flight and a successful return to Earth require

  11. A Dynamic Systems Theory Model of Visual Perception Development

    ERIC Educational Resources Information Center

    Coté, Carol A.

    2015-01-01

    This article presents a model for understanding the development of visual perception from a dynamic systems theory perspective. It contrasts to a hierarchical or reductionist model that is often found in the occupational therapy literature. In this proposed model vision and ocular motor abilities are not foundational to perception, they are seen…

  12. Flicker-light induced visual phenomena: frequency dependence and specificity of whole percepts and percept features.

    PubMed

    Allefeld, Carsten; Pütz, Peter; Kastner, Kristina; Wackermann, Jiří

    2011-12-01

    Flickering light induces visual hallucinations in human observers. Despite a long history of the phenomenon, little is known about the dependence of flicker-induced subjective impressions on the flicker frequency. We investigate this question using Ganzfeld stimulation and an experimental paradigm combining a continuous frequency scan (1-50 Hz) with a focus on re-occurring, whole percepts. On the single-subject level, we find a high degree of frequency stability of percepts. To generalize across subjects, we apply two rating systems, (1) a set of complex percept classes derived from subjects' reports and (2) an enumeration of elementary percept features, and determine distributions of occurrences over flicker frequency. We observe a stronger frequency specificity for complex percept classes than elementary percept features. Comparing the similarity relations among percept categories to those among frequency profiles, we observe that though percepts are preferentially induced by particular frequencies, the frequency does not unambiguously determine the experienced percept. PMID:21123084

  13. Seeing and seeing: visual perception in art and science

    NASA Astrophysics Data System (ADS)

    Campbell, Peter

    2004-11-01

    This article takes a brief walk through two complex cultures, looking at similarities and differences between them. Visual perception is vital to both art and science, for to see is to understand. The article compares how education in each subject fosters visualization and creative thinking.

  14. Visual perception of texture in aggressive behavior of Betta splendens.

    PubMed

    Bando, T

    1991-07-01

    In order to elucidate the role of texture in fish vision, the agonistic behavior of male Siamese fighting fish (Betta splendens) was tested in a response to models composed by means of image processing techniques. Using the models with the contour shape of a side view of Betta splendens in an aggressive state, the responses were vigorous when there was a fine distribution of brightness and naturalistic color, producing textures like a scale pattern. Reactions became weaker as the brightness and color distribution reverted to more homogeneous levels and the scale pattern disappeared. When the artificial models with the circular contour shape were used, models with the scale pattern evoked more aggressive behaviors than those without it, while the existence of spherical gradation affected the behavior slightly. These results suggest that texture plays an important role in fish visual perception.

  15. Visual perception of texture in aggressive behavior of Betta splendens.

    PubMed

    Bando, T

    1991-07-01

    In order to elucidate the role of texture in fish vision, the agonistic behavior of male Siamese fighting fish (Betta splendens) was tested in a response to models composed by means of image processing techniques. Using the models with the contour shape of a side view of Betta splendens in an aggressive state, the responses were vigorous when there was a fine distribution of brightness and naturalistic color, producing textures like a scale pattern. Reactions became weaker as the brightness and color distribution reverted to more homogeneous levels and the scale pattern disappeared. When the artificial models with the circular contour shape were used, models with the scale pattern evoked more aggressive behaviors than those without it, while the existence of spherical gradation affected the behavior slightly. These results suggest that texture plays an important role in fish visual perception. PMID:1941718

  16. General Markers of Conscious Visual Perception and Their Timing.

    PubMed

    Rutiku, Renate; Aru, Jaan; Bachmann, Talis

    2016-01-01

    Previous studies have observed different onset times for the neural markers of conscious perception. This variability could be attributed to procedural differences between studies. Here we show that the onset times for the markers of conscious visual perception can strongly vary even within a single study. A heterogeneous stimulus set was presented at threshold contrast. Trials with and without conscious perception were contrasted on 100 balanced subsets of the data. Importantly, the 100 subsets with heterogeneous stimuli did not differ in stimulus content, but only with regard to specific trials used. This approach enabled us to study general markers of conscious visual perception independent of stimulus content, characterize their onset and its variability within one study. N200 and P300 were the two reliable markers of conscious visual perception common to all perceived stimuli and absent for all non-perceived stimuli. The estimated mean onset latency for both markers was shortly after 200 ms. However, the onset latency of these markers was associated with considerable variability depending on which subsets of the data were considered. We show that it is first and foremost the amplitude fluctuation in the condition without conscious perception that explains the observed variability in onset latencies of the markers of conscious visual perception.

  17. General Markers of Conscious Visual Perception and Their Timing

    PubMed Central

    Rutiku, Renate; Aru, Jaan; Bachmann, Talis

    2016-01-01

    Previous studies have observed different onset times for the neural markers of conscious perception. This variability could be attributed to procedural differences between studies. Here we show that the onset times for the markers of conscious visual perception can strongly vary even within a single study. A heterogeneous stimulus set was presented at threshold contrast. Trials with and without conscious perception were contrasted on 100 balanced subsets of the data. Importantly, the 100 subsets with heterogeneous stimuli did not differ in stimulus content, but only with regard to specific trials used. This approach enabled us to study general markers of conscious visual perception independent of stimulus content, characterize their onset and its variability within one study. N200 and P300 were the two reliable markers of conscious visual perception common to all perceived stimuli and absent for all non-perceived stimuli. The estimated mean onset latency for both markers was shortly after 200 ms. However, the onset latency of these markers was associated with considerable variability depending on which subsets of the data were considered. We show that it is first and foremost the amplitude fluctuation in the condition without conscious perception that explains the observed variability in onset latencies of the markers of conscious visual perception. PMID:26869905

  18. Close to me? The influence of affective closeness on space perception.

    PubMed

    Morgado, Nicolas; Muller, Dominique; Gentaz, Edouard; Palluel-Germain, Richard

    2011-01-01

    Recent data show that psychosocial factors affect visual perception. We tested this hypothesis by investigating the relationship between affective closeness and the perception of apertures between two people. People feel discomfort when they are near someone they are not affectively close to. Therefore, we predict that they will be less likely to perceive that they can pass between two people not affectively close to them. Participants had to imagine passing through the aperture between two life-size classmate pictures. We found that the closer participants felt to their classmates, the more they felt able to pass between them. This provides the first evidence of a relationship between affective closeness and the perception of aperture between two people, suggesting that psychosocial factors constrain space perception.

  19. How do musical tonality and experience affect visual working memory?

    PubMed

    Yang, Hua; Lu, Jing; Gong, Diankun; Yao, Dezhong

    2016-01-20

    The influence of music on the human brain has continued to attract increasing attention from neuroscientists and musicologists. Currently, tonal music is widely present in people's daily lives; however, atonal music has gradually become an important part of modern music. In this study, we conducted two experiments: the first one tested for differences in perception of distractibility between tonal music and atonal music. The second experiment tested how tonal music and atonal music affect visual working memory by comparing musicians and nonmusicians who were placed in contexts with background tonal music, atonal music, and silence. They were instructed to complete a delay matching memory task. The results show that musicians and nonmusicians have different evaluations of the distractibility of tonal music and atonal music, possibly indicating that long-term training may lead to a higher auditory perception threshold among musicians. For the working memory task, musicians reacted faster than nonmusicians in all background music cases, and musicians took more time to respond in the tonal background music condition than in the other conditions. Therefore, our results suggest that for a visual memory task, background tonal music may occupy more cognitive resources than atonal music or silence for musicians, leaving few resources left for the memory task. Moreover, the musicians outperformed the nonmusicians because of the higher sensitivity to background music, which also needs a further longitudinal study to be confirmed.

  20. Visual Cues for Enhancing Depth Perception.

    ERIC Educational Resources Information Center

    O'Donnell, L. M.; Smith, A. J.

    1994-01-01

    This article describes the physiological mechanisms involved in three-dimensional depth perception and presents a variety of distance and depth cues and strategies for detecting and estimating curbs and steps for individuals with impaired vision. (Author/DB)

  1. High visual resolution matters in audiovisual speech perception, but only for some.

    PubMed

    Alsius, Agnès; Wayne, Rachel V; Paré, Martin; Munhall, Kevin G

    2016-07-01

    The basis for individual differences in the degree to which visual speech input enhances comprehension of acoustically degraded speech is largely unknown. Previous research indicates that fine facial detail is not critical for visual enhancement when auditory information is available; however, these studies did not examine individual differences in ability to make use of fine facial detail in relation to audiovisual speech perception ability. Here, we compare participants based on their ability to benefit from visual speech information in the presence of an auditory signal degraded with noise, modulating the resolution of the visual signal through low-pass spatial frequency filtering and monitoring gaze behavior. Participants who benefited most from the addition of visual information (high visual gain) were more adversely affected by the removal of high spatial frequency information, compared to participants with low visual gain, for materials with both poor and rich contextual cues (i.e., words and sentences, respectively). Differences as a function of gaze behavior between participants with the highest and lowest visual gains were observed only for words, with participants with the highest visual gain fixating longer on the mouth region. Our results indicate that the individual variance in audiovisual speech in noise performance can be accounted for, in part, by better use of fine facial detail information extracted from the visual signal and increased fixation on mouth regions for short stimuli. Thus, for some, audiovisual speech perception may suffer when the visual input (in addition to the auditory signal) is less than perfect.

  2. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception

    PubMed Central

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress). PMID:27445901

  3. The Affective Bases of Risk Perception: Negative Feelings and Stress Mediate the Relationship between Mental Imagery and Risk Perception.

    PubMed

    Sobkow, Agata; Traczyk, Jakub; Zaleskiewicz, Tomasz

    2016-01-01

    Recent research has documented that affect plays a crucial role in risk perception. When no information about numerical risk estimates is available (e.g., probability of loss or magnitude of consequences), people may rely on positive and negative affect toward perceived risk. However, determinants of affective reactions to risks are poorly understood. In a series of three experiments, we addressed the question of whether and to what degree mental imagery eliciting negative affect and stress influences risk perception. In each experiment, participants were instructed to visualize consequences of risk taking and to rate riskiness. In Experiment 1, participants who imagined negative risk consequences reported more negative affect and perceived risk as higher compared to the control condition. In Experiment 2, we found that this effect was driven by affect elicited by mental imagery rather than its vividness and intensity. In this study, imagining positive risk consequences led to lower perceived risk than visualizing negative risk consequences. Finally, we tested the hypothesis that negative affect related to higher perceived risk was caused by negative feelings of stress. In Experiment 3, we introduced risk-irrelevant stress to show that participants in the stress condition rated perceived risk as higher in comparison to the control condition. This experiment showed that higher ratings of perceived risk were influenced by psychological stress. Taken together, our results demonstrate that affect-laden mental imagery dramatically changes risk perception through negative affect (i.e., psychological stress). PMID:27445901

  4. Relating Standardized Visual Perception Measures to Simulator Visual System Performance

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Sweet, Barbara T.

    2013-01-01

    Human vision is quantified through the use of standardized clinical vision measurements. These measurements typically include visual acuity (near and far), contrast sensitivity, color vision, stereopsis (a.k.a. stereo acuity), and visual field periphery. Simulator visual system performance is specified in terms such as brightness, contrast, color depth, color gamut, gamma, resolution, and field-of-view. How do these simulator performance characteristics relate to the perceptual experience of the pilot in the simulator? In this paper, visual acuity and contrast sensitivity will be related to simulator visual system resolution, contrast, and dynamic range; similarly, color vision will be related to color depth/color gamut. Finally, we will consider how some characteristics of human vision not typically included in current clinical assessments could be used to better inform simulator requirements (e.g., relating dynamic characteristics of human vision to update rate and other temporal display characteristics).

  5. Direction specific biases in human visual and vestibular heading perception.

    PubMed

    Crane, Benjamin T

    2012-01-01

    Heading direction is determined from visual and vestibular cues. Both sensory modalities have been shown to have better direction discrimination for headings near straight ahead. Previous studies of visual heading estimation have not used the full range of stimuli, and vestibular heading estimation has not previously been reported. The current experiments measure human heading estimation in the horizontal plane to vestibular, visual, and spoken stimuli. The vestibular and visual tasks involved 16 cm of platform or visual motion. The spoken stimulus was a voice command speaking a heading angle. All conditions demonstrated direction dependent biases in perceived headings such that biases increased with headings further from the fore-aft axis. The bias was larger with the visual stimulus when compared with the vestibular stimulus in all 10 subjects. For the visual and vestibular tasks precision was best for headings near fore-aft. The spoken headings had the least bias, and the variation in precision was less dependent on direction. In a separate experiment when headings were limited to ± 45°, the biases were much less, demonstrating the range of headings influences perception. There was a strong and highly significant correlation between the bias curves for visual and spoken stimuli in every subject. The correlation between visual-vestibular and vestibular-spoken biases were weaker but remained significant. The observed biases in both visual and vestibular heading perception qualitatively resembled predictions of a recent population vector decoder model (Gu et al., 2010) based on the known distribution of neuronal sensitivities.

  6. Direction Specific Biases in Human Visual and Vestibular Heading Perception

    PubMed Central

    Crane, Benjamin T.

    2012-01-01

    Heading direction is determined from visual and vestibular cues. Both sensory modalities have been shown to have better direction discrimination for headings near straight ahead. Previous studies of visual heading estimation have not used the full range of stimuli, and vestibular heading estimation has not previously been reported. The current experiments measure human heading estimation in the horizontal plane to vestibular, visual, and spoken stimuli. The vestibular and visual tasks involved 16 cm of platform or visual motion. The spoken stimulus was a voice command speaking a heading angle. All conditions demonstrated direction dependent biases in perceived headings such that biases increased with headings further from the fore-aft axis. The bias was larger with the visual stimulus when compared with the vestibular stimulus in all 10 subjects. For the visual and vestibular tasks precision was best for headings near fore-aft. The spoken headings had the least bias, and the variation in precision was less dependent on direction. In a separate experiment when headings were limited to ±45°, the biases were much less, demonstrating the range of headings influences perception. There was a strong and highly significant correlation between the bias curves for visual and spoken stimuli in every subject. The correlation between visual-vestibular and vestibular-spoken biases were weaker but remained significant. The observed biases in both visual and vestibular heading perception qualitatively resembled predictions of a recent population vector decoder model (Gu et al., 2010) based on the known distribution of neuronal sensitivities. PMID:23236490

  7. Auditory Emotional Cues Enhance Visual Perception

    ERIC Educational Resources Information Center

    Zeelenberg, Rene; Bocanegra, Bruno R.

    2010-01-01

    Recent studies show that emotional stimuli impair performance to subsequently presented neutral stimuli. Here we show a cross-modal perceptual enhancement caused by emotional cues. Auditory cue words were followed by a visually presented neutral target word. Two-alternative forced-choice identification of the visual target was improved by…

  8. Disentangling visual imagery and perception of real-world objects.

    PubMed

    Lee, Sue-Hyun; Kravitz, Dwight J; Baker, Chris I

    2012-02-15

    During mental imagery, visual representations can be evoked in the absence of "bottom-up" sensory input. Prior studies have reported similar neural substrates for imagery and perception, but studies of brain-damaged patients have revealed a double dissociation with some patients showing preserved imagery in spite of impaired perception and others vice versa. Here, we used fMRI and multi-voxel pattern analysis to investigate the specificity, distribution, and similarity of information for individual seen and imagined objects to try and resolve this apparent contradiction. In an event-related design, participants either viewed or imagined individual named object images on which they had been trained prior to the scan. We found that the identity of both seen and imagined objects could be decoded from the pattern of activity throughout the ventral visual processing stream. Further, there was enough correspondence between imagery and perception to allow discrimination of individual imagined objects based on the response during perception. However, the distribution of object information across visual areas was strikingly different during imagery and perception. While there was an obvious posterior-anterior gradient along the ventral visual stream for seen objects, there was an opposite gradient for imagined objects. Moreover, the structure of representations (i.e. the pattern of similarity between responses to all objects) was more similar during imagery than perception in all regions along the visual stream. These results suggest that while imagery and perception have similar neural substrates, they involve different network dynamics, resolving the tension between previous imaging and neuropsychological studies.

  9. Aging affects postural tracking of complex visual motion cues.

    PubMed

    Sotirakis, H; Kyvelidou, A; Mademli, L; Stergiou, N; Hatzitaki, V

    2016-09-01

    Postural tracking of visual motion cues improves perception-action coupling in aging, yet the nature of the visual cues to be tracked is critical for the efficacy of such a paradigm. We investigated how well healthy older (72.45 ± 4.72 years) and young (22.98 ± 2.9 years) adults can follow with their gaze and posture horizontally moving visual target cues of different degree of complexity. Participants tracked continuously for 120 s the motion of a visual target (dot) that oscillated in three different patterns: a simple periodic (simulated by a sine), a more complex (simulated by the Lorenz attractor that is deterministic displaying mathematical chaos) and an ultra-complex random (simulated by surrogating the Lorenz attractor) pattern. The degree of coupling between performance (posture and gaze) and the target motion was quantified in the spectral coherence, gain, phase and cross-approximate entropy (cross-ApEn) between signals. Sway-target coherence decreased as a function of target complexity and was lower for the older compared to the young participants when tracking the chaotic target. On the other hand, gaze-target coherence was not affected by either target complexity or age. Yet, a lower cross-ApEn value when tracking the chaotic stimulus motion revealed a more synchronous gaze-target relationship for both age groups. Results suggest limitations in online visuo-motor processing of complex motion cues and a less efficient exploitation of the body sway dynamics with age. Complex visual motion cues may provide a suitable training stimulus to improve visuo-motor integration and restore sway variability in older adults.

  10. Aging affects postural tracking of complex visual motion cues.

    PubMed

    Sotirakis, H; Kyvelidou, A; Mademli, L; Stergiou, N; Hatzitaki, V

    2016-09-01

    Postural tracking of visual motion cues improves perception-action coupling in aging, yet the nature of the visual cues to be tracked is critical for the efficacy of such a paradigm. We investigated how well healthy older (72.45 ± 4.72 years) and young (22.98 ± 2.9 years) adults can follow with their gaze and posture horizontally moving visual target cues of different degree of complexity. Participants tracked continuously for 120 s the motion of a visual target (dot) that oscillated in three different patterns: a simple periodic (simulated by a sine), a more complex (simulated by the Lorenz attractor that is deterministic displaying mathematical chaos) and an ultra-complex random (simulated by surrogating the Lorenz attractor) pattern. The degree of coupling between performance (posture and gaze) and the target motion was quantified in the spectral coherence, gain, phase and cross-approximate entropy (cross-ApEn) between signals. Sway-target coherence decreased as a function of target complexity and was lower for the older compared to the young participants when tracking the chaotic target. On the other hand, gaze-target coherence was not affected by either target complexity or age. Yet, a lower cross-ApEn value when tracking the chaotic stimulus motion revealed a more synchronous gaze-target relationship for both age groups. Results suggest limitations in online visuo-motor processing of complex motion cues and a less efficient exploitation of the body sway dynamics with age. Complex visual motion cues may provide a suitable training stimulus to improve visuo-motor integration and restore sway variability in older adults. PMID:27126061

  11. Work-related goal perceptions and affective well-being.

    PubMed

    Ingledew, David K; Wray, Josephine L; Markland, David; Hardy, Lew

    2005-01-01

    The aim was to clarify how perceptions of work-related goals influence affective well-being and goal commitment. Participants (N = 201) completed a Goal Perceptions Questionnaire and affect scales. A model was refined using structural equation modelling. Value and success expectation substantially mediated the effects of other goal perceptions on affects and commitment. Both value and success expectation increased commitment, but whereas value increased positive affects, success expectation reduced negative affects. The determinants of value (e.g. personal origin) were different from those of success expectation (e.g. personal control). Through astute goal setting, it is possible to promote well-being without compromising commitment. PMID:15576503

  12. Crossmodal integration between visual linguistic information and flavour perception.

    PubMed

    Razumiejczyk, Eugenia; Macbeth, Guillermo; Marmolejo-Ramos, Fernando; Noguchi, Kimihiro

    2015-08-01

    Many studies have found processing interference in working memory when complex information that enters the cognitive system from different modalities has to be integrated to understand the environment and promote adjustment. Here, we report on a Stroop study that provides evidence concerned with the crossmodal processing of flavour perception and visual language. We found a facilitation effect in the congruency condition. Acceleration was observed for incomplete words and anagrams compared to complete words. A crossmodal completion account is presented for such findings. It is concluded that the crossmodal integration between flavour and visual language perception requires the active participation of top-down and bottom-up processing. PMID:25843936

  13. Reversal of cortical information flow during visual imagery as compared to visual perception.

    PubMed

    Dentico, Daniela; Cheung, Bing Leung; Chang, Jui-Yang; Guokas, Jeffrey; Boly, Melanie; Tononi, Giulio; Van Veen, Barry

    2014-10-15

    The role of bottom-up and top-down connections during visual perception and the formation of mental images was examined by analyzing high-density EEG recordings of brain activity using two state-of-the-art methods for assessing the directionality of cortical signal flow: state-space Granger causality and dynamic causal modeling. We quantified the directionality of signal flow in an occipito-parieto-frontal cortical network during perception of movie clips versus mental replay of the movies and free visual imagery. Both Granger causality and dynamic causal modeling analyses revealed an increased top-down signal flow in parieto-occipital cortices during mental imagery as compared to visual perception. These results are the first direct demonstration of a reversal of the predominant direction of cortical signal flow during mental imagery as compared to perception. PMID:24910071

  14. Reversal of cortical information flow during visual imagery as compared to visual perception

    PubMed Central

    Dentico, Daniela; Cheung, Bing Leung; Chang, Jui-Yang; Guokas, Jeffrey; Boly, Melanie; Tononi, Giulio; Van Veen, Barry

    2014-01-01

    The role of bottom-up and top-down connections during visual perception and the forming of mental images was examined by analyzing high-density EEG recordings of brain activity using two state-of-the-art methods for assessing the directionality of cortical signal flow: state-space Granger causality and dynamic causal modeling. We quantified the directionality of signal flow in an occipito-parieto-frontal cortical network during perception of movie clips versus mental replay of the movies and free visual imagery. Both Granger causality and dynamic causal modeling analyses revealed increased top-down signal flow in parieto-occipital cortices during mental imagery as compared to visual perception. These results are the first direct demonstration of a reversal of the predominant direction of cortical signal flow during mental imagery as compared to perception. PMID:24910071

  15. Auditory-visual crossmodal integration in perception of face gender.

    PubMed

    Smith, Eric L; Grabowecky, Marcia; Suzuki, Satoru

    2007-10-01

    Whereas extensive neuroscientific and behavioral evidence has confirmed a role of auditory-visual integration in representing space [1-6], little is known about the role of auditory-visual integration in object perception. Although recent neuroimaging results suggest integrated auditory-visual object representations [7-11], substantiating behavioral evidence has been lacking. We demonstrated auditory-visual integration in the perception of face gender by using pure tones that are processed in low-level auditory brain areas and that lack the spectral components that characterize human vocalization. When androgynous faces were presented together with pure tones in the male fundamental-speaking-frequency range, faces were more likely to be judged as male, whereas when faces were presented with pure tones in the female fundamental-speaking-frequency range, they were more likely to be judged as female. Importantly, when participants were explicitly asked to attribute gender to these pure tones, their judgments were primarily based on relative pitch and were uncorrelated with the male and female fundamental-speaking-frequency ranges. This perceptual dissociation of absolute-frequency-based crossmodal-integration effects from relative-pitch-based explicit perception of the tones provides evidence for a sensory integration of auditory and visual signals in representing human gender. This integration probably develops because of concurrent neural processing of visual and auditory features of gender.

  16. Dynamic Stimuli And Active Processing In Human Visual Perception

    NASA Astrophysics Data System (ADS)

    Haber, Ralph N.

    1990-03-01

    Theories of visual perception traditionally have considered a static retinal image to be the starting point for processing; and has considered processing both to be passive and a literal translation of that frozen, two dimensional, pictorial image. This paper considers five problem areas in the analysis of human visually guided locomotion, in which the traditional approach is contrasted to newer ones that utilize dynamic definitions of stimulation, and an active perceiver: (1) differentiation between object motion and self motion, and among the various kinds of self motion (e.g., eyes only, head only, whole body, and their combinations); (2) the sources and contents of visual information that guide movement; (3) the acquisition and performance of perceptual motor skills; (4) the nature of spatial representations, percepts, and the perceived layout of space; and (5) and why the retinal image is a poor starting point for perceptual processing. These newer approaches argue that stimuli must be considered as dynamic: humans process the systematic changes in patterned light when objects move and when they themselves move. Furthermore, the processing of visual stimuli must be active and interactive, so that perceivers can construct panoramic and stable percepts from an interaction of stimulus information and expectancies of what is contained in the visual environment. These developments all suggest a very different approach to the computational analyses of object location and identification, and of the visual guidance of locomotion.

  17. Undetectable Changes in Image Resolution of Luminance-Contrast Gradients Affect Depth Perception

    PubMed Central

    Tsushima, Yoshiaki; Komine, Kazuteru; Sawahata, Yasuhito; Morita, Toshiya

    2016-01-01

    A great number of studies have suggested a variety of ways to get depth information from two dimensional images such as binocular disparity, shape-from-shading, size gradient/foreshortening, aerial perspective, and so on. Are there any other new factors affecting depth perception? A recent psychophysical study has investigated the correlation between image resolution and depth sensation of Cylinder images (A rectangle contains gradual luminance-contrast changes.). It was reported that higher resolution images facilitate depth perception. However, it is still not clear whether or not the finding generalizes to other kinds of visual stimuli, because there are more appropriate visual stimuli for exploration of depth perception of luminance-contrast changes, such as Gabor patch. Here, we further examined the relationship between image resolution and depth perception by conducting a series of psychophysical experiments with not only Cylinders but also Gabor patches having smoother luminance-contrast gradients. As a result, higher resolution images produced stronger depth sensation with both images. This finding suggests that image resolution affects depth perception of simple luminance-contrast differences (Gabor patch) as well as shape-from-shading (Cylinder). In addition, this phenomenon was found even when the resolution difference was undetectable. This indicates the existence of consciously available and unavailable information in our visual system. These findings further support the view that image resolution is a cue for depth perception that was previously ignored. It partially explains the unparalleled viewing experience of novel high resolution displays. PMID:26941693

  18. Undetectable Changes in Image Resolution of Luminance-Contrast Gradients Affect Depth Perception.

    PubMed

    Tsushima, Yoshiaki; Komine, Kazuteru; Sawahata, Yasuhito; Morita, Toshiya

    2016-01-01

    A great number of studies have suggested a variety of ways to get depth information from two dimensional images such as binocular disparity, shape-from-shading, size gradient/foreshortening, aerial perspective, and so on. Are there any other new factors affecting depth perception? A recent psychophysical study has investigated the correlation between image resolution and depth sensation of Cylinder images (A rectangle contains gradual luminance-contrast changes.). It was reported that higher resolution images facilitate depth perception. However, it is still not clear whether or not the finding generalizes to other kinds of visual stimuli, because there are more appropriate visual stimuli for exploration of depth perception of luminance-contrast changes, such as Gabor patch. Here, we further examined the relationship between image resolution and depth perception by conducting a series of psychophysical experiments with not only Cylinders but also Gabor patches having smoother luminance-contrast gradients. As a result, higher resolution images produced stronger depth sensation with both images. This finding suggests that image resolution affects depth perception of simple luminance-contrast differences (Gabor patch) as well as shape-from-shading (Cylinder). In addition, this phenomenon was found even when the resolution difference was undetectable. This indicates the existence of consciously available and unavailable information in our visual system. These findings further support the view that image resolution is a cue for depth perception that was previously ignored. It partially explains the unparalleled viewing experience of novel high resolution displays.

  19. Odors Bias Time Perception in Visual and Auditory Modalities

    PubMed Central

    Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang

    2016-01-01

    Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of

  20. Odors Bias Time Perception in Visual and Auditory Modalities.

    PubMed

    Yue, Zhenzhu; Gao, Tianyu; Chen, Lihan; Wu, Jiashuang

    2016-01-01

    Previous studies have shown that emotional states alter our perception of time. However, attention, which is modulated by a number of factors, such as emotional events, also influences time perception. To exclude potential attentional effects associated with emotional events, various types of odors (inducing different levels of emotional arousal) were used to explore whether olfactory events modulated time perception differently in visual and auditory modalities. Participants were shown either a visual dot or heard a continuous tone for 1000 or 4000 ms while they were exposed to odors of jasmine, lavender, or garlic. Participants then reproduced the temporal durations of the preceding visual or auditory stimuli by pressing the spacebar twice. Their reproduced durations were compared to those in the control condition (without odor). The results showed that participants produced significantly longer time intervals in the lavender condition than in the jasmine or garlic conditions. The overall influence of odor on time perception was equivalent for both visual and auditory modalities. The analysis of the interaction effect showed that participants produced longer durations than the actual duration in the short interval condition, but they produced shorter durations in the long interval condition. The effect sizes were larger for the auditory modality than those for the visual modality. Moreover, by comparing performance across the initial and the final blocks of the experiment, we found odor adaptation effects were mainly manifested as longer reproductions for the short time interval later in the adaptation phase, and there was a larger effect size in the auditory modality. In summary, the present results indicate that odors imposed differential impacts on reproduced time durations, and they were constrained by different sensory modalities, valence of the emotional events, and target durations. Biases in time perception could be accounted for by a framework of

  1. Visual perception and saccadic eye movements.

    PubMed

    Ibbotson, Michael; Krekelberg, Bart

    2011-08-01

    We use saccades several times per second to move the fovea between points of interest and build an understanding of our visual environment. Recent behavioral experiments show evidence for the integration of pre- and postsaccadic information (even subliminally), the modulation of visual sensitivity, and the rapid reallocation of attention. The recent physiological literature has identified a characteristic modulation of neural responsiveness-perisaccadic reduction followed by a postsaccadic increase-that is found in many visual areas, but whose source is as yet unknown. This modulation seems optimal for reducing sensitivity during and boosting sensitivity between saccades, but no study has yet established a direct causal link between neural and behavioral changes. PMID:21646014

  2. Relationships between Categorical Perception of Phonemes, Phoneme Awareness, and Visual Attention Span in Developmental Dyslexia.

    PubMed

    Zoubrinetzky, Rachel; Collet, Gregory; Serniclaes, Willy; Nguyen-Morel, Marie-Ange; Valdois, Sylviane

    2016-01-01

    We tested the hypothesis that the categorical perception deficit of speech sounds in developmental dyslexia is related to phoneme awareness skills, whereas a visual attention (VA) span deficit constitutes an independent deficit. Phoneme awareness tasks, VA span tasks and categorical perception tasks of phoneme identification and discrimination using a d/t voicing continuum were administered to 63 dyslexic children and 63 control children matched on chronological age. Results showed significant differences in categorical perception between the dyslexic and control children. Significant correlations were found between categorical perception skills, phoneme awareness and reading. Although VA span correlated with reading, no significant correlations were found between either categorical perception or phoneme awareness and VA span. Mediation analyses performed on the whole dyslexic sample suggested that the effect of categorical perception on reading might be mediated by phoneme awareness. This relationship was independent of the participants' VA span abilities. Two groups of dyslexic children with a single phoneme awareness or a single VA span deficit were then identified. The phonologically impaired group showed lower categorical perception skills than the control group but categorical perception was similar in the VA span impaired dyslexic and control children. The overall findings suggest that the link between categorical perception, phoneme awareness and reading is independent from VA span skills. These findings provide new insights on the heterogeneity of developmental dyslexia. They suggest that phonological processes and VA span independently affect reading acquisition. PMID:26950210

  3. Relationships between Categorical Perception of Phonemes, Phoneme Awareness, and Visual Attention Span in Developmental Dyslexia.

    PubMed

    Zoubrinetzky, Rachel; Collet, Gregory; Serniclaes, Willy; Nguyen-Morel, Marie-Ange; Valdois, Sylviane

    2016-01-01

    We tested the hypothesis that the categorical perception deficit of speech sounds in developmental dyslexia is related to phoneme awareness skills, whereas a visual attention (VA) span deficit constitutes an independent deficit. Phoneme awareness tasks, VA span tasks and categorical perception tasks of phoneme identification and discrimination using a d/t voicing continuum were administered to 63 dyslexic children and 63 control children matched on chronological age. Results showed significant differences in categorical perception between the dyslexic and control children. Significant correlations were found between categorical perception skills, phoneme awareness and reading. Although VA span correlated with reading, no significant correlations were found between either categorical perception or phoneme awareness and VA span. Mediation analyses performed on the whole dyslexic sample suggested that the effect of categorical perception on reading might be mediated by phoneme awareness. This relationship was independent of the participants' VA span abilities. Two groups of dyslexic children with a single phoneme awareness or a single VA span deficit were then identified. The phonologically impaired group showed lower categorical perception skills than the control group but categorical perception was similar in the VA span impaired dyslexic and control children. The overall findings suggest that the link between categorical perception, phoneme awareness and reading is independent from VA span skills. These findings provide new insights on the heterogeneity of developmental dyslexia. They suggest that phonological processes and VA span independently affect reading acquisition.

  4. Relationships between Categorical Perception of Phonemes, Phoneme Awareness, and Visual Attention Span in Developmental Dyslexia

    PubMed Central

    Zoubrinetzky, Rachel; Collet, Gregory; Serniclaes, Willy; Nguyen-Morel, Marie-Ange; Valdois, Sylviane

    2016-01-01

    We tested the hypothesis that the categorical perception deficit of speech sounds in developmental dyslexia is related to phoneme awareness skills, whereas a visual attention (VA) span deficit constitutes an independent deficit. Phoneme awareness tasks, VA span tasks and categorical perception tasks of phoneme identification and discrimination using a d/t voicing continuum were administered to 63 dyslexic children and 63 control children matched on chronological age. Results showed significant differences in categorical perception between the dyslexic and control children. Significant correlations were found between categorical perception skills, phoneme awareness and reading. Although VA span correlated with reading, no significant correlations were found between either categorical perception or phoneme awareness and VA span. Mediation analyses performed on the whole dyslexic sample suggested that the effect of categorical perception on reading might be mediated by phoneme awareness. This relationship was independent of the participants’ VA span abilities. Two groups of dyslexic children with a single phoneme awareness or a single VA span deficit were then identified. The phonologically impaired group showed lower categorical perception skills than the control group but categorical perception was similar in the VA span impaired dyslexic and control children. The overall findings suggest that the link between categorical perception, phoneme awareness and reading is independent from VA span skills. These findings provide new insights on the heterogeneity of developmental dyslexia. They suggest that phonological processes and VA span independently affect reading acquisition. PMID:26950210

  5. Lighting System for Visual Perception Enhancement in Volume Rendering.

    PubMed

    Wang, Lei; Kaufman, Arie E

    2013-01-01

    We introduce a lighting system that enhances the visual cues in a rendered image for the perception of 3D volumetric objects. We divide the lighting effects into global and local effects, and deploy three types of directional lights: the key light and accessory lights (fill and detail lights). The key light provides both lighting effects and carries the visual cues for the perception of local and global shapes and depth. The cues for local shapes are conveyed by gradient; those for global shapes are carried by shadows; and those for depth are provided by shadows and translucent objects. Fill lights produce global effects to increase the perceptibility. Detail lights generate local effects to improve the cues for local shapes. Our method quantifies the perception and uses an exhaustive search to set the lights. It configures accessory lights with the consideration of preserving the global impression conveyed by the key light. It ensures the feeling of smooth light movements in animations. With simplification, it achieves interactive frame rates and produces results that are visually indistinguishable from results using the nonsimplified algorithm. The major contributions of this paper are our lighting system, perception measurement and lighting design algorithm with our indistinguishable simplification.

  6. Unifying account of visual motion and position perception

    PubMed Central

    Kwon, Oh-Sang; Tadin, Duje; Knill, David C.

    2015-01-01

    Despite growing evidence for perceptual interactions between motion and position, no unifying framework exists to account for these two key features of our visual experience. We show that percepts of both object position and motion derive from a common object-tracking system—a system that optimally integrates sensory signals with a realistic model of motion dynamics, effectively inferring their generative causes. The object-tracking model provides an excellent fit to both position and motion judgments in simple stimuli. With no changes in model parameters, the same model also accounts for subjects’ novel illusory percepts in more complex moving stimuli. The resulting framework is characterized by a strong bidirectional coupling between position and motion estimates and provides a rational, unifying account of a number of motion and position phenomena that are currently thought to arise from independent mechanisms. This includes motion-induced shifts in perceived position, perceptual slow-speed biases, slowing of motions shown in visual periphery, and the well-known curveball illusion. These results reveal that motion perception cannot be isolated from position signals. Even in the simplest displays with no changes in object position, our perception is driven by the output of an object-tracking system that rationally infers different generative causes of motion signals. Taken together, we show that object tracking plays a fundamental role in perception of visual motion and position. PMID:26080410

  7. Cerebral Visual Impairment: which perceptive visual dysfunctions can be expected in children with brain damage? A systematic review.

    PubMed

    Boot, F H; Pel, J J M; van der Steen, J; Evenhuis, H M

    2010-01-01

    The current definition of Cerebral Visual Impairment (CVI) includes all visual dysfunctions caused by damage to, or malfunctioning of, the retrochiasmatic visual pathways in the absence of damage to the anterior visual pathways or any major ocular disease. CVI is diagnosed by exclusion and the existence of many different causes and symptoms make it an overall non-categorized group. To date, no discrimination is made within CVI based on types of perceptive visual dysfunctions. The aim of this review was to outline which perceptive visual dysfunctions are to be expected based on a number of etiologies of brain damage and brain development disorders with their onset in the pre-, peri- or postnatal period. For each period two etiologies were chosen as the main characteristic brain damage. For each etiology a main search was performed. The selection of the articles was based on the following criteria: age, etiology, imaging, central pathology and perceptive visual function test. The perceptive visual functions included for this review were object recognition, face recognition, visual memory, orientation, visual spatial perception, motion perception and simultaneous perception. Our search resulted in 11 key articles. A diversity of research history is performed for the selected etiologies and their relation to perceptive visual dysfunctions. Periventricular Leukomalacia (PVL) was most studied, whereas the main tested perceptive visual function was visual spatial perception. As a conclusion, the present status of research in the field of CVI does not allow to correlate between etiology, location and perceptive visual dysfunctions in children with brain damage or a brain development disorder. A limiting factor could be the small number of objective tests performed in children experiencing problems in visual processing. Based on recent insights in central visual information processing, we recommend an alternative approach for the definition of CVI that is based on

  8. Visual Perception of Touchdown Point During Simulated Landing

    ERIC Educational Resources Information Center

    Palmisano, Stephen; Gillam, Barbara

    2005-01-01

    Experiments examined the accuracy of visual touchdown point perception during oblique descents (1.5?-15?) toward a ground plane consisting of (a) randomly positioned dots, (b) a runway outline, or (c) a grid. Participants judged whether the perceived touchdown point was above or below a probe that appeared at a random position following each…

  9. Asymmetries for the Visual Expression and Perception of Speech

    ERIC Educational Resources Information Center

    Nicholls, Michael E. R.; Searle, Dara A.

    2006-01-01

    This study explored asymmetries for movement, expression and perception of visual speech. Sixteen dextral models were videoed as they articulated: "bat," "cat," "fat," and "sat." Measurements revealed that the right side of the mouth was opened wider and for a longer period than the left. The asymmetry was accentuated at the beginning and ends of…

  10. Perceptual Training Strongly Improves Visual Motion Perception in Schizophrenia

    ERIC Educational Resources Information Center

    Norton, Daniel J.; McBain, Ryan K.; Ongur, Dost; Chen, Yue

    2011-01-01

    Schizophrenia patients exhibit perceptual and cognitive deficits, including in visual motion processing. Given that cognitive systems depend upon perceptual inputs, improving patients' perceptual abilities may be an effective means of cognitive intervention. In healthy people, motion perception can be enhanced through perceptual learning, but it…

  11. Infant Perception of Audio-Visual Speech Synchrony

    ERIC Educational Resources Information Center

    Lewkowicz, David J.

    2010-01-01

    Three experiments investigated perception of audio-visual (A-V) speech synchrony in 4- to 10-month-old infants. Experiments 1 and 2 used a convergent-operations approach by habituating infants to an audiovisually synchronous syllable (Experiment 1) and then testing for detection of increasing degrees of A-V asynchrony (366, 500, and 666 ms) or by…

  12. Audio-Visual Speech Perception: A Developmental ERP Investigation

    ERIC Educational Resources Information Center

    Knowland, Victoria C. P.; Mercure, Evelyne; Karmiloff-Smith, Annette; Dick, Fred; Thomas, Michael S. C.

    2014-01-01

    Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language…

  13. Curriculum Model for Oculomotor. Binocular, and Visual Perception Dysfunctions.

    ERIC Educational Resources Information Center

    Journal of Optometric Education, 1988

    1988-01-01

    A curriculum for disorders of oculomotor control, binocular vision, and visual perception, adopted by the Association of Schools and Colleges of Optometry, is outlined. The curriculum's 14 objectives in physiology, perceptual and cognitive development, epidemiology, public health, diagnosis and management, environmental influences, care delivery,…

  14. Dynamic Visual Perception and Reading Development in Chinese School Children

    ERIC Educational Resources Information Center

    Meng, Xiangzhi; Cheng-Lai, Alice; Zeng, Biao; Stein, John F.; Zhou, Xiaolin

    2011-01-01

    The development of reading skills may depend to a certain extent on the development of basic visual perception. The magnocellular theory of developmental dyslexia assumes that deficits in the magnocellular pathway, indicated by less sensitivity in perceiving dynamic sensory stimuli, are responsible for a proportion of reading difficulties…

  15. Enhanced visual perception with occipital transcranial magnetic stimulation.

    PubMed

    Mulckhuyse, Manon; Kelley, Todd A; Theeuwes, Jan; Walsh, Vincent; Lavie, Nilli

    2011-10-01

    Transcranial magnetic stimulation (TMS) over the occipital pole can produce an illusory percept of a light flash (or 'phosphene'), suggesting an excitatory effect. Whereas previous reported effects produced by single-pulse occipital pole TMS are typically disruptive, here we report the first demonstration of a location-specific facilitatory effect on visual perception in humans. Observers performed a spatial cueing orientation discrimination task. An orientation target was presented in one of two peripheral placeholders. A single pulse below the phosphene threshold applied to the occipital pole 150 or 200 ms before stimulus onset was found to facilitate target discrimination in the contralateral compared with the ipsilateral visual field. At the 150-ms time window contralateral TMS also amplified cueing effects, increasing both facilitation effects for valid cues and interference effects for invalid cues. These results are the first to show location-specific enhanced visual perception with single-pulse occipital pole stimulation prior to stimulus presentation, suggesting that occipital stimulation can enhance the excitability of visual cortex to subsequent perception. PMID:21848918

  16. Exaggerated color perception in a patient with visual form agnosia.

    PubMed

    Yang, Jiongjiong; Wu, Ming; Shen, Zheng

    2007-10-01

    Previous studies on visual form agnosic patients have shown that their color perception is relatively preserved when monochromatic figures are used. However, it is unclear whether their color perception remains normal when figures are composed of two parts in different colors. The results showed that patient X.F. had difficulty in naming both colors when the two colors were placed next to each other, and in discriminating the two-color figure from the figure presented in its larger color. In contrast, X.F. could name the two colors when they were physically separated. These data suggest that X.F. manifests exaggerated color perception, producing a color filling-in effect that may be mediated by her spared early visual area.

  17. The effect of induced visual stress on three dimensional perception.

    PubMed

    Abd-Manan, F

    2000-07-01

    Previous studies have shown that stress on the vergence and accommodation systems, either artificially induced or naturally occurring, results in small misalignment of the visual axes, reduces binocular visual acuity and produces symptoms of ocular discomfort. This study examines the effect of artificially induced visual stress using ophthalmic prisms on three dimensional perception on 30 optometry students ages ranging from 19 to 29 years old. 6D base-in prisms, equally divided between the eyes (3D base-in each) was used to induce stress on the visual system producing misalignment of visual axes known as fixation disparity. The fixation disparity is quantified using near vision Mallett Unit and an enlarged scaled diagram. Stereoscopic perception was measured with the TNO test, with and without the presence of stress and the results was compared. Wilcoxon's matched pair ranked tests show statistically significant difference in the stereo thresholds of both conditions, p = 0.01 for advancing stereopsis and p = 0.01 for receding stereopsis, respectively. The study concludes that visual stress induced by prisms, produce misalignment of the visual axes and thus reduces three dimensional performance. PMID:22977386

  18. Integration of visual and motion cues for simulator requirements and ride quality investigation. [computerized simulation of aircraft landing, visual perception of aircraft pilots

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1975-01-01

    Preliminary tests and evaluation are presented of pilot performance during landing (flight paths) using computer generated images (video tapes). Psychophysiological factors affecting pilot visual perception were measured. A turning flight maneuver (pitch and roll) was specifically studied using a training device, and the scaling laws involved were determined. Also presented are medical studies (abstracts) on human response to gravity variations without visual cues, acceleration stimuli effects on the semicircular canals, and neurons affecting eye movements, and vestibular tests.

  19. Types and tokens in visual letter perception.

    PubMed

    Mozer, M C

    1989-05-01

    Five experiments demonstrate that in briefly presented displays, subjects have difficulty distinguishing repeated instances of a letter or digit (multiple tokens of the same type). When subjects were asked to estimate the numerosity of a display, reports were lower for displays containing repeated letters, for example, DDDD, than for displays containing distinct letters, for example, NRVT. This homogeneity effect depends on the common visual form of adjacent letters. A distinct homogeneity effect, one that depends on the repetition of abstract letter identities, was also found: When subjects were asked to report the number of As and Es in a display, performance was poorer on displays containing two instances of a target letter, one appearing in uppercase and the other in lowercase, than on displays containing one of each target letter. This effect must be due to the repetition of identities, because visual form is not repeated in these mixed-case displays. Further experiments showed that this effect was not influenced by the context surrounding the target letters, and that it can be tied to limitations in attentional processing. The results are interpreted in terms of a model in which parallel encoding processes are capable of automatically analyzing information from several regions of the visual field simultaneously, but fail to accurately encode location information. The resulting representation is thus insufficient to distinguish one token from another because two tokens of a given type differ only in location. However, with serial attentional processing multiple tokens can be kept distinct, pointing to yet another limit on the ability to process visual information in parallel.

  20. Quality of Visual Cue Affects Visual Reweighting in Quiet Standing

    PubMed Central

    Moraes, Renato; de Freitas, Paulo Barbosa; Razuk, Milena; Barela, José Angelo

    2016-01-01

    Sensory reweighting is a characteristic of postural control functioning adopted to accommodate environmental changes. The use of mono or binocular cues induces visual reduction/increment of moving room influences on postural sway, suggesting a visual reweighting due to the quality of available sensory cues. Because in our previous study visual conditions were set before each trial, participants could adjust the weight of the different sensory systems in an anticipatory manner based upon the reduction in quality of the visual information. Nevertheless, in daily situations this adjustment is a dynamical process and occurs during ongoing movement. The purpose of this study was to examine the effect of visual transitions in the coupling between visual information and body sway in two different distances from the front wall of a moving room. Eleven young adults stood upright inside of a moving room in two distances (75 and 150 cm) wearing a liquid crystal lenses goggles, which allow individual lenses transition from opaque to transparent and vice-versa. Participants stood still during five minutes for each trial and the lenses status changed every one minute (no vision to binocular vision, no vision to monocular vision, binocular vision to monocular vision, and vice-versa). Results showed that farther distance and monocular vision reduced the effect of visual manipulation on postural sway. The effect of visual transition was condition dependent, with a stronger effect when transitions involved binocular vision than monocular vision. Based upon these results, we conclude that the increased distance from the front wall of the room reduced the effect of visual manipulation on postural sway and that sensory reweighting is stimulus quality dependent, with binocular vision producing a much stronger down/up-weighting than monocular vision. PMID:26939058

  1. Prenatal exposure to recreational drugs affects global motion perception in preschool children.

    PubMed

    Chakraborty, Arijit; Anstice, Nicola S; Jacobs, Robert J; LaGasse, Linda L; Lester, Barry M; Wouldes, Trecia A; Thompson, Benjamin

    2015-01-01

    Prenatal exposure to recreational drugs impairs motor and cognitive development; however it is currently unknown whether visual brain areas are affected. To address this question, we investigated the effect of prenatal drug exposure on global motion perception, a behavioural measure of processing within the dorsal extrastriate visual cortex that is thought to be particularly vulnerable to abnormal neurodevelopment. Global motion perception was measured in one hundred and forty-five 4.5-year-old children who had been exposed to different combinations of methamphetamine, alcohol, nicotine and marijuana prior to birth and 25 unexposed children. Self-reported drug use by the mothers was verified by meconium analysis. We found that global motion perception was impaired by prenatal exposure to alcohol and improved significantly by exposure to marijuana. Exposure to both drugs prenatally had no effect. Other visual functions such as habitual visual acuity and stereoacuity were not affected by drug exposure. Prenatal exposure to methamphetamine did not influence visual function. Our results demonstrate that prenatal drug exposure can influence a behavioural measure of visual development, but that the effects are dependent on the specific drugs used during pregnancy. PMID:26581958

  2. Prenatal exposure to recreational drugs affects global motion perception in preschool children

    PubMed Central

    Chakraborty, Arijit; Anstice, Nicola S.; Jacobs, Robert J.; LaGasse, Linda L.; Lester, Barry M.; Wouldes, Trecia A.; Thompson, Benjamin

    2015-01-01

    Prenatal exposure to recreational drugs impairs motor and cognitive development; however it is currently unknown whether visual brain areas are affected. To address this question, we investigated the effect of prenatal drug exposure on global motion perception, a behavioural measure of processing within the dorsal extrastriate visual cortex that is thought to be particularly vulnerable to abnormal neurodevelopment. Global motion perception was measured in one hundred and forty-five 4.5-year-old children who had been exposed to different combinations of methamphetamine, alcohol, nicotine and marijuana prior to birth and 25 unexposed children. Self-reported drug use by the mothers was verified by meconium analysis. We found that global motion perception was impaired by prenatal exposure to alcohol and improved significantly by exposure to marijuana. Exposure to both drugs prenatally had no effect. Other visual functions such as habitual visual acuity and stereoacuity were not affected by drug exposure. Prenatal exposure to methamphetamine did not influence visual function. Our results demonstrate that prenatal drug exposure can influence a behavioural measure of visual development, but that the effects are dependent on the specific drugs used during pregnancy. PMID:26581958

  3. Audiovisual associations alter the perception of low-level visual motion.

    PubMed

    Kafaligonul, Hulusi; Oluk, Can

    2015-01-01

    Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role.

  4. Audiovisual associations alter the perception of low-level visual motion.

    PubMed

    Kafaligonul, Hulusi; Oluk, Can

    2015-01-01

    Motion perception is a pervasive nature of vision and is affected by both immediate pattern of sensory inputs and prior experiences acquired through associations. Recently, several studies reported that an association can be established quickly between directions of visual motion and static sounds of distinct frequencies. After the association is formed, sounds are able to change the perceived direction of visual motion. To determine whether such rapidly acquired audiovisual associations and their subsequent influences on visual motion perception are dependent on the involvement of higher-order attentive tracking mechanisms, we designed psychophysical experiments using regular and reverse-phi random dot motions isolating low-level pre-attentive motion processing. Our results show that an association between the directions of low-level visual motion and static sounds can be formed and this audiovisual association alters the subsequent perception of low-level visual motion. These findings support the view that audiovisual associations are not restricted to high-level attention based motion system and early-level visual motion processing has some potential role. PMID:25873869

  5. Adaptation to visual or auditory time intervals modulates the perception of visual apparent motion

    PubMed Central

    Zhang, Huihui; Chen, Lihan; Zhou, Xiaolin

    2012-01-01

    It is debated whether sub-second timing is subserved by a centralized mechanism or by the intrinsic properties of task-related neural activity in specific modalities (Ivry and Schlerf, 2008). By using a temporal adaptation task, we investigated whether adapting to different time intervals conveyed through stimuli in different modalities (i.e., frames of a visual Ternus display, visual blinking discs, or auditory beeps) would affect the subsequent implicit perception of visual timing, i.e., inter-stimulus interval (ISI) between two frames in a Ternus display. The Ternus display can induce two percepts of apparent motion (AM), depending on the ISI between the two frames: “element motion” for short ISIs, in which the endmost disc is seen as moving back and forth while the middle disc at the overlapping or central position remains stationary; “group motion” for longer ISIs, in which both discs appear to move in a manner of lateral displacement as a whole. In Experiment 1, participants adapted to either the typical “element motion” (ISI = 50 ms) or the typical “group motion” (ISI = 200 ms). In Experiments 2 and 3, participants adapted to a time interval of 50 or 200 ms through observing a series of two paired blinking discs at the center of the screen (Experiment 2) or hearing a sequence of two paired beeps (with pitch 1000 Hz). In Experiment 4, participants adapted to sequences of paired beeps with either low pitches (500 Hz) or high pitches (5000 Hz). After adaptation in each trial, participants were presented with a Ternus probe in which the ISI between the two frames was equal to the transitional threshold of the two types of motions, as determined by a pretest. Results showed that adapting to the short time interval in all the situations led to more reports of “group motion” in the subsequent Ternus probes; adapting to the long time interval, however, caused no aftereffect for visual adaptation but significantly more reports of group motion for

  6. The Perception of Cooperativeness Without Any Visual or Auditory Communication.

    PubMed

    Chang, Dong-Seon; Burger, Franziska; Bülthoff, Heinrich H; de la Rosa, Stephan

    2015-12-01

    Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal.

  7. The Perception of Cooperativeness Without Any Visual or Auditory Communication

    PubMed Central

    Chang, Dong-Seon; Burger, Franziska; de la Rosa, Stephan

    2015-01-01

    Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal. PMID:27551362

  8. The Perception of Cooperativeness Without Any Visual or Auditory Communication.

    PubMed

    Chang, Dong-Seon; Burger, Franziska; Bülthoff, Heinrich H; de la Rosa, Stephan

    2015-12-01

    Perceiving social information such as the cooperativeness of another person is an important part of human interaction. But can people perceive the cooperativeness of others even without any visual or auditory information? In a novel experimental setup, we connected two people with a rope and made them accomplish a point-collecting task together while they could not see or hear each other. We observed a consistently emerging turn-taking behavior in the interactions and installed a confederate in a subsequent experiment who either minimized or maximized this behavior. Participants experienced this only through the haptic force-feedback of the rope and made evaluations about the confederate after each interaction. We found that perception of cooperativeness was significantly affected only by the manipulation of this turn-taking behavior. Gender- and size-related judgments also significantly differed. Our results suggest that people can perceive social information such as the cooperativeness of other people even in situations where possibilities for communication are minimal. PMID:27551362

  9. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  10. Modeling visual clutter perception using proto-object segmentation

    PubMed Central

    Yu, Chen-Ping; Samaras, Dimitris; Zelinsky, Gregory J.

    2014-01-01

    We introduce the proto-object model of visual clutter perception. This unsupervised model segments an image into superpixels, then merges neighboring superpixels that share a common color cluster to obtain proto-objects—defined here as spatially extended regions of coherent features. Clutter is estimated by simply counting the number of proto-objects. We tested this model using 90 images of realistic scenes that were ranked by observers from least to most cluttered. Comparing this behaviorally obtained ranking to a ranking based on the model clutter estimates, we found a significant correlation between the two (Spearman's ρ = 0.814, p < 0.001). We also found that the proto-object model was highly robust to changes in its parameters and was generalizable to unseen images. We compared the proto-object model to six other models of clutter perception and demonstrated that it outperformed each, in some cases dramatically. Importantly, we also showed that the proto-object model was a better predictor of clutter perception than an actual count of the number of objects in the scenes, suggesting that the set size of a scene may be better described by proto-objects than objects. We conclude that the success of the proto-object model is due in part to its use of an intermediate level of visual representation—one between features and objects—and that this is evidence for the potential importance of a proto-object representation in many common visual percepts and tasks. PMID:24904121

  11. Neuronal codes for visual perception and memory.

    PubMed

    Quian Quiroga, Rodrigo

    2016-03-01

    In this review, I describe and contrast the representation of stimuli in visual cortical areas and in the medial temporal lobe (MTL). While cortex is characterized by a distributed and implicit coding that is optimal for recognition and storage of semantic information, the MTL shows a much sparser and explicit coding of specific concepts that is ideal for episodic memory. I will describe the main characteristics of the coding in the MTL by the so-called concept cells and will then propose a model of the formation and recall of episodic memory based on partially overlapping assemblies. PMID:26707718

  12. Perception, Cognition, and Effectiveness of Visualizations with Applications in Science and Engineering

    NASA Astrophysics Data System (ADS)

    Borkin, Michelle A.

    Visualization is a powerful tool for data exploration and analysis. With data ever-increasing in quantity and becoming integrated into our daily lives, having effective visualizations is necessary. But how does one design an effective visualization? To answer this question we need to understand how humans perceive, process, and understand visualizations. Through visualization evaluation studies we can gain deeper insight into the basic perception and cognition theory of visualizations, both through domain-specific case studies as well as generalized laboratory experiments. This dissertation presents the results of four evaluation studies, each of which contributes new knowledge to the theory of perception and cognition of visualizations. The results of these studies include a deeper clearer understanding of how color, data representation dimensionality, spatial layout, and visual complexity affect a visualization's effectiveness, as well as how visualization types and visual attributes affect the memorability of a visualization. We first present the results of two domain-specific case study evaluations. The first study is in the field of biomedicine in which we developed a new heart disease diagnostic tool, and conducted a study to evaluate the effectiveness of 2D versus 3D data representations as well as color maps. In the second study, we developed a new visualization tool for filesystem provenance data with applications in computer science and the sciences more broadly. We additionally developed a new time-based hierarchical node grouping method. We then conducted a study to evaluate the effectiveness of the new tool with its radial layout versus the conventional node-link diagram, and the new node grouping method. Finally, we discuss the results of two generalized studies designed to understand what makes a visualization memorable. In the first evaluation we focused on visualization memorability and conducted an online study using Amazon's Mechanical Turk with

  13. Visual perception of force: comment on White (2012).

    PubMed

    Hubbard, Timothy L

    2012-07-01

    White (2012) proposed that kinematic features in a visual percept are matched to stored representations containing information regarding forces (based on prior haptic experience) and that information in the matched, stored representations regarding forces is then incorporated into visual perception. Although some elements of White's (2012) account appear consistent with previous findings and theories, other elements do not appear consistent with previous findings and theories or are in need of clarification. Some of the latter elements include the (a) differences between perception and impression (representation of force; relationship of force and resistance; role and necessity of stored representations and of concurrent simulation; roles of rules, cues, and heuristics), (b) characteristics of object motion and human movement (whether motion is internally generated or externally generated and whether motion is biological or nonbiological; generalization of human action and the extent to which perceived force depends upon similarity of object movement to human patterns of movement), (c) related perceptual and cognitive phenomena (representational momentum, imagery, psychophysics of force perception, perception of causality), and (d) scope and limitations of White's account (attributions of intentionality, falsifiability). PMID:22730923

  14. Boosting visual cortex function and plasticity with acetylcholine to enhance visual perception

    PubMed Central

    Kang, Jun Il; Huppé-Gourgues, Frédéric; Vaucher, Elvire

    2014-01-01

    The cholinergic system is a potent neuromodulatory system that plays critical roles in cortical plasticity, attention and learning. In this review, we propose that the cellular effects of acetylcholine (ACh) in the primary visual cortex during the processing of visual inputs might induce perceptual learning; i.e., long-term changes in visual perception. Specifically, the pairing of cholinergic activation with visual stimulation increases the signal-to-noise ratio, cue detection ability and long-term facilitation in the primary visual cortex. This cholinergic enhancement would increase the strength of thalamocortical afferents to facilitate the treatment of a novel stimulus while decreasing the cortico-cortical signaling to reduce recurrent or top-down modulation. This balance would be mediated by different cholinergic receptor subtypes that are located on both glutamatergic and GABAergic neurons of the different cortical layers. The mechanisms of cholinergic enhancement are closely linked to attentional processes, long-term potentiation (LTP) and modulation of the excitatory/inhibitory balance. Recently, it was found that boosting the cholinergic system during visual training robustly enhances sensory perception in a long-term manner. Our hypothesis is that repetitive pairing of cholinergic and sensory stimulation over a long period of time induces long-term changes in the processing of trained stimuli that might improve perceptual ability. Various non-invasive approaches to the activation of the cholinergic neurons have strong potential to improve visual perception. PMID:25278848

  15. Dynamics of travelling waves in visual perception.

    PubMed

    Wilson, H R; Blake, R; Lee, S H

    2001-08-30

    Nonlinear wave propagation is ubiquitous in nature, appearing in chemical reaction kinetics, cardiac tissue dynamics, cortical spreading depression and slow wave sleep. The application of dynamical modelling has provided valuable insights into the mechanisms underlying such nonlinear wave phenomena in several domains. Wave propagation can also be perceived as sweeping waves of visibility that occur when the two eyes view radically different stimuli. Termed binocular rivalry, these fluctuating states of perceptual dominance and suppression are thought to provide a window into the neural dynamics that underlie conscious visual awareness. Here we introduce a technique to measure the speed of rivalry dominance waves propagating around a large, essentially one-dimensional annulus. When mapped onto visual cortex, propagation speed is independent of eccentricity. Propagation speed doubles when waves travel along continuous contours, thus demonstrating effects of collinear facilitation. A neural model with reciprocal inhibition between two layers of units provides a quantitative explanation of dominance wave propagation in terms of disinhibition. Dominance waves provide a new tool for investigating fundamental cortical dynamics. PMID:11528478

  16. Visual stability and space perception in monocular vision: mathematical model.

    PubMed

    Hadani, I; Ishai, G; Gur, M

    1980-01-01

    A deterministic model for monocular space perception is presented. According to the model, retinal luminance changes due to involuntary eye movements are detected and locally analyzed to yield the angular velocity of each image point. The stable three-dimensional spatial coordinates of viewed objects are then reconstructed using a method of infinitesimal transformations. The extraction of the movement (parallax) field from the optical flow is represented by a set of differential equations, the derivation of which is based on the conservation of energy principle. The relation of the model to retinal neurophysiology and to various aspects of visual space perception is discussed.

  17. Linking brain imaging signals to visual perception.

    PubMed

    Welchman, Andrew E; Kourtzi, Zoe

    2013-11-01

    The rapid advances in brain imaging technology over the past 20 years are affording new insights into cortical processing hierarchies in the human brain. These new data provide a complementary front in seeking to understand the links between perceptual and physiological states. Here we review some of the challenges associated with incorporating brain imaging data into such "linking hypotheses," highlighting some of the considerations needed in brain imaging data acquisition and analysis. We discuss work that has sought to link human brain imaging signals to existing electrophysiological data and opened up new opportunities in studying the neural basis of complex perceptual judgments. We consider a range of approaches when using human functional magnetic resonance imaging to identify brain circuits whose activity changes in a similar manner to perceptual judgments and illustrate these approaches by discussing work that has studied the neural basis of 3D perception and perceptual learning. Finally, we describe approaches that have sought to understand the information content of brain imaging data using machine learning and work that has integrated multimodal data to overcome the limitations associated with individual brain imaging approaches. Together these approaches provide an important route in seeking to understand the links between physiological and psychological states.

  18. Visual Timing of Structured Dance Movements Resembles Auditory Rhythm Perception

    PubMed Central

    Su, Yi-Huang; Salazar-López, Elvira

    2016-01-01

    Temporal mechanisms for processing auditory musical rhythms are well established, in which a perceived beat is beneficial for timing purposes. It is yet unknown whether such beat-based timing would also underlie visual perception of temporally structured, ecological stimuli connected to music: dance. In this study, we investigated whether observers extracted a visual beat when watching dance movements to assist visual timing of these movements. Participants watched silent videos of dance sequences and reproduced the movement duration by mental recall. We found better visual timing for limb movements with regular patterns in the trajectories than without, similar to the beat advantage for auditory rhythms. When movements involved both the arms and the legs, the benefit of a visual beat relied only on the latter. The beat-based advantage persisted despite auditory interferences that were temporally incongruent with the visual beat, arguing for the visual nature of these mechanisms. Our results suggest that visual timing principles for dance parallel their auditory counterparts for music, which may be based on common sensorimotor coupling. These processes likely yield multimodal rhythm representations in the scenario of music and dance. PMID:27313900

  19. Audio-visual speech perception: a developmental ERP investigation.

    PubMed

    Knowland, Victoria C P; Mercure, Evelyne; Karmiloff-Smith, Annette; Dick, Fred; Thomas, Michael S C

    2014-01-01

    Being able to see a talking face confers a considerable advantage for speech perception in adulthood. However, behavioural data currently suggest that children fail to make full use of these available visual speech cues until age 8 or 9. This is particularly surprising given the potential utility of multiple informational cues during language learning. We therefore explored this at the neural level. The event-related potential (ERP) technique has been used to assess the mechanisms of audio-visual speech perception in adults, with visual cues reliably modulating auditory ERP responses to speech. Previous work has shown congruence-dependent shortening of auditory N1/P2 latency and congruence-independent attenuation of amplitude in the presence of auditory and visual speech signals, compared to auditory alone. The aim of this study was to chart the development of these well-established modulatory effects over mid-to-late childhood. Experiment 1 employed an adult sample to validate a child-friendly stimulus set and paradigm by replicating previously observed effects of N1/P2 amplitude and latency modulation by visual speech cues; it also revealed greater attenuation of component amplitude given incongruent audio-visual stimuli, pointing to a new interpretation of the amplitude modulation effect. Experiment 2 used the same paradigm to map cross-sectional developmental change in these ERP responses between 6 and 11 years of age. The effect of amplitude modulation by visual cues emerged over development, while the effect of latency modulation was stable over the child sample. These data suggest that auditory ERP modulation by visual speech represents separable underlying cognitive processes, some of which show earlier maturation than others over the course of development. PMID:24176002

  20. Visual Speech Acts Differently Than Lexical Context in Supporting Speech Perception

    PubMed Central

    Samuel, Arthur G.; Lieblich, Jerrold

    2014-01-01

    The speech signal is often badly articulated, and heard under difficult listening conditions. To deal with these problems, listeners make use of various types of context. In the current study, we examine a type of context that in previous work has been shown to affect how listeners report what they hear: visual speech (i.e., the visible movements of the speaker’s articulators). Despite the clear utility of this type of context under certain conditions, prior studies have shown that visually-driven phonetic percepts (via the “McGurk” effect) are not “real” enough to affect perception of later-occurring speech; such percepts have not produced selective adaptation effects. This failure contrasts with successful adaptation by sounds that are generated by lexical context – the word that a sound occurs within. We demonstrate here that this dissociation is robust, leading to the conclusion that visual and lexical contexts operate differently. We suggest that the dissociation reflects the dual nature of speech as both a perceptual object and a linguistic object. Visual speech seems to contribute directly to the computations of the perceptual object, but not the linguistic one, while lexical context is used in both types of computations. PMID:24749935

  1. Dynamic visual speech perception in a patient with visual form agnosia.

    PubMed

    Munhall, K G; Servos, P; Santi, A; Goodale, M A

    2002-10-01

    To examine the role of dynamic cues in visual speech perception, a patient with visual form agnosia (DF) was tested with a set of static and dynamic visual displays of three vowels. Five conditions were tested: (1) auditory only which provided only vocal pitch information, (2) dynamic visual only, (3) dynamic audiovisual with vocal pitch information, (4) dynamic audiovisual with full voice information and (5) static visual only images of postures during vowel production. DF showed normal performance in all conditions except the static visual only condition in which she scored at chance. Control subjects scored close to ceiling in this condition. The results suggest that spatiotemporal signatures for objects and events are processed separately from static form cues.

  2. Image Watermarking Based on Adaptive Models of Human Visual Perception

    NASA Astrophysics Data System (ADS)

    Khawne, Amnach; Hamamoto, Kazuhiko; Chitsobhuk, Orachat

    This paper proposes a digital image watermarking based on adaptive models of human visual perception. The algorithm exploits the local activities estimated from wavelet coefficients of each subband to adaptively control the luminance masking. The adaptive luminance is thus delicately combined with the contrast masking and edge detection and adopted as a visibility threshold. With the proposed combination of adaptive visual sensitivity parameters, the proposed perceptual model can be more appropriate to the different characteristics of various images. The weighting function is chosen such that the fidelity, imperceptibility and robustness could be preserved without making any perceptual difference to the image quality.

  3. Uniocular Pulfrich phenomenon: an abnormality of visual perception.

    PubMed Central

    Ell, J J; Gresty, M A

    1982-01-01

    We describe a patient with multiple sclerosis who experienced the Pulfrich illusion of elliptical motion of a target moving linearly when viewing the motion with one eye as opposed to the well recognised binocular manifestation of the phenomenon. The perception of the illusion was independent of the wave form or velocity characteristics of target motion or of retinal image position. We suggest that the occurrence of the phenomenon does not simply reflect delay in the visual system but is a function of an abnormality of perceptual interpretation of visual stimuli occurring at a high integrative level. PMID:7104283

  4. Producing Curious Affects: Visual Methodology as an Affecting and Conflictual Wunderkammer

    ERIC Educational Resources Information Center

    Staunaes, Dorthe; Kofoed, Jette

    2015-01-01

    Digital video cameras, smartphones, internet and iPads are increasingly used as visual research methods with the purpose of creating an affective corpus of data. Such visual methods are often combined with interviews or observations. Not only are visual methods part of the used research methods, the visual products are used as requisites in…

  5. Auditory and visual information in speech perception: A developmental perspective.

    PubMed

    Taitelbaum-Swead, Riki; Fostick, Leah

    2016-01-01

    This study investigates the development of audiovisual speech perception from age 4 to 80, analysing the contribution of modality, context and special features of specific language being tested. Data of 77 participants in five age groups is presented in the study. Speech stimuli were introduced via auditory, visual and audiovisual modalities. Monosyllabic meaningful and nonsense words were included in a signal-to-noise ratio of 0 dB. Speech perception accuracy in audiovisual and auditory modalities by age resulted in an inverse U-shape, with lowest performance at ages 4-5 and 65-80. In the visual modality, a clear difference was shown between performance of children (ages 4-5 and 8-9) and adults (age 20 and above). The findings of the current study have important implications for strategic planning in rehabilitation programmes for child and adult speakers of different languages with hearing difficulties. PMID:27029217

  6. Deceiving the brain: pictures and visual perception.

    PubMed

    Wade, Nicholas J

    2013-01-01

    Pictures deceive the brain: they provide distillations of objects or ideas into simpler shapes. They create the impression of representing that which cannot be presented. Even at the level of the photograph, the links between pictorial images (the contents of pictures) and objects are tenuous. The dimensions of depth and motion are missing from a pictorial image, and this alone introduces all manner of potential ambiguities. The history of art can be considered as exploring the missing link between image and object. Pictorial images can be spatialized or stylized; spatialized images (like photographs) generally share some of the projective characteristics of the object represented. Written words are also images but they do not resemble the objects they represent--they are stylized or conventional. Pictures can also be illusions--deceptions of vision so that what is seen does not necessarily correspond to what is physically presented. Most of visual science is now concerned with pictorial images--two-dimensional displays on computer monitors. Is vision now the science of deception?

  7. Accuracy aspects of stereo side-looking radar. [analysis of its visual perception and binocular vision

    NASA Technical Reports Server (NTRS)

    Leberl, F. W.

    1979-01-01

    The geometry of the radar stereo model and factors affecting visual radar stereo perception are reviewed. Limits to the vertical exaggeration factor of stereo radar are defined. Radar stereo model accuracies are analyzed with respect to coordinate errors caused by errors of radar sensor position and of range, and with respect to errors of coordinate differences, i.e., cross-track distances and height differences.

  8. Influence of aging on visual perception and visual motor integration in Korean adults.

    PubMed

    Kim, Eunhwi; Park, Young-Kyung; Byun, Yong-Hyun; Park, Mi-Sook; Kim, Hong

    2014-08-01

    This study investigated age-related changes of cognitive function in Korean adults using the Korean-Developmental Test of Visual Perception-2 (K-DTVP-2) and the Visual Motor Integration-3rd Revision (VMI-3R) test, and determined the main factors influencing VP and VMI in older adults. For this research, 139 adults for the K-DTVP-2 and 192 adults for the VMI-3R, from a total of 283 participants, were randomly and separately recruited in province, Korea. The present study showed that the mean score of the K-DTVP-2 and VMI-3R in 10-yr age increments significantly decreased as age increased (K-DTVP-2, F= 41.120, P< 0.001; VMI-3R, F= 16.583, P< 0.001). The mean score of the VMI-3R and K-DTVP-2 were significantly decreased in participants in their 50s compared to those in their 20s (P< 0.05). Age (t= -9.130, P< 0.001), gender (t= 3.029, P= 0.003), and the presence of diseases (t= -2.504, P= 0.013) were the significant factors affecting K-DTVP-2 score. On the other hand, age (t= -6.300, P< 0.001) was the only significant factor affecting VMI-3R score. K-DTVP-2 score (Standardized β= -0.611) decreased more sensitively with aging than VMI-3R (Standardized β= -0.467). The two measurements had a significant positive correlation (r = 0.855, P< 0.001). In conclusion, it can be suggested that VP and VMI should be regularly checked from an individual's 50s, which is a critical period for detecting cognitive decline by aging. Both the K-DTVP-2 and VMI-3R could be used for determining the level of cognitive deficit by aging. PMID:25210701

  9. A STUDY OF CONCEPTUAL DEVELOPMENT AND VISUAL PERCEPTION IN SIX-YEAR-OLD CHILDREN.

    PubMed

    Bütün Ayhan, Aynur; Aki, Esra; Mutlu, Burcu; Aral, Neriman

    2015-12-01

    Visual perception comprises established responses to visual stimuli. Conceptual development accompanies the development of visual perception skills. Both visual perception and sufficient conceptual development is vital to a child's academic skills and social participation. The aim of this study was to examine the relationship between conceptual development and visual perceptual skills of six-year-old children. 140 children were administered Bracken's (1998) Basic Concept Scale (BBCS-R) and the Frostig Developmental Visual Perception Test. BBCS-R scores were weakly correlated with FDVPT Discrimination of figure-ground, and had moderate and significant correlations with Constancy of the figures, Perception of position in space, Perception of spatial relation, and the Total score on visual perception. Also, a moderate correlation was found between the total scores of the FDVPT and the total score of the BBCS-R.

  10. Aesthetic perception of visual textures: a holistic exploration using texture analysis, psychological experiment, and perception modeling.

    PubMed

    Liu, Jianli; Lughofer, Edwin; Zeng, Xianyi

    2015-01-01

    Modeling human aesthetic perception of visual textures is important and valuable in numerous industrial domains, such as product design, architectural design, and decoration. Based on results from a semantic differential rating experiment, we modeled the relationship between low-level basic texture features and aesthetic properties involved in human aesthetic texture perception. First, we compute basic texture features from textural images using four classical methods. These features are neutral, objective, and independent of the socio-cultural context of the visual textures. Then, we conduct a semantic differential rating experiment to collect from evaluators their aesthetic perceptions of selected textural stimuli. In semantic differential rating experiment, eights pairs of aesthetic properties are chosen, which are strongly related to the socio-cultural context of the selected textures and to human emotions. They are easily understood and connected to everyday life. We propose a hierarchical feed-forward layer model of aesthetic texture perception and assign 8 pairs of aesthetic properties to different layers. Finally, we describe the generation of multiple linear and non-linear regression models for aesthetic prediction by taking dimensionality-reduced texture features and aesthetic properties of visual textures as dependent and independent variables, respectively. Our experimental results indicate that the relationships between each layer and its neighbors in the hierarchical feed-forward layer model of aesthetic texture perception can be fitted well by linear functions, and the models thus generated can successfully bridge the gap between computational texture features and aesthetic texture properties. PMID:26582987

  11. Aesthetic perception of visual textures: a holistic exploration using texture analysis, psychological experiment, and perception modeling

    PubMed Central

    Liu, Jianli; Lughofer, Edwin; Zeng, Xianyi

    2015-01-01

    Modeling human aesthetic perception of visual textures is important and valuable in numerous industrial domains, such as product design, architectural design, and decoration. Based on results from a semantic differential rating experiment, we modeled the relationship between low-level basic texture features and aesthetic properties involved in human aesthetic texture perception. First, we compute basic texture features from textural images using four classical methods. These features are neutral, objective, and independent of the socio-cultural context of the visual textures. Then, we conduct a semantic differential rating experiment to collect from evaluators their aesthetic perceptions of selected textural stimuli. In semantic differential rating experiment, eights pairs of aesthetic properties are chosen, which are strongly related to the socio-cultural context of the selected textures and to human emotions. They are easily understood and connected to everyday life. We propose a hierarchical feed-forward layer model of aesthetic texture perception and assign 8 pairs of aesthetic properties to different layers. Finally, we describe the generation of multiple linear and non-linear regression models for aesthetic prediction by taking dimensionality-reduced texture features and aesthetic properties of visual textures as dependent and independent variables, respectively. Our experimental results indicate that the relationships between each layer and its neighbors in the hierarchical feed-forward layer model of aesthetic texture perception can be fitted well by linear functions, and the models thus generated can successfully bridge the gap between computational texture features and aesthetic texture properties. PMID:26582987

  12. Aesthetic perception of visual textures: a holistic exploration using texture analysis, psychological experiment, and perception modeling.

    PubMed

    Liu, Jianli; Lughofer, Edwin; Zeng, Xianyi

    2015-01-01

    Modeling human aesthetic perception of visual textures is important and valuable in numerous industrial domains, such as product design, architectural design, and decoration. Based on results from a semantic differential rating experiment, we modeled the relationship between low-level basic texture features and aesthetic properties involved in human aesthetic texture perception. First, we compute basic texture features from textural images using four classical methods. These features are neutral, objective, and independent of the socio-cultural context of the visual textures. Then, we conduct a semantic differential rating experiment to collect from evaluators their aesthetic perceptions of selected textural stimuli. In semantic differential rating experiment, eights pairs of aesthetic properties are chosen, which are strongly related to the socio-cultural context of the selected textures and to human emotions. They are easily understood and connected to everyday life. We propose a hierarchical feed-forward layer model of aesthetic texture perception and assign 8 pairs of aesthetic properties to different layers. Finally, we describe the generation of multiple linear and non-linear regression models for aesthetic prediction by taking dimensionality-reduced texture features and aesthetic properties of visual textures as dependent and independent variables, respectively. Our experimental results indicate that the relationships between each layer and its neighbors in the hierarchical feed-forward layer model of aesthetic texture perception can be fitted well by linear functions, and the models thus generated can successfully bridge the gap between computational texture features and aesthetic texture properties.

  13. Ongoing Slow Fluctuations in V1 Impact on Visual Perception.

    PubMed

    Wohlschläger, Afra M; Glim, Sarah; Shao, Junming; Draheim, Johanna; Köhler, Lina; Lourenço, Susana; Riedl, Valentin; Sorg, Christian

    2016-01-01

    The human brain's ongoing activity is characterized by intrinsic networks of coherent fluctuations, measured for example with correlated functional magnetic resonance imaging signals. So far, however, the brain processes underlying this ongoing blood oxygenation level dependent (BOLD) signal orchestration and their direct relevance for human behavior are not sufficiently understood. In this study, we address the question of whether and how ongoing BOLD activity within intrinsic occipital networks impacts on conscious visual perception. To this end, backwardly masked targets were presented in participants' left visual field only, leaving the ipsi-lateral occipital areas entirely free from direct effects of task throughout the experiment. Signal time courses of ipsi-lateral BOLD fluctuations in visual areas V1 and V2 were then used as proxies for the ongoing contra-lateral BOLD activity within the bilateral networks. Magnitude and phase of these fluctuations were compared in trials with and without conscious visual perception, operationalized by means of subjective confidence ratings. Our results show that ipsi-lateral BOLD magnitudes in V1 were significantly higher at times of peak response when the target was perceived consciously. A significant difference between conscious and non-conscious perception with regard to the pre-target phase of an intrinsic-frequency regime suggests that ongoing V1 fluctuations exert a decisive impact on the access to consciousness already before stimulation. Both effects were absent in V2. These results thus support the notion that ongoing slow BOLD activity within intrinsic networks covering V1 represents localized processes that modulate the degree of readiness for the emergence of visual consciousness. PMID:27601986

  14. Ongoing Slow Fluctuations in V1 Impact on Visual Perception

    PubMed Central

    Wohlschläger, Afra M.; Glim, Sarah; Shao, Junming; Draheim, Johanna; Köhler, Lina; Lourenço, Susana; Riedl, Valentin; Sorg, Christian

    2016-01-01

    The human brain’s ongoing activity is characterized by intrinsic networks of coherent fluctuations, measured for example with correlated functional magnetic resonance imaging signals. So far, however, the brain processes underlying this ongoing blood oxygenation level dependent (BOLD) signal orchestration and their direct relevance for human behavior are not sufficiently understood. In this study, we address the question of whether and how ongoing BOLD activity within intrinsic occipital networks impacts on conscious visual perception. To this end, backwardly masked targets were presented in participants’ left visual field only, leaving the ipsi-lateral occipital areas entirely free from direct effects of task throughout the experiment. Signal time courses of ipsi-lateral BOLD fluctuations in visual areas V1 and V2 were then used as proxies for the ongoing contra-lateral BOLD activity within the bilateral networks. Magnitude and phase of these fluctuations were compared in trials with and without conscious visual perception, operationalized by means of subjective confidence ratings. Our results show that ipsi-lateral BOLD magnitudes in V1 were significantly higher at times of peak response when the target was perceived consciously. A significant difference between conscious and non-conscious perception with regard to the pre-target phase of an intrinsic-frequency regime suggests that ongoing V1 fluctuations exert a decisive impact on the access to consciousness already before stimulation. Both effects were absent in V2. These results thus support the notion that ongoing slow BOLD activity within intrinsic networks covering V1 represents localized processes that modulate the degree of readiness for the emergence of visual consciousness.

  15. Ongoing Slow Fluctuations in V1 Impact on Visual Perception

    PubMed Central

    Wohlschläger, Afra M.; Glim, Sarah; Shao, Junming; Draheim, Johanna; Köhler, Lina; Lourenço, Susana; Riedl, Valentin; Sorg, Christian

    2016-01-01

    The human brain’s ongoing activity is characterized by intrinsic networks of coherent fluctuations, measured for example with correlated functional magnetic resonance imaging signals. So far, however, the brain processes underlying this ongoing blood oxygenation level dependent (BOLD) signal orchestration and their direct relevance for human behavior are not sufficiently understood. In this study, we address the question of whether and how ongoing BOLD activity within intrinsic occipital networks impacts on conscious visual perception. To this end, backwardly masked targets were presented in participants’ left visual field only, leaving the ipsi-lateral occipital areas entirely free from direct effects of task throughout the experiment. Signal time courses of ipsi-lateral BOLD fluctuations in visual areas V1 and V2 were then used as proxies for the ongoing contra-lateral BOLD activity within the bilateral networks. Magnitude and phase of these fluctuations were compared in trials with and without conscious visual perception, operationalized by means of subjective confidence ratings. Our results show that ipsi-lateral BOLD magnitudes in V1 were significantly higher at times of peak response when the target was perceived consciously. A significant difference between conscious and non-conscious perception with regard to the pre-target phase of an intrinsic-frequency regime suggests that ongoing V1 fluctuations exert a decisive impact on the access to consciousness already before stimulation. Both effects were absent in V2. These results thus support the notion that ongoing slow BOLD activity within intrinsic networks covering V1 represents localized processes that modulate the degree of readiness for the emergence of visual consciousness. PMID:27601986

  16. Sound Affects the Speed of Visual Processing

    ERIC Educational Resources Information Center

    Keetels, Mirjam; Vroomen, Jean

    2011-01-01

    The authors examined the effects of a task-irrelevant sound on visual processing. Participants were presented with revolving clocks at or around central fixation and reported the hand position of a target clock at the time an exogenous cue (1 clock turning red) or an endogenous cue (a line pointing toward 1 of the clocks) was presented. A…

  17. Knowledge corruption for visual perception in individuals high on paranoia.

    PubMed

    Moritz, Steffen; Göritz, Anja S; Van Quaquebeke, Niels; Andreou, Christina; Jungclaussen, David; Peters, Maarten J V

    2014-03-30

    Studies revealed that patients with paranoid schizophrenia display overconfidence in errors for memory and social cognition tasks. The present investigation examined whether this pattern holds true for visual perception tasks. Nonclinical participants were recruited via an online panel. Individuals were asked to complete a questionnaire that included the Paranoia Checklist and were then presented with 24 blurry pictures; half contained a hidden object while the other half showed snowy (visual) noise. Participants were asked to state whether the visual items contained an object and how confident they were in their judgment. Data from 1966 individuals were included following a conservative selection process. Participants high on core paranoid symptoms showed a poor calibration of confidence for correct versus incorrect responses. In particular, participants high on paranoia displayed overconfidence in incorrect responses and demonstrated a 20% error rate for responses made with high confidence compared to a 12% error rate in participants with low paranoia scores. Interestingly, paranoia scores declined after performance of the task. For the first time, overconfidence in errors was demonstrated among individuals with high levels of paranoia using a visual perception task, tentatively suggesting it is a ubiquitous phenomenon. In view of the significant decline in paranoia across time, bias modification programs may incorporate items such as the one employed here to teach patients with clinical paranoia the fallibility of human cognition, which may foster subsequent symptom improvement.

  18. Visual perception system and method for a humanoid robot

    NASA Technical Reports Server (NTRS)

    Wells, James W. (Inventor); Mc Kay, Neil David (Inventor); Chelian, Suhas E. (Inventor); Linn, Douglas Martin (Inventor); Wampler, II, Charles W. (Inventor); Bridgwater, Lyndon (Inventor)

    2012-01-01

    A robotic system includes a humanoid robot with robotic joints each moveable using an actuator(s), and a distributed controller for controlling the movement of each of the robotic joints. The controller includes a visual perception module (VPM) for visually identifying and tracking an object in the field of view of the robot under threshold lighting conditions. The VPM includes optical devices for collecting an image of the object, a positional extraction device, and a host machine having an algorithm for processing the image and positional information. The algorithm visually identifies and tracks the object, and automatically adapts an exposure time of the optical devices to prevent feature data loss of the image under the threshold lighting conditions. A method of identifying and tracking the object includes collecting the image, extracting positional information of the object, and automatically adapting the exposure time to thereby prevent feature data loss of the image.

  19. Perception of environmental tobacco smoke odors: An olfactory and visual response

    NASA Astrophysics Data System (ADS)

    Moschandreas, D. J.; Relwani, S. M.

    Odor perception of approximately 200 subjects was measured to determine whether visual contact with an odor source affects sensory responses and to estimate the magnitude of such an effect. Environmental tobacco smoke (ETS) odors were generated in a chamber either by a smoke machine or by an investigator who smoked. Several levels of odor intensity were generated. Odor intensity, odor hedonics and odor characters were the parameters measured before and after visual contact with the odor source. Visual contact increased the perceived odor intensity, the hedonic nature of the odor changed directionally toward unpleasant and the number of subjects perceiving tobacco odor increased. The change caused by visual contact led to differences that were statistically significant.

  20. Exploration of complex visual feature spaces for object perception

    PubMed Central

    Leeds, Daniel D.; Pyles, John A.; Tarr, Michael J.

    2014-01-01

    The mid- and high-level visual properties supporting object perception in the ventral visual pathway are poorly understood. In the absence of well-specified theory, many groups have adopted a data-driven approach in which they progressively interrogate neural units to establish each unit's selectivity. Such methods are challenging in that they require search through a wide space of feature models and stimuli using a limited number of samples. To more rapidly identify higher-level features underlying human cortical object perception, we implemented a novel functional magnetic resonance imaging method in which visual stimuli are selected in real-time based on BOLD responses to recently shown stimuli. This work was inspired by earlier primate physiology work, in which neural selectivity for mid-level features in IT was characterized using a simple parametric approach (Hung et al., 2012). To extend such work to human neuroimaging, we used natural and synthetic object stimuli embedded in feature spaces constructed on the basis of the complex visual properties of the objects themselves. During fMRI scanning, we employed a real-time search method to control continuous stimulus selection within each image space. This search was designed to maximize neural responses across a pre-determined 1 cm3 brain region within ventral cortex. To assess the value of this method for understanding object encoding, we examined both the behavior of the method itself and the complex visual properties the method identified as reliably activating selected brain regions. We observed: (1) Regions selective for both holistic and component object features and for a variety of surface properties; (2) Object stimulus pairs near one another in feature space that produce responses at the opposite extremes of the measured activity range. Together, these results suggest that real-time fMRI methods may yield more widely informative measures of selectivity within the broad classes of visual features

  1. Visual attention to and perception of undamaged and damaged versions of natural and colored female hair.

    PubMed

    Fink, Bernhard; Neuser, Frauke; Deloux, Gwenelle; Röder, Susanne; Matts, Paul J

    2013-03-01

    Female hair color is thought to influence physical attractiveness, and although there is some evidence for this assertion, research has yet not addressed the question if and how physical damaging affects the perception of female hair color. Here we investigate whether people are sensitive (in terms of visual attention and age, health and attractiveness perception) to subtle differences in hair images of natural and colored hair before and after physical damaging. We tracked the eye-gaze of 50 men and 50 women aged 31-50 years whilst they viewed randomized pairs of images of 20 natural and 20 colored hair tresses, each pair displaying the same tress before and after controlled cuticle damage. The hair images were then rated for perceived health, attractiveness, and age. Undamaged versions of natural and colored hair were perceived as significantly younger, healthier, and more attractive than corresponding damaged versions. Visual attention to images of undamaged colored hair was significantly higher compared with their damaged counterparts, while in natural hair, the opposite pattern was found. We argue that the divergence in visual attention to undamaged colored female hair and damaged natural female hair and associated ratings is due to differences in social perception and discuss the source of apparent visual difference between undamaged and damaged hair.

  2. Auditory enhancement of visual perception at threshold depends on visual abilities.

    PubMed

    Caclin, Anne; Bouchet, Patrick; Djoulah, Farida; Pirat, Elodie; Pernier, Jacques; Giard, Marie-Hélène

    2011-06-17

    Whether or not multisensory interactions can improve detection thresholds, and thus widen the range of perceptible events is a long-standing debate. Here we revisit this question, by testing the influence of auditory stimuli on visual detection threshold, in subjects exhibiting a wide range of visual-only performance. Above the perceptual threshold, crossmodal interactions have indeed been reported to depend on the subject's performance when the modalities are presented in isolation. We thus tested normal-seeing subjects and short-sighted subjects wearing their usual glasses. We used a paradigm limiting potential shortcomings of previous studies: we chose a criterion-free threshold measurement procedure and precluded exogenous cueing effects by systematically presenting a visual cue whenever a visual target (a faint Gabor patch) might occur. Using this carefully controlled procedure, we found that concurrent sounds only improved visual detection thresholds in the sub-group of subjects exhibiting the poorest performance in the visual-only conditions. In these subjects, for oblique orientations of the visual stimuli (but not for vertical or horizontal targets), the auditory improvement was still present when visual detection was already helped with flanking visual stimuli generating a collinear facilitation effect. These findings highlight that crossmodal interactions are most efficient to improve perceptual performance when an isolated modality is deficient.

  3. Motion coherence affects human perception and pursuit similarly

    NASA Technical Reports Server (NTRS)

    Beutter, B. R.; Stone, L. S.

    2000-01-01

    Pursuit and perception both require accurate information about the motion of objects. Recovering the motion of objects by integrating the motion of their components is a difficult visual task. Successful integration produces coherent global object motion, while a failure to integrate leaves the incoherent local motions of the components unlinked. We compared the ability of perception and pursuit to perform motion integration by measuring direction judgments and the concomitant eye-movement responses to line-figure parallelograms moving behind stationary rectangular apertures. The apertures were constructed such that only the line segments corresponding to the parallelogram's sides were visible; thus, recovering global motion required the integration of the local segment motion. We investigated several potential motion-integration rules by using stimuli with different object, vector-average, and line-segment terminator-motion directions. We used an oculometric decision rule to directly compare direction discrimination for pursuit and perception. For visible apertures, the percept was a coherent object, and both the pursuit and perceptual performance were close to the object-motion prediction. For invisible apertures, the percept was incoherently moving segments, and both the pursuit and perceptual performance were close to the terminator-motion prediction. Furthermore, both psychometric and oculometric direction thresholds were much higher for invisible apertures than for visible apertures. We constructed a model in which both perception and pursuit are driven by a shared motion-processing stage, with perception having an additional input from an independent static-processing stage. Model simulations were consistent with our perceptual and oculomotor data. Based on these results, we propose the use of pursuit as an objective and continuous measure of perceptual coherence. Our results support the view that pursuit and perception share a common motion

  4. Coordinates of Human Visual and Inertial Heading Perception

    PubMed Central

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results. PMID:26267865

  5. Coordinates of Human Visual and Inertial Heading Perception.

    PubMed

    Crane, Benjamin Thomas

    2015-01-01

    Heading estimation involves both inertial and visual cues. Inertial motion is sensed by the labyrinth, somatic sensation by the body, and optic flow by the retina. Because the eye and head are mobile these stimuli are sensed relative to different reference frames and it remains unclear if a perception occurs in a common reference frame. Recent neurophysiologic evidence has suggested the reference frames remain separate even at higher levels of processing but has not addressed the resulting perception. Seven human subjects experienced a 2s, 16 cm/s translation and/or a visual stimulus corresponding with this translation. For each condition 72 stimuli (360° in 5° increments) were delivered in random order. After each stimulus the subject identified the perceived heading using a mechanical dial. Some trial blocks included interleaved conditions in which the influence of ±28° of gaze and/or head position were examined. The observations were fit using a two degree-of-freedom population vector decoder (PVD) model which considered the relative sensitivity to lateral motion and coordinate system offset. For visual stimuli gaze shifts caused shifts in perceived head estimates in the direction opposite the gaze shift in all subjects. These perceptual shifts averaged 13 ± 2° for eye only gaze shifts and 17 ± 2° for eye-head gaze shifts. This finding indicates visual headings are biased towards retina coordinates. Similar gaze and head direction shifts prior to inertial headings had no significant influence on heading direction. Thus inertial headings are perceived in body-centered coordinates. Combined visual and inertial stimuli yielded intermediate results.

  6. Human visual and vestibular heading perception in the vertical planes.

    PubMed

    Crane, Benjamin T

    2014-02-01

    Heading estimation has not previously been reported in the vertical planes. This is a potentially interesting issue because although distribution of neuronal direction sensitivities is near uniform for vertical headings, there is an overrepresentation of otolith organs sensitive to motion in the horizontal relative to the vertical plane. Furthermore, thresholds of horizontal motion perception are considerably lower than those of vertical motion which has the potential to bias heading perception. The current data from 14 human subjects (age 19 to 67) measured heading estimation in response to vestibular motion of 14 cm (28 cm/s) over a 360° of headings at 5° intervals. An analogous visual motion was tested in separate trials. In this study, earth and head vertical/horizontal were always aligned. Results demonstrated that the horizontal component of heading was overestimated relative to the vertical component for vestibular heading stimuli in the coronal (skew) and sagittal (elevation) planes. For visual headings, the bias was much smaller and in the opposite direction such that the vertical component of heading was overestimated. Subjects older than 50 had significantly worse precision and larger biases relative to that of younger subjects for the vestibular conditions, although visual heading estimates were similar. A vector addition model was fit to the data which explains the observed heading biases by the known distribution of otolith organs in humans. The greatly decreased precision with age is explained by the model with decreases in end organ numbers, and relatively greater loss of otoliths that are sensitive to vertical motion.

  7. Stars advantages vs parallel coordinates: shape perception as visualization reserve

    NASA Astrophysics Data System (ADS)

    Grishin, Vladimir; Kovalerchuk, Boris

    2013-12-01

    Although shape perception is the main information channel for brain, it has been poor used by recent visualization techniques. The difficulties of its modeling are key obstacles for visualization theory and application. Known experimental estimates of shape perception capabilities have been made for low data dimension, and they were usually not connected with data structures. More applied approach for certain data structures detection by means of shape displays are considered by the example of analytical and experimental comparison of popular now Parallel Coordinates (PCs), i.e. 2D Cartesian displays of data vectors, with polar displays known as stars. Advantages of stars vs. PCs by Gestalt Laws are shown. About twice faster feature selection and classification with stars than PCs are showed by psychological experiments for hyper-tubes structures detection in data space with dimension up to 100-200 and its subspaces. This demonstrates great reserves of visualization enhancement in comparison with many recent techniques usually focused on few data attributes analysis.

  8. Human alteration of the rural landscape: Variations in visual perception

    SciTech Connect

    Cloquell-Ballester, Vicente-Agustin Carmen Torres-Sibille, Ana del; Cloquell-Ballester, Victor-Andres; Santamarina-Siurana, Maria Cristina

    2012-01-15

    The objective of this investigation is to evaluate how visual perception varies as the rural landscape is altered by human interventions of varying character. An experiment is carried out using Semantic Differential Analysis to analyse the effect of the character and the type of the intervention on perception. Interventions are divided into elements of 'permanent industrial character', 'elements of permanent rural character' and 'elements of temporary character', and these categories are sub-divided into smaller groups according to the type of development. To increase the reliability of the results, the Intraclass Correlation Coefficient tool, is applied to validate the semantic space of the perceptual responses and to determine the number of subjects required for a reliable evaluation of the scenes.

  9. Factors affecting the perceptions of Iranian agricultural researchers towards nanotechnology.

    PubMed

    Hosseini, Seyed Mahmood; Rezaei, Rohollah

    2011-07-01

    This descriptive survey research was undertaken to design appropriate programs for the creation of a positive perception of nanotechnology among their intended beneficiaries. In order to do that, the factors affecting positive perceptions were defined. A stratified random sample of 278 science board members was selected out of 984 researchers who were working in 22 National Agricultural Research Institutions (NARIs). Data were collected by using a mailed questionnaire. The descriptive results revealed that more than half of the respondents had "low" or "very low" familiarity with nanotechnology. Regression analysis indicated that the perceptions of Iranian NARI Science Board Members towards nanotechnology were explained by three variables: the level of their familiarity with emerging applications of nanotechnology in agriculture, the level of their familiarity with nanotechnology and their work experiences. The findings of this study can contribute to a better understanding of the present situation of the development of nanotechnology and the planning of appropriate programs for creating a positive perception of nanotechnology.

  10. Affective state influences perception by affecting decision parameters underlying bias and sensitivity.

    PubMed

    Lynn, Spencer K; Zhang, Xuan; Barrett, Lisa Feldman

    2012-08-01

    Studies of the effect of affect on perception often show consistent directional effects of a person's affective state on perception. Unpleasant emotions have been associated with a "locally focused" style of stimulus evaluation, and positive emotions with a "globally focused" style. Typically, however, studies of affect and perception have not been conducted under the conditions of perceptual uncertainty and behavioral risk inherent to perceptual judgments outside the laboratory. We investigated the influence of perceivers' experienced affect (valence and arousal) on the utility of social threat perception by combining signal detection theory and behavioral economics. We compared 3 perceptual decision environments that systematically differed with respect to factors that underlie uncertainty and risk: the base rate of threat, the costs of incorrect identification threat, and the perceptual similarity of threats and nonthreats. We found that no single affective state yielded the best performance on the threat perception task across the 3 environments. Unpleasant valence promoted calibration of response bias to base rate and costs, high arousal promoted calibration of perceptual sensitivity to perceptual similarity, and low arousal was associated with an optimal adjustment of bias to sensitivity. However, the strength of these associations was conditional upon the difficulty of attaining optimal bias and high sensitivity, such that the effect of the perceiver's affective state on perception differed with the cause and/or level of uncertainty and risk.

  11. [Visual perception of dentition esthetic parameters (part 1)].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2015-01-01

    The article presents a study on the impact of violations of aesthetic parameters as the inclination of the incisalline, thedislocation of median interincisal line and the width of dental arch on visual perception. Comparison of the data of objective assessment data the subjective assessment of the respondents was conducted. It is proved that all dependencies are linear and can be described by a linear regression equations. A similar method can be used for objective quantitative method for the assessment of aesthetics teeth when you smile before and after dental treatment.

  12. Visual perception and grasping for the extravehicular activity robot

    NASA Technical Reports Server (NTRS)

    Starks, Scott A.

    1989-01-01

    The development of an approach to the visual perception of object surface information using laser range data in support of robotic grasping is discussed. This is a very important problem area in that a robot such as the EVAR must be able to formulate a grasping strategy on the basis of its knowledge of the surface structure of the object. A description of the problem domain is given as well as a formulation of an algorithm which derives an object surface description adequate to support robotic grasping. The algorithm is based upon concepts of differential geometry namely, Gaussian and mean curvature.

  13. Does bilingual experience affect early visual perceptual development?

    PubMed Central

    Schonberg, Christina; Sandhofer, Catherine M.; Tsang, Tawny; Johnson, Scott P.

    2014-01-01

    Visual attention and perception develop rapidly during the first few months after birth, and these behaviors are critical components in the development of language and cognitive abilities. Here we ask how early bilingual experiences might lead to differences in visual attention and perception. Experiments 1–3 investigated the looking behavior of monolingual and bilingual infants when presented with social (Experiment 1), mixed (Experiment 2), or non-social (Experiment 3) stimuli. In each of these experiments, infants' dwell times (DT) and number of fixations to areas of interest (AOIs) were analyzed, giving a sense of where the infants looked. To examine how the infants looked at the stimuli in a more global sense, Experiment 4 combined and analyzed the saccade data collected in Experiments 1–3. There were no significant differences between monolingual and bilingual infants' DTs, AOI fixations, or saccade characteristics (specifically, frequency, and amplitude) in any of the experiments. These results suggest that monolingual and bilingual infants process their visual environments similarly, supporting the idea that the substantial cognitive differences between monolinguals and bilinguals in early childhood are more related to active vocabulary production than perception of the environment. PMID:25566116

  14. Relationships between Fine-Motor, Visual-Motor, and Visual Perception Scores and Handwriting Legibility and Speed

    ERIC Educational Resources Information Center

    Klein, Sheryl; Guiltner, Val; Sollereder, Patti; Cui, Ying

    2011-01-01

    Occupational therapists assess fine motor, visual motor, visual perception, and visual skill development, but knowledge of the relationships between scores on sensorimotor performance measures and handwriting legibility and speed is limited. Ninety-nine students in grades three to six with learning and/or behavior problems completed the Upper-Limb…

  15. Mental Fatigue Affects Visual Selective Attention

    PubMed Central

    Faber, Léon G.; Maurits, Natasha M.; Lorist, Monicque M.

    2012-01-01

    Mental fatigue is a form of fatigue, induced by continuous task performance. Mentally fatigued people often report having a hard time keeping their attention focussed and being easily distracted. In this study, we examined the relation between mental fatigue, as induced by time on task, and attention-related changes in event-related potentials (ERPs). EEG, reaction times and response accuracies were obtained from 17 healthy volunteers during two hours of task performance on an adapted Eriksen flanker task. In this task, the size of targets and flankers was manipulated to discern neuronal processes that are related to processing of relevant information from processes related to the processing of irrelevant information. The ERP data showed that effects induced by target size manipulation were not affected by time on task, while an initial effect of flanker size manipulation decreased gradually with increasing time on task. We conclude that attention was affected by mental fatigue, in the form of a decrease in the ability to suppress irrelevant information. In behavioural results, this was reflected by a tendency of participants to increasingly base their response decision on irrelevant information, resulting in decreased response accuracies. PMID:23118927

  16. Mental fatigue affects visual selective attention.

    PubMed

    Faber, Léon G; Maurits, Natasha M; Lorist, Monicque M

    2012-01-01

    Mental fatigue is a form of fatigue, induced by continuous task performance. Mentally fatigued people often report having a hard time keeping their attention focussed and being easily distracted. In this study, we examined the relation between mental fatigue, as induced by time on task, and attention-related changes in event-related potentials (ERPs). EEG, reaction times and response accuracies were obtained from 17 healthy volunteers during two hours of task performance on an adapted Eriksen flanker task. In this task, the size of targets and flankers was manipulated to discern neuronal processes that are related to processing of relevant information from processes related to the processing of irrelevant information. The ERP data showed that effects induced by target size manipulation were not affected by time on task, while an initial effect of flanker size manipulation decreased gradually with increasing time on task. We conclude that attention was affected by mental fatigue, in the form of a decrease in the ability to suppress irrelevant information. In behavioural results, this was reflected by a tendency of participants to increasingly base their response decision on irrelevant information, resulting in decreased response accuracies.

  17. Principals' Perception regarding Factors Affecting the Performance of Teachers

    ERIC Educational Resources Information Center

    Akram, Muhammad Javaid; Raza, Syed Ahmad; Khaleeq, Abdur Rehman; Atika, Samrana

    2011-01-01

    This study investigated the perception of principals on how the factors of subject mastery, teaching methodology, personal characteristics, and attitude toward students affect the performance of teachers at higher secondary level in the Punjab. All principals of higher secondary level in the Punjab were part of the population of the study. From…

  18. Categorical Perception of Affective and Linguistic Facial Expressions

    ERIC Educational Resources Information Center

    McCullough, Stephen; Emmorey, Karen

    2009-01-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX…

  19. Vegetarianism and food perception. Selective visual attention to meat pictures.

    PubMed

    Stockburger, Jessica; Renner, Britta; Weike, Almut I; Hamm, Alfons O; Schupp, Harald T

    2009-04-01

    Vegetarianism provides a model system to examine the impact of negative affect towards meat, based on ideational reasoning. It was hypothesized that meat stimuli are efficient attention catchers in vegetarians. Event-related brain potential recordings served to index selective attention processes at the level of initial stimulus perception. Consistent with the hypothesis, late positive potentials to meat pictures were enlarged in vegetarians compared to omnivores. This effect was specific for meat pictures and obtained during passive viewing and an explicit attention task condition. These findings demonstrate the attention capture of food stimuli, deriving affective salience from ideational reasoning and symbolic meaning.

  20. How (and why) the visual control of action differs from visual perception

    PubMed Central

    Goodale, Melvyn A.

    2014-01-01

    Vision not only provides us with detailed knowledge of the world beyond our bodies, but it also guides our actions with respect to objects and events in that world. The computations required for vision-for-perception are quite different from those required for vision-for-action. The former uses relational metrics and scene-based frames of reference while the latter uses absolute metrics and effector-based frames of reference. These competing demands on vision have shaped the organization of the visual pathways in the primate brain, particularly within the visual areas of the cerebral cortex. The ventral ‘perceptual’ stream, projecting from early visual areas to inferior temporal cortex, helps to construct the rich and detailed visual representations of the world that allow us to identify objects and events, attach meaning and significance to them and establish their causal relations. By contrast, the dorsal ‘action’ stream, projecting from early visual areas to the posterior parietal cortex, plays a critical role in the real-time control of action, transforming information about the location and disposition of goal objects into the coordinate frames of the effectors being used to perform the action. The idea of two visual systems in a single brain might seem initially counterintuitive. Our visual experience of the world is so compelling that it is hard to believe that some other quite independent visual signal—one that we are unaware of—is guiding our movements. But evidence from a broad range of studies from neuropsychology to neuroimaging has shown that the visual signals that give us our experience of objects and events in the world are not the same ones that control our actions. PMID:24789899

  1. The contribution of dynamic visual cues to audiovisual speech perception.

    PubMed

    Jaekl, Philip; Pesquita, Ana; Alsius, Agnes; Munhall, Kevin; Soto-Faraco, Salvador

    2015-08-01

    Seeing a speaker's facial gestures can significantly improve speech comprehension, especially in noisy environments. However, the nature of the visual information from the speaker's facial movements that is relevant for this enhancement is still unclear. Like auditory speech signals, visual speech signals unfold over time and contain both dynamic configural information and luminance-defined local motion cues; two information sources that are thought to engage anatomically and functionally separate visual systems. Whereas, some past studies have highlighted the importance of local, luminance-defined motion cues in audiovisual speech perception, the contribution of dynamic configural information signalling changes in form over time has not yet been assessed. We therefore attempted to single out the contribution of dynamic configural information to audiovisual speech processing. To this aim, we measured word identification performance in noise using unimodal auditory stimuli, and with audiovisual stimuli. In the audiovisual condition, speaking faces were presented as point light displays achieved via motion capture of the original talker. Point light displays could be isoluminant, to minimise the contribution of effective luminance-defined local motion information, or with added luminance contrast, allowing the combined effect of dynamic configural cues and local motion cues. Audiovisual enhancement was found in both the isoluminant and contrast-based luminance conditions compared to an auditory-only condition, demonstrating, for the first time the specific contribution of dynamic configural cues to audiovisual speech improvement. These findings imply that globally processed changes in a speaker's facial shape contribute significantly towards the perception of articulatory gestures and the analysis of audiovisual speech.

  2. Spatial Frequency Requirements and Gaze Strategy in Visual-Only and Audiovisual Speech Perception

    ERIC Educational Resources Information Center

    Wilson, Amanda H.; Alsius, Agnès; Parè, Martin; Munhall, Kevin G.

    2016-01-01

    Purpose: The aim of this article is to examine the effects of visual image degradation on performance and gaze behavior in audiovisual and visual-only speech perception tasks. Method: We presented vowel-consonant-vowel utterances visually filtered at a range of frequencies in visual-only, audiovisual congruent, and audiovisual incongruent…

  3. Olfactory-visual integration facilitates perception of subthreshold negative emotion

    PubMed Central

    Novak, Lucas R.; Gitelman, Darren R.; Schulyer, Brianna; Li, Wen

    2015-01-01

    A fast growing literature of multisensory emotion integration notwithstanding, the chemical senses, intimately associated with emotion, have been largely overlooked. Moreover, an ecologically highly relevant principle of “inverse effectiveness”, rendering maximal integration efficacy with impoverished sensory input, remains to be assessed in emotion integration. Presenting minute, subthreshold negative (vs. neutral) cues in faces and odors, we demonstrated olfactory-visual emotion integration in improved emotion detection (especially among individuals with weaker perception of unimodal negative cues) and response enhancement in the amygdala. Moreover, while perceptual gain for visual negative emotion involved the posterior superior temporal sulcus/pSTS, perceptual gain for olfactory negative emotion engaged both the associative olfactory (orbitofrontal) cortex and amygdala. Dynamic causal modeling (DCM) analysis of fMRI time series further revealed connectivity strengthening among these areas during cross modal emotion integration. That multisensory (but not low-level unisensory) areas exhibited both enhanced response and region-to-region coupling favors a top-down (vs. bottom-up) account for olfactory-visual emotion integration. Current findings thus confirm the involvement of multisensory convergence areas, while highlighting unique characteristics of olfaction-related integration. Furthermore, successful crossmodal binding of subthreshold aversive cues not only supports the principle of “inverse effectiveness” in emotion integration but also accentuates the automatic, unconscious quality of crossmodal emotion synthesis. PMID:26359718

  4. Action induction by visual perception of rotational motion.

    PubMed

    Classen, Claudia; Kibele, Armin

    2016-09-01

    A basic process in the planning of everyday actions involves the integration of visually perceived movement characteristics. Such processes of information integration often occur automatically. The aim of the present study was to examine whether the visual perception of spatial characteristics of a rotational motion (rotation direction) can induce a spatially compatible action. Four reaction time experiments were conducted to analyze the effect of perceiving task irrelevant rotational motions of simple geometric figures as well as of gymnasts on a horizontal bar while responding to color changes in these objects. The results show that the participants react faster when the directional information of a rotational motion is compatible with the spatial characteristics of an intended action. The degree of complexity of the perceived event does not play a role in this effect. The spatial features of the used biological motion were salient enough to elicit a motion based Simon effect. However, in the cognitive processing of the visual stimulus, the critical criterion is not the direction of rotation, but rather the relative direction of motion (direction of motion above or below the center of rotation). Nevertheless, this conclusion is tainted with reservations since it is only fully supported by the response behavior of female participants.

  5. Visual model of human blur perception for scene adaptive capturing

    NASA Astrophysics Data System (ADS)

    Kim, Sung-Su; Chung, DaeSu; Park, Byung-Kwan; Kim, Jung-Bae; Lee, Seong-Deok

    2009-01-01

    Despite fast spreading of digital cameras, many people cannot take pictures of high quality, they want, due to lack of photography. To help users under the unfavorable capturing environments, e.g. 'Night', 'Backlighting', 'Indoor', or 'Portrait', the automatic mode of cameras provides parameter sets by manufactures. Unfortunately, this automatic functionality does not give pleasing image quality in general. Especially, length of exposure (shutter speed) is critical factor in taking high quality pictures in the night. One of key factors causing this bad quality in the night is the image blur, which mainly comes from hand-shaking in long capturing. In this study, to circumvent this problem and to enhance image quality of automatic cameras, we propose an intelligent camera processing core having BASE (Scene Adaptive Blur Estimation) and VisBLE (Visual Blur Limitation Estimation). SABE analyzes the high frequency component in the DCT (Discrete Cosine Transform) domain. VisBLE determines acceptable blur level on the basis of human visual tolerance and Gaussian model. This visual tolerance model is developed on the basis of human perception physiological mechanism. In the experiments proposed method outperforms existing imaging systems by general users and photographers, as well.

  6. Emotion perception, but not affect perception, is impaired with semantic memory loss.

    PubMed

    Lindquist, Kristen A; Gendron, Maria; Barrett, Lisa Feldman; Dickerson, Bradford C

    2014-04-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others' faces is inborn, prelinguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this article, we report findings from 3 patients with semantic dementia that cannot be explained by this "basic emotion" view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear, and sadness. These findings have important consequences for understanding the processes supporting emotion perception.

  7. Emotion perception, but not affect perception, is impaired with semantic memory loss

    PubMed Central

    Lindquist, Kristen A.; Gendron, Maria; Feldman Barrett, Lisa; Dickerson, Bradford C.

    2014-01-01

    For decades, psychologists and neuroscientists have hypothesized that the ability to perceive emotions on others’ faces is inborn, pre-linguistic, and universal. Concept knowledge about emotion has been assumed to be epiphenomenal to emotion perception. In this paper, we report findings from three patients with semantic dementia that cannot be explained by this “basic emotion” view. These patients, who have substantial deficits in semantic processing abilities, spontaneously perceived pleasant and unpleasant expressions on faces, but not discrete emotions such as anger, disgust, fear, or sadness, even in a task that did not require the use of emotion words. Our findings support the hypothesis that discrete emotion concept knowledge helps transform perceptions of affect (positively or negatively valenced facial expressions) into perceptions of discrete emotions such as anger, disgust, fear and sadness. These findings have important consequences for understanding the processes supporting emotion perception. PMID:24512242

  8. Behavioral Differences in the Upper and Lower Visual Hemifields in Shape and Motion Perception

    PubMed Central

    Zito, Giuseppe A.; Cazzoli, Dario; Müri, René M.; Mosimann, Urs P.; Nef, Tobias

    2016-01-01

    Perceptual accuracy is known to be influenced by stimuli location within the visual field. In particular, it seems to be enhanced in the lower visual hemifield (VH) for motion and space processing, and in the upper VH for object and face processing. The origins of such asymmetries are attributed to attentional biases across the visual field, and in the functional organization of the visual system. In this article, we tested content-dependent perceptual asymmetries in different regions of the visual field. Twenty-five healthy volunteers participated in this study. They performed three visual tests involving perception of shapes, orientation and motion, in the four quadrants of the visual field. The results of the visual tests showed that perceptual accuracy was better in the lower than in the upper visual field for motion perception, and better in the upper than in the lower visual field for shape perception. Orientation perception did not show any vertical bias. No difference was found when comparing right and left VHs. The functional organization of the visual system seems to indicate that the dorsal and the ventral visual streams, responsible for motion and shape perception, respectively, show a bias for the lower and upper VHs, respectively. Such a bias depends on the content of the visual information. PMID:27378876

  9. Seen, Unseen or Overlooked? How Can Visual Perception Develop through a Multimodal Enquiry?

    ERIC Educational Resources Information Center

    Payne, Rachel

    2012-01-01

    This article outlines an exploration into the development of visual perception through analysing the process of taking photographs of the mundane as small-scale research. A preoccupation with social construction of the visual lies at the heart of the investigation by correlating the perceptive process to Mitchell's (2002) counter thesis for visual…

  10. COMPARISON OF TWO PROCEDURES FOR TEACHING READING TO PRIMARY CHILDREN WITH VISUAL PERCEPTION DIFFICULTIES.

    ERIC Educational Resources Information Center

    LAPRAY, MARGARET; ROSS, RAMON

    READING ABILITIES OF PRIMARY CHILDREN WITH VISUAL PERCEPTION PROBLEMS WHO WERE TAUGHT BY CONVENTIONAL METHODS WERE COMPARED TO THE ABILITIES OF CHILDREN GIVEN SPECIAL TRAINING DESIGNED TO IMPROVE FAULTY OR IMMATURE VISUAL PERCEPTION. ONE CONTROL GROUP PARTICIPATED IN SPECIAL ACTIVITIES SUCH AS PICTURE COLORING AND THE OTHER CONTROL GROUP RECEIVED…

  11. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech.

    PubMed

    García-Pérez, Miguel A; Alcalá-Quintana, Rocío

    2015-12-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal.

  12. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech.

    PubMed

    García-Pérez, Miguel A; Alcalá-Quintana, Rocío

    2015-12-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. PMID:27551361

  13. Visual and Auditory Components in the Perception of Asynchronous Audiovisual Speech

    PubMed Central

    Alcalá-Quintana, Rocío

    2015-01-01

    Research on asynchronous audiovisual speech perception manipulates experimental conditions to observe their effects on synchrony judgments. Probabilistic models establish a link between the sensory and decisional processes underlying such judgments and the observed data, via interpretable parameters that allow testing hypotheses and making inferences about how experimental manipulations affect such processes. Two models of this type have recently been proposed, one based on independent channels and the other using a Bayesian approach. Both models are fitted here to a common data set, with a subsequent analysis of the interpretation they provide about how experimental manipulations affected the processes underlying perceived synchrony. The data consist of synchrony judgments as a function of audiovisual offset in a speech stimulus, under four within-subjects manipulations of the quality of the visual component. The Bayesian model could not accommodate asymmetric data, was rejected by goodness-of-fit statistics for 8/16 observers, and was found to be nonidentifiable, which renders uninterpretable parameter estimates. The independent-channels model captured asymmetric data, was rejected for only 1/16 observers, and identified how sensory and decisional processes mediating asynchronous audiovisual speech perception are affected by manipulations that only alter the quality of the visual component of the speech signal. PMID:27551361

  14. The dynamics of visual perception pictures of stroboscope

    NASA Astrophysics Data System (ADS)

    Zhytaryuk, V. G.

    2015-11-01

    This paper studies and investigated the issue of physical principles of visual perception blinking images spokes of a wheel that rotates in alternating and direct the reflected light fields. The research results make it possible to clearly interpret observations stroboscopic effect of the rotating spoke wheels of the car, propeller aircraft, domestic fans. Established that the observation of these defects is possible only when illuminated by artificial fluorescent, discharge and pulsed light source. Discovered fact "capture", ie observation as a separate fixed needles at frequencies far exceeding the published data, which this time is 0.1 sec (10 Hz). Established that there is a capture at frequencies up to and including 50 Hz. This result is not described in the scientific literature and no explanation of the theory.

  15. Functional modulation of power-law distribution in visual perception

    NASA Astrophysics Data System (ADS)

    Shimono, Masanori; Owaki, Takashi; Amano, Kaoru; Kitajo, Keiichi; Takeda, Tsunehiro

    2007-05-01

    Neuronal activities have recently been reported to exhibit power-law scaling behavior. However, it has not been demonstrated that the power-law component can play an important role in human perceptual functions. Here, we demonstrate that the power spectrum of magnetoencephalograph recordings of brain activity varies in coordination with perception of subthreshold visual stimuli. We observed that perceptual performance could be better explained by modulation of the power-law component than by modulation of the peak power in particular narrow frequency ranges. The results suggest that the brain operates in a state of self-organized criticality, modulating the power spectral exponent of its activity to optimize its internal state for response to external stimuli.

  16. Short-term visual deprivation reduces interference effects of task-irrelevant facial expressions on affective prosody judgments.

    PubMed

    Fengler, Ineke; Nava, Elena; Röder, Brigitte

    2015-01-01

    Several studies have suggested that neuroplasticity can be triggered by short-term visual deprivation in healthy adults. Specifically, these studies have provided evidence that visual deprivation reversibly affects basic perceptual abilities. The present study investigated the long-lasting effects of short-term visual deprivation on emotion perception. To this aim, we visually deprived a group of young healthy adults, age-matched with a group of non-deprived controls, for 3 h and tested them before and after visual deprivation (i.e., after 8 h on average and at 4 week follow-up) on an audio-visual (i.e., faces and voices) emotion discrimination task. To observe changes at the level of basic perceptual skills, we additionally employed a simple audio-visual (i.e., tone bursts and light flashes) discrimination task and two unimodal (one auditory and one visual) perceptual threshold measures. During the 3 h period, both groups performed a series of auditory tasks. To exclude the possibility that changes in emotion discrimination may emerge as a consequence of the exposure to auditory stimulation during the 3 h stay in the dark, we visually deprived an additional group of age-matched participants who concurrently performed unrelated (i.e., tactile) tasks to the later tested abilities. The two visually deprived groups showed enhanced affective prosodic discrimination abilities in the context of incongruent facial expressions following the period of visual deprivation; this effect was partially maintained until follow-up. By contrast, no changes were observed in affective facial expression discrimination and in the basic perception tasks in any group. These findings suggest that short-term visual deprivation per se triggers a reweighting of visual and auditory emotional cues, which seems to possibly prevail for longer durations.

  17. Task-irrelevant stimulus salience affects visual search.

    PubMed

    Lamy, Dominique; Zoaris, Loren

    2009-05-01

    The relative contributions of stimulus salience and task-related goals in guiding attention remain an issue of debate. Several studies have demonstrated that top-down factors play an important role, as they often override capture by salient irrelevant objects. However, Yantis and Egeth [Yantis, S., & Egeth, H. E. (1999). On the distinction between visual salience and stimulus-driven attentional capture. Journal of Experimental Psychology: Human Perception and Performance, 25, 661-676.] have made the more radical claim that salience plays no role in visual search unless the observer adopts an attentional set for singletons or "singleton-detection mode". We reexamine their claim while disentangling effects of stimulus salience from effects of attentional set and inter-trial repetition. The results show that stimulus salience guides attention even when salience is task irrelevant.

  18. Identifying the information for the visual perception of relative phase.

    PubMed

    Wilson, Andrew D; Bingham, Geoffrey P

    2008-04-01

    The production and perception of coordinated rhythmic movement are very specifically structured. For production and perception, 0 degree mean relative phase is stable, 180 degrees is less stable, and no other state is stable without training. It has been hypothesized that perceptual stability characteristics underpin the movement stability characteristics, which has led to the development of a phase-driven oscillator model (e.g., Bingham, 2004a, 2004b). In the present study, a novel perturbation method was used to explore the identity of the perceptual information being used in rhythmic movement tasks. In the three conditions, relative position, relative speed, and frequency (variables motivated by the model) were selectively perturbed. Ten participants performed a judgment task to identify 0 degree or 180 degrees under these perturbation conditions, and 8 participants who had been trained to visually discriminate 90 degrees performed the task with perturbed 90 degrees displays. Discrimination of 0 degree and 180 degrees was unperturbed in 7 out of the 10 participants, but discrimination of 90 degrees was completely disrupted by the position perturbation and was made noisy by the frequency perturbation. We concluded that (1) the information used by most observers to perceive relative phase at 0 degree and 180 degrees was relative direction and (2) becoming an expert perceiver of 90 degrees entails learning a new variable composed of position and speed.

  19. Constructing Visual Perception of Body Movement with the Motor Cortex.

    PubMed

    Orgs, Guido; Dovern, Anna; Hagura, Nobuhiro; Haggard, Patrick; Fink, Gereon R; Weiss, Peter H

    2016-01-01

    The human brain readily perceives fluent movement from static input. Using functional magnetic resonance imaging, we investigated brain mechanisms that mediate fluent apparent biological motion (ABM) perception from sequences of body postures. We presented body and nonbody stimuli varying in objective sequence duration and fluency of apparent movement. Three body postures were ordered to produce a fluent (ABC) or a nonfluent (ACB) apparent movement. This enabled us to identify brain areas involved in the perceptual reconstruction of body movement from identical lower-level static input. Participants judged the duration of a rectangle containing body/nonbody sequences, as an implicit measure of movement fluency. For body stimuli, fluent apparent motion sequences produced subjectively longer durations than nonfluent sequences of the same objective duration. This difference was reduced for nonbody stimuli. This body-specific bias in duration perception was associated with increased blood oxygen level-dependent responses in the primary (M1) and supplementary motor areas. Moreover, fluent ABM was associated with increased functional connectivity between M1/SMA and right fusiform body area. We show that perceptual reconstruction of fluent movement from static body postures does not merely enlist areas traditionally associated with visual body processing, but involves cooperative recruitment of motor areas, consistent with a "motor way of seeing". PMID:26534907

  20. Suppressive mechanisms in visual motion processing: from perception to intelligence

    PubMed Central

    Tadin, Duje

    2015-01-01

    Perception operates on an immense amount of incoming information that greatly exceeds the brain's processing capacity. Because of this fundamental limitation, the ability to suppress irrelevant information is a key determinant of perceptual efficiency. Here, I will review a series of studies investigating suppressive mechanisms in visual motion processing, namely perceptual suppression of large, background-like motions. These spatial suppression mechanisms are adaptive, operating only when sensory inputs are sufficiently robust to guarantee visibility. Converging correlational and causal evidence links these behavioral results with inhibitory center-surround mechanisms, namely those in cortical area MT. Spatial suppression is abnormally weak in several special populations, including the elderly and those with schizophrenia—a deficit that is evidenced by better-than-normal direction discriminations of large moving stimuli. Theoretical work shows that this abnormal weakening of spatial suppression should result in motion segregation deficits, but direct behavioral support of this hypothesis is lacking. Finally, I will argue that the ability to suppress information is a fundamental neural process that applies not only to perception but also to cognition in general. Supporting this argument, I will discuss recent research that shows individual differences in spatial suppression of motion signals strongly predict individual variations in IQ scores. PMID:26299386

  1. Unseen Affective Faces Influence Person Perception Judgments in Schizophrenia

    PubMed Central

    Kring, Ann M.; Siegel, Erika H.; Barrett, Lisa Feldman

    2014-01-01

    To demonstrate the influence of unconscious affective processing on consciously processed information among people with and without schizophrenia, we used a continuous flash suppression (CFS) paradigm to examine whether early and rapid processing of affective information influences first impressions of structurally neutral faces. People with and without schizophrenia rated visible neutral faces as more or less trustworthy, warm, and competent when paired with unseen smiling or scowling faces compared to when paired with unseen neutral faces. Yet, people with schizophrenia also exhibited a deficit in explicit affect perception. These findings indicate that early processing of affective information is intact in schizophrenia but the integration of this information with semantic contexts is problematic. Furthermore, people with schizophrenia who were more influenced by smiling faces presented outside awareness reported experiencing more anticipatory pleasure, suggesting that the ability to rapidly process affective information is important for anticipation of future pleasurable events. PMID:25664225

  2. Influential sources affecting Bangkok adolescent body image perceptions.

    PubMed

    Thianthai, Chulanee

    2006-01-01

    The study of body image-related problems in non-Western countries is still very limited. Thus, this study aims to identify the main influential sources and show how they affect the body image perceptions of Bangkok adolescents. The researcher recruited 400 Thai male and female adolescents in Bangkok, attending high school to freshmen level, ranging from 16-19 years, to participate in this study. Survey questionnaires were distributed to every student and follow-up interviews conducted with 40 students. The findings showed that there are eight main influential sources respectively ranked from the most influential to the least influential: magazines, television, peer group, familial, fashion trend, the opposite gender, self-realization and health knowledge. Similar to those studies conducted in Western countries, more than half of the total percentage was the influence of mass media and peer groups. Bangkok adolescents also internalized Western ideal beauty through these mass media channels. Alike studies conducted in the West, there was similarities in the process of how these influential sources affect Bangkok adolescent body image perception, with the exception of familial source. In conclusion, taking the approach of identifying the main influential sources and understanding how they affect adolescent body image perceptions can help prevent adolescents from having unhealthy views and taking risky measures toward their bodies. More studies conducted in non-Western countries are needed in order to build a cultural sensitive program, catered to the body image problems occurring in adolescents within that particular society. PMID:17340854

  3. Talker variability in audio-visual speech perception.

    PubMed

    Heald, Shannon L M; Nusbaum, Howard C

    2014-01-01

    A change in talker is a change in the context for the phonetic interpretation of acoustic patterns of speech. Different talkers have different mappings between acoustic patterns and phonetic categories and listeners need to adapt to these differences. Despite this complexity, listeners are adept at comprehending speech in multiple-talker contexts, albeit at a slight but measurable performance cost (e.g., slower recognition). So far, this talker variability cost has been demonstrated only in audio-only speech. Other research in single-talker contexts have shown, however, that when listeners are able to see a talker's face, speech recognition is improved under adverse listening (e.g., noise or distortion) conditions that can increase uncertainty in the mapping between acoustic patterns and phonetic categories. Does seeing a talker's face reduce the cost of word recognition in multiple-talker contexts? We used a speeded word-monitoring task in which listeners make quick judgments about target word recognition in single- and multiple-talker contexts. Results show faster recognition performance in single-talker conditions compared to multiple-talker conditions for both audio-only and audio-visual speech. However, recognition time in a multiple-talker context was slower in the audio-visual condition compared to audio-only condition. These results suggest that seeing a talker's face during speech perception may slow recognition by increasing the importance of talker identification, signaling to the listener a change in talker has occurred. PMID:25076919

  4. Visual perception in acoustically deprived and normally hearing children.

    PubMed

    Thannhauser, Joanna; Buldańczyk, Agnieszka; Salomon, Ewa; Jankowska, Elżbieta; Borodulin-Nadzieja, Ludmiła; Kraszewska, Barbara; Heisig, Monika

    2009-09-01

    In the present study an attempt was made to establish if and to what extent auditory deprivation modifies the processes of visual analysis and synthesis. The study included 54 children aged 10-16 years with hearing impairment attending the School and Educational Center for Children with Hearing Impairment in Wrocław (group I) and 127 children with normal hearing acuity attending public schools (group II), forming a reference group. Hearing impairment in the children of group I was from 60 to 100 dB. In 9 of these children the hearing impairment was inherited, while in some others it was acquired and resulted from rubella during the mother's pregnancy (5 subjects) or a severe disease course in childhood, for instance cerebral meningitis (4 subjects) and otolaryngologic antibiotic therapy (7 subjects). In the remaining subjects the reason for auditory deprivation was unknown. Hearing impairment, apart from genetically conditioned causes, appeared in the first months or years of life. The general intellectual level of the examined children was similar to that of their control counterparts, which was confirmed by school psychologists during a routine examination. The examination was performed by means of two tests from the Nonverbal Score of the Wechsler Intelligence Scale for Children: Puzzles and Block Design. The children with a hearing deficit generally needed more time to perform the tasks than those with normal hearing. The investigated parameters of visual perception improved in correlation with age, but the dynamics of these changes were different in the two study groups. PMID:19387878

  5. Professors' Facebook content affects students' perceptions and expectations.

    PubMed

    Sleigh, Merry J; Smith, Aimee W; Laboe, Jason

    2013-07-01

    Abstract Facebook users must make choices about level of self-disclosure, and this self-disclosure can influence perceptions of the profile's author. We examined whether the specific type of self-disclosure on a professor's profile would affect students' perceptions of the professor and expectations of his classroom. We created six Facebook profiles for a fictitious male professor, each with a specific emphasis: politically conservative, politically liberal, religious, family oriented, socially oriented, or professional. Undergraduate students randomly viewed one profile and responded to questions that assessed their perceptions and expectations. The social professor was perceived as less skilled but more popular, while his profile was perceived as inappropriate and entertaining. Students reacted more strongly and negatively to the politically focused profiles in comparison to the religious, family, and professional profiles. Students reported being most interested in professional information on a professor's Facebook profile, yet they reported being least influenced by the professional profile. In general, students expressed neutrality about their interest in finding and friending professors on Facebook. These findings suggest that students have the potential to form perceptions about the classroom environment and about their professors based on the specific details disclosed in professors' Facebook profiles. PMID:23614794

  6. Prediction of visual perceptions with artificial neural networks in a visual prosthesis for the blind.

    PubMed

    Archambeau, Cédric; Delbeke, Jean; Veraart, Claude; Verleysen, Michel

    2004-11-01

    Within the framework of the OPTIVIP project, an optic nerve based visual prosthesis is developed in order to restore partial vision to the blind. One of the main challenges is to understand, decode and model the physiological process linking the stimulating parameters to the visual sensations produced in the visual field of a blind volunteer. We propose to use adaptive neural techniques. Two prediction models are investigated. The first one is a grey-box model exploiting the neurophysiological knowledge available up to now. It combines a neurophysiological model with artificial neural networks, such as multi-layer perceptrons and radial basis function networks, in order to predict the features of the visual perceptions. The second model is entirely of the black-box type. We show that both models provide satisfactory prediction tools and achieve similar prediction accuracies. Moreover, we demonstrate that significant improvement (25%) was gained with respect to linear statistical methods, suggesting that the biological process is strongly non-linear.

  7. Causal evidence for subliminal percept-to-memory interference in early visual cortex.

    PubMed

    Silvanto, Juha; Soto, David

    2012-01-01

    There has been recent interest in the neural correlates of visual short-term memory (VSTM) interference by irrelevant perceptual input. These studies, however, presented distracters that were subjected to conscious scrutiny by participants thus strongly involving attentional control mechanisms. In order to minimize the role of attentional control and to investigate interference occurring at the level of sensory representations, we developed a paradigm in which a subliminal visual distracter is presented during the delay period of a visual short-term memory task requiring the maintenance of stimulus orientation. This subliminal distracter could be either congruent or incongruent with the orientation of the memory item. Behavioral results showed that the intervening distracter affected the fidelity of VSTM when it was incongruent with the memory cue. We then assessed the causal role of the early visual cortex in this interaction by using transcranial magnetic stimulation (TMS). We found that occipital TMS impaired the fidelity VSTM content in the absence of the memory mask. Interestingly, TMS facilitated VSTM performance in the presence of a subliminal memory mask that was incongruent with the memory content. Signal detection analyses indicated that TMS did not modulate perceptual sensitivity of the masked distracter. That the impact of TMS on the precision of VSTM was dissociated by the presence vs. absence of a subliminal perceptual distracter and its congruency with the VSTM content provides causal evidence for the view that competitive interactions between memory and perception can occur at the earliest cortical stages of visual processing.

  8. In the eye of the beholder: Visual biases in package and portion size perceptions.

    PubMed

    Ordabayeva, Nailya; Chandon, Pierre

    2016-08-01

    As the sizes of food packages and portions have changed rapidly over the past decades, it has become crucial to understand how consumers perceive and respond to changes in size. Existing evidence suggests that consumers make errors when visually estimating package and portion sizes, and these errors significantly influence subsequent food choices and intake. We outline four visual biases (arising from the underestimation of increasing portion sizes, the dimensionality of the portion size change, labeling effects, and consumer affect) that shape consumers' perceptions of package and portion sizes. We discuss the causes of these biases, review their impact on food consumption decisions, and suggest concrete strategies to reduce them and to promote healthier eating. We conclude with a discussion of important theoretical and practical issues that should be addressed in the future. PMID:26482283

  9. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies

    PubMed Central

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M.; Rogers, Peter J.; Hardman, Charlotte A.

    2016-01-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a ‘normal’ sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. PMID:26702602

  10. Cue Integration in Categorical Tasks: Insights from Audio-Visual Speech Perception

    PubMed Central

    Bejjanki, Vikranth Rao; Clayards, Meghan; Knill, David C.; Aslin, Richard N.

    2011-01-01

    Previous cue integration studies have examined continuous perceptual dimensions (e.g., size) and have shown that human cue integration is well described by a normative model in which cues are weighted in proportion to their sensory reliability, as estimated from single-cue performance. However, this normative model may not be applicable to categorical perceptual dimensions (e.g., phonemes). In tasks defined over categorical perceptual dimensions, optimal cue weights should depend not only on the sensory variance affecting the perception of each cue but also on the environmental variance inherent in each task-relevant category. Here, we present a computational and experimental investigation of cue integration in a categorical audio-visual (articulatory) speech perception task. Our results show that human performance during audio-visual phonemic labeling is qualitatively consistent with the behavior of a Bayes-optimal observer. Specifically, we show that the participants in our task are sensitive, on a trial-by-trial basis, to the sensory uncertainty associated with the auditory and visual cues, during phonemic categorization. In addition, we show that while sensory uncertainty is a significant factor in determining cue weights, it is not the only one and participants' performance is consistent with an optimal model in which environmental, within category variability also plays a role in determining cue weights. Furthermore, we show that in our task, the sensory variability affecting the visual modality during cue-combination is not well estimated from single-cue performance, but can be estimated from multi-cue performance. The findings and computational principles described here represent a principled first step towards characterizing the mechanisms underlying human cue integration in categorical tasks. PMID:21637344

  11. The perception of affective touch in anorexia nervosa.

    PubMed

    Crucianelli, Laura; Cardi, Valentina; Treasure, Janet; Jenkinson, Paul M; Fotopoulou, Aikaterini

    2016-05-30

    Anorexia nervosa (AN) is a disorder characterized by restricted eating, fears of gaining weight, and body image distortions. The etiology remains unknown; however impairments in social cognition and reward circuits contribute to the onset and maintenance of the disorder. One possibility is that AN is associated with reduced perceived pleasantness during social interactions. We therefore examined the perception of interpersonal, 'affective touch' and its social modulation in AN. We measured the perceived pleasantness of light, dynamic stroking touches applied to the forearm of 25 AN patients and 30 healthy controls using C Tactile (CT) afferents-optimal (3cm/s) and non-optimal (18cm/s) velocities, while simultaneously displaying images of faces showing rejecting, neutral and accepting expressions. CT-optimal touch, but not CT non-optimal touch, elicited significantly lower pleasantness ratings in AN patients compared with healthy controls. Pleasantness ratings were modulated by facial expressions in both groups in a similar fashion; namely, presenting socially accepting faces increased the perception of touch pleasantness more than neutral and rejecting faces. Our findings suggest that individuals with AN have a disordered, CT-based affective touch system. This impairment may be linked to their weakened interoceptive perception and distorted body representation. PMID:27137964

  12. PERCEPT Indoor Navigation System for the Blind and Visually Impaired: Architecture and Experimentation

    PubMed Central

    Ganz, Aura; Schafer, James; Gandhi, Siddhesh; Puleo, Elaine; Wilson, Carole; Robertson, Meg

    2012-01-01

    We introduce PERCEPT system, an indoor navigation system for the blind and visually impaired. PERCEPT will improve the quality of life and health of the visually impaired community by enabling independent living. Using PERCEPT, blind users will have independent access to public health facilities such as clinics, hospitals, and wellness centers. Access to healthcare facilities is crucial for this population due to the multiple health conditions that they face such as diabetes and its complications. PERCEPT system trials with 24 blind and visually impaired users in a multistory building show PERCEPT system effectiveness in providing appropriate navigation instructions to these users. The uniqueness of our system is that it is affordable and that its design follows orientation and mobility principles. We hope that PERCEPT will become a standard deployed in all indoor public spaces, especially in healthcare and wellness facilities. PMID:23316225

  13. Additivity in perception of affect from limb motion.

    PubMed

    Etemad, S Ali; Arya, Ali; Parush, Avi

    2014-01-13

    In this study, the notion of additivity in perception of affect from limb motion is investigated. Specifically, we examine whether the impact of multiple limbs in perception of affect is equal to the sum of the impacts of each individual limb. Several neutral, happy, and sad walking sequences are first aligned and averaged. Four distinct body regions or limbs are defined for this study: arms and hands, legs and feet, head and neck, and torso. The three average walks are used to create the stimuli. The motion of each limb and combination of limbs from the neutral sequence are replaced with those of the happy and sad sequences. Through collecting perceptual ratings for when individual limbs contain affective features, and comparing the sums of these ratings to instances where multiple limbs of the body simultaneously contain affective features, additivity is investigated. We find that while the results are highly correlated, additivity does not hold in the classical sense. Based on the results, a mathematical model is proposed for describing the observed relationship.

  14. Additivity in perception of affect from limb motion.

    PubMed

    Etemad, S Ali; Arya, Ali; Parush, Avi

    2014-01-13

    In this study, the notion of additivity in perception of affect from limb motion is investigated. Specifically, we examine whether the impact of multiple limbs in perception of affect is equal to the sum of the impacts of each individual limb. Several neutral, happy, and sad walking sequences are first aligned and averaged. Four distinct body regions or limbs are defined for this study: arms and hands, legs and feet, head and neck, and torso. The three average walks are used to create the stimuli. The motion of each limb and combination of limbs from the neutral sequence are replaced with those of the happy and sad sequences. Through collecting perceptual ratings for when individual limbs contain affective features, and comparing the sums of these ratings to instances where multiple limbs of the body simultaneously contain affective features, additivity is investigated. We find that while the results are highly correlated, additivity does not hold in the classical sense. Based on the results, a mathematical model is proposed for describing the observed relationship. PMID:24269980

  15. Focal Length Affects Depicted Shape and Perception of Facial Images.

    PubMed

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits.

  16. Assistive Technology Competencies of Teachers of Students with Visual Impairments: A Comparison of Perceptions

    ERIC Educational Resources Information Center

    Zhou, Li; Smith, Derrick W.; Parker, Amy T.; Griffin-Shirley, Nora

    2011-01-01

    This study surveyed teachers of students with visual impairments in Texas on their perceptions of a set of assistive technology competencies developed for teachers of students with visual impairments by Smith and colleagues (2009). Differences in opinion between practicing teachers of students with visual impairments and Smith's group of…

  17. Eye movements and attention in reading, scene perception, and visual search.

    PubMed

    Rayner, Keith

    2009-08-01

    Eye movements are now widely used to investigate cognitive processes during reading, scene perception, and visual search. In this article, research on the following topics is reviewed with respect to reading: (a) the perceptual span (or span of effective vision), (b) preview benefit, (c) eye movement control, and (d) models of eye movements. Related issues with respect to eye movements during scene perception and visual search are also reviewed. It is argued that research on eye movements during reading has been somewhat advanced over research on eye movements in scene perception and visual search and that some of the paradigms developed to study reading should be more widely adopted in the study of scene perception and visual search. Research dealing with "real-world" tasks and research utilizing the visual-world paradigm are also briefly discussed.

  18. No-reference quality assessment based on visual perception

    NASA Astrophysics Data System (ADS)

    Li, Junshan; Yang, Yawei; Hu, Shuangyan; Zhang, Jiao

    2014-11-01

    The visual quality assessment of images/videos is an ongoing hot research topic, which has become more and more important for numerous image and video processing applications with the rapid development of digital imaging and communication technologies. The goal of image quality assessment (IQA) algorithms is to automatically assess the quality of images/videos in agreement with human quality judgments. Up to now, two kinds of models have been used for IQA, namely full-reference (FR) and no-reference (NR) models. For FR models, IQA algorithms interpret image quality as fidelity or similarity with a perfect image in some perceptual space. However, the reference image is not available in many practical applications, and a NR IQA approach is desired. Considering natural vision as optimized by the millions of years of evolutionary pressure, many methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychological features of the human visual system (HVS). To reach this goal, researchers try to simulate HVS with image sparsity coding and supervised machine learning, which are two main features of HVS. A typical HVS captures the scenes by sparsity coding, and uses experienced knowledge to apperceive objects. In this paper, we propose a novel IQA approach based on visual perception. Firstly, a standard model of HVS is studied and analyzed, and the sparse representation of image is accomplished with the model; and then, the mapping correlation between sparse codes and subjective quality scores is trained with the regression technique of least squaresupport vector machine (LS-SVM), which gains the regressor that can predict the image quality; the visual metric of image is predicted with the trained regressor at last. We validate the performance of proposed approach on Laboratory for Image and Video Engineering (LIVE) database, the specific contents of the type of distortions present in the database are: 227 images of JPEG2000, 233

  19. Affect of the unconscious: Visually suppressed angry faces modulate our decisions

    PubMed Central

    Pajtas, Petra E.; Mahon, Bradford Z.; Nakayama, Ken; Caramazza, Alfonso

    2016-01-01

    Emotional and affective processing imposes itself over cognitive processes and modulates our perception of the surrounding environment. In two experiments, we addressed the issue of whether nonconscious processing of affect can take place even under deep states of unawareness, such as those induced by interocular suppression techniques, and can elicit an affective response that can influence our understanding of the surrounding environment. In Experiment 1, participants judged the likeability of an unfamiliar item—a Chinese character—that was preceded by a face expressing a particular emotion (either happy or angry). The face was rendered invisible through an interocular suppression technique (continuous flash suppression; CFS). In Experiment 2, backward masking (BM), a less robust masking technique, was used to render the facial expressions invisible. We found that despite equivalent phenomenological suppression of the visual primes under CFS and BM, different patterns of affective processing were obtained with the two masking techniques. Under BM, nonconscious affective priming was obtained for both happy and angry invisible facial expressions. However, under CFS, nonconscious affective priming was obtained only for angry facial expressions. We discuss an interpretation of this dissociation between affective processing and visual masking techniques in terms of distinct routes from the retina to the amygdala. PMID:23224765

  20. Relative Visual Oscillation Can Facilitate Visually Induced Self-Motion Perception

    PubMed Central

    Palmisano, Stephen; Kim, Juno

    2016-01-01

    Adding simulated viewpoint jitter or oscillation to displays enhances visually induced illusions of self-motion (vection). The cause of this enhancement is yet to be fully understood. Here, we conducted psychophysical experiments to investigate the effects of different types of simulated oscillation on vertical vection. Observers viewed horizontally oscillating and nonoscillating optic flow fields simulating downward self-motion through an aperture. The aperture was visually simulated to be nearer to the observer and was stationary or oscillating in-phase or counter-phase to the direction of background horizontal oscillations of optic flow. Results showed that vection strength was modulated by the oscillation of the aperture relative to the background optic flow. Vertical vection strength increased as the relative oscillatory horizontal motion between the flow and the aperture increased. However, such increases in vection were only generated when the added oscillations were orthogonal to the principal direction of the optic flow pattern, and not when they occurred in the same direction. The oscillation effects observed in this investigation could not be explained by motion adaptation or different (motion parallax based) effects on depth perception. Instead, these results suggest that the oscillation advantage for vection depends on relative visual motion. PMID:27698982

  1. Relative Visual Oscillation Can Facilitate Visually Induced Self-Motion Perception

    PubMed Central

    Palmisano, Stephen; Kim, Juno

    2016-01-01

    Adding simulated viewpoint jitter or oscillation to displays enhances visually induced illusions of self-motion (vection). The cause of this enhancement is yet to be fully understood. Here, we conducted psychophysical experiments to investigate the effects of different types of simulated oscillation on vertical vection. Observers viewed horizontally oscillating and nonoscillating optic flow fields simulating downward self-motion through an aperture. The aperture was visually simulated to be nearer to the observer and was stationary or oscillating in-phase or counter-phase to the direction of background horizontal oscillations of optic flow. Results showed that vection strength was modulated by the oscillation of the aperture relative to the background optic flow. Vertical vection strength increased as the relative oscillatory horizontal motion between the flow and the aperture increased. However, such increases in vection were only generated when the added oscillations were orthogonal to the principal direction of the optic flow pattern, and not when they occurred in the same direction. The oscillation effects observed in this investigation could not be explained by motion adaptation or different (motion parallax based) effects on depth perception. Instead, these results suggest that the oscillation advantage for vection depends on relative visual motion.

  2. Categorical perception of affective and linguistic facial expressions.

    PubMed

    McCullough, Stephen; Emmorey, Karen

    2009-02-01

    Two experiments investigated categorical perception (CP) effects for affective facial expressions and linguistic facial expressions from American Sign Language (ASL) for Deaf native signers and hearing non-signers. Facial expressions were presented in isolation (Experiment 1) or in an ASL verb context (Experiment 2). Participants performed ABX discrimination and identification tasks on morphed affective and linguistic facial expression continua. The continua were created by morphing end-point photo exemplars into 11 images, changing linearly from one expression to another in equal steps. For both affective and linguistic expressions, hearing non-signers exhibited better discrimination across category boundaries than within categories for both experiments, thus replicating previous results with affective expressions and demonstrating CP effects for non-canonical facial expressions. Deaf signers, however, showed significant CP effects only for linguistic facial expressions. Subsequent analyses indicated that order of presentation influenced signers' response time performance for affective facial expressions: viewing linguistic facial expressions first slowed response time for affective facial expressions. We conclude that CP effects for affective facial expressions can be influenced by language experience. PMID:19111287

  3. The Developmental Test of Visual Perception-Third Edition (DTVP-3): A Review, Critique, and Practice Implications

    ERIC Educational Resources Information Center

    Brown, Ted; Murdolo, Yuki

    2015-01-01

    The "Developmental Test of Visual Perception-Third Edition" (DTVP-3) is a recent revision of the "Developmental Test of Visual Perception-Second Edition" (DTVP-2). The DTVP-3 is designed to assess the visual perceptual and/or visual-motor integration skills of children from 4 to 12 years of age. The test is standardized using…

  4. The Perceptual Root of Object-Based Storage: An Interactive Model of Perception and Visual Working Memory

    ERIC Educational Resources Information Center

    Gao, Tao; Gao, Zaifeng; Li, Jie; Sun, Zhongqiang; Shen, Mowei

    2011-01-01

    Mainstream theories of visual perception assume that visual working memory (VWM) is critical for integrating online perceptual information and constructing coherent visual experiences in changing environments. Given the dynamic interaction between online perception and VWM, we propose that how visual information is processed during visual…

  5. Sound frequency affects speech emotion perception: results from congenital amusia

    PubMed Central

    Lolli, Sydney L.; Lewenstein, Ari D.; Basurto, Julian; Winnik, Sean; Loui, Psyche

    2015-01-01

    Congenital amusics, or “tone-deaf” individuals, show difficulty in perceiving and producing small pitch differences. While amusia has marked effects on music perception, its impact on speech perception is less clear. Here we test the hypothesis that individual differences in pitch perception affect judgment of emotion in speech, by applying low-pass filters to spoken statements of emotional speech. A norming study was first conducted on Mechanical Turk to ensure that the intended emotions from the Macquarie Battery for Evaluation of Prosody were reliably identifiable by US English speakers. The most reliably identified emotional speech samples were used in Experiment 1, in which subjects performed a psychophysical pitch discrimination task, and an emotion identification task under low-pass and unfiltered speech conditions. Results showed a significant correlation between pitch-discrimination threshold and emotion identification accuracy for low-pass filtered speech, with amusics (defined here as those with a pitch discrimination threshold >16 Hz) performing worse than controls. This relationship with pitch discrimination was not seen in unfiltered speech conditions. Given the dissociation between low-pass filtered and unfiltered speech conditions, we inferred that amusics may be compensating for poorer pitch perception by using speech cues that are filtered out in this manipulation. To assess this potential compensation, Experiment 2 was conducted using high-pass filtered speech samples intended to isolate non-pitch cues. No significant correlation was found between pitch discrimination and emotion identification accuracy for high-pass filtered speech. Results from these experiments suggest an influence of low frequency information in identifying emotional content of speech. PMID:26441718

  6. Sound frequency affects speech emotion perception: results from congenital amusia.

    PubMed

    Lolli, Sydney L; Lewenstein, Ari D; Basurto, Julian; Winnik, Sean; Loui, Psyche

    2015-01-01

    Congenital amusics, or "tone-deaf" individuals, show difficulty in perceiving and producing small pitch differences. While amusia has marked effects on music perception, its impact on speech perception is less clear. Here we test the hypothesis that individual differences in pitch perception affect judgment of emotion in speech, by applying low-pass filters to spoken statements of emotional speech. A norming study was first conducted on Mechanical Turk to ensure that the intended emotions from the Macquarie Battery for Evaluation of Prosody were reliably identifiable by US English speakers. The most reliably identified emotional speech samples were used in Experiment 1, in which subjects performed a psychophysical pitch discrimination task, and an emotion identification task under low-pass and unfiltered speech conditions. Results showed a significant correlation between pitch-discrimination threshold and emotion identification accuracy for low-pass filtered speech, with amusics (defined here as those with a pitch discrimination threshold >16 Hz) performing worse than controls. This relationship with pitch discrimination was not seen in unfiltered speech conditions. Given the dissociation between low-pass filtered and unfiltered speech conditions, we inferred that amusics may be compensating for poorer pitch perception by using speech cues that are filtered out in this manipulation. To assess this potential compensation, Experiment 2 was conducted using high-pass filtered speech samples intended to isolate non-pitch cues. No significant correlation was found between pitch discrimination and emotion identification accuracy for high-pass filtered speech. Results from these experiments suggest an influence of low frequency information in identifying emotional content of speech. PMID:26441718

  7. Visual space perception at different levels of depth description.

    PubMed

    Šikl, Radovan; Šimeček, Michal

    2015-08-01

    The main purpose of this study was to determine the effect of the depth description levels required in experimental tasks on visual space perception. Six observers assessed the locations of 11 posts by determining a distance ranking order, comparing the distances between posts with a reference unit, and estimating the absolute distances between the posts. The experiments were performed in an open outdoor field under normal daylight conditions with posts at distances ranging from 2 to 12 m. To directly assess and compare the observers' perceptual performance in all three phases of the experiment, the raw data were transformed to common measurement levels. A pairwise comparison analysis provided nonmetric information regarding the observers' relative distance judgments, and a multidimensional-scaling procedure provided metric information regarding the relationship between a perceived spatial layout and the layout of the actual scene. The common finding in all of the analyses was that the precision and consistency of the observers' ordinal distance judgments were greater than those of their ratio distance judgments, which were, in turn, greater than the precision and consistency of their absolute-magnitude distance judgments. Our findings raise questions regarding the ecological validity of standard experimental tasks.

  8. Combining spatial and temporal expectations to improve visual perception

    PubMed Central

    Rohenkohl, Gustavo; Gould, Ian C.; Pessoa, Jéssica; Nobre, Anna C.

    2014-01-01

    The importance of temporal expectations in modulating perceptual functions is increasingly recognized. However, the means through which temporal expectations can bias perceptual information processing remains ill understood. Recent theories propose that modulatory effects of temporal expectations rely on the co-existence of other biases based on receptive-field properties, such as spatial location. We tested whether perceptual benefits of temporal expectations in a perceptually demanding psychophysical task depended on the presence of spatial expectations. Foveally presented symbolic arrow cues indicated simultaneously where (location) and when (time) target events were more likely to occur. The direction of the arrow indicated target location (80% validity), while its color (pink or blue) indicated the interval (80% validity) for target appearance. Our results confirmed a strong synergistic interaction between temporal and spatial expectations in enhancing visual discrimination. Temporal expectation significantly boosted the effectiveness of spatial expectation in sharpening perception. However, benefits for temporal expectation disappeared when targets occurred at unattended locations. Our findings suggest that anticipated receptive-field properties of targets provide a natural template upon which temporal expectations can operate in order to help prioritize goal-relevant events from early perceptual stages. PMID:24722562

  9. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness.

    PubMed

    Forder, Lewis; Taylor, Olivia; Mankin, Helen; Scott, Ryan B; Franklin, Anna

    2016-01-01

    The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d') and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object's stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain. PMID:27023274

  10. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness.

    PubMed

    Forder, Lewis; Taylor, Olivia; Mankin, Helen; Scott, Ryan B; Franklin, Anna

    2016-01-01

    The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d') and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object's stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain.

  11. Colour Terms Affect Detection of Colour and Colour-Associated Objects Suppressed from Visual Awareness

    PubMed Central

    Forder, Lewis; Taylor, Olivia; Mankin, Helen; Scott, Ryan B.; Franklin, Anna

    2016-01-01

    The idea that language can affect how we see the world continues to create controversy. A potentially important study in this field has shown that when an object is suppressed from visual awareness using continuous flash suppression (a form of binocular rivalry), detection of the object is differently affected by a preceding word prime depending on whether the prime matches or does not match the object. This may suggest that language can affect early stages of vision. We replicated this paradigm and further investigated whether colour terms likewise influence the detection of colours or colour-associated object images suppressed from visual awareness by continuous flash suppression. This method presents rapidly changing visual noise to one eye while the target stimulus is presented to the other. It has been shown to delay conscious perception of a target for up to several minutes. In Experiment 1 we presented greyscale photos of objects. They were either preceded by a congruent object label, an incongruent label, or white noise. Detection sensitivity (d’) and hit rates were significantly poorer for suppressed objects preceded by an incongruent label compared to a congruent label or noise. In Experiment 2, targets were coloured discs preceded by a colour term. Detection sensitivity was significantly worse for suppressed colour patches preceded by an incongruent colour term as compared to a congruent term or white noise. In Experiment 3 targets were suppressed greyscale object images preceded by an auditory presentation of a colour term. On congruent trials the colour term matched the object’s stereotypical colour and on incongruent trials the colour term mismatched. Detection sensitivity was significantly poorer on incongruent trials than congruent trials. Overall, these findings suggest that colour terms affect awareness of coloured stimuli and colour- associated objects, and provide new evidence for language-perception interaction in the brain. PMID:27023274

  12. A color fusion method of infrared and low-light-level images based on visual perception

    NASA Astrophysics Data System (ADS)

    Han, Jing; Yan, Minmin; Zhang, Yi; Bai, Lianfa

    2014-11-01

    The color fusion images can be obtained through the fusion of infrared and low-light-level images, which will contain both the information of the two. The fusion images can help observers to understand the multichannel images comprehensively. However, simple fusion may lose the target information due to inconspicuous targets in long-distance infrared and low-light-level images; and if targets extraction is adopted blindly, the perception of the scene information will be affected seriously. To solve this problem, a new fusion method based on visual perception is proposed in this paper. The extraction of the visual targets ("what" information) and parallel processing mechanism are applied in traditional color fusion methods. The infrared and low-light-level color fusion images are achieved based on efficient typical targets learning. Experimental results show the effectiveness of the proposed method. The fusion images achieved by our algorithm can not only improve the detection rate of targets, but also get rich natural information of the scenes.

  13. Differentiation of Competence and Affect Self-Perceptions in Elementary School Students: Extending Empirical Evidence

    ERIC Educational Resources Information Center

    Arens, A. Katrin; Hasselhorn, Marcus

    2015-01-01

    This study aimed to address two underexplored research questions regarding support for the separation between competence and affect self-perceptions due to differential relations to outcome criteria. First, it is tested whether higher relations between affect self-perceptions and effort than between competence self-perceptions and effort can also…

  14. Lesions to right posterior parietal cortex impair visual depth perception from disparity but not motion cues

    PubMed Central

    Leopold, David A.; Humphreys, Glyn W.; Welchman, Andrew E.

    2016-01-01

    The posterior parietal cortex (PPC) is understood to be active when observers perceive three-dimensional (3D) structure. However, it is not clear how central this activity is in the construction of 3D spatial representations. Here, we examine whether PPC is essential for two aspects of visual depth perception by testing patients with lesions affecting this region. First, we measured subjects' ability to discriminate depth structure in various 3D surfaces and objects using binocular disparity. Patients with lesions to right PPC (N = 3) exhibited marked perceptual deficits on these tasks, whereas those with left hemisphere lesions (N = 2) were able to reliably discriminate depth as accurately as control subjects. Second, we presented an ambiguous 3D stimulus defined by structure from motion to determine whether PPC lesions influence the rate of bistable perceptual alternations. Patients' percept durations for the 3D stimulus were generally within a normal range, although the two patients with bilateral PPC lesions showed the fastest perceptual alternation rates in our sample. Intermittent stimulus presentation reduced the reversal rate similarly across subjects. Together, the results suggest that PPC plays a causal role in both inferring and maintaining the perception of 3D structure with stereopsis supported primarily by the right hemisphere, but do not lend support to the view that PPC is a critical contributor to bistable perceptual alternations. This article is part of the themed issue ‘Vision in our three-dimensional world’. PMID:27269606

  15. Lesions to right posterior parietal cortex impair visual depth perception from disparity but not motion cues.

    PubMed

    Murphy, Aidan P; Leopold, David A; Humphreys, Glyn W; Welchman, Andrew E

    2016-06-19

    The posterior parietal cortex (PPC) is understood to be active when observers perceive three-dimensional (3D) structure. However, it is not clear how central this activity is in the construction of 3D spatial representations. Here, we examine whether PPC is essential for two aspects of visual depth perception by testing patients with lesions affecting this region. First, we measured subjects' ability to discriminate depth structure in various 3D surfaces and objects using binocular disparity. Patients with lesions to right PPC (N = 3) exhibited marked perceptual deficits on these tasks, whereas those with left hemisphere lesions (N = 2) were able to reliably discriminate depth as accurately as control subjects. Second, we presented an ambiguous 3D stimulus defined by structure from motion to determine whether PPC lesions influence the rate of bistable perceptual alternations. Patients' percept durations for the 3D stimulus were generally within a normal range, although the two patients with bilateral PPC lesions showed the fastest perceptual alternation rates in our sample. Intermittent stimulus presentation reduced the reversal rate similarly across subjects. Together, the results suggest that PPC plays a causal role in both inferring and maintaining the perception of 3D structure with stereopsis supported primarily by the right hemisphere, but do not lend support to the view that PPC is a critical contributor to bistable perceptual alternations.This article is part of the themed issue 'Vision in our three-dimensional world'. PMID:27269606

  16. Lesions to right posterior parietal cortex impair visual depth perception from disparity but not motion cues.

    PubMed

    Murphy, Aidan P; Leopold, David A; Humphreys, Glyn W; Welchman, Andrew E

    2016-06-19

    The posterior parietal cortex (PPC) is understood to be active when observers perceive three-dimensional (3D) structure. However, it is not clear how central this activity is in the construction of 3D spatial representations. Here, we examine whether PPC is essential for two aspects of visual depth perception by testing patients with lesions affecting this region. First, we measured subjects' ability to discriminate depth structure in various 3D surfaces and objects using binocular disparity. Patients with lesions to right PPC (N = 3) exhibited marked perceptual deficits on these tasks, whereas those with left hemisphere lesions (N = 2) were able to reliably discriminate depth as accurately as control subjects. Second, we presented an ambiguous 3D stimulus defined by structure from motion to determine whether PPC lesions influence the rate of bistable perceptual alternations. Patients' percept durations for the 3D stimulus were generally within a normal range, although the two patients with bilateral PPC lesions showed the fastest perceptual alternation rates in our sample. Intermittent stimulus presentation reduced the reversal rate similarly across subjects. Together, the results suggest that PPC plays a causal role in both inferring and maintaining the perception of 3D structure with stereopsis supported primarily by the right hemisphere, but do not lend support to the view that PPC is a critical contributor to bistable perceptual alternations.This article is part of the themed issue 'Vision in our three-dimensional world'.

  17. Seeing Is the Hardest Thing to See: Using Illusions to Teach Visual Perception

    ERIC Educational Resources Information Center

    Riener, Cedar

    2015-01-01

    This chapter describes three examples of using illusions to teach visual perception. The illusions present ways for students to change their perspective regarding how their eyes work and also offer opportunities to question assumptions regarding their approach to knowledge.

  18. Focal Length Affects Depicted Shape and Perception of Facial Images.

    PubMed

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject's facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits. PMID:26894832

  19. Focal Length Affects Depicted Shape and Perception of Facial Images

    PubMed Central

    Třebický, Vít; Fialová, Jitka; Kleisner, Karel; Havlíček, Jan

    2016-01-01

    Static photographs are currently the most often employed stimuli in research on social perception. The method of photograph acquisition might affect the depicted subject’s facial appearance and thus also the impression of such stimuli. An important factor influencing the resulting photograph is focal length, as different focal lengths produce various levels of image distortion. Here we tested whether different focal lengths (50, 85, 105 mm) affect depicted shape and perception of female and male faces. We collected three portrait photographs of 45 (22 females, 23 males) participants under standardized conditions and camera setting varying only in the focal length. Subsequently, the three photographs from each individual were shown on screen in a randomized order using a 3-alternative forced-choice paradigm. The images were judged for attractiveness, dominance, and femininity/masculinity by 369 raters (193 females, 176 males). Facial width-to-height ratio (fWHR) was measured from each photograph and overall facial shape was analysed employing geometric morphometric methods (GMM). Our results showed that photographs taken with 50 mm focal length were rated as significantly less feminine/masculine, attractive, and dominant compared to the images taken with longer focal lengths. Further, shorter focal lengths produced faces with smaller fWHR. Subsequent GMM revealed focal length significantly affected overall facial shape of the photographed subjects. Thus methodology of photograph acquisition, focal length in this case, can significantly affect results of studies using photographic stimuli perhaps due to different levels of perspective distortion that influence shapes and proportions of morphological traits. PMID:26894832

  20. Effects of dynamic luminance modulation on visually induced self-motion perception: observers' perception of illumination is important in perceiving self-motion.

    PubMed

    Nakamura, Shinji; Seno, Takeharu; Ito, Hiroyuki; Sunaga, Shoji

    2013-01-01

    Coherent luminance modulation of visual objects affects visually induced perception of self-motion (vection). The perceptual mechanism underlying the effects of dynamic luminance modulation were investigated with a visual stimulus simulating an external environment illuminated by a moving spotlight (the normal spotlight condition) or an inverted luminance version of it (the inverted luminance condition). Two psychophysical experiments indicated that vection was generally weakened in the inverted luminance condition. The results cannot be fully explained by the undesirable differences of luminosity within the experimental environment, and suggest that the contrast polarity of the visual stimulus has a significant impact on vection. Furthermore, the results show that the dynamic luminance variations weaken vection in the normal spotlight condition in which the observers perceived illumination modulations. In contrast, in the inverted luminance condition, in which the observers cannot perceive the illumination manipulation, the dynamic luminance variations may not impair vection, and may even be expected to strengthen vection, even though they shared similar global and systematic luminance variation with the normal spotlight condition. These experiments suggest that the observer's perception of illumination is a key factor in considering the effects of dynamic luminance modulation of the visual stimulus.

  1. To See or Not to See: Analyzing Difficulties in Geometry from the Perspective of Visual Perception

    ERIC Educational Resources Information Center

    Gal, Hagar; Linchevski, Liora

    2010-01-01

    In this paper, we consider theories about processes of visual perception and perception-based knowledge representation (VPR) in order to explain difficulties encountered in figural processing in junior high school geometry tasks. In order to analyze such difficulties, we take advantage of the following perspectives of VPR: (1) Perceptual…

  2. Visualizing the Perception Filter and Breaching It with Active-Learning Strategies

    ERIC Educational Resources Information Center

    White, Harold B.

    2012-01-01

    Teachers' perception filter operates in all realms of their consciousness. It plays an important part in what and how students learn and should play a central role in what and how they teach. This may be obvious, but having a visual model of a perception filter can guide the way they think about education. In this article, the author talks about…

  3. Optical images of visible and invisible percepts in the primary visual cortex of primates

    PubMed Central

    Macknik, Stephen L.; Haglund, Michael M.

    1999-01-01

    We optically imaged a visual masking illusion in primary visual cortex (area V-1) of rhesus monkeys to ask whether activity in the early visual system more closely reflects the physical stimulus or the generated percept. Visual illusions can be a powerful way to address this question because they have the benefit of dissociating the stimulus from perception. We used an illusion in which a flickering target (a bar oriented in visual space) is rendered invisible by two counter-phase flickering bars, called masks, which flank and abut the target. The target and masks, when shown separately, each generated correlated activity on the surface of the cortex. During the illusory condition, however, optical signals generated in the cortex by the target disappeared although the image of the masks persisted. The optical image thus was correlated with perception but not with the physical stimulus. PMID:10611363

  4. Two-dimensional grouping affects perisaccadic perception of depth and synchrony.

    PubMed

    Aruga, Reiko; Saito, Hideo; Ando, Hideyuki; Watanabe, Junji

    2014-01-01

    There is considerable evidence that, when visual stimuli are presented around the time of a saccade, spatial and temporal perceptions of them are distorted. However, only a small number of previous studies have addressed the perception of a visual image induced by a saccade eye movement (visual image that is dynamically drawn on the retina during a saccade at the speed of the eye movement). Here we investigated three-dimensional and temporal perceptions of the saccade-induced images and found that perceptual grouping of objects has a significant effect on the perceived depth and timing of the images.

  5. Altered visual experience and acute visual deprivation affect predatory targeting by infrared-imaging Boid snakes.

    PubMed

    Grace, M S; Woodward, O M

    2001-11-23

    Boid and Crotaline snakes use both their eyes and infrared-imaging facial pit organs to target homeothermic prey. These snakes can target in complete darkness, but the eyes can also effectively direct predatory strikes. We investigated the behavioral correlates of boid snakes' simultaneous use of two imaging systems by testing whether congenital unilateral visual deprivation affects targeting performance. Normally sighted Burmese pythons exhibited average targeting angle of zero (on the midline axis of the head), but three unilaterally anophthalmic Burmese pythons targeted preferentially on the sighted side. A unilaterally anophthalmic amethystine python also targeted on the sighted side, and a unilaterally anophthalmic Brazilian rainbow boa tended to target on the sighted side, though its mean targeting angle was not significantly different from zero. When unilaterally anophthalmic Burmese pythons were temporarily blinded, mean strike angle changed to that of normally sighted snakes. These results show that while infrared-imaging snakes can shift between visual and infrared information under acute experimental conditions, loss of part of the visual field during development results in abnormal predatory targeting behavior. In contrast, normally sighted snakes subjected to temporary unilateral blinding do not target preferentially on the sighted side. Therefore, while loss of part of the visual field may be compensated for by infrared input in normal snakes, partial absence of visual input during development may alter central organization of visual information. Conversely, absence of half the visual field during development does not alter targeting performance based upon infrared input alone, suggesting that organization of the central infrared map does not depend upon normal organization of visual input.

  6. Dance and Music in "Gangnam Style": How Dance Observation Affects Meter Perception.

    PubMed

    Lee, Kyung Myun; Barrett, Karen Chan; Kim, Yeonhwa; Lim, Yeoeun; Lee, Kyogu

    2015-01-01

    Dance and music often co-occur as evidenced when viewing choreographed dances or singers moving while performing. This study investigated how the viewing of dance motions shapes sound perception. Previous research has shown that dance reflects the temporal structure of its accompanying music, communicating musical meter (i.e. a hierarchical organization of beats) via coordinated movement patterns that indicate where strong and weak beats occur. Experiments here investigated the effects of dance cues on meter perception, hypothesizing that dance could embody the musical meter, thereby shaping participant reaction times (RTs) to sound targets occurring at different metrical positions.In experiment 1, participants viewed a video with dance choreography indicating 4/4 meter (dance condition) or a series of color changes repeated in sequences of four to indicate 4/4 meter (picture condition). A sound track accompanied these videos and participants reacted to timbre targets at different metrical positions. Participants had the slowest RT's at the strongest beats in the dance condition only. In experiment 2, participants viewed the choreography of the horse-riding dance from Psy's "Gangnam Style" in order to examine how a familiar dance might affect meter perception. Moreover, participants in this experiment were divided into a group with experience dancing this choreography and a group without experience. Results again showed slower RTs to stronger metrical positions and the group with experience demonstrated a more refined perception of metrical hierarchy. Results likely stem from the temporally selective division of attention between auditory and visual domains. This study has implications for understanding: 1) the impact of splitting attention among different sensory modalities, and 2) the impact of embodiment, on perception of musical meter. Viewing dance may interfere with sound processing, particularly at critical metrical positions, but embodied familiarity with

  7. Dance and Music in “Gangnam Style”: How Dance Observation Affects Meter Perception

    PubMed Central

    Lee, Kyung Myun; Barrett, Karen Chan; Kim, Yeonhwa; Lim, Yeoeun; Lee, Kyogu

    2015-01-01

    Dance and music often co-occur as evidenced when viewing choreographed dances or singers moving while performing. This study investigated how the viewing of dance motions shapes sound perception. Previous research has shown that dance reflects the temporal structure of its accompanying music, communicating musical meter (i.e. a hierarchical organization of beats) via coordinated movement patterns that indicate where strong and weak beats occur. Experiments here investigated the effects of dance cues on meter perception, hypothesizing that dance could embody the musical meter, thereby shaping participant reaction times (RTs) to sound targets occurring at different metrical positions.In experiment 1, participants viewed a video with dance choreography indicating 4/4 meter (dance condition) or a series of color changes repeated in sequences of four to indicate 4/4 meter (picture condition). A sound track accompanied these videos and participants reacted to timbre targets at different metrical positions. Participants had the slowest RT’s at the strongest beats in the dance condition only. In experiment 2, participants viewed the choreography of the horse-riding dance from Psy’s “Gangnam Style” in order to examine how a familiar dance might affect meter perception. Moreover, participants in this experiment were divided into a group with experience dancing this choreography and a group without experience. Results again showed slower RTs to stronger metrical positions and the group with experience demonstrated a more refined perception of metrical hierarchy. Results likely stem from the temporally selective division of attention between auditory and visual domains. This study has implications for understanding: 1) the impact of splitting attention among different sensory modalities, and 2) the impact of embodiment, on perception of musical meter. Viewing dance may interfere with sound processing, particularly at critical metrical positions, but embodied

  8. 3D Shape Perception in Posterior Cortical Atrophy: A Visual Neuroscience Perspective

    PubMed Central

    Gillebert, Céline R.; Schaeverbeke, Jolien; Bastin, Christine; Neyens, Veerle; Bruffaerts, Rose; De Weer, An-Sofie; Seghers, Alexandra; Sunaert, Stefan; Van Laere, Koen; Versijpt, Jan; Vandenbulcke, Mathieu; Salmon, Eric; Todd, James T.; Orban, Guy A.

    2015-01-01

    Posterior cortical atrophy (PCA) is a rare focal neurodegenerative syndrome characterized by progressive visuoperceptual and visuospatial deficits, most often due to atypical Alzheimer's disease (AD). We applied insights from basic visual neuroscience to analyze 3D shape perception in humans affected by PCA. Thirteen PCA patients and 30 matched healthy controls participated, together with two patient control groups with diffuse Lewy body dementia (DLBD) and an amnestic-dominant phenotype of AD, respectively. The hierarchical study design consisted of 3D shape processing for 4 cues (shading, motion, texture, and binocular disparity) with corresponding 2D and elementary feature extraction control conditions. PCA and DLBD exhibited severe 3D shape-processing deficits and AD to a lesser degree. In PCA, deficient 3D shape-from-shading was associated with volume loss in the right posterior inferior temporal cortex. This region coincided with a region of functional activation during 3D shape-from-shading in healthy controls. In PCA patients who performed the same fMRI paradigm, response amplitude during 3D shape-from-shading was reduced in this region. Gray matter volume in this region also correlated with 3D shape-from-shading in AD. 3D shape-from-disparity in PCA was associated with volume loss slightly more anteriorly in posterior inferior temporal cortex as well as in ventral premotor cortex. The findings in right posterior inferior temporal cortex and right premotor cortex are consistent with neurophysiologically based models of the functional anatomy of 3D shape processing. However, in DLBD, 3D shape deficits rely on mechanisms distinct from inferior temporal structural integrity. SIGNIFICANCE STATEMENT Posterior cortical atrophy (PCA) is a neurodegenerative syndrome characterized by progressive visuoperceptual dysfunction and most often an atypical presentation of Alzheimer's disease (AD) affecting the ventral and dorsal visual streams rather than the medial

  9. Validity and Reliability of the Developmental Test of Visual Perception - Third Edition (DTVP-3).

    PubMed

    Brown, Ted

    2016-07-01

    The Developmental Test of Visual Perception - Third Edition (DTVP-3) is a recently published revision of a visual perceptual test from the United States, frequently used by occupational therapists. It is important that tests have adequate documented reliability and validity and are evaluated in cross-cultural contexts. The purpose of the study was to assess the reliability and validity of the DTVP-3 when completed by a group of Australian participants. Thirty-nine typically developing children 6-8 years of age completed the DTVP-3 and the Developmental Test of Visual-Motor Integration - 6th edition (VMI-6). The internal consistency of the DVTP-3 was assessed using Cronbach alpha coefficients and the DTVP-3's convergent validity was examined by correlating it with the VMI-6 and its two supplementary tests. The five DTVP-3 subscales' Cronbach alpha coefficients ranged from.60 to.80 while its three composite indexes had coefficients all at the.80 level. The VMI-6 was significantly correlated with the DTVP-3 Figure Ground and Visual Closure subscales and the Motor-Reduced Visual Perception Index (MRVPI). The VMI-6 Visual Perception Supplementary Test was significantly correlated with the DTVP-3 Figure Ground, Visual Closure, Form Constancy, MRVPI, and General Visual Perception Index. The DTVP-3 exhibited acceptable levels of internal consistency and moderate levels of convergent validity with the VMI-6 when completed by a group of Australian children.

  10. Biometric Research in Perception and Neurology Related to the Study of Visual Communication.

    ERIC Educational Resources Information Center

    Metallinos, Nikos

    Contemporary research findings in the fields of perceptual psychology and neurology of the human brain that are directly related to the study of visual communication are reviewed and briefly discussed in this paper. Specifically, the paper identifies those major research findings in visual perception that are relevant to the study of visual…

  11. Impact of Language on Development of Auditory-Visual Speech Perception

    ERIC Educational Resources Information Center

    Sekiyama, Kaoru; Burnham, Denis

    2008-01-01

    The McGurk effect paradigm was used to examine the developmental onset of inter-language differences between Japanese and English in auditory-visual speech perception. Participants were asked to identify syllables in audiovisual (with congruent or discrepant auditory and visual components), audio-only, and video-only presentations at various…

  12. Role of Visual Integration in Gaze Perception and Emotional Intelligence in Schizophrenia

    PubMed Central

    Tso, Ivy F.

    2014-01-01

    Background: Individuals with schizophrenia demonstrate a wide range of social cognitive deficits that significantly compromise functioning. Early visual processing is frequently disrupted in schizophrenia, and growing evidence suggests a role of perceptual dysfunctions in socioemotional functioning in the disorder. This study examined visual integration (the ability to effectively integrate individual, local visual features into a holistic representation), a target construct of basic perception identified by the Cognitive Neuroscience Treatment Research to Improve Cognition in Schizophrenia initiative, and its relationship with eye- contact perception and emotional intelligence in schizophrenia. Methods: Twenty-nine participants with schizophrenia (SCZ) and 23 healthy controls (HC) completed tasks measuring visual integration (Coherent Motion Task, Contour Integration Task), an eye-contact perception task, and a measure of emotional intelligence. Results: SCZ participants showed compromised visual integration as suggested by poorer performance on the Contour Integration Task relative to HC. Visual integration was a significant predictor of eye-contact perception and emotional intelligence among SCZ. The amounts of variances in these 2 social cognitive areas accounted for by visual integration were comparable to and overlapped with those accounted for by the diagnosis of schizophrenia. Conclusions: Individuals with schizophrenia showed compromised visual integration, and this may play a significant role in the observed deficits in higher level processing of social information in the disorder. PMID:23666503

  13. Validity and Reliability of the Developmental Test of Visual Perception - Third Edition (DTVP-3).

    PubMed

    Brown, Ted

    2016-07-01

    The Developmental Test of Visual Perception - Third Edition (DTVP-3) is a recently published revision of a visual perceptual test from the United States, frequently used by occupational therapists. It is important that tests have adequate documented reliability and validity and are evaluated in cross-cultural contexts. The purpose of the study was to assess the reliability and validity of the DTVP-3 when completed by a group of Australian participants. Thirty-nine typically developing children 6-8 years of age completed the DTVP-3 and the Developmental Test of Visual-Motor Integration - 6th edition (VMI-6). The internal consistency of the DVTP-3 was assessed using Cronbach alpha coefficients and the DTVP-3's convergent validity was examined by correlating it with the VMI-6 and its two supplementary tests. The five DTVP-3 subscales' Cronbach alpha coefficients ranged from.60 to.80 while its three composite indexes had coefficients all at the.80 level. The VMI-6 was significantly correlated with the DTVP-3 Figure Ground and Visual Closure subscales and the Motor-Reduced Visual Perception Index (MRVPI). The VMI-6 Visual Perception Supplementary Test was significantly correlated with the DTVP-3 Figure Ground, Visual Closure, Form Constancy, MRVPI, and General Visual Perception Index. The DTVP-3 exhibited acceptable levels of internal consistency and moderate levels of convergent validity with the VMI-6 when completed by a group of Australian children. PMID:26913939

  14. Behind Mathematical Learning Disabilities: What about Visual Perception and Motor Skills?

    ERIC Educational Resources Information Center

    Pieters, Stefanie; Desoete, Annemie; Roeyers, Herbert; Vanderswalmen, Ruth; Van Waelvelde, Hilde

    2012-01-01

    In a sample of 39 children with mathematical learning disabilities (MLD) and 106 typically developing controls belonging to three control groups of three different ages, we found that visual perception, motor skills and visual-motor integration explained a substantial proportion of the variance in either number fact retrieval or procedural…

  15. Parents' Perceptions of Physical Activity for Their Children with Visual Impairments

    ERIC Educational Resources Information Center

    Perkins, Kara; Columna, Luis; Lieberman, Lauren; Bailey, JoEllen

    2013-01-01

    Introduction: Ongoing communication with parents and the acknowledgment of their preferences and expectations are crucial to promote the participation of physical activity by children with visual impairments. Purpose: The study presented here explored parents' perceptions of physical activity for their children with visual impairments and explored…

  16. The chronometry of visual perception: review of occipital TMS masking studies.

    PubMed

    de Graaf, Tom A; Koivisto, Mika; Jacobs, Christianne; Sack, Alexander T

    2014-09-01

    Transcranial magnetic stimulation (TMS) continues to deliver on its promise as a research tool. In this review article we focus on the application of TMS to early visual cortex (V1, V2, V3) in studies of visual perception and visual awareness. Depending on the asynchrony between visual stimulus onset and TMS pulse (SOA), TMS can suppress visual perception, allowing one to track the time course of functional relevance (chronometry) of early visual cortex for vision. This procedure has revealed multiple masking effects ('dips'), some consistently (∼+100ms SOA) but others less so (∼-50ms, ∼-20ms, ∼+30ms, ∼+200ms SOA). We review the state of TMS masking research, focusing on the evidence for these multiple dips, the relevance of several experimental parameters to the obtained 'masking curve', and the use of multiple measures of visual processing (subjective measures of awareness, objective discrimination tasks, priming effects). Lastly, we consider possible future directions for this field. We conclude that while TMS masking has yielded many fundamental insights into the chronometry of visual perception already, much remains unknown. Not only are there several temporal windows when TMS pulses can induce visual suppression, even the well-established 'classical' masking effect (∼+100ms) may reflect more than one functional visual process.

  17. Emotional reactivity during anticipation and perception of affective pictures.

    PubMed

    Pastor, M Carmen; Poy, Rosario; Segarra, Pilar; Moltó, Javier

    2015-01-13

    The focus of the present study was on further exploring anticipatory responses to emotional stimuli by measuring the eyeblink startle reflex in a variation of the picture-picture affective learning procedure. Participants (113 undergraduate women) were not explicitly instructed before the experiment began. Instead, they had to learn the specific relations between cues (geometrical shapes) and emotional pictures based on pairings during the first part of the task. Plausible contingency learning effects were tested afterwards, in a parallel sequence of trials including auditory probes during cues and pictures processing during the second part of the task. Results did show the typical affective startle modulation pattern during perception, linear F(1, 200) = 52.67, p < .0001, but unexpected inhibition for both pleasant and unpleasant, compared to neutral cues, during anticipation, quadratic F(1, 200) = 7.07, p < .009. All patterns of startle modulation were independent of cue-picture contingency awareness (all interactions Fs < 1). Skin conductance changes showed the predictable quadratic trend either during picture perception or anticipatory periods (greater activity for emotional vs. neutral; overall quadratic F(1, 224) = 7.04, p < .01), only for participants fully aware of the cue-picture contingency, quadratic F(1, 158) = 5.86, p < .02. Overall, our results during anticipation (cues processing) seem to suggest that more resources were allocated to highly arousing pictures that engage attention. Differences between the present results and prior research may be attributed to procedural variations in the sample, cues, or instructions. Future studies should also explore in more detail the role of the contingency awareness during anticipation.

  18. Effects of Positive Affect on Risk Perceptions in Adolescence and Young Adulthood

    ERIC Educational Resources Information Center

    Haase, Claudia M.; Silbereisen, Rainer K.

    2011-01-01

    Affective influences may play a key role in adolescent risk taking, but have rarely been studied. Using an audiovisual method of affect induction, two experimental studies examined the effect of positive affect on risk perceptions in adolescence and young adulthood. Outcomes were risk perceptions regarding drinking alcohol, smoking a cigarette,…

  19. Integrative Processing of Touch and Affect in Social Perception: An fMRI Study

    PubMed Central

    Ebisch, Sjoerd J. H.; Salone, Anatolia; Martinotti, Giovanni; Carlucci, Leonardo; Mantini, Dante; Perrucci, Mauro G.; Saggino, Aristide; Romani, Gian Luca; Di Giannantonio, Massimo; Northoff, Georg; Gallese, Vittorio

    2016-01-01

    Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top–down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC), and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others’ feelings from manifold bodily sources (sensory-affective information) in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content. PMID:27242474

  20. Integrative Processing of Touch and Affect in Social Perception: An fMRI Study.

    PubMed

    Ebisch, Sjoerd J H; Salone, Anatolia; Martinotti, Giovanni; Carlucci, Leonardo; Mantini, Dante; Perrucci, Mauro G; Saggino, Aristide; Romani, Gian Luca; Di Giannantonio, Massimo; Northoff, Georg; Gallese, Vittorio

    2016-01-01

    Social perception commonly employs multiple sources of information. The present study aimed at investigating the integrative processing of affective social signals. Task-related and task-free functional magnetic resonance imaging was performed in 26 healthy adult participants during a social perception task concerning dynamic visual stimuli simultaneously depicting facial expressions of emotion and tactile sensations that could be either congruent or incongruent. Confounding effects due to affective valence, inhibitory top-down influences, cross-modal integration, and conflict processing were minimized. The results showed that the perception of congruent, compared to incongruent stimuli, elicited enhanced neural activity in a set of brain regions including left amygdala, bilateral posterior cingulate cortex (PCC), and left superior parietal cortex. These congruency effects did not differ as a function of emotion or sensation. A complementary task-related functional interaction analysis preliminarily suggested that amygdala activity depended on previous processing stages in fusiform gyrus and PCC. The findings provide support for the integrative processing of social information about others' feelings from manifold bodily sources (sensory-affective information) in amygdala and PCC. Given that the congruent stimuli were also judged as being more self-related and more familiar in terms of personal experience in an independent sample of participants, we speculate that such integrative processing might be mediated by the linking of external stimuli with self-experience. Finally, the prediction of task-related responses in amygdala by intrinsic functional connectivity between amygdala and PCC during a task-free state implies a neuro-functional basis for an individual predisposition for the integrative processing of social stimulus content. PMID:27242474

  1. Changing psychiatric perception of African-Americans with affective disorders.

    PubMed

    Jarvis, G Eric

    2012-12-01

    This article explored the origins and implications of the underdiagnosis of affective disorders in African-Americans. MEDLINE and old collections were searched using relevant key words. Reference lists from the articles that were gathered from this procedure were reviewed. The historical record indicated that the psychiatric perception of African-Americans with affective disorders changed significantly during the last 200 years. In the antebellum period, the mental disorders of slaves mostly went unnoticed. By the early 20th century, African-Americans were reported to have high rates of manic-depressive disorder compared with whites. By the mid-century, rates of manic-depressive disorder in African-Americans plummeted, whereas depression remained virtually nonexistent. In recent decades, diagnosed depression and bipolar disorder, whether in clinical or research settings, were inexplicably low in African-Americans compared with whites. Given these findings, American psychiatry needs to appraise the deep-seated effects of historical stereotypes on the diagnosis and treatment of African-Americans.

  2. Visual adaptation of the perception of "life": animacy is a basic perceptual dimension of faces.

    PubMed

    Koldewyn, Kami; Hanus, Patricia; Balas, Benjamin

    2014-08-01

    One critical component of understanding another's mind is the perception of "life" in a face. However, little is known about the cognitive and neural mechanisms underlying this perception of animacy. Here, using a visual adaptation paradigm, we ask whether face animacy is (1) a basic dimension of face perception and (2) supported by a common neural mechanism across distinct face categories defined by age and species. Observers rated the perceived animacy of adult human faces before and after adaptation to (1) adult faces, (2) child faces, and (3) dog faces. When testing the perception of animacy in human faces, we found significant adaptation to both adult and child faces, but not dog faces. We did, however, find significant adaptation when morphed dog images and dog adaptors were used. Thus, animacy perception in faces appears to be a basic dimension of face perception that is species specific but not constrained by age categories.

  3. NMDA receptor antagonist ketamine impairs feature integration in visual perception.

    PubMed

    Meuwese, Julia D I; van Loon, Anouk M; Scholte, H Steven; Lirk, Philipp B; Vulink, Nienke C C; Hollmann, Markus W; Lamme, Victor A F

    2013-01-01

    Recurrent interactions between neurons in the visual cortex are crucial for the integration of image elements into coherent objects, such as in figure-ground segregation of textured images. Blocking N-methyl-D-aspartate (NMDA) receptors in monkeys can abolish neural signals related to figure-ground segregation and feature integration. However, it is unknown whether this also affects perceptual integration itself. Therefore, we tested whether ketamine, a non-competitive NMDA receptor antagonist, reduces feature integration in humans. We administered a subanesthetic dose of ketamine to healthy subjects who performed a texture discrimination task in a placebo-controlled double blind within-subject design. We found that ketamine significantly impaired performance on the texture discrimination task compared to the placebo condition, while performance on a control fixation task was much less impaired. This effect is not merely due to task difficulty or a difference in sedation levels. We are the first to show a behavioral effect on feature integration by manipulating the NMDA receptor in humans. PMID:24223927

  4. NMDA Receptor Antagonist Ketamine Impairs Feature Integration in Visual Perception

    PubMed Central

    Meuwese, Julia D. I.; van Loon, Anouk M.; Scholte, H. Steven; Lirk, Philipp B.; Vulink, Nienke C. C.; Hollmann, Markus W.; Lamme, Victor A. F.

    2013-01-01

    Recurrent interactions between neurons in the visual cortex are crucial for the integration of image elements into coherent objects, such as in figure-ground segregation of textured images. Blocking N-methyl-D-aspartate (NMDA) receptors in monkeys can abolish neural signals related to figure-ground segregation and feature integration. However, it is unknown whether this also affects perceptual integration itself. Therefore, we tested whether ketamine, a non-competitive NMDA receptor antagonist, reduces feature integration in humans. We administered a subanesthetic dose of ketamine to healthy subjects who performed a texture discrimination task in a placebo-controlled double blind within-subject design. We found that ketamine significantly impaired performance on the texture discrimination task compared to the placebo condition, while performance on a control fixation task was much less impaired. This effect is not merely due to task difficulty or a difference in sedation levels. We are the first to show a behavioral effect on feature integration by manipulating the NMDA receptor in humans. PMID:24223927

  5. Effect of transcranial direct current stimulation on visual perception function and performance capability of activities of daily living in stroke patients

    PubMed Central

    Kim, Ko-Un; Kim, Su-Han; An, Tae-Gyu

    2016-01-01

    [Purpose] The purpose of this study was to examine the effects of transcranial direct current stimulation (tDCS) on visual perception and performance of activities of daily living in patients with stroke. [Subjects and Methods] Thirty subjects were assigned equally to a tDCS plus traditional occupational therapy group (experimental group) and a traditional occupational therapy group (control group). The intervention was implemented five times per week, 30 minutes each, for six weeks. In order to assess visual perception function before and after the intervention, the motor-free visual perception test (MVPT) was conducted, and in order to compare the performance of activities of daily living, the Functional Independence Measure scale was employed. [Results] According to the results, both groups improved in visual perception function and in performance of activities of daily living. Although there was no significant difference between the two groups, the experimental group exhibited higher scores. [Conclusion] In conclusion, the application of tDCS for the rehabilitation of patients with stroke may positively affect their visual perception and ability to perform activities of daily living. PMID:27799697

  6. Audio-visual interactions for motion perception in depth modulate activity in visual area V3A.

    PubMed

    Ogawa, Akitoshi; Macaluso, Emiliano

    2013-05-01

    Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) "matched vs. unmatched" conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual "congruent vs. incongruent" between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.

  7. Time-Resolved Influences of Functional DAT1 and COMT Variants on Visual Perception and Post-Processing

    PubMed Central

    Bender, Stephan; Rellum, Thomas; Freitag, Christine; Resch, Franz; Rietschel, Marcella; Treutlein, Jens; Jennen-Steinmetz, Christine; Brandeis, Daniel; Banaschewski, Tobias; Laucht, Manfred

    2012-01-01

    Background Dopamine plays an important role in orienting and the regulation of selective attention to relevant stimulus characteristics. Thus, we examined the influences of functional variants related to dopamine inactivation in the dopamine transporter (DAT1) and catechol-O-methyltransferase genes (COMT) on the time-course of visual processing in a contingent negative variation (CNV) task. Methods 64-channel EEG recordings were obtained from 195 healthy adolescents of a community-based sample during a continuous performance task (A-X version). Early and late CNV as well as preceding visual evoked potential components were assessed. Results Significant additive main effects of DAT1 and COMT on the occipito-temporal early CNV were observed. In addition, there was a trend towards an interaction between the two polymorphisms. Source analysis showed early CNV generators in the ventral visual stream and in frontal regions. There was a strong negative correlation between occipito-temporal visual post-processing and the frontal early CNV component. The early CNV time interval 500–1000 ms after the visual cue was specifically affected while the preceding visual perception stages were not influenced. Conclusions Late visual potentials allow the genomic imaging of dopamine inactivation effects on visual post-processing. The same specific time-interval has been found to be affected by DAT1 and COMT during motor post-processing but not motor preparation. We propose the hypothesis that similar dopaminergic mechanisms modulate working memory encoding in both the visual and motor and perhaps other systems. PMID:22844499

  8. Visual-Proprioceptive Intermodal Perception Using Point Light Displays.

    ERIC Educational Resources Information Center

    Schmuckler, Mark A.; Fairhall, Jennifer L.

    2001-01-01

    Three experiments explored 5- and 7-month-olds' intermodal coordination of proprioceptive information produced by leg movements and visual movement information specifying these same motions. Results suggested that coordination of visual and proprioceptive inputs is constrained by infants' information processing of the displays and have…

  9. Parallel and Serial Grouping of Image Elements in Visual Perception

    ERIC Educational Resources Information Center

    Houtkamp, Roos; Roelfsema, Pieter R.

    2010-01-01

    The visual system groups image elements that belong to an object and segregates them from other objects and the background. Important cues for this grouping process are the Gestalt criteria, and most theories propose that these are applied in parallel across the visual scene. Here, we find that Gestalt grouping can indeed occur in parallel in some…

  10. Working Memory Enhances Visual Perception: Evidence from Signal Detection Analysis

    ERIC Educational Resources Information Center

    Soto, David; Wriglesworth, Alice; Bahrami-Balani, Alex; Humphreys, Glyn W.

    2010-01-01

    We show that perceptual sensitivity to visual stimuli can be modulated by matches between the contents of working memory (WM) and stimuli in the visual field. Observers were presented with an object cue (to hold in WM or to merely attend) and subsequently had to identify a brief target presented within a colored shape. The cue could be…

  11. Depth perception: cuttlefish (Sepia officinalis) respond to visual texture density gradients.

    PubMed

    Josef, Noam; Mann, Ofri; Sykes, António V; Fiorito, Graziano; Reis, João; Maccusker, Steven; Shashar, Nadav

    2014-11-01

    Studies concerning the perceptual processes of animals are not only interesting, but are fundamental to the understanding of other developments in information processing among non-humans. Carefully used visual illusions have been proven to be an informative tool for understanding visual perception. In this behavioral study, we demonstrate that cuttlefish are responsive to visual cues involving texture gradients. Specifically, 12 out of 14 animals avoided swimming over a solid surface with a gradient picture that to humans resembles an illusionary crevasse, while only 5 out of 14 avoided a non-illusionary texture. Since texture gradients are well-known cues for depth perception in vertebrates, we suggest that these cephalopods were responding to the depth illusion created by the texture density gradient. Density gradients and relative densities are key features in distance perception in vertebrates. Our results suggest that they are fundamental features of vision in general, appearing also in cephalopods.

  12. Neighborhood Perceptions Affect Dietary Behaviors and Diet Quality

    PubMed Central

    Keita, Akilah Dulin; Casazza, Krista; Thomas, Olivia; Fernandez, Jose R.

    2009-01-01

    Objective The primary purpose of this study was to determine if perceived neighborhood disorder affected dietary quality within a multiethnic sample of children. Design Children were recruited through the use of fliers, wide-distribution mailers, parent magazines, and school presentations from June 2005 to December 2008. Setting Birmingham-Hoover, Alabama metropolitan area. Participants Sample of 100 children aged 7 to 12. Main Outcome Measure Dietary quality was assessed using the average of two 24 hour recalls and analyzed using the Nutrition Data System for Research. Analysis Multivariate linear regression analyses were conducted to assess the relationship between neighborhood disorder and dietary quality. Results Perceived neighborhood disorder was associated with increased iron intake (P = .031) and lower potassium levels (P = .041). Perceived neighborhood disorder was marginally associated with increased energy intake (P = .074) and increased sodium intake (P = .078). Conclusions and Implications Perceived neighborhood disorder was significantly related to differences in dietary quality. This indicates that subjective neighborhood characteristics may pose barriers to healthful eating behaviors for children. Future research efforts and policy should address sociostructural factors and ways to manipulate and improve food environments and individual’s perceptions of their neighborhoods. PMID:20880752

  13. Perceptions Concerning Visual Culture Dialogues of Visual Art Pre-Service Teachers

    ERIC Educational Resources Information Center

    Mamur, Nuray

    2012-01-01

    The visual art which is commented by the visual art teachers to help processing of the visual culture is important. In this study it is tried to describe the effect of visual culture based on the usual aesthetic experiences to be included in the learning process art education. The action research design, which is a qualitative study, is conducted…

  14. Undergraduate nursing students' perceptions regarding factors that affect math abilities

    NASA Astrophysics Data System (ADS)

    Pyo, Katrina A.

    2011-07-01

    A review of the nursing literature reveals many undergraduate nursing students lack proficiency with basic mathematical skills, those necessary for safe medication preparation and administration. Few studies exploring the phenomenon from the undergraduate nursing student perspective are reported in the nursing literature. The purpose of this study was to explore undergraduate nursing students’ perceptions of math abilities, factors that affect math abilities, the use of math in nursing, and the extent to which specific math skills were addressed throughout a nursing curriculum. Polya’s Model for Problem Solving and the Bloom’s Taxonomy of Educational Objectives, Affective Domain served as the theoretical background for the study. Qualitative and quantitative methods were utilized to obtain data from a purposive sample of undergraduate nursing students from a private university in western Pennsylvania. Participants were selected based on the proficiency level with math skills, as determined by a score on the Elsevier’s HESI™ Admission Assessment (A2) Exam, Math Portion. Ten students from the “Excellent” benchmark group and eleven students from the “Needing Additional Assistance or Improvement” benchmark group participated in one-on-one, semi-structured interviews, and completed a 25-item, 4-point Likert scale survey that rated confidence levels with specific math skills and the extent to which these skills were perceived to be addressed in the nursing curriculum. Responses from the two benchmark groups were compared and contrasted. Eight themes emerged from the qualitative data. Findings related to mathematical approach and confidence levels with specific math skills were determined to be statistically significant.

  15. Close binding of identity and location in visual feature perception

    NASA Technical Reports Server (NTRS)

    Johnston, J. C.; Pashler, H.

    1990-01-01

    The binding of identity and location information in disjunctive feature search was studied. Ss searched a heterogeneous display for a color or a form target, and reported both target identity and location. To avoid better than chance guessing of target identity (by choosing the target less likely to have been seen), the difficulty of the two targets was equalized adaptively; a mathematical model was used to quantify residual effects. A spatial layout was used that minimized postperceptual errors in reporting location. Results showed strong binding of identity and location perception. After correction for guessing, no perception of identity without location was found. A weak trend was found for accurate perception of target location without identity. We propose that activated features generate attention-calling "interrupt" signals, specifying only location; attention then retrieves the properties at that location.

  16. How do visual and postural cues combine for self-tilt perception during slow pitch rotations?

    PubMed

    Scotto Di Cesare, C; Buloup, F; Mestre, D R; Bringoux, L

    2014-11-01

    Self-orientation perception relies on the integration of multiple sensory inputs which convey spatially-related visual and postural cues. In the present study, an experimental set-up was used to tilt the body and/or the visual scene to investigate how these postural and visual cues are integrated for self-tilt perception (the subjective sensation of being tilted). Participants were required to repeatedly rate a confidence level for self-tilt perception during slow (0.05°·s(-1)) body and/or visual scene pitch tilts up to 19° relative to vertical. Concurrently, subjects also had to perform arm reaching movements toward a body-fixed target at certain specific angles of tilt. While performance of a concurrent motor task did not influence the main perceptual task, self-tilt detection did vary according to the visuo-postural stimuli. Slow forward or backward tilts of the visual scene alone did not induce a marked sensation of self-tilt contrary to actual body tilt. However, combined body and visual scene tilt influenced self-tilt perception more strongly, although this effect was dependent on the direction of visual scene tilt: only a forward visual scene tilt combined with a forward body tilt facilitated self-tilt detection. In such a case, visual scene tilt did not seem to induce vection but rather may have produced a deviation of the perceived orientation of the longitudinal body axis in the forward direction, which may have lowered the self-tilt detection threshold during actual forward body tilt. PMID:25299446

  17. Visual speech perception in foveal and extrafoveal vision: further implications for divisions in hemispheric projections.

    PubMed

    Jordan, Timothy R; Sheen, Mercedes; Abedipour, Lily; Paterson, Kevin B

    2014-01-01

    When observing a talking face, it has often been argued that visual speech to the left and right of fixation may produce differences in performance due to divided projections to the two cerebral hemispheres. However, while it seems likely that such a division in hemispheric projections exists for areas away from fixation, the nature and existence of a functional division in visual speech perception at the foveal midline remains to be determined. We investigated this issue by presenting visual speech in matched hemiface displays to the left and right of a central fixation point, either exactly abutting the foveal midline or else located away from the midline in extrafoveal vision. The location of displays relative to the foveal midline was controlled precisely using an automated, gaze-contingent eye-tracking procedure. Visual speech perception showed a clear right hemifield advantage when presented in extrafoveal locations but no hemifield advantage (left or right) when presented abutting the foveal midline. Thus, while visual speech observed in extrafoveal vision appears to benefit from unilateral projections to left-hemisphere processes, no evidence was obtained to indicate that a functional division exists when visual speech is observed around the point of fixation. Implications of these findings for understanding visual speech perception and the nature of functional divisions in hemispheric projection are discussed.

  18. Effects of auditory information on self-motion perception during simultaneous presentation of visual shearing motion

    PubMed Central

    Tanahashi, Shigehito; Ashihara, Kaoru; Ujike, Hiroyasu

    2015-01-01

    Recent studies have found that self-motion perception induced by simultaneous presentation of visual and auditory motion is facilitated when the directions of visual and auditory motion stimuli are identical. They did not, however, examine possible contributions of auditory motion information for determining direction of self-motion perception. To examine this, a visual stimulus projected on a hemisphere screen and an auditory stimulus presented through headphones were presented separately or simultaneously, depending on experimental conditions. The participant continuously indicated the direction and strength of self-motion during the 130-s experimental trial. When the visual stimulus with a horizontal shearing rotation and the auditory stimulus with a horizontal one-directional rotation were presented simultaneously, the duration and strength of self-motion perceived in the opposite direction of the auditory rotation stimulus were significantly longer and stronger than those perceived in the same direction of the auditory rotation stimulus. However, the auditory stimulus alone could not sufficiently induce self-motion perception, and if it did, its direction was not consistent within each experimental trial. We concluded that auditory motion information can determine perceived direction of self-motion during simultaneous presentation of visual and auditory motion information, at least when visual stimuli moved in opposing directions (around the yaw-axis). We speculate that the contribution of auditory information depends on the plausibility and information balance of visual and auditory information. PMID:26113828

  19. Effects of auditory information on self-motion perception during simultaneous presentation of visual shearing motion.

    PubMed

    Tanahashi, Shigehito; Ashihara, Kaoru; Ujike, Hiroyasu

    2015-01-01

    Recent studies have found that self-motion perception induced by simultaneous presentation of visual and auditory motion is facilitated when the directions of visual and auditory motion stimuli are identical. They did not, however, examine possible contributions of auditory motion information for determining direction of self-motion perception. To examine this, a visual stimulus projected on a hemisphere screen and an auditory stimulus presented through headphones were presented separately or simultaneously, depending on experimental conditions. The participant continuously indicated the direction and strength of self-motion during the 130-s experimental trial. When the visual stimulus with a horizontal shearing rotation and the auditory stimulus with a horizontal one-directional rotation were presented simultaneously, the duration and strength of self-motion perceived in the opposite direction of the auditory rotation stimulus were significantly longer and stronger than those perceived in the same direction of the auditory rotation stimulus. However, the auditory stimulus alone could not sufficiently induce self-motion perception, and if it did, its direction was not consistent within each experimental trial. We concluded that auditory motion information can determine perceived direction of self-motion during simultaneous presentation of visual and auditory motion information, at least when visual stimuli moved in opposing directions (around the yaw-axis). We speculate that the contribution of auditory information depends on the plausibility and information balance of visual and auditory information.

  20. How Perceptions of an Intervention Program Affect Outcomes

    ERIC Educational Resources Information Center

    Forneris, Tanya; Danish, Steven J.; Fries, Elizabeth

    2009-01-01

    Goals for Health was a National Cancer Institute funded program designed to impact health behaviors of adolescents living in rural Virginia and New York. This study examined three specific objectives: (a) to examine participants' perceptions of the program components and the relationship between program components and overall program perception,…

  1. The spatiotemporal profile of cortical processing leading up to visual perception.

    PubMed

    Fahrenfort, J J; Scholte, H S; Lamme, V A F

    2008-01-01

    Much controversy exists around the locus of conscious visual perception in human cortex. Some authors have proposed that its neural correlates correspond with recurrent processing within visual cortex, whereas others have argued they are located in a frontoparietal network. The present experiment aims to bring together these competing viewpoints. We recorded EEG from human subjects that were engaged in detecting masked visual targets. From this, we obtained a spatiotemporal profile of neural activity selectively related to the processing of the targets, which we correlated with the subjects' ability to detect those targets. This made it possible to distinguish between those stages of visual processing that correlate with human perception and those that do not. The results show that target induced extra-striate feedforward activity peaking at 121 ms does not correlate with perception, whereas more posterior recurrent activity peaking at 160 ms does. Several subsequent stages show an alternating pattern of frontoparietal and occipital activity, all of which correlate highly with perception. This shows that perception emerges early on, but only after an initial feedforward volley, and suggests that multiple reentrant loops are involved in propagating this signal to frontoparietal areas. PMID:18318615

  2. Crossmodal Statistical Binding of Temporal Information and Stimuli Properties Recalibrates Perception of Visual Apparent Motion.

    PubMed

    Zhang, Yi; Chen, Lihan

    2016-01-01

    Recent studies of brain plasticity that pertain to time perception have shown that fast training of temporal discrimination in one modality, for example, the auditory modality, can improve performance of temporal discrimination in another modality, such as the visual modality. We here examined whether the perception of visual Ternus motion could be recalibrated through fast crossmodal statistical binding of temporal information and stimuli properties binding. We conducted two experiments, composed of three sessions each: pre-test, learning, and post-test. In both the pre-test and the post-test, participants classified the Ternus display as either "element motion" or "group motion." For the training session in Experiment 1, we constructed two types of temporal structures, in which two consecutively presented sound beeps were dominantly (80%) flanked by one leading visual Ternus frame and by one lagging visual Ternus frame (VAAV) or dominantly inserted by two Ternus visual frames (AVVA). Participants were required to respond which interval (auditory vs. visual) was longer. In Experiment 2, we presented only a single auditory-visual pair but with similar temporal configurations as in Experiment 1, and asked participants to perform an audio-visual temporal order judgment. The results of these two experiments support that statistical binding of temporal information and stimuli properties can quickly and selectively recalibrate the sensitivity of perceiving visual motion, according to the protocols of the specific bindings. PMID:27065910

  3. The Effectiveness of Using the Successive Perception Test I to Measure Visual-Haptic Tendencies in Engineering Students.

    ERIC Educational Resources Information Center

    Study, Nancy E.

    2002-01-01

    Compares results of Successive Perception Test I (SPT) for the study population of freshman engineering students to their results on the group-administered Purdue Spatial Visualization Test: Visualization of Rotations (PSVT) and the individually administered Haptic Visual Discrimination Test (HVDT). Concludes that either visual and haptic…

  4. Language and Visual Perception Associations: Meta-Analytic Connectivity Modeling of Brodmann Area 37

    PubMed Central

    Rosselli, Monica

    2015-01-01

    Background. Understanding the functions of different brain areas has represented a major endeavor of neurosciences. Historically, brain functions have been associated with specific cortical brain areas; however, modern neuroimaging developments suggest cognitive functions are associated to networks rather than to areas. Objectives. The purpose of this paper was to analyze the connectivity of Brodmann area (BA) 37 (posterior, inferior, and temporal/fusiform gyrus) in relation to (1) language and (2) visual processing. Methods. Two meta-analyses were initially conducted (first level analysis). The first one was intended to assess the language network in which BA37 is involved. The second one was intended to assess the visual perception network. A third meta-analysis (second level analysis) was then performed to assess contrasts and convergence between the two cognitive domains (language and visual perception). The DataBase of Brainmap was used. Results. Our results support the role of BA37 in language but by means of a distinct network from the network that supports its second most important function: visual perception. Conclusion. It was concluded that left BA37 is a common node of two distinct networks—visual recognition (perception) and semantic language functions. PMID:25648869

  5. Effects of attention and perceptual uncertainty on cerebellar activity during visual motion perception.

    PubMed

    Baumann, Oliver; Mattingley, Jason B

    2014-02-01

    Recent clinical and neuroimaging studies have revealed that the human cerebellum plays a role in visual motion perception, but the nature of its contribution to this function is not understood. Some reports suggest that the cerebellum might facilitate motion perception by aiding attentive tracking of visual objects. Others have identified a particular role for the cerebellum in discriminating motion signals in perceptually uncertain conditions. Here, we used functional magnetic resonance imaging to determine the degree to which cerebellar involvement in visual motion perception can be explained by a role in sustained attentive tracking of moving stimuli in contrast to a role in visual motion discrimination. While holding the visual displays constant, we manipulated attention by having participants attend covertly to a field of random-dot motion or a colored spot at fixation. Perceptual uncertainty was manipulated by varying the percentage of signal dots contained within the random-dot arrays. We found that attention to motion under high perceptual uncertainty was associated with strong activity in left cerebellar lobules VI and VII. By contrast, attending to motion under low perceptual uncertainty did not cause differential activation in the cerebellum. We found no evidence to support the suggestion that the cerebellum is involved in simple attentive tracking of salient moving objects. Instead, our results indicate that specific subregions of the cerebellum are involved in facilitating the detection and discrimination of task-relevant moving objects under conditions of high perceptual uncertainty. We conclude that the cerebellum aids motion perception under conditions of high perceptual demand.

  6. Gravity and observer's body orientation influence the visual perception of human body postures.

    PubMed

    Lopez, Christophe; Bachofner, Christelle; Mercier, Manuel; Blanke, Olaf

    2009-05-04

    Since human behavior and perception have evolved within the Earth's gravitational field, humans possess an internal model of gravity. Although gravity is known to influence the visual perception of moving objects, the evidence is less clear concerning the visual perception of static objects. We investigated whether a visual judgment of the stability of human body postures (static postures of a human standing on a platform and tilted in the roll plane) may also be influenced by gravity and by the participant's orientation. Pictures of human body postures were presented in different orientations with respect to gravity and the participant's body. The participant's body was aligned to gravity (upright) or not (lying on one side). Participants performed stability judgments with respect to the platform, imagining that gravity operates in the direction indicated by the platform (that was or was not concordant with physical gravity). Such visual judgments were influenced by the picture's orientation with respect to physical gravity. When pictures were tilted by 90 degrees with respect to physical gravity, the human postures that were tilted toward physical gravity (down) were perceived as more unstable than similar postures tilted away from physical gravity (up). Stability judgments were also influenced by the picture's orientation with respect to the participant's body. This indicates that gravity and the participant's body position may influence the visual perception of static objects.

  7. Visual Influences on Speech Perception in Children with Autism

    ERIC Educational Resources Information Center

    Iarocci, Grace; Rombough, Adrienne; Yager, Jodi; Weeks, Daniel J.; Chua, Romeo

    2010-01-01

    The bimodal perception of speech sounds was examined in children with autism as compared to mental age--matched typically developing (TD) children. A computer task was employed wherein only the mouth region of the face was displayed and children reported what they heard or saw when presented with consonant-vowel sounds in unimodal auditory…

  8. Science Visual Literacy: Learners' Perceptions and Knowledge of Diagrams

    ERIC Educational Resources Information Center

    McTigue, Erin M.; Flowers, Amanda C.

    2011-01-01

    Constructing meaning from science texts relies not only on comprehending the words but also the diagrams and other graphics. The goal of this study was to explore elementary students' perceptions of science diagrams and their skills related to diagram interpretation. 30 students, ranging from second grade through middle school, completed a diagram…

  9. That Deceptive Line: Plato, Linear Perspective, Visual Perception, and Tragedy

    ERIC Educational Resources Information Center

    Killian, Jeremy

    2012-01-01

    In "The Renaissance Rediscovery of Linear Perspective," one of Samuel Edgerton's claims is that Filippo Brunelleschi and his contemporaries did not develop a three-dimensional style of representing the world in painting as much as they reappropriated a way to depict the natural world in painting that most mirrored the human perception of it.…

  10. Visual Chemistry: Three-Dimensional Perception of Chemical Structures.

    ERIC Educational Resources Information Center

    Balaban, Alexandru T.

    1999-01-01

    Discusses in great detail aspects connected with the visual and mental processing of chemical images. Presents various types of conventions for translating three-dimensional objects into two-dimensional representations. (Author/CCM)

  11. Visual hallucinosis: the major clinical determinant of distorted chromatic contour perception in Parkinson's disease.

    PubMed

    Büttner, T; Kuhn, W; Müller, T; Welter, F L; Federlein, J; Heidbrink, K; Przuntek, H

    1996-01-01

    Recently distorted chromatic contour perception has been demonstrated in Parkinson's disease (PD). The aim of our study is to determine the clinical factors which influence chromatic contour perception in PD. Chromatic and achromatic contour perception, colour discrimination and clinical data were evaluated in 73 patients with PD. We used a computer-aided method to determine the chromatic fusion time (CFT) which indicates the acuity of monochromatic contour perception. Chromatic CFT was generally shortened in patients as compared to controls (p < 0.01), whereas achromatic CFT was not significantly different. Variance analysis revealed the ability of colour discrimination and the risk of visual hallucinations as statistically significant (p < 0.05) variables influencing contour perception of certain stimuli. In contrast, disease stage, disease duration and disease severity have no relevant effect on chromatic contour perception in Parkinson's disease. On the basis of those properties one may suggest that distorted chromatic contour perception is due to an impairment at a central stage of visual processing in PD and an imbalance of the serotonergic system. Whether CFT is a reliable method to predict the individual risk of hallucinosis in PD has to be evaluated.

  12. Changes of visual vertical perception: a long-term sign of unilateral and bilateral vestibular loss.

    PubMed

    Lopez, Christophe; Lacour, Michel; Ahmadi, Abdessadek El; Magnan, Jacques; Borel, Liliane

    2007-05-15

    This study investigates how unilateral and bilateral vestibular deafferentation modifies visual vertical perception in the presence of dynamic and static visual cues. We tested 40 Menière's patients before and after (from 1 week to 1 year) a curative unilateral vestibular neurotomy (UVN), and 4 patients with bilateral vestibular loss. Patients' performances were compared with those of 24 healthy subjects. The perception of the dynamic visual vertical (DVV) was investigated during optokinetic stimulations around the line of sight at various angular velocities. The static visual vertical (SVV) was recorded with a stationary visual pattern. In the acute stage after UVN, Menière's patients exhibited drastic impairment of DVV, which was tilted towards the lesioned side, whatever the direction of the optokinetic stimulation. In addition, the SVV was systematically tilted towards the lesioned side. The optokinetic-induced tilt of the vertical was asymmetrically organized around the new SVV with a significant decrease for contralesional stimulations and no change for ipsilesional stimulations, whatever the postoperative time. The SVV regained normal values 1 year postoperatively. For the patients with bilateral vestibular loss, the optokinetic-induced tilt of the visual vertical was drastically increased and symmetrically organized around an unmodified SVV aligned with the gravitational vertical. This study constitutes the first description of the recovery time-course of DVV perception after unilateral vestibular loss. Data reveal a long-term impairment of the DVV perception after unilateral vestibular loss, suggesting an asymmetrical processing of visual information and a permanent increased weight of dynamic visual cues after bilateral vestibular loss. PMID:17382977

  13. Feature-Based Memory-Driven Attentional Capture: Visual Working Memory Content Affects Visual Attention

    ERIC Educational Resources Information Center

    Olivers, Christian N. L.; Meijer, Frank; Theeuwes, Jan

    2006-01-01

    In 7 experiments, the authors explored whether visual attention (the ability to select relevant visual information) and visual working memory (the ability to retain relevant visual information) share the same content representations. The presence of singleton distractors interfered more strongly with a visual search task when it was accompanied by…

  14. Ignition's glow: Ultra-fast spread of global cortical activity accompanying local "ignitions" in visual cortex during conscious visual perception.

    PubMed

    Noy, N; Bickel, S; Zion-Golumbic, E; Harel, M; Golan, T; Davidesco, I; Schevon, C A; McKhann, G M; Goodman, R R; Schroeder, C E; Mehta, A D; Malach, R

    2015-09-01

    Despite extensive research, the spatiotemporal span of neuronal activations associated with the emergence of a conscious percept is still debated. The debate can be formulated in the context of local vs. global models, emphasizing local activity in visual cortex vs. a global fronto-parietal "workspace" as the key mechanisms of conscious visual perception. These alternative models lead to differential predictions with regard to the precise magnitude, timing and anatomical spread of neuronal activity during conscious perception. Here we aimed to test a specific aspect of these predictions in which local and global models appear to differ - namely the extent to which fronto-parietal regions modulate their activity during task performance under similar perceptual states. So far the main experimental results relevant to this debate have been obtained from non-invasive methods and led to conflicting interpretations. Here we examined these alternative predictions through large-scale intracranial measurements (Electrocorticogram - ECoG) in 43 patients and 4445 recording sites. Both ERP and broadband high frequency (50-150 Hz - BHF) responses were examined through the entire cortex during a simple 1-back visual recognition memory task. Our results reveal short latency intense visual responses, localized first in early visual cortex followed (at ∼200 ms) by higher order visual areas, but failed to show significant delayed (300 ms) visual activations. By contrast, oddball image repeat events, linked to overt motor responses, were associated with a significant increase in a delayed (300 ms) peak of BHF power in fronto-parietal cortex. Comparing BHF responses with ERP revealed an additional peak in the ERP response - having a similar latency to the well-studied P3 scalp EEG response. Posterior and temporal regions demonstrated robust visual category selectivity. An unexpected observation was that high-order visual cortex responses were essentially concurrent (at ∼200 ms

  15. Perceptions of Visual Literacy. Selected Readings from the Annual Conference of the International Visual Literacy Association (21st, Scottsdale, Arizona, October 1989).

    ERIC Educational Resources Information Center

    Braden, Roberts A., Ed.; And Others

    These proceedings contain 37 papers from 51 authors noted for their expertise in the field of visual literacy. The collection is divided into three sections: (1) "Examining Visual Literacy" (including, in addition to a 7-year International Visual Literacy Association bibliography covering the period from 1983-1989, papers on the perception of…

  16. Perception of Audio-Visual Speech Synchrony in Spanish-Speaking Children with and without Specific Language Impairment

    ERIC Educational Resources Information Center

    Pons, Ferran; Andreu, Llorenc; Sanz-Torrent, Monica; Buil-Legaz, Lucia; Lewkowicz, David J.

    2013-01-01

    Speech perception involves the integration of auditory and visual articulatory information, and thus requires the perception of temporal synchrony between this information. There is evidence that children with specific language impairment (SLI) have difficulty with auditory speech perception but it is not known if this is also true for the…

  17. Auditory, Visual, and Auditory-Visual Perception of Vowels by Hearing-Impaired Children.

    ERIC Educational Resources Information Center

    Hack, Zarita Caplan; Erber, Norman P.

    1982-01-01

    Vowels were presented through auditory, visual, and auditory-visual modalities to 18 hearing impaired children (12 to 15 years old) having good, intermediate, and poor auditory word recognition skills. All the groups had difficulty with acoustic information and visual information alone. The first two groups had only moderate difficulty identifying…

  18. Applications of neural networks in human shape visual perception.

    PubMed

    Wu, Bo-Wen; Fang, Yi-Chin; Lin, David Pei-Cheng

    2015-12-01

    Advances in optical and electronic technology can immensely reduce noise in images and greatly enhance human visual recognition. However, it is still difficult for human eyes to identify low-resolution thermal images, due to the limits imposed by psychological and physiological factors. In addition, changes in monitor brightness and lens resolution may also interfere with visual recognition abilities. To overcome these limitations, we devised a suitable and effective recognition method which may help the military in revising the shape parameters of long-range targets. The modulation transfer function was used as a basis to extend the visual characteristics of the human visual model and a new model was produced through the incorporation of new shape parameters. The new human visual model was next used in combination with a backpropagation neural network for better recognition of low-resolution thermal images. The new model was then tested in experiments and the results showed that the accuracy rate of recognition steadily rose by over 95%.

  19. Hearing brighter: changing in-depth visual perception through looming sounds.

    PubMed

    Sutherland, Clare A M; Thut, Gregor; Romei, Vincenzo

    2014-09-01

    Rapidly approaching (looming) sounds are ecologically salient stimuli that are perceived as nearer than they are due to overestimation of their loudness change and underestimation of their distance (Neuhoff, 1998; Seifritz et al., 2002). Despite evidence for crossmodal influence by looming sounds onto visual areas (Romei, Murray, Cappe, & Thut, 2009, 2013; Tyll et al., 2013), it is unknown whether such sounds bias visual percepts in similar ways. Nearer objects appear to be larger and brighter than distant objects. If looming sounds impact visual processing, then visual stimuli paired with looming sounds should be perceived as brighter and larger, even when the visual stimuli do not provide motion cues, i.e. are static. In Experiment 1 we found that static visual objects paired with looming tones (but not static or receding tones) were perceived as larger and brighter than their actual physical properties, as if they appear closer to the observer. In a second experiment, we replicate and extend the findings of Experiment 1. Crucially, we did not find evidence of such bias by looming sounds when visual processing was disrupted via masking or when catch trials were presented, ruling out simple response bias. Finally, in a third experiment we found that looming tones do not bias visual stimulus characteristics that do not carry visual depth information such as shape, providing further evidence that they specifically impact in-depth visual processing. We conclude that looming sounds impact visual perception through a mechanism transferring in-depth sound motion information onto the relevant in-depth visual dimensions (such as size and luminance but not shape) in a crossmodal remapping of information for a genuine, evolutionary advantage in stimulus detection.

  20. Two reference frames for visual perception in two gravity conditions.

    PubMed

    Lipshits, Mark; Bengoetxea, Ana; Cheron, Guy; McIntyre, Joseph

    2005-01-01

    The processing and storage of visual information concerning the orientation of objects in space is carried out in anisotropic reference frames in which all orientations are not treated equally. The perceptual anisotropies, and the implicit reference frames that they define, are evidenced by the observation of 'oblique effects' in which performance on a given perceptual task is better for horizontally and vertically oriented stimuli. The question remains how the preferred horizontal and vertical reference frames are defined. In these experiments cosmonaut subjects reproduced the remembered orientation of a visual stimulus in 1g (on the ground) and in 0g, both attached to a chair and while free-floating within the International Space Station. Results show that while the remembered orientation of a visual stimulus may be stored in a multimodal reference frame that includes gravity, an egocentric reference is sufficient to elicit the oblique effect when all gravitational and haptic cues are absent.

  1. Visual-motor recalibration in geographical slant perception

    NASA Technical Reports Server (NTRS)

    Bhalla, M.; Proffitt, D. R.; Kaiser, M. K. (Principal Investigator)

    1999-01-01

    In 4 experiments, it was shown that hills appear steeper to people who are encumbered by wearing a heavy backpack (Experiment 1), are fatigued (Experiment 2), are of low physical fitness (Experiment 3), or are elderly and/or in declining health (Experiment 4). Visually guided actions are unaffected by these manipulations of physiological potential. Although dissociable, the awareness and action systems were also shown to be interconnected. Recalibration of the transformation relating awareness and actions was found to occur over long-term changes in physiological potential (fitness level, age, and health) but not with transitory changes (fatigue and load). Findings are discussed in terms of a time-dependent coordination between the separate systems that control explicit visual awareness and visually guided action.

  2. What can fish brains tell us about visual perception?

    PubMed Central

    Rosa Salva, Orsola; Sovrano, Valeria Anna; Vallortigara, Giorgio

    2014-01-01

    Fish are a complex taxonomic group, whose diversity and distance from other vertebrates well suits the comparative investigation of brain and behavior: in fish species we observe substantial differences with respect to the telencephalic organization of other vertebrates and an astonishing variety in the development and complexity of pallial structures. We will concentrate on the contribution of research on fish behavioral biology for the understanding of the evolution of the visual system. We shall review evidence concerning perceptual effects that reflect fundamental principles of the visual system functioning, highlighting the similarities and differences between distant fish groups and with other vertebrates. We will focus on perceptual effects reflecting some of the main tasks that the visual system must attain. In particular, we will deal with subjective contours and optical illusions, invariance effects, second order motion and biological motion and, finally, perceptual binding of object properties in a unified higher level representation. PMID:25324728

  3. Analysis of EEG Signals Related to Artists and Nonartists during Visual Perception, Mental Imagery, and Rest Using Approximate Entropy

    PubMed Central

    Shourie, Nasrin; Firoozabadi, Mohammad; Badie, Kambiz

    2014-01-01

    In this paper, differences between multichannel EEG signals of artists and nonartists were analyzed during visual perception and mental imagery of some paintings and at resting condition using approximate entropy (ApEn). It was found that ApEn is significantly higher for artists during the visual perception and the mental imagery in the frontal lobe, suggesting that artists process more information during these conditions. It was also observed that ApEn decreases for the two groups during the visual perception due to increasing mental load; however, their variation patterns are different. This difference may be used for measuring progress in novice artists. In addition, it was found that ApEn is significantly lower during the visual perception than the mental imagery in some of the channels, suggesting that visual perception task requires more cerebral efforts. PMID:25133180

  4. Biases in Visual, Auditory, and Audiovisual Perception of Space.

    PubMed

    Odegaard, Brian; Wozny, David R; Shams, Ladan

    2015-12-01

    Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the

  5. Biases in Visual, Auditory, and Audiovisual Perception of Space.

    PubMed

    Odegaard, Brian; Wozny, David R; Shams, Ladan

    2015-12-01

    Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the

  6. Shining new light on dark percepts: visual sensations induced by TMS.

    PubMed

    Knight, Ramisha; Mazzi, Chiara; Savazzi, Silvia

    2015-11-01

    Phosphenes induced by transcranial magnetic stimulation (TMS) are sensations of light, whereas a missing region in the visual field induced by TMS is generally referred to as a scotoma. It is believed that phosphenes are caused by neural excitation, while scotomas are due to neural inhibition. In light of the recent literature it might, however, be surmised that both phenomena are the result of neural noise injected into the cortex by TMS and that the likelihood of perceiving the two kinds of percepts depends on the state of the cortex at the time of stimulation. In the present study, TMS was applied over the left occipital cortex under different background conditions (Experiments 1-2) and using different TMS intensities (Experiment 3). Behavioral responses indicate the visual system processes luminance in a standardized manner, as lighter percepts were reacted to faster than darker percepts; this effect, however, did not extend to percept size. Our results suggest that phenomenological characteristics of artificial visual percepts are in line with the proposed effects of TMS as the induction of random neural noise interfering with the neural dynamics (the state of the cortex) at the time of stimulation.

  7. Retention interval affects visual short-term memory encoding.

    PubMed

    Bankó, Eva M; Vidnyánszky, Zoltán

    2010-03-01

    Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.

  8. The Effect of Learning Background and Imagery Cognitive Development on Visual Perception

    ERIC Educational Resources Information Center

    Chiang, Shyh-Bao; Sun, Chun-Wang

    2013-01-01

    This research looked into the effect of how cognitive development toward imagery is formed through visual perception by means of a quantitative questionnaire. The main variable was the difference between the learning backgrounds of the interviewees. A two-way ANOVA mixed design was the statistical method used for the analysis of the 2 × 4 (2 by 4)…

  9. The Validity of Two Clinical Tests of Visual-Motor Perception

    ERIC Educational Resources Information Center

    Wallbrown, Jane D.; And Others

    1977-01-01

    The intent of this study was to determine whether the Minnesota Percepto-Diagnostic Test (Fuller, 1969; Fuller & Laird, 1963) is more effective than the Bender-Gestalt (Bender, 1937) with respect to identifying achievement-related errors in visual-motor perception. (Author/RK)

  10. Visual Images of Subjective Perception of Time in a Literary Text

    ERIC Educational Resources Information Center

    Nesterik, Ella V.; Issina, Gaukhar I.; Pecherskikh, Taliya F.; Belikova, Oxana V.

    2016-01-01

    The article is devoted to the subjective perception of time, or psychological time, as a text category and a literary image. It focuses on the visual images that are characteristic of different types of literary time--accelerated, decelerated and frozen (vanished). The research is based on the assumption that the category of subjective perception…

  11. Perceptions of Older Veterans with Visual Impairments Regarding Computer Access Training and Quality of Life

    ERIC Educational Resources Information Center

    DuBosque, Richard Stanborough

    2013-01-01

    The widespread integration of the computer into the mainstream of daily life presents a challenge to various sectors of society, and the incorporation of this technology into the realm of the older individual with visual impairments is a relatively uncharted field of study. This study was undertaken to acquire the perceptions of the impact of the…

  12. Perception of Words and Non-Words in the Upper and Lower Visual Fields

    ERIC Educational Resources Information Center

    Darker, Iain T.; Jordan, Timothy R.

    2004-01-01

    The findings of previous investigations into word perception in the upper and the lower visual field (VF) are variable and may have incurred non-perceptual biases caused by the asymmetric distribution of information within a word, an advantage for saccadic eye-movements to targets in the upper VF and the possibility that stimuli were not projected…

  13. The Analysis of Reading Skills and Visual Perception Levels of First Grade Turkish Students

    ERIC Educational Resources Information Center

    Memis, Aysel; Sivri, Diler Ayvaz

    2016-01-01

    In this study, primary school first grade students' reading skills and visual perception levels were investigated. Sample of the study, which was designed with relational scanning model, consisted of 168 first grade students studying at three public primary schools in Kozlu, Zonguldak, in 2013-2014 education year. Students' reading level, reading…

  14. Increase of Universality in Human Brain during Mental Imagery from Visual Perception

    PubMed Central

    Bhattacharya, Joydeep

    2009-01-01

    Background Different complex systems behave in a similar way near their critical points of phase transitions which leads to an emergence of a universal scaling behaviour. Universality indirectly implies a long-range correlation between constituent subsystems. As the distributed correlated processing is a hallmark of higher complex cognition, I investigated a measure of universality in human brain during perception and mental imagery of complex real-life visual object like visual art. Methodology/Principal Findings A new method was presented to estimate the strength of hidden universal structure in a multivariate data set. In this study, I investigated this method in the electrical activities (electroencephalogram signals) of human brain during complex cognition. Two broad groups - artists and non-artists - were studied during the encoding (perception) and retrieval (mental imagery) phases of actual paintings. Universal structure was found to be stronger in visual imagery than in visual perception, and this difference was stronger in artists than in non-artists. Further, this effect was found to be largest in the theta band oscillations and over the prefrontal regions bilaterally. Conclusions/Significance Phase transition like dynamics was observed in the electrical activities of human brain during complex cognitive processing, and closeness to phase transition was higher in mental imagery than in real perception. Further, the effect of long-term training on the universal scaling was also demonstrated. PMID:19122817

  15. Visual Perception and Frontal Lobe in Intellectual Disabilities: A Study with Evoked Potentials and Neuropsychology

    ERIC Educational Resources Information Center

    Munoz-Ruata, J.; Caro-Martinez, E.; Perez, L. Martinez; Borja, M.

    2010-01-01

    Background: Perception disorders are frequently observed in persons with intellectual disability (ID) and their influence on cognition has been discussed. The objective of this study is to clarify the mechanisms behind these alterations by analysing the visual event related potentials early component, the N1 wave, which is related to perception…

  16. Optical Phonetics and Visual Perception of Lexical and Phrasal Stress in English

    ERIC Educational Resources Information Center

    Scarborough, Rebecca; Keating, Patricia; Mattys, Sven L.; Cho, Taehong; Alwan, Abeer

    2009-01-01

    In a study of optical cues to the visual perception of stress, three American English talkers spoke words that differed in lexical stress and sentences that differed in phrasal stress, while video and movements of the face were recorded. The production of stressed and unstressed syllables from these utterances was analyzed along many measures of…

  17. Sustained interactions between perception and action in visual extinction and neglect: evidence from sequential pointing.

    PubMed

    Kitadono, Keiko; Humphreys, Glyn W

    2009-05-01

    Interactions between perception and action were examined by assessing the effects of action programming on extinction and neglect. In an extension of prior work, effects of sequential motor programming were assessed under conditions in which attention was first directed to an ipsilesional stimulus. Despite pointing and reporting a stimulus on the ipsilesional side first, programming a second action to the contralesional side reduced the spatial deficit on report, improving the report of contralesional stimuli (relative to when the patient just pointed to the ipsilesional side) while decreasing the report of ipsilesional items. The data suggest that perception and action interact through motor feedback to early visual coding, helping a patient overcome a lack of visual awareness to contralesional stimuli. This is effective even when attention has to be disengaged from the ipsilesional side suggesting that motor programming decreases ipsilesional capture and exerts a sustained influence on perception.

  18. Modelling Subjectivity in Visual Perception of Orientation for Image Retrieval.

    ERIC Educational Resources Information Center

    Sanchez, D.; Chamorro-Martinez, J.; Vila, M. A.

    2003-01-01

    Discussion of multimedia libraries and the need for storage, indexing, and retrieval techniques focuses on the combination of computer vision and data mining techniques to model high-level concepts for image retrieval based on perceptual features of the human visual system. Uses fuzzy set theory to measure users' assessments and to capture users'…

  19. Exploring Children's Perceptions of Play Using Visual Methodologies

    ERIC Educational Resources Information Center

    Anthamatten, Peter; Wee, Bryan Shao-Chang; Korris, Erin

    2013-01-01

    Objective: A great deal of scholarly work has examined the way that physical, social and cultural environments relate to children's health behaviour, particularly with respect to diet and exercise. While this work is critical, little research attempts to incorporate the views and perspectives of children themselves using visual methodologies.…

  20. Audio-visual perception system for a humanoid robotic head.

    PubMed

    Viciana-Abad, Raquel; Marfil, Rebeca; Perez-Lorenzo, Jose M; Bandera, Juan P; Romero-Garces, Adrian; Reche-Lopez, Pedro

    2014-01-01

    One of the main issues within the field of social robotics is to endow robots with the ability to direct attention to people with whom they are interacting. Different approaches follow bio-inspired mechanisms, merging audio and visual cues to localize a person using multiple sensors. However, most of these fusion mechanisms have been used in fixed systems, such as those used in video-conference rooms, and thus, they may incur difficulties when constrained to the sensors with which a robot can be equipped. Besides, within the scope of interactive autonomous robots, there is a lack in terms of evaluating the benefits of audio-visual attention mechanisms, compared to only audio or visual approaches, in real scenarios. Most of the tests conducted have been within controlled environments, at short distances and/or with off-line performance measurements. With the goal of demonstrating the benefit of fusing sensory information with a Bayes inference for interactive robotics, this paper presents a system for localizing a person by processing visual and audio data. Moreover, the performance of this system is evaluated and compared via considering the technical limitations of unimodal systems. The experiments show the promise of the proposed approach for the proactive detection and tracking of speakers in a human-robot interactive framework.

  1. Visual Size Perception and Haptic Calibration during Development

    ERIC Educational Resources Information Center

    Gori, Monica; Giuliana, Luana; Sandini, Giulio; Burr, David

    2012-01-01

    It is still unclear how the visual system perceives accurately the size of objects at different distances. One suggestion, dating back to Berkeley's famous essay, is that vision is calibrated by touch. If so, we may expect different mechanisms involved for near, reachable distances and far, unreachable distances. To study how the haptic system…

  2. Vibrotactile Perception: Perspective Taking by Children Who Are Visually Impaired.

    ERIC Educational Resources Information Center

    Miletic, G.

    1994-01-01

    This study compared the performance on perspective-taking tasks of 8 congenitally blind children (mean age 13.5 years), using either haptic exploration or a vibrotactile prosthetic device, with the performance of 4 children having low vision using their limited visual abilities. The vibrotactile device improved perspective-taking performance…

  3. Visual Speech Perception in Children with Language Learning Impairments

    ERIC Educational Resources Information Center

    Knowland, Victoria C. P.; Evans, Sam; Snell, Caroline; Rosen, Stuart

    2016-01-01

    Purpose: The purpose of the study was to assess the ability of children with developmental language learning impairments (LLIs) to use visual speech cues from the talking face. Method: In this cross-sectional study, 41 typically developing children (mean age: 8 years 0 months, range: 4 years 5 months to 11 years 10 months) and 27 children with…

  4. Auditory-Visual Perception of Changing Distance by Human Infants.

    ERIC Educational Resources Information Center

    Walker-Andrews, Arlene S.; Lennon, Elizabeth M.

    1985-01-01

    Examines, in two experiments, 5-month-old infants' sensitivity to auditory-visual specification of distance and direction of movement. One experiment presented two films with soundtracks in either a match or mismatch condition; the second showed the two films side-by-side with a single soundtrack appropriate to one. Infants demonstrated visual…

  5. Altering Visual Perception Abnormalities: A Marker for Body Image Concern

    PubMed Central

    Duncum, Anna J. F.; Mundy, Matthew E.

    2016-01-01

    The body image concern (BIC) continuum ranges from a healthy and positive body image, to clinical diagnoses of abnormal body image, like body dysmorphic disorder (BDD). BDD and non-clinical, yet high-BIC participants have demonstrated a local visual processing bias, characterised by reduced inversion effects. To examine whether this bias is a potential marker of BDD, the visual processing of individuals across the entire BIC continuum was examined. Dysmorphic Concern Questionnaire (DCQ; quantified BIC) scores were expected to correlate with higher discrimination accuracy and faster reaction times of inverted stimuli, indicating reduced inversion effects (occurring due to increased local visual processing). Additionally, an induced global or local processing bias via Navon stimulus presentation was expected to alter these associations. Seventy-four participants completed the DCQ and upright-inverted face and body stimulus discrimination task. Moderate positive associations were revealed between DCQ scores and accuracy rates for inverted face and body stimuli, indicating a graded local bias accompanying increases in BIC. This relationship supports a local processing bias as a marker for BDD, which has significant assessment implications. Furthermore, a moderate negative relationship was found between DCQ score and inverted face accuracy after inducing global processing, indicating the processing bias can temporarily be reversed in high BIC individuals. Navon stimuli were successfully able to alter the visual processing of individuals across the BIC continuum, which has important implications for treating BDD. PMID:27003715

  6. Visual and Tactual Perception of Shape by Young Children

    ERIC Educational Resources Information Center

    Brynat, P. E.; Raz, I.

    1975-01-01

    Simultaneous and successive visual and tactual shape discrimination were examined in this study which replicated with modifications an earlier study. When ceiling effects were precluded, data support the conclusion that children often find it more difficult to discriminate shapes by touch than by vision. (GO)

  7. A framework for the first-person internal sensation of visual perception in mammals and a comparable circuitry for olfactory perception in Drosophila.

    PubMed

    Vadakkan, Kunjumon I

    2015-01-01

    Perception is a first-person internal sensation induced within the nervous system at the time of arrival of sensory stimuli from objects in the environment. Lack of access to the first-person properties has limited viewing perception as an emergent property and it is currently being studied using third-person observed findings from various levels. One feasible approach to understand its mechanism is to build a hypothesis for the specific conditions and required circuit features of the nodal points where the mechanistic operation of perception take place for one type of sensation in one species and to verify it for the presence of comparable circuit properties for perceiving a different sensation in a different species. The present work explains visual perception in mammalian nervous system from a first-person frame of reference and provides explanations for the homogeneity of perception of visual stimuli above flicker fusion frequency, the perception of objects at locations different from their actual position, the smooth pursuit and saccadic eye movements, the perception of object borders, and perception of pressure phosphenes. Using results from temporal resolution studies and the known details of visual cortical circuitry, explanations are provided for (a) the perception of rapidly changing visual stimuli, (b) how the perception of objects occurs in the correct orientation even though, according to the third-person view, activity from the visual stimulus reaches the cortices in an inverted manner and (c) the functional significance of well-conserved columnar organization of the visual cortex. A comparable circuitry detected in a different nervous system in a remote species-the olfactory circuitry of the fruit fly Drosophila melanogaster-provides an opportunity to explore circuit functions using genetic manipulations, which, along with high-resolution microscopic techniques and lipid membrane interaction studies, will be able to verify the structure

  8. Toward unsupervised outbreak detection through visual perception of new patterns

    PubMed Central

    Lévy, Pierre P; Valleron, Alain-Jacques

    2009-01-01

    Background Statistical algorithms are routinely used to detect outbreaks of well-defined syndromes, such as influenza-like illness. These methods cannot be applied to the detection of emerging diseases for which no preexisting information is available. This paper presents a method aimed at facilitating the detection of outbreaks, when there is no a priori knowledge of the clinical presentation of cases. Methods The method uses a visual representation of the symptoms and diseases coded during a patient consultation according to the International Classification of Primary Care 2nd version (ICPC-2). The surveillance data are transformed into color-coded cells, ranging from white to red, reflecting the increasing frequency of observed signs. They are placed in a graphic reference frame mimicking body anatomy. Simple visual observation of color-change patterns over time, concerning a single code or a combination of codes, enables detection in the setting of interest. Results The method is demonstrated through retrospective analyses of two data sets: description of the patients referred to the hospital by their general practitioners (GPs) participating in the French Sentinel Network and description of patients directly consulting at a hospital emergency department (HED). Informative image color-change alert patterns emerged in both cases: the health consequences of the August 2003 heat wave were visualized with GPs' data (but passed unnoticed with conventional surveillance systems), and the flu epidemics, which are routinely detected by standard statistical techniques, were recognized visually with HED data. Conclusion Using human visual pattern-recognition capacities to detect the onset of unexpected health events implies a convenient image representation of epidemiological surveillance and well-trained "epidemiology watchers". Once these two conditions are met, one could imagine that the epidemiology watchers could signal epidemiological alerts, based on "image walls

  9. Temporal stability of the action-perception cycle for postural control in a moving visual environment.

    PubMed

    Dijkstra, T M; Schöner, G; Gielen, C C

    1994-01-01

    When standing human subjects are exposed to a moving visual environment, the induced postural sway forms a stable temporal relationship with the visual information. We have investigated this relationship experimentally with a new set-up in which a computer generates video images which correspond to the motion of a 3D environment. The suggested mean distance to a sinusoidally moving wall is varied and the temporal relationship to induced sway is analysed (1) in terms of the fluctuations of relative phase between visual and sway motion and (2) in terms of the relaxation time of relative phase as determined from the rate of recovery of the stable relative phase pattern following abrupt changes in the visual motion pattern. The two measures are found to converge to a well-defined temporal stability of the action-perception cycle. Furthermore, we show that this temporal stability is a sensitive measure of the strength of the action-perception coupling. It decreases as the distance of the visual scene from the observer increases. This fact and the increase of mean relative phase are consistent with predictions of a linear second-order system driven by the visual expansion rate. However, the amplitude of visual sway decreases little as visual distance increases, in contradiction to the predictions, and is suggestive of a process that actively generates sway. The visual expansion rate on the optic array is found to decrease strongly with visual distance. This leads to the conclusion that postural control in a moving visual environment cannot be understood simply in terms of minimization of retinal slip, and that dynamic coupling of vision into the postural control system must be taken into account. PMID:8187859

  10. Dopamine Activation Preserves Visual Motion Perception Despite Noise Interference of Human V5/MT

    PubMed Central

    Yousif, Nada; Fu, Richard Z.; Abou-El-Ela Bourquin, Bilal; Bhrugubanda, Vamsee; Schultz, Simon R.

    2016-01-01

    When processing sensory signals, the brain must account for noise, both noise in the stimulus and that arising from within its own neuronal circuitry. Dopamine receptor activation is known to enhance both visual cortical signal-to-noise-ratio (SNR) and visual perceptual performance; however, it is unknown whether these two dopamine-mediated phenomena are linked. To assess this, we used single-pulse transcranial magnetic stimulation (TMS) applied to visual cortical area V5/MT to reduce the SNR focally and thus disrupt visual motion discrimination performance to visual targets located in the same retinotopic space. The hypothesis that dopamine receptor activation enhances perceptual performance by improving cortical SNR predicts that dopamine activation should antagonize TMS disruption of visual perception. We assessed this hypothesis via a double-blinded, placebo-controlled study with the dopamine receptor agonists cabergoline (a D2 agonist) and pergolide (a D1/D2 agonist) administered in separate sessions (separated by 2 weeks) in 12 healthy volunteers in a William's balance-order design. TMS degraded visual motion perception when the evoked phosphene and the visual stimulus overlapped in time and space in the placebo and cabergoline conditions, but not in the pergolide condition. This suggests that dopamine D1 or combined D1 and D2 receptor activation enhances cortical SNR to boost perceptual performance. That local visual cortical excitability was unchanged across drug conditions suggests the involvement of long-range intracortical interactions in this D1 effect. Because increased internal noise (and thus lower SNR) can impair visual perceptual learning, improving visual cortical SNR via D1/D2 agonist therapy may be useful in boosting rehabilitation programs involving visual perceptual training. SIGNIFICANCE STATEMENT In this study, we address the issue of whether dopamine activation improves visual perception despite increasing sensory noise in the visual cortex

  11. Seeing the tipping point: Balance perception and visual shape.

    PubMed

    Firestone, Chaz; Keil, Frank C

    2016-07-01

    In a brief glance at an object or shape, we can appreciate a rich suite of its functional properties, including the organization of the object's parts, its optimal contact points for grasping, and its center of mass, or balancing point. However, in the real world and the laboratory, balance perception shows systematic biases whereby observers may misjudge a shape's center of mass by a severe margin. Are such biases simply quirks of physical reasoning? Or might they instead reflect more fundamental principles of object representation? Here we demonstrate systematically biased center-of-mass estimation for two-dimensional (2D) shapes (Study 1) and advance a surprising explanation of such biases. We suggest that the mind implicitly represents ordinary 2D shapes as rich, volumetric, three-dimensional (3D) objects, and that these "inflated" shape representations intrude on and bias perception of the 2D shape's geometric properties. Such "inflation" is a computer-graphics technique for segmenting shapes into parts, and we show that a model derived from this technique best accounts for the biases in center-of-mass estimation in Study 1. Further supporting this account, we show that reducing the need for inflated shape representations diminishes such biases: Center-of-mass estimation improved when cues to shapehood were attenuated (Study 2) and when shapes' depths were explicitly depicted using real-life objects laser-cut from wood (Study 3). We suggest that the technique of shape inflation is actually implemented in the mind; thus, biases in our impressions of balance reflect a more general functional characteristic of object perception. (PsycINFO Database Record PMID:27348290

  12. Perception of affective prosody in major depression: a link to executive functions?

    PubMed

    Uekermann, Jennifer; Abdel-Hamid, Mona; Lehmkämper, Caroline; Vollmoeller, Wolfgang; Daum, Irene

    2008-07-01

    Major depression is associated with impairments of executive functions and affect perception deficits, both being linked to dysfunction of fronto-subcortical networks. So far, little is known about the relationship between cognitive and affective deficits in major depression. In the present investigation, affect perception and executive functions were assessed in 29 patients with a diagnosis of major depression (Dep) and 29 healthy controls (HC). Both groups were comparable on IQ, age, and gender distribution. Depressed patients showed deficits of perception of affective prosody, which were significantly related to inhibition, set shifting, and working memory. Our findings suggest a significant association between cognitive deficits and affect perception impairments in major depression, which may be of considerable clinical relevance and might be addressed in treatment approaches. Future studies are desirable to investigate the nature of the association in more detail.

  13. Inaccurate perception of asthma symptoms: a cognitive-affective framework and implications for asthma treatment.

    PubMed

    Janssens, Thomas; Verleden, Geert; De Peuter, Steven; Van Diest, Ilse; Van den Bergh, Omer

    2009-06-01

    Inaccurate perception of respiratory symptoms is often found in asthma patients. Typically, patients who inaccurately perceive asthma symptoms are divided into underperceivers and overperceivers. In this paper we point out that this division is problematic. We argue that little evidence exists for a trait-like stability of under- and overperception and that accuracy of respiratory symptom perception is highly variable within persons and strongly influenced by contextual information. Particularly, expectancy and affective cues appear to have a powerful influence on symptom accuracy. Based on these findings and incorporating recent work on associative learning, attention and mental representations in anxiety and symptom perception, we propose a cognitive-affective model of symptom perception in asthma. The model can act as a framework to understand both normal perception as well as under- and overperception of asthma symptoms and can guide the development of affect-related interventions to improve perceptual accuracy, asthma control and quality of life in asthma patients. PMID:19285771

  14. Phosphene Perception Relates to Visual Cortex Glutamate Levels and Covaries with Atypical Visuospatial Awareness

    PubMed Central

    Terhune, Devin B.; Murray, Elizabeth; Near, Jamie; Stagg, Charlotte J.; Cowey, Alan; Cohen Kadosh, Roi

    2015-01-01

    Phosphenes are illusory visual percepts produced by the application of transcranial magnetic stimulation to occipital cortex. Phosphene thresholds, the minimum stimulation intensity required to reliably produce phosphenes, are widely used as an index of cortical excitability. However, the neural basis of phosphene thresholds and their relationship to individual differences in visual cognition are poorly understood. Here, we investigated the neurochemical basis of phosphene perception by measuring basal GABA and glutamate levels in primary visual cortex using magnetic resonance spectroscopy. We further examined whether phosphene thresholds would relate to the visuospatial phenomenology of grapheme-color synesthesia, a condition characterized by atypical binding and involuntary color photisms. Phosphene thresholds negatively correlated with glutamate concentrations in visual cortex, with lower thresholds associated with elevated glutamate. This relationship was robust, present in both controls and synesthetes, and exhibited neurochemical, topographic, and threshold specificity. Projector synesthetes, who experience color photisms as spatially colocalized with inducing graphemes, displayed lower phosphene thresholds than associator synesthetes, who experience photisms as internal images, with both exhibiting lower thresholds than controls. These results suggest that phosphene perception is driven by interindividual variation in glutamatergic activity in primary visual cortex and relates to cortical processes underlying individual differences in visuospatial awareness. PMID:25725043

  15. Phosphene Perception Relates to Visual Cortex Glutamate Levels and Covaries with Atypical Visuospatial Awareness.

    PubMed

    Terhune, Devin B; Murray, Elizabeth; Near, Jamie; Stagg, Charlotte J; Cowey, Alan; Cohen Kadosh, Roi

    2015-11-01

    Phosphenes are illusory visual percepts produced by the application of transcranial magnetic stimulation to occipital cortex. Phosphene thresholds, the minimum stimulation intensity required to reliably produce phosphenes, are widely used as an index of cortical excitability. However, the neural basis of phosphene thresholds and their relationship to individual differences in visual cognition are poorly understood. Here, we investigated the neurochemical basis of phosphene perception by measuring basal GABA and glutamate levels in primary visual cortex using magnetic resonance spectroscopy. We further examined whether phosphene thresholds would relate to the visuospatial phenomenology of grapheme-color synesthesia, a condition characterized by atypical binding and involuntary color photisms. Phosphene thresholds negatively correlated with glutamate concentrations in visual cortex, with lower thresholds associated with elevated glutamate. This relationship was robust, present in both controls and synesthetes, and exhibited neurochemical, topographic, and threshold specificity. Projector synesthetes, who experience color photisms as spatially colocalized with inducing graphemes, displayed lower phosphene thresholds than associator synesthetes, who experience photisms as internal images, with both exhibiting lower thresholds than controls. These results suggest that phosphene perception is driven by interindividual variation in glutamatergic activity in primary visual cortex and relates to cortical processes underlying individual differences in visuospatial awareness. PMID:25725043

  16. On the advantage of being left-handed in volleyball: further evidence of the specificity of skilled visual perception.

    PubMed

    Loffing, Florian; Schorer, Jörg; Hagemann, Norbert; Baker, Joseph

    2012-02-01

    High ball speeds and close distances between competitors require athletes in interactive sports to correctly anticipate an opponent's intentions in order to render appropriate reactions. Although it is considered crucial for successful performance, such skill appears impaired when athletes are confronted with a left-handed opponent, possibly because of athletes' reduced perceptual familiarity with rarely encountered left-handed actions. To test this negative perceptual frequency effect hypothesis, we invited 18 skilled and 18 novice volleyball players to predict shot directions of left- and right-handed attacks in a video-based visual anticipation task. In accordance with our predictions, and with recent reports on laterality differences in visual perception, the outcome of left-handed actions was significantly less accurately predicted than the outcome of right-handed attacks. In addition, this left-right bias was most distinct when predictions had to be based on preimpact (i.e., before hand-ball contact) kinematic cues, and skilled players were generally more affected by the opponents' handedness than were novices. The study's findings corroborate the assumption that skilled visual perception is attuned to more frequently encountered actions.

  17. Visual behavior and perception of trajectories of moving objects with visual occlusion.

    PubMed

    Moreno, Francisco J; Luis, Vicente; Salgado, Francisco; García, Juan A; Reina, Raúl

    2005-08-01

    Experienced athletes in sports with moving objects have shown greater skill when using visual information to anticipate the direction of a moving object than nonexperienced athletes of those sports. Studies have shown that expert athletes are more effective than novices in occlusion situations in the first stages of the sports sequence. In this study, 12 athletes with different competitive experience in sports with moving objects viewed a sequence of tennis ball launches with and without visual occlusion, launched by a ball-shooting machine toward different areas with respect to the participant's position. The relation among visual behavior, occlusion time, and the precision of the task is reviewed. The spot where the balls bounced was analysed by a digital camera and visual behavior by an Eye Tracking System. Analysis showed that the nonexperienced athletes made significantly more errors and were more variable in visual occlusion conditions. Participants had a stable visual search strategy. PMID:16350604

  18. Biased perception about gene technology: How perceived naturalness and affect distort benefit perception.

    PubMed

    Siegrist, Michael; Hartmann, Christina; Sütterlin, Bernadette

    2016-01-01

    In two experiments, the participants showed biased responses when asked to evaluate the benefits of gene technology. They evaluated the importance of additional yields in corn fields due to a newly introduced variety, which would increase a farmer's revenues. In one condition, the newly introduced variety was described as a product of traditional breeding; in the other, it was identified as genetically modified (GM). The two experiments' findings showed that the same benefits were perceived as less important for a farmer when these were the result of GM crops compared with traditionally bred crops. Mediation analyses suggest that perceived naturalness and the affect associated with the technology per se influence the interpretation of the new information. The lack of perceived naturalness of gene technology seems to be the reason for the participants' perceived lower benefits of a new corn variety in the gene technology condition compared with the perceptions of the participants assigned to the traditional breeding condition. The strategy to increase the acceptance of gene technology by introducing plant varieties that better address consumer and producer needs may not work because people discount its associated benefits. PMID:26505287

  19. Biased perception about gene technology: How perceived naturalness and affect distort benefit perception.

    PubMed

    Siegrist, Michael; Hartmann, Christina; Sütterlin, Bernadette

    2016-01-01

    In two experiments, the participants showed biased responses when asked to evaluate the benefits of gene technology. They evaluated the importance of additional yields in corn fields due to a newly introduced variety, which would increase a farmer's revenues. In one condition, the newly introduced variety was described as a product of traditional breeding; in the other, it was identified as genetically modified (GM). The two experiments' findings showed that the same benefits were perceived as less important for a farmer when these were the result of GM crops compared with traditionally bred crops. Mediation analyses suggest that perceived naturalness and the affect associated with the technology per se influence the interpretation of the new information. The lack of perceived naturalness of gene technology seems to be the reason for the participants' perceived lower benefits of a new corn variety in the gene technology condition compared with the perceptions of the participants assigned to the traditional breeding condition. The strategy to increase the acceptance of gene technology by introducing plant varieties that better address consumer and producer needs may not work because people discount its associated benefits.

  20. Visual learning in the perception of texture: simple and contingent aftereffects of texture density.

    PubMed

    Durgin, F H; Proffitt, D R

    1996-01-01

    Novel results elucidating the magnitude, binocularity and retinotopicity of aftereffects of visual texture density adaptation are reported as is a new contingent aftereffect of texture density which suggests that the perception of visual texture density is quite malleable. Texture aftereffects contingent upon orientation, color and temporal sequence are discussed. A fourth effect is demonstrated in which auditory contingencies are shown to produce a different kind of visual distortion. The merits and limitations of error-correction and classical conditioning theories of contingent adaptation are reviewed. It is argued that a third kind of theory which emphasizes coding efficiency and informational considerations merits close attention. It is proposed that malleability in the registration of texture information can be understood as part of the functional adaptability of perception.

  1. Illusions of having small or large invisible bodies influence visual perception of object size

    PubMed Central

    van der Hoort, Björn; Ehrsson, H. Henrik

    2016-01-01

    The size of our body influences the perceived size of the world so that objects appear larger to children than to adults. The mechanisms underlying this effect remain unclear. It has been difficult to dissociate visual rescaling of the external environment based on an individual’s visible body from visual rescaling based on a central multisensory body representation. To differentiate these potential causal mechanisms, we manipulated body representation without a visible body by taking advantage of recent developments in body representation research. Participants experienced the illusion of having a small or large invisible body while object-size perception was tested. Our findings show that the perceived size of test-objects was determined by the size of the invisible body (inverse relation), and by the strength of the invisible body illusion. These findings demonstrate how central body representation directly influences visual size perception, without the need for a visible body, by rescaling the spatial representation of the environment. PMID:27708344

  2. Development of Visual Motion Perception for Prospective Control: Brain and Behavioral Studies in Infants

    PubMed Central

    Agyei, Seth B.; van der Weel, F. R. (Ruud); van der Meer, Audrey L. H.

    2016-01-01

    During infancy, smart perceptual mechanisms develop allowing infants to judge time-space motion dynamics more efficiently with age and locomotor experience. This emerging capacity may be vital to enable preparedness for upcoming events and to be able to navigate in a changing environment. Little is known about brain changes that support the development of prospective control and about processes, such as preterm birth, that may compromise it. As a function of perception of visual motion, this paper will describe behavioral and brain studies with young infants investigating the development of visual perception for prospective control. By means of the three visual motion paradigms of occlusion, looming, and optic flow, our research shows the importance of including behavioral data when studying the neural correlates of prospective control. PMID:26903908

  3. Time course of visual perception: coarse-to-fine processing and beyond.

    PubMed

    Hegdé, Jay

    2008-04-01

    Our perception of a visual scene changes rapidly in time, even when the scene itself does not. It is increasingly clear that understanding how the visual percept changes in time is crucial to understanding how we see. We are still far from fully understanding the temporal changes in the visual percept and the neural mechanisms that underlie it. But recently, many disparate lines of evidence are beginning to converge to produce a complex but fuzzy picture of visual temporal dynamics. It is clear, largely from psychophysical studies in humans, that one can get the 'gist' of complex visual scenes within about 150ms after the stimulus onset, even when the stimulus itself is presented as briefly as 10 ms or so. It generally takes longer processing, if not longer stimulus presentation, to identify individual objects. It may take even longer for a fuller semantic understanding, or awareness, of the scene to emerge and be encoded in short-term memory. Microelectrode recording studies in monkeys, along with neuroimaging studies mostly in humans, have elucidated many important temporal dynamic phenomena at the level of individual neurons and neuronal populations. Many of the temporal changes at the perceptual and the neural levels can be captured by the multifaceted and somewhat ambiguous concept of coarse-to-fine processing, although it is clear that not all temporal changes can be characterized this way. A more comprehensive, albeit unproven, alternative framework for understanding visual temporal dynamics is to view it as a sequential, Bayesian decision-making process. At each step, the visual system infers the likely nature visual scene by jointly evaluating the available processed image information and prior knowledge about the scene, including prior inferences. Whether the processing proceeds in a coarse-to-fine fashion depends largely on whether the underlying computations are hierarchical or not. Characterizing these inferential steps from the computational

  4. Proprioceptive body illusions modulate the visual perception of reaching distance.

    PubMed

    Petroni, Agustin; Carbajal, M Julia; Sigman, Mariano

    2015-01-01

    The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide-without engaging in explicit action-whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas.

  5. Proprioceptive body illusions modulate the visual perception of reaching distance.

    PubMed

    Petroni, Agustin; Carbajal, M Julia; Sigman, Mariano

    2015-01-01

    The neurobiology of reaching has been extensively studied in human and non-human primates. However, the mechanisms that allow a subject to decide-without engaging in explicit action-whether an object is reachable are not fully understood. Some studies conclude that decisions near the reach limit depend on motor simulations of the reaching movement. Others have shown that the body schema plays a role in explicit and implicit distance estimation, especially after motor practice with a tool. In this study we evaluate the causal role of multisensory body representations in the perception of reachable space. We reasoned that if body schema is used to estimate reach, an illusion of the finger size induced by proprioceptive stimulation should propagate to the perception of reaching distances. To test this hypothesis we induced a proprioceptive illusion of extension or shrinkage of the right index finger while participants judged a series of LEDs as reachable or non-reachable without actual movement. Our results show that reach distance estimation depends on the illusory perceived size of the finger: illusory elongation produced a shift of reaching distance away from the body whereas illusory shrinkage produced the opposite effect. Combining these results with previous findings, we suggest that deciding if a target is reachable requires an integration of body inputs in high order multisensory parietal areas that engage in movement simulations through connections with frontal premotor areas. PMID:26110274

  6. Visual perception in the brain of a jumping spider.

    PubMed

    Menda, Gil; Shamble, Paul S; Nitzany, Eyal I; Golden, James R; Hoy, Ronald R

    2014-11-01

    Jumping spiders (Salticidae) are renowned for a behavioral repertoire that can seem more vertebrate, or even mammalian, than spider-like in character. This is made possible by a unique visual system that supports their stalking hunting style and elaborate mating rituals in which the bizarrely marked and colored appendages of males highlight their song-and-dance displays. Salticids perform these tasks with information from four pairs of functionally specialized eyes, providing a near 360° field of view and forward-looking spatial resolution surpassing that of all insects and even some mammals, processed by a brain roughly the size of a poppy seed. Salticid behavior, evolution, and ecology are well documented, but attempts to study the neurophysiological basis of their behavior had been thwarted by the pressurized nature of their internal body fluids, making typical physiological techniques infeasible and restricting all previous neural work in salticids to a few recordings from the eyes. We report the first survey of neurophysiological recordings from the brain of a jumping spider, Phidippus audax (Salticidae). The data include single-unit recordings in response to artificial and naturalistic visual stimuli. The salticid visual system is unique in that high-acuity and motion vision are processed by different pairs of eyes. We found nonlinear interactions between the principal and secondary eyes, which can be inferred from the emergence of spatiotemporal receptive fields. Ecologically relevant images, including prey-like objects such as flies, elicited bursts of excitation from single units. PMID:25308077

  7. Anode heel affect in thoracic radiology: a visual grading analysis

    NASA Astrophysics Data System (ADS)

    Mearon, T.; Brennan, P. C.

    2006-03-01

    For decades, the antero-posterior (AP) projection of the thoracic spine has represented a substantial challenge. Patient thickness varies substantially along the cranio-caudal axis resulting in images that are too dark for the upper vertebrae and too light, or with excessive quantum mottle, towards the 9th to 12th thoracic vertebra. The anode heel affect is a well known phenomenon, however there is a paucity of reports demonstrating its exploitation in clinical departments for optimising images. The current work, using an adult, tissue-equivalent anthropomorphic phantom, explores if appropriate positioning ofthe anode can improve image quality for thoracic spine radiology. At each of 5 kVps (70, 81, 90, 102, 109) thirty AP thoracic spine images were produced, 15 with the anode end of the tube towards the cranial part of the phantom and 15 with the anode end of the tube facing caudally. Visual grading analysis of the resultant images demonstrated significant improvements in overall image quality and visualisation of specific anatomical features for the cranially facing anode compared with the alternative position, which were most pronounced for the 1st to 4th and 9th to 12th vertebrae. These improvements were evident at 70, 81 and 90 kVp, but not for the higher beam energies. The results demonstrate that correct positioning of the X-ray tube can improve image quality for thoracic radiology at specific tube potentials. Further work is ongoing to investigate whether this easy to implement and cost-free technique can be employed for other examinations.

  8. Does Viewing Documentary Films Affect Environmental Perceptions and Behaviors?

    ERIC Educational Resources Information Center

    Janpol, Henry L.; Dilts, Rachel

    2016-01-01

    This research explored whether viewing documentary films about the natural or built environment can exert a measurable influence on behaviors and perceptions. Different documentary films were viewed by subjects. One film emphasized the natural environment, while the other focused on the built environment. After viewing a film, a computer game…

  9. Negative Affect, Risk Perception, and Adolescent Risk Behavior

    ERIC Educational Resources Information Center

    Curry, Laura A.; Youngblade, Lise M.

    2006-01-01

    The prevalence, etiology, and consequences of adolescent risk behavior have stimulated much research. The current study examined relationships among anger and depressive symptomatology (DS), risk perception, self-restraint, and adolescent risk behavior. Telephone surveys were conducted with 290 14- to 20-year-olds (173 females; M = 15.98 years).…

  10. Teacher Perceptions Affect Boys' and Girls' Reading Motivation Differently

    ERIC Educational Resources Information Center

    Boerma, Inouk E.; Mol, Suzanne E.; Jolles, Jelle

    2016-01-01

    The aim of this study was to examine the relationship between teacher perceptions and children's reading motivation, with specific attention to gender differences. The reading self-concept, task value, and attitude of 160 fifth and sixth graders were measured. Teachers rated each student's reading comprehension. Results showed that for boys,…

  11. Specific Previous Experience Affects Perception of Harmony and Meter

    ERIC Educational Resources Information Center

    Creel, Sarah C.

    2011-01-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was…

  12. Visual perception in space and time--mapping the visual field of temporal resolution.

    PubMed

    Poggel, Dorothe A; Strasburger, Hans

    2004-01-01

    To characterize temporal aspects of information processing in the human visual field, we studied the topographical distribution of temporal and non-temporal performance parameters in 95 normally sighted subjects. Visual field maps of double-pulse resolution thresholds (DPR) (the minimum detectable temporal gap between two light stimuli) and simple visual reaction times (RT) (measuring the speed of reaction to a light stimulus) were compared to maps of luminance thresholds determined by standard perimetry. Thus, for the first time, the topography of a visual variable without temporal constraints (perimetry) could be compared to visual variables in the temporal domain, with (RT) and without (DPR) motor reaction. The goal of the study was to obtain and to describe the pattern of co-variation of performance indicators. In all three measures, performance was best in the central visual field and dropped significantly towards the periphery. Although the correlation between DPR and RT was significant, shared variance was low, and we observed large topographical differences between these two temporal-performance variables. In contrast, DPR and perimetric thresholds correlated more substantially, and visual field maps were similar. The Gestalt of DPR maps shares characteristics of basic visual processing (e.g., light sensitivity), but it also reflects top-down influences, i.e., from spatial attention. Although the correlation between DPR and RT suggests common characteristics between these two temporal variables, the topographic distributions reveal significant differences, indicating separate underlying processing mechanisms. PMID:15283484

  13. Towards Evidence of Visual Literacy: Assessing Pre-Service Teachers' Perceptions of Instructional Visuals

    ERIC Educational Resources Information Center

    Yeh, Hsin-Te; Lohr, Linda

    2010-01-01

    This research describes how eight pre-service teachers interpreted and analyzed instructional visuals, defined visual literacy, and assessed the effectiveness of four instructional images. A phenomenological procedure was used to collect, organize, and analyze the data. Findings suggest that pre-service teachers in this study were able to define…

  14. Diabetes reduces the cognitive function with the decrease of the visual perception and visual motor integration in male older adults.

    PubMed

    Yun, Hyo-Soon; Kim, Eunhwi; Suh, Soon-Rim; Kim, Mi-Han; Kim, Hong

    2013-01-01

    This study investigated the influence of diabetes on cognitive decline between the diabetes and non- diabetes patients and identified the associations between diabetes and cognitive function, visual perception (VP), and visual motor integration (VMI). Sixty elderly men (67.10± 1.65 yr) with and without diabetes (n= 30 in each group) who were surveyed by interview and questionnaire in South Korea were enrolled in this study. The score of Mini-Mental State Examination of Korean version (MMSE-KC), Motor-free Visual Perception Test-Vertical Format (MVPT-V), and Visual-Motor Integration 3rd Revision (VMI-3R) were assessed in all of the participants to evaluate cognitive function, VP, and VMI in each. The score of MMSE-KC in the diabetic group was significantly lower than that of the non-diabetes group (P< 0.01). Participants in the diabetes group also had lower MVPT-V and VMI-3R scores than those in the non-diabetes group (P< 0.01, respectively). Especially, the scores of figure-ground and visual memory among the subcategories of MVPT-V were significantly lower in the diabetes group than in the non-diabetes group (P< 0.01). These findings indicate that the decline in cognitive function in individuals with diabetes may be greater than that in non-diabetics. In addition, the cognitive decline in older adults with diabetes might be associated with the decrease of VP and VMI. In conclusion, we propose that VP and VMI will be helpful to monitor the change of cognitive function in older adults with diabetes as part of the routine management of diabetes-induced cognitive declines. PMID:24282807

  15. Comparative studies of color fields, visual acuity fields, and movement perception limits among varsity athletes and non-varsity groups.

    PubMed

    Mizusawa, K; Sweeting, R L; Knouse, S B

    1983-06-01

    This paper examined effects of sports practice on patterns of color fields, limits of peripheral movement perception, and visual acuity field by comparing varsity ball players and non-varsity control groups. The first study measured extent of color fields and limits of horizontal and vertical meridians for peripheral movement perception of 139 college students. The second study tested visual acuity fields of female and male basketball players and female and male controls. The first study indicated that athletes had wider limits for horizontal movement perception, while the non-athletes had better vertical movement perception limits. Basketball players demonstrated color fields and limits for peripheral movement perception superior to those of soccer players. In the second study, athletes did not have any wider visual acuity fields than non-athletes, but their movement-perception limits were significantly wider than those of non-athletes.

  16. The perceptual root of object-based storage: an interactive model of perception and visual working memory.

    PubMed

    Gao, Tao; Gao, Zaifeng; Li, Jie; Sun, Zhongqiang; Shen, Mowei

    2011-12-01

    Mainstream theories of visual perception assume that visual working memory (VWM) is critical for integrating online perceptual information and constructing coherent visual experiences in changing environments. Given the dynamic interaction between online perception and VWM, we propose that how visual information is processed during visual perception can directly determine how the information is going to be selected, consolidated, and maintained in VWM. We demonstrate the validity of this hypothesis by investigating what kinds of perceptual information can be stored as integrated objects in VWM. Three criteria for object-based storage are introduced: (a) automatic selection of task-irrelevant features, (b) synchronous consolidation of multiple features, and (c) stable maintenance of feature conjunctions. The results show that the outputs of parallel perception meet all three criteria, as opposed to the outputs of serial attentive processing, which fail all three criteria. These results indicate that (a) perception and VWM are not two sequential processes, but are dynamically intertwined; (b) there are dissociated mechanisms in VWM for storing information identified at different stages of perception; and (c) the integrated object representations in VWM originate from the "preattentive" or "proto" objects created by parallel perception. These results suggest how visual perception, attention, and VWM can be explained by a unified framework.

  17. Visual perception and stereoscopic imaging: an artist's perspective

    NASA Astrophysics Data System (ADS)

    Mason, Steve

    2015-03-01

    This paper continues my 2014 February IS and T/SPIE Convention exploration into the relationship of stereoscopic vision and consciousness (90141F-1). It was proposed then that by using stereoscopic imaging people may consciously experience, or see, what they are viewing and thereby help make them more aware of the way their brains manage and interpret visual information. Environmental imaging was suggested as a way to accomplish this. This paper is the result of further investigation, research, and follow-up imaging. A show of images, that is a result of this research, allows viewers to experience for themselves the effects of stereoscopy on consciousness. Creating dye-infused aluminum prints while employing ChromaDepth® 3D glasses, I hope to not only raise awareness of visual processing but also explore the differences and similarities between the artist and scientist―art increases right brain spatial consciousness, not only empirical thinking, while furthering the viewer's cognizance of the process of seeing. The artist must abandon preconceptions and expectations, despite what the evidence and experience may indicate in order to see what is happening in his work and to allow it to develop in ways he/she could never anticipate. This process is then revealed to the viewer in a show of work. It is in the experiencing, not just from the thinking, where insight is achieved. Directing the viewer's awareness during the experience using stereoscopic imaging allows for further understanding of the brain's function in the visual process. A cognitive transformation occurs, the preverbal "left/right brain shift," in order for viewers to "see" the space. Using what we know from recent brain research, these images will draw from certain parts of the brain when viewed in two dimensions and different ones when viewed stereoscopically, a shift, if one is looking for it, which is quite noticeable. People who have experienced these images in the context of examining their own

  18. Turning body and self inside out: visualized heartbeats alter bodily self-consciousness and tactile perception.

    PubMed

    Aspell, Jane Elizabeth; Heydrich, Lukas; Marillier, Guillaume; Lavanchy, Tom; Herbelin, Bruno; Blanke, Olaf

    2013-12-01

    Prominent theories highlight the importance of bodily perception for self-consciousness, but it is currently not known whether bodily perception is based on interoceptive or exteroceptive signals or on integrated signals from these anatomically distinct systems. In the research reported here, we combined both types of signals by surreptitiously providing participants with visual exteroceptive information about their heartbeat: A real-time video image of a periodically illuminated silhouette outlined participants' (projected, "virtual") bodies and flashed in synchrony with their heartbeats. We investigated whether these "cardio-visual" signals could modulate bodily self-consciousness and tactile perception. We report two main findings. First, synchronous cardio-visual signals increased self-identification with and self-location toward the virtual body, and second, they altered the perception of tactile stimuli applied to participants' backs so that touch was mislocalized toward the virtual body. We argue that the integration of signals from the inside and the outside of the human body is a fundamental neurobiological process underlying self-consciousness.

  19. Confinement has no effect on visual space perception: The results of the Mars-500 experiment.

    PubMed

    Sikl, Radovan; Simeček, Michal

    2014-02-01

    People confined to a closed space live in a visual environment that differs from a natural open-space environment in several respects. The view is restricted to no more than a few meters, and nearby objects cannot be perceived relative to the position of a horizon. Thus, one might expect to find changes in visual space perception as a consequence of the prolonged experience of confinement. The subjects in our experimental study were participants of the Mars-500 project and spent nearly a year and a half isolated from the outside world during a simulated mission to Mars. The participants were presented with a battery of computer-based psychophysical tests examining their performance on various 3-D perception tasks, and we monitored changes in their perceptual performance throughout their confinement. Contrary to our expectations, no serious effect of the confinement on the crewmembers' 3-D perception was observed in any experiment. Several interpretations of these findings are discussed, including the possibilities that (1) the crewmembers' 3-D perception really did not change significantly, (2) changes in 3-D perception were manifested in the precision rather than the accuracy of perceptual judgments, and/or (3) the experimental conditions and the group sample were problematic.

  20. Confinement has no effect on visual space perception: The results of the Mars-500 experiment.

    PubMed

    Sikl, Radovan; Simeček, Michal

    2014-02-01

    People confined to a closed space live in a visual environment that differs from a natural open-space environment in several respects. The view is restricted to no more than a few meters, and nearby objects cannot be perceived relative to the position of a horizon. Thus, one might expect to find changes in visual space perception as a consequence of the prolonged experience of confinement. The subjects in our experimental study were participants of the Mars-500 project and spent nearly a year and a half isolated from the outside world during a simulated mission to Mars. The participants were presented with a battery of computer-based psychophysical tests examining their performance on various 3-D perception tasks, and we monitored changes in their perceptual performance throughout their confinement. Contrary to our expectations, no serious effect of the confinement on the crewmembers' 3-D perception was observed in any experiment. Several interpretations of these findings are discussed, including the possibilities that (1) the crewmembers' 3-D perception really did not change significantly, (2) changes in 3-D perception were manifested in the precision rather than the accuracy of perceptual judgments, and/or (3) the experimental conditions and the group sample were problematic. PMID:24288139

  1. Acute Zonal Occult Outer Retinopathy in Japanese Patients: Clinical Features, Visual Function, and Factors Affecting Visual Function

    PubMed Central

    Saito, Saho; Saito, Wataru; Saito, Michiyuki; Hashimoto, Yuki; Mori, Shohei; Noda, Kousuke; Namba, Kenichi; Ishida, Susumu

    2015-01-01

    Purpose To evaluate the clinical features and investigate their relationship with visual function in Japanese patients with acute zonal occult outer retinopathy (AZOOR). Methods Fifty-two eyes of 38 Japanese AZOOR patients (31 female and 7 male patients; mean age at first visit, 35.0 years; median follow-up duration, 31 months) were retrospectively collected: 31 untreated eyes with good visual acuity and 21 systemic corticosteroid-treated eyes with progressive visual acuity loss. Variables affecting the logMAR values of best-corrected visual acuity (BCVA) and the mean deviation (MD) on Humphrey perimetry at initial and final visits were examined using multiple stepwise linear regression analysis. Results In untreated eyes, the mean MD at the final visit was significantly higher than that at the initial visit (P = 0.00002). In corticosteroid-treated eyes, the logMAR BCVA and MD at the final visit were significantly better than the initial values (P = 0.007 and P = 0.02, respectively). The final logMAR BCVA was 0.0 or less in 85% of patients. Variables affecting initial visual function were moderate anterior vitreous cells, myopia severity, and a-wave amplitudes on electroretinography; factors affecting final visual function were the initial MD values, female sex, moderate anterior vitreous cells, and retinal atrophy. Conclusions Our data indicated that visual functions in enrolled patients significantly improved spontaneously or after systemic corticosteroids therapy, suggesting that Japanese patients with AZOOR have good visual outcomes during the follow-up period of this study. Furthermore, initial visual field defects, gender, anterior vitreous cells, and retinal atrophy affected final visual functions in these patients. PMID:25919689

  2. Transient perceptual neglect: visual working memory load affects conscious object processing.

    PubMed

    Emrich, Stephen M; Burianová, Hana; Ferber, Susanne

    2011-10-01

    Visual working memory (VWM) is a capacity-limited cognitive resource that plays an important role in complex cognitive behaviors. Recent studies indicate that regions subserving VWM may play a role in the perception and recognition of visual objects, suggesting that conscious object perception may depend on the same cognitive and neural architecture that supports the maintenance of visual object information. In the present study, we examined this question by testing object processing under a concurrent VWM load. Under a high VWM load, recognition was impaired for objects presented in the left visual field, in particular when two objects were presented simultaneously. Multivariate fMRI revealed that two independent but partially overlapping networks of brain regions contribute to object recognition. The first network consisted of regions involved in VWM encoding and maintenance. Importantly, these regions were also sensitive to object load. The second network comprised regions of the ventral temporal lobes traditionally associated with object recognition. Importantly, activation in both networks predicted object recognition performance. These results indicate that information processing in regions that mediate VWM may be critical to conscious visual perception. Moreover, the observation of a hemifield asymmetry in object recognition performance has important theoretical and clinical significance for the study of visual neglect.

  3. Preserved implicit form perception and orientation adaptation in visual form agnosia.

    PubMed

    Yang, Jiongjiong; Wu, Ming; Shen, Zheng

    2006-01-01

    Visual form agnosia is mainly characterized by profound deficits in visual form and shape discrimination. Previous studies have shown that patients retain the capacity for coordinated motor behaviors, color naming and implicit letter perception. However, it is unknown to what extent other visual functions, such as implicit form and orientation perception, are preserved. To address these questions, we investigated a single visual form agnosic patient, X.F., in two distinct experiments. X.F.'s visual lesions were mainly localized in the bilateral occipitotemporal cortex, with the dorsal visual stream and early visual cortex largely spared. In Experiment 1, X.F. named the color of different forms across 12 blocks of trials. After the first six blocks, the combinations of a form with its color were changed and the new combination was presented for the remaining six blocks. X.F.'s reaction time increased during the switch block and was significantly greater than the overall RT changes between adjacent, non-switch blocks. This indicates that X.F. retained the ability to perceive changes in form despite her inability to discriminate the forms. In Experiment 2, X.F. showed selective orientation adaptation effects to different spatial frequencies; that is, her contrast threshold was significantly higher when the adapting and test orientations were the same than when they were orthogonal, although her orientation discrimination performance was severely impaired. These data provide evidence of a functional dissociation between explicit and implicit visual abilities, and suggest that the residual early visual cortex mediates form and orientation processing in the absence of awareness.

  4. Visual depth perception in normal and deprived rats: effects of environmental enrichment.

    PubMed

    Baroncelli, L; Braschi, C; Maffei, L

    2013-04-16

    A proper maturation of stereoscopic functions requires binocular visual experience and early disruption of sensory-driven activity can result in long-term or even permanent visual function impairment. Amblyopia is one paradigmatic case of visual system disorder, with early conditions of functional imbalance between the two eyes leading to severe deficits of visual acuity and depth-perception abilities. In parallel to the reduction of neural plasticity levels, the brain potential for functional recovery declines with age. Recent evidence has challenged this traditional view and experimental paradigms enhancing experience-dependent plasticity in the adult brain have been described. Here, we show that environmental enrichment (EE), a condition of increased cognitive and sensory-motor stimulation, restores experience-dependent plasticity of stereoscopic perception in response to sensory deprivation well after the end of the critical period and reinstates depth-perception abilities of adult amblyopic animals in the range of normal values. Our results encourage efforts in the clinical application of paradigms based on EE as an intervention strategy for treating amblyopia in adulthood.

  5. The Development of Face Perception in Infancy: Intersensory Interference and Unimodal Visual Facilitation

    PubMed Central

    Bahrick, Lorraine E.; Lickliter, Robert; Castellanos, Irina

    2014-01-01

    Although research has demonstrated impressive face perception skills of young infants, little attention has focused on conditions that enhance versus impair infant face perception. The present studies tested the prediction, generated from the Intersensory Redundancy Hypothesis (IRH), that face discrimination, which relies on detection of visual featural information, would be impaired in the context of intersensory redundancy provided by audiovisual speech, and enhanced in the absence of intersensory redundancy (unimodal visual and asynchronous audiovisual speech) in early development. Later in development, following improvements in attention, faces should be discriminated in both redundant audiovisual and nonredundant stimulation. Results supported these predictions. Two-month-old infants discriminated a novel face in unimodal visual and asynchronous audiovisual speech but not in synchronous audiovisual speech. By 3 months, face discrimination was evident even during synchronous audiovisual speech. These findings indicate that infant face perception is enhanced and emerges developmentally earlier following unimodal visual than synchronous audiovisual exposure and that intersensory redundancy generated by naturalistic audiovisual speech can interfere with face processing. PMID:23244407

  6. Augmented depth perception visualization in 2D/3D image fusion.

    PubMed

    Wang, Jian; Kreiser, Matthias; Wang, Lejing; Navab, Nassir; Fallavollita, Pascal

    2014-12-01

    2D/3D image fusion applications are widely used in endovascular interventions. Complaints from interventionists about existing state-of-art visualization software are usually related to the strong compromise between 2D and 3D visibility or the lack of depth perception. In this paper, we investigate several concepts enabling improvement of current image fusion visualization found in the operating room. First, a contour enhanced visualization is used to circumvent hidden information in the X-ray image. Second, an occlusion and depth color-coding scheme is considered to improve depth perception. To validate our visualization technique both phantom and clinical data are considered. An evaluation is performed in the form of a questionnaire which included 24 participants: ten clinicians and fourteen non-clinicians. Results indicate that the occlusion correction method provides 100% correctness when determining the true position of an aneurysm in X-ray. Further, when integrating an RGB or RB color-depth encoding in the image fusion both perception and intuitiveness are improved.

  7. Perception and performance in flight simulators: The contribution of vestibular, visual, and auditory information

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The pilot's perception and performance in flight simulators is examined. The areas investigated include: vestibular stimulation, flight management and man cockpit information interfacing, and visual perception in flight simulation. The effects of higher levels of rotary acceleration on response time to constant acceleration, tracking performance, and thresholds for angular acceleration are examined. Areas of flight management examined are cockpit display of traffic information, work load, synthetic speech call outs during the landing phase of flight, perceptual factors in the use of a microwave landing system, automatic speech recognition, automation of aircraft operation, and total simulation of flight training.

  8. On the relationship between personal experience, affect and risk perception: The case of climate change

    PubMed Central

    van der Linden, Sander

    2014-01-01

    Examining the conceptual relationship between personal experience, affect, and risk perception is crucial in improving our understanding of how emotional and cognitive process mechanisms shape public perceptions of climate change. This study is the first to investigate the interrelated nature of these variables by contrasting three prominent social-psychological theories. In the first model, affect is viewed as a fast and associative information processing heuristic that guides perceptions of risk. In the second model, affect is seen as flowing from cognitive appraisals (i.e., affect is thought of as a post-cognitive process). Lastly, a third, dual-process model is advanced that integrates aspects from both theoretical perspectives. Four structural equation models were tested on a national sample (N = 808) of British respondents. Results initially provide support for the “cognitive” model, where personal experience with extreme weather is best conceptualized as a predictor of climate change risk perception and, in turn, risk perception a predictor of affect. Yet, closer examination strongly indicates that at the same time, risk perception and affect reciprocally influence each other in a stable feedback system. It is therefore concluded that both theoretical claims are valid and that a dual-process perspective provides a superior fit to the data. Implications for theory and risk communication are discussed. © 2014 The Authors. European Journal of Social Psychology published by John Wiley & Sons, Ltd. PMID:25678723

  9. Visual Perception and Regulatory Conflict: Motivation and Physiology Influence Distance Perception

    ERIC Educational Resources Information Center

    Cole, Shana; Balcetis, Emily; Zhang, Sam

    2013-01-01

    Regulatory conflict can emerge when people experience a strong motivation to act on goals but a conflicting inclination to withhold action because physical resources available, or "physiological potentials", are low. This study demonstrated that distance perception is biased in ways that theory suggests assists in managing this conflict.…

  10. Effects of attention and perceptual uncertainty on cerebellar activity during visual motion perception.

    PubMed

    Baumann, Oliver; Mattingley, Jason B

    2014-02-01

    Recent clinical and neuroimaging studies have revealed that the human cerebellum plays a role in visual motion perception, but the nature of its contribution to this function is not understood. Some reports suggest that the cerebellum might facilitate motion perception by aiding attentive tracking of visual objects. Others have identified a particular role for the cerebellum in discriminating motion signals in perceptually uncertain conditions. Here, we used functional magnetic resonance imaging to determine the degree to which cerebellar involvement in visual motion perception can be explained by a role in sustained attentive tracking of moving stimuli in contrast to a role in visual motion discrimination. While holding the visual displays constant, we manipulated attention by having participants attend covertly to a field of random-dot motion or a colored spot at fixation. Perceptual uncertainty was manipulated by varying the percentage of signal dots contained within the random-dot arrays. We found that attention to motion under high perceptual uncertainty was associated with strong activity in left cerebellar lobules VI and VII. By contrast, attending to motion under low perceptual uncertainty did not cause differential activation in the cerebellum. We found no evidence to support the suggestion that the cerebellum is involved in simple attentive tracking of salient moving objects. Instead, our results indicate that specific subregions of the cerebellum are involved in facilitating the detection and discrimination of task-relevant moving objects under conditions of high perceptual uncertainty. We conclude that the cerebellum aids motion perception under conditions of high perceptual demand. PMID:23982589

  11. Global motion perception deficits in autism are reflected as early as primary visual cortex.

    PubMed

    Robertson, Caroline E; Thomas, Cibu; Kravitz, Dwight J; Wallace, Gregory L; Baron-Cohen, Simon; Martin, Alex; Baker, Chris I

    2014-09-01

    Individuals with autism are often characterized as 'seeing the trees, but not the forest'-attuned to individual details in the visual world at the expense of the global percept they compose. Here, we tested the extent to which global processing deficits in autism reflect impairments in (i) primary visual processing; or (ii) decision-formation, using an archetypal example of global perception, coherent motion perception. In an event-related functional MRI experiment, 43 intelligence quotient and age-matched male participants (21 with autism, age range 15-27 years) performed a series of coherent motion perception judgements in which the amount of local motion signals available to be integrated into a global percept was varied by controlling stimulus viewing duration (0.2 or 0.6 s) and the proportion of dots moving in the correct direction (coherence: 4%, 15%, 30%, 50%, or 75%). Both typical participants and those with autism evidenced the same basic pattern of accuracy in judging the direction of motion, with performance decreasing with reduced coherence and shorter viewing durations. Critically, these effects were exaggerated in autism: despite equal performance at the long duration, performance was more strongly reduced by shortening viewing duration in autism (P < 0.015) and decreasing stimulus coherence (P < 0.008). To assess the neural correlates of these effects we focused on the responses of primary visual cortex and the middle temporal area, critical in the early visual processing of motion signals, as well as a region in the intraparietal sulcus thought to be involved in perceptual decision-making. The behavioural results were mirrored in both primary visual cortex and the middle temporal area, with a greater reduction in response at short, compared with long, viewing durations in autism compared with controls (both P < 0.018). In contrast, there was no difference between the groups in the intraparietal sulcus (P > 0.574). These findings suggest that reduced

  12. Implied Dynamics Biases the Visual Perception of Velocity

    PubMed Central

    La Scaleia, Barbara; Zago, Myrka; Moscatelli, Alessandro; Lacquaniti, Francesco; Viviani, Paolo

    2014-01-01

    We expand the anecdotic report by Johansson that back-and-forth linear harmonic motions appear uniform. Six experiments explore the role of shape and spatial orientation of the trajectory of a point-light target in the perceptual judgment of uniform motion. In Experiment 1, the target oscillated back-and-forth along a circular arc around an invisible pivot. The imaginary segment from the pivot to the midpoint of the trajectory could be oriented vertically downward (consistent with an upright pendulum), horizontally leftward, or vertically upward (upside-down). In Experiments 2 to 5, the target moved uni-directionally. The effect of suppressing the alternation of movement directions was tested with curvilinear (Experiment 2 and 3) or rectilinear (Experiment 4 and 5) paths. Experiment 6 replicated the upright condition of Experiment 1, but participants were asked to hold the gaze on a fixation point. When some features of the trajectory evoked the motion of either a simple pendulum or a mass-spring system, observers identified as uniform the kinematic profiles close to harmonic motion. The bias towards harmonic motion was most consistent in the upright orientation of Experiment 1 and 6. The bias disappeared when the stimuli were incompatible with both pendulum and mass-spring models (Experiments 3 to 5). The results are compatible with the hypothesis that the perception of dynamic stimuli is biased by the laws of motion obeyed by natural events, so that only natural motions appear uniform. PMID:24667578

  13. Visual perception of effervescence in champagne and other sparkling beverages.

    PubMed

    Liger-Belair, Gérard

    2010-01-01

    The so-called effervescence process, which enlivens champagne, sparkling wines, beers, and carbonated beverages in general, is the result of the fine interplay between CO₂-dissolved gas molecules, tiny air pockets trapped within microscopic particles during the pouring process, and some liquid properties. This chapter summarizes recent advances obtained during the last decade concerning the physicochemical processes behind the nucleation, rise, and burst of bubbles found in glasses poured with sparkling beverages. Those phenomena observed in close-up through high-speed photography are often visually appealing. Moreover, the kinetics of gas discharging from freshly poured glasses was monitored with time, whether champagne is served into a flute or into a coupe. The role of temperature was also examined. We hope that your enjoyment of champagne will be enhanced after reading this fully illustrated review dedicated to the deep beauties of nature often hidden behind many everyday phenomena. PMID:21092901

  14. Visual perception in aviation: Glide path performance during impoverished visual conditions

    NASA Astrophysics Data System (ADS)

    Gibb, Randall William

    Research has attempted to identify which visual cues are most salient for glide path (GP) performance during an approach to landing by a pilot flying in both rich and impoverished visual conditions. Numerous aviation accidents have occurred when a shallow GP was induced by a black hole illusion (BHI) or featureless terrain environment during night visual approaches to landing. Identifying the landing surface's orientation as well as size, distance, and depth cues are critical for a safe approach to landing. Twenty pilots accomplished simulated approaches while exposed to manipulated visual cues of horizon, runway length/width (ratio), random terrain objects, and approach lighting system (ALS) configurations. Participants were assessed on their performance relative to a 3 degree GP in terms of precision, bias, and stability in both degrees and altitude deviation over a distance of 5 nm (9.3 km) assessed at equal intervals to landing. Runway ratio and distance from the runway were the most dominant aspects of the visual scene that differentiated pilot performance and mediated other visual cues. The horizon was most influential for the first two-thirds of the approach and random terrain objects influenced the final portion. An ALS commonly used at airports today, mediated by a high runway ratio, induced shallow GPs; however, the worst GP performance regardless of ratio, was a combination ALS consisting of both side and approach lights. Pilot performance suggested a three-phase perceptual process, Assess-Act-React, used by pilots as they accumulated visual cues to guide their behavior. Perceptual learning demonstrated that despite recognition of the BH approach, pilots confidently flew dangerously low but did improve with practice implying that visual spatial disorientation education and training would be effective if accomplished in flight simulators.

  15. Spatially specific vs. unspecific disruption of visual orientation perception using chronometric pre-stimulus TMS.

    PubMed

    de Graaf, Tom A; Duecker, Felix; Fernholz, Martin H P; Sack, Alexander T

    2015-01-01

    Transcranial magnetic stimulation (TMS) over occipital cortex can impair visual processing. Such "TMS masking" has repeatedly been shown at several stimulus onset asynchronies (SOAs), with TMS pulses generally applied after the onset of a visual stimulus. Following increased interest in the neuronal state-dependency of visual processing, we recently explored the efficacy of TMS at "negative SOAs", when no visual processing can yet occur. We could reveal pre-stimulus TMS disruption, with results moreover hinting at two separate mechanisms in occipital cortex biasing subsequent orientation perception. Here we extended this work, including a chronometric design to map the temporal dynamics of spatially specific and unspecific mechanisms of state-dependent visual processing, while moreover controlling for TMS-induced pupil covering. TMS pulses applied 60-40 ms prior to a visual stimulus decreased orientation processing independent of stimulus location, while a local suppressive effect was found for TMS applied 30-10 ms pre-stimulus. These results contribute to our understanding of spatiotemporal mechanisms in occipital cortex underlying the state-dependency of visual processing, providing a basis for future work to link pre-stimulus TMS suppression effects to other known visual biasing mechanisms.

  16. Spatially specific vs. unspecific disruption of visual orientation perception using chronometric pre-stimulus TMS

    PubMed Central

    de Graaf, Tom A.; Duecker, Felix; Fernholz, Martin H. P.; Sack, Alexander T.

    2015-01-01

    Transcranial magnetic stimulation (TMS) over occipital cortex can impair visual processing. Such “TMS masking” has repeatedly been shown at several stimulus onset asynchronies (SOAs), with TMS pulses generally applied after the onset of a visual stimulus. Following increased interest in the neuronal state-dependency of visual processing, we recently explored the efficacy of TMS at “negative SOAs”, when no visual processing can yet occur. We could reveal pre-stimulus TMS disruption, with results moreover hinting at two separate mechanisms in occipital cortex biasing subsequent orientation perception. Here we extended this work, including a chronometric design to map the temporal dynamics of spatially specific and unspecific mechanisms of state-dependent visual processing, while moreover controlling for TMS-induced pupil covering. TMS pulses applied 60–40 ms prior to a visual stimulus decreased orientation processing independent of stimulus location, while a local suppressive effect was found for TMS applied 30–10 ms pre-stimulus. These results contribute to our understanding of spatiotemporal mechanisms in occipital cortex underlying the state-dependency of visual processing, providing a basis for future work to link pre-stimulus TMS suppression effects to other known visual biasing mechanisms. PMID:25688194

  17. Fornix and medial temporal lobe lesions lead to comparable deficits in complex visual perception.

    PubMed

    Lech, Robert K; Koch, Benno; Schwarz, Michael; Suchan, Boris

    2016-05-01

    Recent research dealing with the structures of the medial temporal lobe (MTL) has shifted away from exclusively investigating memory-related processes and has repeatedly incorporated the investigation of complex visual perception. Several studies have demonstrated that higher level visual tasks can recruit structures like the hippocampus and perirhinal cortex in order to successfully perform complex visual discriminations, leading to a perceptual-mnemonic or representational view of the medial temporal lobe. The current study employed a complex visual discrimination paradigm in two patients suffering from brain lesions with differing locations and origin. Both patients, one with extensive medial temporal lobe lesions (VG) and one with a small lesion of the anterior fornix (HJK), were impaired in complex discriminations while showing otherwise mostly intact cognitive functions. The current data confirmed previous results while also extending the perceptual-mnemonic theory of the MTL to the main output structure of the hippocampus, the fornix. PMID:26994782

  18. Visual Crowding: a fundamental limit on conscious perception and object recognition

    PubMed Central

    Whitney, David; Levi, Dennis M.

    2011-01-01

    Crowding, the inability to recognize objects in clutter, sets a fundamental limit on conscious visual perception and object recognition throughout most of the visual field. Despite how widespread and essential it is to object recognition, reading, and visually guided action, a solid operational definition of what crowding is has only recently become clear. The goal of this review is to provide a broad-based synthesis of the most recent findings in this area, to define what crowding is and is not, and to set the stage for future work that will extend crowding well beyond low-level vision. Here we define five diagnostic criteria for what counts as crowding, and further describe factors that both escape and break crowding. All of these lead to the conclusion that crowding occurs at multiple stages in the visual hierarchy. PMID:21420894

  19. The Conductor As Visual Guide: Gesture and Perception of Musical Content

    PubMed Central

    Kumar, Anita B.; Morrison, Steven J.

    2016-01-01

    Ensemble conductors are often described as embodying the music. Researchers have determined that expressive gestures affect viewers’ perceptions of conducted ensemble performances. This effect may be due, in part, to conductor gesture delineating and amplifying specific expressive aspects of music performances. The purpose of the present study was to determine if conductor gesture affected observers’ focus of attention to contrasting aspects of ensemble performances. Audio recordings of two different music excerpts featuring two-part counterpoint (an ostinato paired with a lyric melody, and long chord tones paired with rhythmic interjections) were paired with video of two conductors. Each conductor used gesture appropriate to one or the other musical element (e.g., connected and flowing or detached and crisp) for a total of sixteen videos. Musician participants evaluated 8 of the excerpts for Articulation, Rhythm, Style, and Phrasing using four 10-point differential scales anchored by descriptive terms (e.g., disconnected to connected, and angular to flowing.) Results indicated a relationship between gesture and listeners’ evaluations of musical content. Listeners appear to be sensitive to the manner in which a conductor’s gesture delineates musical lines, particularly as an indication of overall articulation and style. This effect was observed for the lyric melody and ostinato excerpt, but not for the chords and interjections excerpt. Therefore, this effect appears to be mitigated by the congruence of gesture to preconceptions of the importance of melodic over rhythmic material, of certain instrument timbres over others, and of length between onsets of active material. These results add to a body of literature that supports the importance of the visual component in the multimodal experience of music performance. PMID:27458425

  20. Event Boundaries in Perception Affect Memory Encoding and Updating

    ERIC Educational Resources Information Center

    Swallow, Khena M.; Zacks, Jeffrey M.; Abrams, Richard A.

    2009-01-01

    Memory for naturalistic events over short delays is important for visual scene processing, reading comprehension, and social interaction. The research presented here examined relations between how an ongoing activity is perceptually segmented into events and how those events are remembered a few seconds later. In several studies, participants…

  1. The Effect of Group Administration of Selected Individual Tests of Language, Visual Perception, and Auditory Perception to Kindergarten, First-, Second- and Third-Grade Children.

    ERIC Educational Resources Information Center

    Becker, John T.

    This study endeavored to determine (1) the reliability with which selected individual tests of language, visual and auditory perception, and auditory-visual perceptual integration can be administered through group testing; (2) the decrease in administration and scoring time by using these instruments in a group manner, and (3) the relationships…

  2. Factors Affecting Parent's Perception on Air Quality-From the Individual to the Community Level.

    PubMed

    Guo, Yulin; Liu, Fengfeng; Lu, Yuanan; Mao, Zongfu; Lu, Hanson; Wu, Yanyan; Chu, Yuanyuan; Yu, Lichen; Liu, Yisi; Ren, Meng; Li, Na; Chen, Xi; Xiang, Hao

    2016-01-01

    The perception of air quality significantly affects the acceptance of the public of the government's environmental policies. The aim of this research is to explore the relationship between the perception of the air quality of parents and scientific monitoring data and to analyze the factors that affect parents' perceptions. Scientific data of air quality were obtained from Wuhan's environmental condition reports. One thousand parents were investigated for their knowledge and perception of air quality. Scientific data show that the air quality of Wuhan follows an improving trend in general, while most participants believed that the air quality of Wuhan has deteriorated, which indicates a significant difference between public perception and reality. On the individual level, respondents with an age of 40 or above (40 or above: OR = 3.252; 95% CI: 1.170-9.040), a higher educational level (college and above: OR = 7.598; 95% CI: 2.244-25.732) or children with poor healthy conditions (poor: OR = 6.864; 95% CI: 2.212-21.302) have much more negative perception of air quality. On the community level, industrial facilities, vehicles and city construction have major effects on parents' perception of air quality. Our investigation provides baseline information for environmental policy researchers and makers regarding the public's perception and expectation of air quality and the benefits to the environmental policy completing and enforcing.

  3. Factors Affecting Parent's Perception on Air Quality-From the Individual to the Community Level.

    PubMed

    Guo, Yulin; Liu, Fengfeng; Lu, Yuanan; Mao, Zongfu; Lu, Hanson; Wu, Yanyan; Chu, Yuanyuan; Yu, Lichen; Liu, Yisi; Ren, Meng; Li, Na; Chen, Xi; Xiang, Hao

    2016-01-01

    The perception of air quality significantly affects the acceptance of the public of the government's environmental policies. The aim of this research is to explore the relationship between the perception of the air quality of parents and scientific monitoring data and to analyze the factors that affect parents' perceptions. Scientific data of air quality were obtained from Wuhan's environmental condition reports. One thousand parents were investigated for their knowledge and perception of air quality. Scientific data show that the air quality of Wuhan follows an improving trend in general, while most participants believed that the air quality of Wuhan has deteriorated, which indicates a significant difference between public perception and reality. On the individual level, respondents with an age of 40 or above (40 or above: OR = 3.252; 95% CI: 1.170-9.040), a higher educational level (college and above: OR = 7.598; 95% CI: 2.244-25.732) or children with poor healthy conditions (poor: OR = 6.864; 95% CI: 2.212-21.302) have much more negative perception of air quality. On the community level, industrial facilities, vehicles and city construction have major effects on parents' perception of air quality. Our investigation provides baseline information for environmental policy researchers and makers regarding the public's perception and expectation of air quality and the benefits to the environmental policy completing and enforcing. PMID:27187432

  4. Task-dependent calibration of auditory spatial perception through environmental visual observation.

    PubMed

    Tonelli, Alessia; Brayda, Luca; Gori, Monica

    2015-01-01

    Visual information is paramount to space perception. Vision influences auditory space estimation. Many studies show that simultaneous visual and auditory cues improve precision of the final multisensory estimate. However, the amount or the temporal extent of visual information, that is sufficient to influence auditory perception, is still unknown. It is therefore interesting to know if vision can improve auditory precision through a short-term environmental observation preceding the audio task and whether this influence is task-specific or environment-specific or both. To test these issues we investigate possible improvements of acoustic precision with sighted blindfolded participants in two audio tasks [minimum audible angle (MAA) and space bisection] and two acoustically different environments (normal room and anechoic room). With respect to a baseline of auditory precision, we found an improvement of precision in the space bisection task but not in the MAA after the observation of a normal room. No improvement was found when performing the same task in an anechoic chamber. In addition, no difference was found between a condition of short environment observation and a condition of full vision during the whole experimental session. Our results suggest that even short-term environmental observation can calibrate auditory spatial performance. They also suggest that echoes can be the cue that underpins visual calibration. Echoes may mediate the transfer of information from the visual to the auditory system. PMID:26082692

  5. Syntactic texture and perception for a new generic visual anomalies classification

    NASA Astrophysics Data System (ADS)

    Désage, Simon-Frédéric; Pitard, Gilles; Pillet, Maurice; Favrelière, Hugues; Maire, Jean-Luc; Frelin, Fabrice; Samper, Serge; Le Goïc, Gaëtan

    2015-04-01

    The research purpose is to improve aesthetic anomalies detection and evaluation based on what is perceived by human eye and on the 2006 CIE report.1 It is therefore important to define parameters able to discriminate surfaces, in accordance with the perception of human eye. Our starting point in assessing aesthetic anomalies is geometric description such as defined by ISO standard,2 i.e. traduce anomalies description with perception words about texture divergence impact. However, human controllers observe (detect) the aesthetic anomaly by its visual effect and interpreter for its geometric description. The research question is how define generic parameters for discriminating aesthetic anomalies, from enhanced information of visual texture such as recent surface visual rendering approach. We propose to use an approach from visual texture processing that quantify spatial variations of pixel for translating changes in color, material and relief. From a set of images from different angles of light which gives us access to the surface appearance, we propose an approach from visual effect to geometrical specifications as the current standards have identified the aesthetic anomalies.

  6. High-resolution remotely sensed small target detection by imitating fly visual perception mechanism.

    PubMed

    Huang, Fengchen; Xu, Lizhong; Li, Min; Tang, Min

    2012-01-01

    The difficulty and limitation of small target detection methods for high-resolution remote sensing data have been a recent research hot spot. Inspired by the information capture and processing theory of fly visual system, this paper endeavors to construct a characterized model of information perception and make use of the advantages of fast and accurate small target detection under complex varied nature environment. The proposed model forms a theoretical basis of small target detection for high-resolution remote sensing data. After the comparison of prevailing simulation mechanism behind fly visual systems, we propose a fly-imitated visual system method of information processing for high-resolution remote sensing data. A small target detector and corresponding detection algorithm are designed by simulating the mechanism of information acquisition, compression, and fusion of fly visual system and the function of pool cell and the character of nonlinear self-adaption. Experiments verify the feasibility and rationality of the proposed small target detection model and fly-imitated visual perception method.

  7. The Effect of a Computerized Visual Perception and Visual-Motor Integration Training Program on Improving Chinese Handwriting of Children with Handwriting Difficulties

    ERIC Educational Resources Information Center

    Poon, K. W.; Li-Tsang, C. W .P.; Weiss, T. P. L.; Rosenblum, S.

    2010-01-01

    This study aimed to investigate the effect of a computerized visual perception and visual-motor integration training program to enhance Chinese handwriting performance among children with learning difficulties, particularly those with handwriting problems. Participants were 26 primary-one children who were assessed by educational psychologists and…

  8. Visual Perception and Visual-Motor Integration in Very Preterm and/or Very Low Birth Weight Children: A Meta-Analysis

    ERIC Educational Resources Information Center

    Geldof, C. J. A.; van Wassenaer, A. G.; de Kieviet, J. F.; Kok, J. H.; Oosterlaan, J.

    2012-01-01

    A range of neurobehavioral impairments, including impaired visual perception and visual-motor integration, are found in very preterm born children, but reported findings show great variability. We aimed to aggregate the existing literature using meta-analysis, in order to provide robust estimates of the effect of very preterm birth on visual…

  9. Immersion factors affecting perception and behaviour in a virtual reality power wheelchair simulator.

    PubMed

    Alshaer, Abdulaziz; Regenbrecht, Holger; O'Hare, David

    2017-01-01

    Virtual Reality based driving simulators are increasingly used to train and assess users' abilities to operate vehicles in a controlled and safe way. For the development of those simulators it is important to identify and evaluate design factors affecting perception, behaviour, and driving performance. In an exemplary power wheelchair simulator setting we identified the three immersion factors display type (head-mounted display v monitor), ability to freely change the field of view (FOV), and the visualisation of the user's avatar as potentially affecting perception and behaviour. In a study with 72 participants we found all three factors affected the participants' sense of presence in the virtual environment. In particular the display type significantly affected both perceptual and behavioural measures whereas FOV only affected behavioural measures. Our findings could guide future Virtual Reality simulator designers to evoke targeted user behaviours and perceptions. PMID:27633192

  10. Immersion factors affecting perception and behaviour in a virtual reality power wheelchair simulator.

    PubMed

    Alshaer, Abdulaziz; Regenbrecht, Holger; O'Hare, David

    2017-01-01

    Virtual Reality based driving simulators are increasingly used to train and assess users' abilities to operate vehicles in a controlled and safe way. For the development of those simulators it is important to identify and evaluate design factors affecting perception, behaviour, and driving performance. In an exemplary power wheelchair simulator setting we identified the three immersion factors display type (head-mounted display v monitor), ability to freely change the field of view (FOV), and the visualisation of the user's avatar as potentially affecting perception and behaviour. In a study with 72 participants we found all three factors affected the participants' sense of presence in the virtual environment. In particular the display type significantly affected both perceptual and behavioural measures whereas FOV only affected behavioural measures. Our findings could guide future Virtual Reality simulator designers to evoke targeted user behaviours and perceptions.

  11. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT).

  12. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT).

  13. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure

    PubMed Central

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  14. A Model of Generating Visual Place Cells Based on Environment Perception and Similar Measure.

    PubMed

    Zhou, Yang; Wu, Dewei

    2016-01-01

    It is an important content to generate visual place cells (VPCs) in the field of bioinspired navigation. By analyzing the firing characteristic of biological place cells and the existing methods for generating VPCs, a model of generating visual place cells based on environment perception and similar measure is abstracted in this paper. VPCs' generation process is divided into three phases, including environment perception, similar measure, and recruiting of a new place cell. According to this process, a specific method for generating VPCs is presented. External reference landmarks are obtained based on local invariant characteristics of image and a similar measure function is designed based on Euclidean distance and Gaussian function. Simulation validates the proposed method is available. The firing characteristic of the generated VPCs is similar to that of biological place cells, and VPCs' firing fields can be adjusted flexibly by changing the adjustment factor of firing field (AFFF) and firing rate's threshold (FRT). PMID:27597859

  15. The role of visual perception in action anticipation in basketball athletes.

    PubMed

    Wu, Y; Zeng, Y; Zhang, L; Wang, S; Wang, D; Tan, X; Zhu, X; Zhang, J; Zhang, J

    2013-05-01

    Athletes exhibit better anticipation abilities than novices. However, it is not known whether this difference is related to different visual perceptions between them and which neural elements are involved in producing this difference. Fifteen elite basketball players and 15 novices participated in an action anticipation task with basketball free throw. Accurate rate for anticipation and gaze behavior were analyzed. Functional brain activity was recorded using functional magnetic resonance imaging. We found that the accurate rate for anticipation was higher in athletes than that in novices. Athletes showed more stable gaze fixation than novices and the locus of fixation was reliable in athletes but not in novices. Athletes showed higher activity in inferior parietal lobule and inferior frontal gyrus than novices during action anticipation. We conclude that the processes for action anticipation in elite athletes and novices are different and this difference is caused by different visual perceptions between them.

  16. Mean and Random Errors of Visual Roll Rate Perception from Central and Peripheral Visual Displays

    NASA Technical Reports Server (NTRS)

    Vandervaart, J. C.; Hosman, R. J. A. W.

    1984-01-01

    A large number of roll rate stimuli, covering rates from zero to plus or minus 25 deg/sec, were presented to subjects in random order at 2 sec intervals. Subjects were to make estimates of magnitude of perceived roll rate stimuli presented on either a central display, on displays in the peripheral ield of vision, or on all displays simultaneously. Response was by way of a digital keyboard device, stimulus exposition times were varied. The present experiment differs from earlier perception tasks by the same authors in that mean rate perception error (and standard deviation) was obtained as a function of rate stimulus magnitude, whereas the earlier experiments only yielded mean absolute error magnitude. Moreover, in the present experiment, all stimulus rates had an equal probability of occurrence, whereas the earlier tests featured a Gaussian stimulus probability density function. Results yield a ood illustration of the nonlinear functions relating rate presented to rate perceived by human observers or operators.

  17. Contextual effects of scene on the visual perception of object orientation in depth.

    PubMed

    Niimi, Ryosuke; Watanabe, Katsumi

    2013-01-01

    We investigated the effect of background scene on the human visual perception of depth orientation (i.e., azimuth angle) of three-dimensional common objects. Participants evaluated the depth orientation of objects. The objects were surrounded by scenes with an apparent axis of the global reference frame, such as a sidewalk scene. When a scene axis was slightly misaligned with the gaze line, object orientation perception was biased, as if the gaze line had been assimilated into the scene axis (Experiment 1). When the scene axis was slightly misaligned with the object, evaluated object orientation was biased, as if it had been assimilated into the scene axis (Experiment 2). This assimilation may be due to confusion between the orientation of the scene and object axes (Experiment 3). Thus, the global reference frame may influence object orientation perception when its orientation is similar to that of the gaze-line or object.

  18. Auditory, visual, and auditory-visual perception of vowels by hearing-impaired children.

    PubMed

    Hack, Z C; Erber, N P

    1982-03-01

    The vowels (foreign letters in text) were presented through auditory, visual, and combined auditory-visual modalities to hearing-impaired children having good, intermediate, and poor auditory work-recognition skills. When they received acoustic information only, children with good word-recognition skills confused neighboring vowels (i.e., those having similar formant frequencies). Children with intermediate work-recognition skills demonstrated this same difficulty and confused front and back vowels. Children with poor word-recognition skills identified vowels mainly on the basis of temporal and intensity cues. Through lipreading alone, all three groups distinguished spread from rounded vowels but could not reliably identify vowels within the categories. The first two groups exhibited only moderate difficulty in identifying vowels audiovisually. The third group, although showing a small amount of improvement over lipreading alone, still experienced difficulty in identifying vowels through combined auditory and visual modes.

  19. If you feel bad, it's unfair: a quantitative synthesis of affect and organizational justice perceptions.

    PubMed

    Barsky, Adam; Kaplan, Seth A

    2007-01-01

    Whereas research interest in both individual affect/temperament and organizational justice has grown substantially in recent years, affect's role in the perception of organizational justice has received scant attention. Here, the authors integrate these literatures and test bivariate relationships between state affect (e.g., moods), trait affect (e.g., affectivity), and organizational justice variables using meta-analytically aggregated effect sizes. Results indicated that state and trait positive and negative affect exhibit statistically significant relationships with perceptions of distributive, procedural, and interactional justice in the predicted directions, with mean population-level correlations ranging in absolute magnitude from M(rho) = .09 to M(rho) = .43. Correlations involving state affect generally were larger but not significantly different from those involving trait affect. Finally, the authors propose ideas for investigations at the primary-study level.

  20. Specific previous experience affects perception of harmony and meter.

    PubMed

    Creel, Sarah C

    2011-10-01

    Prior knowledge shapes our experiences, but which prior knowledge shapes which experiences? This question is addressed in the domain of music perception. Three experiments were used to determine whether listeners activate specific musical memories during music listening. Each experiment provided listeners with one of two musical contexts that was presented simultaneously with a melody. After a listener was familiarized with melodies embedded in contexts, the listener heard melodies in isolation and judged the fit of a final harmonic or metrical probe event. The probe event matched either the familiar (but absent) context or an unfamiliar context. For both harmonic (Experiments 1 and 3) and metrical (Experiment 2) information, exposure to context shifted listeners' preferences toward a probe matching the context that they had been familiarized with. This suggests that listeners rapidly form specific musical memories without explicit instruction, which are then activated during music listening. These data pose an interesting challenge for models of music perception which implicitly assume that the listener's knowledge base is predominantly schematic or abstract. PMID:21553992

  1. Factors affecting the perception of Korean-accented American English

    NASA Astrophysics Data System (ADS)

    Cho, Kwansun; Harris, John G.; Shrivastav, Rahul

    2005-09-01

    This experiment examines the relative contribution of two factors, intonation and articulation errors, on the perception of foreign accent in Korean-accented American English. Ten native speakers of Korean and ten native speakers of American English were asked to read ten English sentences. These sentences were then modified using high-quality speech resynthesis techniques [STRAIGHT Kawahara et al., Speech Commun. 27, 187-207 (1999)] to generate four sets of stimuli. In the first two sets of stimuli, the intonation patterns of the Korean speakers and American speakers were switched with one another. The articulatory errors for each speaker were not modified. In the final two sets, the sentences from the Korean and American speakers were resynthesized without any modifications. Fifteen listeners were asked to rate all the stimuli for the degree of foreign accent. Preliminary results show that, for native speakers of American English, articulation errors may play a greater role in the perception of foreign accent than errors in intonation patterns. [Work supported by KAIM.

  2. The influence of yaw motion on the perception of active vs passive visual curvilinear displacement.

    PubMed

    Savona, Florian; Stratulat, Anca Melania; Roussarie, Vincent; Bourdin, Christophe

    2015-01-01

    Self-motion perception, which partly determines the realism of dynamic driving simulators, is based on multisensory integration. However, it remains unclear how the brain integrates these cues to create adequate motion perception, especially for curvilinear displacements. In the present study, the effect of visual, inertial and visuo-inertial cues (concordant or discordant bimodal cues) on self-motion perception was analyzed. Subjects were asked to evaluate (externally produced) or produce (self-controlled) curvilinear displacements as accurately as possible. The results show systematic overestimation of displacement, with better performance for active subjects than for passive ones. Furthermore, it was demonstrated that participants used unimodal or bimodal cues differently in performing their activity. When passive, subjects systematically integrated visual and inertial cues even when discordant, but with weightings that depended on the dynamics. On the contrary, active subjects were able to reject the inertial cue when the discordance became too high, producing self-motion perception on the basis of more reliable information. Thereby, multisensory integration seems to follow a non-linear integration model of, i.e., the cues' weight changes with the cue reliability and/or the intensity of the stimuli, as reported by previous studies. These results represent a basis for the adaptation of motion cueing algorithms are developed for dynamic driving simulators, by taking into account the dynamics of simulated motion in line with the status of the participants (driver or passenger).

  3. [Perception, processing of visual information and resistance to emotional stresses in athletes of different ages].

    PubMed

    Korobeĭnikova, L H; Makarchuk, M Iu

    2013-01-01

    Among the numerous studies devoted to the study of perception and information processing, no data available on the effects of age on these processes. In this paper we studied the influence of psycho-emotional stress and different levels of stress on the mental processes of perception and information processing in highly skilled athletes divided into two groups. The first group included the athletes aged 19-24 years (12 athletes, members of the Ukrainian team in Greco-Roman wrestling), the second group included the athletes aged 27-31 years (7 highly skilled athletes, members of the Ukrainian team in Greco-Roman wrestling). We revealed that the athletes of the first group had higher productivity and better visual perception and visual information processing efficiency, compared with athletes from the second group. This observation suggests a dependency of cognitive component of perception and information processing on the age of the athletes. Sportsmen from the second group had higher stress resistance compared to the older age group.

  4. Investigating affective color association of media content in language and perception based on online RGB experiment

    NASA Astrophysics Data System (ADS)

    Lee, Kyung Jae

    2005-03-01

    As an investigation of color categorization in language and perception, this research intends to study the affective associations between certain colors and different media content (i.e., movie genres). Compared to non-entertainment graphics (medical imaging and engineering graphics), entertainment graphics (video games and movies) are designed to deliver emotionally stimulating content to audiences. Based on an online color survey of 19 subjects, this study investigated whether or not subjects had different color preferences on diverse movie genres. Instead of providing predefined limited number of color chips (or pictures) as stimuli, this study was conducted by asking the subjects to visualize their own images of movie genres and to select their preferred colors through an online RGB color palette. By providing a combined application interface of three color slides (red, green, blue) and 216 digital color cells, the subjects were interactively able to select their preferred colors of different movie genres. To compare the distribution of movie genres, the user selected colors were mapped on CIE chromaticity diagram. This study also investigated preferred color naming of different movie genres as well as three primary color names of the subjects" most favorite genre. The results showed that the subjects had different color associations with specific movie genres as well as certain genres showed higher individual differences. Regardless of genre differences, the subjects selected blue, red or green as their three primary color names that represent their favorite movie genres. Also, the results supports Berlin & Kay"s eleven color terms.

  5. The effect of visual spatial attention on audiovisual speech perception in adults with Asperger syndrome.

    PubMed

    Saalasti, Satu; Tiippana, Kaisa; Kätsyri, Jari; Sams, Mikko

    2011-09-01

    Individuals with Asperger syndrome (AS) have problems in following conversation, especially in the situations where several people are talking. This might result from impairments in audiovisual speech perception, especially from difficulties in focusing attention to speech-relevant visual information and ignoring distracting information. We studied the effect of visual spatial attention on the audiovisual speech perception of adult individuals with AS and matched control participants. Two faces were presented side by side, one uttering /aka/ and the other /ata/, while an auditory stimulus of /apa/ was played. The participants fixated on a central cross and directed their attention to the face that an arrow pointed to, reporting which consonant they heard. We hypothesized that the adults with AS would be more distracted by a competing talking face than the controls. Instead, they were able to covertly attend to the talking face, and they were as distracted by a competing face as the controls. Independently of the attentional effect, there was a qualitative difference in audiovisual speech perception: when the visual articulation was /aka/, the control participants heard /aka/ almost exclusively, while the participants with AS heard frequently /ata/. This finding may relate to difficulties in face-to-face communication in AS.

  6. Vestibular signals in macaque extrastriate visual cortex are functionally appropriate for heading perception

    PubMed Central

    Liu, Sheng; Angelaki, Dora E.

    2009-01-01

    Visual and vestibular signals converge onto the dorsal medial superior temporal area (MSTd) of the macaque extrastriate visual cortex, which is thought to be involved in multisensory heading perception for spatial navigation. Peripheral otolith information, however, is ambiguous and cannot distinguish linear accelerations experienced during self-motion from those due to changes in spatial orientation relative to gravity. Here we show that, unlike peripheral vestibular sensors but similar to lobules 9 and 10 of the cerebellar vermis (nodulus and uvula), MSTd neurons respond selectively to heading and not to changes in orientation relative to gravity. In support of a role in heading perception, MSTd vestibular responses are also dominated by velocity-like temporal dynamics, which might optimize sensory integration with visual motion information. Unlike the cerebellar vermis, however, MSTd neurons also carry a spatial orientation-independent rotation signal from the semicircular canals, which could be useful in compensating for the effects of head rotation on the processing of optic flow. These findings show that vestibular signals in MSTd are appropriately processed to support a functional role in multisensory heading perception. PMID:19605631

  7. Color names, color categories, and color-cued visual search: sometimes, color perception is not categorical.

    PubMed

    Brown, Angela M; Lindsey, Delwin T; Guckes, Kevin M

    2011-01-01

    The relation between colors and their names is a classic case study for investigating the Sapir-Whorf hypothesis that categorical perception is imposed on perception by language. Here, we investigate the Sapir-Whorf prediction that visual search for a green target presented among blue distractors (or vice versa) should be faster than search for a green target presented among distractors of a different color of green (or for a blue target among different blue distractors). A. L. Gilbert, T. Regier, P. Kay, and R. B. Ivry (2006) reported that this Sapir-Whorf effect is restricted to the right visual field (RVF), because the major brain language centers are in the left cerebral hemisphere. We found no categorical effect at the Green-Blue color boundary and no categorical effect restricted to the RVF. Scaling of perceived color differences by Maximum Likelihood Difference Scaling (MLDS) also showed no categorical effect, including no effect specific to the RVF. Two models fit the data: a color difference model based on MLDS and a standard opponent-colors model of color discrimination based on the spectral sensitivities of the cones. Neither of these models nor any of our data suggested categorical perception of colors at the Green-Blue boundary, in either visual field.

  8. Perception and psychological evaluation for visual and auditory environment based on the correlation mechanisms

    NASA Astrophysics Data System (ADS)

    Fujii, Kenji

    2002-06-01

    In this dissertation, the correlation mechanism in modeling the process in the visual perception is introduced. It has been well described that the correlation mechanism is effective for describing subjective attributes in auditory perception. The main result is that it is possible to apply the correlation mechanism to the process in temporal vision and spatial vision, as well as in audition. (1) The psychophysical experiment was performed on subjective flicker rates for complex waveforms. A remarkable result is that the phenomenon of missing fundamental is found in temporal vision as analogous to the auditory pitch perception. This implies the existence of correlation mechanism in visual system. (2) For spatial vision, the autocorrelation analysis provides useful measures for describing three primary perceptual properties of visual texture: contrast, coarseness, and regularity. Another experiment showed that the degree of regularity is a salient cue for texture preference judgment. (3) In addition, the autocorrelation function (ACF) and inter-aural cross-correlation function (IACF) were applied for analysis of the temporal and spatial properties of environmental noise. It was confirmed that the acoustical properties of aircraft noise and traffic noise are well described. These analyses provided useful parameters extracted from the ACF and IACF in assessing the subjective annoyance for noise. Thesis advisor: Yoichi Ando Copies of this thesis written in English can be obtained from Junko Atagi, 6813 Mosonou, Saijo-cho, Higashi-Hiroshima 739-0024, Japan. E-mail address: atagi\\@urban.ne.jp.

  9. The Perception of Naturalness Correlates with Low-Level Visual Features of Environmental Scenes

    PubMed Central

    Berman, Marc G.; Hout, Michael C.; Kardan, Omid; Hunter, MaryCarol R.; Yourganov, Grigori; Henderson, John M.; Hanayik, Taylor; Karimi, Hossein; Jonides, John

    2014-01-01

    Previous research has shown that interacting with natural environments vs. more urban or built environments can have salubrious psychological effects, such as improvements in attention and memory. Even viewing pictures of nature vs. pictures of built environments can produce similar effects. A major question is: What is it about natural environments that produces these benefits? Problematically, there are many differing qualities between natural and urban environments, making it difficult to narrow down the dimensions of nature that may lead to these benefits. In this study, we set out to uncover visual features that related to individuals' perceptions of naturalness in images. We quantified naturalness in two ways: first, implicitly using a multidimensional scaling analysis and second, explicitly with direct naturalness ratings. Features that seemed most related to perceptions of naturalness were related to the density of contrast changes in the scene, the density of straight lines in the scene, the average color saturation in the scene and the average hue diversity in the scene. We then trained a machine-learning algorithm to predict whether a scene was perceived as being natural or not based on these low-level visual features and we could do so with 81% accuracy. As such we were able to reliably predict subjective perceptions of naturalness with objective low-level visual features. Our results can be used in future studies to determine if these features, which are related to naturalness, may also lead to the benefits attained from interacting with nature. PMID:25531411

  10. Dexterity, visual perception, and activities of daily living in persons with multiple sclerosis.

    PubMed

    Poole, Janet L; Nakamoto, Trisha; McNulty, Tina; Montoya, Janeen R; Weill, Deedra; Dieruf, Kathy; Skipper, Betty

    2010-04-01

    ABSTRACT The purposes of this study were to compare dexterity, visual perception, and abilities to carry out activities of daily living (ADL) in persons with different multiple sclerosis (MS) subtypes and to determine what relationships exist between the three variables. Fifty-six persons with MS were administered tests of dexterity, visual perception, and ADL ability. Demographic variables and scores on Kurtzke's Expanded Disability Status Scale were also collected. Scores from the chronic-progressive group were significantly higher than those of the benign and progressive-relapsing groups for the Nine-Hole Peg Test-Left Hand, Grooved Peg Test, and Functional Status Index (except Functional Status Index-Pain). There were no differences between the MS groups for any demographic variables except on the Expanded Disability Status Scale. Visual perception did not correlate with dexterity or ADL ability, and only dexterity scores for the left hand correlated with ADL ability. Persons with the severer subtype of MS were significantly impaired compared with the least severe group for dexterity and ADL ability. Decreased dexterity was associated with needing more assistance and having more perceived difficulty with ADL.

  11. Activation of the prefrontal cortex in the human visual aesthetic perception

    PubMed Central

    Cela-Conde, Camilo J.; Marty, Gisèle; Maestú, Fernando; Ortiz, Tomás; Munar, Enric; Fernández, Alberto; Roca, Miquel; Rosselló, Jaume; Quesney, Felipe

    2004-01-01

    Visual aesthetic perception (“aesthetics”) or the capacity to visually perceive a particular attribute added to other features of objects, such as form, color, and movement, was fixed during human evolutionary lineage as a trait not shared with any great ape. Although prefrontal brain expansion is mentioned as responsible for the appearance of such human trait, no current knowledge exists on the role of prefrontal areas in the aesthetic perception. The visual brain consists of “several parallel multistage processing systems, each specialized in a given task such as, color or motion” [Bartels, A. & Zeki, S. (1999) Proc. R. Soc. London Ser. B 265, 2327–2332]. Here we report the results of an experiment carried out with magnetoencephalography which shows that the prefrontal area is selectively activated in humans during the perception of objects qualified as “beautiful” by the participants. Therefore, aesthetics can be hypothetically considered as an attribute perceived by means of a particular brain processing system, in which the prefrontal cortex seems to play a key role. PMID:15079079

  12. Toward an evolutionary perspective on conceptual representation: species-specific calls activate visual and affective processing systems in the macaque.

    PubMed

    Gil-da-Costa, Ricardo; Braun, Allen; Lopes, Marco; Hauser, Marc D; Carson, Richard E; Herscovitch, Peter; Martin, Alex

    2004-12-14

    Non-human primates produce a diverse repertoire of species-specific calls and have rich conceptual systems. Some of their calls are designed to convey information about concepts such as predators, food, and social relationships, as well as the affective state of the caller. Little is known about the neural architecture of these calls, and much of what we do know is based on single-cell physiology from anesthetized subjects. By using positron emission tomography in awake rhesus macaques, we found that conspecific vocalizations elicited activity in higher-order visual areas, including regions in the temporal lobe associated with the visual perception of object form (TE/TEO) and motion (superior temporal sulcus) and storing visual object information into long-term memory (TE), as well as in limbic (the amygdala and hippocampus) and paralimbic regions (ventromedial prefrontal cortex) associated with the interpretation and memory-encoding of highly salient and affective material. This neural circuitry strongly corresponds to the network shown to support representation of conspecifics and affective information in humans. These findings shed light on the evolutionary precursors of conceptual representation in humans, suggesting that monkeys and humans have a common neural substrate for representing object concepts. PMID:15583132

  13. The perception of three-dimensional cast-shadow structure is dependent on visual awareness.

    PubMed

    Khuu, Sieu K; Gordon, Jack; Balcomb, Kaleah; Kim, Juno

    2014-03-19

    In the present study we examined whether the perception of depth from cast shadows is dependent on visual awareness using continuous flash suppression (CFS). As a direct measure of how the visual system infers depth from cast shadows, we examined the cast-shadow motion illusion originally reported by Kersten, Knill, Mamassian, and Bulthoff (1996), in which a moving cast shadow induces illusory motion in depth in a physically stationary object. In Experiment 1, we used a disparity defined probe to determine the stereo motion speed required to match the cast-shadow motion illusion for different cast shadow speeds (0°/s-1.6°/s) and different lighting directions. We found that configurations implying light from above produce more compelling illusory effects. We also found that increasing shadow speed monotonically increased the stereo motion speed required to match the illusory motion, which suggests that quantitative depth can be derived from cast shadows when they are in motion. In Experiment 2, we used CFS to suppress the cast shadow from visual awareness. Visual suppression of the cast shadow from awareness greatly diminished the perception of illusory motion in depth. In Experiment 3 we confirmed that while CFS suppresses the cast-shadow motion from awareness, it continues to be processed by the visual system sufficient to generate a significant motion after effect. The results of the present study suggest that cast shadows can greatly contribute to the perception of scene depth structure, through a process that is dependent on the conscious awareness of the cast shadow.

  14. Linking the laminar circuits of visual cortex to visual perception: development, grouping, and attention.

    PubMed

    Grossberg, S

    2001-08-01

    How do the laminar circuits of visual cortical areas V1 and V2 implement context-sensitive binding processes such as perceptual grouping and attention, and how do these circuits develop and learn in a stable way? Recent neural models clarify how preattentive and attentive perceptual mechanisms are intimately linked within the laminar circuits of visual cortex, notably how bottom-up, top-down, and horizontal cortical connections interact within the cortical layers. These laminar circuits allow the responses of visual cortical neurons to be influenced, not only by the stimuli within their classical receptive fields, but also by stimuli in the extra-classical surround. Such context-sensitive visual processing can greatly enhance the analysis of visual scenes, especially those containing targets that are low contrast, partially occluded, or crowded by distractors. Attentional enhancement can selectively propagate along groupings of both real and illusory contours, thereby showing how attention can selectively enhance object representations. Recent models explain how attention may have a stronger facilitatory effect on low contrast than on high contrast stimuli, and how pop-out from orientation contrast may occur. The specific functional roles which the model proposes for the cortical layers allow several testable neurophysiological predictions to be made. Model mechanisms clarify how intracortical and intercortical feedback help to stabilize cortical development and learning. Although feedback plays a key role, fast feedforward processing is possible in response to unambiguous information. Model circuits are capable of synchronizing quickly, but context-sensitive persistence of previous events can influence how synchrony develops.

  15. Whose reality counts? Factors affecting the perception of volcanic risk

    NASA Astrophysics Data System (ADS)

    Haynes, Katharine; Barclay, Jenni; Pidgeon, Nick

    2008-05-01

    Understanding how people perceive risk has become increasingly important for improving risk communication and reducing risk associated conflicts. This paper builds upon findings, methodologies and lessons learned from other fields to help understand differences between scientists, authorities and the public. Qualitative and quantitative methods were used to analyse underlying attitudes and judgements during an ongoing volcanic crisis on the Caribbean Island of Montserrat. Specific differences between the public, authorities and scientists were found to have been responsible for misunderstandings and misinterpretations of information and roles, resulting in differing perceptions of acceptable risk. Difficulties in the articulation and understanding of uncertainties pertaining to the volcanic risk led to a situation in which the roles of hazard monitoring, risk communication and public protection became confused. In addition, social, economic and political forces were found to have distorted risk messages, leading to a public reliance upon informal information networks. The implications of these findings for volcanic risk management and communication are discussed.

  16. Brain responses during sentence reading: visual input affects central processes.

    PubMed

    Gunter, T C; Friederici, A D; Hahne, A

    1999-10-19

    The effect of visual contrast on sentence reading was investigated using event-related brain potentials (ERPs). Under the low contrast condition semantic integration as reflected in the N400 ERP component was delayed to some degree. The left anterior negativity (LAN) reflecting initial syntactic processes, in contrast, seemed to change its characteristics as a function of visual input. In the high contrast condition the LAN preceded the P200 component whereas in the low contrast condition it was present after this component. These ERP-data from word-by-word sentence reading together with prior results from sentence listening suggest that the physical characteristics of the input must fall within a certain optimal range to guarantee ERP-effects of fast initial syntactic processes.

  17. Students' Achievement Goals, Emotion Perception Ability and Affect and Performance in the Classroom: A Multilevel Examination

    ERIC Educational Resources Information Center

    Vassiou, Aikaterini; Mouratidis, Athanasios; Andreou, Eleni; Kafetsios, Konstantinos

    2016-01-01

    Performance at school is affected not only by students' achievement goals but also by emotional exchanges among classmates and their teacher. In this study, we investigated relationships between students' achievement goals and emotion perception ability and class affect and performance. Participants were 949 Greek adolescent students in 49 classes…

  18. The motion/pursuit law for visual depth perception from motion parallax.

    PubMed

    Nawrot, Mark; Stroyan, Keith

    2009-07-01

    One of vision's most important functions is specification of the layout of objects in the 3D world. While the static optical geometry of retinal disparity explains the perception of depth from binocular stereopsis, we propose a new formula to link the pertinent dynamic geometry to the computation of depth from motion parallax. Mathematically, the ratio of retinal image motion (motion) and smooth pursuit of the eye (pursuit) provides the necessary information for the computation of relative depth from motion parallax. We show that this could have been obtained with the approaches of Nakayama and Loomis [Nakayama, K., & Loomis, J. M. (1974). Optical velocity patterns, velocity-sensitive neurons, and space perception: A hypothesis. Perception, 3, 63-80] or Longuet-Higgens and Prazdny [Longuet-Higgens, H. C., & Prazdny, K. (1980). The interpretation of a moving retinal image. Proceedings of the Royal Society of London Series B, 208, 385-397] by adding pursuit to their treatments. Results of a psychophysical experiment show that changes in the motion/pursuit ratio have a much better relationship to changes in the perception of depth from motion parallax than do changes in motion or pursuit alone. The theoretical framework provided by the motion/pursuit law provides the quantitative foundation necessary to study this fundamental visual depth perception ability.

  19. The role of illness perceptions in the attachment-related process of affect regulation.

    PubMed

    Vilchinsky, Noa; Dekel, Rachel; Asher, Zvia; Leibowitz, Morton; Mosseri, Morris

    2013-01-01

    Based on the predictions of the attachment theory and the Common Sense Model of illness perceptions, the current study focused on the role played by illness perceptions in explaining the path linking attachment orientations to negative affect during recovery from cardiac illness. We predicted two putative mechanisms: (1) illness perceptions would mediate the direct association between attachment-related insecurity (especially attachment anxiety) and levels of distress at follow-up and (2) illness perceptions would interact with attachment orientations (attachment avoidance in particular) in explaining patients' distress. The sample consisted of 111 male patients admitted to the Cardiac Care Unit of the Meir Medical Center, located in the central region of Israel. Patients completed a measure of attachment orientations during hospitalization (baseline). One month later, patients' illness perceptions were measured. Patients' depression and anxiety symptoms were measured at baseline and at the six-month follow-up. The associations between attachment-related anxiety and anxiety symptoms at follow-up were fully mediated by illness perceptions. Attachment-related avoidance was found to interact with illness perceptions in the prediction of depressive symptoms at follow-up. The findings shed light on the possible dynamics among personality, cognitive appraisals, and affect regulation efforts when coping with illness.

  20. Are theories of perception necessary? A review of Gibson's The Ecological Approach to Visual Perception.

    PubMed Central

    Costall, A P

    1984-01-01

    Representational theories of perception postulate an isolated and autonomous "subject" set apart from its real environment, and then go on to invoke processes of mental representation, construction, or hypothesizing to explain how perception can nevertheless take place. Although James Gibson's most conspicuous contribution has been to challenge representational theory, his ultimate concern was the cognitivism which now prevails in psychology. He was convinced that the so-called cognitive revolution merely perpetuates, and even promotes, many of psychology's oldest mistakes. This review article considers Gibson's final statement of his "ecological" alternative to cognitivism (Gibson, 1979). It is intended not as a complete account of Gibson's alternative, however, but primarily as an appreciation of his critical contribution. Gibson's sustained attempt to counter representational theory served not only to reveal the variety of arguments used in support of this theory, but also to expose the questionable metaphysical assumptions upon which they rest. In concentrating upon Gibson's criticisms of representational theory, therefore, this paper aims to emphasize the point of his alternative scheme and to explain some of the important concerns shared by Gibson's ecological approach and operant psychology. PMID:6699538

  1. Atypical perception of affective prosody in Autism Spectrum Disorder.

    PubMed

    Gebauer, Line; Skewes, Joshua; Hørlyck, Lone; Vuust, Peter

    2014-01-01

    Autism Spectrum Disorder (ASD) is characterized by impairments in language and social-emotional cognition. Yet, findings of emotion recognition from affective prosody in individuals with ASD are inconsistent. This study investigated emotion recognition and neural processing of affective prosody in high-functioning adults with ASD relative to neurotypical (NT) adults. Individuals with ASD showed mostly typical brain activation of the fronto-temporal and subcortical brain regions in response to affective prosody. Yet, the ASD group showed a trend towards increased activation of the right caudate during processing of affective prosody and rated the emotional intensity lower than NT individuals. This is likely associated with increased attentional task demands in this group, which might contribute to social-emotional impairments.

  2. Dynamic reweighting of visual and vestibular cues during self-motion perception.

    PubMed

    Fetsch, Christopher R; Turner, Amanda H; DeAngelis, Gregory C; Angelaki, Dora E

    2009-12-01

    The perception of self-motion direction, or heading, relies on integration of multiple sensory cues, especially from the visual and vestibular systems. However, the reliability of sensory information can vary rapidly and unpredictably, and it remains unclear how the brain integrates multiple sensory signals given this dynamic uncertainty. Human psychophysical studies have shown that observers combine cues by weighting them in proportion to their reliability, consistent with statistically optimal integration schemes derived from Bayesian probability theory. Remarkably, because cue reliability is varied randomly across trials, the perceptual weight assigned to each cue must change from trial to trial. Dynamic cue reweighting has not been examined for combinations of visual and vestibular cues, nor has the Bayesian cue integration approach been applied to laboratory animals, an important step toward understanding the neural basis of cue integration. To address these issues, we tested human and monkey subjects in a heading discrimination task involving visual (optic flow) and vestibular (translational motion) cues. The cues were placed in conflict on a subset of trials, and their relative reliability was varied to assess the weights that subjects gave to each cue in their heading judgments. We found that monkeys can rapidly reweight visual and vestibular cues according to their reliability, the first such demonstration in a nonhuman species. However, some monkeys and humans tended to over-weight vestibular cues, inconsistent with simple predictions of a Bayesian model. Nonetheless, our findings establish a robust model system for studying the neural mechanisms of dynamic cue reweighting in multisensory perception.

  3. Visual recovery following open globe injury with initial no light perception

    PubMed Central

    Han, Yong S; Kavoussi, Shaheen C; Adelman, Ron A

    2015-01-01

    Background The purpose of this study was to analyze eyes presenting with no light perception (NLP) after open globe injury (OGI) to determine visual outcomes and prognostic indicators for visual recovery. Methods The records of consecutive patients with at least 6 months of follow-up presenting with OGI and NLP to a single institution between January 1, 2003 and December 31, 2013 were reviewed for demographics, ophthalmic history, context and characteristics of injury, ocular examination findings, surgical interventions, and follow-up visual acuity. Unpaired t-tests and Fisher’s Exact tests were used for statistical analysis. Results Twenty-five patients met our inclusion criteria. The mean age was 50.4±25.5 (range 8–91) years. Four patients (16%) regained vision (hand motion in three patients and light perception in one patient) while 21 patients (84%) remained with NLP or had a prosthesis at final follow-up. Fourteen eyes (56%) were enucleated; nine (36%) were secondary enucleations. Although the sample sizes were small, neither ocular trauma score nor wound size was found to predict visual recovery. Conclusion Four patients regained some vision after presenting with NLP due to OGI. These findings suggest that, in select cases, physicians should discuss the possibility of regaining some vision. PMID:26316683

  4. Visual Contextual Effects of Orientation, Contrast, Flicker, and Luminance: All Are Affected by Normal Aging

    PubMed Central

    Nguyen, Bao N.; McKendrick, Allison M.

    2016-01-01

    The perception of a visual stimulus can be markedly altered by spatial interactions between the stimulus and its surround. For example, a grating stimulus appears lower in contrast when surrounded by a similar pattern of higher contrast: a phenomenon known as surround suppression of perceived contrast. Such center–surround interactions in visual perception are numerous and arise from both cortical and pre-cortical neural circuitry. For example, perceptual surround suppression of luminance and flicker are predominantly mediated pre-cortically, whereas contrast and orientation suppression have strong cortical contributions. Here, we compare the perception of older and younger observers on a battery of tasks designed to assess such visual contextual effects. For all visual dimensions tested (luminance, flicker, contrast, and orientation), on average the older adults showed greater suppression of central targets than the younger adult group. The increase in suppression was consistent in magnitude across all tasks, suggesting that normal aging produces a generalized, non-specific alteration to contextual processing in vision. PMID:27148047

  5. Human fMRI Reveals That Delayed Action Re-Recruits Visual Perception

    PubMed Central

    Kaufman, Liam D.; Culham, Jody C.

    2013-01-01

    Behavioral and neuropsychological research suggests that delayed actions rely on different neural substrates than immediate actions; however, the specific brain areas implicated in the two types of actions remain unknown. We used functional magnetic resonance imaging (fMRI) to measure human brain activation during delayed grasping and reaching. Specifically, we examined activation during visual stimulation and action execution separated by a 18-s delay interval in which subjects had to remember an intended action toward the remembered object. The long delay interval enabled us to unambiguously distinguish visual, memory-related, and action responses. Most strikingly, we observed reactivation of the lateral occipital complex (LOC), a ventral-stream area implicated in visual object recognition, and early visual cortex (EVC) at the time of action. Importantly this reactivation was observed even though participants remained in complete darkness with no visual stimulation at the time of the action. Moreover, within EVC, higher activation was observed for grasping than reaching during both vision and action execution. Areas in the dorsal visual stream were activated during action execution as expected and, for some, also during vision. Several areas, including the anterior intraparietal sulcus (aIPS), dorsal premotor cortex (PMd), primary motor cortex (M1) and the supplementary motor area (SMA), showed sustained activation during the delay phase. We propose that during delayed actions, dorsal-stream areas plan and maintain coarse action goals; however, at the time of execution, motor programming requires re-recruitment of detailed visual information about the object through reactivation of (1) ventral-stream areas involved in object perception and (2) early visual areas that contain richly detailed visual representations, particularly for grasping. PMID:24040007

  6. How to make a good animation: A grounded cognition model of how visual representation design affects the construction of abstract physics knowledge

    NASA Astrophysics Data System (ADS)

    Chen, Zhongzhou; Gladding, Gary

    2014-06-01

    Visual representations play a critical role in teaching physics. However, since we do not have a satisfactory understanding of how visual perception impacts the construction of abstract knowledge, most visual representations used in instructions are either created based on existing conventions or designed according to the instructor's intuition, which leads to a significant variance in their effectiveness. In this paper we propose a cognitive mechanism based on grounded cognition, suggesting that visual perception affects understanding by activating "perceptual symbols": the basic cognitive unit used by the brain to construct a concept. A good visual representation activates perceptual symbols that are essential for the construction of the represented concept, whereas a bad representation does the opposite. As a proof of concept, we conducted a clinical experiment in which participants received three different versions of a multimedia tutorial teaching the integral expression of electric potential. The three versions were only different by the details of the visual representation design, only one of which contained perceptual features that activate perceptual symbols essential for constructing the idea of "accumulation." On a following post-test, participants receiving this version of tutorial significantly outperformed those who received the other two versions of tutorials designed to mimic conventional visual representations used in classrooms.

  7. Linguistic experience and audio-visual perception of non-native fricatives.

    PubMed

    Wang, Yue; Behne, Dawn M; Jiang, Haisheng

    2008-09-01

    This study examined the effects of linguistic experience on audio-visual (AV) perception of non-native (L2) speech. Canadian English natives and Mandarin Chinese natives differing in degree of English exposure [long and short length of residence (LOR) in Canada] were presented with English fricatives of three visually distinct places of articulation: interdentals nonexistent in Mandarin and labiodentals and alveolars common in both languages. Stimuli were presented in quiet and in a cafe-noise background in four ways: audio only (A), visual only (V), congruent AV (AVc), and incongruent AV (AVi). Identification results showed that overall performance was better in the AVc than in the A or V condition and better in quiet than in cafe noise. While the Mandarin long LOR group approximated the native English patterns, the short LOR group showed poorer interdental identification, more reliance on visual information, and greater AV-fusion with the AVi materials, indicating the failure of L2 visual speech category formation with the short LOR non-natives and the positive effects of linguistic experience with the long LOR non-natives. These results point to an integrated network in AV speech processing as a function of linguistic background and provide evidence to extend auditory-based L2 speech learning theories to the visual domain.

  8. Hearing the speed: visual motion biases the perception of auditory tempo.

    PubMed

    Su, Yi-Huang; Jonikaitis, Donatas

    2011-10-01

    The coupling between sensory and motor processes has been established in various scenarios: for example, the perception of auditory rhythm entails an audiomotor representation of the sounds. Similarly, visual action patterns can also be represented via a visuomotor transformation. In this study, we tested the hypothesis that the visual motor information, such as embedded in a coherent motion flow, can interact with the perception of a motor-related aspect in auditory rhythm: the tempo. In the first two experiments, we employed an auditory tempo judgment task where participants listened to a standard auditory sequence while concurrently watching visual stimuli of different motion information, after which they judged the tempo of a comparison sequence related to the standard. In Experiment 1, we found that the same auditory tempo was perceived as faster when it was accompanied by accelerating visual motion than by non-motion luminance change. In Experiment 2, we compared the perceived auditory tempo among three visual motion conditions, increase in speed, decrease in speed, and no speed change, and found the corresponding bias in judgment of auditory tempo: faster than it was, slower than it was, and no bias. In Experiment 3, the perceptual bias induced by the change in motion speed was consistently reflected in the tempo reproduction task. Taken together, these results indicate that between a visual spatiotemporal and an auditory temporal stimulation, the embedded motor representations from each can interact across modalities, leading to a spatial-to-temporal bias. This suggests that the perceptual process in one modality can incorporate concurrent motor information from cross-modal sensory inputs to form a coherent experience.

  9. Semantic congruence affects hippocampal response to repetition of visual associations.

    PubMed

    McAndrews, Mary Pat; Girard, Todd A; Wilkins, Leanne K; McCormick, Cornelia

    2016-09-01

    Recent research has shown complementary engagement of the hippocampus and medial prefrontal cortex (mPFC) in encoding and retrieving associations based on pre-existing or experimentally-induced schemas, such that the latter supports schema-congruent information whereas the former is more engaged for incongruent or novel associations. Here, we attempted to explore some of the boundary conditions in the relative involvement of those structures in short-term memory for visual associations. The current literature is based primarily on intentional evaluation of schema-target congruence and on study-test paradigms with relatively long delays between learning and retrieval. We used a continuous recognition paradigm to investigate hippocampal and mPFC activation to first and second presentations of scene-object pairs as a function of semantic congruence between the elements (e.g., beach-seashell versus schoolyard-lamp). All items were identical at first and second presentation and the context scene, which was presented 500ms prior to the appearance of the target object, was incidental to the task which required a recognition response to the central target only. Very short lags 2-8 intervening stimuli occurred between presentations. Encoding the targets with congruent contexts was associated with increased activation in visual cortical regions at initial presentation and faster response time at repetition, but we did not find enhanced activation in mPFC relative to incongruent stimuli at either presentation. We did observe enhanced activation in the right anterior hippocampus, as well as regions in visual and lateral temporal and frontal cortical regions, for the repetition of incongruent scene-object pairs. This pattern demonstrates rapid and incidental effects of schema processing in hippocampal, but not mPFC, engagement during continuous recognition. PMID:27449709

  10. Simulated Environments with Animated Agents: Effects on Visual Attention, Emotion, Performance, and Perception

    ERIC Educational Resources Information Center

    Romero-Hall, E.; Watson, G. S.; Adcock, A.; Bliss, J.; Adams Tufts, K.

    2016-01-01

    This research assessed how emotive animated agents in a simulation-based training affect the performance outcomes and perceptions of the individuals interacting in real time with the training application. A total of 56 participants consented to complete the study. The material for this investigation included a nursing simulation in which…

  11. Environmental risk perception from visual cues: the psychophysics of tornado risk perception

    NASA Astrophysics Data System (ADS)

    Dewitt, Barry; Fischhoff, Baruch; Davis, Alexander; Broomell, Stephen B.

    2015-12-01

    Lay judgments of environmental risks are central to both immediate decisions (e.g., taking shelter from a storm) and long-term ones (e.g., building in locations subject to storm surges). Using methods from quantitative psychology, we provide a general approach to studying lay perceptions of environmental risks. As a first application of these methods, we investigate a setting where lay decisions have not taken full advantage of advances in natural science understanding: tornado forecasts in the US and Canada. Because official forecasts are imperfect, members of the public must often evaluate the risks on their own, by checking environmental cues (such as cloud formations) before deciding whether to take protective action. We study lay perceptions of cloud formations, demonstrating an approach that could be applied to other environmental judgments. We use signal detection theory to analyse how well people can distinguish tornadic from non-tornadic clouds, and multidimensional scaling to determine how people make these judgments. We find that participants (N = 400 recruited from Amazon Mechanical Turk) have heuristics that generally serve them well, helping participants to separate tornadic from non-tornadic clouds, but which also lead them to misjudge the tornado risk of certain cloud types. The signal detection task revealed confusion regarding shelf clouds, mammatus clouds, and clouds with upper- and mid-level tornadic features, which the multidimensional scaling task suggested was the result of participants focusing on the darkness of the weather scene and the ease of discerning its features. We recommend procedures for training (e.g., for storm spotters) and communications (e.g., tornado warnings) that will reduce systematic misclassifications of tornadicity arising from observers’ reliance on otherwise useful heuristics.

  12. The intrinsic value of visual information affects saccade velocities.

    PubMed

    Xu-Wilson, Minnan; Zee, David S; Shadmehr, Reza

    2009-07-01

    Let us assume that the purpose of any movement is to position our body in a more advantageous or rewarding state. For example, we might make a saccade to foveate an image because our brain assigns an intrinsic value to the information that it expects to acquire at the endpoint of that saccade. Different images might have different intrinsic values. Optimal control theory predicts that the intrinsic value that the brain assigns to targets of saccades should be reflected in the trajectory of the saccade. That is, in anticipation of foveating a highly valued image, our brain should produce a saccade with a higher velocity and shorter duration. Here, we considered four types of images: faces, objects, inverted faces, and meaningless visual noise. Indeed, we found that reflexive saccades that were made to a laser light in anticipation of viewing an image of a face had the highest velocities and shortest durations. The intrinsic value of visual information appears to have a small but significant influence on the motor commands that guide saccades.

  13. Collinear integration affects visual search at V1.

    PubMed

    Chow, Hiu Mei; Jingling, Li; Tseng, Chia-huei

    2013-08-29

    Perceptual grouping plays an indispensable role in figure-ground segregation and attention distribution. For example, a column pops out if it contains element bars orthogonal to uniformly oriented element bars. Jingling and Tseng (2013) have reported that contextual grouping in a column matters to visual search behavior: When a column is grouped into a collinear (snakelike) structure, a target positioned on it became harder to detect than on other noncollinear (ladderlike) columns. How and where perceptual grouping interferes with selective attention is still largely unknown. This article contributes to this little-studied area by asking whether collinear contour integration interacts with visual search before or after binocular fusion. We first identified that the previously mentioned search impairment occurs with a distractor of five or nine elements but not one element in a 9 × 9 search display. To pinpoint the site of this effect, we presented the search display with a short collinear bar (one element) to one eye and the extending collinear bars to the other eye, such that when properly fused, the combined binocular collinear length (nine elements) exceeded the critical length. No collinear search impairment was observed, implying that collinear information before binocular fusion shaped participants' search behavior, although contour extension from the other eye after binocular fusion enhanced the effect of collinearity on attention. Our results suggest that attention interacts with perceptual grouping as early as V1.

  14. Tilt and Translation Motion Perception during Pitch Tilt with Visual Surround Translation

    NASA Technical Reports Server (NTRS)

    O'Sullivan, Brita M.; Harm, Deborah L.; Reschke, Millard F.; Wood, Scott J.

    2006-01-01

    The central nervous system must resolve the ambiguity of inertial motion sensory cues in order to derive an accurate representation of spatial orientation. Previous studies suggest that multisensory integration is critical for discriminating linear accelerations arising from tilt and translation head motion. Visual input is especially important at low frequencies where canal input is declining. The NASA Tilt Translation Device (TTD) was designed to recreate postflight orientation disturbances by exposing subjects to matching tilt self motion with conflicting visual surround translation. Previous studies have demonstrated that brief exposures to pitch tilt with foreaft visual surround translation produced changes in compensatory vertical eye movement responses, postural equilibrium, and motion sickness symptoms. Adaptation appeared greatest with visual scene motion leading (versus lagging) the tilt motion, and the adaptation time constant appeared to be approximately 30 min. The purpose of this study was to compare motion perception when the visual surround translation was inphase versus outofphase with pitch tilt. The inphase stimulus presented visual surround motion one would experience if the linear acceleration was due to foreaft self translation within a stationary surround, while the outofphase stimulus had the visual scene motion leading the tilt by 90 deg as previously used. The tilt stimuli in these conditions were asymmetrical, ranging from an upright orientation to 10 deg pitch back. Another objective of the study was to compare motion perception with the inphase stimulus when the tilts were asymmetrical relative to upright (0 to 10 deg back) versus symmetrical (10 deg forward to 10 deg back). Twelve subjects (6M, 6F, 22-55 yrs) were tested during 3 sessions separated by at least one week. During each of the three sessions (out-of-phase asymmetrical, in-phase asymmetrical, inphase symmetrical), subjects were exposed to visual surround translation

  15. The Role of Affective and Cognitive Individual Differences in Social Perception.

    PubMed

    Aquino, Antonio; Haddock, Geoffrey; Maio, Gregory R; Wolf, Lukas J; Alparone, Francesca R

    2016-06-01

    Three studies explored the connection between social perception processes and individual differences in the use of affective and cognitive information in relation to attitudes. Study 1 revealed that individuals high in need for affect (NFA) accentuated differences in evaluations of warm and cold traits, whereas individuals high in need for cognition (NFC) accentuated differences in evaluations of competent and incompetent traits. Study 2 revealed that individual differences in NFA predicted liking of warm or cold targets, whereas individual differences in NFC predicted perceptions of competent or incompetent targets. Furthermore, the effects of NFA and NFC were independent of structural bases and meta-bases of attitudes. Study 3 revealed that differences in the evaluation of warm and cold traits mediated the effects of NFA and NFC on liking of targets. The implications for social perception processes and for individual differences in affect-cognition are discussed. PMID:27460272

  16. Neighborhood Perceptions Affect Dietary Behaviors and Diet Quality

    ERIC Educational Resources Information Center

    Keita, Akilah Dulin; Casazza, Krista; Thomas, Olivia; Fernandez, Jose R.

    2011-01-01

    Objective: The primary purpose of this study was to determine if perceived neighborhood disorder affected dietary quality within a multiethnic sample of children. Design: Children were recruited through the use of fliers, wide-distribution mailers, parent magazines, and school presentations from June 2005 to December 2008. Setting:…

  17. Hemispheric Asymmetries in Children's Perception of Nonlinguistic Human Affective Sounds

    ERIC Educational Resources Information Center

    Pollak, Seth D.; Holt, Lori L.; Fries, Alison B. Wismer

    2004-01-01

    In the present work, we developed a database of nonlinguistic sounds that mirror prosodic characteristics typical of language and thus carry affective information, but do not convey linguistic information. In a dichotic-listening task, we used these novel stimuli as a means of disambiguating the relative contributions of linguistic and affective…

  18. Factors Affecting the Effectiveness and Use of Moodle: Students' Perception

    ERIC Educational Resources Information Center

    Damnjanovic, Vesna; Jednak, Sandra; Mijatovic, Ivana

    2015-01-01

    The purpose of this research paper is to identify the factors affecting the effectiveness of Moodle from the students' perspective. The research hypotheses derived from the suggested extended Seddon model have been empirically validated using the responses to a survey on e-learning usage among 255 users. We tested the model across higher education…

  19. Influence of Visual Motion, Suggestion, and Illusory Motion on Self-Motion Perception in the Horizontal Plane

    PubMed Central

    Rosenblatt, Steven David; Crane, Benjamin Thomas

    2015-01-01

    A moving visual field can induce the feeling of self-motion or vection. Illusory motion from static repeated asymmetric patterns creates a compelling visual motion stimulus, but it is unclear if such illusory motion can induce a feeling of self-motion or alter self-motion perception. In these experiments, human subjects reported the perceived direction of self-motion for sway translation and yaw rotation at the end of a period of viewing set visual stimuli coordinated with varying inertial stimuli. This tested the hypothesis that illusory visual motion would influence self-motion perception in the horizontal plane. Trials were arranged into 5 blocks based on stimulus type: moving star field with yaw rotation, moving star field with sway translation, illusory motion with yaw, illusory motion with sway, and static arrows with sway. Static arrows were used to evaluate the effect of cognitive suggestion on self-motion perception. Each trial had a control condition; the illusory motion controls were altered versions of the experimental image, which removed the illusory motion effect. For the moving visual stimulus, controls were carried out in a dark room. With the arrow visual stimulus, controls were a gray screen. In blocks containing a visual stimulus there was an 8s viewing interval with the inertial stimulus occurring over the final 1s. This allowed measurement of the visual illusion perception using objective methods. When no visual stimulus was present, only the 1s motion stimulus was presented. Eight women and five men (mean age 37) participated. To assess for a shift in self-motion perception, the effect of each visual stimulus on the self-motion stimulus (cm/s) at which subjects were equally likely to report motion in either direction was measured. Significant effects were seen for moving star fields for both translation (p = 0.001) and rotation (p<0.001), and arrows (p = 0.02). For the visual motion stimuli, inertial motion perception was shifted in the

  20. Combining Strengths and Weaknesses in Visual Perception of Children with an Autism Spectrum Disorder: Perceptual Matching of Facial Expressions

    ERIC Educational Resources Information Center

    Evers, Kris; Noens, Ilse; Steyaert, Jean; Wagemans, Johan

    2011-01-01

    Background: Children with an autism spectrum disorder (ASD) are known to have an atypical visual perception, with deficits in automatic Gestalt formation and an enhanced processing of visual details. In addition, they are sometimes found to have difficulties in emotion processing. Methods: In three experiments, we investigated whether 7-to-11-year…

  1. Categorical Perception of Colour in the Left and Right Visual Field Is Verbally Mediated: Evidence from Korean

    ERIC Educational Resources Information Center

    Roberson, Debi; Pak, Hyensou; Hanley, J. Richard

    2008-01-01

    In this study we demonstrate that Korean (but not English) speakers show Categorical perception (CP) on a visual search task for a boundary between two Korean colour categories that is not marked in English. These effects were observed regardless of whether target items were presented to the left or right visual field. Because this boundary is…

  2. Visual tuning and metrical perception of realistic point-light dance movements

    PubMed Central

    Su, Yi-Huang

    2016-01-01

    Humans move to music spontaneously, and this sensorimotor coupling underlies musical rhythm perception. The present research proposed that, based on common action representation, different metrical levels as in auditory rhythms could emerge visually when observing structured dance movements. Participants watched a point-light figure performing basic steps of Swing dance cyclically in different tempi, whereby the trunk bounced vertically at every beat and the limbs moved laterally at every second beat, yielding two possible metrical periodicities. In Experiment 1, participants freely identified a tempo of the movement and tapped along. While some observers only tuned to the bounce and some only to the limbs, the majority tuned to one level or the other depending on the movement tempo, which was also associated with individuals’ preferred tempo. In Experiment 2, participants reproduced the tempo of leg movements by four regular taps, and showed a slower perceived leg tempo with than without the trunk bouncing simultaneously in the stimuli. This mirrors previous findings of an auditory ‘subdivision effect’, suggesting the leg movements were perceived as beat while the bounce as subdivisions. Together these results support visual metrical perception of dance movements, which may employ similar action-based mechanisms to those underpinning auditory rhythm perception. PMID:26947252

  3. Trends Affecting the Prevalence of Visual Impairment and Demand for Services.

    ERIC Educational Resources Information Center

    Kirchner, Corinne

    1999-01-01

    Discusses the prevalence of people with visual impairment and trends affecting prevalence, including increased overall populations and a growth in the older population, greater ability to preserve lives of high-risk populations, improved fitness, medical advances in prevention, expanding role of computers among other increasing visual demands, and…

  4. Mathematics anxiety affects counting but not subitizing during visual enumeration.

    PubMed

    Maloney, Erin A; Risko, Evan F; Ansari, Daniel; Fugelsang, Jonathan

    2010-02-01

    Individuals with mathematics anxiety have been found to differ from their non-anxious peers on measures of higher-level mathematical processes, but not simple arithmetic. The current paper examines differences between mathematics anxious and non-mathematics anxious individuals in more basic numerical processing using a visual enumeration task. This task allows for the assessment of two systems of basic number processing: subitizing and counting. Mathematics anxious individuals, relative to non-mathematics anxious individuals, showed a deficit in the counting but not in the subitizing range. Furthermore, working memory was found to mediate this group difference. These findings demonstrate that the problems associated with mathematics anxiety exist at a level more basic than would be predicted from the extant literature.

  5. How Facial Expressions of Emotion Affect Distance Perception

    PubMed Central

    Kim, Nam-Gyoon; Son, Heejung

    2015-01-01

    Facial expressions of emotion are thought to convey expressers’ behavioral intentions, thus priming observers’ approach and avoidance tendencies appropriately. The present study examined whether detecting expressions of behavioral intent influences perceivers’ estimation of the expresser’s distance from them. Eighteen undergraduates (nine male and nine female) participated in the study. Six facial expressions were chosen on the basis of degree of threat—anger, hate (threatening expressions), shame, surprise (neutral expressions), pleasure, and joy (safe expressions). Each facial expression was presented on a tablet PC held by an assistant covered by a black drape who stood 1, 2, or 3 m away from participants. Participants performed a visual matching task to report the perceived distance. Results showed that facial expression influenced distance estimation, with faces exhibiting threatening or safe expressions judged closer than those showing neutral expressions. Females’ judgments were more likely to be influenced; but these influences largely disappeared beyond the 2 m distance. These results suggest that facial expressions of emotion (particularly threatening or safe emotions) influence others’ (especially females’) distance estimations but only within close proximity. PMID:26635708

  6. Assessment of visual perception in adolescents with a history of central coordination disorder in early life – 15-year follow-up study

    PubMed Central

    Kowalski, Ireneusz M.; Domagalska, Małgorzata; Szopa, Andrzej; Dwornik, Michał; Kujawa, Jolanta; Stępień, Agnieszka; Śliwiński, Zbigniew

    2012-01-01

    Introduction Central nervous system damage in early life results in both quantitative and qualitative abnormalities of psychomotor development. Late sequelae of these disturbances may include visual perception disorders which not only affect the ability to read and write but also generally influence the child's intellectual development. This study sought to determine whether a central coordination disorder (CCD) in early life treated according to Vojta's method with elements of the sensory integration (S-I) and neuro-developmental treatment (NDT)/Bobath approaches affects development of visual perception later in life. Material and methods The study involved 44 participants aged 15-16 years, including 19 diagnosed with moderate or severe CCD in the neonatal period, i.e. during the first 2-3 months of life, with diagnosed mild degree neonatal encephalopathy due to perinatal anoxia, and 25 healthy people without a history of developmental psychomotor disturbances in the neonatal period. The study tool was a visual perception IQ test comprising 96 graphic tasks. Results The study revealed equal proportions of participants (p < 0.05) defined as very skilled (94-96), skilled (91-94), aerage (71-91), poor (67-71), and very poor (0-67) in both groups. These results mean that adolescents with a history of CCD in the neonatal period did not differ with regard to the level of visual perception from their peers who had not demonstrated psychomotor development disorders in the neonatal period. Conclusions Early treatment of children with CCD affords a possibility of normalising their psychomotor development early enough to prevent consequences in the form of cognitive impairments in later life. PMID:23185199

  7. Remote haptic perception of slanted surfaces shows the same scale expansion as visual perception.

    PubMed

    Shaffer, Dennis M; McManama, Eric

    2015-04-01

    Previous work has shown that overestimates of geographic slant depend on the modality used (verbal or haptic). Recently, that line of reasoning has come into question for many reasons, not the least of which is that the typical method used for measuring "action" has been the use of a palm board, which is not well calibrated to any type of action toward slanted surfaces. In the present work, we investigated how a remote haptic task that has been well calibrated to action in previous work is related to verbal overestimates of slanted surfaces that are out of reach. The results show that haptic estimates are perceptually equivalent to the verbal overestimates that have been found in numerous previous studies. This work shows that the haptic perceptual system is scaled in the same way as the visual perceptual system for estimating the orientation of slanted surfaces that are out of reach.

  8. Semantic Categorization Precedes Affective Evaluation of Visual Scenes

    ERIC Educational Resources Information Center

    Nummenmaa, Lauri; Hyona, Jukka; Calvo, Manuel G.

    2010-01-01

    We compared the primacy of affective versus semantic categorization by using forced-choice saccadic and manual response tasks. Participants viewed paired emotional and neutral scenes involving humans or animals flashed rapidly in extrafoveal vision. Participants were instructed to categorize the targets by saccading toward the location occupied by…

  9. MEG brain activities reflecting affection for visual food stimuli.

    PubMed

    Kuriki, Shinya; Miyamura, Takahiro; Uchikawa, Yoshinori

    2010-01-01

    This study aimed to explore the modulation of alpha rhythm in response to food pictures with distinct affection values. We examined the method to discriminate subject's state, i.e., whether he/she liked the article of food or not, from MEG signals detected over the head. Pictures of familiar foods were used as affective stimuli, while those pictures with complementary color phase were used as non-affective stimuli. Alpha band signals in a narrow frequency window around the spectral peak of individual subjects were wavelet analyzed and phase-locked component to the stimulus onset was obtained as a complex number. The amplitude of the phase-locked component was averaged during 0-1 s after stimulus onset for 30 epochs in a measurement session and across 76 channels of MEG sensor. In statistical test of individual subjects, significant difference was found in the real part of the averaged phase-locked amplitude between the normal-color and reverse-color pictures. These results suggest that affective information processing of food pictures is reflected in the synchronized component of narrow band alpha rhythm. PMID:21096510

  10. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy.

    PubMed

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W Tecumseh

    2012-07-19

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups.

  11. Production and perception rules underlying visual patterns: effects of symmetry and hierarchy

    PubMed Central

    Westphal-Fitch, Gesche; Huber, Ludwig; Gómez, Juan Carlos; Fitch, W. Tecumseh

    2012-01-01

    Formal language theory has been extended to two-dimensional patterns, but little is known about two-dimensional pattern perception. We first examined spontaneous two-dimensional visual pattern production by humans, gathered using a novel touch screen approach. Both spontaneous creative production and subsequent aesthetic ratings show that humans prefer ordered, symmetrical patterns over random patterns. We then further explored pattern-parsing abilities in different human groups, and compared them with pigeons. We generated visual plane patterns based on rules varying in complexity. All human groups tested, including children and individuals diagnosed with autism spectrum disorder (ASD), were able to detect violations of all production rules tested. Our ASD participants detected pattern violations with the same speed and accuracy as matched controls. Children's ability to detect violations of a relatively complex rotational rule correlated with age, whereas their ability to detect violations of a simple translational rule did not. By contrast, even with extensive training, pigeons were unable to detect orientation-based structural violations, suggesting that, unlike humans, they did not learn the underlying structural rules. Visual two-dimensional patterns offer a promising new formally-grounded way to investigate pattern production and perception in general, widely applicable across species and age groups. PMID:22688636

  12. Effect of body temperature on visual evoked potential delay and visual perception in multiple sclerosis.

    PubMed

    Regan, D; Murray, T J; Silver, R

    1977-11-01

    Seven multiple sclerosis patients were cooled and four heated, but evoked potential delay changed in only five out 11 experiments. Control limits were set by cooling eight and heating four control subjects. One patient gave anomalous results in that although heating degraded perceptual delay and visual acuity, and depressed the sine wave grating MTF, double-flash resolution was improved. An explanation is proposed in terms of the pattern of axonal demyelination. The medium frequency flicker evoked potential test seems to be a less reliable means of monitoring the progress of demyelination in multiple sclerosis patients than is double-flash campimetry or perceptual delay campimetry, although in some situations the objectivity of the evoked potential test would be advantageous.

  13. Functional dissociation between action and perception of object shape in developmental visual object agnosia.

    PubMed

    Freud, Erez; Ganel, Tzvi; Avidan, Galia; Gilaie-Dotan, Sharon

    2016-03-01

    According to the two visual systems model, the cortical visual system is segregated into a ventral pathway mediating object recognition, and a dorsal pathway mediating visuomotor control. In the present study we examined whether the visual control of action could develop normally even when visual perceptual abilities are compromised from early childhood onward. Using his fingers, LG, an individual with a rare developmental visual object agnosia, manually estimated (perceptual condition) the width of blocks that varied in width and length (but not in overall size), or simply picked them up across their width (grasping condition). LG's perceptual sensitivity to target width was profoundly impaired in the manual estimation task compared to matched controls. In contrast, the sensitivity to object shape during grasping, as measured by maximum grip aperture (MGA), the time to reach the MGA, the reaction time and the total movement time were all normal in LG. Further analysis, however, revealed that LG's sensitivity to object shape during grasping emerged at a later time stage during the movement compared to controls. Taken together, these results demonstrate a dissociation between action and perception of object shape, and also point to a distinction between different stages of the grasping movement, namely planning versus online control. Moreover, the present study implies that visuomotor abilities can develop normally even when perceptual abilities developed in a profoundly impaired fashion.

  14. The plausibility of visual information for hand ownership modulates multisensory synchrony perception.

    PubMed

    Zopf, Regine; Friedman, Jason; Williams, Mark A

    2015-08-01

    We are frequently changing the position of our bodies and body parts within complex environments. How does the brain keep track of one's own body? Current models of body ownership state that visual body ownership cues such as viewed object form and orientation are combined with multisensory information to correctly identify one's own body, estimate its current location and evoke an experience of body ownership. Within this framework, it may be possible that the brain relies on a separate perceptual analysis of body ownership cues (e.g. form, orientation, multisensory synchrony). Alternatively, these cues may interact in earlier stages of perceptual processing-visually derived body form and orientation cues may, for example, directly modulate temporal synchrony perception. The aim of the present study was to distinguish between these two alternatives. We employed a virtual hand set-up and psychophysical methods. In a two-interval force-choice task, participants were asked to detect temporal delays between executed index finger movements and observed movements. We found that body-specifying cues interact in perceptual processing. Specifically, we show that plausible visual information (both form and orientation) for one's own body led to significantly better detection performance for small multisensory asynchronies compared to implausible visual information. We suggest that this perceptual modulation when visual information plausible for one's own body is present is a consequence of body-specific sensory predictions. PMID:25980691

  15. Visual features for perception, attention, and working memory: Toward a three-factor framework.

    PubMed

    Huang, Liqiang

    2015-12-01

    Visual features are the general building blocks for attention, perception, and working memory. Here, I explore the factors which can quantitatively predict all the differences they make in various paradigms. I tried to combine the strengths of experimental and correlational approaches in a novel way by developing an individual-item differences analysis to extract the factors from 16 stimulus types on the basis of their roles in eight tasks. A large sample size (410) ensured that all eight tasks had a reliability (Cronbach's α) of no less than 0.975, allowing the factors to be precisely determined. Three orthogonal factors were identified which correspond respectively to featural strength (i.e., how close a stimulus is to a basic feature), visual strength (i.e., visual quality of the stimulus), and spatial strength (i.e., how well a stimulus can be represented as a spatial structure). Featural strength helped substantially in all the tasks but moderately less so in perceptual discrimination; visual strength helped substantially in low-level tasks but not in high-level tasks; and spatial strength helped change detection but hindered ensemble matching and visual search. Jointly, these three factors explained 96.4% of all the variances of the eight tasks, making it clear that they account for almost everything about the roles of these 16 stimulus types in these eight tasks.

  16. Influence Of Ambient Light On The "Visual" Sensitometric Properties Of, And Detail Perception On, A Radiograph

    NASA Astrophysics Data System (ADS)

    Bollen, Romain; Vranckx, Jean

    1981-07-01

    Lack of perception at high densities on radiographs and the influence of viewing conditions on it are well known. This lack may be caused by blinding effects, by high visual noise at low light intensities or by a third phenomenon i.e. the dependence of the sensitometric properties of film on viewing conditions, which is analyzed in this paper. Reflection of ambient light by the film mainly lowers dramatically high densities and film contrast at these densities. Sensitometric curves of several films were measured under different viewing conditions by means of a telescopic photometer. The curves also can be deduced from curves measured by a regular densitometer when the optical properties of the film, the ambient light level and the light intensity of the negatoscope are known. The influence of the phenomenon under typical viewing conditions for the Curix MR4-film is demonstrated by means of sensitometric- and perceptibility-curves.

  17. Motion perception: a review of developmental changes and the role of early visual experience

    PubMed Central

    Hadad, Batsheva; Schwartz, Sivan; Maurer, Daphne; Lewis, Terri L.

    2015-01-01

    Significant controversies have arisen over the developmental trajectory for the perception of global motion. Studies diverge on the age at which it becomes adult-like, with estimates ranging from as young as 3 years to as old as 16. In this article, we review these apparently conflicting results and suggest a potentially unifying hypothesis that may also account for the contradictory literature in neurodevelopmental disorders, such as Autism Spectrum Disorder (ASD). We also discuss the extent to which patterned visual input during this period is necessary for the later development of motion perception. We conclude by addressing recent studies directly comparing different types of motion integration, both in typical and atypical development, and suggest areas ripe for future research. PMID:26441564

  18. Evaluation of factors affecting stakeholder risk perception of contaminated sediment disposal in Oslo harbor.

    PubMed

    Sparrevik, Magnus; Ellen, Gerald Jan; Duijn, Mike

    2011-01-01

    The management of environmental pollution has changed considerably since the growth of environmental awareness in the late 1960s. The general increased environmental concern and involvement of stakeholders in today's environmental issues may enhance the need to consider risk in a much broader social context rather than just as an estimate of ecological hazard. Risk perception and the constructs and images of risks held by stakeholders and society are important items to address in the management of environmental projects, including the management of contaminated sediments. Here we present a retrospective case study that evaluates factors affecting stakeholder risk perception of contaminated sediment disposal that occurred during a remediation project in Oslo harbor, Norway. The choice to dispose dredged contaminated sediments in a confined aquatic disposal (CAD) site rather than at a land disposal site has received a lot of societal attention, attracted large media coverage, and caused many public discussions. A mixed method approach is used to investigate how risk perceptive affective factors (PAF), socio-demographic aspects, and participatory aspects have influenced the various stakeholders' preferences for the two different disposal options. Risk perceptive factors such as transparency in the decision making process and controllability of the disposal options have been identified as important for risk perception. The results of the study also support the view that there is no sharp distinction in risk perception between experts and other parties and emphasizes the importance of addressing risk perceptive affective factors in similar environmental decision-making processes. Indeed, PAFs such as transparency, openness, and information are fundamental to address in sensitive environmental decisions, such as sediment disposal alternatives, in order to progress to more technical questions such as the controllability and safety.

  19. Emotional prosody rarely affects the spatial distribution of visual attention.

    PubMed

    Godfrey, Hazel K; Grimshaw, Gina M

    2012-01-01

    Emotional manipulations have been demonstrated to produce leftward shifts in perceptual asymmetries. However, much of this research has used linguistic tasks to assess perceptual asymmetry and there are therefore two interpretations of the leftward shift. It may reflect a leftward shift in the spatial distribution of attention as a consequence of emotional activation of the right hemisphere; alternatively it may reflect emotional facilitation of right hemisphere linguistic processing. The current study used two non-linguistic attention tasks to determine whether emotional prosody influences the spatial distribution of visual attention. In a dual-task paradigm participants listened to semantically neutral sentences in neutral, happy or sad prosodies while completing a target discrimination task (Experiment 1) and a target detection task (Experiments 2 and 3). There was only one condition in one experiment that induced perceptual asymmetries that interacted with emotional prosody, suggesting that task-irrelevant emotional prosody only rarely directs attention. Instead a more likely cause of the leftward perceptual shift for comprehension of emotional speech is facilitation of right hemisphere linguistic processing.

  20. Embodiments, visualizations, and immersion with enactive affective systems

    NASA Astrophysics Data System (ADS)

    Domingues, Diana; Miosso, Cristiano J.; Rodrigues, Suélia F.; Silva Rocha Aguiar, Carla; Lucena, Tiago F.; Miranda, Mateus; Rocha, Adson F.; Raskar, Ramesh

    2014-02-01

    Our proposal in Bioart and Biomedical Engineering for a ective esthetics focuses on the expanded sensorium and investigates problems regarding enactive systems. These systems enhance the sensorial experiences and amplify kinesthesia by adding the sensations that are formed in response to the physical world, which aesthetically constitutes the principle of synaesthesia. In this paper, we also present enactive systems inside the CAVE, con guring compelling experiences in data landscapes and human a ective narratives. The interaction occurs through the acquisition, data visualization and analysis of several synchronized physiological signals, to which the landscapes respond and provide immediate feedback, according to the detected participants' actions and the intertwined responses of the environment. The signals we use to analyze the human states include the electrocardiography (ECG) signal, the respiratory ow, the galvanic skin response (GSR) signal, plantar pressures, the pulse signal and others. Each signal is collected by using a speci cally designed dedicated electronic board, with reduced dimensions, so it does not interfere with normal movements, according to the principles of transparent technologies. Also, the electronic boards are implemented in a modular approach, so they are independent, and can be used in many di erent desired combinations, and at the same time provide synchronization between the collected data.

  1. Visual crowding illustrates the inadequacy of local vs. global and feedforward vs. feedback distinctions in modeling visual perception.

    PubMed

    Clarke, Aaron M; Herzog, Michael H; Francis, Gregory

    2014-01-01

    Experimentalists tend to classify models of visual perception as being either local or global, and involving either feedforward or feedback processing. We argue that these distinctions are not as helpful as they might appear, and we illustrate these issues by analyzing models of visual crowding as an example. Recent studies have argued that crowding cannot be explained by purely local processing, but that instead, global factors such as perceptual grouping are crucial. Theories of perceptual grouping, in turn, often invoke feedback connections as a way to account for their global properties. We examined three types of crowding models that are representative of global processing models, and two of which employ feedback processing: a model based on Fourier filtering, a feedback neural network, and a specific feedback neural architecture that explicitly models perceptual grouping. Simulations demonstrate that crucial empirical findings are not accounted for by any of the models. We conclude that empirical investigations that reject a local or feedforward architecture offer almost no constraints for model construction, as there are an uncountable number of global and feedback systems. We propose that the identification of a system as being local or global and feedforward or feedback is less important than the identification of a system's computational details. Only the latter information can provide constraints on model development and promote quantitative explanations of complex phenomena.

  2. Visual crowding illustrates the inadequacy of local vs. global and feedforward vs. feedback distinctions in modeling visual perception

    PubMed Central

    Clarke, Aaron M.; Herzog, Michael H.; Francis, Gregory

    2014-01-01

    Experimentalists tend to classify models of visual perception as being either local or global, and involving either feedforward or feedback processing. We argue that these distinctions are not as helpful as they might appear, and we illustrate these issues by analyzing models of visual crowding as an example. Recent studies have argued that crowding cannot be explained by purely local processing, but that instead, global factors such as perceptual grouping are crucial. Theories of perceptual grouping, in turn, often invoke feedback connections as a way to account for their global properties. We examined three types of crowding models that are representative of global processing models, and two of which employ feedback processing: a model based on Fourier filtering, a feedback neural network, and a specific feedback neural architecture that explicitly models perceptual grouping. Simulations demonstrate that crucial empirical findings are not accounted for by any of the models. We conclude that empirical investigations that reject a local or feedforward architecture offer almost no constraints for model construction, as there are an uncountable number of global and feedback systems. We propose that the identification of a system as being local or global and feedforward or feedback is less important than the identification of a system's computational details. Only the latter information can provide constraints on model development and promote quantitative explanations of complex phenomena. PMID:25374554

  3. Positive affect modulates activity in the visual cortex to images of high calorie foods.

    PubMed

    Killgore, William D S; Yurgelun-Todd, Deborah A

    2007-05-01

    Activity within the visual cortex can be influenced by the emotional salience of a stimulus, but it is not clear whether such cortical activity is modulated by the affective status of the individual. This study used functional magnetic resonance imaging (fMRI) to examine the relationship between affect ratings on the Positive and Negative Affect Schedule and activity within the occipital cortex of 13 normal-weight women while viewing images of high calorie and low calorie foods. Regression analyses revealed that when participants viewed high calorie foods, Positive Affect correlated significantly with activity within the lingual gyrus and calcarine cortex, whereas Negative Affect was unrelated to visual cortex activity. In contrast, during presentations of low calorie foods, affect ratings, regardless of valence, were unrelated to occipital cortex activity. These findings suggest a mechanism whereby positive affective state may affect the early stages of sensory processing, possibly influencing subsequent perceptual experience of a stimulus. PMID:17464782

  4. Teachers’ perceptions of aspects affecting seminar learning: a qualitative study

    PubMed Central

    2013-01-01

    Background Many medical schools have embraced small group learning methods in their undergraduate curricula. Given increasing financial constraints on universities, active learning groups like seminars (with 25 students a group) are gaining popularity. To enhance the understanding of seminar learning and to determine how seminar learning can be optimised it is important to investigate stakeholders’ views. In this study, we qualitatively explored the views of teachers on aspects affecting seminar learning. Methods Twenty-four teachers with experience in facilitating seminars in a three-year bachelor curriculum participated in semi-structured focus group interviews. Three focus groups met twice with an interval of two weeks led by one moderator. Sessions were audio taped, transcribed verbatim and independently coded by two researchers using thematic analysis. An iterative process of data reduction resulted in emerging aspects that influence seminar learning. Results Teachers identified seven key aspects affecting seminar learning: the seminar teacher, students, preparation, group functioning, seminar goals and content, course coherence and schedule and facilities. Important components of these aspects were: the teachers’ role in developing seminars (‘ownership’), the amount and quality of preparation materials, a non-threatening learning climate, continuity of group composition, suitability of subjects for seminar teaching, the number and quality of seminar questions, and alignment of different course activities. Conclusions The results of this study contribute to the unravelling of the ‘the black box’ of seminar learning. Suggestions for ways to optimise active learning in seminars are made regarding curriculum development, seminar content, quality assurance and faculty development. PMID:23399475

  5. Early Visual Perception Potentiated by Object Affordances: Evidence From a Temporal Order Judgment Task

    PubMed Central

    Yamada, Yuki; Yamani, Yusuke

    2016-01-01

    Perceived objects automatically potentiate afforded action. Object affordances also facilitate perception of such objects, and this occurrence is known as the affordance effect. This study examined whether object affordances facilitate the initial visual processing stage, or perceptual entry processes, using the temporal order judgment task. The onset of the graspable (right-handled) coffee cup was perceived earlier than that of the less graspable (left-handled) cup for right-handed participants. The affordance effect was eliminated when the coffee cups were inverted, which presumably conveyed less affordance information. These results suggest that objects preattentively potentiate the perceptual entry processes in response to their affordances.

  6. [Does music influence visual perception in campimetric measurements of the visual field?].

    PubMed

    Gall, Carolin; Geier, Jens-Stefan; Sabel, Bernhard A; Kasten, Erich

    2009-01-01

    21 subjects (mean age 28,4 +/- 10,9, M +/- SD) without any damage of the visual system were examined with computer-based campimetric tests of near threshold stimulus detection whereby an artificial tunnel vision was induced. Campimetry was performed in four trials in randomized order using a within-subjects-design: 1. classical music, 2. Techno music, 3. music for relaxation and 4. no music. Results were slightly better in all music conditions. Performance was best when subjects were listening to Techno music. The average increase of correctly recognized stimuli and fixation controls amounted to 3 %. To check the stability of the effects 9 subjects were tested three times. A moderating influence of personality traits and habits of listening to music was tested but could not be found. We conclude that music has at least no negative influence on performance in the campimetric measurement. Reasons for the positive effects of music can be seen in a general increase of vigilance and a modulation of perceptual thresholds.

  7. The Use of Virtual Reality in Psychology: A Case Study in Visual Perception

    PubMed Central

    Wilson, Christopher J.; Soranzo, Alessandro

    2015-01-01

    Recent proliferation of available virtual reality (VR) tools has seen increased use in psychological research. This is due to a number of advantages afforded over traditional experimental apparatus such as tighter control of the environment and the possibility of creating more ecologically valid stimulus presentation and response protocols. At the same time, higher levels of immersion and visual fidelity afforded by VR do not necessarily evoke presence or elicit a “realistic” psychological response. The current paper reviews some current uses for VR environments in psychological research and discusses some ongoing questions for researchers. Finally, we focus on the area of visual perception, where both the advantages and challenges of VR are particularly salient. PMID:26339281

  8. Parafoveal perception during sentence reading?: An ERP paradigm using rapid serial visual presentation (RSVP) with flankers

    PubMed Central

    Bentin, Shlomo; Kutas, Marta

    2014-01-01

    We describe a new procedure using event related brain potentials to investigate parafoveal word processing during sentence reading. Sentences were presented word-byword at fixation, flanked two degrees bilaterally by letter strings. Flanker strings were pseudowords, except for the third word in each sentence, which was flanked by either two pseudowords, or a pseudoword and a word, one on each side. Flanker words were either semantically congruent or incongruent with the sentence context. P2 (175-375 ms) amplitudes were less positive for contextually incongruent than congruent flanker words but only with flanker words in the right visual field for English, and in the left visual field in Hebrew. Flankered word presentation thus may be a suitable method for the electrophysiological study of parafoveal perception during sentence reading. PMID:21361965

  9. Blind jealousy? Romantic insecurity increases emotion-induced failures of visual perception.

    PubMed

    Most, Steven B; Laurenceau, Jean-Philippe; Graber, Elana; Belcher, Amber; Smith, C Veronica

    2010-04-01

    Does the influence of close relationships pervade so deeply as to impact visual awareness? Results from two experiments involving heterosexual romantic couples suggest that they do. Female partners from each couple performed a rapid detection task where negative emotional distractors typically disrupt visual awareness of subsequent targets; at the same time, their male partners rated attractiveness first of landscapes, then of photos of other women. At the end of both experiments, the degree to which female partners indicated uneasiness about their male partner looking at and rating other women correlated significantly with the degree to which negative emotional distractors had disrupted their target perception during that time. This relationship was robust even when controlling for individual differences in baseline performance. Thus, emotions elicited by social contexts appear to wield power even at the level of perceptual processing.

  10. Neuronal integration in visual cortex elevates face category tuning to conscious face perception.

    PubMed

    Fahrenfort, Johannes J; Snijders, Tineke M; Heinen, Klaartje; van Gaal, Simon; Scholte, H Steven; Lamme, Victor A F

    2012-12-26

    The human brain has the extraordinary capability to transform cluttered sensory input into distinct object representations. For example, it is able to rapidly and seemingly without effort detect object categories in complex natural scenes. Surprisingly, category tuning is not sufficient to achieve conscious recognition of objects. What neural process beyond category extraction might elevate neural representations to the level where objects are consciously perceived? Here we show that visible and invisible faces produce similar category-selective responses in the ventral visual cortex. The pattern of neural activity evoked by visible faces could be used to decode the presence of invisible faces and vice versa. However, only visible faces caused extensive response enhancements and changes in neural oscillatory synchronization, as well as increased functional connectivity between higher and lower visual areas. We conclude that conscious face perception is more tightly linked to neural processes of sustained information integration and binding than to processes accommodating face category tuning. PMID:23236162

  11. Parafoveal perception during sentence reading? An ERP paradigm using rapid serial visual presentation (RSVP) with flankers.

    PubMed

    Barber, Horacio A; Ben-Zvi, Shir; Bentin, Shlomo; Kutas, Marta

    2011-04-01

    We describe a new procedure using event-related brain potentials to investigate parafoveal word processing during sentence reading. Sentences were presented word by word at fixation, flanked 2° bilaterally by letter strings. Flanker strings were pseudowords, except for the third word in each sentence, which was flanked by either two pseudowords or a pseudoword and a word, one on each side. Flanker words were either semantically congruent or incongruent with the sentence context. P2 (175-375 ms) amplitudes were less positive for contextually incongruent than congruent flanker words but only with flanker words in the right visual field for English and in the left visual field in Hebrew. Flankered word presentation thus may be a suitable method for the electrophysiological study of parafoveal perception during sentence reading.

  12. Do Students' Approaches to Learning Affect Their Perceptions of Using Computing and Information Technology?

    ERIC Educational Resources Information Center

    Jelfs, Anne; Colbourn, Chris

    2002-01-01

    Discusses the use of communication and information technology (C&IT) in higher education in the United Kingdom and describes research that examined student perceptions of using C&IT for a virtual seminar series in psychology. Identified student learning approaches within the group and how it affected their adoption or rejection of the electronic…

  13. Perceptions of Educational Barriers Affecting the Academic Achievement of Latino K-12 Students

    ERIC Educational Resources Information Center

    Becerra, David

    2012-01-01

    This study examined different factors affecting the perceptions of barriers in academic achievement of Latino K-12 students. The study used data from 1,508 participants who identified themselves as being of Hispanic or Latino heritage in the 2004 National Survey of Latinos: Education, compiled by the Pew Hispanic Center between August 7 and…

  14. A School Principal's Perceptions Regarding Personal Qualities and Pedagogical Qualifications Affecting Teacher Candidate Selection

    ERIC Educational Resources Information Center

    Smith, Pamela Thayer

    2014-01-01

    This study examined the procedures used and the perceptions of a principal as to the personal qualities and pedagogical qualifications affecting the selection of teacher candidates. The approach examined one principal's procedures used to choose which candidates to interview, the process she used to conduct the interviews, the professional…

  15. Ethical Ideologies: Do They Affect Shopping Behaviors and Perceptions of Morality?

    ERIC Educational Resources Information Center

    Cho, Hyeon; Yoo, Jeong-Ju; Johnson, Kim K. P.

    2005-01-01

    Counterfeiting is a serious problem facing several industries, including the medical, agricultural, and apparel industries (Bloch, Bush, & Campbell, 1993). The authors investigated whether ethical viewpoints affect perceptions of the morality of particular shopping behaviors, attitudes toward counterfeit products, and intentions to purchase such…

  16. Public School Principals' Perceptions of Selected External Factors Affecting Job Performance.

    ERIC Educational Resources Information Center

    Reisert, John E.

    Based on principals' own perceptions, this paper examines how the principal's role has changed, what constitutes principals' major problems or concerns, and how state and federal regulations and community pressures have affected the principal's role. The project identified and interviewed 56 public school principals for an 11-county area served by…

  17. Preschool Children's Perceptions of the Value of Affection as Seen in Their Drawings

    ERIC Educational Resources Information Center

    Günindi, Yunus

    2015-01-01

    The purpose of this study is to examine the perceptions of children in preschool education with regard to the value of affection in the pictures they draw. The study involved 199 children aged 60 months old or above. The descriptive research method was used and data were collected with the draw-and-explain technique. During the collection of the…

  18. Students Perceptions on Factors That Affect Their Academic Performance: The Case of Great Zimbabwe University (GZU)

    ERIC Educational Resources Information Center

    Mapuranga, Barbra; Musingafi, Maxwell C. C.; Zebron, Shupikai

    2015-01-01

    Some educators argue that entry standards are the most important determinants of successful completion of a university programme; others maintain that non-academic factors must also be considered. In this study we sought to investigate open and distance learning students' perceptions of the factors affecting academic performance and successful…

  19. Emotion Telepresence: Emotion Augmentation through Affective Haptics and Visual Stimuli

    NASA Astrophysics Data System (ADS)

    Tsetserukou, D.; Neviarouskaya, A.

    2012-03-01

    The paper focuses on a novel concept of emotional telepresence. The iFeel_IM! system which is in the vanguard of this technology integrates 3D virtual world Second Life, intelligent component for automatic emotion recognition from text messages, and innovative affective haptic interfaces providing additional nonverbal communication channels through simulation of emotional feedback and social touch (physical co-presence). Users can not only exchange messages but also emotionally and physically feel the presence of the communication partner (e.g., family member, friend, or beloved person). The next prototype of the system will include the tablet computer. The user can realize haptic interaction with avatar, and thus influence its mood and emotion of the partner. The finger gesture language will be designed for communication with avatar. This will bring new level of immersion of on-line communication.

  20. Visual aspects of perception of multimedia messages on the web through the "eye tracker" method.

    PubMed

    Svilicić, Niksa

    2010-09-01

    Since the dawn of civilisation visual communication played a role in everyday life. In the early times there were simply shaped drawings of animals, pictograms explaining hunting tactics or strategies of attacking the enemies. Through evolution visual expression becomes an important component of communication process on several levels, from the existential and economic level to the artistic level. However, there was always a question of the level of user reception of such visual information in the medium transmitting the information. Does physical positioning of information in the medium contribute to the efficiency of the message? Do the same rules of content positioning apply for traditional (offline) and online media (Internet)? Rapid development of information technology and Internet in almost all segments of contemporary life calls for defining the rules of designing and positioning multimedia online contents on web sites. Recent research indicates beyond doubt that the physical positioning of an online content on a web site significantly determines the quality of user's perception of such content. By employing the "Eye tracking" method it is possible to objectively analyse the level of user perception of a multimedia content on a web site. What is the first thing observed by the user after opening the web site and how does he/she visually search the online content? By which methods can this be investigated subjectively and objectively? How can the survey results be used to improve the creation of web sites and to optimise the positioning of relevant contents on the site? The answers to these questions will significantly improve the presentation of multimedia interactive contents on the Web.